I have seen this question for windows: Run all SQL files in a directory
I was wondering how to do it for linux. To my knowledge .bat filetype is for windows. Anyone know a simple script for linux? I rarely use linux.
I have code to run them one at a time with
sqlcmd -S localhost -U SA -p myPassword -i myFile1.sql
(My sql files have which database to use). Just unsure how to make it run for all them since there are a lot.
A very simplistic sh script file might contain:
#!/bin/sh
#
# loop over the result of 'ls -1 *.sql'
# 'ls -1' sorts the file names based on the current locale
# and presents them in a single column
for i in `/bin/ls -1 *.sql`; do
sqlcmd -S localhost -U SA -p myPassword -i $i
done
If there is a specific order to the sql files then you would need to name them in a way that sorts into the correct order.
find to the rescue:
find ./directory -maxdepth 1 -name *.sql -exec sqlcmd -S localhost -U SA -p myPassword -i {} \;
Related
I have a folder in a directory in my PC which contains multiple SQL files. Each of the file is a Postgres function. I want to execute every SQL file situated in the folder at a time in PostgreSQL server using PgAdmin or in other way. How can I accomplish this?
I apologize if I'm oversimplifying your question, but if the main issue is how to execute all SQL files without having to call them one by one, you just need to put them in a loop, e.g. in bash calling psql
#!/bin/bash
for f in *.sql
do
psql -h dbhost -d db -U dbuser -f $f
done
Or cat them and pipe the result to psql stdin:
$ cat /path/to/files/*.sql | psql -h dbhost -d db -U dbuser
And if you need them to run in a single transaction, consider merging the SQL files, e.g. using cat - this assumes all statements in your sql file are properly terminated:
$ cat /path/to/files/*.sql > merged.sql
Currently learning SQL online. I've been trying to restore the database from this link:
http://app.sixweeksql.com:2000/SqlCourse.bak
when I run SQL Server through Docker (Mac user, can't run SSMS unfortunately). I've been following directions from Microsoft here:
https://learn.microsoft.com/en-us/sql/linux/tutorial-restore-backup-in-sql-server-container?view=sql-server-2017
I moved the file into my container and checked the files listed inside (Course New and CourseNew_log) so I could write out its file path:
sudo docker cp SqlCourse.bak container_name:/var/opt/mssql/backup
followed by:
sudo docker exec -it container_name /opt/mssql-tools/bin/sqlcmd -S localhost \
-U SA -P "Password" \
-Q 'RESTORE FILELISTONLY FROM DISK = "/var/opt/mssql/backup/SqlCourse.bak"'
Yet I just don't know how to restore the database. I've tried this:
sudo docker exec -it container_name /opt/mssql-tools/bin/sqlcmd \
-S localhost -U SA -P "Password" \
-Q 'RESTORE DATABASE SqlCourse FROM DISK = "/var/opt/mssql/backup/SqlCourse.bak" WITH MOVE "CourseNew" TO "/var/opt/mssql/data/SqlCourse.mdf", MOVE "CourseNew_log" TO "/var/opt/mssql/data/SqlCourse.ldf"
and it returns "unexpected argument." Clearly that's not the right call but I'm not sure how else to proceed.
(Running mcr.microsoft.com/mssql/server:2019-CTP3.2-ubuntu)
Single quotes are used to enclose string literals in T-SQL so the resultant RESTORE T-SQL script needs to be:
RESTORE DATABASE SqlCourse
FROM DISK = '/var/opt/mssql/backup/SqlCourse.bak\'
WITH
MOVE 'CourseNew' TO '/var/opt/mssql/data/SqlCourse.mdf'
, MOVE 'CourseNew_log' TO '/var/opt/mssql/data/SqlCourse.ldf';
Since you are passing the command as a bash shell command-line argument, you also need to prefix the argument string with '$' and escape the single quotes within the string by preceding them with a \:
sudo docker exec -it container_name /opt/mssql-tools/bin/sqlcmd \
-S localhost -U SA -P "Password" \
-Q $'RESTORE DATABASE SqlCourse FROM DISK = \'/var/opt/mssql/backup/SqlCourse.bak\' WITH MOVE \'CourseNew\' TO \'/var/opt/mssql/data/SqlCourse.mdf\', MOVE \'CourseNew_log\' TO \'/var/opt/mssql/data/SqlCourse.ldf\';'
You can avoid the escaping ugliness by copying the normal RESTORE script into the container and running with the SQLCMD -i argument.
I have about a thousand files on a remote server (all in different directories). I would like to scp them to my local machine. I would not want to run scp command a thousand times in a row, so I have created a text file with a list of file locations on the remote server. It is a simple text file with a path on each line like below:
...
/iscsi/archive/aat/2005/20050801/A/RUN0010.FTS
/iscsi/archive/aat/2006/20060201/A/RUN0062.FTS
/iscsi/archive/aat/2013/20130923/B/RUN0010.FTS
/iscsi/archive/aat/2009/20090709/A/RUN1500.FTS
...
I have searched and found someone trying to do a similar but not the same thing here. The command I would like to edit is below:
cat /location/file.txt | xargs -i scp {} user#server:/location
In my case I need something like:
cat fileList.txt | xargs -i scp user#server:{} .
To download files from a remote server using the list in fileList.txt located in the same directory I run this command from.
When I run this I get an error: xargs: illegal option -- i
How can I get this command to work?
Thanks,
Aina.
You get this error xargs: illegal option -- i because -i was deprecated. Use -I {} instead (you could also use a different replace string but {} is fine).
If the list is remote, the files are remote, you can do this to retrieve it locally and use it with xargs -I {}:
ssh user#server cat fileList.txt | xargs -I {} scp user#server:{} .
But this creates N+1 connections, and more importantly this copies all remote files (scattered in different directories you said) to the same local directory. Probably not what you want.
So, in order to recreate a similar hierarchy locally, let's say everything under /iscsi/archive/aat, you can:
use cut -d/ to extract the part you want to be identical on both sides
use a subshell to create the command that creates the target directory and copies the file there
Thus:
ssh user#server cat fileList.txt \
| cut -d/ -f4- \
| xargs -I {} sh -c 'mkdir -p $(dirname {}); scp user#server:/iscsi/archive/{} ./{}'
Should work, but that's starting to look messy, and you still have N+1 connections, so now rsync looks like a better option. If you have passwordless ssh connection, this should work:
rsync -a --files-from=<(ssh user#server cat fileList.txt) user#server:/ .
The leading / is stripped by rsync and in the end you'll get everything under ./iscsi/archive/....
You can also copy the files locally first, and then:
rsync -a --files-from=localCopyOfFileList.txt user#server:/ .
You can also manipulate that file to remove for example 2 levels:
rsync -a --files-from=localCopyOfFileList2.txt user#server:/iscsi/archive .
etc.
Example of the contents I require a .cmd to contain
mkdir Output
sqlcmd -S serverName -d dbName -E -i "FILE LOCATION HERE" -o Output\Message.log
sqlcmd -S serverName -d dbName -E -i "FILE LOCATION HERE" >> Output\Messages.log
.
.
.
pause
Specifics: I have a SQL repo and need to generate a code that will take all changes from last revision to this revisionand output the above example.
As far as Mercurial is concerned,
hg status -I re:.*\.sql$ -am --rev 3:7
will give the relation of all files with .sql extension added or modified after changeset 3 and up to changeset 7. You can then massage the output into the desired script with your favourite text processing tools.
N.B.: Blindly running all modified SQL scripts, specially if you intend to use this procedure more than once, seems rather dangerous, unless you are sure all the operations are idempotent.
I have 10 sql scripts lying on 10 SVN urls.
I want to write a single sql script which execute those 10 svn sql scripts.
for example, http://svn/s1.sql, http://svn/s1.sq2, ....
I want to write a single sql which does like execute http://svn/s1.sql, execute http://svn/s2.sql, etc
How can I do it?
You can run all the .SQL files using sqlcmd
First Create a Batch file and Paste the below coding in that batch file :
sqlcmd -S ServerName -U Username -P password -i c:\s1.sql -o C:\s1.txt
sqlcmd -S ServerName -U Username -P password -i c:\s2.sql -o C:\s2.txt
sqlcmd -S ServerName -U Username -P password -i c:\s3.sql -o C:\s3.txt
sqlcmd -S ServerName -U Username -P password -i c:\s4.sql -o C:\s4.txt
Execute the Batch file from SQL Server like below..
EXEC master..xp_CMDShell 'c:filename.bat'
You can also refer the Below link for running batch file..
SQL SERVER – Running Batch File Using T-SQL – xp_cmdshell bat file
You would need to write a program that downloads the files, reads them in line by line, appends them internally and executes the whole batch.
It would be a huge security hole, if you could execute SQL scripts by calling a url in your browser.