How to run multiple SQL scripts using a batch file? - sql

I have a case where i have got 10+ SQL script.
I don't want to go and run all my scripts 1 by 1.
Is there a way that i can run all my scripts in succession in SQL Management studio.
I found this post. Creating a batch file seems easier.
This is all you need:
#echo off
ECHO %USERNAME% started the batch process at %TIME% >output.txt
for %%f in (*.sql) do (
(
sqlcmd.exe -S servername -E -d databasename -i %%f >>output.txt
)
pause
Replacing servername and databasename, but it seems to be not working.
Any ideas?

You've got an unmatched parenthesis, there.
Try
for %%f in (*.sql) do sqlcmd.exe -S servername -E -d databasename -i %%f >>output.txt
I just saved it in a .cmd file and it appears to be working.

Yes, it's possible. You can do it with :r command of SQLCMD.
I strongly recommend you to read this article and do it with SQLCMD
http://www.mssqltips.com/sqlservertip/1543/using-sqlcmd-to-execute-multiple-sql-server-scripts/

You can create a Strored Procedure to call all your Scripts. You could also create a schedule plan to run the scripts automaticaly.
http://msdn.microsoft.com/en-us/library/aa174792(v=sql.80).aspx

Here is an open source utility with source code http://scriptzrunner.codeplex.com/
This utility was written in c# and allows you to drag and drop many sql files and start running them against a database.

You can use Batch Compiler add-in for SMSS, it let's you run multiple scripts at once, create SQLCMD scripts or consolidate them into a *.sql file.

Some batch trick
cd %~dp0 //use this if you use 'for xxx in', it solved most of my problems
ECHO %USERNAME% started the batch process at %TIME% >output.txt
for %%f in (*.sql) do (
(
sqlcmd.exe -S servername -E -d databasename -i %%f >>output.txt
)
echo %errorlevel%
pause

If you want to run Oracle SQL files through a Batch program, then the code below will be useful. Just copy & change the Database credential and DB names
#echo off
for %%i in ("%~dp0"*.sql) do echo #"%%~fi" >> "%~dp0all.sql"
echo exit | sqlplus scott/tiger#orcl #"c:\users\all.sql"
pause
Basically, you need to put this batch file in the folder where you have all the SQL files. It will first get all the sql file names in the directory and load their full path with the sql file names. Then, it will write into a file all.sql and then sqlplus will call that all.sql to execute all the sql files that you have in that directory.

Related

Execute bulk of scripts on different databases using single script

I have to execute bulk of scripts on a database server, i am able to execute it with the help of batch but This database server having multiple databases,so for every database i am writing below mentioned script-
EX-
for database name1-
for %%G in (*.sql) do sqlcmd /S [Server] /d [A] -E -i"%%G" pause
for database name2-
for %%G in (*.sql) do sqlcmd /S [Server] /d [B] -E -i"%%G" pause
is there any way so that i don't have to write this .bat file for every database name?i want to write single script which works for all the databases...
You could change your script to be:
for %%G in (*.sql) do sqlcmd /S %1 /d %2 -E -i"%%G" pause
then pass in the server and database name when launching the bash script. For example, if your script was called "myscript.bat", then you could launch it from command line via
> myscript.bat server name

Can I specify an input sql file with bcp?

How can I specify an input sql file with a long query when using bcp? I tried using the -i option but it keeps complaining about a command-line error with no extra information. Is this possible?
I had this problem today and found a convenient workaround, at least in an ad-hoc situation.
Temporary tables can be created by any user with connect permissions. This means you can also create GLOBAL temporary tables.
Just run your query in enterprise manager (or sql cmd or whatever) using SELECT ...INTO with a global temporary table e.g.
SELECT *
INTO ##mytemptable
FROM SomeTable
WHERE [massive where clause, for example]
You can then use the temporary table in the BCP query with a simple
SELECT * FROM ##mytemptable
Then drop the temp table through enterprise manager
DROP TABLE ##mytemptable
I did other way for fix that.
I create a batch file which read a file and send your content in bcp command. See:
#ECHO off
SETLOCAL EnableDelayedExpansion
SET queryFile=%1
SET outFileName=%2
FOR /F "delims=" %%i IN (%queryFile%) DO SET join=!join! %%i
ECHO %join%
bcp "%join%" queryout %outFileName% /S.\SQLE_CAESAR /d /c /t"|" /T
That script receive two parameters:
Filename which has a query;
Filename for export data;
Execute a script in cmd like that:
export-query.bat query.sql export.txt
I hope helped.
As far as I'm concerned the BCP utility only supports Transact-SQL queries directly written to the command line. Ex:
bcp "SELECT Name FROM AdventureWorks.Sales.Currency" queryout Currency.Name.dat -T -c
According to its reference the "-i" option:
Specifies the name of a response file, containing the responses to the command prompt questions for each data field when a bulk copy is being performed using interactive mode (-n, -c, -w, or -N not specified).
Notice that it differs from the sqlcmd Utility "-i" option:
Identifies the file that contains a batch of SQL statements or stored procedures. Multiple files may be specified that will be read and processed in order (...)
try :
query=$( cat < /file.sql )
export query
bcp "${query}" queryout /home/file.csv
Multi-line queries can be given to bcp easily using powershell:
PS> $query = #'
select *
from <table>
'#
PS> bcp $query queryout <outfile> -d <database> -T -S <server> -c
I had face same issue, may not be a very good approach. However, I did something like the following
bcp "declare #query nvarchar(max) set #query = (SELECT * FROM OPENROWSET(BULK 'F:\tasks\report_v2.sql', SINGLE_CLOB) AS Contents) exec sp_executesql #query" queryout %outFileName% /c /C RAW -S . -U sa -P 123 -d blog /T
And I must say, if you use like global temp table then global temp table is dropped itself of after query executed. you can't use this at some situations
What really worked for me is this:
#ECHO off
setlocal enableextensions enabledelayedexpansion
SET "queryFile=%1"
SET "outFileName=%2"
SET RESULT=
FOR /F "delims=" %%i IN ('type %queryFile%') DO SET RESULT=!RESULT! %%i
echo %RESULT%
rem bcp "%RESULT%" queryout %outFileName% -t^ -r \n -T -k -c -d DB_NAME -S SERVER_NAME
type file is the equivalent of cat file in unix
What I did with complex queries was create a stored procedure with the desired statement and call it from BCP:
bcp "exec db.schema.stored_procedure" queryout "c:\file.txt" -T -S localhost -t "|" -c
This worked great for me. Greetings!
I made my own script (called of bulk.sh) to do this (not optimal and not best practice... The script is too ugly, but very functional).
#!/bin/bash
input="SQL_FILE.sql"
count=0
const=1000
lines=()
mkdir -p bulk
while IFS= read -r line
do
lines+=("$line")
count=$((count+1))
check=$((count % const))
if [[ $check -eq 0 ]]; then
bulk="${lines[*]}"
unset lines
number=$(printf "%010d" $count)
echo $bulk > "bulk/bulk${number}.sql"
bulk=""
fi
done < "$input"
FILES="bulk/*"
for f in $FILES
do
echo "Processing $f file..."
sqlcmd -S SERVER -d DATABASE -U USER -P "PASSWORD" -i "$f"
sleep 2s
done
You can try it, with:
$ docker run -v /path/to/your/sql/file/folder:/backup -it mcr.microsoft.com/mssql-tools
$ bash bulk.sh

SQL Server: running every sql script in a directory

I'm running SQL Server 2008 locally. I have a pile of scripts I would like to run on my local database. I can connect to the server and run them manually but I have over a 100 scripts, and I'm sure there is a way to do this. Any help is appreciated, thanks!
You can iterate all query files in a directory and execute them with osql utility.
#echo off
for %%f in (*.sql) do (
echo executing %%f
osql -E -i %%f
)
pause

Run all SQL files in a directory

I have a number of .sql files which I have to run in order to apply changes made by other developers on an SQL Server 2005 database.
The files are named according to the following pattern:
0001 - abc.sql
0002 - abcef.sql
0003 - abc.sql
...
Is there a way to run all of them in one go?
Create a .BAT file with the following command:
for %%G in (*.sql) do sqlcmd /S servername /d databaseName -E -i"%%G"
pause
If you need to provide username and passsword
for %%G in (*.sql) do sqlcmd /S servername /d databaseName -U username -P
password -i"%%G"
Note that the "-E" is not needed when user/password is provided
Place this .BAT file in the directory from which you want the .SQL files to be executed, double click the .BAT file and you are done!
Use FOR. From the command prompt:
c:\>for %f in (*.sql) do sqlcmd /S <servername> /d <dbname> /E /i "%f"
In the SQL Management Studio open a new query and type all files as below
:r c:\Scripts\script1.sql
:r c:\Scripts\script2.sql
:r c:\Scripts\script3.sql
Go to Query menu on SQL Management Studio and make sure SQLCMD Mode is enabled
Click on SQLCMD Mode; files will be selected in grey as below
:r c:\Scripts\script1.sql
:r c:\Scripts\script2.sql
:r c:\Scripts\script3.sql
Now execute
The easiest way I found included the following steps (the only requirement is it to be in Win7+):
open the folder in Explorer
select all script files
press Shift
right click the selection and select "Copy as path"
go to SQL Server Management Studio
create a new query
Query Menu, "SQLCMD mode"
paste the list, then Ctrl+H, replace '"C:' (or whatever the drive letter) with ':r "C:' (i.e. prefix the lines with ':r ')
run the query
It sounds long, but in reality is very fast (it sounds long as I described even the smallest steps).
Make sure you have SQLCMD enabled by clicking on the Query > SQLCMD mode option in the management studio.
Suppose you have four .sql files (script1.sql,script2.sql,script3.sql,script4.sql) in a folder c:\scripts.
Create a main script file (Main.sql) with the following:
:r c:\Scripts\script1.sql
:r c:\Scripts\script2.sql
:r c:\Scripts\script3.sql
:r c:\Scripts\script4.sql
Save the Main.sql in c:\scripts itself.
Create a batch file named ExecuteScripts.bat with the following:
SQLCMD -E -d<YourDatabaseName> -ic:\Scripts\Main.sql
PAUSE
Remember to replace <YourDatabaseName> with the database you want to execute your scripts. For example, if the database is "Employee", the command would be the following:
SQLCMD -E -dEmployee -ic:\Scripts\Main.sql
PAUSE
Execute the batch file by double clicking the same.
General Query
save the below lines in notepad with name batch.bat and place inside the folder where all your script file are there
for %%G in (*.sql) do sqlcmd /S servername /d databasename -i"%%G"
pause
EXAMPLE
for %%G in (*.sql) do sqlcmd /S NFGDDD23432 /d EMPLYEEDB -i"%%G"
pause
sometime if login failed for you please use the below code with username and password
for %%G in (*.sql) do sqlcmd /S SERVERNAME /d DBNAME -U USERNAME -P PASSWORD -i"%%G"
pause
for %%G in (*.sql) do sqlcmd /S NE8148server /d EMPLYEEDB -U Scott -P tiger -i"%%G"
pause
After you create the bat file inside the folder in which your Script files are there just click on the bat file your scripts will get executed
You could use ApexSQL Propagate. It is a free tool which executes multiple scripts on multiple databases. You can select as many scripts as you need and execute them against one or multiple databases (even multiple servers). You can create scripts list and save it, then just select that list each time you want to execute those same scripts in the created order (multiple script lists can be added also):
When scripts and databases are selected, they will be shown in the main window and all you have to do is to click the “Execute” button and all scripts will be executed on selected databases in the given order:
I wrote an open source utility in C# that allows you to drag and drop many SQL files and start running them against a database.
The utility has the following features:
Drag And Drop script files
Run a directory of script files
Sql Script out put messages during execution
Script passed or failed that are colored green and red (yellow for running)
Stop on error option
Open script on error option
Run report with time taken for each script
Total duration time
Test DB connection
Asynchronus
.Net 4 & tested with SQL 2008
Single exe file
Kill connection at anytime
What I know you can use the osql or sqlcmd commands to execute multiple sql files. The drawback is that you will have to create a script for both the commands.
Using SQLCMD to Execute Multiple SQL Server Scripts
OSQL (This is for sql server 2000)
http://msdn.microsoft.com/en-us/library/aa213087(v=SQL.80).aspx
#echo off
cd C:\Program Files (x86)\MySQL\MySQL Workbench 6.0 CE
for %%a in (D:\abc\*.sql) do (
echo %%a
mysql --host=ip --port=3306 --user=uid--password=ped < %%a
)
Step1: above lines copy into note pad save it as bat.
step2: In d drive abc folder in all Sql files in queries executed in sql server.
step3: Give your ip, user id and password.
I know this question is more focused on SQL Server. I had the same question, but for PostgreSQL. The solution is very close for what I needed, so I thought I would share what I got for anyone that needs it:
for %f in (*.sql) do psql -U [username] -d [database name] --command="\i %f";
I ran this from the folder containing all of my sql scripts.
To avoid being prompted for a password, I had to add
*:*:*:[user]:[password]
to my pgpass.conf file that lives in
%APPDATA%\Roaming\postgresql\
folder on windows. I had to create the file myself.
You can create a single script that calls all the others.
Put the following into a batch file:
#echo off
echo.>"%~dp0all.sql"
for %%i in ("%~dp0"*.sql) do echo #"%%~fi" >> "%~dp0all.sql"
When you run that batch file it will create a new script named all.sql in the same directory where the batch file is located. It will look for all files with the extension .sql in the same directory where the batch file is located.
You can then run all scripts by using sqlplus user/pwd #all.sql (or extend the batch file to call sqlplus after creating the all.sql script)
For executing every SQLfile on the same directory use the following command:
ls | awk '{print "#"$0}' > all.sql
This command will create a single SQL file with the names of every SQL file in the directory appended by "#".
After the all.sql is created simply execute all.sql with SQLPlus, this will execute every sql file in the all.sql.
If you can use Interactive SQL:
1 - Create a .BAT file with this code:
#ECHO OFF ECHO
for %%G in (*.sql) do dbisql -c "uid=dba;pwd=XXXXXXXX;ServerName=INSERT-DB-NAME-HERE" %%G
pause
2 - Change the pwd and ServerName.
3 - Put the .BAT file in the folder that contains .SQL files and run it.

DOS command to execute all SQL script in a directory and subdirectories

I need a DOS command or a batch (.bat) file I can execute to run all the *.sql scripts in a directory and its subdirectories. What would the solution be?
The following will get you started
for /r %f in (*.sql) do echo %f
Run from the command line that will print the names of all the SQL files in the current directory and all sub directories.
Then substitute sqlcmd <connection args> -i%f for echo %f to execute the scripts.
Hope this helps.
Here you go. This batch file will execute all sql files in a directory and its subdirectories. It will also create an output.txt file with the results so you can see errors and whatnot. Some notes on batch file:
[YourDatabase] is the name of the database you want to execute the scripts against.
[YourPath] is the path of where you keep all the scripts.
[YourServerName\YourInstanceName] is the SQL server name and instance name, separated with a '\'
You'll want to replace the text after the '=' for each variable with whatever is appropriate for your server
Be sure NOT to put spaces around the '='
Do not put any quotes around [YourPath]
Make sure that [YourPath] has a '\' at the end
SET Database=[YourDatabase]
SET ScriptsPath=[YourPath]
SET ServerInstance=[YourServerName\YourInstanceName]
IF EXIST "%ScriptsPath%output.txt" del "%ScriptsPath%output.txt"
type NUL > "%ScriptsPath%output.txt"
FOR /R "%ScriptsPath%" %%G IN (*.sql) DO (
sqlcmd -d %Database% -S %ServerInstance% -i "%%G" -o "%%G.txt"
echo ..................................................................................... >> "%ScriptsPath%output.txt"
echo Executing: "%%G" >> "%ScriptsPath%output.txt"
echo ..................................................................................... >> "%ScriptsPath%output.txt"
copy "%ScriptsPath%output.txt"+"%%G.txt" "%ScriptsPath%output.txt"
del "%%G.txt"
)
for %f in ("c:\path\to\dir\*.sql") do sqlcmd -S [SERVER_NAME] -d [DATABASE_NAME] -i "%f" -b
Try a for loop. The options of this command have evolved and I'm not sure what version of DOS you are using, but assuming that DOS includes "cmd.exe from Windows XP", something like this could work:
for /r . %f in (*.sql) do #echo %f
Ok, this will only print the names of the files. I'm assuming you already have a program that you can run from the command line that will execute one SQL file, which you can use instead of echo.
For more information, try for /?.