Stored procedure execution - sql

I have written a stored procedure to be called from sqlcmd (batch file). This is the code - for some reason it is not executing.
#ECHO OFF
SET /P FDate=Enter From Date:
SET /P TDate=Enter To Date:
ECHO sqlcmd -E -Q "dbo.SavingsAccountsAllDetail #FDate=N'%Param1%', #TDate=N'%Param2%'" -S OMNIDB-UAT -d HNBG_LOAN_TEST -o C:\SavingsAccountsAllDetailRepo.txt
SET Param1=
SET Param2=
Any thoughts?

Try to add EXEC before procedure's name:
sqlcmd -E -Q "exec dbo.SavingsAccountsAllDetail #FDate=N'%Param1%', #TDate=N'%Param2%'" -S OMNIDB-UAT -d HNBG_LOAN_TEST -o C:\SavingsAccountsAllDetailRepo.txt
You can also try to passparameters via the /v argument. More datais here
sqlcmd -E -Q "exec dbo.SavingsAccountsAllDetail #FDate=N'$(Param1)', #TDate=N'$(Param2)'" /v Param1=%Param1% Param2=%Param2% -S OMNIDB-UAT -d HNBG_LOAN_TEST -o C:\SavingsAccountsAllDetailRepo.txt

Related

Unexpected argument executing cmdexec on a SQL job to export to CSV

I try to run this on a SQL job:
sqlcmd -S . -d CI_Reports -E -s"," -W -Q "SET NOCOUNT ON SELECT * FROM [dbo].[Table]" > D:\Test.csv
How can I fix this error?
Sqlcmd: '> D:\Test.csv': Unexpected argument.
Have you tried like this -
sqlcmd -S . -d CI_Reports -E -s"," -W -Q "SET NOCOUNT ON SELECT * FROM [dbo].[Table]" -o D:\Test.csv
where -o output_file which would identify the file that receives output from sqlcmd.
Additionally you could try BCP which is best suited for bulk coping data between an instance of Microsoft SQL Server and a data file in a user-specified format.
Read more here.

Stopping executing of SQL scripts after error occurs

I am executing multiple SQL scripts using batch file but now I want to stop execution of batch file if error occurs in any one of the scripts.
This is my batch file:
sqlcmd -S dbdev026b\dbdev026b -i c:Test1.sql -o c:\o1.txt
sqlcmd -S dbdev026b\dbdev026b -i c:Test2.sql -o c:\o2.txt
sqlcmd -S dbdev026b\dbdev026b -i c:Test1.sql -o c:\o1.txt || exit /b 1
sqlcmd -S dbdev026b\dbdev026b -i c:Test2.sql -o c:\o2.txt || exit /b 2

Can I specify an input sql file with bcp?

How can I specify an input sql file with a long query when using bcp? I tried using the -i option but it keeps complaining about a command-line error with no extra information. Is this possible?
I had this problem today and found a convenient workaround, at least in an ad-hoc situation.
Temporary tables can be created by any user with connect permissions. This means you can also create GLOBAL temporary tables.
Just run your query in enterprise manager (or sql cmd or whatever) using SELECT ...INTO with a global temporary table e.g.
SELECT *
INTO ##mytemptable
FROM SomeTable
WHERE [massive where clause, for example]
You can then use the temporary table in the BCP query with a simple
SELECT * FROM ##mytemptable
Then drop the temp table through enterprise manager
DROP TABLE ##mytemptable
I did other way for fix that.
I create a batch file which read a file and send your content in bcp command. See:
#ECHO off
SETLOCAL EnableDelayedExpansion
SET queryFile=%1
SET outFileName=%2
FOR /F "delims=" %%i IN (%queryFile%) DO SET join=!join! %%i
ECHO %join%
bcp "%join%" queryout %outFileName% /S.\SQLE_CAESAR /d /c /t"|" /T
That script receive two parameters:
Filename which has a query;
Filename for export data;
Execute a script in cmd like that:
export-query.bat query.sql export.txt
I hope helped.
As far as I'm concerned the BCP utility only supports Transact-SQL queries directly written to the command line. Ex:
bcp "SELECT Name FROM AdventureWorks.Sales.Currency" queryout Currency.Name.dat -T -c
According to its reference the "-i" option:
Specifies the name of a response file, containing the responses to the command prompt questions for each data field when a bulk copy is being performed using interactive mode (-n, -c, -w, or -N not specified).
Notice that it differs from the sqlcmd Utility "-i" option:
Identifies the file that contains a batch of SQL statements or stored procedures. Multiple files may be specified that will be read and processed in order (...)
try :
query=$( cat < /file.sql )
export query
bcp "${query}" queryout /home/file.csv
Multi-line queries can be given to bcp easily using powershell:
PS> $query = #'
select *
from <table>
'#
PS> bcp $query queryout <outfile> -d <database> -T -S <server> -c
I had face same issue, may not be a very good approach. However, I did something like the following
bcp "declare #query nvarchar(max) set #query = (SELECT * FROM OPENROWSET(BULK 'F:\tasks\report_v2.sql', SINGLE_CLOB) AS Contents) exec sp_executesql #query" queryout %outFileName% /c /C RAW -S . -U sa -P 123 -d blog /T
And I must say, if you use like global temp table then global temp table is dropped itself of after query executed. you can't use this at some situations
What really worked for me is this:
#ECHO off
setlocal enableextensions enabledelayedexpansion
SET "queryFile=%1"
SET "outFileName=%2"
SET RESULT=
FOR /F "delims=" %%i IN ('type %queryFile%') DO SET RESULT=!RESULT! %%i
echo %RESULT%
rem bcp "%RESULT%" queryout %outFileName% -t^ -r \n -T -k -c -d DB_NAME -S SERVER_NAME
type file is the equivalent of cat file in unix
What I did with complex queries was create a stored procedure with the desired statement and call it from BCP:
bcp "exec db.schema.stored_procedure" queryout "c:\file.txt" -T -S localhost -t "|" -c
This worked great for me. Greetings!
I made my own script (called of bulk.sh) to do this (not optimal and not best practice... The script is too ugly, but very functional).
#!/bin/bash
input="SQL_FILE.sql"
count=0
const=1000
lines=()
mkdir -p bulk
while IFS= read -r line
do
lines+=("$line")
count=$((count+1))
check=$((count % const))
if [[ $check -eq 0 ]]; then
bulk="${lines[*]}"
unset lines
number=$(printf "%010d" $count)
echo $bulk > "bulk/bulk${number}.sql"
bulk=""
fi
done < "$input"
FILES="bulk/*"
for f in $FILES
do
echo "Processing $f file..."
sqlcmd -S SERVER -d DATABASE -U USER -P "PASSWORD" -i "$f"
sleep 2s
done
You can try it, with:
$ docker run -v /path/to/your/sql/file/folder:/backup -it mcr.microsoft.com/mssql-tools
$ bash bulk.sh

DOS batch file - loop and increment by 1

I have this batch file that logs into sql on a remote machine runs a stored procedure and then sends the output to a text file. I would like it to increment both the 3rd octet in the IP address and the name of the output text file by 1 and loop so I don't have to repeat the command over and over. Also, I would like it to stop when it reaches a certain number. Is there a way to do this?
sqlcmd -U user -P password -S 192.168.1.2 -i c:\sql\storecreditfix.sql -o c:\sql\ouput01.txt
sqlcmd -U user -P password -S 192.168.2.2 -i c:\sql\storecreditfix.sql -o c:\sql\ouput02.txt
sqlcmd -U user -P password -S 192.168.3.2 -i c:\sql\storecreditfix.sql -o c:\sql\ouput03.txt
sqlcmd -U user -P password -S 192.168.4.2 -i c:\sql\storecreditfix.sql -o c:\sql\ouput04.txt
sqlcmd -U user -P password -S 192.168.5.2 -i c:\sql\storecreditfix.sql -o c:\sql\ouput05.txt
this will do what you want. the /L specifier tells it to act like a regular programming for loop. the first parameter is the starting integer, the next is the number to step, and the last is the count. so this will loop starting at 1, incrementing by 1 for 6 integers:
#echo off
FOR /L %%G IN (1, 1, 6) DO (
echo sqlcmd -U user -P password -S 192.168.%%G.2 -i c:\sql\storecreditfix.sql -o c:\sql\ouput0%%G.txt
)
and if you want to do a list of IPs instead of a range, you can leave off the /L and just list the numbers. e.g.
FOR %%G IN (1,2,3,99,121) DO ...
obviously the "echo" before sqlcmd is just for testing;)
On the assumption that you're actually using cmd.exe rather than MS-DOS, one way to increment and test a variable is as follows:
#setlocal enableextensions enabledelayedexpansion
#echo off
set /a "i = 1"
:loop
if !i! leq 15 (
if !i! lss 10 (
echo sqlcmd -S 192.168.!i!.2 -o c:\sql\ouput0!i!.txt
) else (
echo sqlcmd -S 192.168.!i!.2 -o c:\sql\ouput!i!.txt
)
set /a "i = i + 1"
goto :loop
)
endlocal
This is a slightly modified version of what you need which echoes the relevant bits rather than executing them, and it outputs:
sqlcmd -S 192.168.1.2 -o c:\sql\ouput01.txt
sqlcmd -S 192.168.2.2 -o c:\sql\ouput02.txt
sqlcmd -S 192.168.3.2 -o c:\sql\ouput03.txt
sqlcmd -S 192.168.4.2 -o c:\sql\ouput04.txt
sqlcmd -S 192.168.5.2 -o c:\sql\ouput05.txt
sqlcmd -S 192.168.6.2 -o c:\sql\ouput06.txt
sqlcmd -S 192.168.7.2 -o c:\sql\ouput07.txt
sqlcmd -S 192.168.8.2 -o c:\sql\ouput08.txt
sqlcmd -S 192.168.9.2 -o c:\sql\ouput09.txt
sqlcmd -S 192.168.10.2 -o c:\sql\ouput10.txt
sqlcmd -S 192.168.11.2 -o c:\sql\ouput11.txt
sqlcmd -S 192.168.12.2 -o c:\sql\ouput12.txt
sqlcmd -S 192.168.13.2 -o c:\sql\ouput13.txt
sqlcmd -S 192.168.14.2 -o c:\sql\ouput14.txt
sqlcmd -S 192.168.15.2 -o c:\sql\ouput15.txt
Simply take the echo off the line and adjust the command to put back the other bits.
If you have access to cmd.exe then you also have access to cscript which allows you to write a DOS batch file in Javascript.

SQL Server 2008 automated database drop, create and fill

For the database in my project I have a drop/create script for the database, a script for creating tables and SPs and an Access 2003 .mdb file with some exported values.
To set up the database from scratch I can use my SQL management studio to first run one script, then the other and lastly manually run the sort of tedious import task.
But I would like to do this as automated as possible. Hopefully something like putting the three files in a folder along with a fourth script to execute. Looking something like:
run script "dropcreate.sql"
run script "createtables.sql"
import "values.mdb"
How is this done? I hope to avoid using SSIS and the like. The tricky this is of course the import of data, where I can't seem to find a simple way. It is also important that the files a left as they are and not embedded into anything.
:: DOC AT THE END
#ECHO OFF
::BOOM BOOM BOOM CHANGE THIS ONE WHEN YOU ARE INSTALLAING DIFFERENT DATABASE
SET DbName=CAS_DEV
ECHO CREATE FIRST BACKUP OF ALL DATABASES ON THE DEFAULT INSTANCE ONES:
ECHO CREATING THE LOG FILES
echo THIS IS THE ERROR LOG OF THE UPDATE OF THE %DbName% ON %DATE% >error.log
echo THIS IS THE INSTALL LOG OF THE UPDATE OF THE %DbName% ON %DATE% >install.log
ECHO STARTTING BACKUP
CD .\0.BackUp
ECHO FOR EACH SQL FILE DO RUN IT THIS WILL TAKE A WHILE
ECHO SINCE WE ARE GOING TO MAKE A BACKUP FOR ALL THE DATABASES ON THE CURRENT HOST
for /f %%i in ('dir *.SQL /s /b /o') DO ECHO %DATE% --- %TIME% RUNNING %%i 1>>"..\install.log"&SQLCMD -U ysg -P pass -H hostname -d MASTER -t 30000 -w 80 -u -p 1 -b -i %%i -r1 1>> "..\install.log" 2>> "..\error.log"
ECHO GO ONE FOLDER UP
ECHO SLEEP FOR 1 SECOND
ping -n 1 127.0.0.1 >NUL
ECHO DONE WITH BACKUP GOING UP
cd ..
ECHO THE BACKUPS ARE IN THE FOLDER
ECHO D:\DATA\BACKUPS
ECHO CLICK A KEY TO CONTINUE
ECHO ========================================================================================================================
PAUSE
ECHO STARTING INSTALLING FUNCTIONS
CD ".\1.Functions"
ECHO FOR EACH SQL FILE DO RUN IT
ping -n 1 127.0.0.1 >NUL
for /f %%i in ('dir *.SQL /s /b /o') DO ECHO %DATE% --- %TIME% RUNNING %%i 1>>"..\install.log"&SQLCMD -U ysg -P pass -H hostname -d %DbName% -t 3000 -w 80 -u -p 1 -b -i "%%i" -r1 1>> "..\install.log" 2>> "..\error.log"
ECHO DONE WITH STORED PROCEDDURES GOING UP
cd ..
ping -n 1 127.0.0.1 >NUL
ECHO HIT A KEY AFTER PAUSE
PAUSE
ECHO START TO EXECUTE THE MIXED FILES
CD .\1.Mixed
ECHO CREATING THE LOG FILES
echo. >>"..\error.log"
echo. >>"..\install.log"
ECHO FOR EACH SQL FILE DO RUN IT
for /f %%i in ('dir *.SQL /s /b /o') DO ECHO %DATE% --- %TIME% RUNNING %%i 1>>"..\install.log"&SQLCMD -U ysg -P pass -H hostname -d %DbName% -t 3000 -w 80 -u -p 1 -b -i %%i -r1 1>> "..\install.log" 2>> "..\error.log"
ECHO GO ONE FOLDER UP
cd ..
ECHO SLEEP FOR 1 SECOND
ping -n 1 127.0.0.1 >NUL
ECHO DONE WITH MIXED GOING UP
ECHO HIT A KEY AFTER PAUSE
PAUSE
ECHO STARTING INSTALLING TABLES
CD .\2.Tables
ECHO FOR EACH SQL FILE DO RUN IT
ping -n 1 127.0.0.1 >NUL
for /f %%i in ('dir *.SQL /s /b /o') DO ECHO %DATE% --- %TIME% RUNNING %%i 1>>"..\install.log"&SQLCMD -U ysg -P pass -H hostname -d %DbName% -t 3000 -w 80 -u -p 1 -b -i "%%i" -r1 1>> "..\install.log" 2>> "..\error.log"
ping -n 1 127.0.0.1 >NUL
ECHO DONE WITH TAbles GOING UP
cd ..
ping -n 1 127.0.0.1 >NUL
ECHO HIT A KEY AFTER PAUSE
PAUSE
ECHO STARTING INSTALLING Views
CD ".\3.Views"
ECHO FOR EACH SQL FILE DO RUN IT
ping -n 1 127.0.0.1 >NUL
for /f %%i in ('dir *.SQL /s /b /o') DO ECHO %DATE% --- %TIME% RUNNING %%i 1>>"..\install.log"&SQLCMD -U ysg -P pass -H hostname -d %DbName% -t 3000 -w 80 -u -p 1 -b -i "%%i" -r1 1>> "..\install.log" 2>> "..\error.log"
ECHO DONE WITH Views GOING UP
cd ..
ping -n 1 127.0.0.1 >NUL
ECHO HIT A KEY AFTER PAUSE
PAUSE
ECHO STARTING INSTALLING stored procedures
CD ".\5.StoredProcedures"
ECHO FOR EACH SQL FILE DO RUN IT
ping -n 1 127.0.0.1 >NUL
for /f %%i in ('dir *.SQL /s /b /o') DO ECHO %DATE% --- %TIME% RUNNING %%i 1>>"..\install.log"&SQLCMD -U ysg -P pass -H hostname -d %DbName% -t 3000 -w 80 -u -p 1 -b -i "%%i" -r1 1>> "..\install.log" 2>> "..\error.log"
ECHO DONE WITH STORED PROCEDDURES GOING UP
cd ..
ping -n 1 127.0.0.1 >NUL
ECHO HIT A KEY AFTER PAUSE
PAUSE
ECHO STARTING INSTALLING Triggers
CD ".\6.Triggers"
ECHO FOR EACH SQL FILE DO RUN IT
ping -n 1 127.0.0.1 >NUL
for /f %%i in ('dir *.SQL /s /b /o') DO ECHO %DATE% --- %TIME% RUNNING %%i 1>>"..\install.log"&SQLCMD -U ysg -P pass -H hostname -d %DbName% -t 3000 -w 80 -u -p 1 -b -i "%%i" -r1 1>> "..\install.log" 2>> "..\error.log"
ping -n 1 127.0.0.1 >NUL
ECHO DONE WITH Triggers GOING UP
cd ..
ping -n 1 127.0.0.1 >NUL
ECHO HIT A KEY AFTER PAUSE
PAUSE
ECHO Please , Review the log files and sent them back to Advanced Application Support
set mailadd= yordan.georgiev^#oxit.fi
:: WE USE THE "%cd%\bin\bmail.exe".EXE UTILITY TO SEND OURSELF AN E-MAIL CONTAINING THE TEXT FILE
:: ALTERNATIVE SMTP MIGHT BE company.com, UNCOMMENT THE NEXT LINE FOR ALTERN
::cmd /c "%cd%\bin\bmail.exe" -s company.com -m %computername%.txt -t %mailadd% -a %computername% -h
::"%cd%\bin\bmail.exe" -s smtp.company.com -m install.log -t yordan.georgiev#oxit.fi -a "POC 1.2 install log" -h
::"%cd%\bin\bmail.exe" -s smtp.company.com -m error.log -t yordan.georgiev#oxit.fi -a "POC 1.2 error log" -h
cmd /c start /max INSTALL.LOG
CMD /C start /MAX ERROR.LOG
echo DONE !!!
ECHO HIT A KEY TO EXIT
PAUSE
:: WE GO TROUGH ALL THE FOLDERS AND RUN THE SQL FILES IN ALPHABETIC ORDER
You can run SQL Server Management Studio in SQLCMD mode. In there you can run scripts as follows
:r c:\temp\DropCreate.SQL
:r c:\temp\CreateTables.SQL
Alternately, you can run the whole thing from a batch file using SQLCMD.exe commands.
SQLCMD -S "." -E -i "c:\temp\DropCreate.SQL"
SQLCMD -S "." -E -i "c:\temp\CreateTables.SQL"
Do you have an alternative to SSIS that can import the data for you? Usually to do any kind of transformations and loading you need error handling, lookups, etc that you will have to code yourself unless you use an off-the-shelf product.
You can read a lot about SSIS right here on SO.
We have a similar project (create db, load data, create code). We do all of this inside a Database Project - with Visual Studio Team System Edition 2008 and GDR2.