Unexpected argument executing cmdexec on a SQL job to export to CSV - sql

I try to run this on a SQL job:
sqlcmd -S . -d CI_Reports -E -s"," -W -Q "SET NOCOUNT ON SELECT * FROM [dbo].[Table]" > D:\Test.csv
How can I fix this error?
Sqlcmd: '> D:\Test.csv': Unexpected argument.

Have you tried like this -
sqlcmd -S . -d CI_Reports -E -s"," -W -Q "SET NOCOUNT ON SELECT * FROM [dbo].[Table]" -o D:\Test.csv
where -o output_file which would identify the file that receives output from sqlcmd.
Additionally you could try BCP which is best suited for bulk coping data between an instance of Microsoft SQL Server and a data file in a user-specified format.
Read more here.

Related

SQL Server : batch with specific characters of column

I wrote a batch script with SQL statements in order to export data to a .csv file:
sqlcmd -S DBServer -U User -P Password -d DBName -s","
-Q "SET NOCOUNT on;SELECT <=0.1%, (0.1%,0.5%] FROM t" -o D:\output.csv ;
But the specific characters in column name <=0.1%、(0.1%,0.5%] make the batch file is not working.
What should be the correct approach for this select statement?
Any help will be much appreciated.

Shell script to load the SqlServer table data into csv File

Need a Unix shell script to load the sqlServer table data into csv File.
Could some one please share the sample shell script.
Below Working:-->
sqlcmd -S $SQLHOSTNAME -U $SQLUSERNAME -P $SQLPASSWORD -d $SQLDATABASE -s" " -W -w 3000 -Q "SET NOCOUNT ON; $query;" | sed 2d >$csv_filename

Import CSV from Linux to Azure SQL Server

I have an Azure SQL Server database and a linux box. I have a csv file on the linux machine that I want to import into SQL Server. I have a table already created where I am going to import this file.
Why does this command return an Unknown argument: -U?
bcp table in ~/test.csv -U myUsername -S databaseServerName -d dbName -q -c -t
When I rearrange the arguments passed to bcp like below, it returns an Unknown argument: -S
bcp table in ~/test.csv -S databaseServerName -d dbName -U myUsername -q -c -t
So contrary to the documentation:
https://learn.microsoft.com/en-us/sql/tools/bcp-utility?redirectedfrom=MSDN&view=sql-server-2017#U
I hit issues where bcp does not like spaces after the argument names.
https://granadacoder.wordpress.com/2009/12/22/bcp-export/
quote from the article above:
//
The other syntax sugar is that there is no space after the -S
argument. As seen below
-SMyServerName\MyInstanceName
bcp.exe "SELECT cast(LastName as char(50)) , cast(FirstName as
char(50)) , cast(MiddleName as char(50)) , cast(Suffix as char(50))
FROM MyAdventureWorksDB.Person.Person ORDER BY NEWID()" queryout
PeopleRock.txt -c -t -T -SMyServerName\MyInstanceName
also
https://www.easysoft.com/products/data_access/odbc-sql-server-driver/bulk-copy.html#importing-data-table
check your syntax sugar in linux (below example is from above easysoft link)
./bcp AdventureWorks.HumanResources.myTeam in ~/myTeam.csv \
-f ~/myTeam.Fmt -U mydomain\myuser -S mymachine\sqlexpress
Note the above has the dbname.schemaname.tablename (before the "in" word above)

Can I specify an input sql file with bcp?

How can I specify an input sql file with a long query when using bcp? I tried using the -i option but it keeps complaining about a command-line error with no extra information. Is this possible?
I had this problem today and found a convenient workaround, at least in an ad-hoc situation.
Temporary tables can be created by any user with connect permissions. This means you can also create GLOBAL temporary tables.
Just run your query in enterprise manager (or sql cmd or whatever) using SELECT ...INTO with a global temporary table e.g.
SELECT *
INTO ##mytemptable
FROM SomeTable
WHERE [massive where clause, for example]
You can then use the temporary table in the BCP query with a simple
SELECT * FROM ##mytemptable
Then drop the temp table through enterprise manager
DROP TABLE ##mytemptable
I did other way for fix that.
I create a batch file which read a file and send your content in bcp command. See:
#ECHO off
SETLOCAL EnableDelayedExpansion
SET queryFile=%1
SET outFileName=%2
FOR /F "delims=" %%i IN (%queryFile%) DO SET join=!join! %%i
ECHO %join%
bcp "%join%" queryout %outFileName% /S.\SQLE_CAESAR /d /c /t"|" /T
That script receive two parameters:
Filename which has a query;
Filename for export data;
Execute a script in cmd like that:
export-query.bat query.sql export.txt
I hope helped.
As far as I'm concerned the BCP utility only supports Transact-SQL queries directly written to the command line. Ex:
bcp "SELECT Name FROM AdventureWorks.Sales.Currency" queryout Currency.Name.dat -T -c
According to its reference the "-i" option:
Specifies the name of a response file, containing the responses to the command prompt questions for each data field when a bulk copy is being performed using interactive mode (-n, -c, -w, or -N not specified).
Notice that it differs from the sqlcmd Utility "-i" option:
Identifies the file that contains a batch of SQL statements or stored procedures. Multiple files may be specified that will be read and processed in order (...)
try :
query=$( cat < /file.sql )
export query
bcp "${query}" queryout /home/file.csv
Multi-line queries can be given to bcp easily using powershell:
PS> $query = #'
select *
from <table>
'#
PS> bcp $query queryout <outfile> -d <database> -T -S <server> -c
I had face same issue, may not be a very good approach. However, I did something like the following
bcp "declare #query nvarchar(max) set #query = (SELECT * FROM OPENROWSET(BULK 'F:\tasks\report_v2.sql', SINGLE_CLOB) AS Contents) exec sp_executesql #query" queryout %outFileName% /c /C RAW -S . -U sa -P 123 -d blog /T
And I must say, if you use like global temp table then global temp table is dropped itself of after query executed. you can't use this at some situations
What really worked for me is this:
#ECHO off
setlocal enableextensions enabledelayedexpansion
SET "queryFile=%1"
SET "outFileName=%2"
SET RESULT=
FOR /F "delims=" %%i IN ('type %queryFile%') DO SET RESULT=!RESULT! %%i
echo %RESULT%
rem bcp "%RESULT%" queryout %outFileName% -t^ -r \n -T -k -c -d DB_NAME -S SERVER_NAME
type file is the equivalent of cat file in unix
What I did with complex queries was create a stored procedure with the desired statement and call it from BCP:
bcp "exec db.schema.stored_procedure" queryout "c:\file.txt" -T -S localhost -t "|" -c
This worked great for me. Greetings!
I made my own script (called of bulk.sh) to do this (not optimal and not best practice... The script is too ugly, but very functional).
#!/bin/bash
input="SQL_FILE.sql"
count=0
const=1000
lines=()
mkdir -p bulk
while IFS= read -r line
do
lines+=("$line")
count=$((count+1))
check=$((count % const))
if [[ $check -eq 0 ]]; then
bulk="${lines[*]}"
unset lines
number=$(printf "%010d" $count)
echo $bulk > "bulk/bulk${number}.sql"
bulk=""
fi
done < "$input"
FILES="bulk/*"
for f in $FILES
do
echo "Processing $f file..."
sqlcmd -S SERVER -d DATABASE -U USER -P "PASSWORD" -i "$f"
sleep 2s
done
You can try it, with:
$ docker run -v /path/to/your/sql/file/folder:/backup -it mcr.microsoft.com/mssql-tools
$ bash bulk.sh

How to export query results to csv in Microsoft SQL Server Management Studio?

Trying to export custom query to csv file I wrote the following command:
sqlcmd [-S myserver -d mydb -E -Q "SELECT column1 ,column_date, DATENAME(WEEKDAY, column_date) AS day_of_week ,distinc_events_count ,total_events_count ,event_duration FROM dbo.event_daily_stats ORDER BY column1" -o "D:\MyData.csv" -h-1 -s"," -w 700]
but it returned the following error message:
The identifier that starts with '-S myserver -d mydb -E -Q "SELECT column1 ,column_date, DATENAME(WEEKDAY, column_date) AS day_of_week ,distinc_events_count ,' is too long. Maximum length is 128.
Does anyone know how this issue could be solved?
Thank you!
I executed the command without "[" and "]" with no problem, have you tried in this way?
sqlcmd -S myserver -d mydb -E -Q "SELECT column1 ,column_date, DATENAME(WEEKDAY, column_date) AS day_of_week ,distinc_events_count ,total_events_count ,event_duration FROM dbo.event_daily_stats ORDER BY column1" -o "D:\MyData.csv" -h-1 -s"," -w 700
I think the problem is that you're trying to run this inside of SSMS when it should be run at a command prompt instead.