Import CSV from Linux to Azure SQL Server - sql

I have an Azure SQL Server database and a linux box. I have a csv file on the linux machine that I want to import into SQL Server. I have a table already created where I am going to import this file.
Why does this command return an Unknown argument: -U?
bcp table in ~/test.csv -U myUsername -S databaseServerName -d dbName -q -c -t
When I rearrange the arguments passed to bcp like below, it returns an Unknown argument: -S
bcp table in ~/test.csv -S databaseServerName -d dbName -U myUsername -q -c -t

So contrary to the documentation:
https://learn.microsoft.com/en-us/sql/tools/bcp-utility?redirectedfrom=MSDN&view=sql-server-2017#U
I hit issues where bcp does not like spaces after the argument names.
https://granadacoder.wordpress.com/2009/12/22/bcp-export/
quote from the article above:
//
The other syntax sugar is that there is no space after the -S
argument. As seen below
-SMyServerName\MyInstanceName
bcp.exe "SELECT cast(LastName as char(50)) , cast(FirstName as
char(50)) , cast(MiddleName as char(50)) , cast(Suffix as char(50))
FROM MyAdventureWorksDB.Person.Person ORDER BY NEWID()" queryout
PeopleRock.txt -c -t -T -SMyServerName\MyInstanceName
also
https://www.easysoft.com/products/data_access/odbc-sql-server-driver/bulk-copy.html#importing-data-table
check your syntax sugar in linux (below example is from above easysoft link)
./bcp AdventureWorks.HumanResources.myTeam in ~/myTeam.csv \
-f ~/myTeam.Fmt -U mydomain\myuser -S mymachine\sqlexpress
Note the above has the dbname.schemaname.tablename (before the "in" word above)

Related

SQL Server : batch with specific characters of column

I wrote a batch script with SQL statements in order to export data to a .csv file:
sqlcmd -S DBServer -U User -P Password -d DBName -s","
-Q "SET NOCOUNT on;SELECT <=0.1%, (0.1%,0.5%] FROM t" -o D:\output.csv ;
But the specific characters in column name <=0.1%、(0.1%,0.5%] make the batch file is not working.
What should be the correct approach for this select statement?
Any help will be much appreciated.

Shell script to load the SqlServer table data into csv File

Need a Unix shell script to load the sqlServer table data into csv File.
Could some one please share the sample shell script.
Below Working:-->
sqlcmd -S $SQLHOSTNAME -U $SQLUSERNAME -P $SQLPASSWORD -d $SQLDATABASE -s" " -W -w 3000 -Q "SET NOCOUNT ON; $query;" | sed 2d >$csv_filename

How can I specifie column order while Bulk Copying (BCP) Out the tables from SYBASE_IQ.

Below is the code I am using in the batch script to BCP process.
call bcp.exe DBName.tablename out FILENAME.csv -e FILENAMEerr.txt -c -t"|" -U USER DETAILS -P PASSWORD -S Servicename -r"\n"

Unexpected argument executing cmdexec on a SQL job to export to CSV

I try to run this on a SQL job:
sqlcmd -S . -d CI_Reports -E -s"," -W -Q "SET NOCOUNT ON SELECT * FROM [dbo].[Table]" > D:\Test.csv
How can I fix this error?
Sqlcmd: '> D:\Test.csv': Unexpected argument.
Have you tried like this -
sqlcmd -S . -d CI_Reports -E -s"," -W -Q "SET NOCOUNT ON SELECT * FROM [dbo].[Table]" -o D:\Test.csv
where -o output_file which would identify the file that receives output from sqlcmd.
Additionally you could try BCP which is best suited for bulk coping data between an instance of Microsoft SQL Server and a data file in a user-specified format.
Read more here.

How to execute postgres' sql queries from batch file?

I need to execute SQL from batch file.
I am executing following to connect to Postgres and select data from table
C:/pgsql/bin/psql -h %DB_HOST% -p 5432 -U %DB_USER% -d %DB_NAME%
select * from test;
I am able to connect to database, however I'm getting the error
'select' is not recognized as an internal or external command,
operable program or batch file.
Has anyone faced such issue?
This is one of the query i am trying, something similar works in shell script, (please ignore syntax error in the query if there are any)
copy testdata (col1,col2,col3) from '%filepath%/%csv_file%' with csv;
You could pipe it into psql
(
echo select * from test;
) | C:/pgsql/bin/psql -h %DB_HOST% -p 5432 -U %DB_USER% -d %DB_NAME%
When closing parenthesis are part of the SQL query they have to be escaped with three carets.
(
echo insert into testconfig(testid,scenarioid,testname ^^^) values( 1,1,'asdf'^^^);
) | psql -h %DB_HOST% -p 5432 -U %DB_USER% -d %DB_NAME%
Use the -f parameter to pass the batch file name
C:/pgsql/bin/psql -h %DB_HOST% -p 5432 -U %DB_USER% -d %DB_NAME% -f 'sql_batch_file.sql'
http://www.postgresql.org/docs/current/static/app-psql.html
-f filename
--file=filename
Use the file filename as the source of commands instead of reading commands interactively. After the file is processed, psql terminates. This is in many ways equivalent to the meta-command \i.
If filename is - (hyphen), then standard input is read until an EOF indication or \q meta-command. Note however that Readline is not used in this case (much as if -n had been specified).
if running on Linux, this is what worked for me (need to update values below with your user, db name etc)
psql "host=YOUR_HOST port=YOUR_PORT dbname=YOUR_DB_NAME user=YOUR_USER_NAME password=YOUR_PASSWORD" -f "fully_qualified_path_to_your_script.sql"
You cannot put the query on separate line, batch interpreter will assume it's another command instead of a query for psql. I believe you will need to quote it as well.
I agree with Spidey:
1] if you are passing the file with the sql use -f or --file parameter
When you want to execute several commands the best way to do that is to add parameter -f, and after that just type path to your file without any " or ' marks (relative paths works also):
psql -h %host% -p 5432 -U %user% -d %dbname% -f ..\..\folder\Data.txt
It also works in .NET Core. I need it to add basic data to my database after migrations.
Kindly refer to the documentation
1] if you are passing the file with the sql use -f or --file parameter
2] if you are passing individual command use -c or --command parameter
If you are trying the shell script
psql postgresql://$username:$password#$host/$database < /app/sql_script/script.sql