Exporting SQL Table into a CSV file using Windows Batch Script - sql

I am trying to create a windows batch file to export data from an SQL file to a CSV file.
I have an SQL file in %MYHOME%\database\NET-DB.sql which contains data that is:
NET-DB.sql
insert into net_network (id, a_id, alias, address, domain, mask) values('NET_10.10.1.0_10', 1, 'Local Network', '10.10.1.0', '', '255.255.252.0');
What I have tried so far in exporting the data from net_network table into a CSV file in my .bat file is with this command:
export.bat
if not exist "%MYHOME%\net\NUL" mkdir "%MYHOME%\net"
COPY net_network TO '%MYHOME%\net\CSV-EXPORT_FILE.csv' DELIMITER ',' CSV HEADER;
pause
Since that does not work for me, what should be the correct approach for this implementation? Any help will be much appreciated.

Use SQLCMD
You need to modify the code to make it work in your environment, but here goes.
if not exist "%MYHOME%\net\NUL" mkdir "%MYHOME%\net"
cd "C:\your path\to\sqlcmd"
sqlcmd -S YourDBServer -d DB_NAME -E -Q "select id, a_id, alias, address, domain, mask from net_network"
-o "CSV-EXPORT-FILE.csv" -s"," -w 255
Some explanations:
-S The database server to connect to.
-d Name of the database to connect to.
-Q Query to run, can also be insert, delete, update etc.
-o select the output file
-s"," separated by comma
-w column width, this has to be as big as your largest columns characters.

Related

How can I run .sql file having multiple .sql file inside using Snowsql?

I want to figure out how to run multiple sql files on one go. Suppose I have this test.sql file which has file1.sql, file2.sql and file3.sql and so on. Along with some DML/DDL.
use database &{db};
use schema &{sc};
file1.sql;
file2.sql;
file3.sql;
create table snow_test1
(
name varchar
,add1 varchar
,id number
)
comment = 'this is snowsql testing table' ;
desc table snow_test1;
insert into snow_test1
values('prachi', 'testing', 1);
select * from snow_test1;
here what I run in power shell,
snowsql -c pp_conn -f ...\test.sql -D db=tbc -D sc=testing;
Is there any way to do this ? I know It is possible in Oracle but I want to do this using snowsql. Please guide me. Thanks in advance!
you can run multiple files in a single call:
snowsql -c pp_conn -f file1.sql -f file2.sql -f file3.sql -D db=tbc -D sc=testing;
You might need to put the addition DMLs in a file.
I have tried defining .sql file with !source inside my test.sql file and its working:
!source file1.sql;
!source file2.sql;
!source file3.sql;
....
Also, run the same command in power shell using one .sql file and it is working.

SQL Server BCP Empty File

I'm trying to use bcp to query out a comma-separated-value file but each time I get an empty file.
Here's my bcp command:
bcp "SELECT * FROM ##OutAK " QUERYOUT D:\Outbound\raw\li14090413.raw -c -T -t -S DB1
I have verified that ##OutAK is NOT empty because select count (*) from ##OutAK is not 0. When open file using HEX editor, I see the following:
0D 0A
I found the problem. It seems BCP is "allergic" with NULL. So, I just put ISNULL() to all the null-able fields and the output file is back to normal now.

bcp utility asking me to enter different parameters i do not undertand

I am using BCP To export data from sqlserver 2008R2 Database Name Health,and a table name patient.The out of the query should be save in a textfile:ApplicantsName.txt located at:
C:\Users\meuser\Desktop ApplicantsName.txt -C -T
After running the following query on the command prompt:
bcp "Select FirstName,LastName,PatientNumber from Health.dbo.Patient order by FirstName" queryout "C:\Users\meuser\Desktop ApplicantsName.txt" -C -T
It prompted me this:
Enter the file storage type of fiedl FirstName [char]:varchar
and then this:
Enter prefix-length of field FirstName[2]:FirstName
I have been entering some values but i think the best is to know how it works.After some time of research on the internet, know using bcp utility is one fastest way to export or import data between instance to a file.I follow exactly the samples provided by MS here but i think i need some practical explanation. Can some guide me how to go about this and a little bit of explanation or relevant ref. will be appreciated too.
#one angry researcher's solution of adding '-C RAW' did not work in my particular case but adding lower-case '-c' did. It performs the operation using a character data type
For instance:
bcp mydb.mytable out c:/temp/data.txt -T -c
You need to add a value for the -C parameter (capital C!). If you do not know what you're using it for, you probably won't be needing it and can omitt it.
Refer to the official documentation: http://msdn.microsoft.com/en-us/library/ms162802.aspx
edit: you could, for example, use
bcp "Select FirstName,LastName,PatientNumber from Health.dbo.Patient order by FirstName" queryout "C:\Users\meuser\Desktop\ApplicantsName.txt" -C RAW -T
You will need to fix your output directory too (seems you forgot a backslash there).
heres the sample bcp command with query and credentials (param)
bcp "SELECT * from yourtable" queryout c:\StockItemTransactionID_c.txt -c -Uusername -Pdbpassword -Sinstance -dYourDBName
Note: -U -P -S are case sensitive.

How to export a table row in mssql server to a file so that an import to a different location is possible

I have a sessions table which contains
sesssion_key (PK, bigint,not null),
created(datetime, not null)
content(image, null)
Now I need to import such a session to a file so that i can import it into another mssqsl instance.
Does someone know how this can be done?
Are the instances on the same server?
If so, just:SELECT * into newDB.sessions FROM olddb.sessions;
If they are not on the same server, go with either of Mithrandir's suggestions
You could use the inport/export wizard from SSMS or the command line tool bcp.
A bcp session might look like this:
Export data to a file:
bcp databasae.schema.table out outputfile -S source_server -T -n
or
bcp "SELECT sesssion_key, created, content FROM databasae.schema.table WHERE sesssion_key = 0000001" queryout outputfile -S source_server -T -n
Then you can import the data to another server with the same table:
bcp databasae.schema.table in inputfile -S destination_server -T -n

Execute SQL from file in bash

I'm trying to load a sql from a file in bash and execute the loaded sql. The sql file needs to be versatile, meaning it cannot be altered in order to make things easy while being run in bash (escaping special characters like * )
So I have run into some problems:
If I read my sample.sql
SELECT * FROM SAMPLETABLE
to a variable with
ab=`cat sample.sql`
and execute it
db2 `echo $ab`
I receive an sql error because by doing a cat the * has been replaced by all the files in the directory of sample.sql.
Easy solution would be to replace "" with "\" . But I cannot do this, because the file needs to stay executable in programs like DB Visualizer etc.
Could someone give me hint in the right direction?
The DB2 command line processor has options that accept a filename as input, so you shouldn't need to load statements from a text file into a shell variable.
This command will execute all SQL statements in the file, with newline treated as the statement terminator:
db2 -f sample.sql
This command will execute all SQL statements in the file, with semicolon treated as the statement terminator:
db2 -t -f sample.sql
Other useful CLP flags are:
-x : Suppress the column headings
-v : Echo the statement text immediately before execution
-z : Tee a copy of all CLP output to the filename immediately following this flag
Redirect stdin from the file.
db2 < sample.sql
In case, you have a variable used in your script and wanted to get it replaced by the shell before executed in DB2 then use this approach:
Contents of File.sql:
cat <<xEOF
insert values(1,2) into ${MY_SCHEMA}.${MY_TABLE};
select * from ${MY_SCHEMA}.${MY_TABLE};
xEOF
In command prompt do:
export MY_SCHEMA='STAR'
export MY_TAVLE='DIMENSION'
Then you are all good to get it executed in DB2:
eval File.sq |db2 +p -t
The shell will replace the global variables and then DB2 will execute it.
Hope it helps.