how to export sql output to excel with text format - sql

I have requirement that, Need to export the sql output data to excel sheet. To acheive this i am executing the below query,
xp_cmdshell'sqlcmd -s -U aaaa -P bbbbb -d dbao -Q " set nocount on;select * from table_tmp" -s "," -o D:\temp_table.xls
The query is executing fine but the issue which i am facing while executing that query is,
I am loosing the extract format of the columns
E.g., 0000012345 is stored as 12345 ( since it is numeric value xls by default removing preceeding zero's.
Is there a way to load the data in excel with format as text.
Thanks.

Related

Creating a Format File for Bulk Import

I am trying to create a Format File to bulk import a .csv file but i, am getting an error.
Query I used
"BCP -SMSSQLSERVER01.[Internal_Checks].[Jan_Flat] format out -fC:\Desktop\exported data\Jan_FlatFormat.fmt -c -T -Uasda -SMSSQLSERVER01 -PPASSWORD"
I am getting an error
"A valid table name is required for in, out, or format options."
This is the error. can anyone suggest what need to do.
According to the bcp Utility documentation the first parameter should be a [Database.]Schema.{Table | View | "query"}, so don't put -SMSSQLSERVER01 where you've got it. Also use format nul instead of format out.
Try using:
bcp.exe [Internal_Checks].[Jan_Flat] format nul "-fC:\Desktop\exported data\Jan_FlatFormat.fmt" -c -SMSSQLSERVER01 -T -Uasda -PPASSWORD
Note the quotes " around the -f switch because your path name contains space characters.
Also note that the -c switch causes single-byte characters (ASCII/OEM/codepage with SQLCHAR) to be written out. If your table contains nchar, nvarchar or ntext columns you should consider using the -w switch instead so as to write out UTF-16 encoded data (using SQLNCHAR).

Store my "Sybase" query result /output into a script variable

I need a variable to keep the results retrieved from a query (Sybase) that´s in a script.
I have built the following script, it works fine I get the desired result when I run it
Script: EXECUTE_DAILY:
isql -U database_dba -P password <<EOF!
select the_name from table_name where m_num="NUMB912" and date="17/01/2019"
go
quit
EOF!
echo "All Done"
Output:
"EXECUTE_DAILY" 97 lines, 293 characters
user#zp01$ ./EXECUTE_DAILY
the_name
-----------------------------------
NAME912
(1 row affected)
But now I would like to keep the output(the_name: NAME912) in a variable.
So far this is basically what I'm trying with no success.
variable=$(isql -U database_dba -P password -se "select the_name from table_name where m_num="NUMB912" and date="17/01/2019" ")
But, is not working. I can't save NAME912 in a variable.
You need to parse the output for the desired string/piece-of-data that you wish to store in your variable. I tend to make my life a bit easier by making sure I can easily/quickly search/parse out what I want.
Keeping a few issues in mind ...
I tend to use isql -s"|" -w10000 to ensure (most of the time) that a) the result set has all columns delimited with the pipe ('|') and b) a single row of data does not span multiple rows; the pipe delimiter makes it easier to parse out columns that may contain white space; obviously (?) use a different delimiter if a pipe may be part of your actual data
to make parsing of the isql output a bit easier I tend to add a unique, grep-able (literal) string to the rows that I'm looking to search/parse
some databases (eg, SQLAnywhere, Oracle) tend to mimic a literal value as the column header if said literal string has not been assigned an explicit alias/header; this means that if you do a simple search on your literal string then you'll get a match for the result set header as well as the actual data row
I tend to capture all isql output to a temporary file; this allows for easier follow-on processing, eg, error checking, data parsing, dumping contents to a logfile, etc
So, with the above in mind my code typically looks something like:
$ outfile=/tmp/.$$.isql.outfile
$ isql -s"|" -w10000 -U database_dba -P password <<-EOF > ${outfile} 2>&1
-- 'GREP'||'ME' ensures that 'GREPME' only shows up in the data row
select 'GREP'||'ME',the_name
from table_name
where m_num = "NUMB912"
and date = "17/01/2019"
go
EOF
$ cat ${outfile}
... snip ...
|'GREP'||'ME'|the_name | # notice the default column header = 'GREP'||'ME' which won't match my search for 'GREPME'
|------------|----------|
|GREPME |NAME912 | # this is the line I want to search/parse
... snip ...
$ read -r namevar < <(egrep GREPME ${outfile} | awk -F"|" '{print $3}')
$ echo ${namevar}
NAME912

Exporting SQL Table into a CSV file using Windows Batch Script

I am trying to create a windows batch file to export data from an SQL file to a CSV file.
I have an SQL file in %MYHOME%\database\NET-DB.sql which contains data that is:
NET-DB.sql
insert into net_network (id, a_id, alias, address, domain, mask) values('NET_10.10.1.0_10', 1, 'Local Network', '10.10.1.0', '', '255.255.252.0');
What I have tried so far in exporting the data from net_network table into a CSV file in my .bat file is with this command:
export.bat
if not exist "%MYHOME%\net\NUL" mkdir "%MYHOME%\net"
COPY net_network TO '%MYHOME%\net\CSV-EXPORT_FILE.csv' DELIMITER ',' CSV HEADER;
pause
Since that does not work for me, what should be the correct approach for this implementation? Any help will be much appreciated.
Use SQLCMD
You need to modify the code to make it work in your environment, but here goes.
if not exist "%MYHOME%\net\NUL" mkdir "%MYHOME%\net"
cd "C:\your path\to\sqlcmd"
sqlcmd -S YourDBServer -d DB_NAME -E -Q "select id, a_id, alias, address, domain, mask from net_network"
-o "CSV-EXPORT-FILE.csv" -s"," -w 255
Some explanations:
-S The database server to connect to.
-d Name of the database to connect to.
-Q Query to run, can also be insert, delete, update etc.
-o select the output file
-s"," separated by comma
-w column width, this has to be as big as your largest columns characters.

BCP Import error "Invalid character value for cast specification"

All
I am using BCP for import export and getting "Invalid character value for cast specification" error for only 1(first row of export) row while trying to import back.
Table Structure
Col1 -- Numeric(19,0)
Col2 -- NVARCHAR(400)
Col3 -- NVARCHAR(400)
I am using following commands
FOR Export
EXEC master..xp_cmdshell 'bcp "SELECT TOP 10 Col1, Col2, Col3 FROM Server.dbo.TableName" queryout C:\Data\File.dat -S Server -T -t"<EOFD>" -r"<EORD>" -w'
Same way I am generating a FORMAT file
EXEC master..xp_cmdshell 'BCP Server.dbo.TableName format nul -S Server -T -w -f "C:\Data\File.fmt" -t"<EOFD>" -r"<EORD>" '
Now when I try importing data back into SQL Server table I am getting error "Invalid character value for cast specification"
Error logs shows me something like this
## Row 1, Column 1: Invalid character value for cast specification ##
?1000 Mytestdataunicoded nothing
Now from where this ? added in starting of my column data is still unknown.
I am able to import successfully when trying importing with format file, also able to import successfully when using switch -c, but for some purposes we must use -w switch to do that.
I am using BCP for import export and getting "Invalid
character value for cast specification" error for only 1(first row of
export) row while trying to import back.
Does the first row of your export file contain column definition information?
If so, use -F2.
https://learn.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-ver15#F
When using -w option, I believe BCP ignores any -t or -r option and uses \t and \n and field and row terminators.
From MS docs:
-w Performs the bulk copy operation using Unicode characters. This option does not prompt for each field; it uses nchar as the storage
type, no prefixes, \t (tab character) as the field separator, and \n
(newline character) as the row terminator. -w is not compatible with
-c.

SQL Server BCP Empty File

I'm trying to use bcp to query out a comma-separated-value file but each time I get an empty file.
Here's my bcp command:
bcp "SELECT * FROM ##OutAK " QUERYOUT D:\Outbound\raw\li14090413.raw -c -T -t -S DB1
I have verified that ##OutAK is NOT empty because select count (*) from ##OutAK is not 0. When open file using HEX editor, I see the following:
0D 0A
I found the problem. It seems BCP is "allergic" with NULL. So, I just put ISNULL() to all the null-able fields and the output file is back to normal now.