Export to Excel from SQL server through BCP - sql

I tried to export the result of a query to an excel sheet with the following BCP command and I was successful.
But the issue is, one of the columns have data with delimiter. Because of that the structure of the data is affected when I open the file in MS Excel. The data is not coming in the desired cell.
Is there anyway to escape the delimiter in the data of the column.
set #cmd='bcp "select ''colName1'',''colName2'',''colName3'',''colName4'',''colName5'',''colName6'',''colName7'' union all SELECT colValue1,colValue2,CONVERT(varchar,[colValue3]),[colValue4],convert(varchar,[colValue5]),[colValue6],[colValue7] from [DBName].[dbo].[ViewName] where  [colValue5]='''+ CONVERT(VARCHAR(8), GETDATE()-1, 112)+'''" queryout "HYPERLINK "D:\Report_'+CONVERT(VARCHAR, GETDATE(), 112)+'.xls"D:\Reports\Detailed_Comment_Report_'+CONVERT(VARCHAR(8), GETDATE(), 112)+'.xls" -c -T -t -S ABCD' 

Related

How to fix BCP file with widechar support that causes Pentaho data integration to fail on insert of values from first character of first row?

I have an bat file that collects data with a bcp extract call that executes a Stored Procedure(SP) with the -w flag. When the data from that file is consumed by our Pentaho transformation, there is an additional character added to the first value in any row. The CSV input step uses "UTF-16LE" but the first field has a value that has garbage characters before the value (ex. "1" instead of "1"). Is there an additional option to bcp that can either add a header row or is there something that can cleanse this character from the pentaho side.
Sample BCP command :
bcp "exec [companyschema].[collectdataprocedure] %SESSIONID%" queryout collectedoutput.csv -t "," -w -T -S
The issue occurs when I try to load to the database within the transformation.
I have tried skipping the first row of the data but do need to have that data loaded to the db.
Found an answer to the issue and it is to use the Replace in String step with Search Pattern of "([^A-Za-z0-9\-])" and set Empty String "Y" to replace the first field in your row with the same name.
This resolved the issue with losing the first row of data.

Import a text file to Postgres on ubuntu?

I have a huge text file that is in format of "name:email".
I also created a table that has name and email columns.
How do I upload the text file into the table?
The textfile is on my ubuntu server, and I have connected to psql by using the commands
sudo -u postgres psql
\connect <username>"
What do I do next in order to import the text file to my database?
psql has a \COPY command
Copy allows you to do copy data into and out of tables in your
database. It supports several modes including:
binary
tab delimited
csv delimited
What you need is:
\COPY tablename(name,email) FROM '/path/to/file' DELIMITER ':' CSV
If you get an error ERROR: missing data for column "name", that means your file has an empty line at the end, just remove it.

how to use 'bcp' utility to transfer data from table to text file in sql server 2000

Can anyone know about how to use 'bcp' utility to transfer data from table to text file in sql server 2000
EXEC master..xp_cmdshell'bcp "Select * from test..emp" queryout "c:\dept.txt" -c -T -x'

How to generate a csv file using select query in SQL [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
How to export data as CSV format from SQL Server using sqlcmd?
I want to generate CSV file using select query in SQL-server.
The code below is working correctly in mySQL:
select * into outfile 'd:/report.csv' fields terminated by ',' from tableName;
It generated the CSV file.
Does anybody know how can I create a CSV file using select query in SQL-server?
Will this do the work
sqlcmd -S server -U loginid -P password -d DBname -Q "select * from tablename" -o output.csv
EDIT:
Use -i options if you want to execute a SQL script like -i sql_script_filename.sql
SQLCMD -S MyInstance -E -d sales -i query_file.sql -o output_file.csv -s
You can use OPENROWSET() to read from a CSV file within a T-SQL query but AFAIK you can't write to one. This is really what SSIS/DTS is for.
If you're working with it interactively in SQL Server Management Studio you could export a grid to a file.

how to convert csv file from a table in stored procedure?

I need query for import csv file from a particular table and that quest must be used inside a stored procedures.
I tried this query
EXEC master..xp_cmdshell
'osql.exe -S ramcobl412 -U connect -P connect
-Q "select * from ramcodb..rct_unplanned_hdr" -o "c:\out.csv" -h-1 -s","'
but that csv file not in format when I open in xsl sheet
Comma separated files are working fine but width is problem
Save the .csv file you want to import to SQL Server on your desktop or some other path you can easily access.
Using SQL Server Management Studio, you should right click the database you want the csv file imported to as a table and go to Tasks >
Import Data and use the Import Wizard to import the csv file to a table.
The Import Wizard will automatically account for the different lengths you have for some rows. For example if you have column X and it has 5 characters on 1 row and 10 characters on two other rows the Import Wizard will automatically set the max length for the X column as 10.