How to get column name in excel using bcp format in sql - sql

I already got the csv file with data but i need data with columns
I have many table which i need to execute at one go and save data in csv format at one location with header i don't want to add columns name as their 100 table.
I am using below query
Declare #cmd varchar(8000)
Set #cmd='Bcp"select * from tablename where condition queryout "path" -s -T -f -t, -c -E
EXECUTE master..xp_cmdshell
Their 100 tables like that.. any suggestion

Related

SQL Server 2012 table data export to excel file

I would like to export data from a SQL Server stored procedure to an Excel file. How can I achieve that?
I test like that:
insert into OPENROWSET(
'Microsoft.ACE.OLEDB.12.0',
'Excel 8.0;Database=D:\test.xlsx;;HDR=YES',
'SELECT EmpID FROM [Sheet1$]')
select * from tb1
but it returns an error:
Column name or number of supplied values does not match table definition.
I think it's because Excel doesn't have columns right? But there's no way to pre-write columns before exporting to Excel, and SQL Server can't create Excel by itself ...
I also tried with bcp:
bcp "SELECT * FROM mydb.tb1" queryout 'D:\test.xlsx' -t, -c -S . -d DESKTOP-D3PCUR7 -T
It returns error:
Incorrect syntax near 'queryout'.
How can I export table to Excel easily in SQL Server?

Bash script to export a table into another table

I created a script to export a table from my db into a .csv file
#!/usr/bin/bash
FILE="example.csv"
sqlplus -s abcd/abcd#XE <<EOF
SET PAGESIZE 50000
SET COLSEP ","
SET LINESIZE 200
SET FEEDBACK OFF
SPOOL $FILE
SELECT * FROM myTable;
SPOOL OFF
EXIT
and now I'd like to modify this script and export my table into another. How can I change my code?
By "exporting your table into another", do you mean copying data from one table to another? If you don't need indexes or keys or other features on your new table initially, i.e. if it's not for production use, it's quite simple:
#!/usr/bin/bash
TABLE="myOtherTable"
sqlplus -s abcd/abcd#XE <<EOF
CREATE TABLE $TABLE as SELECT * FROM myTable;
EXIT
You could also do a create table statement first, specifying columns, keys and storage options as any other table, and then have a separate line that does INSERT INTO $TABLE (SELECT * FROM myTable) to fill it with data copied from myTable.

Suppressing query output information from the export file

Trying to export a SQL Server database query results to a .csv file I have similarly output situation according to what it's presented here but having a different approach I open a new thread.
I have a complex query that has joins on few tables using some filters via some temporary tables. I am executing the stored procedure:
EXEC xp_cmdshell 'SQLCMD -S . -d myDATABASE -Q "EXECUTE myStoreProcedure;" -s " " -x -o "d:\result.csv"';
I get all the data into result.csv but the beginning of the file contains some
(10 rows affected)
(4 rows affected)
(23 rows affected)
(5 rows affected)
(8 rows affected)
(2 rows affected)
//followed by the header columns - as expected
----------------------------------------------------------------------
//and effective results - as expected
I would prefer not having output and dashed rows. I'm open to any suggestions / workarounds.
I think you want to add
SET NOCOUNT ON;
as the first line in your stored procedure
and add "-h -1" to your SQLCMD call
EDIT: #siyual beat me on the nocount part. I'll leave it in but he got it first :-)
EDIT 2: OK, I coded a short example showing what I mean and the output it produces
--EXEC xp_cmdshell 'SQLCMD -h -1 -S myserver -d Scratchpad_A -Q "EXECUTE spDummy;" -s " " -x -o "C:\TEMP result.csv"';
CREATE procedure spDummy as
SET NOCOUNT ON;
DECLARE #T TABLE( Person varchar(7), FavCol varchar(7));
INSERT INTO #T(Person, FavCol) VALUES ('Alice','Red'), ('Bob','Yellow'), ('Cindy','Green'), ('Dan','Blue');
SELECT 'Person' as H1, 'FavCol' as H2;
SELECT * FROM #T
and it gives me the output
Person FavCol
Alice Red
Bob Yellow
Cindy Green
Dan Blue
How does this compare to what you need? If that's not what you're looking for, maybe you could try a different way of explaining it?

sql insert query through text file

I'm trying to insert data from excel sheet to sql database. The query is stored in a text file as follows:
insert into [demo].[dbo].[relative]
select *
from openrowset('Microsoft.Jet.OLEDB.4.0','Excel 8.0;Database=D:\relative.xls','select * from [sheet1$]');
When I am executing the following command:
sqlcmd -S ADMIN-PC/SEXPRESS -i d:\demo.txt.
it is showing this error:
Msg 7357, Level 16, State 2, Server ADMIN-PC\SEXPRESS, Line 1
Can anyone please help in rectifying my problem.
Try using the sql server import vizard to create a new table from the xls file and then insert that data to the existing table from there. The problem you are having is maybe due to the non-compatibility between 64bit sql instance and 32 bit excel.
Or try using bcp
bcp demo.dbo.relative in "D:\relative.xls" -c -T
There is another way to get the same result..
create a temp table.
declare #sometable table (value varchar(50), translation varchar(max))
select * into #sometable from YOUR_DATABASE_TABLE (nolock)
Then, do your OPENROWSET, BCP, etc. from here..
You can create a shell script which will automatically read the insert commands from the .csv file and then write it to the database. If you want I can help you up with it. What you just need to do is to write all the insert statements in the .csv file.
#!/bin/ksh
sqlplus -silent /nolog << EOF > /dev/null
username/pwd#"Connection String"
set linesize 0;
set pagesize 0;
set echo off;
while read line; do
A=`echo "$line" | awk -F" " {print 1}`
and so on depends on the number of words in the insert statements.
$A $B
done < your_insert_statements.csv
It will read the .csv file and automatically insert the records in the database.

Export #tempTable to .txt file in SQL server

Check my last question Export values from SQL Server to txt file. I can able to export values to .txt file with my last question. I want to do export #tempTable to .txt file. How can I do this?
Edit: My Requirement: I want to export only updated data from a table to .txt file, say user insert 10 new rows of data in existing table, I want that 10 rows to be there in .txt file. I used Inserted table in trigger to get updated rows, when I try to export to .txt file from bcp, I can't since I don't know the full context of inserted table([database].[schema].[tableName]). So I decided to have inserted table data in #tempTable to export .txt file.
This is I suppose to do export inserted table data to .txt file in SQL server.
you could simply load the memory table into a regular table an then use your code for export that you already have
select * into queryout from #tempTable
-- your code from step 1
drop table queryout
here is an updated version of your trigger which should do the work
create trigger monitorTest1 on test for insert as
declare #sql varchar(8000);
select * into tempTable from inserted
SELECT #sql = 'bcp "select * from test2.dbo.tempTable" queryout C:\temp\mytest.txt -c -t -T -S localhost'
exec xp_cmdshell #sql
drop table tempTable