SQL Server 2012 table data export to excel file - sql

I would like to export data from a SQL Server stored procedure to an Excel file. How can I achieve that?
I test like that:
insert into OPENROWSET(
'Microsoft.ACE.OLEDB.12.0',
'Excel 8.0;Database=D:\test.xlsx;;HDR=YES',
'SELECT EmpID FROM [Sheet1$]')
select * from tb1
but it returns an error:
Column name or number of supplied values does not match table definition.
I think it's because Excel doesn't have columns right? But there's no way to pre-write columns before exporting to Excel, and SQL Server can't create Excel by itself ...
I also tried with bcp:
bcp "SELECT * FROM mydb.tb1" queryout 'D:\test.xlsx' -t, -c -S . -d DESKTOP-D3PCUR7 -T
It returns error:
Incorrect syntax near 'queryout'.
How can I export table to Excel easily in SQL Server?

Related

Postgres COPY and SQL commands in the same script

I have the following SQL script -
--This is a function
SELECT * FROM import ('test');
TRUNCATE table some_table;
cat <<SQL
\COPY (
SELECT * from large_query
)
TO '/tmp/dash.csv' WITH DELIMITER ',' CSV HEADER
SQL
;
I am getting a parse error when I run this script like this -
psql -h host -p port -U user db -f my_file.sql
Can regular SQL statements not be combined with the \COPY command?

How to get column name in excel using bcp format in sql

I already got the csv file with data but i need data with columns
I have many table which i need to execute at one go and save data in csv format at one location with header i don't want to add columns name as their 100 table.
I am using below query
Declare #cmd varchar(8000)
Set #cmd='Bcp"select * from tablename where condition queryout "path" -s -T -f -t, -c -E
EXECUTE master..xp_cmdshell
Their 100 tables like that.. any suggestion

How to include SQL code in SQL query?

Prior to my main select statement, I first have to create a temporary table with thousands of lines (using a select statement to get these lines automatically is not feasible in my SQL Server environment), but working in that SQL query is a nightmare of readability, as my .sql file has thousands of lines :(
Is is possible to achieve something like this ?
include('actors_tables.sql') /*including all the insert code*/
select * from #temp_actors
instead of this ?
create table #temp_actors (firstname varchar(50), lastname varchar(50))
insert into #temp_actors values ('George','Clooney')
insert into #temp_actors values ('Bill','Murray')
insert into #temp_actors values ('Bruce','Willis')
... + 1000 inserts in thousands of lines
select * from #temp_actors
Seems to me like a basic simple feature but I can't find how to achieve this ...
The server is running SQL Server 2005, and I'm using SQL Server Management Studio 2008.
Thank you for your help !
Kris.
From the command prompt, start up sqlcmd:
sqlcmd -S <server> -i C:\<your file name here>.sql -o
Or to run from sql server management studio use xp_cmdshell and sqlcmd:
EXEC xp_cmdshell 'sqlcmd -S ' + #DBServerName + ' -d ' + #DBName + ' -i ' + #FilePathName
you can use Sqlcmd in command prompt with input file (use sqlcmd -i actors_tables.sql ).

sql insert query through text file

I'm trying to insert data from excel sheet to sql database. The query is stored in a text file as follows:
insert into [demo].[dbo].[relative]
select *
from openrowset('Microsoft.Jet.OLEDB.4.0','Excel 8.0;Database=D:\relative.xls','select * from [sheet1$]');
When I am executing the following command:
sqlcmd -S ADMIN-PC/SEXPRESS -i d:\demo.txt.
it is showing this error:
Msg 7357, Level 16, State 2, Server ADMIN-PC\SEXPRESS, Line 1
Can anyone please help in rectifying my problem.
Try using the sql server import vizard to create a new table from the xls file and then insert that data to the existing table from there. The problem you are having is maybe due to the non-compatibility between 64bit sql instance and 32 bit excel.
Or try using bcp
bcp demo.dbo.relative in "D:\relative.xls" -c -T
There is another way to get the same result..
create a temp table.
declare #sometable table (value varchar(50), translation varchar(max))
select * into #sometable from YOUR_DATABASE_TABLE (nolock)
Then, do your OPENROWSET, BCP, etc. from here..
You can create a shell script which will automatically read the insert commands from the .csv file and then write it to the database. If you want I can help you up with it. What you just need to do is to write all the insert statements in the .csv file.
#!/bin/ksh
sqlplus -silent /nolog << EOF > /dev/null
username/pwd#"Connection String"
set linesize 0;
set pagesize 0;
set echo off;
while read line; do
A=`echo "$line" | awk -F" " {print 1}`
and so on depends on the number of words in the insert statements.
$A $B
done < your_insert_statements.csv
It will read the .csv file and automatically insert the records in the database.

Export #tempTable to .txt file in SQL server

Check my last question Export values from SQL Server to txt file. I can able to export values to .txt file with my last question. I want to do export #tempTable to .txt file. How can I do this?
Edit: My Requirement: I want to export only updated data from a table to .txt file, say user insert 10 new rows of data in existing table, I want that 10 rows to be there in .txt file. I used Inserted table in trigger to get updated rows, when I try to export to .txt file from bcp, I can't since I don't know the full context of inserted table([database].[schema].[tableName]). So I decided to have inserted table data in #tempTable to export .txt file.
This is I suppose to do export inserted table data to .txt file in SQL server.
you could simply load the memory table into a regular table an then use your code for export that you already have
select * into queryout from #tempTable
-- your code from step 1
drop table queryout
here is an updated version of your trigger which should do the work
create trigger monitorTest1 on test for insert as
declare #sql varchar(8000);
select * into tempTable from inserted
SELECT #sql = 'bcp "select * from test2.dbo.tempTable" queryout C:\temp\mytest.txt -c -t -T -S localhost'
exec xp_cmdshell #sql
drop table tempTable