I would like to export data from a SQL Server stored procedure to an Excel file. How can I achieve that?
I test like that:
insert into OPENROWSET(
'Microsoft.ACE.OLEDB.12.0',
'Excel 8.0;Database=D:\test.xlsx;;HDR=YES',
'SELECT EmpID FROM [Sheet1$]')
select * from tb1
but it returns an error:
Column name or number of supplied values does not match table definition.
I think it's because Excel doesn't have columns right? But there's no way to pre-write columns before exporting to Excel, and SQL Server can't create Excel by itself ...
I also tried with bcp:
bcp "SELECT * FROM mydb.tb1" queryout 'D:\test.xlsx' -t, -c -S . -d DESKTOP-D3PCUR7 -T
It returns error:
Incorrect syntax near 'queryout'.
How can I export table to Excel easily in SQL Server?
I created a script to export a table from my db into a .csv file
#!/usr/bin/bash
FILE="example.csv"
sqlplus -s abcd/abcd#XE <<EOF
SET PAGESIZE 50000
SET COLSEP ","
SET LINESIZE 200
SET FEEDBACK OFF
SPOOL $FILE
SELECT * FROM myTable;
SPOOL OFF
EXIT
and now I'd like to modify this script and export my table into another. How can I change my code?
By "exporting your table into another", do you mean copying data from one table to another? If you don't need indexes or keys or other features on your new table initially, i.e. if it's not for production use, it's quite simple:
#!/usr/bin/bash
TABLE="myOtherTable"
sqlplus -s abcd/abcd#XE <<EOF
CREATE TABLE $TABLE as SELECT * FROM myTable;
EXIT
You could also do a create table statement first, specifying columns, keys and storage options as any other table, and then have a separate line that does INSERT INTO $TABLE (SELECT * FROM myTable) to fill it with data copied from myTable.
I already got the csv file with data but i need data with columns
I have many table which i need to execute at one go and save data in csv format at one location with header i don't want to add columns name as their 100 table.
I am using below query
Declare #cmd varchar(8000)
Set #cmd='Bcp"select * from tablename where condition queryout "path" -s -T -f -t, -c -E
EXECUTE master..xp_cmdshell
Their 100 tables like that.. any suggestion
Prior to my main select statement, I first have to create a temporary table with thousands of lines (using a select statement to get these lines automatically is not feasible in my SQL Server environment), but working in that SQL query is a nightmare of readability, as my .sql file has thousands of lines :(
Is is possible to achieve something like this ?
include('actors_tables.sql') /*including all the insert code*/
select * from #temp_actors
instead of this ?
create table #temp_actors (firstname varchar(50), lastname varchar(50))
insert into #temp_actors values ('George','Clooney')
insert into #temp_actors values ('Bill','Murray')
insert into #temp_actors values ('Bruce','Willis')
... + 1000 inserts in thousands of lines
select * from #temp_actors
Seems to me like a basic simple feature but I can't find how to achieve this ...
The server is running SQL Server 2005, and I'm using SQL Server Management Studio 2008.
Thank you for your help !
Kris.
From the command prompt, start up sqlcmd:
sqlcmd -S <server> -i C:\<your file name here>.sql -o
Or to run from sql server management studio use xp_cmdshell and sqlcmd:
EXEC xp_cmdshell 'sqlcmd -S ' + #DBServerName + ' -d ' + #DBName + ' -i ' + #FilePathName
you can use Sqlcmd in command prompt with input file (use sqlcmd -i actors_tables.sql ).
How do I dump the data, and only the data, not the schema, of some SQLite3 tables of a database (not all the tables)?
The dump should be in SQL format, as it should be easily re-entered into the database later and should be done from the command line. Something like
sqlite3 db .dump
but without dumping the schema and selecting which tables to dump.
You're not saying what you wish to do with the dumped file.
To get a CSV file (which can be imported into almost everything)
.mode csv
-- use '.separator SOME_STRING' for something other than a comma.
.headers on
.out file.csv
select * from MyTable;
To get an SQL file (which can be reinserted into a different SQLite database)
.mode insert <target_table_name>
.out file.sql
select * from MyTable;
You can do this getting difference of .schema and .dump commands. for example with grep:
sqlite3 some.db .schema > schema.sql
sqlite3 some.db .dump > dump.sql
grep -vx -f schema.sql dump.sql > data.sql
data.sql file will contain only data without schema, something like this:
BEGIN TRANSACTION;
INSERT INTO "table1" VALUES ...;
...
INSERT INTO "table2" VALUES ...;
...
COMMIT;
You can specify one or more table arguments to the .dump special command, e.g.sqlite3 db ".dump 'table1' 'table2'".
Not the best way, but at lease does not need external tools (except grep, which is standard on *nix boxes anyway)
sqlite3 database.db3 .dump | grep '^INSERT INTO "tablename"'
but you do need to do this command for each table you are looking for though.
Note that this does not include schema.
Any answer which suggests using grep to exclude the CREATE lines or just grab the INSERT lines from the sqlite3 $DB .dump output will fail badly. The CREATE TABLE commands list one column per line (so excluding CREATE won't get all of it), and values on the INSERT lines can have embedded newlines (so you can't grab just the INSERT lines).
for t in $(sqlite3 $DB .tables); do
echo -e ".mode insert $t\nselect * from $t;"
done | sqlite3 $DB > backup.sql
Tested on sqlite3 version 3.6.20.
If you want to exclude certain tables you can filter them with $(sqlite $DB .tables | grep -v -e one -e two -e three), or if you want to get a specific subset replace that with one two three.
As an improvement to Paul Egan's answer, this can be accomplished as follows:
sqlite3 database.db3 '.dump "table1" "table2"' | grep '^INSERT'
--or--
sqlite3 database.db3 '.dump "table1" "table2"' | grep -v '^CREATE'
The caveat, of course, is that you have to have grep installed.
In Python or Java or any high level language the .dump does not work. We need to code the conversion to CSV by hand. I give an Python example. Others, examples would be appreciated:
from os import path
import csv
def convert_to_csv(directory, db_name):
conn = sqlite3.connect(path.join(directory, db_name + '.db'))
cursor = conn.cursor()
cursor.execute("SELECT name FROM sqlite_master WHERE type='table';")
tables = cursor.fetchall()
for table in tables:
table = table[0]
cursor.execute('SELECT * FROM ' + table)
column_names = [column_name[0] for column_name in cursor.description]
with open(path.join(directory, table + '.csv'), 'w') as csv_file:
csv_writer = csv.writer(csv_file)
csv_writer.writerow(column_names)
while True:
try:
csv_writer.writerow(cursor.fetchone())
except csv.Error:
break
If you have 'panel data, in other words many individual entries with id's add this to the with look and it also dumps summary statistics:
if 'id' in column_names:
with open(path.join(directory, table + '_aggregate.csv'), 'w') as csv_file:
csv_writer = csv.writer(csv_file)
column_names.remove('id')
column_names.remove('round')
sum_string = ','.join('sum(%s)' % item for item in column_names)
cursor.execute('SELECT round, ' + sum_string +' FROM ' + table + ' GROUP BY round;')
csv_writer.writerow(['round'] + column_names)
while True:
try:
csv_writer.writerow(cursor.fetchone())
except csv.Error:
break
Review of other possible solutions
Include only INSERTs
sqlite3 database.db3 .dump | grep '^INSERT INTO "tablename"'
Easy to implement but it will fail if any of your columns include new lines
SQLite insert mode
for t in $(sqlite3 $DB .tables); do
echo -e ".mode insert $t\nselect * from $t;"
done | sqlite3 $DB > backup.sql
This is a nice and customizable solution, but it doesn't work if your columns have blob objects like 'Geometry' type in spatialite
Diff the dump with the schema
sqlite3 some.db .schema > schema.sql
sqlite3 some.db .dump > dump.sql
grep -v -f schema.sql dump > data.sql
Not sure why, but is not working for me
Another (new) possible solution
Probably there is not a best answer to this question, but one that is working for me is grep the inserts taking into account that be new lines in the column values with an expression like this
grep -Pzo "(?s)^INSERT.*\);[ \t]*$"
To select the tables do be dumped .dump admits a LIKE argument to match the table names, but if this is not enough probably a simple script is better option
TABLES='table1 table2 table3'
echo '' > /tmp/backup.sql
for t in $TABLES ; do
echo -e ".dump ${t}" | sqlite3 database.db3 | grep -Pzo "(?s)^INSERT.*?\);$" >> /tmp/backup.sql
done
or, something more elaborated to respect foreign keys and encapsulate all the dump in only one transaction
TABLES='table1 table2 table3'
echo 'BEGIN TRANSACTION;' > /tmp/backup.sql
echo '' >> /tmp/backup.sql
for t in $TABLES ; do
echo -e ".dump ${t}" | sqlite3 $1 | grep -Pzo "(?s)^INSERT.*?\);$" | grep -v -e 'PRAGMA foreign_keys=OFF;' -e 'BEGIN TRANSACTION;' -e 'COMMIT;' >> /tmp/backup.sql
done
echo '' >> /tmp/backup.sql
echo 'COMMIT;' >> /tmp/backup.sql
Take into account that the grep expression will fail if ); is a string present in any of the columns
To restore it (in a database with the tables already created)
sqlite3 -bail database.db3 < /tmp/backup.sql
According to the SQLite documentation for the Command Line Shell For SQLite you can export an SQLite table (or part of a table) as CSV, simply by setting the "mode" to "csv" and then run a query to extract the desired rows of the table:
sqlite> .header on
sqlite> .mode csv
sqlite> .once c:/work/dataout.csv
sqlite> SELECT * FROM tab1;
sqlite> .exit
Then use the ".import" command to import CSV (comma separated value) data into an SQLite table:
sqlite> .mode csv
sqlite> .import C:/work/dataout.csv tab1
sqlite> .exit
Please read the further documentation about the two cases to consider: (1) Table "tab1" does not previously exist and (2) table "tab1" does already exist.
The best method would be to take the code the sqlite3 db dump would do, excluding schema parts.
Example pseudo code:
SELECT 'INSERT INTO ' || tableName || ' VALUES( ' ||
{for each value} ' quote(' || value || ')' (+ commas until final)
|| ')' FROM 'tableName' ORDER BY rowid DESC
See: src/shell.c:838 (for sqlite-3.5.9) for actual code
You might even just take that shell and comment out the schema parts and use that.
This version works well with newlines inside inserts:
sqlite3 database.sqlite3 .dump | grep -v '^CREATE'
In practice excludes all the lines starting with CREATE which is less likely to contain newlines
The answer by retracile should be the closest one, yet it does not work for my case. One insert query just broke in the middle and the export just stopped. Not sure what is the reason. However It works fine during .dump.
Finally I wrote a tool for the split up the SQL generated from .dump:
https://github.com/motherapp/sqlite_sql_parser/
You could do a select on the tables inserting commas after each field to produce a csv, or use a GUI tool to return all the data and save it to a csv.