I have the following SQL script -
--This is a function
SELECT * FROM import ('test');
TRUNCATE table some_table;
cat <<SQL
\COPY (
SELECT * from large_query
)
TO '/tmp/dash.csv' WITH DELIMITER ',' CSV HEADER
SQL
;
I am getting a parse error when I run this script like this -
psql -h host -p port -U user db -f my_file.sql
Can regular SQL statements not be combined with the \COPY command?
Related
I would like to export data from a SQL Server stored procedure to an Excel file. How can I achieve that?
I test like that:
insert into OPENROWSET(
'Microsoft.ACE.OLEDB.12.0',
'Excel 8.0;Database=D:\test.xlsx;;HDR=YES',
'SELECT EmpID FROM [Sheet1$]')
select * from tb1
but it returns an error:
Column name or number of supplied values does not match table definition.
I think it's because Excel doesn't have columns right? But there's no way to pre-write columns before exporting to Excel, and SQL Server can't create Excel by itself ...
I also tried with bcp:
bcp "SELECT * FROM mydb.tb1" queryout 'D:\test.xlsx' -t, -c -S . -d DESKTOP-D3PCUR7 -T
It returns error:
Incorrect syntax near 'queryout'.
How can I export table to Excel easily in SQL Server?
When I run my command:
psql -h localhost -p 5432 -U meee -d my_db -f sqltest.sql
it displays:
CREATE VIEW
ALTER TABLE
However I want it to show me just like pgadmin show it (for exmpl: The query was executed successfully in 45 ms, but returns no results)
add at the beginning of ´sqltest.sql´ the command ´\timing´ and you will see the time of each command
for example script.sql :
\timing select 2 ; select 1; create table tablax(i int);
Or if you want all time from the begining of the script until end, add some command to script
at the beginning:
create temp table tab (time1 time,time2 time); insert into tab (time1) select now()::time;
at the end:
update tab set time2=now()::time; select time2-time1 as time_Elapsed from tab;
for example:
create temp table tab (time1 time,time2 time);
insert into tab (time1) select now()::time;
...
your script code
...update tab set time2=now()::time; select time2-time1 as time_Elapsed from tab;
Use the a psql command parameter:
psql -h localhost -p 5432 -U meee -d my_db -af sqltest.sql
https://www.postgresql.org/docs/current/static/app-psql.html
And place \timing on at the top of your script
I'm trying to write a script in R to export a query from a Postgresql database to a csv file.
If I do this from cmd it works fine :
psql -U postgres
\copy (select * from clickthru.train limit 10) to 'c:\\me\\psql_test.csv' with csv header;
However when I try this in R it looks like it executes (no errors) but no file is generated:
system('psql -U postgres COPY (select * from clickthru.train limit 10;) TO "C:\\me\\psql_test.csv" with CSV')
Any suggestions?
use;
\copy (select * from clickthru.train limit 10) to 'c:\\me\\psql_test.csv' DELIMITER ',' CSV HEADER;
There are several issues here
Query should be passed via -c argument: psql -U postgres -c '... query ...'
COPY and \copy work differently. COPY operates on servers disk space, \copy operates on client disk space. If you run both on one computer, then the difference is that COPY will work as postgres user, and \copy as user who runs the script.
In your example you have extra semicolon (;) in query ...COPY (select * from clickthru.train limit 10;)...
I have been looking for a command to export an entire database data (db2) to csv.
I did google it but it came up with db2 export command which only export table by table.
For example
export to employee.csv of del select * from employee
Therefore I have to do it for all the table and it can be very annoying. Is there a way I can export an entire database in db2 to csv? (or some other format that I can use with other databases)
Thank you
You could read the SYSIBM.SYSTABLES table to get the names of all the tables, and generate an export command for each table.
Write the export commands to an SQL file.
Read the SQL file, and execute the export commands.
Edited to add: Warning - some of your foreign keys may not be synchronized, as the data base can be changed while you're reading the various tables.
# loop trough all the db2 db tables in bash and export them
# all this is just another oneliner ...
# note the filter by schema name ...
db2 -x "select schemaname from syscat.schemata where 1=1 and schemaname like 'GOSALES%'" | { while read -r schema ; do db2 connect to GS_DB ; db2 -x "SELECT TABLE_NAME FROM SYSIBM.TABLES WHERE 1=1 AND TABLE_CATALOG='GS_DB' AND TABLE_SCHEMA = '$schema'" | { while read -r table ; do db2 connect to GS_DB; echo -e "start table: $table \n" ; db2 -td# "EXPORT TO $schema.$table.csv of del modified by coldel; SELECT * FROM $schema.$table " ; echo -e " stop table: $table " ; done ; } ; done ; }
wc -l *.csv | sort -nr
4185939 total
446023 GOSALES.ORDER_DETAILS.csv
446023 GOSALESDW.SLS_SALES_ORDER_DIM.csv
I know how to select which table I want to dump in SQL format with the shell command:
$ ./sqlite3 test.db '.dump mytable' > test.sql
But this command selects all the data of "mytable"
Can I select the data in my table that I want before dump and how?
In other terms I seek a command like :
$ ./sqlite3 test.db '.dump select name from mytable' > test.sql
Obviously this command does not work :'(
The only way to do it within the sqlite console is to create a temporary table:
CREATE TABLE tmp AS SELECT (field1, field2 ... ) FROM yourTable WHERE ... ;
.dump tmp
DROP TABLE tmp