I created a script to export a table from my db into a .csv file
#!/usr/bin/bash
FILE="example.csv"
sqlplus -s abcd/abcd#XE <<EOF
SET PAGESIZE 50000
SET COLSEP ","
SET LINESIZE 200
SET FEEDBACK OFF
SPOOL $FILE
SELECT * FROM myTable;
SPOOL OFF
EXIT
and now I'd like to modify this script and export my table into another. How can I change my code?
By "exporting your table into another", do you mean copying data from one table to another? If you don't need indexes or keys or other features on your new table initially, i.e. if it's not for production use, it's quite simple:
#!/usr/bin/bash
TABLE="myOtherTable"
sqlplus -s abcd/abcd#XE <<EOF
CREATE TABLE $TABLE as SELECT * FROM myTable;
EXIT
You could also do a create table statement first, specifying columns, keys and storage options as any other table, and then have a separate line that does INSERT INTO $TABLE (SELECT * FROM myTable) to fill it with data copied from myTable.
Related
I need to export data from my Oracle table to a csv file.
Below is the code that I am running under SQL Developer :
set termout off
set serveroutput off
set feedback off
set colsep ';'
set lines 100000
set pagesize 0
set echo off
set feedback off
spool D:\myfile.csv
select *
from Employee;
spool off
However the output of the above code in the csv file is :
select *
from Employee;
I want the data of the Employee table to be in the csv, not the sql statement.
Any idea what might be wrong in the above code? Thanks.
save your sql in a file emp.sql in a directory D:\scripts and use like below;
set term off
set feed off
spool D:\myfile.csv
#D:\scripts\emp.sql
spool off
You're in SQL Developer, so you don't have to write so much code.
SET SQLFORMAT csv
SPOOL C:\your_file.csv
select * from whatever;
spool off
Run with F5
I already got the csv file with data but i need data with columns
I have many table which i need to execute at one go and save data in csv format at one location with header i don't want to add columns name as their 100 table.
I am using below query
Declare #cmd varchar(8000)
Set #cmd='Bcp"select * from tablename where condition queryout "path" -s -T -f -t, -c -E
EXECUTE master..xp_cmdshell
Their 100 tables like that.. any suggestion
I have to run several queries on an oracle 11g database, within SQLDeveloper 3.1.
For example:
select * from product;
select * from customer;
select * from prices;
At the moment I am exporting the resultsets "per hand", I simply right-clickonto the result and thenexport` it.
I would like to automatically save the resultset of each query in a specific folder.
Any recommendation how I could do that?
UPDATE
I tried using the csv and als the txt extesion of testFile:
spool C:\Users\User\Desktop\testFile.csv --I tried also .txt extension here!!!
set colsep ';'
select * from product;
spool off;
However, when I open the file I get for csv and txt the following result:
> set colsep '
> select * from product
I appreciate your replies!
set echo off
set feedback off
set linesize 1000
set pagesize 0
set sqlprompt ''
set trimspool on
spool output.csv
select columnA || ',' || columnB || ',' || ......
from table
where ...
spool off;
exit 0;
Then create a shell script that calls the sql file
sqlplus >/dev/null 2>&1 "user/pass#DATABASE" << EOF
whenever sqlerror exit 1
#file.sql
EOF
UPDATE just saw you are on windows, same principle still applies, you probably will need to use PowerShell
You can use Spool, http://docs.oracle.com/cd/B19306_01/server.102/b14357/ch12043.htm
spool OutFile.txt
select row1||','||row2... from product; --format you prefer
spool off;
I'm trying to insert data from excel sheet to sql database. The query is stored in a text file as follows:
insert into [demo].[dbo].[relative]
select *
from openrowset('Microsoft.Jet.OLEDB.4.0','Excel 8.0;Database=D:\relative.xls','select * from [sheet1$]');
When I am executing the following command:
sqlcmd -S ADMIN-PC/SEXPRESS -i d:\demo.txt.
it is showing this error:
Msg 7357, Level 16, State 2, Server ADMIN-PC\SEXPRESS, Line 1
Can anyone please help in rectifying my problem.
Try using the sql server import vizard to create a new table from the xls file and then insert that data to the existing table from there. The problem you are having is maybe due to the non-compatibility between 64bit sql instance and 32 bit excel.
Or try using bcp
bcp demo.dbo.relative in "D:\relative.xls" -c -T
There is another way to get the same result..
create a temp table.
declare #sometable table (value varchar(50), translation varchar(max))
select * into #sometable from YOUR_DATABASE_TABLE (nolock)
Then, do your OPENROWSET, BCP, etc. from here..
You can create a shell script which will automatically read the insert commands from the .csv file and then write it to the database. If you want I can help you up with it. What you just need to do is to write all the insert statements in the .csv file.
#!/bin/ksh
sqlplus -silent /nolog << EOF > /dev/null
username/pwd#"Connection String"
set linesize 0;
set pagesize 0;
set echo off;
while read line; do
A=`echo "$line" | awk -F" " {print 1}`
and so on depends on the number of words in the insert statements.
$A $B
done < your_insert_statements.csv
It will read the .csv file and automatically insert the records in the database.
Check my last question Export values from SQL Server to txt file. I can able to export values to .txt file with my last question. I want to do export #tempTable to .txt file. How can I do this?
Edit: My Requirement: I want to export only updated data from a table to .txt file, say user insert 10 new rows of data in existing table, I want that 10 rows to be there in .txt file. I used Inserted table in trigger to get updated rows, when I try to export to .txt file from bcp, I can't since I don't know the full context of inserted table([database].[schema].[tableName]). So I decided to have inserted table data in #tempTable to export .txt file.
This is I suppose to do export inserted table data to .txt file in SQL server.
you could simply load the memory table into a regular table an then use your code for export that you already have
select * into queryout from #tempTable
-- your code from step 1
drop table queryout
here is an updated version of your trigger which should do the work
create trigger monitorTest1 on test for insert as
declare #sql varchar(8000);
select * into tempTable from inserted
SELECT #sql = 'bcp "select * from test2.dbo.tempTable" queryout C:\temp\mytest.txt -c -t -T -S localhost'
exec xp_cmdshell #sql
drop table tempTable