I am very new to Apache Derby, and I have a need to export all of the rows from a table into a csv file.
I am using RazorSQL to connect to the database.
What I need to know is if there is an equivalent Derby SQL command to what MySQL does to export data below.
NOTE: VERY IMPORTANT. One of the columns in this database table is defined as an XML datatype. It will NOT work using any kind of wizard to export the data, because the select statement needs to use the 'xmlserialize' statement in the SQL command.
Here is what I want to do, but Derby does not understand the "INTO OUTFILE" statement I am used to using with MySQL.
select
message_id,
msg_schema_id,
user_id,
req_send_time,
destination,
xmlserialize(msg_content as clob) as message,
facility_id
from Audit
into outfile 'd:\csv\audit.csv'
fields terminated by ','
enclosed by '"'
lines terminated by '\n';
Related
I have a requirement of importing multiple files from a folder into a local DB table. column structures are same. Row count varies from 25 k to 200 k in each file.
which tool can be used to do it faster? like Sql server, oracle or Mysql
I think this could help, at least for MySQL:
LOAD DATA LOCAL INFILE '<filename>' INTO TABLE <table_name> FIELDS TERMINATED BY ';' LINES TERMINATED BY '\n' (<columns>);
You can use SQL Server with OPENROWSET.
1. https://learn.microsoft.com/en-us/sql/t-sql/functions/openrowset-transact-sql
You can import multiple files in one query but it will depend on your needs.
below is the sample line of csv
012,12/11/2013,"<555523051548>KRISHNA KUMAR ASHOKU,AR",<10-12-2013>,555523051548,12/11/2013,"13,012.55",
you can see KRISHNA KUMAR ASHOKU,AR as single field but it is treating KRISHNA KUMAR ASHOKU and AR as two different fields because of comma, though they are enclosed with " but still no luck
I tried
BULK
INSERT tbl
FROM 'd:\1.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FIRSTROW=2
)
GO
is there any solution for it?
The answer is: you can't do that. See http://technet.microsoft.com/en-us/library/ms188365.aspx.
"Importing Data from a CSV file
Comma-separated value (CSV) files are not supported by SQL Server bulk-import operations. However, in some cases, a CSV file can be used as the data file for a bulk import of data into SQL Server. For information about the requirements for importing data from a CSV data file, see Prepare Data for Bulk Export or Import (SQL Server)."
The general solution is that you must convert your CSV file into one that can be be successfully imported. You can do that in many ways, such as by creating the file with a different delimiter (such as TAB) or by importing your table using a tool that understands CSV files (such as Excel or many scripting languages) and exporting it with a unique delimiter (such as TAB), from which you can then BULK INSERT.
They added support for this SQL Server 2017 (14.x) CTP 1.1. You need to use the FORMAT = 'CSV' Input File Option for the BULK INSERT command.
To be clear, here is what the csv looks like that was giving me problems, the first line is easy to parse, the second line contains the curve ball since there is a comma inside the quoted field:
jenkins-2019-09-25_cve-2019-10401,CVE-2019-10401,4,Jenkins Advisory 2019-09-25: CVE-2019-10401:
jenkins-2019-09-25_cve-2019-10403_cve-2019-10404,"CVE-2019-10404,CVE-2019-10403",4,Jenkins Advisory 2019-09-25: CVE-2019-10403: CVE-2019-10404:
Broken Code
BULK INSERT temp
FROM 'c:\test.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
FIRSTROW= 2
);
Working Code
BULK INSERT temp
FROM 'c:\test.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
FORMAT = 'CSV',
FIRSTROW= 2
);
Unfortunately , SQL Server Import methods( BCP && BULK INSERT) do not understand quoting " "
Source : http://msdn.microsoft.com/en-us/library/ms191485%28v=sql.100%29.aspx
I have encountered this problem recently and had to switch to tab-delimited format. If you do that and use the SQL Server Management Studio to do the import (Right-click on database, then select Tasks, then Import) tab-delimited works just fine. The bulk insert option with tab-delimited should also work.
I must admit to being very surprised when finding out that Microsoft SQL Server had this comma-delimited issue. The CSV file format is a very old one, so finding out that this was an issue with a modern database was very disappointing.
MS have now addressed this issue and you can use FIELDQUOTE in your with clause to add quoted string support:
FIELDQUOTE = '"',
anywhere in your with clause should do the trick, if you have SQL Server 2017 or above.
Well, Bulk Insert is very fast but not very flexible. Can you load the data into a staging table and then push everything into a production table? Once in SQL Server, you will have a lot more control in how you move data from one table to another. So, basically.
1) Load data into staging
2) Clean/Convert by copying to a second staging table defined using the desired datatypes. Good data copied over, bad data left behind
3) Copy data from the "clean" table to the "live" table
I have a little problem. My friend has a database with over 10 tables and each table has over 90-100 records.
I can't find a workaround to export the records (to put in a SQL file something like this: INSERT INTO .... VALUES ... for each existing records) from his tables to import in my database.
How to do that ?
I tried: right click on a table -> Script Table as -> INSERT TO -> File ...
but it only generate the INSERT statement.
There are a solution ? or this feature is only for commercial version ?
UPDATE
You can use BCP command with command prompt like this
For export: bcp ADatabase.dbo.OneTable out d:\test\OneTable.bcp -c -Usa -Ppassword
For import: bcp ADatabase.dbo.OneTable in d:\test\OneTable.bcp -c -Usa -Ppassword
these commands will create a BCP file which contains records for specified table. You can import using existing BCP file into another database
If you use remote database then:
bcp ADatabaseRemote.dbo.OneTableRemote out d:\test\OneTableRemote.bcp -Slocalhost/SQLExpress -Usa -Ppassword
Instead of localhost/SQLExpress, you can use localhost or other server name...
Probably the simplest way to do this would be to run a SELECT statement that outputs to a file. Then you can import that data into your database.
For simple moves, I have also done a copy/paste manually. Sometimes it is better to use Excel as a staging platform before pasting it into the new database. You may need to create a temporary table in your new database that matches up exactly with the data you are pasting over. For example, I usually don't put a PK on the temp table at first and make the PK field just an INT. That way the copy will go smoother.
In the corporate world, you would use SSIS to move this data around.
a couple of ways you could do this. One,select everything from each table and save the results as a csv or delimited file (you can do this from sql management studio). You can also script the tables as create and copy the scripts over to the new database, assuming it is a sql server also. Then for import use the load infile statement. You may have to google the syntax for sql server but I know this works in mysql and oracle. haven't tried it in sql server yet.
LOAD DATA INFILE 'myfile'
INTO TABLE stuff
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
SET id = NULL;
Or if you are going to another sql server use the sql export import wizard.
http://msdn.microsoft.com/en-us/library/ms141209.aspx
I need a full dump of my SQL server database in one large XML file. I need to get all the tables, except on some tables I need to exclude specific columns, (columns with raw data).
How can I do this?
I am using SQL server 2008 R2.
Never tried it, but I believe you can use the bulk export option of bcp:
From SQL Server Books on-line:
"E. Bulk exporting XML data
The following example uses bcp to bulk export XML data from the table that is created in the preceding example by using the same XML format file. In the following bcp command, and represent placeholders that must be replaced with appropriate values:
bcp bulktest..xTable out a-wn.out -N -T -S<server_name>\<instance_name>
"
My company's currently moving our databases around, shifting one set of tables out from the old MySQL instance into the new. We've done some development prior to this migration, and some tables' structure has been altered from the original (eg. columns were dropped).
So currently I've dumped the data from the old database and am now attempting to reinsert them into the new table. Of course, the import borks when it tries to insert rows with more fields than the table has.
What's the best way (preferably scriptable, because I foresee myself having to do this a few more times) to import only the fields I need into the new table?
Update the following to suit:
SELECT 'INSERT INTO NEW_TABLE ... ('+ to.column +');'
FROM OLD_TABLE ot
You need an INSERT statement for the table on the new database, with column list. Then populate the value portion accordingly based on the values in the old table. Run in the old environment, and you'll have your inserts with data for the new environment - just copy'n'paste into a script.
Mind though that datatypes have to be handled accordingly - dates (incl. time), and strings will have to be handled because you're dealing in text.
First of all, create new database with old structure, or temp tables in current database. Then run script with insert statements for each row, but in values must be only those fields that are in new structure.
insert into newTable select row1,row2 from tempTable
Use the fastest way, load data infile :
-- Dump datas
SELECT * INTO OUTFILE 'mybigtable.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM mybigtable
-- Load datas
LOAD DATA LOCAL INFILE 'mybigtable.csv'
INTO TABLE mynewbigtable
FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'
(#col1,#col2,#col3,#col4) set name=#col4,id=#col2;
Ref :
http://dev.mysql.com/doc/refman/5.6/en/insert-speed.html
http://dev.mysql.com/doc/refman/5.6/en/load-data.html
If you're using MySQL 5.1, a powerful, although maybe in this case overkill, solution is to do an xml mysqldump and use an XSLT to transform it. Unfortunately re-importing that xml file isn't supported in 5.0, you'll need 5.1, 5.4, or 6.0