I have a requirement of importing multiple files from a folder into a local DB table. column structures are same. Row count varies from 25 k to 200 k in each file.
which tool can be used to do it faster? like Sql server, oracle or Mysql
I think this could help, at least for MySQL:
LOAD DATA LOCAL INFILE '<filename>' INTO TABLE <table_name> FIELDS TERMINATED BY ';' LINES TERMINATED BY '\n' (<columns>);
You can use SQL Server with OPENROWSET.
1. https://learn.microsoft.com/en-us/sql/t-sql/functions/openrowset-transact-sql
You can import multiple files in one query but it will depend on your needs.
Related
I have a third party tool which uses CSV Text Drivers which allows for executing SQL queries on CSV data imported into the tool. Most Oracle SQL queries work on this while many don't.
I have a requirement where I have to read and import data into the tool using a CSV file which has no column names or header fields available. How can I execute SQL queries on a table which has no column names or headers defined?
Sample Table:
AB 100 GPAA 9876
AC 101 GPAB 9877
AD 102 GPAC 9878
You would likely need to add the headers before running the queries. Is there a table in which the data will eventually end up? If so, you could export the column names from there first, then append the CSV info to the newly created file afterward.
So apparently, you can specify if your CSV file has a header or not when using the CSV SQL Text driver for interaction with CSV files.
jdbc:csv:////<path_to_file>/?_CSV_Header=false;
Then, we can have a query like
select distinct (column1) as accountID, (column2) as groupID from csv_file_name
The parameters (column1), (column2)... represent the actual columns in the file from left to right and they have to be written like this for the query to work.
I have a csv file with 45,00,000 rows and i need a way to limit the number of rows to be imported in the database say 1,00,000 rows.How can I import the limited number of rows in the mysql database?
What you could do is use PhpMyAdmin - this has an option to break up your CSV file into multiple requests when doing the import. Check out this article: http://www.group3solutions.com/blog/tips-on-phpmyadmin-csv-importing/
Also - you could try this application: http://www.ozerov.de/bigdump/ . It worked for me before.
I have a little problem. My friend has a database with over 10 tables and each table has over 90-100 records.
I can't find a workaround to export the records (to put in a SQL file something like this: INSERT INTO .... VALUES ... for each existing records) from his tables to import in my database.
How to do that ?
I tried: right click on a table -> Script Table as -> INSERT TO -> File ...
but it only generate the INSERT statement.
There are a solution ? or this feature is only for commercial version ?
UPDATE
You can use BCP command with command prompt like this
For export: bcp ADatabase.dbo.OneTable out d:\test\OneTable.bcp -c -Usa -Ppassword
For import: bcp ADatabase.dbo.OneTable in d:\test\OneTable.bcp -c -Usa -Ppassword
these commands will create a BCP file which contains records for specified table. You can import using existing BCP file into another database
If you use remote database then:
bcp ADatabaseRemote.dbo.OneTableRemote out d:\test\OneTableRemote.bcp -Slocalhost/SQLExpress -Usa -Ppassword
Instead of localhost/SQLExpress, you can use localhost or other server name...
Probably the simplest way to do this would be to run a SELECT statement that outputs to a file. Then you can import that data into your database.
For simple moves, I have also done a copy/paste manually. Sometimes it is better to use Excel as a staging platform before pasting it into the new database. You may need to create a temporary table in your new database that matches up exactly with the data you are pasting over. For example, I usually don't put a PK on the temp table at first and make the PK field just an INT. That way the copy will go smoother.
In the corporate world, you would use SSIS to move this data around.
a couple of ways you could do this. One,select everything from each table and save the results as a csv or delimited file (you can do this from sql management studio). You can also script the tables as create and copy the scripts over to the new database, assuming it is a sql server also. Then for import use the load infile statement. You may have to google the syntax for sql server but I know this works in mysql and oracle. haven't tried it in sql server yet.
LOAD DATA INFILE 'myfile'
INTO TABLE stuff
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
SET id = NULL;
Or if you are going to another sql server use the sql export import wizard.
http://msdn.microsoft.com/en-us/library/ms141209.aspx
Hi i am using SQL Server 2008.
How can I import an Excel file into the database, which is the easiest way and simple to do?
OpenRowSet
BulkCopy
Linked Servers
SSIS
I have the above options to Import Excel to Database.
In my opinion SSIS wizard is best way to import excel data where you get row and column wise whole view of table data which will be inserted and also specify column names and contraints and parse data using query.
UPDATE :
If the data in your excel file does not require any processing to match your database table then I recommend you save your excel file as a csv and use a combination of BULK INSERT and the BCP.exe program.
To use BULK INSERT you will need a format file which defines how your datafile matches up to your database table. You can write this by hand to match the existing database table or you can use the following command to generate the format file you need:
bcp [ServerName].[SchemaName].[TableName] format nul -c -f [FormatFileOutputName].fmt -S[ServerHostName] -U[DbUserName] -P[DbUserPassword]
Now you will have 2 files:
DatafileName.csv
FormatFileName.fmt.
Use BULK INSERT within Sql Server to insert your data.
Note: If the columns in your datafile are in a different order than your database table then you can simply edit the generated format file to have them map correctly.
Is it possible for me to write an SQL query from within PhpMyAdmin that will search for matching records from a .csv file and match them to a table in MySQL?
Basically I want to do a WHERE IN query, but I want the WHERE IN to check records in a .csv file on my local machine, not a column in the database.
Can I do this?
I'd load the .csv content into a new table, do the comparison/merge and drop the table again.
Loading .csv files into mysql tables is easy:
LOAD DATA INFILE 'path/to/industries.csv'
INTO TABLE `industries`
FIELDS TERMINATED BY ';'
IGNORE 1 LINES (`nogaCode`, `title`);
There are a lot more things you can tell the LOAD command, like what char wraps the entries, etc.
I would do the following:
Create a temporary or MEMORY table on the server
Copy the CSV file to the server
Use the LOAD DATA INFILE command
Run your comparison
There is no way to have the CSV file on the client and the table on the server and be able to compare the contents of both using only SQL.
Short answer: no, you can't.
Long answer: you'll need to build a query locally, maybe with a script (Python/PHP) or just uploading the CSV in a table and doing a JOIN query (or just the WHERE x IN(SELECT y FROM mytmmpTABLE...))
For anyone new asking, there is this new tool that i used : Write SQL on CSV file