I have hundreds of excel documents which have lookup values for all the lookup tables that I am giving to my developers. some are small and some are super huge like world cities. Either i can send them the xls file and let them import it into the DB but I prefer to send them the SQL inserts in a text file so they can just execute it and save time to load all the data.
Now I dont have any MySQL environment setup as i dont do development so the question is how do i convert the various colunms of lookup values on each excel tab into insert statements to load in? Are there any online tools that can read the xls and create sql inserts? I dont want to manually do it, the city table itself will take me a whole week if i put in 12 hours a day each day of the week to manually create the inserts for all the rows.
Within Excel, save your spreadsheets as CSV (comma separated values) files. Your developers will be able to load them into MySQL directly using LOAD DATA INFILE. If they create a table with columns that match the CSV columns, then your developers can import them with the following SQL command:
LOAD DATA INFILE 'file_name.csv'
INTO TABLE tbl_name
FIELDS
TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES
TERMINATED BY '\r\n' -- or '\n' if you are on unix or '\r' if you are on mac
IGNORE 1 LINES -- if you want to skip over the column headings
Related
I have a problem to transfer Excel data to My SQL Query Browser? Someone can help me with how to transfer Excel data to SQL?
My excel data is store
C:\2019\countries.xlsx
My excel data is like below the picture:
Below is my SQL info, I won't put excel file data to column(id, name,country_code and language):
I hope anyone can guide me on how to use an easy way to insert data into the database. Thanks.
You need to convert into csv then use the below query to import.
load data local infile 'C:/2019/countries.csv' into table countries
fields terminated by ','
enclosed by '"'
lines terminated by '\r\n'
ignore 1 lines;
You can refer the detail in docs here
Note: If using Windows terminate by '\r\n' use '\n' on Linux based OS.
Every morning one of my clients send me a .txt file with ' ; ' as separator, and this is how the file is currently being imported in a temp table using SSIS:
mario.mascarenhas;MARIO LUIZ MASCARENHAS;2017-03-21 13:18:22;PDV;94d33a66dbaaff15a01d8139c7acd7c6;;;1;0;0;0;0;0;0;0;0;0;0;\N
evilanio.asevedo;EVILANIO ASEVEDO;2017-03-21 13:26:10;PDV;30a1bd072ac5f158f99445bb0975e423;;;1;1;0;0;0;0;0;0;0;0;0;\N
marcelo.tarso;MARCELO TARSO;2017-03-21 13:47:09;PDV;ef6b5e971242ec345552cdb724968f8a;;;1;0;0;0;0;0;0;0;0;0;0;\N
tiago.rodrigues;TIAGO ALVES RODRIGUES;2017-03-21 13:49:04;PDV;d782d4b30c0d302fe815b2cb48de4d03;;;1;1;0;0;0;0;0;0;0;0;0;\N
roberto.freire;ROBERTO CUSTODIO;2017-03-21 13:54:53;PDV;78794a18187f068d612e6b6370a60781;;;1;0;0;0;1;0;0;0;0;0;0;\N
eduardo.lima;EDUARDO MORO LIMA;2017-03-21 13:55:24;PDV;83e1c2696faa83d54881b13c70a07924;;;1;0;0;0;0;0;0;0;0;0;0;\N
Each file constains at least 23,000 rows just like that.
I already made a table with the correct number of columns to receive this data. So what I want is to "explode" (just like in PHP) the row using ' ; ' as the column separator and loop the insert in my table named dbo.showHistoricalLogging.
I've been searching for a solution here in Stack but nothing specific having this volume of data in consideration and looping an insert.
Any idea? I'm running SQL Server 2008.
My suggestion,
convert the text file into a csv file, then refer to this post from StackOverFlow to use the Bulk package. I have used this before while I was in University of Arizona for one of my programming assignments in my Database Designs class. Any clarifications and/or question, leave in a comment and will do my best.
Something like this should work
BULK INSERT [TableName] FROM 'C:\MyFile.txt' WITH (FIELDTERMINATOR = ';', ROWTERMINATOR = '\\N');
consult the Microsoft Bulk Insert documentation if you need other parameters. Alternatively SSIS makes this super easy as well - many ways you could do this honestly.
I am currently trying to move several data tables in my current AWS instance's redshift database to a new database in a different AWS instance (for background my company has acquired a new one and we need to consolidate to on instance of AWS).
I am using the UNLOAD command below on a table and I plan on making that table a csv then uploading that file to the destination AWS' S3 and using the COPY command to finish moving the table.
unload ('select * from table1')
to 's3://destination_folder'
CREDENTIALS 'aws_access_key_id=XXXXXXXXXXXXX;aws_secret_access_key=XXXXXXXXX'
ADDQUOTES
DELIMITER AS ','
PARALLEL OFF;
My issue is that when I change the file type to .csv and open the file I get inconsistencies with the data. there are areas where many rows are skipped and on some rows when the expected columns end I get additional columns with the value "f" for unknown reasons. Any help on how I could achieve this transfer would be greatly appreciated.
EDIT 1: It looks like fields with quotes are having the quotes removed. Additionally fields with commas are having the commas separated away. I've identified some fields with quotes and commas and they are throwing everything off. Would the addquotes clause I have apply to the entire field regardless of whether there are quotes and commas within the field?
Default document will have extension as txt and with quotes. Try to open it with Excel and then save as csv file.
refer https://help.xero.com/Q_ConvertTXT
I have a following table in xlsx format which I would like to import into the my sql database:
The table is pretty complicated and I only want the records after '1)HEADING'
I have been looking at php libraries to import into sql but they only seem to be for simple excel files.
You have two ways to realize that :
First method :
1) Export it into some text format. The easiest will probably be a tab-delimited version, but CSV can work as well.
2) Use the load data capability. See http://dev.mysql.com/doc/refman/5.1/en/load-data.html
3) Look half way down the page, as it will gives a good example for tab separated data:
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\'
4) Check your data. Sometimes quoting or escaping has problems, and you need to adjust your source, import command-- or it may just be easier to post-process via SQL.
Second method :
There's a simple online tool that can do this called sqlizer.io.
You upload an XLSX file to it, enter a sheet name and cell range, and it will generate a CREATE TABLE statement and a bunch of INSERT statements to import all your data into a MySQL database.
My company's currently moving our databases around, shifting one set of tables out from the old MySQL instance into the new. We've done some development prior to this migration, and some tables' structure has been altered from the original (eg. columns were dropped).
So currently I've dumped the data from the old database and am now attempting to reinsert them into the new table. Of course, the import borks when it tries to insert rows with more fields than the table has.
What's the best way (preferably scriptable, because I foresee myself having to do this a few more times) to import only the fields I need into the new table?
Update the following to suit:
SELECT 'INSERT INTO NEW_TABLE ... ('+ to.column +');'
FROM OLD_TABLE ot
You need an INSERT statement for the table on the new database, with column list. Then populate the value portion accordingly based on the values in the old table. Run in the old environment, and you'll have your inserts with data for the new environment - just copy'n'paste into a script.
Mind though that datatypes have to be handled accordingly - dates (incl. time), and strings will have to be handled because you're dealing in text.
First of all, create new database with old structure, or temp tables in current database. Then run script with insert statements for each row, but in values must be only those fields that are in new structure.
insert into newTable select row1,row2 from tempTable
Use the fastest way, load data infile :
-- Dump datas
SELECT * INTO OUTFILE 'mybigtable.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM mybigtable
-- Load datas
LOAD DATA LOCAL INFILE 'mybigtable.csv'
INTO TABLE mynewbigtable
FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'
(#col1,#col2,#col3,#col4) set name=#col4,id=#col2;
Ref :
http://dev.mysql.com/doc/refman/5.6/en/insert-speed.html
http://dev.mysql.com/doc/refman/5.6/en/load-data.html
If you're using MySQL 5.1, a powerful, although maybe in this case overkill, solution is to do an xml mysqldump and use an XSLT to transform it. Unfortunately re-importing that xml file isn't supported in 5.0, you'll need 5.1, 5.4, or 6.0