I have to load set of questions to MYSQL database table.But the table format is like below
QNO Questions Option a Option b Option c Option d Rightanswer... but my textfile which i need to load in database using LOADINFILE command is in different format. How can i match my textfile format to my table format?
You can re-map the columns to import in LOAD DATA INFILE.
Check the manual entry
Example:
LOAD DATA LOCAL INFILE '/importfile.csv'
INTO TABLE test_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(field1, field2, field3);
I like using a graphical front-end like HeidiSQL to point-and-click the desired fields, and copy the generated SQL from there.
This depends upon your table format, but if there is any complex preprocessing then it's probably a good idea to do it outside of mysql. It's probably a lot easier.
Related
I'm trying to load a CSV file which is control+A separated into bigquery. What should be the option I pass for -F parameter for the bq load command? All the options I have tried are resulting in an error while loading.
I would guess that Control+A is used in some legacy formats that OP wants to load into BigQuery. From the other hand Control+A can be chosen when it is hard to select any of usually used delimiters.
My recommendation would be to load your CSV file without any delimiter, so whole row will be loaded as a one field
Assuming your rows loaded into TempTable look like below with just one column called FullRow.
'value1^Avalue2^Avalue3'
where ^A is "invisible" character
So, after you loaded your file into BigQuery - now you can parse it to separate columns and write it to final table with something like below
SELECT
REGEXP_EXTRACT(FullRow, r'(?:\w*\x01){0}(\w*)') AS col1,
REGEXP_EXTRACT(FullRow, r'(?:\w*\x01){1}(\w*)') AS col2,
REGEXP_EXTRACT(FullRow, r'(?:\w*\x01){2}(\w*)') AS col3
FROM TempTable
Above is confirmed to work as I used this approach multiple times. Works for both Legacy and Standard SQL
I have a following table in xlsx format which I would like to import into the my sql database:
The table is pretty complicated and I only want the records after '1)HEADING'
I have been looking at php libraries to import into sql but they only seem to be for simple excel files.
You have two ways to realize that :
First method :
1) Export it into some text format. The easiest will probably be a tab-delimited version, but CSV can work as well.
2) Use the load data capability. See http://dev.mysql.com/doc/refman/5.1/en/load-data.html
3) Look half way down the page, as it will gives a good example for tab separated data:
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\'
4) Check your data. Sometimes quoting or escaping has problems, and you need to adjust your source, import command-- or it may just be easier to post-process via SQL.
Second method :
There's a simple online tool that can do this called sqlizer.io.
You upload an XLSX file to it, enter a sheet name and cell range, and it will generate a CREATE TABLE statement and a bunch of INSERT statements to import all your data into a MySQL database.
Hi I often have to insert a lot of data into a table. For example, I would have data from excel or text file in the form of
1,a
3,bsdf
4,sdkfj
5,something
129,else
then I often construct 6 insert statements in this example and run the SQL script. I found this was slow when I have to send thousands of small packages to server, it also causes extra overhead to the network.
What's your best way of doing this?
Update: I'm using ORACLE 10g.
Use Oracle external tables.
See also e.g.
OraFaq about external tables
What Tom thinks about external tables
René Nyffenegger's notes about external tables
A simple example that should get you started
You need a file located in a server directory (get familiar with directory objects):
SQL> select directory_path from all_directories where directory_name = 'JTEST';
DIRECTORY_PATH
--------------------------------------------------------------------------------
c:\data\jtest
SQL> !cat ~/.gvfs/jtest\ on\ 192.168.xxx.xxx/exttable-1.csv
1,a
3,bsdf
4,sdkfj
5,something
129,else
Create an external table:
create table so13t (
id number(4),
data varchar2(20)
)
organization external (
type oracle_loader
default directory jtest /* jtest is an existing directory object */
access parameters (
records delimited by newline
fields terminated by ','
missing field values are null
)
location ('exttable-1.csv') /* the file located in jtest directory */
)
reject limit unlimited;
Now you can use all the powers of SQL to access the data:
SQL> select * from so13t order by data;
ID DATA
---------- ------------------------------------------------------------
1 a
3 bsdf
129 else
4 sdkfj
5 something
Im not sure if this works in Oracle but in SQL Server you can use BULK INSERT sql statement to upload data from a txt or a csv file.
BULK
INSERT [TableName]
FROM 'c:\FileName.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
Just make sure that the table columns correctly matches whats in the txt file. With a more complicated solution you may want to use a format file see the following:
http://msdn.microsoft.com/en-us/library/ms178129.aspx
There are alot of ways to speed this up.
1) Do it in a single transaction. This will speed things up by avoiding connection opening / closing.
2) Load directly as a CSV file. If you load data as a CSV file, the "SQL" statements aren't required at all. in MySQL the "LOAD DATA INFILE" operation accomplishes this very intuitively and simply.
3) You can also simply dump the whole file as text into a table called "raw". And then let the database parse the data on its own using triggers. This is a hack, but it will simplify your application code and reduce network usage.
I have hundreds of excel documents which have lookup values for all the lookup tables that I am giving to my developers. some are small and some are super huge like world cities. Either i can send them the xls file and let them import it into the DB but I prefer to send them the SQL inserts in a text file so they can just execute it and save time to load all the data.
Now I dont have any MySQL environment setup as i dont do development so the question is how do i convert the various colunms of lookup values on each excel tab into insert statements to load in? Are there any online tools that can read the xls and create sql inserts? I dont want to manually do it, the city table itself will take me a whole week if i put in 12 hours a day each day of the week to manually create the inserts for all the rows.
Within Excel, save your spreadsheets as CSV (comma separated values) files. Your developers will be able to load them into MySQL directly using LOAD DATA INFILE. If they create a table with columns that match the CSV columns, then your developers can import them with the following SQL command:
LOAD DATA INFILE 'file_name.csv'
INTO TABLE tbl_name
FIELDS
TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES
TERMINATED BY '\r\n' -- or '\n' if you are on unix or '\r' if you are on mac
IGNORE 1 LINES -- if you want to skip over the column headings
My company's currently moving our databases around, shifting one set of tables out from the old MySQL instance into the new. We've done some development prior to this migration, and some tables' structure has been altered from the original (eg. columns were dropped).
So currently I've dumped the data from the old database and am now attempting to reinsert them into the new table. Of course, the import borks when it tries to insert rows with more fields than the table has.
What's the best way (preferably scriptable, because I foresee myself having to do this a few more times) to import only the fields I need into the new table?
Update the following to suit:
SELECT 'INSERT INTO NEW_TABLE ... ('+ to.column +');'
FROM OLD_TABLE ot
You need an INSERT statement for the table on the new database, with column list. Then populate the value portion accordingly based on the values in the old table. Run in the old environment, and you'll have your inserts with data for the new environment - just copy'n'paste into a script.
Mind though that datatypes have to be handled accordingly - dates (incl. time), and strings will have to be handled because you're dealing in text.
First of all, create new database with old structure, or temp tables in current database. Then run script with insert statements for each row, but in values must be only those fields that are in new structure.
insert into newTable select row1,row2 from tempTable
Use the fastest way, load data infile :
-- Dump datas
SELECT * INTO OUTFILE 'mybigtable.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM mybigtable
-- Load datas
LOAD DATA LOCAL INFILE 'mybigtable.csv'
INTO TABLE mynewbigtable
FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'
(#col1,#col2,#col3,#col4) set name=#col4,id=#col2;
Ref :
http://dev.mysql.com/doc/refman/5.6/en/insert-speed.html
http://dev.mysql.com/doc/refman/5.6/en/load-data.html
If you're using MySQL 5.1, a powerful, although maybe in this case overkill, solution is to do an xml mysqldump and use an XSLT to transform it. Unfortunately re-importing that xml file isn't supported in 5.0, you'll need 5.1, 5.4, or 6.0