Format 6 Mio Codes for SQL Table - sql

I have a txt. File with 6 Mio unique Codes.
Codes like:
0007:=)GnuW
0045:)w1WKu
0007:=)GnuW
0045:)w1WKu
.....
I need a way to format them with a separator like || to upload them into a SQL Table.
I tried to use SublimeText's ability to mark all 6 Mio Lines and jump to the End of each line to add the separator. ->that didn't worked Sublime crashes.
Once I have my formatted csv. How should I import this huge amount of records?
Should I Split the File into 100 Files?

I don't get why you need the file to be converted from .txt to .csv? .txt should have line breaks.
If you are able to perform bcp this will be the fastest way to import data.
http://technet.microsoft.com/en-us/library/aa173839(v=sql.80).aspx
Another way would be using bulk insert
http://technet.microsoft.com/en-us/library/aa225968(v=sql.80).aspx
But also using "Import Data" feature in SSMS or using SSIS Data Flow should not take too long. if inserting into an empty table.
I'm assuming your data has linebreaks? So what about importing the data first and then split as needed?

Related

How to export multiple tables in oracle into multiple .csv files in Oracle SQL developer

I am trying to export multiple tables to individual .csv files (approximately 70 tables need to be exported) in oracle SQL Developer. As of right now, I am running through this process
Run Query
SELECT * FROM TABLE;
From the result window, click "Export Query Results", choosing the encoding and delimiter and saving it as a .csv
This is a lengthy process, and takes around a minute per table (lots of information!), I can't help but think there has to be an easier, more efficient way of doing this. I just can't find any information.
Tools - Database Export
Pick your file format (csv) and directory.
I have Excel shown in the picture, but there's a dozen formats to choose from, including delimited and CSV. If you want European CSV (;), pick delimited and change the delimiter to ;
Then pick your 70 tables.
I think the best solution for mass table export is not to use the embedded SQL Developper export tool, but use the SPOOL SQL option.
you can check here how to generate SPOOL files:
How do I spool to a CSV formatted file using SQLPLUS?

importing excel table into database

I have a following table in xlsx format which I would like to import into the my sql database:
The table is pretty complicated and I only want the records after '1)HEADING'
I have been looking at php libraries to import into sql but they only seem to be for simple excel files.
You have two ways to realize that :
First method :
1) Export it into some text format. The easiest will probably be a tab-delimited version, but CSV can work as well.
2) Use the load data capability. See http://dev.mysql.com/doc/refman/5.1/en/load-data.html
3) Look half way down the page, as it will gives a good example for tab separated data:
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\'
4) Check your data. Sometimes quoting or escaping has problems, and you need to adjust your source, import command-- or it may just be easier to post-process via SQL.
Second method :
There's a simple online tool that can do this called sqlizer.io.
You upload an XLSX file to it, enter a sheet name and cell range, and it will generate a CREATE TABLE statement and a bunch of INSERT statements to import all your data into a MySQL database.

How to create format files using bcp from flat files

I want to use a format file to help import a comma delimited file using bulk insert. I want to know how you generate format files from a flat file source. The microsoft guidance on this subjects makes it seem as though you can only generate a format file from a SQL table. But I want it to look at text file and tell me what the delimiters are in that file.
Surely this is possible.
Thanks
The format file can, and usually does include more than just delimiters. It also frequently includes column data types, which is why it can only be automatically generated from the Table or view the data is being retrieved from.
If you need to find the delimiters in a flat file, I'm sure there are a number of ways to create a script that could accomplish that, as well as creating a format file.

What is the best way to import data using insert statements into a table in MS SQL?

I have exported a table from another db into an .sql file as insert statements.
The exported file has around 350k lines in it.
When i try to simply run them, I get a "not enough memory" error before the execution even starts.
How can import this file easily?
Thanks in advance,
Orkun
You have to manually split sql file into smaller pieces. Use Notepad++ or some other editor capable to handle huge files.
Also, since you wrote that you have ONE table, you could try with utility or editor which can automatically split file into pieces of predefined size.
Use SQLCMD utility.. search MICROSOFT documentation.. with that you just need to gives some parameters. One of them is file path.. no need to go through the pain of splitting and other jugglery..

Is there an easier way to import this data into a sql database?

I'm trying to import a bunch of ach files and make a big sql table. An ach file is a text file with transaction information arranged in columns. The problem is that I need to add a date column. Currently the date is only contained in the file name and header. There are about 3000 files and each file is a different date.
I have basic knowledge of sql commands and how to query a database, but I just started learning about importing data for this project. The only tool I found is the program called "import and export data" as a part of sql server 2012. It allows me to import the text file and make it into a table.
The problem is that I have to import the text file and create the table. Then I add a column for the date, do
update table
set date='date'
then I can combine tables with an insert command. The do it again 3000 times.
Is there a better way?
Write a program that can open all files in directory.
Extend that program to parse ach files.
Extend that program to get date from file name.
Extend that program to write to database.
I'd say it's 3 hours of work.