Update multiple tables from fixed width file - sql-server-2005

I need to do a bulk import into SQL Server 2005 using a fixed width file that has data to be written to 3 different tables. I am not sure the best way to go about this. I assume the best way would be to import all of the data into a single table and update each of the 3 from the new staging table. I am not sure though how to import the data into the table from the fixed width file.
Thanks,
Scott

Related

Should I use SSIS or the SQL Server Import Export tool for a large bulk insert operation?

I will soon need to import millions of records into into a single SQL Server Database table which we use in production. The data to import will be available in the form of about 40 csv files, each having hundreds of thousands of records.
For each row, some of the column values are supplied by the csv files, whereas other rows will require values that I must specify.
I am trying to determine which tool to use. I noticed that SQL Server Management Studio comes with the Import Export Wizard. Is that tool advisable for this type of job? Or should I use SSIS instead?
Some other questions I have:
Should I "lock" the table during the operation?
Should I perform the insert into a copy of the production table and
then once the operation is validated, should I make the copy the
official version of the production table?
As you are having some logic to handle for the rows from CSV (some rows, you will insert and some rows require you to supply some values), you cannot have these kinds of logic in the Import Export Wizard. It is straightforward load. So, you have to go for SSIS only.
You need to have conditional branching to split the rows and supply values to the target table.
For the second question, If possible, I would suggest you to load to separate table and then rename them later. That way, production system users are not impacted by this loading.

Import Oracle data dump and overwrite existing data

I have an oracle dmp file and I need to import data into a table.
The data in the dump contains new rows and few updated rows.
I am using import command and IGNORE=Y, so it imports all the new rows well. But it doesn't import/overwrite the existing rows (it shows a warning of unique key constraint violated).
Is there some option to make the import UPDATE the existing rows with new data?
No. If you were using data pump then you could use the TABLE_EXISTS_ACTION=TRUNCATE option to remove all existing rows and import everything from the dump file, but as you want to update existing rows and leave any rows not in the new file alone - i.e. not delete them (I think, since you only mention updating, though that isn't clear) - that might not be appropriate. And as your dump file is from the old exp tool rather than expdp that's moot anyway, unless you can re-export the data.
If you do want to delete existing rows that are not in the dump then you could truncate all the affected tables before importing. But that would be a separate step that you'd have to perform yourself, its not something imp will do for you; and the tables would be empty for a while, so you'd have to have downtime to do it.
Alternatively you could import into new staging tables - in a different schema sinceimp doesn't support renaming either - and then use those to merge the new data into the real tables. That may be the least disruptive approach. You'd still have to design and write all the merge statements though. There's no built-in way to do this automatically.
You can import into temp table and then do record recon by joining with it.
Use impdp option REMAP_TABLE to load existing file into temp table.
impdp .... REMAP_TABLE=TMP_TABLE_NAME
when load is done run MERGE statement on existing table from temp table.

Multiple CSV files to multiple tables not yet created

Database platform: SQL Server 2012
I have a folder with a lot of CSV's. I require the creation of a table for each CSV. The CSV has the column names in the first row, data in subsequent rows.
I have a handy SSIS package to iterate through a folder and import over into existing tables in a database but in this case, it is our first load and we would also like to create the tables as part of the process.
I know how to do it one at a time through the import wizard or SSIS DBO source, new table button. I was wonder if there was a more automated way using SSIS.
After further review of the 313 CSV's, I determined that 75% of them are lookup tables and the other 25% are relevant data. I will simply go through each one and build out a staging table for each one and then properly build out the structure. Only will take about 1 day to build one SSIS package to churn through all the CSV's I want to use and then I'm all set!

Script to import ever changing Excel CSV file into SQL Server Express 2008

I have looked at this site and several other, and the closest I could come to was Dealing with a changing Excel table structure to import to a database table
But I'm actually looking for the script to identify the Column Name and Type from the CSV file, and to create the Table from that.
The actual Importing script is working well, but it only caters for fixed Headers, while I have headers varying between 160 and 170 during any given time.

How to import pipe delimited text file data to SQLServer table

I have database table represented as text file in the following pattern:
0|ALGERIA|0| haggle. carefully f|
1|ARGENTINA|1|al foxes promise|
2|BRAZIL|1|y alongside of the pendal |
3|CANADA|1|eas hang ironic, silent packages. |
I need to import this data to a SQL Server 2008 database table. I have created the table with the types matching the schema.
How to import this data to the table?
EDIT: Solved by following the answer selected.
Note to anyone stumbling upon this in future: The datatype needs to be converted.
Refer: http://social.msdn.microsoft.com/Forums/en/sqlintegrationservices/thread/94399ff2-616c-44d5-972d-ca8623c8014e
You could use the Import Data feature by right mouse clicking the database, and then clicking Tasks then Import Data. This will give you a wizard which you can specify the delimiters etc. for your file and preview the output before you've inserted any data.
If you have a large amount of data you can use bcp to bulk import from file: http://msdn.microsoft.com/en-us/library/ms162802.aspx
The bcp utility bulk copies data
between an instance of Microsoft SQL
Server and a data file in a
user-specified format. The bcp utility
can be used to import large numbers of
new rows into SQL Server tables...
Except when used with the
queryout option, the utility requires
no knowledge of Transact-SQL. To
import data into a table, you must
either use a format file created for
that table or understand the structure
of the table and the types of data
that are valid for its columns.