How to import pipe delimited text file data to SQLServer table - sql

I have database table represented as text file in the following pattern:
0|ALGERIA|0| haggle. carefully f|
1|ARGENTINA|1|al foxes promise|
2|BRAZIL|1|y alongside of the pendal |
3|CANADA|1|eas hang ironic, silent packages. |
I need to import this data to a SQL Server 2008 database table. I have created the table with the types matching the schema.
How to import this data to the table?
EDIT: Solved by following the answer selected.
Note to anyone stumbling upon this in future: The datatype needs to be converted.
Refer: http://social.msdn.microsoft.com/Forums/en/sqlintegrationservices/thread/94399ff2-616c-44d5-972d-ca8623c8014e

You could use the Import Data feature by right mouse clicking the database, and then clicking Tasks then Import Data. This will give you a wizard which you can specify the delimiters etc. for your file and preview the output before you've inserted any data.

If you have a large amount of data you can use bcp to bulk import from file: http://msdn.microsoft.com/en-us/library/ms162802.aspx
The bcp utility bulk copies data
between an instance of Microsoft SQL
Server and a data file in a
user-specified format. The bcp utility
can be used to import large numbers of
new rows into SQL Server tables...
Except when used with the
queryout option, the utility requires
no knowledge of Transact-SQL. To
import data into a table, you must
either use a format file created for
that table or understand the structure
of the table and the types of data
that are valid for its columns.

Related

Should I use SSIS or the SQL Server Import Export tool for a large bulk insert operation?

I will soon need to import millions of records into into a single SQL Server Database table which we use in production. The data to import will be available in the form of about 40 csv files, each having hundreds of thousands of records.
For each row, some of the column values are supplied by the csv files, whereas other rows will require values that I must specify.
I am trying to determine which tool to use. I noticed that SQL Server Management Studio comes with the Import Export Wizard. Is that tool advisable for this type of job? Or should I use SSIS instead?
Some other questions I have:
Should I "lock" the table during the operation?
Should I perform the insert into a copy of the production table and
then once the operation is validated, should I make the copy the
official version of the production table?
As you are having some logic to handle for the rows from CSV (some rows, you will insert and some rows require you to supply some values), you cannot have these kinds of logic in the Import Export Wizard. It is straightforward load. So, you have to go for SSIS only.
You need to have conditional branching to split the rows and supply values to the target table.
For the second question, If possible, I would suggest you to load to separate table and then rename them later. That way, production system users are not impacted by this loading.

Select into VS Import and export wizard in sql server

In sql server, From my desktop I connected to the server. And I want to move a data from a database to another. I have used both select into and import wizard. But import wizard seems to be slow. Why?
Is there any methodology changes for transferring data ?
Select into is a SQL query, and it is executed directly.
Import and Export Wizard is a tool which invokes Integration Services (SSIS).
Wizard is slow, but can use various data sources
More about export/import wizard
https://msdn.microsoft.com/en-US/en-en/library/ms141209.aspx
Topic about select into and export/import wizard
https://social.msdn.microsoft.com/forums/sqlserver/en-US/e0524b2a-0ea4-43e7-b74a-e9c7302e34e0/super-slow-performance-while-using-import-export-wizard
I agree with Andrey. The Wizard is super slow. If you perform a Google search on "sql server import and export wizard slow", you will receive nearly 50k hits. You may want to consider a couple of other options.
BCP Utility
Note: I have used this on a number occasions. Very fast processing.
The bcp utility bulk copies data between an instance of Microsoft SQL Server and a data file in a user-specified format. The bcp utility can be used to import large numbers of new rows into SQL Server tables or to export data out of tables into data files. Except when used with the queryout option, the utility requires no knowledge of Transact-SQL. To import data into a table, you must either use a format file created for that table or understand the structure of the table and the types of data that are valid for its columns.
Example:
BULK INSERT TestServer.dbo.EmployeeAddresses
FROM 'D:\Users\Addresses.txt';
GO
OPENROWSET(BULK) Function
The OPENROWSET(BULK) function connects to an OLE DB data source to restore data and it allows access to a remote data by connecting to a remote data source.
Example:
INSERT INTO AllAddress(Address)
SELECT * FROM OPENROWSET(
BULK 'D:\Users\Addresses.txt',
SINGLE_BLOB) AS x;
Reference
https://msdn.microsoft.com/en-us/library/ms175915.aspx
http://solutioncenter.apexsql.com/sql-server-bulk-copy-and-bulk-import-and-export-techniques/
Mysql Store data into many places and it stores data in Small chunk of files for faster retrieve and when we use export wizard what it does is write all metadata and data to our RAM first and depending on our system and increases overhead and same happen in case of importing, and Select into is fast because mysql has to create inbuilt replica of the database that already exist.
in real life, Select into is like photocopy of a page whereas wizard is like re-writing the page manually.

Oracle SQL Dump file extracting parts to sql/another dump file

I have a Oracle DB dump file and now I only need parts of the tables that are included there. Does anyone know how I can extract this parts into a separate dump file (or SQL)?
I thought about using the import statement. Import from dump file (full export) to dumpfile (needed parts) something like this, but don't know if its possible this way
import user/pw directory=fullexport_dump dumpfile=part.dmp logfile=import.log status=30
No it's not possible. You can only limit rows while exporting using query parameter.
exp ..... query="where id=10"
You may search further in the Oracle Documentation.
So, import the whole table, and create a new table with only required parts:
create table NEEDEDPARTS as select * from FULLEXPORT where id=10
Or, import the whole table and re-export with query parameter.

Which one is good way to Import Excel to Database?

Hi i am using SQL Server 2008.
How can I import an Excel file into the database, which is the easiest way and simple to do?
OpenRowSet
BulkCopy
Linked Servers
SSIS
I have the above options to Import Excel to Database.
In my opinion SSIS wizard is best way to import excel data where you get row and column wise whole view of table data which will be inserted and also specify column names and contraints and parse data using query.
UPDATE :
If the data in your excel file does not require any processing to match your database table then I recommend you save your excel file as a csv and use a combination of BULK INSERT and the BCP.exe program.
To use BULK INSERT you will need a format file which defines how your datafile matches up to your database table. You can write this by hand to match the existing database table or you can use the following command to generate the format file you need:
bcp [ServerName].[SchemaName].[TableName] format nul -c -f [FormatFileOutputName].fmt -S[ServerHostName] -U[DbUserName] -P[DbUserPassword]
Now you will have 2 files:
DatafileName.csv
FormatFileName.fmt.
Use BULK INSERT within Sql Server to insert your data.
Note: If the columns in your datafile are in a different order than your database table then you can simply edit the generated format file to have them map correctly.

BCP utility to create a format file, to import Excel data to SQL Server 2008 for BULK insertion

Am trying to import Excel 2003 data into SQL table for SQL Server 2008.
Tried to add a linked server but have met with little success.
Now am trying to check if there's a way to use the BCP utility to do a BULK insert or BULK operation with OPENROWSET, using a format file to get the Excel mapping.
First of all, how can I create a format file for a table, that has differently named columns than the Excel spreadsheet colums?
Next, how to use this format file to import data from say a file at: C:\Folder1\Excel1.xsl
into table Table1 ?
Thank you.
There's some examples here that demonstrate what the data file should look like (csv) and what the format file should look like. Unless you need to do this lots I'd just hand-craft the format file, save the excel data to csv, then try using bcp or OPENROWSET.
The format file specifies the column names for the destination. The data file doesn't have column headings so you don't need to worry about the excel (source) cols being different.
If you need to do more mapping etc, then create an SSIS package. You can use the data import wizard to get you started, then save as SSIS package, then edit to your heart's content.
If it's a one-off I'd use the SQL data import size, from right-click on database in mgmt studio. If you just have a few rows to import from excel I typically open a query to Edit Top 200 rows, edit the query to match the columns I have in excel, then copy and paste the rows from excel into SQL mgmt studio. Doesn't handle errors very well, but quick.