Bulk insert in SQL Server database from one table to another - sql

I have 200k records in one table and I want to insert these records into another table. I read about the bulk insert but the query I found on msdn website is just not making any sense.
This is the query
BULK INSERT AdventureWorks2012.Sales.SalesOrderDetail
FROM 'f:\orders\lineitem.tbl'
WITH
(
FIELDTERMINATOR =' |',
ROWTERMINATOR =' |\n'
);
What is f:\orders\lineitem.tbl and the whole this is just not making any sense.
I have a table with four columns: id, frm, to1 and country
Same this in destination table
Any easy syntax will be helpful
I am using SQL Server 2008/12

BULK INSERT imports from an external data file. If you already have the data in a SQL Server table, then you should do something like:
INSERT INTO NewTable (field1, field2, field3)
SELECT field1, field2, field3 FROM OldTable
DO NOT point BULK INSERT at your SQL Server database file. The .tbl file referenced in your example code is to a text file with delimited fields.

Bulk insert is for import external data from file to sql table like
BULK INSERT 'tableName' From 'File Path',
If You have Copy data from one table to other table in sql, use:
select into instate insert into like ' select * into table1 From table2

insert into destinationsever.destinationdatabase.dbo.destinationtable
select * from sourcesever.sourcedatabase.dbo.sourcetable

Related

SQL Server: Bulk Insert to a Linked Server

I have a local database myDB and a server database serverDB that is linked to myDB as [serverDB].serverDB.dbo. I want to upload a 50,000-row table from a .csv file on my computer to the serverDB. I tried this:
bulk insert #temp from 'filename'
insert into [serverDB].serverDB.dbo.tablename select * from #temp
and it takes ages. I found out that the insert into creates connection for each row, so it looks like it's not an option in this case. Then I tried
bulk insert [serverDB].serverDB.dbo.tablename from 'filename'
and I get the error Invalid object name 'tablename' even thought this table exists in the [serverDB].serverDB database. Does anyone know how I can make SQL "see" the table [serverDB].serverDB.dbo.tablename?

PL/SQL bulk INSERT into a table with an unknown structure

Can you issue a FORALL bulk INSERT into a table with an unknown structure? That means, can you build dynamically the INSERT command in the FORALL construct without knowing the number of fields at compile time?
Number and name of fields is retrieved at runtime and stored in a collection:
TYPE RowType is TABLE OF VARCHAR2(50) INDEX BY VARCHAR2(50);
TYPE TableType is TABLE OF RowType;
my_table TableType;
So at runtime my_table could be filled this way for example:
my_table(1)('FIELD1') = 'VALUE1A';
my_table(1)('FIELD2') = 'VALUE2A';
my_table(1)('FIELD3') = 'VALUE3A';
my_table(2)('FIELD1') = 'VALUE1B';
my_table(2)('FIELD2') = 'VALUE2B';
my_table(2)('FIELD3') = 'VALUE3B';
my_table(3)('FIELD1') = 'VALUE1C';
my_table(3)('FIELD2') = 'VALUE2C';
my_table(3)('FIELD3') = 'VALUE3C';
The insert statements that should be bulk executed therefore are:
INSERT INTO TABLENAME (FIELD1,FIELD2,FIELD3) VALUES (VALUE1A,VALUE2A,VALUE3A);
INSERT INTO TABLENAME (FIELD1,FIELD2,FIELD3) VALUES (VALUE1B,VALUE2B,VALUE3B);
INSERT INTO TABLENAME (FIELD1,FIELD2,FIELD3) VALUES (VALUE1C,VALUE2C,VALUE3C);
EDIT: Do you even read the questions or you just read a couple of words in the title? The linked question asks how to bind a variable, this question asks how to bulk issue dynamic statements. Yes, there are the words 'insert' and 'table' in both questions.
No, you can't dynamically build and execute a FORALL...INSERT... statement dynamically. You can, however, build up an INSERT statement dynamically of the form:
INSERT ALL
INTO TABLENAME (FIELD1,FIELD2,FIELD3) VALUES (VALUE1A,VALUE2A,VALUE3A)
INTO TABLENAME (FIELD1,FIELD2,FIELD3) VALUES (VALUE1B,VALUE2B,VALUE3B)
INTO TABLENAME (FIELD1,FIELD2,FIELD3) VALUES (VALUE1C,VALUE2C,VALUE3C)
Or if the data you want to insert into your table resides in another table you might find an INSERT...(subquery) statement like
INSERT INTO TABLENAME
SELECT FIELD1, FIELD2, FIELD3
FROM OTHER_TABLE
WHERE something <> something_else
or you might be able to use a MERGE statement similar to
MERGE INTO TABLENAME t
USING (SELECT FIELD1, FIELD2, FIELD3 FROM OTHER_TABLE) o
ON (t.FIELD1 = o.FIELD1)
WHEN NOT FOUND THEN
INSERT (FIELD1, FIELD2, FIELD3) VALUES (o.FIELD1, o.FIELD2, o.FIELD3)
which will do a mass insert based on the data specified in the USING clause and the match criteria in the ON predicate.
So there may be ways to do what you want but without knowing the specifics of the source of your data and how you're manipulating that data prior to inserting it into your database it's tough to say whether or not any of them would apply.
Best of luck.

How do I put a text file into a SQL Server 2012 field

I have a table call it tbl1 with a field ASCII_file nvarchar(MAX).
I want to put file 'c:xyz.ght' into field ASCII_file using SSMS SQL Insert statement.
Also for existing records is using update the only way to get the files into the field. I tried copy the file but paste did not show up when I tried to paste the file into the field. Is there any easier way than using a SQL update?
Why can't you do something like this, with a simple insert statement?
INSERT INTO tbl1
VALUES ('c:xyz.ght');
And an update statement:
UPDATE tbl1
SET ASCII_file ='c:xyz.ght'
WHERE some_column=some_value;
If you want to read a content of a file:
BULK INSERT tbl1
FROM 'c:\temp\file.txt'
WITH
(
ROWTERMINATOR ='\n'
)

How do I take all of the text from a file and insert it into one row's column?

I want to read all of the text in a file and insert it into a table's column. One suggested way was to use BULK INSERT. Because of the syntax, I thought it would be better to BULK INSERT into a temp table, then eventually, I would SELECT from the temp table along with other values to fill the main table's row.
I tried:
USE [DB]
CREATE TABLE #ImportText
(
[XSLT] NVARCHAR(MAX)
)
BULK INSERT #ImportText
FROM 'C:\Users\me\Desktop\Test.txt'
SELECT * FROM #ImportText
DROP TABLE #ImportText
But, it is creating a new row in #ImportText per newline in the file. I don't want it split at all. I could not find a FIELDTERMINATOR that would allow for this. (i.e. end of file character)
Try this:
BULK INSERT #ImportText
FROM 'C:\Users\me\Desktop\Test.txt'
WITH (ROWTERMINATOR = '\0')

To read CSV file data one by one from SQL Stored proc

Need to read CSV file information one by one. i.e. If the customer in the file is existing in Customer table insert into detail table otherwise insert into error table. So I can't use bulk insert method.
How to read one by one record from CSV file? How to give the path?
Bulk insert method is not going to work here.
One option is to use an INSTEAD OF INSERT trigger to selectively put the row in the correct table, and then use your normal BULK INSERT with the option FIRE_TRIGGERS.
Something close to;
CREATE TRIGGER bop ON MyTable INSTEAD OF INSERT AS
BEGIN
INSERT INTO MyTable
SELECT inserted.id,inserted.name,inserted.otherfield FROM inserted
WHERE inserted.id IN (SELECT id FROM customerTable);
INSERT INTO ErrorTable
SELECT inserted.id,inserted.name,inserted.otherfield FROM inserted
WHERE inserted.id NOT IN (SELECT id FROM customerTable);
END;
BULK INSERT MyTable FROM 'c:\temp\test.sql'
WITH (FIELDTERMINATOR=',', FIRE_TRIGGERS);
DROP TRIGGER bop;
If you're importing files regularly, you can create a table (ImportTable) with the same schema, set the trigger on that and do the imports to MyTable through bulk import to ImportTable. That way you can keep the trigger and as long as you're importing to ImportTable, you don't need to do any special setup/procedure for each import.
CREATE TABLE #ImportData
(
CVECount varchar(MAX),
ContentVulnCVE varchar(MAX),
ContentVulnCheckName varchar(MAX)
)
BULK INSERT #ImportData
FROM 'D:\test.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
TABLOCK
)
select * from #ImportData
//Here you can write your script to user read data one by one
DROP TABLE #ImportData
Use bulk insert to load into a staging table and then process it line by line.