Ignoring Columns at the end of Bulk Insert Statement - sql

What I'm doing is inserting into a table using bulk insert from a csv. There are fixed number of columns which I need to input.
Code:
BULK INSERT #TEMP FROM 'c:\temp.csv'
WITH
(
FIELDTERMINATOR = ','
, ROWTERMINATOR = '\n'
, CODEPAGE = 'RAW'
,FIRSTROW =2
)
Input:
A,B,C,D,E
A,B,C,D,E
Problem:
The column containing the values E are not to be written into the table because there is no column to store those values. When I take these values into the table, the last column is shown like this:
D,E
D,E
Question:
Is there any way to prevent the insertion of column E into the table without using a format file? I cannot use OPENROWSET to get these values as there are some permission issues.

As others have mentioned, you cant ignore a field while doing bulk insert. If you don't have access to the format file, then import into your temp table, and drop the columns you don't need.

Related

What is the right way to handle type string null values in SQL's Bulk Insert?

For example, I have a column with type int.
The raw data source has integer values, but the null values, instead of being empty (''), is 'NIL'
How would I handle those values when trying to Bulk Insert into MSSQL?
My code is
create table test (nid INT);
bulk insert test from #FILEPATH with (format="CSV", firstrow=2);
the first 5 rows of my .csv file looks like
1
2
3
NIL
7
You can replace the nil with " (empty string) directly in your data source file or insert the data into a staging table and transform it:
BULK INSERT staging_sample_data
FROM '\\data\sample_data.dat';
INSERT INTO [sample_data]
SELECT NULLIF(ColA, 'nil'), NULLIF(ColB, 'nil'),...
Of course if your field is for example a numeric, the staging table should have a string field. Then, you can do as Larnu offers: 'TRY_CONVERT(INT, ColA)'.
*Note: if there are default constraints you may need to check how to keep nulls

Bulk Insert - How to tell SQLServer to insert empty-string and not null

This seems like a trivial question. And it is. But I have googled for over a day now, and still no answer:
I wish to do a bulk insert where for a column whose datatype is varchar(100), I wish to insert an empty string. Not Null but empty. For example for the table:
create table temp(columnName varchar(100))
I wish to insert an empty string as the value:
BULK INSERT sandbox..temp FROM
'file.txt' WITH ( FIELDTERMINATOR = '|#', ROWTERMINATOR = '|:' );
And the file contents would be row1|:row2|:|:|:. So it contains 4 rows where last two rows are intended to be empty string. But they get inserted as NULL.
This question is not the same as the duplicate marked question: In a column, I wish to have the capacity to insert both: NULL and also empty-string. The answer's provided does only one of them but not both.
Well instead of inserting empty string explicitly like this why not let your table column have a default value of empty string and in your bulk insert don't pass any values for those columns. Something like
create table temp(columnName varchar(100) default '')

Insert .csv file into SQL Server with BULK INSERT and reorder columns

I make an ASP.NET application and I want to insert data into my SQL Server from a CSV file. I did it with this SQL command:
BULK
INSERT Shops
FROM 'C:\..\file.csv'
WITH
(
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n'
);
It pretty works but I have an id columns with AUTO INCREMENT option. I want to reorder inserted columns to SQL server increment automatiquely Id column.
How can I do that with BULK method?
(of course, I don't want to edit .csv file manualy :P )
Like I said in my comment: You can't, bulk insert just pumps data in, you can't transform the data in any way. What you can do is bulk insert to staging table(s) and use an insert statement to do what you need to do.
You can do it like this:
-- Create staging table
SELECT TOP 0 *
INTO Shops_temp
FROM Shops;
-- Bulk insert into staging
BULK INSERT Shops_temp
FROM 'C:\..\file.csv'
WITH
(
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n'
);
-- Insert into actual table, use SELECT for transformation, column order etc.
INSERT INTO Shops(name, etc..)
SELECT name
, etc..
FROM Shops_temp;
-- Cleanup
DROP TABLE Shops_temp;

SQL Server 2012. Copy row into single column on another table

I am working with SQL Server on the AdventureWorks2012 Database. I am working with triggers. I would like to copy any new inserted row into one single column in another table called AuditTable. Basically whenever I insert into the parson.address table, I would like to copy all of the rows into the AuditTable.prevValue column. I know how to insert etc, I am not sure how to write to one column.
Here is the general idea.
USE [AdventureWorks2012]
ALTER TRIGGER [Person].[sPerson] ON [Person].[Address]
FOR INSERT AS INSERT INTO AdventureWorks2012.HumanResources.AuditTable(PrevValue) select
AddressID,AddressLine1,AddressLine2,City, StateProvinceID, PostalCode, SpatialLocation, rowguid, ModifiedDate FROM Inserted
ERROR: The select list for the INSERT statement contains more items than the insert list. The number of SELECT values must match the number of INSERT columns.
Thank you for any assistance. I have searched loads but cannot find the exact solution anywhere.
The error message says it all - you can't insert 9 columns of different types into a single column. Assuming that your destination AuditTable.PrevValue column is NVARCHAR(), you could flatten your insert as follows, by concatenating the columns and casting non-char columns to n*char:
INSERT INTO AdventureWorks2012.HumanResources.AuditTable(PrevValue)
SELECT
N'ID : ' + CAST(AddressID AS NVARCHAR(20)) + N'Address: ' + AddressLine1 +
N', ' +AddressLine2 + ....
FROM Inserted
IMO keeping one long string like this makes the Audit table difficult to search, so you might consider adding SourceTable and possibly Source PK columns.
You could also consider converting your row to Xml and storing it as an Xml column, like this:
create table Audit
(
AuditXml xml
);
alter trigger [Person].[sPerson] ON [Person].[Address] for INSERT AS
begin
DECLARE #xml XML;
SET #xml =
(
SELECT *
FROM INSERTED
FOR XML PATH('Inserted')
);
insert into [MyAuditTable](AuditXml) VALUES (#xml);
end

Passing default values to a column in Bulk insert

I am trying to get data from a csv file with the following data.
Station code;Priority vehicle;DateBegin;DateEnd
01;y;20100214;20100214
02;n;20100214;20100214
03;;20100214;20100214
Now I want a value 'n' in the table when no data is provided for the column 'Priority vehicle' in csv file.
I am writing the query as
BULK INSERT dbo.#tmp_station_details
FROM 'C:\station.csv'
WITH (
FIELDTERMINATOR ='';'',
FIRSTROW = 2,
ROWTERMINATOR = ''\n''
)
Check the full explanation here:
http://msdn.microsoft.com/en-us/library/ms187887.aspx
"By default, when data is imported into a table, the bcp command and BULK INSERT statement observe any defaults that are defined for the columns in the table. For example, if there is a null field in a data file, the default value for the column is loaded instead. "
My suggestion is to specify a default value for the Priority vehicle column and the Null value from the csv file will be overwritten to your SQL table with the default value specified in the table design.