BULK INSERT 4866 and 7301 - sql

Trying to BULK import data in SQL server with below lines but getting error:
Msg 4866, Level 16, State 8, Line 3
The bulk load failed. The column is too long in the data file for row 1, column 96. Verify that the field terminator and row terminator are specified correctly.
Msg 7301, Level 16, State 2, Line 3
Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)".
Is there anything wrong with my statements? As when I use import wizard it works fine.
BULK INSERT BICX.dbo.raw
FROM 'D:\NEW_CDR\NEW.txt'
WITH
(
FIRSTROW = 5,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
);

As you say the table contains 95 columns, and the error says column 96 is too long you have a problem with your row delimiter.
If your file came from a windows system it most likely is \r\n or you could try 0x0a if that doesn't work
BULK INSERT BICX.dbo.raw
FROM 'D:\NEW_CDR\NEW.txt'
WITH
(
FIRSTROW = 5,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\r\n'
);
or
BULK INSERT BICX.dbo.raw
FROM 'D:\NEW_CDR\NEW.txt'
WITH
(
FIRSTROW = 5,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a'
);

Related

SQL Server Bulk Insert error - data conversion error (type mismatch or invalid character for the specified codepage)

I'm not a pro in this but I'm trying to do a bulk insert (from csv to SQL Server) but I'm getting some errors:
Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 2, column 1 (Year).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Here is the code I have used:
BULK INSERT [dbCen_Staging].dbo.[dc1]
FROM "C:\Newfolder\dc.csv"
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
MAXERRORS = 0,
--DATAFILETYPE = 'widechar',
BATCHSIZE=250000,
KEEPIDENTITY
)
GO
So wondering why I'm making a mistake, or if anyone else has a better idea on how to insert a few csv files with 900 millions rows each to SQL Server table. Any on-premise or Azure cloud solution maybe? Because speed matters a lot.
Thanks all

Bulk Insert data conversion error with CSV file in SQL Server for BIT datatype

I am trying to load data from CSV to table with BULK INSERT functionality.
In file for one column values are as True and False as the column datatype is BIT in database.
However when I tried to import it thrown an error as below.
Msg 4864, Level 16, State 1, Line 8
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 2, column 2 (ISTrue).
Here is the code which I have used for data import.
BULK INSERT dbo.BulImportTesting
FROM 'C:\ExportData\Test\dbo_tbl_Import.csv'
WITH (FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
BATCHSIZE = 100000,
KEEPIDENTITY);
Here is the sample data I am trying to import
ID UseStartDate
81958 FALSE
81959 FALSE
83336 FALSE
83337 FALSE
83338 TRUE
Thanks in advance!

SQL Server 2017: IID_IColumnsInfo Bulk Insert Error

I've used the following script in the past without issue, so I'm not sure why it's causing me issues now.
Msg 7301, Level 16, State 2, Line 8
Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)".
My code:
(
FORMAT = 'CSV',
FIELDQUOTE = '"',
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
TABLOCK
)
screenshot of setup and error
File Size: 112 MB
Rows: 322,190
Microsoft Server Management Studio v17.4
Can you try
ROWTERMINATOR = '\r\n'
or
ROWTERMINATOR = '0x0a'
Since you're using a CSV file the row terminator may be a line feed (LF), which 0x0a in the hexadecimal notation for. The example below accounts accounts for this type of row terminator.
BULK INSERT dbo.YourTable
FROM 'C:\FilePath\DataFile.csv'
WITH (
FORMAT = 'CSV',
FIRSTROW = 2,
FIELDQUOTE = '"',
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
TABLOCK
);
try removing the FORMAT= 'CSV' line
your file may not be RFC 4180 compliant.
this has worked for me and this error
Make sure there is not a byte-order mark (BOM) at the beginning of the file, which will cause this to fail with this error.

How to write a SQL script to read contents of a file

I have a SQL script and a ".csv" file. I want the SQL script to read the data from the ".csv" file instead of manually entering the data in the script. Is it possible?
....
.....
......
and SP_F.trade_id = SP_R.trade_id
and SP_R.iSINCode IN (here is where I can manually enter the data)
ps: I am new to SQL and I am still learning.
Here is good solution.
BULK INSERT CSVTest
FROM 'c:\csvtest.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
More explained:
1) We have csv file named test.csv with such content:
'JE000DT', 'BE000DT2J', 'DE000DT2'
1, 2, 3
2, 3, 4
4, 5, 6
2) We need to create table for this file in DB:
CREATE TABLE CSVTest ([columnOne] int, [columnTwo] int, [columnThree] int)
3) Insert your data with BULK INSERT. The columns count and type must match your csv.
BULK INSERT CSVTest
FROM 'C:\test.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FIRSTROW = 2
)
4) Use your this table in yours subquery:
Select
SP_F.trade_id, -- as 'Trade ID',
SP_F.issuer_id, --as 'Issuer ID',
SP_R.iSINCode --as 'ISIN'
from t_SP_Fundamentals SP_F
JOIN t_SP_References SP_R ON SP_F.trade_id = SP_R.trade_id
where
(SP_F.issuer_id = 3608 or SP_F.issuer_id = 3607)
and SP_R.iSINCode IN (SELECT [columnOne] FROM CSVTest)
There is another solution with OPENROWSET statement, that allows direct reading from the file. But I strongly recommend you to use the solution above. Reading direct from the file in QUERY is not very great choose.

Bulk Insert Not working

I have a Bulk Insert query as follows
BULK INSERT tmp_table FROM 'file.jrl'
WITH (
DATAFILETYPE='widenative' ,
FIELDTERMINATOR = '~' ,
MAXERRORS = 0 ,
ROWS_PER_BATCH = 116064 ,
ROWTERMINATOR = '0x0a' ,
TABLOCK )
It is giving me following error
Msg 4866, Level 16, State 4, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
I am using DATAFILETYPE='widenative' , because my data contains some special characters like Ñ,Ã etc
For RowTerminator value I have also checked with '\n'
My column separator is ~. Is there anything I have to change?
My sample data is as follows
12345 ~asdfdfdfd ~ ~ ~ ~ ~0000000000~ ~0000000000~ ~rrrrñtttttt ~
Do you work for BioWare?
Anyway,
the problem here doesn't come from the row terminator. The error message tells you that the first error occurs at the first column, so there's a problem with your field terminator.
My bet is on the datafiletype. Try 'widechar' instead of 'widenative'. Native field terminator is \t, and using native will make the BULK ignore the FIELDTERMINATOR option.
WKR.