Bulk insert with Blank values - sql

I have one table with 48 columns in which I want to import data from csv file. My csv file consist of some blank values.
Whenever I uses bulk insert I am getting error:
1) Bulk load data conversion error (type mismatch or invalid character
for the specified codepage) for row 1, column 1 (column name)
2)The OLE DB provider "BULK" for linked server "(null)" reported an
error. The provider did not give any information about the error.
3)Cannot fetch a row from OLE DB provider "BULK" for linked server
"(null)".
I am using sql server 2008
Below is bulk insert command I am using:-
**
bulk insert DataBaseName.dbo.TableName
from 'C:\FolderName\FileName.csv'
with
(
FIRSTROW = 1,
FIELDTERMINATOR =',',
ROWTERMINATOR ='\n',
KEEPNULLS
)**
Please suggest how to handle it..?

For this type of errors make sure the below things:
1.The datalength should be matched according to your .CSV file(use trial and error method and reach your lengths).
The number of columns should be matched(need to check manually).
The datatype conversions should be done implicitly(better to use all nvarchar datatype in order to avoid errors).

Related

Script to Import data into SQL table from flat file (text file) .XYZ file

I am trying to create a script to import flat files into SQL Server tables. I tried using the import wizard but since I need to do this periodically I would have to create a SQL function in order to achieve this and I am not sure how to go about it. The flat files are stored in the following format:
19350.000 45978.000 1560.631
19352.000 45978.000 1560.234
19354.000 45978.000 1560.021
19356.000 45978.000 1559.809
19358.000 45978.000 1559.596
I have tried the following:
CREATE TABLE #TempTable
(
Id int identity (1,1),
X float,
Y float,
Z float
)
BULK INSERT #TempTable FROM
'\\fcgwnt01\share.$\StandardHaulage\TEST\Automated\EVO\SurfaceFiles\EVO 2019-
01-23.xyz'
WITH (FIELDTERMINATOR = '**\t**', ROWTERMINATOR = '\n')
SELECT * INTO [dbo].[SHM_EVO_SURFACE_DETAILS] FROM #TempTable
--Drop temporary table
DROP TABLE #TempTable
But I'm getting the following errors
Msg 4866, Level 16, State 1, Line 12
The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 12
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 12
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)"

SQL Bulk insert ignores first data row

I am trying to import a pipeline delimited file into a temporary table using bulk insert (UTF-8 with unix style row terminator), but it keeps ignoring the first data row (the one after the header) and i don't know why.
Adding | to the header row will not help either...
File contents:
SummaryFile_20191017140001.dat|XXXXXXXXXX|FIL-COUNTRY|128
File1_20191011164611.dat|2|4432|2|Imported||
File2_20191011164611.dat|3|4433|1|Imported||
File3_20191011164611.dat|4|4433|2|Imported||
File4_20191011164611.dat|5|4434|1|Imported|INV_ERROR|
File5_20191011164611.dat|6|4434|2|Imported||
File6_20191011164611.dat|7|4434|3|Imported||
The bulk insert throws no error, but it keeps ignoring the first data line (File1_...)
SQL below:
IF OBJECT_ID('tempdb..#mycsv') IS NOT NULL
DROP TABLE #mycsv
create table #mycsv
(
tlr_file_name varchar(150) null,
tlr_record_id int null,
tlr_pre_invoice_number varchar(50) null,
tlr_pre_invoice_line_number varchar(50) null,
tlr_status varchar (30) null,
tlr_error_code varchar(30) null,
tlr_error_message varchar (500) null)
bulk insert #mycsv
from 'D:\TestData\Test.dat'
with (
rowterminator = '0x0A',
fieldTerminator = '|',
firstrow = 2,
ERRORFILE = 'D:\TestData\Import.log')
select * from #mycsv
It's really bugging me, since i don't really know what am i missing.
If i specify FirstRow = 1 th script will throw:
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (tlr_record_id).
Thanks in advance!
"UTF-8 with unix style row terminator" I assume you're using a version of SQL Server that doesn't support UTF-8. From BULK INSERT (Transact-SQL)
** Important ** Versions prior to SQL Server 2016 (13.x) do not support code page 65001 (UTF-8 encoding).
If you are using 2016+, then specify the code page for UTF-8:
BULK INSERT #mycsv
FROM 'D:\TestData\Test.dat'
WITH (ROWTERMINATOR = '0x0A',
FIELDTERMINATOR = '|',
FIRSTROW = 1,
CODEPAGE = '65001',
ERRORFILE = 'D:\TestData\Import.log');
If you aren't using SQL Server 2016+, then you cannot use BULK INSERT to import a UTF-8 file; you will have to use a different code page or use a different tool.
Note, also, that the above document states the below:
The FIRSTROW attribute is not intended to skip column headers. Skipping headers is not supported by the BULK INSERT statement. When skipping rows, the SQL Server Database Engine looks only at the field terminators, and does not validate the data in the fields of skipped rows.
if you are skipping rows, you still need to ensure the row is valid, but it's not for skipping headers. This means you should be using FIRSTROW = 1 and fixing your header row as #sarlacii points out.
Of course, that does not fix the code page problem if you are using an older version of SQL Server; and my point stands that you'll have to use a different technology on 2014 and prior.
To import rows effectively into a SQL database, it is important to make the header formatting match the data rows. Add the missing delimiters, like so, to the header and try the import again:
SummaryFile_20191017140001.dat|XXXXXXXXXX|FIL-COUNTRY|128|||
The number of fields in the header, versus the data fields, must match, else the row is ignored, and the first satisfactory "data" row will be treated as the header.

SQL import/create columns in SMILE chemical structures

We try to import a csv file where the first column includes chemical structures (SMILE) like this
c1cccc(c12)n(C)c(c2)CN(C)C(=O)c(c3)ccc(c34)NCC(=O)N(C4)C,14-BENZODIAZEPINEDERIV.4_145,1
c1cccc(c12)n(C)c(c2)CN(C)C(=O)c(c3)ccc(c34)N[C#H](C(=O)N(C4)C)CC(=O)OC,14-BENZODIAZEPINEDERIV.3_146,1
Here is the code in SQL
--Define Table
CREATE TABLE Amide_actives_test
(Structure VARCHAR(40),
Name VARCHAR(40),
Active INT)
GO
--Import Data from CSV
BULK
INSERT Amide_actives_test
FROM 'C:\Amide_actives.csv'
WITH
(
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n' --Use to shift the control to next row
)
GO
--Check the content of the table
SELECT * FROM Amide_actives_test
GO
The following error message will pop out:
Bulk load data conversion error (truncation) for row 1, column 1 (Name).
Msg 4863, Level 16, State 1, Line 10
...repeating the previous 2 lines 10 times....
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 10
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Apparently there is a problem of SQL to read the first column in "Structure VARCHAR(40)". I have tried all the string types (CHAR,VARCHAR.NCHAR,NVARCHAR,NTEXT,TEXT) and none of them works.
http://msdn.microsoft.com/en-us/library/ff848814.aspx
There is one way to solve this issue is to purchase another customized MySQL module from DayLight. However, 1. it costs 2. it doesn't support SQL
http://www.daylight.com/dayhtml/doc/pgsql/daycart_pg_search.html
May I know if any SQL guru has SQL solutions? Thanks!
First problem is Structure VARCHAR(40) varchar length is lesser than the input so you got trucation error. Try increasing the varchar length and check

error Bulk load data conversion error (truncation) when inserting data from a text file

got about 20 to 50 rows in a text file to insert into my database 'bourse' using sql server management studio 2012 my database 'bourse contains certain number of tables for example i have the table IB_emetteur it has 3 attributes (c_emetteur , libelle , mnemes )
and i have the data in a text file structured like this :
1,UFI,U.F.I
2,TSI, T.S.I
3,ADAY,A.Day
5,CAPITAL,C.PITAL
7,COFCAP,COFjuil
8,SFI,SUIhyuo
9,AFC,A.KIYUI
13,CGI,chakoqguio
14,BNAC,banque hyuijsii
i have to insert this data in my table so i used this query :
bulk insert [dbo].[IB_Emetteur]
from 'C:\Users\Manu\Documents\GL5\Finance\liste_emetteur.txt'
with (fieldterminator = ',', rowterminator = '\n')
go
but i got this error
Bulk load data conversion error (truncation) for row 1, column 2 (mnemes)
Try
bulk insert [dbo].[IB_Emetteur]
from 'C:\Users\Manu\Documents\GL5\Finance\liste_emetteur.txt'
with (fieldterminator = ',', rowterminator = '\n',FIRSTROW = 2)
go

SQL Server 2008 bulk import issue

Ive been working with this for a while and can't find out what I'm doing wrong. I have a CSV file with data such as
123,Jon,Son,M,1
When I run the query
BULK INSERT MYDB2..Dependent FROM 'c:\db3\db.csv'
WITH
(FIELDTERMINATOR=',',
ROWTERMINATOR = '/n')
I get errors like
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 5 (AGE).
The thing is I made EXACT copies of the tables so there is no way my tables can't match.
I believe the problem is my the format of my query.
It does have a little problem, it should be \n, instead of /n
BULK INSERT [Dependent] FROM 'c:\db3\db.csv'
WITH
(FIELDTERMINATOR=',' ,ROWTERMINATOR = '\n')