Getting error while bulk insertion - sql

I am getting error when I am trying to do bulk insert:
BULK INSERT #tbl_InterCompanyUploadDetail_Staging
FROM '\\SVRP03546008461D\QA\UploadTemplatewithvalidation.xlsx'
WITH (FIRSTROW = 6, FIELDTERMINATOR ='\t', ROWTERMINATOR ='\\n' )
Error that I am getting is :
Bulk load data conversion error (truncation) for row 6, column 2 (Oracle Company Code).
The column in Excel has data as 470 and in database column is varchar (10).
So what could be the reason for the error.

The Issue
BULK INSERT may not work with xlsx files, try converting the .xlsx file to .csv file to achieve this (using BULK INSERT)
1st Solution - using OPENROWSET
Try using OPENROWSET with Microsoft.ACE.OLEDB.12.0 provider:
Insert into <rawdatatable>
select * FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0;Database=D:\SSIS\FileToLoad.xlsx;HDR=YES',
'SELECT * FROM [Sheet1$]')
OR
SELECT * INTO Data_dq
FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0; Database=D:\Desktop\Data.xlsx', [Sheet1$]);
2nd Solution - using OPENDATASOURCE
SELECT * INTO Data_dq
FROM OPENDATASOURCE('Microsoft.ACE.OLEDB.12.0',
'Data Source=D:\Desktop\Data.xlsx;Extended Properties=Excel 12.0')...[Sheet1$];
References
Import/Export Excel (.Xlsx) or (.Xls) File into SQL Server
Import data from Excel to SQL Server or Azure SQL Database
How to Bulk Insert from XLSX file extension?

Replace "\n" with "0x0a" as ROWTERMINATOR and try again.
Or also
ROWTERMINATOR = '''+cast (0x0000 as char(1))+'''
Let me know if it works.
Check also this.

I doubt using XLSX file with BULK INSERT. If XLSX file supported, then FIELDTERMINATOR and ROWTERMINATOR are not needed.
XLSX is zip file so I guess (but not sure) XLSX not supported and you are getting truncation error because it is reading it as pure text file and BULK INSERT getting long text up to FIELDTERMINATOR.
To confirm, you try increasing length of column up to some thousand character and run BULK INSERT, if you get garbage character then it is reading it as pure text file. may be garbage character could be same as you open same xlsx file in notepad or notepad++.

You can't bulk load XLSX into SQL Server. You CAN convert the XLSX to a tab delimited text file and bulk load.
If this is a one-off operation I would recommend converting to text first (but beware how Excel exports certain types like dates and large numbers). Or you can use the Import/Export wizard (https://learn.microsoft.com/en-us/sql/relational-databases/import-export/import-data-from-excel-to-sql)
If this is a process you need to repeat I would create an SSIS script.

Related

Save output into Varbinary(max) format

Using this SQL statement, I pull locally stored Excel sheet into SQL Server.
SELECT *
FROM OPENROWSET( 'Microsoft.ACE.OLEDB.12.0','Excel 12.0;Database=C:\SampleTestFiles\outputFile.xlsx;HDR=YES', [Sheet1$])
And, I get the complete Excel sheet details with multiple rows in output console.
Now, I want to save the Excel file into a database column of datatype varbinary(max).
Please can anyone let me know how to do that?
Use application code like C# to read the file and then insert the binary data into your varbinary column in the database.

SQL bulk insert from Cisco Unified Communications Manager (CallManager) file extract

I'm trying to populate a SQL table with data from a CUCM file extract.
For some reason bulk insert seems not to work and gives me: (0 rows affected) and this has been bugging me all day today.
The SQL table is located here: SQL Fiddle link
The data is located here (under javascript): Data can be copied and pasted into Notepad++
I opted to use JSFiddle to save the data as it's reasonably practical and I don't have to upload files that might be deemed dodgy.
The data from the JSFiddle can be saved into a file with Notepad++ and uploaded with the bulk insert command from the SQL fiddle. However it will not upload.
I am trying this on a SQL server 2012 (11.0.6251.0)
Actual code I use to try to import:
BULK INSERT CDR FROM 'R:\CDR_Telephony\Imports\cdr_StandAloneCluster_01_201605252003_107183' WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0a',
FIRSTROW=3
)

sql server Bulk insert csv with data having comma

below is the sample line of csv
012,12/11/2013,"<555523051548>KRISHNA KUMAR ASHOKU,AR",<10-12-2013>,555523051548,12/11/2013,"13,012.55",
you can see KRISHNA KUMAR ASHOKU,AR as single field but it is treating KRISHNA KUMAR ASHOKU and AR as two different fields because of comma, though they are enclosed with " but still no luck
I tried
BULK
INSERT tbl
FROM 'd:\1.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FIRSTROW=2
)
GO
is there any solution for it?
The answer is: you can't do that. See http://technet.microsoft.com/en-us/library/ms188365.aspx.
"Importing Data from a CSV file
Comma-separated value (CSV) files are not supported by SQL Server bulk-import operations. However, in some cases, a CSV file can be used as the data file for a bulk import of data into SQL Server. For information about the requirements for importing data from a CSV data file, see Prepare Data for Bulk Export or Import (SQL Server)."
The general solution is that you must convert your CSV file into one that can be be successfully imported. You can do that in many ways, such as by creating the file with a different delimiter (such as TAB) or by importing your table using a tool that understands CSV files (such as Excel or many scripting languages) and exporting it with a unique delimiter (such as TAB), from which you can then BULK INSERT.
They added support for this SQL Server 2017 (14.x) CTP 1.1. You need to use the FORMAT = 'CSV' Input File Option for the BULK INSERT command.
To be clear, here is what the csv looks like that was giving me problems, the first line is easy to parse, the second line contains the curve ball since there is a comma inside the quoted field:
jenkins-2019-09-25_cve-2019-10401,CVE-2019-10401,4,Jenkins Advisory 2019-09-25: CVE-2019-10401:
jenkins-2019-09-25_cve-2019-10403_cve-2019-10404,"CVE-2019-10404,CVE-2019-10403",4,Jenkins Advisory 2019-09-25: CVE-2019-10403: CVE-2019-10404:
Broken Code
BULK INSERT temp
FROM 'c:\test.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
FIRSTROW= 2
);
Working Code
BULK INSERT temp
FROM 'c:\test.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
FORMAT = 'CSV',
FIRSTROW= 2
);
Unfortunately , SQL Server Import methods( BCP && BULK INSERT) do not understand quoting " "
Source : http://msdn.microsoft.com/en-us/library/ms191485%28v=sql.100%29.aspx
I have encountered this problem recently and had to switch to tab-delimited format. If you do that and use the SQL Server Management Studio to do the import (Right-click on database, then select Tasks, then Import) tab-delimited works just fine. The bulk insert option with tab-delimited should also work.
I must admit to being very surprised when finding out that Microsoft SQL Server had this comma-delimited issue. The CSV file format is a very old one, so finding out that this was an issue with a modern database was very disappointing.
MS have now addressed this issue and you can use FIELDQUOTE in your with clause to add quoted string support:
FIELDQUOTE = '"',
anywhere in your with clause should do the trick, if you have SQL Server 2017 or above.
Well, Bulk Insert is very fast but not very flexible. Can you load the data into a staging table and then push everything into a production table? Once in SQL Server, you will have a lot more control in how you move data from one table to another. So, basically.
1) Load data into staging
2) Clean/Convert by copying to a second staging table defined using the desired datatypes. Good data copied over, bad data left behind
3) Copy data from the "clean" table to the "live" table

Bulk Import .CSV to SQL table

I have a bizare problem with importing .csv files into sql table.
The bulk import from .csv does not give any erros and completes fine, but the data is not imported into SQL table. There is data in .csv
However, when i open the .csv in excel and save it again as .csv, try bulk import, check the tables and the data is there.
Any ideas of what it could be? And if it encoding issue, how can i check the encoding on the .csv file before importing or force sql to import no matter what.
Thanks.
To identify the reject and issues only we have log and bad file while
loading from a file,
Your log file will have the reason behind reject
Your bad file will have a sample rejected record
For your question, when you open it in excel and it's working : this is
nothing but format of some fields (may be date columns) will change when
you open it in excel which may suits your table structure.
try giving MAX ERROR 0 and run,
BULK INSERT OrdersBulk
FROM 'c:\file.csv'
WITH
(
FIRSTROW = 2,
MAXERRORS = 0,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
Hope this helps

BULK INSERT with Excel from a VARBINARY(MAX) field

BULK INSERT Communication.Message
FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=C:\temp\Messages201101.XLS', [messages$])
How do I take the above and instead read Message.XLS from a Varbinary(max) field named FileBytes in a table named Attachments.FileContents? I already know how to stage it in the table by various methods-- I just do not know method to use a BULK INSERT from a VarBinary(max) field.
The Jet driver can't open a VARBINARY that contains the bytes of your file. Looking at this MSDN page, the documentation doesn't talk about opening/mounting anything except files. You would have to take the bytes out of FileBytes, write them to a file and then use that file in your OPENROWSET statement.