The resultant error is:
Msg 207, Level 16, State 1, Line 9
Invalid column name 'Email'.
Code:
-- Bulk insert data from csv file into server temp table
BULK INSERT vwTemporaryIT_USE_ONLY_Import FROM 'C:\Bulk\b_email.csv'
WITH (
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
Go
-- Set the flag in db for all records imported from csv
UPDATE [APTIFY].[dbo].[Person]
SET
[IT_Use_Only] = 1
WHERE
[Email] IN
(Select [Email] From vwTemporaryIT_USE_ONLY_Import)
Go
I can see that the vwTemporaryIT_USE_ONLY_Import table is being populated with the data from the CSV fine, but is seems the following statement is failing for some reason:
WHERE
[Email] IN
(Select [Email] From vwTemporaryIT_USE_ONLY_Import)
I am certainly not an expert at this and I may not have setup the table or view correctly, as I recently added the Email column to both. But they have matching datatype of nvchar(100) not null. I have also tried it as null. I'm not even sure if IN handles nvchar such is the level of my SQL expertise. Any clues what I'm doing wrong?
Actually no! In the Person table it was called Email1. I have changed the code to:
WHERE [Email1]
IN
(Select [Email] From wTemporaryIT_USE_ONLY_Import)
...and now it works fine. Thanks for your help!!
Related
I've got some hard problems inserting my CSV file from a location into a table that will be used for making reports and data extraction matched with other data.
Create table #PD_ABC (
Column1
Column2 etc etc
)
BULK INSERT #PD_ABC FROM 'F:\BulkInsert\Andrej\UtkastAntal(23)Export20141003.csv'
WITH (FIELDTERMINATOR = ';',CODEPAGE = 'RAW',ROWTERMINATOR = '0x0a')
insert into Maintenance.dbo.PD_ABC_Del1
select * from #PD_ABC
So far I supose everything should work. I made a similar script for .txt files but when comming to CSV somehow I cannot import them correctly.
This is the erros message I've been receving.
Msg 4863, Level 16, State 1, Procedure PD_ABC_SP, Line 49
Bulk load data conversion error (truncation) for row 1, column 3 (Gldnr).
No idea how to move forward from this.
It looks like your Column3 doesn't have enough characters for data. Is column3 type char or varchar? If so, you should give it more characters.
This is my source data in CSV format:
4,23,2AY5623,7235623
4,23,2GP1207,1451207
4,23,2GQ6689,4186689
Table:
CREATE TABLE [dbo].[Table1](
[idCodeLevel] [int] NOT NULL,
[idFirm] [int] NOT NULL,
[valCodeFrom] [varchar](15) NOT NULL,
[valCodeTo] [varchar](15) NOT NULL
) ON [PRIMARY]
This the code I am using to bulk import:
USE Test
GO
TRUNCATE TABLE Table1
GO
BULK INSERT Table1
FROM 'C:\Temp\test.csv'
WITH (
FIELDTERMINATOR = ',',
MAXERRORS=0,
ROWTERMINATOR = '\n'
)
GO
Error I am getting is:
Msg 4864, Level 16, State 1, Line 2
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 1 (idCodeLevel).
Can you please someone tell me why is it failing?
I googled and found out that I might have to use the format .fmt. But how can I convert a csv file to fmt. I have seen code to create fmt file from sql table.
Thanks a lot for your help!
Does the csv have a row at the top of field names? If so you'll need to add "FIRSTROW = 2" to your bulk statement. If not, try creating a new table that is all VARCHAR fields, then check the data: you probably have something strange in your data that you aren't expecting, like a non-printing character. Import as text and then try something like "SELECT ISNUMERIC([FIELD1]) FROM NEWTABLE".
use the sql import wizard to import data from external file.
Right click on database--->task--->import----> specify the flat file as source and select the destination server.
for more information please visit Import CSV data to SQL
I have a CSV file that contains stock quotes. I am new when it comes to SQL, but I have done a lot of research and come up with a code that I thought should work. But it doesn't. I get errors all the way....
USE ShakeOut
GO
CREATE TABLE CSVTest1
(Ticker varchar(10),
dateval smalldatetime),
timevale time(),
Openval varchar(10),
Highval varchar(10),
Lowval varchar(10),
Closeval varchar(10),
Volume varchar(10),
)
GO
BULK
INSERT CSVTest1
FROM 'c:\TEST.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
--Check the content of the table.
SELECT *
FROM CSVTest1
GO
--Drop the table to clean up database.
DROP TABLE CSVTest1
GO
My CSV file has timevalue as 03:15:00 PM, and I'm not sure how to set that up in the table. The other values I think are aproxmately right, here's a sample of my csv file:
5/1/2009,9:30:00 AM,18.21,18.45,18.21,18.32,32163
5/1/2009,9:35:00 AM,18.33,18.34,18.27,18.29,36951
5/1/2009,9:40:00 AM,18.29,18.38,18.25,18.37,53198
5/1/2009,9:45:00 AM,18.38,18.4,18.28,18.285,49491
And here is my error messages in SQL Management Studio:
Msg 102, Level 15, State 1, Line 4 Incorrect syntax near ','. Msg 208,
Level 16, State 82, Line 3 Invalid object name 'CSVTest1'. Msg 208,
Level 16, State 1, Line 3 Invalid object name 'CSVTest1'. Msg 3701,
Level 11, State 5, Line 3 Cannot drop the table 'CSVTest1', because it
does not exist or you do not have permission.
I would really appreciate help here, my head is about to explode after all these hours without any progression. I've tried MySQL too, didn't work there either.
As I'm new, I might need it explained to the details.
It appears you have an extraneous comma in the CREATE TABLE statement. There is a comma following the final column prior to the closing paren. Perhaps it is valid in some implementations, but you might try removing it. Change it to:
Volume varchar(10)
Ah - and it appears there is an extraneous closing parent in the second column definition. Change it to:
dateval smalldatetime,
And the time column:
timevale time,
Ultimately, it appears you should probably just try to get the CREATE TABLE statement syntax correct, then start adding the other parts.
There is no need for a comma after the last column definition: Volume varchar(10),.
I assume timevale should be timeval.
time() should just be time.
Also, I'm probably being picky but you have capitalised the first letter of all the column names except the first two - won't cause an error but probably better to have a consistent naming convention. I would capitalise the 'v' in val and write the whole word too.
The CSV data needs revising too - you need to specify EVERY column, even if it is null. See my example data (the new lines at the end of each row are for illustration purposes only).
1234567890,2012-08-25,22:15,anytext,ornum,for,varchar,columns <-new line
abcd123456,2010-05-20,00:01,anything,in,these,varchar,columns <-new line
abcd123456,2010-05-20,00:01,anything,in,,,columns <-new line
This works:
CREATE TABLE CSVTest1 (
Ticker varchar(10) NULL,
DateValue smalldatetime NULL,
TimeValue time NULL,
OpenValue varchar(10) NULL,
HighValue varchar(10) NULL,
LowValue varchar(10) NULL,
CloseValue varchar(10) NULL,
Volume varchar(10) NULL)
GO
BULK INSERT CSVTest1
FROM 'C:\TEST.txt'
WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '\n')
GO
Your CSV file needs to have a new line for each record you want to insert, as specified by the ROWTERMINATOR = '\n' and a comma between each field as specified by FIELDTERMINATOR = ','.
EDIT:
By the way if you are using SQL Server Management Studio (SSMS) you can create the table through the user interface and then:
Right click on the table
Script Table as
CREATE To
New Query Editor Window
I have next table:
CREATE TABLE [dbo].[tempTable](
[id] [varchar](50) NULL,
[amount] [varchar](50) NULL,
[bdate] [varchar](50) NULL
)
and next insert statement:
BULK INSERT dbo.tempTable
FROM 'C:\files\inv123.txt'
WITH
(
FIELDTERMINATOR ='\t',
ROWTERMINATOR ='\n'
)
I get next error:
Bulk load data conversion error (truncation) for row 1, column 3
(bdate).
Data example in file:
12313 24 2012-06-08 13:25:49
12314 26 2012-06-08 12:25:49
It does look like it is just not ever delimiting the row. I've had to separate rows by column delimiter AND row delimiter because the text file had a post ceding (and unnecessary) column delimiter after the last value that it took me awhile to spot. Those dates would certainly fit the format (assuming there just isn't some bad data in a huge file you can't visually spot, and since it doesn't fail until 10 errors by default there'd be at least that many bad records) and it looks like it is making it to that point correctly. View the file in hex in a good text editor if you can and see or just try:
BULK INSERT dbo.tempTable
FROM 'C:\files\inv123.txt'
WITH
(
FIELDTERMINATOR ='\t',
ROWTERMINATOR = '\t\n'
)
Another possibility (that I doubt considering it is varchar(50)) is that there are headers in the inv123.txt file and the header is being perceived as a row and is exceeding varchar(50) and it is what is being truncated. In this case you can add
FIRSTROW = 2,
If it still fails after these things, try to force some data in or grab the rows that are errorring so you'll truly know where the problem is. Look into set ansi_warnings off or using ERRORFILE depending on what flavor SQL SERVER or create a temp table with text as the datatype. SQL Server 2005 forces stricter data validation and forcing an insert without fail is more difficult but can be done.
i am doing a bulk insert:
DECLARE #row_terminator CHAR;
SET #row_terminator = CHAR(10); -- or char(10)
DECLARE #stmt NVARCHAR(2000);
SET #stmt = '
BULK INSERT accn_errors
FROM ''F:\FullUnzipped\accn_errors_201205080105.txt''
WITH
(
firstrow=2,
FIELDTERMINATOR = ''|'' ,
ROWS_PER_BATCH=10000
,ROWTERMINATOR='''+#row_terminator+'''
)'
exec sp_executesql #stmt;
and am getting the following error:
Msg 4832, Level 16, State 1, Line 2
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 2
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 2
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
is there a way to know on which ROW this error occurred?
i am able to import 10,000,000 rows without a problem and error occurs after that
To locate the troublesome row use the errorfile specifier.
BULK INSERT myData
FROM 'C:\...\...\myData.csv'
WITH (
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
ERRORFILE = 'C:\...\...\myRubbishData.log'
);
myRubbishData.log will have the offending rows and a companion file
myRubbishData.log.txt will give you row numbers and offsets into the file.
Companion file example:
Row 3 File Offset 152 ErrorFile Offset 0 - HRESULT 0x80004005
Row 5 File Offset 268 ErrorFile Offset 60 - HRESULT 0x80004005
Row 7 File Offset 384 ErrorFile Offset 120 - HRESULT 0x80004005
Row 10 File Offset 600 ErrorFile Offset 180 - HRESULT 0x80004005
Row 12 File Offset 827 ErrorFile Offset 301 - HRESULT 0x80004005
Row 13 File Offset 942 ErrorFile Offset 416 - HRESULT 0x80004005
Fun, fun, fun. I haven't found a good way to debug these problems, so I use brute force. That is, the FirstRow and LastRow options are very useful.
Start with LastRow = 2 and keep trying. Load the results into a throw-away table, that you can readily truncate.
And, you should also keep in mind that the first row could be causing you problems as well.
I have a csv file that i import using Bulk
BULK INSERT [Dashboard].[dbo].[3G_Volume]
FROM 'C:\3G_Volume.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = '","',
ROWTERMINATOR = '\n'
)
GO
Usually I used this script and it has no problems but in rare occassions.
I encounter this error..
"The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error."
Usually, this happens when the last row have blank values(null).
You need to link your csv file in MS access db to check the data..
(If your csv is not more than 1.4million rows you can open it in excel)
Since my data is around 3million rows I need to use access db.
Then check the number of the last row with blanks and subtract the number of null rows to your total rows for csv.
if you have 2 blank rows at the end and the total number of rows is 30000005
The script will become like this..
BULK
INSERT [Dashboard].[dbo].[3G_Volume]
FROM 'C:\3G_Volume.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = '","',
ROWTERMINATOR = '\n',
Lastrow = 30000003
)
GO
Cheers...
Mhelboy
If CHAR(10) is the row terminator, I don't think you can put it in quotes like you are trying to in BULK INSERT. There is an undocumented way to indicate it, though:
ROWTERMINATOR = '0x0A'
Yeah - BULK INSERT would have done will with a bit more detail in its error messages, and the only way around this is to use the brute force approach, as Gordon rightly pointed out. First, though, based on the error you're getting, it is either not understanding your row terminator, or there is a row terminator missing at the end of the file. Using FIRSTROW and LASTROW will help to determine that.
So, you need to do the following:
Check that there is a row terminator at the end of the file. If not, put one in and try again. Also make sure that the last row contains all of the necessary fields. It it says 'EOF', then that is your problem.
Are you sure there's a LF at the end of each line? Try a CR (\n, 0x0D) and see if that works.
Still not working? Try setting LASTROW=2 and try again. Then try LASTROW=3. If you have more than three rows in your file and this step fails, then the row terminator isn't working.
I ran into the same issue. I had written a shell script to create a .csv in Linux. I took this .csv to Windows and tried to bulk load the data. It did not "like" the commas.... Don't ask me why, but I changed to a * as a delimiter in the bulk import and performed a find and replace for comma with * in my .csv .. that worked.. I changed to a ~ as a delimiter, that worked... tab also worked- it didn't like the comma.... Hope this helps someone.
In my experience this is almost always caused by something in the last two lines. tail the import file and it should still give you the failure. Then open it in a full text editor that lets you see non-printing characters like CR, LF, and EOF. That should enable you to kludge it into working, even if you don't know why. E.g., BULK INSERT fails with row terminator on last row
I got around the problem by converting all fields to strings and then using a common FIELDTERMINATOR. This worked:
BULK INSERT [dbo].[workingBulkInsert]
FROM 'C:\Data\myfile.txt' WITH (
ROWTERMINATOR = '\n',
FIELDTERMINATOR = ','
)
My data file looks like this now:
"01502","1470"
"01504","686"
"02167","882"
"106354","882"
"106355","784"
"106872","784"
The second field had been a decimal type with no double-quote delimiter (like , 1470.00) . Formatting both as strings eliminated the error.
I have a CSV file that I import using Bulk
You need to create one table and all columns should be nullable and remove space in the last row, add only those columns that available in excel. And please do not create a primary column, this process is not Identity increment automatically that's why creating the error.
I have done a bulk insert like this:
CREATE TABLE [dbo].[Department](
[Deptid] [bigint] IDENTITY(1,1) NOT NULL,
[deptname] [nvarchar](max) NULL,
[test] [nvarchar](max) NULL,
CONSTRAINT [PK_Department] PRIMARY KEY CLUSTERED
(
[Deptid] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF,
ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
CREATE TABLE [dbo].[Table_Column](
[column1] [nvarchar](max) NULL,
[column2] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
BULK INSERT Table_Column
FROM 'C:\Temp Data\bulkinsert1.csv'
WITH (
FIELDTERMINATOR = ',',
ROWTERMINATOR='\n' ,
batchsize=300000
);
insert into [dbo].[Department]
select column1,column2 from Table_Column
I got around the problem if I converted all fields to string and then used a common fielddelimiter.
the rows generating this error don't have CHAR(10) terminator or have unnecessary spaces