Msg 102, Level 15, Line 1 , Incorrect syntax (however ...) - sql-server-2005

I'm running Microsoft SQL Server 2005 and I've got the following query to import the records from my csv.
However it keeps giving me this syntax error
LOAD DATA local INFILE 'C:\Users\Administrator\Downloads\update_05112013.csv' INTO TABLE dbo.Urls
FIELDS TERMINATED BY ';'
ENCLOSED BY '"'
ESCAPED BY '\\'
LINES TERMINATED BY '\n'
Perhaps I'm missing something small , probably ??
Can any of you guys see what I'm doing wrong.

SQL Server BULK INSERT is a good way to insert data in bulk (as the name suggests) but it doesn't actually support CSV files:
http://technet.microsoft.com/en-us/library/ms188609.aspx
Comma-separated value (CSV) files are not supported by SQL Server
bulk-import operations. However, in some cases, a CSV file can be used
as the data file for a bulk import of data into SQL Server.
If you can create a CSV without quotation marks or escaped characters this will work:
BULK INSERT dbo.Urls FROM 'C:\Users\Administrator\Downloads\update_05112013.csv'
WITH
(
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n'
)

Related

Can't BULK INSERT Into SQL Table From CSV File [duplicate]

This question already has answers here:
SQL Server - Bulk Insert - FIELDQUOTE does not recognize double quote
(1 answer)
Import CSV file into SQL Server
(14 answers)
Row terminators for text files used for bulk insert
(2 answers)
Closed 8 months ago.
I seem to be unable to run a bulk insert on a SQL table from a CSV file that gets sent through FTP to our server. It runs without error, but alters 0 rows.
If I copy the data into another file, it works, but I need to be able to do this automatically without messing with the file myself. Opening them both, the only differences I can see are that the line breaks are CRLF on the new file, and just LF on the original. Encoding looks to be the same as well on both, so I must be missing something not in the other similar questions asked.
Sample script below:
BULK INSERT dbo.t_process_order_import
FROM 'C:\Root\Product Data\H888 ProcOrd.csv'
WITH
(
FIRSTROW = 2, -- as 1st one is header
FIELDTERMINATOR = '|',
ROWTERMINATOR = '\n',
TABLOCK
)
Sample CSV Data:
ProcessOrder|ProductNumber|MaterialDescription|OrderQuantity|WorkCenter|StartDate|StartTime
001001101111|000000000000101111|TEST|40500.000 |CKET02|20201014|220000
001001221111|000000000000101111|TEST|14124.000 |GHFD02|20210225|032000
You need to use ROWTERMINATOR='0x0a'
Your code will become:
BULK INSERT dbo.t_process_order_import
FROM 'C:\Root\Product Data\H888 ProcOrd.csv'
WITH
(
FIRSTROW = 2, -- as 1st one is header
FIELDTERMINATOR = '|',
ROWTERMINATOR = '0x0a',
TABLOCK
)
As suggested, I try to improve with my source:
https://learn.microsoft.com/en-us/sql/relational-databases/import-export/specify-field-and-row-terminators-sql-server?view=sql-server-ver16
At paragraph "Specifying \n as a Row Terminator for Bulk Import"
Reporting here what is important for the question:
When you specify \n as a row terminator for bulk import, or implicitly use the default row terminator, bcp and the BULK INSERT statement expect a carriage return-line feed combination (CRLF) as the row terminator. If your source file uses a line feed character only (LF) as the row terminator - as is typical in files generated on Unix and Linux computers - use hexadecimal notation to specify the LF row terminator. For example, in a BULK INSERT statement

SQL Bulk Insert - ROWTERMINATOR 'Ox0A' vs '\r\n'

I am writing a SQL job to replace some old legacy processing. Basically, it reads in a raw text file, does some manipulations to the data, then runs an update statement. Pretty simple. However, I ran into an issue with the BULK INSERT and I don't exactly understand the resolution, so I wanted to post it here.
The BULK INSERT statement I was writing was as follows:
-- Insert statements for procedure here
BULK INSERT [SomeTableName] FROM '\\FILE_PATH_HERE\test.txt'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '0x0a'--'\r\n'
);
When I ran original statement with the ROWTERMINATOR of '\r\n' no data was populated in my table. When I replaced '\r\n' with 'OxOa' my bulk insert worked.
The reason I am unsure why this worked, is that when I view file in Notepad++ with View->Symbols->Show all characters, I see CRLF at the end of each data line. I thought that '\r\n' was the equivalent of a carriage return line feed (CRLF), but it wasn't working. I thought 'Oxoa' only represented a '\n' so I am a little unsure why changing ROWTERMINATOR from '\r\n' to '0x0a' worked.
Any insight would be appreciated

SQL; csv import with semicolons in data and double quotes

I'm wanting to import a CSV file which has some values as such:
123;456;"78;9";1011
Simply said, there are some quotes in a value, but the value is within double quotes. When I use a bulk import, the value '"78' is put into one column, whereas '9"' is put into the next column. How can I prevent this?
I am using below query:
BULK INSERT CSVTest
FROM 'c:\csvtest.csv'
WITH
(
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n'
)
GO
I'm using SQL Server!
In a test environment i've setup the new sql server, and the fieldquote seems to be ignored in the statement, and the fields are still split up. What am I doing wrong? I'm doing:
BULK INSERT CSVTest
FROM 'c:\csvtest.csv'
WITH
(
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n',
FIELDQUOTE='"'
)
GO

BulkInsert CSV file to table SQL Server

I've got some hard problems inserting my CSV file from a location into a table that will be used for making reports and data extraction matched with other data.
Create table #PD_ABC (
Column1
Column2 etc etc
)
BULK INSERT #PD_ABC FROM 'F:\BulkInsert\Andrej\UtkastAntal(23)Export20141003.csv'
WITH (FIELDTERMINATOR = ';',CODEPAGE = 'RAW',ROWTERMINATOR = '0x0a')
insert into Maintenance.dbo.PD_ABC_Del1
select * from #PD_ABC
So far I supose everything should work. I made a similar script for .txt files but when comming to CSV somehow I cannot import them correctly.
This is the erros message I've been receving.
Msg 4863, Level 16, State 1, Procedure PD_ABC_SP, Line 49
Bulk load data conversion error (truncation) for row 1, column 3 (Gldnr).
No idea how to move forward from this.
It looks like your Column3 doesn't have enough characters for data. Is column3 type char or varchar? If so, you should give it more characters.

SQL Server 2008 bulk import issue

Ive been working with this for a while and can't find out what I'm doing wrong. I have a CSV file with data such as
123,Jon,Son,M,1
When I run the query
BULK INSERT MYDB2..Dependent FROM 'c:\db3\db.csv'
WITH
(FIELDTERMINATOR=',',
ROWTERMINATOR = '/n')
I get errors like
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 5 (AGE).
The thing is I made EXACT copies of the tables so there is no way my tables can't match.
I believe the problem is my the format of my query.
It does have a little problem, it should be \n, instead of /n
BULK INSERT [Dependent] FROM 'c:\db3\db.csv'
WITH
(FIELDTERMINATOR=',' ,ROWTERMINATOR = '\n')