I'm having some trouble dealing with a missing row qualifier at the end of a .csv file. I'm automatically downloading a Google sheets .csv which is then bulk inserted into a SQL server table. However what I've found is happening is that the final row of the file is not being inserted.
Looking at the file in Notepad ++, all of the lines except for the final one has a row qualifier of 'LF'.
The code I'm using to insert is below.
bulk insert CSVworkout
from 'C:\Users\Documents\Personal\531 Workouts.csv'
with (
fieldterminator = ',',
rowterminator = '0x0a',
firstrow=2)
Has anyone encountered anything similiar? Looking around, it seems this is a drawback of the Google Sheets .csv export, but is there a way I can either force the insert to recognise the final row, or is there a tool I can use to automatically populate a LF on the final row?
Any tips are very welcome!
Thanks
Can't you just add a CRLF to the file? Plenty of different ways to do that, you could use this in a batch file;
echo. >> "C:\Users\Documents\Personal\531 Workouts.csv"
Related
Every morning one of my clients send me a .txt file with ' ; ' as separator, and this is how the file is currently being imported in a temp table using SSIS:
mario.mascarenhas;MARIO LUIZ MASCARENHAS;2017-03-21 13:18:22;PDV;94d33a66dbaaff15a01d8139c7acd7c6;;;1;0;0;0;0;0;0;0;0;0;0;\N
evilanio.asevedo;EVILANIO ASEVEDO;2017-03-21 13:26:10;PDV;30a1bd072ac5f158f99445bb0975e423;;;1;1;0;0;0;0;0;0;0;0;0;\N
marcelo.tarso;MARCELO TARSO;2017-03-21 13:47:09;PDV;ef6b5e971242ec345552cdb724968f8a;;;1;0;0;0;0;0;0;0;0;0;0;\N
tiago.rodrigues;TIAGO ALVES RODRIGUES;2017-03-21 13:49:04;PDV;d782d4b30c0d302fe815b2cb48de4d03;;;1;1;0;0;0;0;0;0;0;0;0;\N
roberto.freire;ROBERTO CUSTODIO;2017-03-21 13:54:53;PDV;78794a18187f068d612e6b6370a60781;;;1;0;0;0;1;0;0;0;0;0;0;\N
eduardo.lima;EDUARDO MORO LIMA;2017-03-21 13:55:24;PDV;83e1c2696faa83d54881b13c70a07924;;;1;0;0;0;0;0;0;0;0;0;0;\N
Each file constains at least 23,000 rows just like that.
I already made a table with the correct number of columns to receive this data. So what I want is to "explode" (just like in PHP) the row using ' ; ' as the column separator and loop the insert in my table named dbo.showHistoricalLogging.
I've been searching for a solution here in Stack but nothing specific having this volume of data in consideration and looping an insert.
Any idea? I'm running SQL Server 2008.
My suggestion,
convert the text file into a csv file, then refer to this post from StackOverFlow to use the Bulk package. I have used this before while I was in University of Arizona for one of my programming assignments in my Database Designs class. Any clarifications and/or question, leave in a comment and will do my best.
Something like this should work
BULK INSERT [TableName] FROM 'C:\MyFile.txt' WITH (FIELDTERMINATOR = ';', ROWTERMINATOR = '\\N');
consult the Microsoft Bulk Insert documentation if you need other parameters. Alternatively SSIS makes this super easy as well - many ways you could do this honestly.
I have a folder full of text files that are tab delimited. However, each has a lot of columns that can change. What I am looking for is a way to bring in these text files with the column names (1st row). In a perfect world the code would loop through all of the files I have in a folder (DemographicsPL) and import them in as tables with the original name. I know their has to be a way to do this. Access can do this in one line of code and I know SQL is better than Access.
I would like to do other things with so I would like to do this in a stored procedure. Any help would be greatly appreciated as I'm kind of a newbie to SQL. The code below works but requires that the file already exist
--**** This works but is on local drive and reguires table to already exist.*****
BULK INSERT [dbo].[TR15] FROM '\\MA000XSREA01\E$\TDLoad\DemographicsPL\BG15.txt'
WITH (
FIRSTROW = 2,
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '\n'
);
GO
So I stumbled on an interesting problem while trying to load some data. Essentially I have some files with data in them that I am trying to BULK INSERT into a table with varchar columns to make it easy to import. The file is tab delimited with CRLFs as the row terminator.
For some reason, when I write/copy&paste the BULK INSERT command from my own PC the command fails. It offers an error stating
Bulk load: DataFileType was incorrectly specified as widechar. DataFileType will be assumed to be char because the data file does not have a Unicode signature.
Then it says:
Bulk load failed. The column is too long in the data file for row 1, column 7. Verify that the field terminator and row terminator are specified correctly.
The command is as follows:
BULK INSERT <table>
FROM '<filepath>'
WITH
(
DATAFILETYPE = 'widechar',
FIELDTERMINATOR = ' ',
ROWTERMINATOR = '
'
);
Now, the part that doesn't make sense is that without changing any piece of that code, my co-worker was able to run and load the table with perfect success. No warning messages, no failures, nothing.
When I look at the command in Notepad++ with all character symbols enabled it appears to be correct with CRLFs as the row endings and arrows to denote tabs between columns.
The only thing I could come up with on my own is that somehow the encoding of my SQL Server Management Studio text editor must be messing up the field/row terminator arguments and causing the bulk insert command to fail.
Anyone have any bright ideas?
Turns out my coworker's computer is messed up and does something weird with encoding that enables him to just paste LFs correctly.
I was able to solve my problem by creating some dynamic sql to execute the bulk insert with the row terminator being generated by CHAR(10) concatenated to the rest of the command.
CHAR(10) is the ascii representation of a linefeed which was the row terminator in the files I had.
I have a bizare problem with importing .csv files into sql table.
The bulk import from .csv does not give any erros and completes fine, but the data is not imported into SQL table. There is data in .csv
However, when i open the .csv in excel and save it again as .csv, try bulk import, check the tables and the data is there.
Any ideas of what it could be? And if it encoding issue, how can i check the encoding on the .csv file before importing or force sql to import no matter what.
Thanks.
To identify the reject and issues only we have log and bad file while
loading from a file,
Your log file will have the reason behind reject
Your bad file will have a sample rejected record
For your question, when you open it in excel and it's working : this is
nothing but format of some fields (may be date columns) will change when
you open it in excel which may suits your table structure.
try giving MAX ERROR 0 and run,
BULK INSERT OrdersBulk
FROM 'c:\file.csv'
WITH
(
FIRSTROW = 2,
MAXERRORS = 0,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
Hope this helps
I have managed to insert the whole text-file data in SQL-SERVER database table with this statement
BULK INSERT [dbo].[tablename] FROM 'c:\weblog.log' WITH (
FIELDTERMINATOR = ' ',
ROWTERMINATOR = '\n' )
But my text-file is not organized in any format and it contains some data i want to omit from the insertion process. So i am looking for a way to be able to only insert some of the data in the text-file into my database table?
There are two ways. One way is to write some code that will read the specific data, to be inserted into the database, from the file and then insert it to the database. Second, if you have some minimal data that you want to remove from the file, you might run a Regex query to find and replace them with none (deleting the unwanted portion) from the file and then doing a bulk insert.
For bulk insert to work, you need it to be a delimited text file. So if your log file is not a delimited log file, you might not be able to insert it using the bulk insert.