How to insert data from text file into my table? - sql-server-2005

I have text files which contain one word per line, and I would like to add this content to a column in my table, the column type is Varchar, how can I accomplish that?

You can treat your file as a special case of CSV - it's a CSV file with only one column.
See this article for how to bulk insert from a CSV file.
BULK
INSERT CSVTest
FROM 'c:\csvtest.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO

You can also use import wizard provided in the management studio. You can check this link for your reference.

Related

Can't BULK INSERT Into SQL Table From CSV File [duplicate]

This question already has answers here:
SQL Server - Bulk Insert - FIELDQUOTE does not recognize double quote
(1 answer)
Import CSV file into SQL Server
(14 answers)
Row terminators for text files used for bulk insert
(2 answers)
Closed 8 months ago.
I seem to be unable to run a bulk insert on a SQL table from a CSV file that gets sent through FTP to our server. It runs without error, but alters 0 rows.
If I copy the data into another file, it works, but I need to be able to do this automatically without messing with the file myself. Opening them both, the only differences I can see are that the line breaks are CRLF on the new file, and just LF on the original. Encoding looks to be the same as well on both, so I must be missing something not in the other similar questions asked.
Sample script below:
BULK INSERT dbo.t_process_order_import
FROM 'C:\Root\Product Data\H888 ProcOrd.csv'
WITH
(
FIRSTROW = 2, -- as 1st one is header
FIELDTERMINATOR = '|',
ROWTERMINATOR = '\n',
TABLOCK
)
Sample CSV Data:
ProcessOrder|ProductNumber|MaterialDescription|OrderQuantity|WorkCenter|StartDate|StartTime
001001101111|000000000000101111|TEST|40500.000 |CKET02|20201014|220000
001001221111|000000000000101111|TEST|14124.000 |GHFD02|20210225|032000
You need to use ROWTERMINATOR='0x0a'
Your code will become:
BULK INSERT dbo.t_process_order_import
FROM 'C:\Root\Product Data\H888 ProcOrd.csv'
WITH
(
FIRSTROW = 2, -- as 1st one is header
FIELDTERMINATOR = '|',
ROWTERMINATOR = '0x0a',
TABLOCK
)
As suggested, I try to improve with my source:
https://learn.microsoft.com/en-us/sql/relational-databases/import-export/specify-field-and-row-terminators-sql-server?view=sql-server-ver16
At paragraph "Specifying \n as a Row Terminator for Bulk Import"
Reporting here what is important for the question:
When you specify \n as a row terminator for bulk import, or implicitly use the default row terminator, bcp and the BULK INSERT statement expect a carriage return-line feed combination (CRLF) as the row terminator. If your source file uses a line feed character only (LF) as the row terminator - as is typical in files generated on Unix and Linux computers - use hexadecimal notation to specify the LF row terminator. For example, in a BULK INSERT statement

Import CSV into SQL (CODE)

I want to import several CSV files automatically using SQL-code (i.e. without using the GUI). Normally, I know the dimensions of my CSV file. So, in many cases I create an empty table with, let say, x columns with the corresponding data types. Then, I import the CSV file into this table using BULK INSERT. However, in this case I don't know much about my files, i.e. information about data types and dimensions are not given.
To summerize the problem:
I receive a file path, e.g. C:...\DATA.csv. Then, I want to use this path in SQL-code to import the file to a table without knowing anything about it.
Any ideas on how to solve this problem?
Use something like this:
BULK INSERT tbl
FROM 'csv_full_path'
WITH
(
FIRSTROW = 2, --Second row if header row in file
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'error_file_path',
TABLOCK
)
If columns are not known, you could try with:
select * from OpenRowset
Or, do a bulk insert with only the first row as one big column, then parse it to create the dynamic main insert. Or bulk insert the whole file into a table with just one column, then parse that...
You can use OPENROWSET (documantation).
SELECT *
INTO dbo.MyTable
FROM
OPENROWSET(
BULK 'C:\...\mycsvfile.csv',
SINGLE_CLOB) AS DATA;
In addition, you can use dynamic SQL to parameterize table name and location of csv file.

Bulk insert from txt in SQL table

I need to do some bulk inserts in SQL Table from a txt file.
bulk insert [dbo].[TempSample]
from 'D:\sqls\sample.txt'
with (fieldterminator = ',', rowterminator = '\n')
go
In the txt file I have descriptions like 'Hörsching'. After insert is made I found descriptions in my table like 'H÷rsching'. How can we deal with that ? The collation of the table is set to Latin1_General_CI_AS.
How is the file encoded?
Have you tried using the CODEPAGE parameter to specify the file's encoding?

SQL Server Bulk Insert of UTF-8 CSV file not working correctly

I am trying to bulk insert a UTF-8 CSV file that I downloaded as that type from Google Drive, because Excel was not saving my CSV correctly.
I opened the Google Drive generated CSV file in notepad++ and went to View > Show Symbol > Show All Characters and I could see that it contained LF line feeds for the row terminator (correct me if I am wrong here)
So I tried the below and I don't get any records in the temp table. This works for other CSV files that are not UTF-8 when I use the default row terminator (i.e. '\r\n' when you don't specify one).
I have also tried '\t', '\r\n', '\r' & '\0' for the row terminators and with and without a data file type.. nothing seems to be working? is this to do with my field types in the temp table? or something else?
CREATE TABLE #TEMPResourceContents (
[ResourceName] [nvarchar](250) NOT NULL,
[Language] [nvarchar](250) NOT NULL,
[Content] [nvarchar](max) NOT NULL
)
GO
BULK INSERT #TEMPResourceContents
FROM 'C:\import-resources.csv'
WITH
(FIRSTROW = 2, DATAFILETYPE = 'widechar', FIELDTERMINATOR = ',', ROWTERMINATOR = '\n')
GO
SELECT * FROM #TEMPResourceContents
By the way BULK INSERT doesn’t support UTF-8.
See Reference Link MSDN
See Reference Link

Passing default values to a column in Bulk insert

I am trying to get data from a csv file with the following data.
Station code;Priority vehicle;DateBegin;DateEnd
01;y;20100214;20100214
02;n;20100214;20100214
03;;20100214;20100214
Now I want a value 'n' in the table when no data is provided for the column 'Priority vehicle' in csv file.
I am writing the query as
BULK INSERT dbo.#tmp_station_details
FROM 'C:\station.csv'
WITH (
FIELDTERMINATOR ='';'',
FIRSTROW = 2,
ROWTERMINATOR = ''\n''
)
Check the full explanation here:
http://msdn.microsoft.com/en-us/library/ms187887.aspx
"By default, when data is imported into a table, the bcp command and BULK INSERT statement observe any defaults that are defined for the columns in the table. For example, if there is a null field in a data file, the default value for the column is loaded instead. "
My suggestion is to specify a default value for the Priority vehicle column and the Null value from the csv file will be overwritten to your SQL table with the default value specified in the table design.