I have a file named queryservices.txt. The data in it ";" divided with, can i write directly to sql database
C:\queryservices.txt
file content;
;STATE;4 RUNNING
(sorry i'm trying to learn sqli by myself, the codes may sound silly)
Codes I've developed so far
BULK INSERT dbo.query
FROM 'queryservices.txt'
WITH
(
FIELDTERMINATOR = ';'
)
I have a folder full of text files that are tab delimited. However, each has a lot of columns that can change. What I am looking for is a way to bring in these text files with the column names (1st row). In a perfect world the code would loop through all of the files I have in a folder (DemographicsPL) and import them in as tables with the original name. I know their has to be a way to do this. Access can do this in one line of code and I know SQL is better than Access.
I would like to do other things with so I would like to do this in a stored procedure. Any help would be greatly appreciated as I'm kind of a newbie to SQL. The code below works but requires that the file already exist
--**** This works but is on local drive and reguires table to already exist.*****
BULK INSERT [dbo].[TR15] FROM '\\MA000XSREA01\E$\TDLoad\DemographicsPL\BG15.txt'
WITH (
FIRSTROW = 2,
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '\n'
);
GO
below is the sample line of csv
012,12/11/2013,"<555523051548>KRISHNA KUMAR ASHOKU,AR",<10-12-2013>,555523051548,12/11/2013,"13,012.55",
you can see KRISHNA KUMAR ASHOKU,AR as single field but it is treating KRISHNA KUMAR ASHOKU and AR as two different fields because of comma, though they are enclosed with " but still no luck
I tried
BULK
INSERT tbl
FROM 'd:\1.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FIRSTROW=2
)
GO
is there any solution for it?
The answer is: you can't do that. See http://technet.microsoft.com/en-us/library/ms188365.aspx.
"Importing Data from a CSV file
Comma-separated value (CSV) files are not supported by SQL Server bulk-import operations. However, in some cases, a CSV file can be used as the data file for a bulk import of data into SQL Server. For information about the requirements for importing data from a CSV data file, see Prepare Data for Bulk Export or Import (SQL Server)."
The general solution is that you must convert your CSV file into one that can be be successfully imported. You can do that in many ways, such as by creating the file with a different delimiter (such as TAB) or by importing your table using a tool that understands CSV files (such as Excel or many scripting languages) and exporting it with a unique delimiter (such as TAB), from which you can then BULK INSERT.
They added support for this SQL Server 2017 (14.x) CTP 1.1. You need to use the FORMAT = 'CSV' Input File Option for the BULK INSERT command.
To be clear, here is what the csv looks like that was giving me problems, the first line is easy to parse, the second line contains the curve ball since there is a comma inside the quoted field:
jenkins-2019-09-25_cve-2019-10401,CVE-2019-10401,4,Jenkins Advisory 2019-09-25: CVE-2019-10401:
jenkins-2019-09-25_cve-2019-10403_cve-2019-10404,"CVE-2019-10404,CVE-2019-10403",4,Jenkins Advisory 2019-09-25: CVE-2019-10403: CVE-2019-10404:
Broken Code
BULK INSERT temp
FROM 'c:\test.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
FIRSTROW= 2
);
Working Code
BULK INSERT temp
FROM 'c:\test.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
FORMAT = 'CSV',
FIRSTROW= 2
);
Unfortunately , SQL Server Import methods( BCP && BULK INSERT) do not understand quoting " "
Source : http://msdn.microsoft.com/en-us/library/ms191485%28v=sql.100%29.aspx
I have encountered this problem recently and had to switch to tab-delimited format. If you do that and use the SQL Server Management Studio to do the import (Right-click on database, then select Tasks, then Import) tab-delimited works just fine. The bulk insert option with tab-delimited should also work.
I must admit to being very surprised when finding out that Microsoft SQL Server had this comma-delimited issue. The CSV file format is a very old one, so finding out that this was an issue with a modern database was very disappointing.
MS have now addressed this issue and you can use FIELDQUOTE in your with clause to add quoted string support:
FIELDQUOTE = '"',
anywhere in your with clause should do the trick, if you have SQL Server 2017 or above.
Well, Bulk Insert is very fast but not very flexible. Can you load the data into a staging table and then push everything into a production table? Once in SQL Server, you will have a lot more control in how you move data from one table to another. So, basically.
1) Load data into staging
2) Clean/Convert by copying to a second staging table defined using the desired datatypes. Good data copied over, bad data left behind
3) Copy data from the "clean" table to the "live" table
A .csv file gets dumped nightly onto my FTP server by an external company. (Thus I have no control over its format, nor will they change it as it's used by several other companies.)
My mission is to create a job that runs to extract the info from the file (and then delete it) and insert the extracted data into a SQL table.
This is the format of the info contained within the .csv file:
[Message]Message 1 contains, a comma,[Cell]27747642512,[Time]3:06:10 PM,[Ref]144721721
[Message]Message 2 contains,, 2 commas,[Cell]27747642572,[Time]3:06:10 PM,[Ref]144721722
[Message],[Cell]27747642572,[Time]3:06:10 PM,[Ref]144721723
I have a SQL Server 2012 table with the following columns:
Message varchar(800)
Cell varchar(15)
Time varchar(10)
Ref varchar(50)
I would like to use something like the SQL bulk insert (see below) to read from the .csv file and insert into the SQL table above.
BULK INSERT sms_reply
FROM 'C:\feedback.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
TABLOCK
)
The delimiter in the .csv file is not common. I was wondering if there was a way for me to use a wildcard character when specifying the delimiter. e.g. '[%]'?
This would then ignore whatever was between the square brackets and extract the info in between.
I cannot simply use the commas to delimit the fields as there could be commas in the fields themselves as illustrated in the .csv example above.
Or if anyone has any other suggestions, I'd really appreciate it.
TIA.
My approach would be:
Bulk load the csv file into a staging table. All columns accepting data would be varchar.
Do an sql update to get rid of the brackets and their contents.
Do whatever other processing is required.
Insert data from your staging table to your prodution table.
I've managed to overcome this by using a handy function I found here
which comes from an article located here.
This function returns each line in the .csv file and then I use string manipulation to extract the fields between the "[variable_name]" delimiters and insert directly into my SQL table.
Clean, simple and quick.
NOTE: You have to enable OLE Automation on your SQL server:
sp_configure 'show advanced options', 1;
GO
RECONFIGURE;
GO
sp_configure 'Ole Automation Procedures', 1;
GO
RECONFIGURE;
GO
I have managed to insert the whole text-file data in SQL-SERVER database table with this statement
BULK INSERT [dbo].[tablename] FROM 'c:\weblog.log' WITH (
FIELDTERMINATOR = ' ',
ROWTERMINATOR = '\n' )
But my text-file is not organized in any format and it contains some data i want to omit from the insertion process. So i am looking for a way to be able to only insert some of the data in the text-file into my database table?
There are two ways. One way is to write some code that will read the specific data, to be inserted into the database, from the file and then insert it to the database. Second, if you have some minimal data that you want to remove from the file, you might run a Regex query to find and replace them with none (deleting the unwanted portion) from the file and then doing a bulk insert.
For bulk insert to work, you need it to be a delimited text file. So if your log file is not a delimited log file, you might not be able to insert it using the bulk insert.