SQL bulk insert from Cisco Unified Communications Manager (CallManager) file extract - sql

I'm trying to populate a SQL table with data from a CUCM file extract.
For some reason bulk insert seems not to work and gives me: (0 rows affected) and this has been bugging me all day today.
The SQL table is located here: SQL Fiddle link
The data is located here (under javascript): Data can be copied and pasted into Notepad++
I opted to use JSFiddle to save the data as it's reasonably practical and I don't have to upload files that might be deemed dodgy.
The data from the JSFiddle can be saved into a file with Notepad++ and uploaded with the bulk insert command from the SQL fiddle. However it will not upload.
I am trying this on a SQL server 2012 (11.0.6251.0)
Actual code I use to try to import:
BULK INSERT CDR FROM 'R:\CDR_Telephony\Imports\cdr_StandAloneCluster_01_201605252003_107183' WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0a',
FIRSTROW=3
)

Related

I want to insert data in txt format file into sql database

I have a file named queryservices.txt. The data in it ";" divided with, can i write directly to sql database
C:\queryservices.txt
file content;
;STATE;4 RUNNING
(sorry i'm trying to learn sqli by myself, the codes may sound silly)
Codes I've developed so far
BULK INSERT dbo.query
FROM 'queryservices.txt'
WITH
(
FIELDTERMINATOR = ';'
)

SQL Server Bulk Insert won't import text file but DTS does

SQL Server Bulk Insert of text file produces 0 results, no error file and no imported data (rows). I am using a format file.
I have used the DTS function of SQL and successfully imported the data (SSMS Task). I have used BCP (command line) to successfully export and create the format file. I have used that format file with Bulk Insert to import the exported data using the format file. I have viewed my source data in Notepad++ showing all characters to ensure I have the correct column and row terminators.
BULK INSERT [schema].[tablename]
FROM 'FullyQualifiedFileName'
WITH (
FORMATFILE = 'FullyQualifiedFileName'
,KEEPNULLS
,FIRSTROW = 2
,ERRORFILE = 'FullyQualifiedFilename'
)
No error file is generated, no error is displayed and no data is imported.
I have been 'beating' on this for 2 days searching the hallowed halls of Google U endlessly. The source data is pipe-delimited (|) and I have confirmed through Notepad++ that the EndOfRow is a CR/LF. When I EXPORT the data via BCP, the resulting text file has a number of fields showing as ...|NULL|NULL|... whereas my source file is ...|||.
This learning experience is in aid of trying to develop a core SSIS package that will import multiple different flat files into multiple different SQL tables all 'mapped' correctly by use of the format files - that is the long distance destination. For now - I can't figure out why I can't BULK INSERT my sample data file.
I'm a technology 'generalist' having taught myself coding in numerous different languages and scripts. SQL Server is relatively new to me and the nuances of BULK INSERT and SSIS are challenging. I have worked through all the errors that I kept getting thrown up until this point - no errors... no data.

Import large delimited .txt file in SQL Server 2008

Every morning one of my clients send me a .txt file with ' ; ' as separator, and this is how the file is currently being imported in a temp table using SSIS:
mario.mascarenhas;MARIO LUIZ MASCARENHAS;2017-03-21 13:18:22;PDV;94d33a66dbaaff15a01d8139c7acd7c6;;;1;0;0;0;0;0;0;0;0;0;0;\N
evilanio.asevedo;EVILANIO ASEVEDO;2017-03-21 13:26:10;PDV;30a1bd072ac5f158f99445bb0975e423;;;1;1;0;0;0;0;0;0;0;0;0;\N
marcelo.tarso;MARCELO TARSO;2017-03-21 13:47:09;PDV;ef6b5e971242ec345552cdb724968f8a;;;1;0;0;0;0;0;0;0;0;0;0;\N
tiago.rodrigues;TIAGO ALVES RODRIGUES;2017-03-21 13:49:04;PDV;d782d4b30c0d302fe815b2cb48de4d03;;;1;1;0;0;0;0;0;0;0;0;0;\N
roberto.freire;ROBERTO CUSTODIO;2017-03-21 13:54:53;PDV;78794a18187f068d612e6b6370a60781;;;1;0;0;0;1;0;0;0;0;0;0;\N
eduardo.lima;EDUARDO MORO LIMA;2017-03-21 13:55:24;PDV;83e1c2696faa83d54881b13c70a07924;;;1;0;0;0;0;0;0;0;0;0;0;\N
Each file constains at least 23,000 rows just like that.
I already made a table with the correct number of columns to receive this data. So what I want is to "explode" (just like in PHP) the row using ' ; ' as the column separator and loop the insert in my table named dbo.showHistoricalLogging.
I've been searching for a solution here in Stack but nothing specific having this volume of data in consideration and looping an insert.
Any idea? I'm running SQL Server 2008.
My suggestion,
convert the text file into a csv file, then refer to this post from StackOverFlow to use the Bulk package. I have used this before while I was in University of Arizona for one of my programming assignments in my Database Designs class. Any clarifications and/or question, leave in a comment and will do my best.
Something like this should work
BULK INSERT [TableName] FROM 'C:\MyFile.txt' WITH (FIELDTERMINATOR = ';', ROWTERMINATOR = '\\N');
consult the Microsoft Bulk Insert documentation if you need other parameters. Alternatively SSIS makes this super easy as well - many ways you could do this honestly.

sql server Bulk insert csv with data having comma

below is the sample line of csv
012,12/11/2013,"<555523051548>KRISHNA KUMAR ASHOKU,AR",<10-12-2013>,555523051548,12/11/2013,"13,012.55",
you can see KRISHNA KUMAR ASHOKU,AR as single field but it is treating KRISHNA KUMAR ASHOKU and AR as two different fields because of comma, though they are enclosed with " but still no luck
I tried
BULK
INSERT tbl
FROM 'd:\1.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FIRSTROW=2
)
GO
is there any solution for it?
The answer is: you can't do that. See http://technet.microsoft.com/en-us/library/ms188365.aspx.
"Importing Data from a CSV file
Comma-separated value (CSV) files are not supported by SQL Server bulk-import operations. However, in some cases, a CSV file can be used as the data file for a bulk import of data into SQL Server. For information about the requirements for importing data from a CSV data file, see Prepare Data for Bulk Export or Import (SQL Server)."
The general solution is that you must convert your CSV file into one that can be be successfully imported. You can do that in many ways, such as by creating the file with a different delimiter (such as TAB) or by importing your table using a tool that understands CSV files (such as Excel or many scripting languages) and exporting it with a unique delimiter (such as TAB), from which you can then BULK INSERT.
They added support for this SQL Server 2017 (14.x) CTP 1.1. You need to use the FORMAT = 'CSV' Input File Option for the BULK INSERT command.
To be clear, here is what the csv looks like that was giving me problems, the first line is easy to parse, the second line contains the curve ball since there is a comma inside the quoted field:
jenkins-2019-09-25_cve-2019-10401,CVE-2019-10401,4,Jenkins Advisory 2019-09-25: CVE-2019-10401:
jenkins-2019-09-25_cve-2019-10403_cve-2019-10404,"CVE-2019-10404,CVE-2019-10403",4,Jenkins Advisory 2019-09-25: CVE-2019-10403: CVE-2019-10404:
Broken Code
BULK INSERT temp
FROM 'c:\test.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
FIRSTROW= 2
);
Working Code
BULK INSERT temp
FROM 'c:\test.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
FORMAT = 'CSV',
FIRSTROW= 2
);
Unfortunately , SQL Server Import methods( BCP && BULK INSERT) do not understand quoting " "
Source : http://msdn.microsoft.com/en-us/library/ms191485%28v=sql.100%29.aspx
I have encountered this problem recently and had to switch to tab-delimited format. If you do that and use the SQL Server Management Studio to do the import (Right-click on database, then select Tasks, then Import) tab-delimited works just fine. The bulk insert option with tab-delimited should also work.
I must admit to being very surprised when finding out that Microsoft SQL Server had this comma-delimited issue. The CSV file format is a very old one, so finding out that this was an issue with a modern database was very disappointing.
MS have now addressed this issue and you can use FIELDQUOTE in your with clause to add quoted string support:
FIELDQUOTE = '"',
anywhere in your with clause should do the trick, if you have SQL Server 2017 or above.
Well, Bulk Insert is very fast but not very flexible. Can you load the data into a staging table and then push everything into a production table? Once in SQL Server, you will have a lot more control in how you move data from one table to another. So, basically.
1) Load data into staging
2) Clean/Convert by copying to a second staging table defined using the desired datatypes. Good data copied over, bad data left behind
3) Copy data from the "clean" table to the "live" table

Wildcard character in field delimiter - reading .csv file

A .csv file gets dumped nightly onto my FTP server by an external company. (Thus I have no control over its format, nor will they change it as it's used by several other companies.)
My mission is to create a job that runs to extract the info from the file (and then delete it) and insert the extracted data into a SQL table.
This is the format of the info contained within the .csv file:
[Message]Message 1 contains, a comma,[Cell]27747642512,[Time]3:06:10 PM,[Ref]144721721
[Message]Message 2 contains,, 2 commas,[Cell]27747642572,[Time]3:06:10 PM,[Ref]144721722
[Message],[Cell]27747642572,[Time]3:06:10 PM,[Ref]144721723
I have a SQL Server 2012 table with the following columns:
Message varchar(800)
Cell varchar(15)
Time varchar(10)
Ref varchar(50)
I would like to use something like the SQL bulk insert (see below) to read from the .csv file and insert into the SQL table above.
BULK INSERT sms_reply
FROM 'C:\feedback.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
TABLOCK
)
The delimiter in the .csv file is not common. I was wondering if there was a way for me to use a wildcard character when specifying the delimiter. e.g. '[%]'?
This would then ignore whatever was between the square brackets and extract the info in between.
I cannot simply use the commas to delimit the fields as there could be commas in the fields themselves as illustrated in the .csv example above.
Or if anyone has any other suggestions, I'd really appreciate it.
TIA.
My approach would be:
Bulk load the csv file into a staging table. All columns accepting data would be varchar.
Do an sql update to get rid of the brackets and their contents.
Do whatever other processing is required.
Insert data from your staging table to your prodution table.
I've managed to overcome this by using a handy function I found here
which comes from an article located here.
This function returns each line in the .csv file and then I use string manipulation to extract the fields between the "[variable_name]" delimiters and insert directly into my SQL table.
Clean, simple and quick.
NOTE: You have to enable OLE Automation on your SQL server:
sp_configure 'show advanced options', 1;
GO
RECONFIGURE;
GO
sp_configure 'Ole Automation Procedures', 1;
GO
RECONFIGURE;
GO