How can I avoid the ',' using OPENROWSET or BULK INSERTION? - sql

I have to read and insert into a table i already have in sql server 2005 i have to read a txt file that has more than 1million of records, it worked great using this method:
set #variable='
insert into table(row1)
select * from OPENROWSET(''MSDASQL'',''Driver={Microsoft Text Driver (*.txt; *.csv)};
DEFAULTDIR='+#path+';rowDTERMINATOR = \n;'',''SELECT * FROM '+#fileName+''')
'
EXEC(#variable)
the problem is that some records has a ',' in the middle of the name of the client for example:
david,steven abril pulecio
and this method insert that in the table like this:
|david|
so how can i use this method or maybe use bulk insertion and evite that it cut the records when they have a ',' ?

Related

Import CSV into SQL (CODE)

I want to import several CSV files automatically using SQL-code (i.e. without using the GUI). Normally, I know the dimensions of my CSV file. So, in many cases I create an empty table with, let say, x columns with the corresponding data types. Then, I import the CSV file into this table using BULK INSERT. However, in this case I don't know much about my files, i.e. information about data types and dimensions are not given.
To summerize the problem:
I receive a file path, e.g. C:...\DATA.csv. Then, I want to use this path in SQL-code to import the file to a table without knowing anything about it.
Any ideas on how to solve this problem?
Use something like this:
BULK INSERT tbl
FROM 'csv_full_path'
WITH
(
FIRSTROW = 2, --Second row if header row in file
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'error_file_path',
TABLOCK
)
If columns are not known, you could try with:
select * from OpenRowset
Or, do a bulk insert with only the first row as one big column, then parse it to create the dynamic main insert. Or bulk insert the whole file into a table with just one column, then parse that...
You can use OPENROWSET (documantation).
SELECT *
INTO dbo.MyTable
FROM
OPENROWSET(
BULK 'C:\...\mycsvfile.csv',
SINGLE_CLOB) AS DATA;
In addition, you can use dynamic SQL to parameterize table name and location of csv file.

Insert .csv file into SQL Server with BULK INSERT and reorder columns

I make an ASP.NET application and I want to insert data into my SQL Server from a CSV file. I did it with this SQL command:
BULK
INSERT Shops
FROM 'C:\..\file.csv'
WITH
(
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n'
);
It pretty works but I have an id columns with AUTO INCREMENT option. I want to reorder inserted columns to SQL server increment automatiquely Id column.
How can I do that with BULK method?
(of course, I don't want to edit .csv file manualy :P )
Like I said in my comment: You can't, bulk insert just pumps data in, you can't transform the data in any way. What you can do is bulk insert to staging table(s) and use an insert statement to do what you need to do.
You can do it like this:
-- Create staging table
SELECT TOP 0 *
INTO Shops_temp
FROM Shops;
-- Bulk insert into staging
BULK INSERT Shops_temp
FROM 'C:\..\file.csv'
WITH
(
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n'
);
-- Insert into actual table, use SELECT for transformation, column order etc.
INSERT INTO Shops(name, etc..)
SELECT name
, etc..
FROM Shops_temp;
-- Cleanup
DROP TABLE Shops_temp;

SQL BULK INSERT with conditions

What i'm trying to do is read a text file and then use BULK INSERT to create a table.
This is an example of how the text file looks
TIME DATE USER_NAME VALUE
11:10:04 10/02/15 Irene I. Moosa
There are a lot of rows and i mean a lot but sometimes the time is empty or the end character is not just a simple enter and I'm trying to compensate for it
Is something like this possible:
BULK INSERT #TEMP FROM 'C:\QPR_Logs\Audit\MetricsServerAudit.txt'
WHERE [TIME] IS NOT NULL WITH (FIELDTERMINATOR =' ', ROWTERMINATOR = '\n')
Something like that if it reads a null value that it just skips the line?
For the end character I'm not exactly sure what to use.
Has anyone got any suggestions?
Try OPENROWSET. since you have custom row/column terminators, you might require a format file.
select t1.*
from openrowset(bulk 'c:\folder\file1.csv'
, formatfile = 'c:\folder\values.fmt'
, firstrow = 2) as t1
where t1.[TIME] is not null

Generating CSv through a query in sql

I have created a database, a table, and entries in that table through basic SELECT and INSERT INTO commands.
To view the entries I am using the basic query:
USE test1
SELECT * FROM orders
where test1 is Database and orders is Table name.
I can see the entries.
How can I store the results to a CSV?
With the query
SELECT * FROM orders
INTO OUTFILE '/path/to/file.csv'
FIELDS ESCAPED BY '""'
TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\n'"
I am getting an error
Msg 156, Level 15, State 1, Line 4
Incorrect syntax near the keyword 'INTO'.
You do have the option to output to csv from within management studio, you can return your grid as normal then right click and click "Save results As..."
or
Using SQLMD as per question How to export SQL Server 2005 query to CSV
sqlcmd -q "select col1,col2,col3 from table" -oc:\myfile.csv -h-1 -s","
Alternative is create it into a variable and then save that via your application e.g.
declare #csv varchar(max) = ""
select #csv += replace(column1, ',','","') + ', ' + replace(column1, ',','","') + char(13) + char(10) from orders
select #csv csv
Regards
Liam
If you're not as familiar with SQL and would like a more graphical based approach, you could always use the Import/Export Wizard. Little more user-friendly to people who may be new to SQL.

SQL Server 2012. Copy row into single column on another table

I am working with SQL Server on the AdventureWorks2012 Database. I am working with triggers. I would like to copy any new inserted row into one single column in another table called AuditTable. Basically whenever I insert into the parson.address table, I would like to copy all of the rows into the AuditTable.prevValue column. I know how to insert etc, I am not sure how to write to one column.
Here is the general idea.
USE [AdventureWorks2012]
ALTER TRIGGER [Person].[sPerson] ON [Person].[Address]
FOR INSERT AS INSERT INTO AdventureWorks2012.HumanResources.AuditTable(PrevValue) select
AddressID,AddressLine1,AddressLine2,City, StateProvinceID, PostalCode, SpatialLocation, rowguid, ModifiedDate FROM Inserted
ERROR: The select list for the INSERT statement contains more items than the insert list. The number of SELECT values must match the number of INSERT columns.
Thank you for any assistance. I have searched loads but cannot find the exact solution anywhere.
The error message says it all - you can't insert 9 columns of different types into a single column. Assuming that your destination AuditTable.PrevValue column is NVARCHAR(), you could flatten your insert as follows, by concatenating the columns and casting non-char columns to n*char:
INSERT INTO AdventureWorks2012.HumanResources.AuditTable(PrevValue)
SELECT
N'ID : ' + CAST(AddressID AS NVARCHAR(20)) + N'Address: ' + AddressLine1 +
N', ' +AddressLine2 + ....
FROM Inserted
IMO keeping one long string like this makes the Audit table difficult to search, so you might consider adding SourceTable and possibly Source PK columns.
You could also consider converting your row to Xml and storing it as an Xml column, like this:
create table Audit
(
AuditXml xml
);
alter trigger [Person].[sPerson] ON [Person].[Address] for INSERT AS
begin
DECLARE #xml XML;
SET #xml =
(
SELECT *
FROM INSERTED
FOR XML PATH('Inserted')
);
insert into [MyAuditTable](AuditXml) VALUES (#xml);
end