Related
I am getting error when I am trying to do bulk insert:
BULK INSERT #tbl_InterCompanyUploadDetail_Staging
FROM '\\SVRP03546008461D\QA\UploadTemplatewithvalidation.xlsx'
WITH (FIRSTROW = 6, FIELDTERMINATOR ='\t', ROWTERMINATOR ='\\n' )
Error that I am getting is :
Bulk load data conversion error (truncation) for row 6, column 2 (Oracle Company Code).
The column in Excel has data as 470 and in database column is varchar (10).
So what could be the reason for the error.
The Issue
BULK INSERT may not work with xlsx files, try converting the .xlsx file to .csv file to achieve this (using BULK INSERT)
1st Solution - using OPENROWSET
Try using OPENROWSET with Microsoft.ACE.OLEDB.12.0 provider:
Insert into <rawdatatable>
select * FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0;Database=D:\SSIS\FileToLoad.xlsx;HDR=YES',
'SELECT * FROM [Sheet1$]')
OR
SELECT * INTO Data_dq
FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0; Database=D:\Desktop\Data.xlsx', [Sheet1$]);
2nd Solution - using OPENDATASOURCE
SELECT * INTO Data_dq
FROM OPENDATASOURCE('Microsoft.ACE.OLEDB.12.0',
'Data Source=D:\Desktop\Data.xlsx;Extended Properties=Excel 12.0')...[Sheet1$];
References
Import/Export Excel (.Xlsx) or (.Xls) File into SQL Server
Import data from Excel to SQL Server or Azure SQL Database
How to Bulk Insert from XLSX file extension?
Replace "\n" with "0x0a" as ROWTERMINATOR and try again.
Or also
ROWTERMINATOR = '''+cast (0x0000 as char(1))+'''
Let me know if it works.
Check also this.
I doubt using XLSX file with BULK INSERT. If XLSX file supported, then FIELDTERMINATOR and ROWTERMINATOR are not needed.
XLSX is zip file so I guess (but not sure) XLSX not supported and you are getting truncation error because it is reading it as pure text file and BULK INSERT getting long text up to FIELDTERMINATOR.
To confirm, you try increasing length of column up to some thousand character and run BULK INSERT, if you get garbage character then it is reading it as pure text file. may be garbage character could be same as you open same xlsx file in notepad or notepad++.
You can't bulk load XLSX into SQL Server. You CAN convert the XLSX to a tab delimited text file and bulk load.
If this is a one-off operation I would recommend converting to text first (but beware how Excel exports certain types like dates and large numbers). Or you can use the Import/Export wizard (https://learn.microsoft.com/en-us/sql/relational-databases/import-export/import-data-from-excel-to-sql)
If this is a process you need to repeat I would create an SSIS script.
I am looking for help to import a .csv file into SQL Server using BULK INSERT and I have few basic questions.
Issues:
The CSV file data may have , (comma) in between (Ex: description), so how can I make import handling these data?
If the client creates the CSV from Excel then the data that have comma are enclosed within "" (double quotes) [as the below example] so how do the import can handle this?
How do we track if some rows have bad data, which import skips? (does import skips rows that are not importable)
Here is the sample CSV with header:
Name,Class,Subject,ExamDate,Mark,Description
Prabhat,4,Math,2/10/2013,25,Test data for prabhat.
Murari,5,Science,2/11/2013,24,"Test data for his's test, where we can test 2nd ROW, Test."
sanjay,4,Science,,25,Test Only.
And SQL statement to import:
BULK INSERT SchoolsTemp
FROM 'C:\CSVData\Schools.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
TABLOCK
)
Based SQL Server CSV Import
1) The CSV file data may have , (comma) in between (Ex:
description), so how can I make import handling these data?
Solution
If you're using , (comma) as a delimiter, then there is no way to differentiate between a comma as a field terminator and a comma in your data. I would use a different FIELDTERMINATOR like ||. Code would look like and this will handle comma and single slash perfectly.
2) If the client create the csv from excel then the data that have
comma are enclosed within " ... " (double quotes) [as the below
example] so how do the import can handle this?
Solution
If you're using BULK insert then there is no way to handle double quotes, data will be
inserted with double quotes into rows.
after inserting the data into table you could replace those double quotes with ''.
update table
set columnhavingdoublequotes = replace(columnhavingdoublequotes,'"','')
3) How do we track if some rows have bad data, which import skips?
(does import skips rows that are not importable)?
Solution
To handle rows which aren't loaded into table because of invalid data or format, could be
handle using ERRORFILE property, specify the error file name, it will write the rows
having error to error file. code should look like.
BULK INSERT SchoolsTemp
FROM 'C:\CSVData\Schools.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'C:\CSVDATA\SchoolsErrorRows.csv',
TABLOCK
)
From How to import a CSV file into a database using SQL Server Management Studio, from 2013-11-05:
First create a table in your database into which you will be importing
the CSV file. After the table is created:
Log into your database using SQL Server Management Studio
Right click on your database and select Tasks -> Import Data...
Click the Next > button
For the Data Source, select Flat File Source. Then use the Browse button to select the CSV file. Spend some time configuring how you want the data to be imported before clicking on the Next > button.
For the Destination, select the correct database provider (e.g. for SQL Server 2012, you can use SQL Server Native Client 11.0). Enter the Server name; Check Use SQL Server Authentication, enter the User name, Password, and Database before clicking on the Next > button.
On the Select Source Tables and Views window, you can Edit Mappings before clicking on the Next > button.
Check the Run immediately check box and click on the Next > button.
Click on the Finish button to run the package.
2) If the client create the csv from excel then the data that have
comma are enclosed within " ... " (double quotes) [as the below
example] so how do the import can handle this?
You should use FORMAT = 'CSV', FIELDQUOTE = '"' options:
BULK INSERT SchoolsTemp
FROM 'C:\CSVData\Schools.csv'
WITH
(
FORMAT = 'CSV',
FIELDQUOTE = '"',
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
TABLOCK
)
The best, quickest and easiest way to resolve the comma in data issue is to use Excel to save a comma separated file after having set Windows' list separator setting to something other than a comma (such as a pipe). This will then generate a pipe (or whatever) separated file for you that you can then import. This is described here.
Because they do not use the SQL import wizard, the steps would be as follows:
Right click on the database in the option tasks to import data,
Once the wizard is open, we select the type of data to be implied. In this case it would be the
Flat file source
We select the CSV file, you can configure the data type of the tables in the CSV, but it is best to bring it from the CSV.
Click Next and select in the last option that is
SQL client
Depending on our type of authentication we select it, once this is done, a very important option comes.
We can define the id of the table in the CSV (it is recommended that the columns of the CSV should be called the same as the fields in the table). In the option Edit Mappings we can see the preview of each table with the column of the spreadsheet, if we want the wizard to insert the id by default we leave the option unchecked.
Enable id insert
(usually not starting from 1), instead if we have a column with the id in the CSV we select the enable id insert, the next step is to end the wizard, we can review the changes here.
On the other hand, in the following window may come alerts, or warnings the ideal is to ignore this, only if they leave error is necessary to pay attention.
This link has images.
Firs you need to import CSV file into Data Table
Then you can insert bulk rows using SQLBulkCopy
using System;
using System.Data;
using System.Data.SqlClient;
namespace SqlBulkInsertExample
{
class Program
{
static void Main(string[] args)
{
DataTable prodSalesData = new DataTable("ProductSalesData");
// Create Column 1: SaleDate
DataColumn dateColumn = new DataColumn();
dateColumn.DataType = Type.GetType("System.DateTime");
dateColumn.ColumnName = "SaleDate";
// Create Column 2: ProductName
DataColumn productNameColumn = new DataColumn();
productNameColumn.ColumnName = "ProductName";
// Create Column 3: TotalSales
DataColumn totalSalesColumn = new DataColumn();
totalSalesColumn.DataType = Type.GetType("System.Int32");
totalSalesColumn.ColumnName = "TotalSales";
// Add the columns to the ProductSalesData DataTable
prodSalesData.Columns.Add(dateColumn);
prodSalesData.Columns.Add(productNameColumn);
prodSalesData.Columns.Add(totalSalesColumn);
// Let's populate the datatable with our stats.
// You can add as many rows as you want here!
// Create a new row
DataRow dailyProductSalesRow = prodSalesData.NewRow();
dailyProductSalesRow["SaleDate"] = DateTime.Now.Date;
dailyProductSalesRow["ProductName"] = "Nike";
dailyProductSalesRow["TotalSales"] = 10;
// Add the row to the ProductSalesData DataTable
prodSalesData.Rows.Add(dailyProductSalesRow);
// Copy the DataTable to SQL Server using SqlBulkCopy
using (SqlConnection dbConnection = new SqlConnection("Data Source=ProductHost;Initial Catalog=dbProduct;Integrated Security=SSPI;Connection Timeout=60;Min Pool Size=2;Max Pool Size=20;"))
{
dbConnection.Open();
using (SqlBulkCopy s = new SqlBulkCopy(dbConnection))
{
s.DestinationTableName = prodSalesData.TableName;
foreach (var column in prodSalesData.Columns)
s.ColumnMappings.Add(column.ToString(), column.ToString());
s.WriteToServer(prodSalesData);
}
}
}
}
}
Here's how I would solve it:
Just Save your CSV File as a XLS Sheet in excel(By Doing so, you wouldn't have to worry about delimitiers. Excel's spreadsheet format will be read as a table and imported directly into a SQL Table)
Import the File Using SSIS
Write a Custom Script in the import manager to omit/modify the data you're looking for.(Or run a master script to scrutinize the data you're looking to remove)
Good Luck.
If anyone wants to import csv using powershell
## Install module if not installed, this is a one time install.
Install-Module SqlServer
## Input SQL Server Variables and CSV path
$csvPath = "D:\Orders.csv"
$csvDelimiter = ","
$serverName = "DESKTOP-DOG5T0Q\SQLEXPRESS"
$databaseName = "OrderDetails"
$tableSchema = "dbo"
$tableName = "Orders"
## Truncate Table
Invoke-Sqlcmd -ServerInstance $serverName -Database $databaseName -Query "TRUNCATE TABLE $tableSchema.$tableName"
## Import CSV into SQL
Import-Csv -Path $csvPath -header "Id","Country","Price","OrderQuantity" -Delimiter $csvDelimiter | Write-SqlTableData -ServerInstance $serverName -DatabaseName $databaseName -SchemaName $tableSchema -TableName $tableName -Force
Source: Import csv into SQL server (with query OR without query using SSMS)
I know this is not the exact solution to the question above, but for me, it was a nightmare when I was trying to Copy data from one database located at a separate server to my local.
I was trying to do that by first export data from the Server to CSV/txt and then import it to my local table.
Both solutions: with writing down the query to import CSV or using the SSMS Import Data wizard was always producing errors (errors were very general, saying that there is parsing problem). And although I wasn't doing anything special, just export to CSV and then trying to import CSV to the local DB, the errors were always there.
I was trying to look at the mapping section and the data preview, but there was always a big mess. And I know the main problem was comming from one of the table columns, which was containing JSON and SQL parser was treating that wrongly.
So eventually, I came up with a different solution and want to share it in case if someone else will have a similar problem.
What I did is that I've used the Exporting Wizard on the external Server.
Here are the steps to repeat the same process:
1) Right click on the database and select Tasks -> Export Data...
2) When Wizard will open, choose Next and in the place of "Data Source:" choose "SQL Server Native Client".
In case of external Server you will most probably have to choose "Use SQL Server Authentication" for the "Authentication Mode:".
3) After hitting Next, you have to select the Destionation.
For that, select again "SQL Server Native Client".
This time you can provide your local (or some other external DB) DB.
4) After hitting the Next button, you have two options either to copy the entire table from one DB to another or write down the query to specify the exact data to be copied.
In my case, I didn't need the entire table (it was too large), but just some part of it, so I've chosen "Write a query to specify the data to transfer".
I would suggest writing down and testing the query on a separate query editor before moving to Wizard.
5) And finally, you need to specify the destination table where the data will be selected.
I suggest to leave it as [dbo].[Query] or some custom Table name in case if you will have errors exporting the data or if you are not sure about the data and want further analyze it before moving to the exact table you want.
And now go straight to the end of the Wizard by hitting Next/Finish buttons.
All of the answers here work great if your data is "clean" (no data constraint violations, etc.) and you have access to putting the file on the server. Some of the answers provided here stop at the first error (PK violation, data-loss error, etc.) and give you one error at a time if using SSMS's built in Import Task. If you want to gather all errors at once (in case you want to tell the person that gave you the .csv file to clean up their data), I recommend the following as an answer. This answer also gives you complete flexibility as you are "writing" the SQL yourself.
Note: I'm going to assume you are running a Windows OS and have access to Excel and SSMS. If not, I'm sure you can tweak this answer to fit your needs.
Using Excel, open your .csv file. In an empty column you will write a formula that will build individual INSERTstatements like =CONCATENATE("INSERT INTO dbo.MyTable (FirstName, LastName) VALUES ('", A1, "', '", B1,"')", CHAR(10), "GO") where A1 is a cell that has the first name data and A2 has the last name data for example.
CHAR(10) adds a newline character to the final result and GO will allow us to run this INSERT and continue to the next even if there are any errors.
Highlight the cell with your =CONCATENATION() formula
Shift + End to highlight the same column in the rest of your rows
In the ribbon > Home > Editing > Fill > Click Down
This applies the formula all the way down the sheet so you don't have to copy-paste, drag, etc. down potentially thousands of rows by hand
Ctrl + C to copy the formulated SQL INSERT statements
Paste into SSMS
You will notice Excel, probably unexpectedly, added double quotes around each of your INSERT and GO commands. This is a "feature" (?) of copying multi-line values out of Excel. You can simply find and replace "INSERT and GO" with INSERT and GO respectively to clean that up.
Finally you are ready to run your import process
After the process completes, check the Messages window for any errors. You can select all the content (Ctrl + A) and copy into Excel and use a column filter to remove any successful messages and you are left with any and all the errors.
This process will definitely take longer than other answers here, but if your data is "dirty" and full of SQL violations, you can at least gather all the errors at one time and send them to the person that gave you the data, if that is your scenario.
Import the file into Excel by first opening excel, then going to DATA, import from TXT File, choose the csv extension which will preserve 0 prefixed values, and save that column as TEXT because excel will drop the leading 0 otherwise (DO NOT double click to open with Excel if you have numeric data in a field starting with a 0 [zero]). Then just save out as a Tab Delimited Text file. When you are importing into excel you get an option to save as GENERAL, TEXT, etc.. choose TEXT so that quotes in the middle of a string in a field like YourCompany,LLC are preserved also...
BULK INSERT dbo.YourTableName
FROM 'C:\Users\Steve\Downloads\yourfiletoIMPORT.txt'
WITH (
FirstRow = 2, (if skipping a header row)
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '\n'
)
I wish I could use the FORMAT and Fieldquote functionality but that does not appear to be supported in my version of SSMS
As it was stated above, you need to add FORMAT and FIELDQUOTE options to bulk insert .CSV data into SQL Server. For your case SQL statement will look like this:
BULK INSERT SchoolsTemp
FROM 'C:\CSVData\Schools.csv'
WITH
(
FORMAT = 'CSV',
FIELDQUOTE = '""',
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
TABLOCK
)
Though BULK INSERT in SSMS is great for a one-time import job, depending on your use case you may need some other options inside SSMS or using 3rd parties. Here is a detailed guide describing various options to import CSV files to SQL Server, including ways to automate (I mean schedule) the process and specify FTP or file storages for CSV location.
I know that there are accepted answer but still, I want to share my scenario that maybe help someone to solve their problem
TOOLS
ASP.NET
EF CODE-FIRST APPROACH
SSMS
EXCEL
SCENARIO
i was loading the dataset which's in CSV format which was later to be shown on the View
i tried to use the bulk load but I's unable to load as BULK LOAD was using
FIELDTERMINATOR = ','
and Excel cell was also using ,
however, I also couldn't use Flat file source directly because I was using Code-First Approach and doing that only made model in SSMS DB, not in the model from which I had to use the properties later.
SOLUTION
I used flat-file source and made DB table from CSV file (Right click DB in SSMS -> Import Flat FIle -> select CSV path and do all the settings as directed)
Made Model Class in Visual Studio (You MUST KEEP all the datatypes and names same as that of CSV file loaded in sql)
use Add-Migration in NuGet package console
Update DB
Maybe not exactly what you're asking, but another option is to use the CSV Lint plug-in for Notepad++
The plug-in can validate the csv data beforehand, meaning check for bad data like missing quotes, incorrect decimal separator, datetime formatting errors etc. And instead of BULK INSERT it can convert the csv file to an SQL insert script.
The SQL script will contain INSERT statements for each csv line in batches of 1000 records, and also adjust any datetime and decimal values. The plug-in automatically detects datatypes in the csv, and it will include a CREATE TABLE part with the correct data types for each column.
below is the sample line of csv
012,12/11/2013,"<555523051548>KRISHNA KUMAR ASHOKU,AR",<10-12-2013>,555523051548,12/11/2013,"13,012.55",
you can see KRISHNA KUMAR ASHOKU,AR as single field but it is treating KRISHNA KUMAR ASHOKU and AR as two different fields because of comma, though they are enclosed with " but still no luck
I tried
BULK
INSERT tbl
FROM 'd:\1.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FIRSTROW=2
)
GO
is there any solution for it?
The answer is: you can't do that. See http://technet.microsoft.com/en-us/library/ms188365.aspx.
"Importing Data from a CSV file
Comma-separated value (CSV) files are not supported by SQL Server bulk-import operations. However, in some cases, a CSV file can be used as the data file for a bulk import of data into SQL Server. For information about the requirements for importing data from a CSV data file, see Prepare Data for Bulk Export or Import (SQL Server)."
The general solution is that you must convert your CSV file into one that can be be successfully imported. You can do that in many ways, such as by creating the file with a different delimiter (such as TAB) or by importing your table using a tool that understands CSV files (such as Excel or many scripting languages) and exporting it with a unique delimiter (such as TAB), from which you can then BULK INSERT.
They added support for this SQL Server 2017 (14.x) CTP 1.1. You need to use the FORMAT = 'CSV' Input File Option for the BULK INSERT command.
To be clear, here is what the csv looks like that was giving me problems, the first line is easy to parse, the second line contains the curve ball since there is a comma inside the quoted field:
jenkins-2019-09-25_cve-2019-10401,CVE-2019-10401,4,Jenkins Advisory 2019-09-25: CVE-2019-10401:
jenkins-2019-09-25_cve-2019-10403_cve-2019-10404,"CVE-2019-10404,CVE-2019-10403",4,Jenkins Advisory 2019-09-25: CVE-2019-10403: CVE-2019-10404:
Broken Code
BULK INSERT temp
FROM 'c:\test.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
FIRSTROW= 2
);
Working Code
BULK INSERT temp
FROM 'c:\test.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
FORMAT = 'CSV',
FIRSTROW= 2
);
Unfortunately , SQL Server Import methods( BCP && BULK INSERT) do not understand quoting " "
Source : http://msdn.microsoft.com/en-us/library/ms191485%28v=sql.100%29.aspx
I have encountered this problem recently and had to switch to tab-delimited format. If you do that and use the SQL Server Management Studio to do the import (Right-click on database, then select Tasks, then Import) tab-delimited works just fine. The bulk insert option with tab-delimited should also work.
I must admit to being very surprised when finding out that Microsoft SQL Server had this comma-delimited issue. The CSV file format is a very old one, so finding out that this was an issue with a modern database was very disappointing.
MS have now addressed this issue and you can use FIELDQUOTE in your with clause to add quoted string support:
FIELDQUOTE = '"',
anywhere in your with clause should do the trick, if you have SQL Server 2017 or above.
Well, Bulk Insert is very fast but not very flexible. Can you load the data into a staging table and then push everything into a production table? Once in SQL Server, you will have a lot more control in how you move data from one table to another. So, basically.
1) Load data into staging
2) Clean/Convert by copying to a second staging table defined using the desired datatypes. Good data copied over, bad data left behind
3) Copy data from the "clean" table to the "live" table
I am looking for help to import a .csv file into SQL Server using BULK INSERT and I have few basic questions.
Issues:
The CSV file data may have , (comma) in between (Ex: description), so how can I make import handling these data?
If the client creates the CSV from Excel then the data that have comma are enclosed within "" (double quotes) [as the below example] so how do the import can handle this?
How do we track if some rows have bad data, which import skips? (does import skips rows that are not importable)
Here is the sample CSV with header:
Name,Class,Subject,ExamDate,Mark,Description
Prabhat,4,Math,2/10/2013,25,Test data for prabhat.
Murari,5,Science,2/11/2013,24,"Test data for his's test, where we can test 2nd ROW, Test."
sanjay,4,Science,,25,Test Only.
And SQL statement to import:
BULK INSERT SchoolsTemp
FROM 'C:\CSVData\Schools.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
TABLOCK
)
Based SQL Server CSV Import
1) The CSV file data may have , (comma) in between (Ex:
description), so how can I make import handling these data?
Solution
If you're using , (comma) as a delimiter, then there is no way to differentiate between a comma as a field terminator and a comma in your data. I would use a different FIELDTERMINATOR like ||. Code would look like and this will handle comma and single slash perfectly.
2) If the client create the csv from excel then the data that have
comma are enclosed within " ... " (double quotes) [as the below
example] so how do the import can handle this?
Solution
If you're using BULK insert then there is no way to handle double quotes, data will be
inserted with double quotes into rows.
after inserting the data into table you could replace those double quotes with ''.
update table
set columnhavingdoublequotes = replace(columnhavingdoublequotes,'"','')
3) How do we track if some rows have bad data, which import skips?
(does import skips rows that are not importable)?
Solution
To handle rows which aren't loaded into table because of invalid data or format, could be
handle using ERRORFILE property, specify the error file name, it will write the rows
having error to error file. code should look like.
BULK INSERT SchoolsTemp
FROM 'C:\CSVData\Schools.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'C:\CSVDATA\SchoolsErrorRows.csv',
TABLOCK
)
From How to import a CSV file into a database using SQL Server Management Studio, from 2013-11-05:
First create a table in your database into which you will be importing
the CSV file. After the table is created:
Log into your database using SQL Server Management Studio
Right click on your database and select Tasks -> Import Data...
Click the Next > button
For the Data Source, select Flat File Source. Then use the Browse button to select the CSV file. Spend some time configuring how you want the data to be imported before clicking on the Next > button.
For the Destination, select the correct database provider (e.g. for SQL Server 2012, you can use SQL Server Native Client 11.0). Enter the Server name; Check Use SQL Server Authentication, enter the User name, Password, and Database before clicking on the Next > button.
On the Select Source Tables and Views window, you can Edit Mappings before clicking on the Next > button.
Check the Run immediately check box and click on the Next > button.
Click on the Finish button to run the package.
2) If the client create the csv from excel then the data that have
comma are enclosed within " ... " (double quotes) [as the below
example] so how do the import can handle this?
You should use FORMAT = 'CSV', FIELDQUOTE = '"' options:
BULK INSERT SchoolsTemp
FROM 'C:\CSVData\Schools.csv'
WITH
(
FORMAT = 'CSV',
FIELDQUOTE = '"',
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
TABLOCK
)
The best, quickest and easiest way to resolve the comma in data issue is to use Excel to save a comma separated file after having set Windows' list separator setting to something other than a comma (such as a pipe). This will then generate a pipe (or whatever) separated file for you that you can then import. This is described here.
Because they do not use the SQL import wizard, the steps would be as follows:
Right click on the database in the option tasks to import data,
Once the wizard is open, we select the type of data to be implied. In this case it would be the
Flat file source
We select the CSV file, you can configure the data type of the tables in the CSV, but it is best to bring it from the CSV.
Click Next and select in the last option that is
SQL client
Depending on our type of authentication we select it, once this is done, a very important option comes.
We can define the id of the table in the CSV (it is recommended that the columns of the CSV should be called the same as the fields in the table). In the option Edit Mappings we can see the preview of each table with the column of the spreadsheet, if we want the wizard to insert the id by default we leave the option unchecked.
Enable id insert
(usually not starting from 1), instead if we have a column with the id in the CSV we select the enable id insert, the next step is to end the wizard, we can review the changes here.
On the other hand, in the following window may come alerts, or warnings the ideal is to ignore this, only if they leave error is necessary to pay attention.
This link has images.
Firs you need to import CSV file into Data Table
Then you can insert bulk rows using SQLBulkCopy
using System;
using System.Data;
using System.Data.SqlClient;
namespace SqlBulkInsertExample
{
class Program
{
static void Main(string[] args)
{
DataTable prodSalesData = new DataTable("ProductSalesData");
// Create Column 1: SaleDate
DataColumn dateColumn = new DataColumn();
dateColumn.DataType = Type.GetType("System.DateTime");
dateColumn.ColumnName = "SaleDate";
// Create Column 2: ProductName
DataColumn productNameColumn = new DataColumn();
productNameColumn.ColumnName = "ProductName";
// Create Column 3: TotalSales
DataColumn totalSalesColumn = new DataColumn();
totalSalesColumn.DataType = Type.GetType("System.Int32");
totalSalesColumn.ColumnName = "TotalSales";
// Add the columns to the ProductSalesData DataTable
prodSalesData.Columns.Add(dateColumn);
prodSalesData.Columns.Add(productNameColumn);
prodSalesData.Columns.Add(totalSalesColumn);
// Let's populate the datatable with our stats.
// You can add as many rows as you want here!
// Create a new row
DataRow dailyProductSalesRow = prodSalesData.NewRow();
dailyProductSalesRow["SaleDate"] = DateTime.Now.Date;
dailyProductSalesRow["ProductName"] = "Nike";
dailyProductSalesRow["TotalSales"] = 10;
// Add the row to the ProductSalesData DataTable
prodSalesData.Rows.Add(dailyProductSalesRow);
// Copy the DataTable to SQL Server using SqlBulkCopy
using (SqlConnection dbConnection = new SqlConnection("Data Source=ProductHost;Initial Catalog=dbProduct;Integrated Security=SSPI;Connection Timeout=60;Min Pool Size=2;Max Pool Size=20;"))
{
dbConnection.Open();
using (SqlBulkCopy s = new SqlBulkCopy(dbConnection))
{
s.DestinationTableName = prodSalesData.TableName;
foreach (var column in prodSalesData.Columns)
s.ColumnMappings.Add(column.ToString(), column.ToString());
s.WriteToServer(prodSalesData);
}
}
}
}
}
Here's how I would solve it:
Just Save your CSV File as a XLS Sheet in excel(By Doing so, you wouldn't have to worry about delimitiers. Excel's spreadsheet format will be read as a table and imported directly into a SQL Table)
Import the File Using SSIS
Write a Custom Script in the import manager to omit/modify the data you're looking for.(Or run a master script to scrutinize the data you're looking to remove)
Good Luck.
If anyone wants to import csv using powershell
## Install module if not installed, this is a one time install.
Install-Module SqlServer
## Input SQL Server Variables and CSV path
$csvPath = "D:\Orders.csv"
$csvDelimiter = ","
$serverName = "DESKTOP-DOG5T0Q\SQLEXPRESS"
$databaseName = "OrderDetails"
$tableSchema = "dbo"
$tableName = "Orders"
## Truncate Table
Invoke-Sqlcmd -ServerInstance $serverName -Database $databaseName -Query "TRUNCATE TABLE $tableSchema.$tableName"
## Import CSV into SQL
Import-Csv -Path $csvPath -header "Id","Country","Price","OrderQuantity" -Delimiter $csvDelimiter | Write-SqlTableData -ServerInstance $serverName -DatabaseName $databaseName -SchemaName $tableSchema -TableName $tableName -Force
Source: Import csv into SQL server (with query OR without query using SSMS)
I know this is not the exact solution to the question above, but for me, it was a nightmare when I was trying to Copy data from one database located at a separate server to my local.
I was trying to do that by first export data from the Server to CSV/txt and then import it to my local table.
Both solutions: with writing down the query to import CSV or using the SSMS Import Data wizard was always producing errors (errors were very general, saying that there is parsing problem). And although I wasn't doing anything special, just export to CSV and then trying to import CSV to the local DB, the errors were always there.
I was trying to look at the mapping section and the data preview, but there was always a big mess. And I know the main problem was comming from one of the table columns, which was containing JSON and SQL parser was treating that wrongly.
So eventually, I came up with a different solution and want to share it in case if someone else will have a similar problem.
What I did is that I've used the Exporting Wizard on the external Server.
Here are the steps to repeat the same process:
1) Right click on the database and select Tasks -> Export Data...
2) When Wizard will open, choose Next and in the place of "Data Source:" choose "SQL Server Native Client".
In case of external Server you will most probably have to choose "Use SQL Server Authentication" for the "Authentication Mode:".
3) After hitting Next, you have to select the Destionation.
For that, select again "SQL Server Native Client".
This time you can provide your local (or some other external DB) DB.
4) After hitting the Next button, you have two options either to copy the entire table from one DB to another or write down the query to specify the exact data to be copied.
In my case, I didn't need the entire table (it was too large), but just some part of it, so I've chosen "Write a query to specify the data to transfer".
I would suggest writing down and testing the query on a separate query editor before moving to Wizard.
5) And finally, you need to specify the destination table where the data will be selected.
I suggest to leave it as [dbo].[Query] or some custom Table name in case if you will have errors exporting the data or if you are not sure about the data and want further analyze it before moving to the exact table you want.
And now go straight to the end of the Wizard by hitting Next/Finish buttons.
All of the answers here work great if your data is "clean" (no data constraint violations, etc.) and you have access to putting the file on the server. Some of the answers provided here stop at the first error (PK violation, data-loss error, etc.) and give you one error at a time if using SSMS's built in Import Task. If you want to gather all errors at once (in case you want to tell the person that gave you the .csv file to clean up their data), I recommend the following as an answer. This answer also gives you complete flexibility as you are "writing" the SQL yourself.
Note: I'm going to assume you are running a Windows OS and have access to Excel and SSMS. If not, I'm sure you can tweak this answer to fit your needs.
Using Excel, open your .csv file. In an empty column you will write a formula that will build individual INSERTstatements like =CONCATENATE("INSERT INTO dbo.MyTable (FirstName, LastName) VALUES ('", A1, "', '", B1,"')", CHAR(10), "GO") where A1 is a cell that has the first name data and A2 has the last name data for example.
CHAR(10) adds a newline character to the final result and GO will allow us to run this INSERT and continue to the next even if there are any errors.
Highlight the cell with your =CONCATENATION() formula
Shift + End to highlight the same column in the rest of your rows
In the ribbon > Home > Editing > Fill > Click Down
This applies the formula all the way down the sheet so you don't have to copy-paste, drag, etc. down potentially thousands of rows by hand
Ctrl + C to copy the formulated SQL INSERT statements
Paste into SSMS
You will notice Excel, probably unexpectedly, added double quotes around each of your INSERT and GO commands. This is a "feature" (?) of copying multi-line values out of Excel. You can simply find and replace "INSERT and GO" with INSERT and GO respectively to clean that up.
Finally you are ready to run your import process
After the process completes, check the Messages window for any errors. You can select all the content (Ctrl + A) and copy into Excel and use a column filter to remove any successful messages and you are left with any and all the errors.
This process will definitely take longer than other answers here, but if your data is "dirty" and full of SQL violations, you can at least gather all the errors at one time and send them to the person that gave you the data, if that is your scenario.
Import the file into Excel by first opening excel, then going to DATA, import from TXT File, choose the csv extension which will preserve 0 prefixed values, and save that column as TEXT because excel will drop the leading 0 otherwise (DO NOT double click to open with Excel if you have numeric data in a field starting with a 0 [zero]). Then just save out as a Tab Delimited Text file. When you are importing into excel you get an option to save as GENERAL, TEXT, etc.. choose TEXT so that quotes in the middle of a string in a field like YourCompany,LLC are preserved also...
BULK INSERT dbo.YourTableName
FROM 'C:\Users\Steve\Downloads\yourfiletoIMPORT.txt'
WITH (
FirstRow = 2, (if skipping a header row)
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '\n'
)
I wish I could use the FORMAT and Fieldquote functionality but that does not appear to be supported in my version of SSMS
As it was stated above, you need to add FORMAT and FIELDQUOTE options to bulk insert .CSV data into SQL Server. For your case SQL statement will look like this:
BULK INSERT SchoolsTemp
FROM 'C:\CSVData\Schools.csv'
WITH
(
FORMAT = 'CSV',
FIELDQUOTE = '""',
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
TABLOCK
)
Though BULK INSERT in SSMS is great for a one-time import job, depending on your use case you may need some other options inside SSMS or using 3rd parties. Here is a detailed guide describing various options to import CSV files to SQL Server, including ways to automate (I mean schedule) the process and specify FTP or file storages for CSV location.
I know that there are accepted answer but still, I want to share my scenario that maybe help someone to solve their problem
TOOLS
ASP.NET
EF CODE-FIRST APPROACH
SSMS
EXCEL
SCENARIO
i was loading the dataset which's in CSV format which was later to be shown on the View
i tried to use the bulk load but I's unable to load as BULK LOAD was using
FIELDTERMINATOR = ','
and Excel cell was also using ,
however, I also couldn't use Flat file source directly because I was using Code-First Approach and doing that only made model in SSMS DB, not in the model from which I had to use the properties later.
SOLUTION
I used flat-file source and made DB table from CSV file (Right click DB in SSMS -> Import Flat FIle -> select CSV path and do all the settings as directed)
Made Model Class in Visual Studio (You MUST KEEP all the datatypes and names same as that of CSV file loaded in sql)
use Add-Migration in NuGet package console
Update DB
Maybe not exactly what you're asking, but another option is to use the CSV Lint plug-in for Notepad++
The plug-in can validate the csv data beforehand, meaning check for bad data like missing quotes, incorrect decimal separator, datetime formatting errors etc. And instead of BULK INSERT it can convert the csv file to an SQL insert script.
The SQL script will contain INSERT statements for each csv line in batches of 1000 records, and also adjust any datetime and decimal values. The plug-in automatically detects datatypes in the csv, and it will include a CREATE TABLE part with the correct data types for each column.
I have a very large csv file with ~500 columns, ~350k rows, which I am trying to import into an existing SQL Server table.
I have tried BULK INSERT, I get - Query executed successfully, 0 rows affected. Interestingly, BULK INSERT worked, in a matter of seconds, for a similar operation but for a much smaller csv file, less than 50 cols., ~77k rows.
I have also tried bcp, I get - Unexpected EOF encountered in BCP data-file. BCP copy in failed.
The task is simple - it shouldn't be hard to the limits of pure frustration. Any ideas or suggestions? Any other tools, utilities that you have successfully used to accomplish a bulk import operation or something similar? Thanks.
-- BULK INSERT
USE myDb
BULK INSERT myTable
FROM 'C:\Users\myFile.csv'
WITH
(
FIRSTROW = 2,
-- DATAFILETYPE = 'char',
-- MAXERRORS = 100,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
);
-- bcp
bcp myDb.dbo.myTable in 'C:\Users\myFile.csv' -T -t, -c
UPDATE
I have now changed course. I've decided to join the csv files, which was my goal to begin with, outside of SQL Server so that I don't have to upload the data to a table for now. However, it'll be interesting to try to upload (BULK INSERT or 'bcp') only 1 record (~490 cols.) from the csv file, which otherwise failed, and see if it works.
Check your file for an EOF character where it shouldn't be - BCP is telling you there is a problem with the file.
Notepad ++ may be able to load the file for you to view and search.
Most likely the last line lacks a \n. Also, there is a limitation in the row size (8060 bytes) in SQL-Server although T-SQL should have mention this. However, check this link:
My advice: Start with one row and get it to work. Then the rest.
How are you mapping the fields in the file with the columns in the table? Are the number of columns in the table the same as the number of fields in the file? Or are you using a format file to specify the column mapping? If so, is the format file formatted correctly?
If you are using the format file and if you have the "Number of columns" parameter wrong, it will cause the error "Unexpected end of file". See this for some other errors/issues with bulk uploading.
It is probably not the solution your expecting but with Python you could create a table out of the csv very easily (just uploaded a 1GB CSV file):
import pandas as pd
import psycopg2
from sqlalchemy import create_engine
# Read the csv to a dataframe
df = pd.read_csv('path_to_csv_file', index_col='name_of_index_column', sep=",")
# Connect and upload
engine = create_engine('postgresql+psycopg2://db_user_name:db_password#localhost:5432/' + 'db_name', client_encoding='utf8')
df.to_sql('table_name', engine, if_exists='replace', index =True, index_label='name_of_index_column')