I am using the BCP utility to import data to SQL Server 2012. I am importing a text file that is tab delimited. If there is data in my file that contains commas, SQL Server is wrapping the field value in double quotes. How can I avoid SQL Server adding this text qualifier? I want the data to not have these quotations, just like it does in the flat text file.
I would like to avoid using a format file because I am using lots flat text files with great variability.
I also cannot use Bulk Insert because my files are not stored on the server, but a local machine that is connecting via SSMS.
EDIT: My BCP Command:
bcp "[DB].[SCHEMA].[TABLE]" in "PATH\FILE.txt" -F2 -w -S"SERVERSTRING" -U"USERNAME" -P"PASSWORD";
Related
I have a huge text file that is in format of "name:email".
I also created a table that has name and email columns.
How do I upload the text file into the table?
The textfile is on my ubuntu server, and I have connected to psql by using the commands
sudo -u postgres psql
\connect <username>"
What do I do next in order to import the text file to my database?
psql has a \COPY command
Copy allows you to do copy data into and out of tables in your
database. It supports several modes including:
binary
tab delimited
csv delimited
What you need is:
\COPY tablename(name,email) FROM '/path/to/file' DELIMITER ':' CSV
If you get an error ERROR: missing data for column "name", that means your file has an empty line at the end, just remove it.
I am having a .CSV file that contain more than 1,00,000 rows.
I have tried the following method to Import the CSV into table "Root"
BULK INSERT [dbo].[Root]
FROM 'C:\Original.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
But there are so many errors like check your Terminators.
I opened the CSV with Notepad.
There is no Terminator , or \n. I find at end of the row a square box is there.
please help me to import this CSV into table.
http://msdn.microsoft.com/en-us/library/ms188609.aspx
Comma-separated value (CSV) files are not supported by SQL Server bulk-import operations. However, in some cases, a CSV file can be used as the data file for a bulk import of data into SQL Server. Note that the field terminator of a CSV file does not have to be a comma. To be usable as a data file for bulk import, a CSV file must comply with the following restrictions:
Data fields never contain the field terminator.
Either none or all of the values in a data field are enclosed in quotation marks ("").
Note: There may be other unseen characters that need to be stripped from the source file. VIM (command ":set list") or Notepad++(View > Show Symbol > Show All Characters) are two methods to check.
If you are comfortable with Java, I have written a set of tools for CSV manipulation, including an importer and exporter. The project is up on Github.com:
https://github.com/carlspring/csv-db-tools
The importer is here:
https://github.com/carlspring/csv-db-tools/tree/master/csv-db-importer
For instructions on how to use the importer, check:
https://github.com/carlspring/csv-db-tools/blob/master/csv-db-importer/USAGE
You will need to make a simple mapping file. An example can be seen here:
https://github.com/carlspring/csv-db-tools/blob/master/csv-db-importer/src/test/resources/configuration-large.xml
I have table with more than 3 000 000 rows. I have try to export the data from it manually and with SQL Server Management Studio Export data functionality to Excel but I have met several problems:
when create .txt file manually copying and pasting the data (this is several times, because if you copy all rows from the SQL Server Management Studio it throws out of memory error) I am not able to open it with any text editor and to copy the rows;
the Export data to Excel do not work, because Excel do not support so many rows
Finally, with the Export data functionality I have created a .sql file, but it is 1.5 GB, and I am not able to open it in SQL Server Management Studio again.
Is there a way to import it with the Import data functionality, or other more clever way to make a backup of the information of my table and then to import it again if I need it?
Thanks in advance.
I am not quite sure if I understand your requirements (I don't know if you need to export your data to excel or you want to make some kind of backup).
In order to export data from single tables, you could use Bulk Copy Tool which allows you to export data from single tables and exporting/Importing it to files. You can also use a custom Query to export the data.
It is important that this does not generate a Excel file, but another format. You could use this to move data from one database to another (must be MS SQL in both cases).
Examples:
Create a format file:
Bcp [TABLE_TO_EXPORT] format "[EXPORT_FILE]" -n -f "[ FORMAT_FILE]" -S [SERVER] -E -T -a 65535
Export all Data from a table:
bcp [TABLE_TO_EXPORT] out "[EXPORT_FILE]" -f "[FORMAT_FILE]" -S [SERVER] -E -T -a 65535
Import the previously exported data:
bcp [TABLE_TO_EXPORT] in [EXPORT_FILE]" -f "[FORMAT_FILE] " -S [SERVER] -E -T -a 65535
I redirect the output from hte export/import operations to a logfile (by appending "> mylogfile.log" ad the end of the commands) - this helps if you are exporting a lot of data.
Here a way of doing it without bcp:
EXPORT THE SCHEMA AND DATA IN A FILE
Use the ssms wizard
Database >> Tasks >> generate Scripts… >> Choose the table >> choose db model and schema
Save the SQL file (can be huge)
Transfer the SQL file on the other server
SPLIT THE DATA IN SEVERAL FILES
Use a program like textfilesplitter to split the file in smaller files and split in files of 10 000 lines (so each file is not too big)
Put all the files in the same folder, with nothing else
IMPORT THE DATA IN THE SECOND SERVER
Create a .bat file in the same folder, name execFiles.bat
You may need to check the table schema to disable the identity in the first file, you can add that after the import in finished.
This will execute all the files in the folder against the server and the database with, the –f define the Unicode text encoding should be used to handle the accents:
for %%G in (*.sql) do sqlcmd /S ServerName /d DatabaseName -E -i"%%G" -f 65001
pause
I have an odd problem. I need to export Japanese characters from a table to a raw text file. When I run the select statement in SQL, I am able to see the characters displayed correctly. However, when I run the SSIS package to export these values to a text file they are displayed as ?'s.
Data type of the field is NTEXT. Has anyone ran into this problem before?
SQL statement:
select cast(body as nvarchar(max)) as body from msgsMarket
In SSIS package's flat file connection manager, I have set the output encoding to use 932
This is not a solution but might probably help you to identify the problem in your case.
Created a sample SSIS package using SSIS 2008 R2 with UTF-8 and Unicode encodings and the SQL Server data exported correctly to flat files.
Sample SQL data in the file. Description field was of data type NVARCHAR. The sample was also tried by changing the data type of the Description field to NTEXT and the flat files still exported correctly.
SSIS package was created with a data flow task with two outputs for UTF-8 and Unicode.
First flat file connection manager to generate flat file with encoding UTF-8.
Output file generated with UTF-8 encoding.
Second flat file connection manager to generate flat file with encoding Unicode.
Output file generated with Unicode encoding.
I need query for import csv file from a particular table and that quest must be used inside a stored procedures.
I tried this query
EXEC master..xp_cmdshell
'osql.exe -S ramcobl412 -U connect -P connect
-Q "select * from ramcodb..rct_unplanned_hdr" -o "c:\out.csv" -h-1 -s","'
but that csv file not in format when I open in xsl sheet
Comma separated files are working fine but width is problem
Save the .csv file you want to import to SQL Server on your desktop or some other path you can easily access.
Using SQL Server Management Studio, you should right click the database you want the csv file imported to as a table and go to Tasks >
Import Data and use the Import Wizard to import the csv file to a table.
The Import Wizard will automatically account for the different lengths you have for some rows. For example if you have column X and it has 5 characters on 1 row and 10 characters on two other rows the Import Wizard will automatically set the max length for the X column as 10.