I am trying to insert the data from this link to my SQL server
https://www.ian.com/affiliatecenter/include/V2/CityCoordinatesList.zip
I created the table
CREATE TABLE [dbo].[tblCityCoordinatesList](
[RegionID] [int] NOT NULL,
[RegionName] [nvarchar](255) NULL,
[Coordinates] [nvarchar](4000) NULL
) ON [PRIMARY]
And I am running the following script to do the bulk insert
BULK INSERT tblCityCoordinatesList
FROM 'C:\data\CityCoordinatesList.txt'
WITH
(
FIRSTROW = 2,
MAXERRORS = 0,
FIELDTERMINATOR = '|',
ROWTERMINATOR = '\n'
)
But the bulk insert fails with following error
Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)".
When I google, I found several articles which says the issue may be with RowTerminator, but I tried everything like \n\r, \n etc, but nothing is working.
Could anyone please help me to insert this data into my database?
Try ROWTERMINATOR = '0x0a'.
it should work.
This can also happen if the number of columns mismatch between the table and the imported file
I got the same error message, and as you had mention, it was related to unexpected line ending.
In my case the line ending was specified in a fmt file as a Windows Line ending (CRLF), written as \r\n, and the data file to process has a Mac classic one (CR).
I solved it with an editor that can show the current line ending and change it. I used EditPad Lite wich shows the opened file line ending in the bottom bar and pressing it allow to replace with the expected one.
I had this on SQL2019 when the FORMAT='CSV' option was used, and there was a comma on the end of each line in the source file. So the table your BULK inserting into needed to have an extra dummy field to cater for the fact each record has essentially a blank field in the source file.
!
I get the same error, probably from the file encoding problem. I fixed it by opening the problem CSV file using Notepad++, select everything and copy to clipboard. Next, create a new text file (making sure it has the CSV file extension), open it using Notepad++, then paste the text to the new file. Save and close all files. You should be able to successfully load the new CSV file into the SQL server.
you need run BULK INSERT - command from windows login (not from SQL). Now I don't have any examples
Related
everyone. I am quite a new user of Azure Data Studio and stumbled upon the following problem:
I was intending to read the CSV formatted file from SQL and take the data therefrom. Here is my SQL Code:
USE excel_checks
BULK INSERT short_info from '/Item.csv'
with (fieldterminator = ',', rowterminator = '/n');
I tried to change the location of the file to the docker container by entering the following code in my Terminal
docker cp /Users/office/Desktop/Item.csv name of my container:/
It throws the following error: Cannot bulk load. The file "/Item.csv" does not exist or you don't have file access rights.
I would be happy of someone could help me get this issue sorted out. Googling did not help much as I keep getting the same error.
Thank you.
Instead of copying the file to the root of the file system, copy it into a sub directory to avoid the file access right issue:
docker cp /Users/office/Desktop/Item.csv name of my container:/tmp
Then run the BULK INSERT with that path:
USE excel_checks
BULK INSERT short_info from '/tmp//Item.csv'
with (fieldterminator = ',', rowterminator = '/n');
I am trying to import TXT file into the postgreSQL database table, but I am getting an error:
ERROR:
missing data for column "bts_name"
SQL state: 22P04
My code is:
COPY indicadores2g (
Daily,
BTS_NAME,
SITE_CODE
)
FROM 'C:\Users\Public\Documents\GEO_2G_CELL.txt'
WITH CSV HEADER DELIMITER ' ' NULL AS '' ;
I know that the problem is in the txt file. In the txt file the last two line are blank (example), and when I remove them, the SQL run without problem.enter image description here
My problem is I need to import every day. Is there any rule to put in my SQL code to run without problems?
Another way to run without problems is: Open TXT in excel and save as CSV. Can I do this automatically?
Create simple batch (for example inpfixer.bat):
#echo off
for /f "delims=" %%a in (%1) do (
echo %%a
)
Then
COPY indicadores2g (
Daily,
BTS_NAME,
SITE_CODE
)
FROM PROGRAM 'inpfixer.bat C:\Users\Public\Documents\GEO_2G_CELL.txt'
WITH CSV HEADER DELIMITER ' ' NULL AS '' ;
Surely, inpfixer.bat should be available by PATH.
Disclaimer: Tested on the Wine.
Im tryingto import data from windows CSV (comma delimiter) file into pgSQL faxtest1 table, but I keep getting error saying "The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application."
The following is my code:
COPY faxtest1
FROM 'C:\Users\David\Desktop\test3.csv'
WITH DELIMITER AS ',' CSV ;
The CSV file is like:
Status,Fax ID
Fax to Email,2104
Fax to Email,2108
It is a bug of pg admin 4, hope they will fix it in the future.
In version 14, in the Import/Export data function, there are 2 columns, "Options" and "Columns." Try manually select the columns one at time, separated by a comma. See if this would by pass the error.
It worked for me.
I am trying to add a text file into SQL database table using BULK INSERT.
BULK
INSERT My_Tablename
FROM 'C:\testing\temptest.txt'
WITH
(
FIELDTERMINATOR = '|',
ROWTERMINATOR = '\n'
)
GO
But got error that 'do not have permission to use the bulk load statement'.
Is there any alternative way to do it?
I don't want to set TRUSTWORTHY ON or create certificate for BULK admin permission.
Try using the SQL Server Import and Export Wizard.
Right click on the database in in Object Explorer within SSMS.
Go to Tasks > Import Data
Select "Flat File Source" for your data source and follow the wizard to specify delimiters, etc.
Although #SQLChao definitely has the answer, I did not remember the location of said Import Data option and simply opened the delimited file with my favorite text editor, Notepad++ and did the following find and replaces with Search Mode set to extended:
Find: ' Replace: ''
Find: | Replace: ','
Find: \r\n Replace: ')\r\n
Find: \r\n Replace: \r\nINSERT INTO [DB_Name].[Schema_Name].[Table_Name] VALUES(\r\n'
The only issues should be in your first and last insert statements which can manually be edited as need be.
I then copied the text straight into Sql Server and executed.
I have the following query to insert into a table
BULK
INSERT tblMain
FROM 'c:\Type.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
It get the message
Msg 4860, Level 16, State 1, Line 1
Cannot bulk load. The file "c:\Type.txt" does not exist.
The file is clearly there. Anything I may be overlooking?
Look at that:
Cannot bulk load. The file "c:\data.txt" does not exist
Is that file on the SQL Server's C:\ drive??
SQL BULK INSERT etc. always works only with local drive on the SQL Server machine. Your SQL Server cannot reach onto your own local drive.
You need to put the file onto the SQL Server's C:\ drive and try again.
Bulk import utility syntax is described here
http://msdn.microsoft.com/en-us/library/ms188365.aspx
> BULK INSERT [ database_name . [ schema_name ] . | schema_name . ]
> [ table_name | view_name ]
> FROM 'data_file'
> [ WITH
> (
Note on data_file argument says
' data_file '
Is the full path of the data file that contains data to import into
the specified table or view. BULK INSERT can import data from a disk
(including network, floppy disk, hard disk, and so on).
data_file must specify a valid path from the server on which SQL
Server is running. If data_file is a remote file, specify the
Universal Naming Convention (UNC) name. A UNC name has the form
\Systemname\ShareName\Path\FileName. For example,
\SystemX\DiskZ\Sales\update.txt.
I've had this problem before. In addition to checking the file path you'll want to make sure you're referencing the correct file name and file type. Make sure this is indeed a text file that you have saved in the source location and not a word file etc. I got tripped up with .doc and .docx. This is a newb mistake of mine to make but hey, it can happen. Changed the file type and it fixed the problem.