SQL Server Bulk Insert Error 7301 "IID_IColumnsInfo" - sql

I'm trying to insert through a CSV file, which by the way will be executed every day through a procedure, but it gives the same error.
Msg 7301, Level 16, State 2, Line 16
Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)".
The table I'm trying to import I put all the fields as nvarchar and all of them with at least 500 characters, because I was thinking that this was the problem.
This CSV file I am exporting through PowerShell as follows:
Export-Csv -Path $DirPath -Delimiter ';' -NoTypeInformation -Encoding UTF8
The file has 40 columns and 685 rows, I already tried to save the CSV file with ',' delimiter and ';' delimiter, but both have the same error.
I tried to do the Bulk Insert in several ways as below, but without success.
BULK INSERT DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV
FROM 'C:\Users\userbi\Desktop\Projetos-Santo-Grau\Projeto1-RelatoriodeEstoque\TBIMP_FOTOS_CSV.csv'
WITH (FORMAT = 'CSV',
--MAXERRORS = 0,
--CODEPAGE = '65001',
CODEPAGE = 'ACP',
--FIELDQUOTE = '"',
FIELDTERMINATOR ='";"',
--ROWTERMINATOR ='"\n"',
ROWTERMINATOR = '\r\n',
--ROWTERMINATOR = "0x0a"
FIRSTROW = 2,
ERRORFILE = 'C:\Users\userbi\Desktop\Projetos-Santo-Grau\Projeto1-RelatoriodeEstoque\TBIMP_FOTOS_CSV_ERROS.csv');
Once he exported a CSV and TXT file with errors, using the code above, the data was like this (but not in the original file):
What should I do?
I would not like it, but if it is possible to ignore these records but the insert is completed, it would be less worse.
Information:
SQL Server 2019 (v15.0.18330.0)
SQL Server Management Objects (SMO) v16.100.37971.0
Microsoft SQL Server Management Studio v18.5

It's usually easier to BULK INSERT data with a format file. Use the bcp.exe utility to create a format file with a command such as the following:
bcp.exe DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV format nul -c -t; -f C:\Temp\TBIMP_FOTOS_CSV.fmt -S(local) -T
Where:
DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV is the Database.Schema.Table we're interacting with.
format specifies format file creation mode.
nul specifies the input/output data file, which in this case means "don't write any data".
-c specifies character mode, as opposed to native (binary) mode.
-t; specifies to use ; as the field separator character.
-f C:\Temp\TBIMP_FOTOS_CSV.fmt specifies the path to write the format file to, relative to your local computer.
-S(local) is the SQL Server to connect to, (local) in my case.
-T means Trusted Authentication (Windows authentication), use -uUsername and -pPassword if you have SQL Login authentication instead.
This creates a format file something like the following (yours will have more and different columns):
14.0
2
1 SQLCHAR 0 510 ";" 1 Filename SQL_Latin1_General_Pref_CP1_CI_AS
2 SQLCHAR 0 510 "\r\n" 2 Resolution SQL_Latin1_General_Pref_CP1_CI_AS
Now, in SSMS, you should be able to run something like the following to import your data file (adjust file paths relative to your SQL Server as appropriate):
BULK INSERT DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV
FROM 'C:\Temp\TBIMP_FOTOS_CSV.csv'
WITH (
CODEPAGE = '65001',
DATAFILETYPE = 'char',
FORMAT = 'CSV',
FORMATFILE = 'C:\Temp\TBIMP_FOTOS_CSV.fmt'
);
-- edit --
On SQL Server and international character support.
SQL Server and UTF-8 has had a bit of a checkered history, only gaining partial support with SQL Server 2016 and really only supporting UTF-8 code pages with SQL Server 2019. Importing and exporting files with international characters is still best handled using UTF-16 encoded files. Adjustments to the workflow are as follows...
In PowerShell, use the Unicode encoding instead of UTF8:
Export-Csv -Path $DirPath -Delimiter ';' -NoTypeInformation -Encoding Unicode
When generating the BCP format file, use the -w switch (for widechar) instead of -c (for char):
bcp.exe DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV format nul -w -t; -f C:\Temp\TBIMP_FOTOS_CSV-widechar.fmt -S(local) -T
This causes the SQLCHAR columns to be written out as SQLNCHAR, aka. national character support:
14.0
2
1 SQLNCHAR 0 510 ";\0" 1 Filename SQL_Latin1_General_Pref_CP1_CI_AS
2 SQLNCHAR 0 510 "\r\0\n\0" 2 Resolution SQL_Latin1_General_Pref_CP1_CI_AS
When using BULK INSERT specify DATAFILETYPE = 'widechar' instead of DATAFILETYPE = 'char' and specifying a codepage, e.g.:
BULK INSERT GB_TBIMP_FOTOS_CSV
FROM 'C:\Temp\TBIMP_FOTOS_CSV.csv'
WITH (
DATAFILETYPE = 'widechar',
FORMATFILE = 'C:\Temp\TBIMP_FOTOS_CSV-widechar.fmt'
);

Related

Bash script to import the multiple CSV files into a mysql database using load data local infile command

I have Multiple CSV files which are stored in one of the folder then I need to use these folder to fetch the csv files then load them into Database Table.
This script need to prepare in Bash with parameterized fields like InputFolderPath(loop Csv Files), DatabaseConnection, SchemaName, TableName then pass these fields using
Load Data Local Infile Command.
This worked for me,
for f in /var/www/path_to_your_folder/*.csv
do
mysql -e "use database_name" -e "
load data local infile '"$f"' into table your_table_name FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES (column_name1, #date_time_variable1, column_name3)
SET some_column_name_which_contains_date = STR_TO_DATE(#date_time_variable1, '%d-%m-%Y');" -u your_mysql_username_here --p --local-infile=1
echo "Done: '"$f"' at $(date)"
done
This script will prompt password for mysql.
i am using this script on ec2 + ubuntu

Is It possible to create BCP without mentioning input file name?

BULK INSERT NECCOI_DB.dbo.ALL_Stores
FROM 'C:\TestingforAutomation\AllStores.csv'
WITH
(
FIELDTERMINATOR=',',
ROWTERMINATOR='\n',
FIRSTROW=2
)
My bat file is like this:
bcp NECCOI_DB.dbo.ALL_Stores in C:\TestingforAutomation\AllStores.csv -Uvm -PMadhu#9515 -SBLLT-5CD124JQHQ -c -F2 -t ","
TIMEOUT /T 60
It is working fine without any issues ,but this script have to work without mentioning csv file name means with any csv file it should work...
so i have changed my script to this:
bcp NECCOI_DB.dbo.ALL_Stores in C:\TestingforAutomation\*.csv -Uvm -PMadhu#9515 -SBLLT-5CD124JQHQ -c -F2 -t ","
TIMEOUT /T 60
It's giving error like this:
SQLState = S1000, NativeError = 0 Error = [Microsoft][ODBC Driver 17
for SQL Server]Unable to open BCP host data-file
can anyone guide me to fix this ?thanks in advance
There is no way to do this within the BCP command.
To bulk copy many files, you'll need to write code to cycle through the list of files you want loaded and build a new bcp or bulk insert statement for each file to be loaded.
Neither the BCP utility nor the BULK INSERT t-sql command can accept wildcards.

Pass byte[] as parameter to sql insert script

I am trying to upload the binary[] of a Zip folder to my database. I used Get-Content -Encoding Byte -ReadCount 0 to read the data into a variable. I want to use this variable in an INSERT statement. Unfortunately, sqlcmd doesn't like the size of the variable, and gives me this error:
Program 'SQLCMD.EXE' failed to run: The filename or extension is too longAt line:1 char:1.
I have tried using the -Q option to run the query, and also -i to run a sql file.
DECLARE #data varbinary(MAX)
SET #data = '$(data_stuff)'
INSERT INTO MyTable
(v1,v2,v3,v4,v5)
VALUES
(v1,v2,v3,v4,#data)
sqlcmd -S servername -E -i .\file.sql -v data = "$binarydata"
Is there a workaround for doing this?
In a SQL query/batch/.sql file, binary/varbinary/image literal data values must be in hexadecimal format with a 0x prefix:
INSERT INTO tableName ( binaryColum ) VALUES ( 0x1234567890ABCDEF )
I don't know what the maximum length of a binary literal is, but I suspect things might stop working, or be very slow, if you exceed more than a few hundred kilobytes.
I recommend using ADO.NET directly via PowerShell, which will also let you use binary parameter values (SqlParameter): How do you run a SQL Server query from PowerShell?

Unable to open BCP host data-file

Below is an example of the BCP Statement.
I'm not accustomed to using BCP so your help and candor is greatly appreciated
I am using it with a format file as well.
If I execute from CMD prompt it works fine but from SQL I get the error.
The BCP statement is all on one line and the SQL Server Agent is running as Local System.
The SQL server, and script are on the same system.
I ran exec master..xp_fixeddrives
C,45589
E,423686
I've tried output to C and E with the same result
EXEC xp_cmdshell 'bcp "Select FILENAME, POLICYNUMBER, INSURED_DRAWER_100, POLICY_INFORMATION, DOCUMENTTYPE, DOCUMENTDATE, POLICYYEAR FROM data.dbo.max" queryout "E:\Storage\Export\Data\max.idx" -fmax-c.fmt -SSERVERNAME -T
Here is the format file rmax-c.fmt
10.0
7
1 SQLCHAR 0 255 "$#Y#$" 1 FILENAME
2 SQLCHAR 0 40 "" 2 POLICYNUMBER
3 SQLCHAR 0 40 "" 3 INSURED_DRAWER_100
4 SQLCHAR 0 40 "" 4 POLICY_INFORMATION
5 SQLCHAR 0 40 "" 5 DOCUMENTTYPE
6 SQLCHAR 0 40 "" 6 DOCUMENTDATE
7 SQLCHAR 0 8 "\r\n" 7 POLICYYEAR
Due to formating in this post the last column of the format file is cut off but reads SQL_Latin1_General_CP1_CI_AS for each column other that documentdate.
Does the output path exist? BCP does not create the folder before trying to create the file.
Try this before your BCP call:
EXEC xp_cmdshell 'MKDIR "E:\Storage\Export\Data\"'
First, rule out an xp_cmdshell issue by doing a simple 'dir c:*.*';
Check out my blog on using BCP to export files.
I had problems on my system in which I could not find the path to BCP.EXE.
Either change the PATH variable of hard code it.
Example below works with Adventure Works.
-- BCP - Export query, pipe delimited format, trusted security, character format
DECLARE #bcp_cmd4 VARCHAR(1000);
DECLARE #exe_path4 VARCHAR(200) =
' cd C:\Program Files\Microsoft SQL Server\100\Tools\Binn\ & ';
SET #bcp_cmd4 = #exe_path4 +
' BCP.EXE "SELECT FirstName, LastName FROM AdventureWorks2008R2.Sales.vSalesPerson" queryout ' +
' "C:\TEST\PEOPLE.TXT" -T -c -q -t0x7c -r\n';
PRINT #bcp_cmd4;
EXEC master..xp_cmdshell #bcp_cmd4;
GO
Before changing the path to \110\ for SQL Server 2012 and the name of the database to [AdventureWorks2012], I received the following error.
After making the changes, the code works fine from SSMS. The service is running under NT AUTHORITY\Local Service. The SQL Server Agent is disabled. The output file was created.
Please check, the file might be opened in another application or program.
If it is the case, bcp.exe cannot overwrite the existing file contents.
In my case, I solved The problem in the following way:
my command was :
bcp "select Top 1000 * from abc.dbo.abcd" queryout FileNameWithDirectory -c -t "|" -r "0x0a" -S 192.111.1.111 -U xx -P xxxxx
My FileNameWithDirectory was too long. like "D:\project-abc\R&D\abc-608\FilesNeeded\FilesNeeded\DataFiles\abc.csv".
I change into a simpler directory like : "D:\abc.csv"
Problem solved.
So I guess the problem occurred due to file name exceeding. thus the file was not found.
If it works from the command line but not from the SQL Agent, I think it is an authentication issue.
The SQL Server Agent is running under a account. Make sure that the account has the ability to read the format file and generate the output file.
Also, make sure the account has the ability to execute the xp_cmdshell stored procedure.
Write back with your progress ...
I received this after I shared my output folder, even when there were no files open.
I created a new, unshared folder for output and all was fine.
(might help someone ;-))
In my case this fix was simply running in administrator mode.
This error can be due to insufficient write permissions to the target folder.
This is a common issue, since the user writing the query might have access to a folder, but the SQL Server Agent or logged-in server account which actually invokes bcp.exe may not.
Destination path has to already exist (except for file name).
Remove no_output from your command, if you use one offcourse
SET #sql = 'BCP ....'
EXEC master..xp_cmdshell #sql , no_output
EXEC master..xp_cmdshell #sql
In case anyone else runs into the same problem: I had ...lesPerson" queryout' rather than ...lesPerson" queryout '
If your code is writing the data file, and then reading it with BCP, make sure that you CLOSE THE DATA FILE before trying to read it!
Failure to do so gives: 'Unable to open host data-file'.
Python example:
# Management of temporary bulk insert file.
def openBulkInsertFile(self) :
self.bulkInsertFile = open('c:/tmp/bulkInsertContent.txt', 'w', newline='')
self.csvWriter = csv.writer(self.bulkInsertFile)
def closeBulkInsertFile(self) :
self.bulkInsertFile.close()
When using a Job in SQL the user that uses the SQL express server is the current user logged, you should give write permission to that user in the folder where the Batch writes the output.
This happens usually only with bcp, when using type commands the ownership goes to the computer(Administrator) and the command runs with out problem.
So if you have a long command in your job just look for the bcp parts.

Cyrillic symbols in SQL code are not correctly after insert

I use SQL Server 2005 and I am try to store Cyrillic characters but I can't with SQL code by trying to run this is SQL Server:
INSERT INTO Assembly VALUES('Македонски парлиамент број 1','','');
Or from C# is happening the same problem but inserting/updating the column from SQL Server it work and it is store normally.
The datatype of column is nvarchar.
You have to add N prefix before your string.
When you implicitly declare a string variable it is treated as varchar by default. Adding prefix N denotes that the subsequent string is in Unicode (nvarchar).
INSERT INTO Assembly VALUES(N'Македонски парлиамент број 1','','');
Here is some reading:
http://databases.aspfaq.com/general/why-do-some-sql-strings-have-an-n-prefix.html
https://msdn.microsoft.com/en-IN/library/ms186939.aspx
What is the meaning of the prefix N in T-SQL statements?
I'm not sure if you are doing a static stored procedure or scripting, but maybe the text is not being encoded properly when you save it to disk. I ran into this, and my problem was solved in PowerShell by correcting the encoding of the SQL that I saved to disk for osql processing:
Out-File -FilePath "MyFile.sql" -InputObject $MyRussianSQL -Encoding "Unicode" -Force;
& osql -U myuser -P password -i "MyFile.sql";