Cyrillic symbols in SQL code are not correctly after insert - sql

I use SQL Server 2005 and I am try to store Cyrillic characters but I can't with SQL code by trying to run this is SQL Server:
INSERT INTO Assembly VALUES('Македонски парлиамент број 1','','');
Or from C# is happening the same problem but inserting/updating the column from SQL Server it work and it is store normally.
The datatype of column is nvarchar.

You have to add N prefix before your string.
When you implicitly declare a string variable it is treated as varchar by default. Adding prefix N denotes that the subsequent string is in Unicode (nvarchar).
INSERT INTO Assembly VALUES(N'Македонски парлиамент број 1','','');
Here is some reading:
http://databases.aspfaq.com/general/why-do-some-sql-strings-have-an-n-prefix.html
https://msdn.microsoft.com/en-IN/library/ms186939.aspx
What is the meaning of the prefix N in T-SQL statements?

I'm not sure if you are doing a static stored procedure or scripting, but maybe the text is not being encoded properly when you save it to disk. I ran into this, and my problem was solved in PowerShell by correcting the encoding of the SQL that I saved to disk for osql processing:
Out-File -FilePath "MyFile.sql" -InputObject $MyRussianSQL -Encoding "Unicode" -Force;
& osql -U myuser -P password -i "MyFile.sql";

Related

Creating a Format File for Bulk Import

I am trying to create a Format File to bulk import a .csv file but i, am getting an error.
Query I used
"BCP -SMSSQLSERVER01.[Internal_Checks].[Jan_Flat] format out -fC:\Desktop\exported data\Jan_FlatFormat.fmt -c -T -Uasda -SMSSQLSERVER01 -PPASSWORD"
I am getting an error
"A valid table name is required for in, out, or format options."
This is the error. can anyone suggest what need to do.
According to the bcp Utility documentation the first parameter should be a [Database.]Schema.{Table | View | "query"}, so don't put -SMSSQLSERVER01 where you've got it. Also use format nul instead of format out.
Try using:
bcp.exe [Internal_Checks].[Jan_Flat] format nul "-fC:\Desktop\exported data\Jan_FlatFormat.fmt" -c -SMSSQLSERVER01 -T -Uasda -PPASSWORD
Note the quotes " around the -f switch because your path name contains space characters.
Also note that the -c switch causes single-byte characters (ASCII/OEM/codepage with SQLCHAR) to be written out. If your table contains nchar, nvarchar or ntext columns you should consider using the -w switch instead so as to write out UTF-16 encoded data (using SQLNCHAR).

SQL Server Bulk Insert Error 7301 "IID_IColumnsInfo"

I'm trying to insert through a CSV file, which by the way will be executed every day through a procedure, but it gives the same error.
Msg 7301, Level 16, State 2, Line 16
Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)".
The table I'm trying to import I put all the fields as nvarchar and all of them with at least 500 characters, because I was thinking that this was the problem.
This CSV file I am exporting through PowerShell as follows:
Export-Csv -Path $DirPath -Delimiter ';' -NoTypeInformation -Encoding UTF8
The file has 40 columns and 685 rows, I already tried to save the CSV file with ',' delimiter and ';' delimiter, but both have the same error.
I tried to do the Bulk Insert in several ways as below, but without success.
BULK INSERT DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV
FROM 'C:\Users\userbi\Desktop\Projetos-Santo-Grau\Projeto1-RelatoriodeEstoque\TBIMP_FOTOS_CSV.csv'
WITH (FORMAT = 'CSV',
--MAXERRORS = 0,
--CODEPAGE = '65001',
CODEPAGE = 'ACP',
--FIELDQUOTE = '"',
FIELDTERMINATOR ='";"',
--ROWTERMINATOR ='"\n"',
ROWTERMINATOR = '\r\n',
--ROWTERMINATOR = "0x0a"
FIRSTROW = 2,
ERRORFILE = 'C:\Users\userbi\Desktop\Projetos-Santo-Grau\Projeto1-RelatoriodeEstoque\TBIMP_FOTOS_CSV_ERROS.csv');
Once he exported a CSV and TXT file with errors, using the code above, the data was like this (but not in the original file):
What should I do?
I would not like it, but if it is possible to ignore these records but the insert is completed, it would be less worse.
Information:
SQL Server 2019 (v15.0.18330.0)
SQL Server Management Objects (SMO) v16.100.37971.0
Microsoft SQL Server Management Studio v18.5
It's usually easier to BULK INSERT data with a format file. Use the bcp.exe utility to create a format file with a command such as the following:
bcp.exe DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV format nul -c -t; -f C:\Temp\TBIMP_FOTOS_CSV.fmt -S(local) -T
Where:
DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV is the Database.Schema.Table we're interacting with.
format specifies format file creation mode.
nul specifies the input/output data file, which in this case means "don't write any data".
-c specifies character mode, as opposed to native (binary) mode.
-t; specifies to use ; as the field separator character.
-f C:\Temp\TBIMP_FOTOS_CSV.fmt specifies the path to write the format file to, relative to your local computer.
-S(local) is the SQL Server to connect to, (local) in my case.
-T means Trusted Authentication (Windows authentication), use -uUsername and -pPassword if you have SQL Login authentication instead.
This creates a format file something like the following (yours will have more and different columns):
14.0
2
1 SQLCHAR 0 510 ";" 1 Filename SQL_Latin1_General_Pref_CP1_CI_AS
2 SQLCHAR 0 510 "\r\n" 2 Resolution SQL_Latin1_General_Pref_CP1_CI_AS
Now, in SSMS, you should be able to run something like the following to import your data file (adjust file paths relative to your SQL Server as appropriate):
BULK INSERT DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV
FROM 'C:\Temp\TBIMP_FOTOS_CSV.csv'
WITH (
CODEPAGE = '65001',
DATAFILETYPE = 'char',
FORMAT = 'CSV',
FORMATFILE = 'C:\Temp\TBIMP_FOTOS_CSV.fmt'
);
-- edit --
On SQL Server and international character support.
SQL Server and UTF-8 has had a bit of a checkered history, only gaining partial support with SQL Server 2016 and really only supporting UTF-8 code pages with SQL Server 2019. Importing and exporting files with international characters is still best handled using UTF-16 encoded files. Adjustments to the workflow are as follows...
In PowerShell, use the Unicode encoding instead of UTF8:
Export-Csv -Path $DirPath -Delimiter ';' -NoTypeInformation -Encoding Unicode
When generating the BCP format file, use the -w switch (for widechar) instead of -c (for char):
bcp.exe DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV format nul -w -t; -f C:\Temp\TBIMP_FOTOS_CSV-widechar.fmt -S(local) -T
This causes the SQLCHAR columns to be written out as SQLNCHAR, aka. national character support:
14.0
2
1 SQLNCHAR 0 510 ";\0" 1 Filename SQL_Latin1_General_Pref_CP1_CI_AS
2 SQLNCHAR 0 510 "\r\0\n\0" 2 Resolution SQL_Latin1_General_Pref_CP1_CI_AS
When using BULK INSERT specify DATAFILETYPE = 'widechar' instead of DATAFILETYPE = 'char' and specifying a codepage, e.g.:
BULK INSERT GB_TBIMP_FOTOS_CSV
FROM 'C:\Temp\TBIMP_FOTOS_CSV.csv'
WITH (
DATAFILETYPE = 'widechar',
FORMATFILE = 'C:\Temp\TBIMP_FOTOS_CSV-widechar.fmt'
);

Pass byte[] as parameter to sql insert script

I am trying to upload the binary[] of a Zip folder to my database. I used Get-Content -Encoding Byte -ReadCount 0 to read the data into a variable. I want to use this variable in an INSERT statement. Unfortunately, sqlcmd doesn't like the size of the variable, and gives me this error:
Program 'SQLCMD.EXE' failed to run: The filename or extension is too longAt line:1 char:1.
I have tried using the -Q option to run the query, and also -i to run a sql file.
DECLARE #data varbinary(MAX)
SET #data = '$(data_stuff)'
INSERT INTO MyTable
(v1,v2,v3,v4,v5)
VALUES
(v1,v2,v3,v4,#data)
sqlcmd -S servername -E -i .\file.sql -v data = "$binarydata"
Is there a workaround for doing this?
In a SQL query/batch/.sql file, binary/varbinary/image literal data values must be in hexadecimal format with a 0x prefix:
INSERT INTO tableName ( binaryColum ) VALUES ( 0x1234567890ABCDEF )
I don't know what the maximum length of a binary literal is, but I suspect things might stop working, or be very slow, if you exceed more than a few hundred kilobytes.
I recommend using ADO.NET directly via PowerShell, which will also let you use binary parameter values (SqlParameter): How do you run a SQL Server query from PowerShell?

how to insert utf8 characters into oracle database using robotframework database library

I have a robot script which inserts some sql statements from a sql file; some of these statements contain utf8 characters. If I insert this file manually into database using navicat tool, everything's fine. But when I try to execute this file using database library of robot framework, utf8 characters go crazy!
This is my utf8 included sql statement:
INSERT INTO "MY_TABLE" VALUES (2, 'تست1');
This is how I use database library:
Connect To Database Using Custom Params cx_Oracle ${dbConnection}
Execute Sql Script ${sqlFile}
Disconnect From Database
This is what I get in the database:
������������ 1
I have tried to execute the SQL file using cx_Oracle directly and it's still failing! It seems there is a problem in the original library. This is what I've used for importing SQL file:
import cx_Oracle
if __name__ == "__main__":
dsn_tns = cx_Oracle.makedsn(ip, port, sid)
db = cx_Oracle.connect(username, password, dsn_tns)
sql_commands = open(sql_file_addr, 'r').read().split(";")
cr = db.cursor()
for command in sql_commands:
if not command in ["", "\t", "\n", "\r", "\n\r", "\r\n", None]:
print "Executing SQL command:", command
cr.execute(command)
db.commit()
I have found that I can define character-set in the connection string. I've done it for mysql database and it the framework successfully inserted UTF8 characters into database; this is my connection string for MySQL:
database='db_name', user='db_username', password='db_password', host='db_ip', port=3306, charset='utf8'
But I don't know how to define character-set for Oracle connection string. I have tried this:
'db_username','db_password','db_ip:1521/db_sid','utf8'
And I've got this error:
TypeError: an integer is required
As #Yu Zhang suggested, I read discussion in this link and I found out that I should set an environment variable NLS_LANG in order to have a UTF-8 connection to the database. So I've added below line in my test setup:
os.environ["NLS_LANG"] = "AMERICAN_AMERICA.AL32UTF8"
Would any of links below help?
http://docs.oracle.com/cd/B19306_01/server.102/b14225/ch6unicode.htm#i1006779
http://www.theserverside.com/news/thread.tss?thread_id=39575
https://community.oracle.com/thread/502949
There can be several problems in here...
The first problem might be that you don't save the test files using UTF-8 encoding.
Robot framework expects plain text test files to be saved using UTF-8 encoding, yet most text editors will not save by default using UTF-8.
Verify that your editor saves that way - for example, by opening the file using NotePad++ and choosing Encoding -> UTF-8
Another problem might be the connection to the Oracle database. It doesn't seem like you can configure the connection custom properties to explicitly state UTF-8
This means you probably need to state that the database schema itself is UTF-8

DB2 database stores/reads umlauts and special chars wrong

I create my database with the following command:
db2 create database kixfs using codeset UTF-8 territory AT
My insert scripts (DMLs) are ecoded in UTF-8.
Example insert statement from resources.dml:
INSERT INTO RESOURCES (RESOURCEKEY, LANGUAGE, CATEGORY, RESOURCEVALUE) VALUES ('XXXX', 'de', 'action', 'Funktion "Gerätemodell erfassen" erfolgreich ausgeführt.');
If i check the table content after creation:
Fehler bei der Ausführung der Funktion "Gerätemodell erfassen".
If i check the database configuration everything looks fine: (db2 get db cfg for MY_DB)
Any ideas why the data is read or stored wrong?
Edit:
I execute the insert script via a batchfile from the db2 admin console (CLP):
db2 -t -v -f resources.dml +o -z createTablesViews.log
Could it depend from the encoding of the db2 termminal?? And if yes how do i change it?
We had the same problem. Script, too. The script encoding has to match the server encoding, in our case ANSI. Even if the database is UTF-8. Converting the script file to ANSI did the trick for us.