error on converting persian characters from excel to sql - sql

I'm trying to convert an excel file to sql.
but i'm getting below error.
when i change on truncation value to "ignore" , the convert process will be complete but persian characters will be seen as "?".

Make sure that all your destination columns are NVARCHAR if they are going to be accepting Unicode text.
See here for a detailed explanation: What is the difference between varchar and nvarchar?

As you know, import excel files to SQL server database is doing by Access Engine, in most of the cases that you see this error, it could be solved by below method:
At first, you create an access file and save it in MDB format (this format could be used in SQL server),
and then import your excel file to this Access.mdb file, when all your data is complete, you can import this MDB file from the SQL server to your database,

Related

Exporting SQL Server table containing a large text column

I have to export a table from a SQL Server, the table contains a column that has a large text content with the maximum length of the text going up to 100,000 characters.
When I use Excel as an export destination, I find out that the length of this text is capped and truncated to 32,765.
Is there an export format that preserves the length?
Note:
I will eventually be importing this data into another SQL Server
The destination SQL Server is in another network, so linked servers and other local options are not feasible
I don't have access to the actual server, so generating back up is difficult
As is documented in the Excel specifications and limits the maximum characters that can be stored in a single Excel cell is 32,767 characters; hence why your data is being truncated.
You might be better off exporting to a CSV, however, note that Quote Identified CSV files aren't supported within bcp/BULK INSERT until SQL Server 2019 (currently in preview). You can use a characters like || to denote a field delimited, however, if you have any line breaks you'll need to choose a different row delimitor too. SSIS, and other ETL tools, however, do support quote identified CSV files; so you can use something like that.
Otherwise, if you need to export such long values and want to use Excel as much as you can (which I actually personally don't recommend due to those awful ACE drivers), I would suggest exporting the (n)varchar(MAX) values to something else, like a text file, and naming each file with the value of your Primary Key included. Then, when you import the data back you can retrieve the (n)varchar(MAX) value again from each individual file.
The .sql is the best format for sql table. Is the native format for sql table, with that, you haven't to concert the export.

Japanese ANSI character in CSV file

I have a csv file generated from Japanese source system. The Japanese character is shown as given below ¬¼ˆã—Ê튔Ž®‰ïŽÐ ‘åã‰c‹ÆŠ. I have changed file type to UTF-8 and also ETL setting to incorporate that but that is working on new data only.
How can I change existing data in my table which shows characters like ‘åã‰c‹ÆŠ.
Is it possible to get original Japanese characters using SQL functions. I am using SQL Sever as database.
Thanks in advance.

How to export Japanese characters from SQL Server 2008 database to a text file?

I have an odd problem. I need to export Japanese characters from a table to a raw text file. When I run the select statement in SQL, I am able to see the characters displayed correctly. However, when I run the SSIS package to export these values to a text file they are displayed as ?'s.
Data type of the field is NTEXT. Has anyone ran into this problem before?
SQL statement:
select cast(body as nvarchar(max)) as body from msgsMarket
In SSIS package's flat file connection manager, I have set the output encoding to use 932
This is not a solution but might probably help you to identify the problem in your case.
Created a sample SSIS package using SSIS 2008 R2 with UTF-8 and Unicode encodings and the SQL Server data exported correctly to flat files.
Sample SQL data in the file. Description field was of data type NVARCHAR. The sample was also tried by changing the data type of the Description field to NTEXT and the flat files still exported correctly.
SSIS package was created with a data flow task with two outputs for UTF-8 and Unicode.
First flat file connection manager to generate flat file with encoding UTF-8.
Output file generated with UTF-8 encoding.
Second flat file connection manager to generate flat file with encoding Unicode.
Output file generated with Unicode encoding.

SQL Server 2005 -> Excel export doesn't keep data types?

Trying (and largely succeeding) to export the results of a query from SQL Server to Excel, like so:
insert into OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=c:\exported excel files\exported_data.xls;',
'SELECT * FROM [Query$]') SELECT dbo.blabbityblah FROM dbo.the_table
It works! Sort of. It does export the data to the excel file, but it puts it all in there as text, even though some of the columns are datetime and most of them are numbers. None of them are being convert()-ed in the query itself. I've tried preformatting the cells in the actual Excel file before running the query, but it ignores the existing formating and spits it all out as text again.
There's got to be a way to do this, right?
excel dont have data type, its text based and preformat not work becus it replace existing file. if u want datatype try MS Access.
Look into using a schema.ini file to define the datatypes in a csv or txt. when you open either in excel you may achieve what you want
[sample_out.csv]
Format=CSVDelimited
DecimalSymbol=.
Col1=DATE datetime
Col2=FName Text
Another approach you may want to look at depending on your needs is to use the import and export wizard. You can customize a query for the data and specify the data type in the wizard. If you are using a SKU other than Express you can the run it right away or save the SSIS package is generated for further manipulation.

BCP utility to create a format file, to import Excel data to SQL Server 2008 for BULK insertion

Am trying to import Excel 2003 data into SQL table for SQL Server 2008.
Tried to add a linked server but have met with little success.
Now am trying to check if there's a way to use the BCP utility to do a BULK insert or BULK operation with OPENROWSET, using a format file to get the Excel mapping.
First of all, how can I create a format file for a table, that has differently named columns than the Excel spreadsheet colums?
Next, how to use this format file to import data from say a file at: C:\Folder1\Excel1.xsl
into table Table1 ?
Thank you.
There's some examples here that demonstrate what the data file should look like (csv) and what the format file should look like. Unless you need to do this lots I'd just hand-craft the format file, save the excel data to csv, then try using bcp or OPENROWSET.
The format file specifies the column names for the destination. The data file doesn't have column headings so you don't need to worry about the excel (source) cols being different.
If you need to do more mapping etc, then create an SSIS package. You can use the data import wizard to get you started, then save as SSIS package, then edit to your heart's content.
If it's a one-off I'd use the SQL data import size, from right-click on database in mgmt studio. If you just have a few rows to import from excel I typically open a query to Edit Top 200 rows, edit the query to match the columns I have in excel, then copy and paste the rows from excel into SQL mgmt studio. Doesn't handle errors very well, but quick.