Issues with special characters from IBM DB2 to SQL in SSIS - sql

I am trying to pull data from IBM db2 into a SQL table via SSIS.
However, the table (source) contains a special character in the column, like below;
"?Ǫ. note pads wearing thin"
As a result, my package fails with the following error;
[OLE DB Source [33]] Error: There was an error with OLE DB
Source.Outputs[OLE DB Source Output].Columns[VTTEXT] on OLE DB
Source.Outputs[OLE DB Source Output]. The column status returned was:
"Text was truncated or one or more characters had no match in the
target code page.".
The data is being loaded into a VARCHAR column within SQL, but the error seems specific to my data source.
How can I get around it?

When an nvarchar column can store any Unicode data the varchar column is restricted to an 8-bit codepage. Codepage incompatabilities are frequent issues, so Unicode is the cure for codepage problems.
ALTER TABLE yourtable
MODIFY thecolumn NVARCHAR(255);

Related

Error when trying to load data from an OData Source Editor to a SQL db table

Several Errors of the same type when trying to load a SQL db table from an OData SharePoint connection.
[GFXBankAccountProcessing - DB GFX Account List [2]] Error: An error occurred while setting up a binding for the "BLCompanyID" column. The binding status was "DT_NTEXT". The data flow column type is "DBBINDSTATUS_UNSUPPORTEDCONVERSION". The conversion from the OLE DB type of "DBTYPE_IUNKNOWN" to the destination column type of "DBTYPE_WVARCHAR" might not be supported by this provider.
It is expected to load the table and proceed to the next function within the process. I believe it has something to do with the data conversion but I am not sure what to convert the data to. I have looked to try to compare datatypes that the table in the DB is requiring. It is a NVARCHAR but I am not sure why it would fail. BLCompanyID is only one column, some of the other columsn are succedding while that one and a few others are failing.
The problem is that the data type you are trying to store into the destination table is not the correct format. For instance, if the destination table requires an nvarchar(255) and you're trying to insert a DT_NTEXT, it will fail. You will need to convert the column to DT_WSTR with a length of 255.
Here's a quick reference that I have bookmarked to help me:
http://wiki.melissadata.com/index.php?title=FAQ%3ASSIS%3AData_Type_Conversions

SSIS error when adding excel destination

I have a SSIS package with data in a SQL Server 2012 table have added an Excel destination and get the error
There is no sufficient information about mapping ssis types to types of the selected .net data provider. As a result you may need to modify the default types of the SQL statement on the next screen
Code:
CREATE TABLE `Excel Destination`
(
`name` VARCHAR(50),
`date` DATETIME
)
It doesn't like the 'name' column I have added a data conversion task but the 'name' column is already set to unicode string. So I'm not sure why I get message about converting between non unicode and unicode?
Any advice would be welcome.
Please check, This error is because your sheet is empty and there are no columns defined in it. you have to write the names of the column in the first row the target sheet.

[Excel Destination [28]] Error: An error occurred while setting up a binding for the column. The binding status was "DT_NTEXT"

I'm working on ssis package which exports data from SQL Server to Excel. I had a problem converting non-unicode to unicode string data types. So I created a derived Column task and converted to Unicode string [DT_WSTR] 4 columns which have a type Varchar(40) in SQL Server table. It worked with these columns. But I also have a Description column of type varchar(max) and I tried to convert it to Unicode text stream [DT_NTEXT]. It did not work.
If your source is SQL Server (as you said), you can convert it directly in your SQL Query
SELECT
CONVERT(NVARCHAR(40), 'att1')
,CONVERT(NTEXT, 'att2')
Convert your VARCHAR into NVARCHAR
Convert your TEXT into NTEXT
it's faster.
P.S. To test it (Do not forget to delete or reset your previous OLE DB Input component) -> It will be forced to reevaluate your datatype
Does it help you?
The only thing that worked was to cast a Description column in Stored Procedure as varchar(1000). I checked the max length of this field and it was about 300 characters. So I made it varchar(1000) and in Derived column Unicode string [DT_WSTR]. This was a workaround, but I still want to know how to make it in ssis package without converting data type in Stored Procedure.

Can't convert String to Numeric/Decimal in SSIS

I have five or six OLE DB Sources with a String[DT_STR], with a length of 500 and 1252 (Latin) as Code Page.
The format of the column is like 0,08 or 0,10 etc etc. As you can see, it is separated with a comma.
All of them are equal except one of them. In this one source, I have a POINT as separation. On this it is working when I set the Data Type in the advanced editor of the OLE DB Source. On another (with comma separated) it is also working, if I set the Data Type in the advanced editor of the OLE DB Source. BUT the weird thing is, that it isn't working with the other sources although they are the same (sperated with comma).
I tested Numeric(18,2) and decimal(2).
Another try to solve the problem with the conversion task and/or the derived column task, failed.
I'm using SQL Server 2008 R2
Slowly, I think SSIS is fooling me :)
Has anyone an idea?
/// EDIT
Here a two screens:
Is working:
click
Isn't working:
click
I would not set the Data Type in the Advanced Editor of the OLE DB Source. I would convert the data in the SQL Code of the OLE DB Source, or in a Script Transformation e.g. using Decimal.TryParse , which would update a new column.
SSIS is unbeleivably fussy over datatypes and trying to mess with its internals is not productive.
Check that there are any spaces in between the commas, so that the SSIS is throwing an error trying to convert the blank space to a number. A blank space does not equal nothing in between spaces.
Redirect error rows and output the data to a file. Then you can examine the data that is being rejected by the SSIS and determine why it's causing error.
Reason for the error
1) Null’s are not properly handled either in the destination database or during SSIS package creation. It is quite possible that the source contains a null database but the destination is not accepting the NULL data leading to build generate above error.
2) Data types between source and destination does not match. For example, source column has varchar data and destination column have an int data type. This can easily generate above error. There are certain kind of datatypes which will automatically convert to another data type and will not generate the error but there are for incompatible datatypes which will generate The value could not be converted because of a potential loss of data. error.
The Issue arises when there is unhandled space or null. I have worked around using the Conditional (Ternary) Operator which checks the length:
LEN(TRIM([Column Name])) >= 1 ? (DT_NUMERIC,38,8)[Column Name] : 0

Conversion Failed Due To Data Overflow (Numeric)

I am trying to move data from a .dbf file to a table in SQL Server 2008 and am getting the following error on multiple numeric columns:
OLE DB provider "MSDASQL" for linked server "(null)" returned message "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
Msg 7341, Level 16, State 2, Line 1
Cannot get the current row value of column "[MSDASQL].apryr" from OLE DB provider "MSDASQL" for linked server "(null)". Conversion failed because the data value overflowed the data type used by the provider.
It only happens on numeric columns and not on every numeric column. Character data is fine and there is no date/time data that could give any issues.
Here is a sample of the code I'm using:
insert into [table] select * from OPENROWSET('MSDASQL',
'DRIVER=Microsoft Visual FoxPro Driver;
SourceDB=[filepath];
SourceType=DBF',
'select *
from [file].dbf)
Since the data in the dbf file is customer data, I've been told I can't manually fix the garbage data in the file (assuming there is any) and everything has to be done through the SQL code. I have searched around the internet and haven't really found a solution to this problem. I'd appreciate any help.
Thank you.
Without knowing more specifics, the situation sounds simple enough: There is data in the dbf file that does not match the data(s) type in your SQL Server table. If that is the case, then you have two options:
Change your SQL Server table to accommodate the data in your dbf file.
Do not import data from the dbf file that is causing the issue.
In option #1, you could modify restrictive numeric or date-type fields to varchar or nvarchar fields. Then, you would want to modify any programs that might be assuming certain data types in the dbf file to accommodate varchar or nvarchar data. For instance you could use some kind of try-catch language that tests the conversion of data before letting a program have access to it.
If you decide to go with option #2, you can change your select query to be filter out data that does not meet the field requirements of your SQL Server table(s).
Good luck!
Check your field types on SQL Server table. Maybe some of them unable to take your DBF's BCD value.