I'm using SSIS to import SalesForce data into SQL Server. In SF, I have ntext fileds that contain values like "Live/ Not Live" which can easily be interpreted with a bit filed in SQL Server.
Is there a way to convert these ntext fields to Boolean values using SSIS. I have tried using Derived Column transformation and get the following errors:
[Insert Destination [807]] Error: SSIS Error Code DTS_E_OLEDBERROR.
An OLE DB error has occurred. Error code: 0x80040E21. An OLE DB record
is available. Source: "Microsoft SQL Server Native Client 10.0"
Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation
generated errors. Check each OLE DB status value, if available. No
work was done.".
[Insert Destination [807]] Error: SSIS Error Code
DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "input "OLE DB Destination
Input" (820)" failed because error code 0xC020907B occurred, and the
error row disposition on "input "OLE DB Destination Input" (820)"
specifies failure on error. An error occurred on the specified object
of the specified component. There may be error messages posted
before this with more information about the failure.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The
ProcessInput method on component "Insert Destination" (807) failed
with error code 0xC0209029 while processing input "OLE DB Destination
Input" (820).
The identified component returned an error from the ProcessInput method.
The error is specific to the component, but the error is fatal and will cause
the Data Flow task to stop running. There may be error messages posted
before this with more information about the failure.
Any help would be greatly appreciated.
NTEXT is your unicode text stream. You can't do much with a stream so we'll need to make it into something more manageable like DT_STR or DT_WSTR. Whether you have internationalization in your text data is something only you will know. Either way, connect a Data Conversion Transformation up to your SF source and make that data into a non-stream type (give it a max width). The goal of this operation is to take the stream data and turn it into straight text values (Live, Not Live). I'm assuming this column will be called IsLiveString
Now that you're dealing with a string type, add your Derived Column Transformation to the Conversion output and here is where you will need an expression to determine whether the supplied values from the Data Conversion task evaluate to True/False. Even if SQL Server understood that Live translates to a 1 (true), there is no way in heck I'd want to rely on that magic working forever. Instead, I'd look to create a new column, IsLiveBoolean and it'd have an expression like ([IsLiveString]=="Live") ? True : False That expression is approximate, I'm not sitting an instance of SSDT/BIDS. It could also be simplified by eliminating the ternary expression in lieu of the equality check. If you need to deal with NULL values in the IsLiveString column, then the ternary operator syntax makes that evaluation easier.
Edit
A picture's worth a thousand words so consider this 3k and some change. This represents the actions in your data flow. I used a simple query to POC your example. I generate a column of data type ntext with values of "Live" and "Not Live" and a NULL. From the source, I use the Data Conversion task as described above. I went with DT_WSTR to make this answer apply to the broadest possible audience and left the length at the default of 50. To optimize your memory usage, you'd want to decrease the length to match the longest possible value from the source system.
I configured my Derived Column transformation thusly. Options are there for dealing with NULLs or if you know your data is NULL free then the first will work.
Results. You can observe that this correctly makes the strings into their corresponding boolean counterparts. Those would then be piped into your destination component.
Related
Several Errors of the same type when trying to load a SQL db table from an OData SharePoint connection.
[GFXBankAccountProcessing - DB GFX Account List [2]] Error: An error occurred while setting up a binding for the "BLCompanyID" column. The binding status was "DT_NTEXT". The data flow column type is "DBBINDSTATUS_UNSUPPORTEDCONVERSION". The conversion from the OLE DB type of "DBTYPE_IUNKNOWN" to the destination column type of "DBTYPE_WVARCHAR" might not be supported by this provider.
It is expected to load the table and proceed to the next function within the process. I believe it has something to do with the data conversion but I am not sure what to convert the data to. I have looked to try to compare datatypes that the table in the DB is requiring. It is a NVARCHAR but I am not sure why it would fail. BLCompanyID is only one column, some of the other columsn are succedding while that one and a few others are failing.
The problem is that the data type you are trying to store into the destination table is not the correct format. For instance, if the destination table requires an nvarchar(255) and you're trying to insert a DT_NTEXT, it will fail. You will need to convert the column to DT_WSTR with a length of 255.
Here's a quick reference that I have bookmarked to help me:
http://wiki.melissadata.com/index.php?title=FAQ%3ASSIS%3AData_Type_Conversions
I have five or six OLE DB Sources with a String[DT_STR], with a length of 500 and 1252 (Latin) as Code Page.
The format of the column is like 0,08 or 0,10 etc etc. As you can see, it is separated with a comma.
All of them are equal except one of them. In this one source, I have a POINT as separation. On this it is working when I set the Data Type in the advanced editor of the OLE DB Source. On another (with comma separated) it is also working, if I set the Data Type in the advanced editor of the OLE DB Source. BUT the weird thing is, that it isn't working with the other sources although they are the same (sperated with comma).
I tested Numeric(18,2) and decimal(2).
Another try to solve the problem with the conversion task and/or the derived column task, failed.
I'm using SQL Server 2008 R2
Slowly, I think SSIS is fooling me :)
Has anyone an idea?
/// EDIT
Here a two screens:
Is working:
click
Isn't working:
click
I would not set the Data Type in the Advanced Editor of the OLE DB Source. I would convert the data in the SQL Code of the OLE DB Source, or in a Script Transformation e.g. using Decimal.TryParse , which would update a new column.
SSIS is unbeleivably fussy over datatypes and trying to mess with its internals is not productive.
Check that there are any spaces in between the commas, so that the SSIS is throwing an error trying to convert the blank space to a number. A blank space does not equal nothing in between spaces.
Redirect error rows and output the data to a file. Then you can examine the data that is being rejected by the SSIS and determine why it's causing error.
Reason for the error
1) Null’s are not properly handled either in the destination database or during SSIS package creation. It is quite possible that the source contains a null database but the destination is not accepting the NULL data leading to build generate above error.
2) Data types between source and destination does not match. For example, source column has varchar data and destination column have an int data type. This can easily generate above error. There are certain kind of datatypes which will automatically convert to another data type and will not generate the error but there are for incompatible datatypes which will generate The value could not be converted because of a potential loss of data. error.
The Issue arises when there is unhandled space or null. I have worked around using the Conditional (Ternary) Operator which checks the length:
LEN(TRIM([Column Name])) >= 1 ? (DT_NUMERIC,38,8)[Column Name] : 0
I have a SQL Server 2005 SP2 database which has a table with a poc_resp_city attribute which is nvarchar(35).
It was changed to nvarchar(80) 2 months ago without aligning the very same attribute in the data warehouse. (which still has nvarchar(35) )
The SSIS data loading package (after two months of proper working) now gives back package failure every time I run it with the following error:
There was an error with output column "poc_resp_city" (2250) on output
"OLE DB Source Output" (11). The column status returned was: "Text was
truncated or one or more characters had no match in the target code
page.". There was an error with output column "poc_resp_city"
(2250) on output "OLE DB Source Output" (11). The column status
returned was: "Text was truncated or one or more characters had no
match in the target code page.".
SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on
component "Source Table" (1) returned error code 0xC020902A. The
component returned a failure code when the pipeline engine called
PrimeOutput(). The meaning of the failure code is defined by the
component, but the error is fatal and the pipeline stopped executing.
There may be error messages posted before this with more information
about the failure.
Neither the package nor the databases were modified regarding this issue. I know that I could ignore this error or I could make the arrangements to make sure it's working but I want to provide a proper and acceptable answer why this error appears 2 months after the modification? Because maybe I miss an important step in this situation.
Important note: I don't have even a single record which has more than 35 characters so truncation never occurs. (this warning belongs to some kind of an SSIS validation step)
Now I think that maybe after a period of time, SSIS package recompiles itself and now it sees this misalignment in its metadata (35 =/= 80) and because TruncationRowDisposition attribute is set to RD_FailComponent, it fails the component.
And I would exclude the code page option because every database column is nvarchar, not varchar, so this shouldn't be the case.
Thanks!
You need to refresh size of column:
With right button push on OLE DB Source -> Show Advanced Editor
Choose Input and Output Properties tab -> Ole DB Source Output -> Output Columns
In the right panel Length row insert your new size.
Push OK
Or you can copy your query from OLE DB Source, delete OLE DB Source, insert new OLE DB Source and paste query. This gonna automaticily refresh your columns.
Just remember what there are probably more element in Dataflow where you need to edit length of your column, like Data Converion...
Good afternoon,
I have a dilemma that I cannot figure out. I am trying to import a flat file in a SQL db table but am having issues. My column in SQL is a date time column for date of birth DOB. The extraction flat file provided to me has this column as date...thus when I importin to SQL I am getting:
Messages
Error 0xc0202009: Data Flow Task 1: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80004005 Description: "Invalid date format".
(SQL Server Import and Export Wizard)
Error 0xc020901c: Data Flow Task 1: There was an error with input column "DOB" (212) on input "Destination Input" (147). The column status returned was: "Conversion failed because the data value overflowed the specified type.".
(SQL Server Import and Export Wizard)
Error 0xc0209029: Data Flow Task 1: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "input "Destination Input" (147)" failed because error code 0xC020907A occurred, and the error row disposition on "input "Destination Input" (147)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
I am trying to figure out how to add a random time like 00:00:00.000 to every DOB in the flat file. example, they send me 1983-11-30 but I want to use 1983-11-30 00:00:00.000
Is there a way to do this or does anyone have an idea of what I can do? Thanks in advance
My situation was a little different but I feel posting about it could help others out there. Unlike OP, I was importing data from an Access 2000 mde file into sql server management studio, but similar to OP I ran into the "Invalid data format" error. I knew the conflict was caused by the fact that the columns of type DateTime in the access file conflicted with the DataTime format in sql.
Solution: After I went through the import data process, which is done by right clicking the server name in sql server management studio -> Tasks -> Import Data and then choosing the source (access), destination (sql server), and table to be imported, I noticed that 0 rows were transferred because of the type issue. However, a table was created for me in SQL Server Management Studio that represented the table I was trying to import from the access table. It had the right columns and their associated data types.
So what I did was I right clicked the table in order to go into design view and changed all the data types of the DataTime columns to just Date,
Then I redid the import data process, and the rows were successfully transferred.
MSSQL will have no problem converting a yyyy-MM-dd string like yours to a date. You'll end up with 00:00:00.000 automatically. The error message you posted "Conversion failed because the data value overflowed the specified type" indicates that one of the dates is outside the valid range. You most likely got an invalid date of birth (look for a 0000-00-00 record or similar). If the flat file is delimited in a way Excel can easily parse, pull it into Excel and sort that column. Look at both the smallest and largest values and you'll likely find the offending record(s).
I changed the type of row from datetime to date (in the mapping) while importing the data via Wizard and it worked for me.
This is old, but I was having a similar issue in my SSIS data flow task and hopefully this might help someone: for me the solution was to check the "Retain null values from the source as null values in the data flow" option in the Flat File Source Editor window (in the Connection Manager page)
I found that when copying a database from the local EXPRESS to my production server, the table definitions were smalldatetime instead of datetime. In the wizard where you select the tables, you can select Edit Mapping. In there you can change your data type.
Is there any way to capture the error messages that occur during a bulk insert?
If I specify an error file, I get 2 seperate files, one that contains the record that errored, and one that contains the row.
The messages that are displayed for errors contain more information:
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 6 (temp_batch_date).
Is it possible to write these messages into a temp table so I can handle them accordingly?
Check out the SqlBulkCopy class to perform the operation and you should have programmatic access to any errors (exceptions) that are generated. This should allow you to attempt recovery/logging.
If you don't want to use Visual Studio (why not? it's free), you could also use PowerShell.
Handling these errors outside of SQL Server really opens up what/how you can workaround hurdles with your source data.