SSIS failed validation and returned validation status "VS_ISBROKEN" - sql

I'm trying to create a temp table and process two data flows using the temp table. It is in a sequence container and if I just execute the container it run perfect but when the entire package is ran it returns this error:
Information: 0x4004300A at V-AccidentCodesBase, SSIS.Pipeline:
Validation phase is beginning.
Error: 0xC0202009 at V-AccidentCodesBase, Insert into Temp Table [69]:
SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error
code: 0x80040E14.
An OLE DB record is available. Source: "Microsoft SQL Server Native
Client 11.0" Hresult: 0x80040E14 Description: "Statement(s) could
not be prepared.".
An OLE DB record is available. Source: "Microsoft SQL Server Native
Client 11.0" Hresult: 0x80040E14 Description: "Invalid object name
'##TmpAccidentCode'.".
Error: 0xC004706B at V-AccidentCodesBase, SSIS.Pipeline: "Insert into
Temp Table" failed validation and returned validation status
"VS_ISBROKEN".
Error: 0xC004700C at V-AccidentCodesBase, SSIS.Pipeline: One or more
component failed validation.
Error: 0xC0024107 at V-AccidentCodesBase: There were errors during
task validation.

I would set the DelayValidation property to True. You may get away with just setting this on the Sequence Container, or you may need to repeat that setting on child objects, e.g. your Data Flow Task.

Also faced the same error message. The issue was permissions on the database for the user that runs the ETL (a service account). Make sure the user that runs the package has enough permissions to execute the query.

As other guys mentioned, the error may happen due to different reasons. In my case, I realized that I have tried to convert some NULL to int in Script section of SSIS. Something like :
ProductsBuffer.ProductId = Int64.Parse(reader["ProductId"].ToString());
so the fix was easy. I just checked the field before converting, to make sure it is not null:
if (reader["ProductId"] != DBNull.Value)
ProductsBuffer.ProductId = Int64.Parse(reader["ProductId"].ToString());

In my case I switched from OleDB to SQNCLI and it worked :/

I ended up solving the problem which was overloading tempDB. When I slowed the process down to one command to tempDb at a time it all ran through smoothly.

I faced the same error, in my case, I was using SSIS to import data from an excel file into a couple of tables.
I used 2 differents files and it failed with one and worked with the other, after some review I found that I was referring to the name of the excel sheet within the package, so the excel sheet has to be named EXACTLY (i think it is case sensitive) as you used on the SSIS package

I'm using VS 2017. I wonder if after a while it forgets your "saved" database passwords, because mine worked fine for days, then just quit working out of the blue, gave the VS_ISBROKEN error. After I reentered the password for one of my database connections--despite the fact I had checked the Save Password checkbox previously--it started working again.

Related

"An error occurred while extracting the result into a variable of type (DBTYPE_I2)"

I am getting below error in production instance. There were no changes to the ETL and the job was running all OK everyday. Today suddenly it has started failing with error:
Source: SQL Update Audit Table Processing Execute SQL Task Description: Executing the query "UPDATE AuditTableProcessing SET ExtractRowCnt = ?..." failed with the following error: "An error occurred while extracting the result into a variable of type (DBTYPE_I2)"
When I run the job in development environment everything runs smoothly without an issue. Any hints on the issue will be much helpful!
It looks like that there are a Variable Mapped in the ResultSet of the Execute SQL Task is of type DBTYPE_I2 and the value does not fit theis type, try changing it to DBTYPE_I4 or relevant data type.
More information about ResulSet:
SSIS Basics: Using the Execute SQL Task to Generate Result Sets

WIS 10901 error while refreshing Webi report

While refreshing Webi report I am getting an error:
A database error occured. The database error text is: (CS) "Unexpected behavior" . (WIS 10901)
All the objects are parsing in the universe and Server is also responding. What can be the possible reason?
We are also able to run query in the database using database client tool.
If the error message appears after the a long time it might just be a timeout issue.
Else, you could try to import a version of the report that works in CMS to your local drive, rename it and run again.
It can be caused by some special character in the data combined with the fact that the server language settings do not foresee such character and therefore Business Objects cannot parse it for presentation.
If that is the case you might need to configure an environment variable of the server (like NLS_LANG) setting it to a value such that those special characters in your data can be handled by Business Objects.
In my situation, the error appera when some objet from the data base has changed or does not exists anymore. So we need to delete this object in the Universe or be sure that the field exists in the data base with the same name and type.
I had same problem with my reports. After couple hour of "investigation", I found.
I create Object in my universe, and set inappropriate type of object data Number, when value in database have type Character.
It throw me oracle Error (ORA-01722), and Bussiness Object error (WIS 10901), though SQL copied from report creator interface, executed directly on database return proper data.

SSIS string truncation error

I have a SQL Server 2005 SP2 database which has a table with a poc_resp_city attribute which is nvarchar(35).
It was changed to nvarchar(80) 2 months ago without aligning the very same attribute in the data warehouse. (which still has nvarchar(35) )
The SSIS data loading package (after two months of proper working) now gives back package failure every time I run it with the following error:
There was an error with output column "poc_resp_city" (2250) on output
"OLE DB Source Output" (11). The column status returned was: "Text was
truncated or one or more characters had no match in the target code
page.". There was an error with output column "poc_resp_city"
(2250) on output "OLE DB Source Output" (11). The column status
returned was: "Text was truncated or one or more characters had no
match in the target code page.".
SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on
component "Source Table" (1) returned error code 0xC020902A. The
component returned a failure code when the pipeline engine called
PrimeOutput(). The meaning of the failure code is defined by the
component, but the error is fatal and the pipeline stopped executing.
There may be error messages posted before this with more information
about the failure.
Neither the package nor the databases were modified regarding this issue. I know that I could ignore this error or I could make the arrangements to make sure it's working but I want to provide a proper and acceptable answer why this error appears 2 months after the modification? Because maybe I miss an important step in this situation.
Important note: I don't have even a single record which has more than 35 characters so truncation never occurs. (this warning belongs to some kind of an SSIS validation step)
Now I think that maybe after a period of time, SSIS package recompiles itself and now it sees this misalignment in its metadata (35 =/= 80) and because TruncationRowDisposition attribute is set to RD_FailComponent, it fails the component.
And I would exclude the code page option because every database column is nvarchar, not varchar, so this shouldn't be the case.
Thanks!
You need to refresh size of column:
With right button push on OLE DB Source -> Show Advanced Editor
Choose Input and Output Properties tab -> Ole DB Source Output -> Output Columns
In the right panel Length row insert your new size.
Push OK
Or you can copy your query from OLE DB Source, delete OLE DB Source, insert new OLE DB Source and paste query. This gonna automaticily refresh your columns.
Just remember what there are probably more element in Dataflow where you need to edit length of your column, like Data Converion...

"Invalid date format" when importing flat file with Import Wizard

Good afternoon,
I have a dilemma that I cannot figure out. I am trying to import a flat file in a SQL db table but am having issues. My column in SQL is a date time column for date of birth DOB. The extraction flat file provided to me has this column as date...thus when I importin to SQL I am getting:
Messages
Error 0xc0202009: Data Flow Task 1: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80004005 Description: "Invalid date format".
(SQL Server Import and Export Wizard)
Error 0xc020901c: Data Flow Task 1: There was an error with input column "DOB" (212) on input "Destination Input" (147). The column status returned was: "Conversion failed because the data value overflowed the specified type.".
(SQL Server Import and Export Wizard)
Error 0xc0209029: Data Flow Task 1: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "input "Destination Input" (147)" failed because error code 0xC020907A occurred, and the error row disposition on "input "Destination Input" (147)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
I am trying to figure out how to add a random time like 00:00:00.000 to every DOB in the flat file. example, they send me 1983-11-30 but I want to use 1983-11-30 00:00:00.000
Is there a way to do this or does anyone have an idea of what I can do? Thanks in advance
My situation was a little different but I feel posting about it could help others out there. Unlike OP, I was importing data from an Access 2000 mde file into sql server management studio, but similar to OP I ran into the "Invalid data format" error. I knew the conflict was caused by the fact that the columns of type DateTime in the access file conflicted with the DataTime format in sql.
Solution: After I went through the import data process, which is done by right clicking the server name in sql server management studio -> Tasks -> Import Data and then choosing the source (access), destination (sql server), and table to be imported, I noticed that 0 rows were transferred because of the type issue. However, a table was created for me in SQL Server Management Studio that represented the table I was trying to import from the access table. It had the right columns and their associated data types.
So what I did was I right clicked the table in order to go into design view and changed all the data types of the DataTime columns to just Date,
Then I redid the import data process, and the rows were successfully transferred.
MSSQL will have no problem converting a yyyy-MM-dd string like yours to a date. You'll end up with 00:00:00.000 automatically. The error message you posted "Conversion failed because the data value overflowed the specified type" indicates that one of the dates is outside the valid range. You most likely got an invalid date of birth (look for a 0000-00-00 record or similar). If the flat file is delimited in a way Excel can easily parse, pull it into Excel and sort that column. Look at both the smallest and largest values and you'll likely find the offending record(s).
I changed the type of row from datetime to date (in the mapping) while importing the data via Wizard and it worked for me.
This is old, but I was having a similar issue in my SSIS data flow task and hopefully this might help someone: for me the solution was to check the "Retain null values from the source as null values in the data flow" option in the Flat File Source Editor window (in the Connection Manager page)
I found that when copying a database from the local EXPRESS to my production server, the table definitions were smalldatetime instead of datetime. In the wizard where you select the tables, you can select Edit Mapping. In there you can change your data type.

How to Convert NTEXT values to BOOLEAN using SSIS

I'm using SSIS to import SalesForce data into SQL Server. In SF, I have ntext fileds that contain values like "Live/ Not Live" which can easily be interpreted with a bit filed in SQL Server.
Is there a way to convert these ntext fields to Boolean values using SSIS. I have tried using Derived Column transformation and get the following errors:
[Insert Destination [807]] Error: SSIS Error Code DTS_E_OLEDBERROR.
An OLE DB error has occurred. Error code: 0x80040E21. An OLE DB record
is available. Source: "Microsoft SQL Server Native Client 10.0"
Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation
generated errors. Check each OLE DB status value, if available. No
work was done.".
[Insert Destination [807]] Error: SSIS Error Code
DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "input "OLE DB Destination
Input" (820)" failed because error code 0xC020907B occurred, and the
error row disposition on "input "OLE DB Destination Input" (820)"
specifies failure on error. An error occurred on the specified object
of the specified component. There may be error messages posted
before this with more information about the failure.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The
ProcessInput method on component "Insert Destination" (807) failed
with error code 0xC0209029 while processing input "OLE DB Destination
Input" (820).
The identified component returned an error from the ProcessInput method.
The error is specific to the component, but the error is fatal and will cause
the Data Flow task to stop running. There may be error messages posted
before this with more information about the failure.
Any help would be greatly appreciated.
NTEXT is your unicode text stream. You can't do much with a stream so we'll need to make it into something more manageable like DT_STR or DT_WSTR. Whether you have internationalization in your text data is something only you will know. Either way, connect a Data Conversion Transformation up to your SF source and make that data into a non-stream type (give it a max width). The goal of this operation is to take the stream data and turn it into straight text values (Live, Not Live). I'm assuming this column will be called IsLiveString
Now that you're dealing with a string type, add your Derived Column Transformation to the Conversion output and here is where you will need an expression to determine whether the supplied values from the Data Conversion task evaluate to True/False. Even if SQL Server understood that Live translates to a 1 (true), there is no way in heck I'd want to rely on that magic working forever. Instead, I'd look to create a new column, IsLiveBoolean and it'd have an expression like ([IsLiveString]=="Live") ? True : False That expression is approximate, I'm not sitting an instance of SSDT/BIDS. It could also be simplified by eliminating the ternary expression in lieu of the equality check. If you need to deal with NULL values in the IsLiveString column, then the ternary operator syntax makes that evaluation easier.
Edit
A picture's worth a thousand words so consider this 3k and some change. This represents the actions in your data flow. I used a simple query to POC your example. I generate a column of data type ntext with values of "Live" and "Not Live" and a NULL. From the source, I use the Data Conversion task as described above. I went with DT_WSTR to make this answer apply to the broadest possible audience and left the length at the default of 50. To optimize your memory usage, you'd want to decrease the length to match the longest possible value from the source system.
I configured my Derived Column transformation thusly. Options are there for dealing with NULLs or if you know your data is NULL free then the first will work.
Results. You can observe that this correctly makes the strings into their corresponding boolean counterparts. Those would then be piped into your destination component.