SSIS - Data source with SQL statement containing a Convert - sql

I have created an SQL OLEDB data source (SQL Server) and I have the following Command Text:
select replace(convert(varchar(10), dob, 111), '/', '-') As DOB FROM Person
I get a warning on executing that says: [OLE DB Source [1]] Warning: Truncation may occur due to retrieving data from database column "DOB" with a length of 8000 to data flow column "DOB" with a length of 10.
When I try to change the external column from 8000 to 10 (as stated in the query), the designer automatically changes it back. I thought that external columns represented meta data in the data source. The dob in the data source (see query above) is a varchar(10). Why does it have to have a length of 8000? I don't have a lot of experience with SSIS.

I have found the solution. I had to do this:
select cast(replace(convert(varchar(10), dob, 111), '/', '-') As varchar(10)) As DOB from Person

There is another workaround to overcome this as: you can ignore the errors and execute the package from 'Truncation' erros by setting the appropriate property in 'Source' block's 'Error output' tab..

Related

Column Datatypes Issue in sql server 2019 when Import Flatfile using SSIS

I have column in flatfile contain value like. 2021-12-15T02:40:39+01:00
When I tried to Insert to table whose column datatype is datetime2.
It throwing Error as :
The data conversion for column "Mycol" returned status value 2 and status text
"The value could not be converted because of a potential loss of data.".
What could be best datatype for such values.
It seems the problem is two-fold here. One, the destination column for your value should be a datetimeoffset(0) and two that SSIS doesn't support the format yyyy-MM-ddThh:mm:ss for a DT_DBTIMESTAMPOFFSET; the T causes it problems.
Therefore I suggest that you define the column, MyCol, in your Flat File Connection as a DT_STR. Them, in your data flow task, use a derived column transformation which replaces MyCol and uses the following expression to remove the T and with a space ( ):
(DT_DBTIMESTAMPOFFSET,0) (REPLACE(Mycol,"T"," "))
This will then cause the correct data type and value to be inserted into the database.

SSIS Variable Date Failing between SQL Server and ORACLE

Good Afternoon All,
I have spent about 6 hours trying to get formatting to work through SSIS using a Max Date Variable to identify into a where clause - Just have no luck!
I have created a variable called my_date which fetched the Max(Date) from a local SQL server table to understand the last load point for that table - using the below code:
SELECT CAST(FORMAT(MAX(Business_Date), 'dd-MMM-yyyy') AS varchar) AS my_date FROM Table
This fetches the date correctly as 17-Sep-2018.
I have then mapped my result set as my_date -> User::max_date
I have set my max_date variable to a string data type under the package scope.
I have tested my variable out by using breakpoints to ensure this runs all the way through in the correct format - and this works 100%.
I then have a data flow task running to fetch data from my ORACLE DB to insert into my SQL Server table which contains the following SQL command:
SELECT *
FROM Table2
WHERE (BUSINESS_DATE > to_date('#[User::max_date]', 'DD-MON-YYYY'))
However I get the ORA-01858 - [TABLE] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E07.
An OLE DB record is available. Source: "Microsoft OLE DB Provider for Oracle" Hresult: 0x80040E07 Description: "ORA-01858: a non-numeric character was found where a numeric was expected
".
If I go and replace my variable directly with the contents of the variable shown by the breakpoint in locales it works perfectly!
I have attempted multiple format types from the initial export through to the final where clause and this seems to be the closest I have come to pushing it through but it is still complaining about format.
Please help - Images below to help see setup.
Control Flow - displaying the execute SQL and the data flow task
Locale showing Variable is inserting after breakpoint is reached
Managed to get it working!
By adding an intermediary variable where the value and expression contains the following:
"SELECT *
FROM TABLE
WHERE (BUSINESS_DATE > to_date('"+#[User::max_date]+"' , 'DD-MON-YYYY'))"
I then amended my source OLEDB to SQL Command from variable and selected the above variable created and it worked perfectly!
Try mapping your user parameter in the "OLE DB Data Source Editor", under "Parameters".
1) Change SQL Command Text (change #[User::max_date] to ?), like this:
SELECT *
FROM Table2
WHERE (BUSINESS_DATE > to_date('?', 'DD-MON-YYYY'))
2) Then in the parameter editor, map parameter 1 to #[User::max_date].
https://learn.microsoft.com/en-us/sql/integration-services/data-flow/map-query-parameters-to-variables-in-a-data-flow-component?view=sql-server-2017
Also, the "Oracle Provider for OLE DB" behaves differently than the "Microsoft OLE DB Provider for Oracle", so it depends which you are using.

SSIS error when adding excel destination

I have a SSIS package with data in a SQL Server 2012 table have added an Excel destination and get the error
There is no sufficient information about mapping ssis types to types of the selected .net data provider. As a result you may need to modify the default types of the SQL statement on the next screen
Code:
CREATE TABLE `Excel Destination`
(
`name` VARCHAR(50),
`date` DATETIME
)
It doesn't like the 'name' column I have added a data conversion task but the 'name' column is already set to unicode string. So I'm not sure why I get message about converting between non unicode and unicode?
Any advice would be welcome.
Please check, This error is because your sheet is empty and there are no columns defined in it. you have to write the names of the column in the first row the target sheet.

Azure SQL Data Warehouse - Strange DateTime conversion error/ behaviour

I am reading data from data lake (csv) and when running the below query, I am getting a 'Conversion failed when converting date and/or time from character string' error message.
select convert(datetime, NullIf(ltrim(rtrim([Date started])), ''), 111)
FROM dl.temp
Looked through the data and checked the source file as well, couldn't spot anything unusual.
As soon as I include the * and change the query to the below everything runs fine and the conversion seem to be doing its job.
select convert(datetime, NullIf(ltrim(rtrim([Date started])), ''), 111),*
from dl.temp
Out of curiosity also wanted to check the max and minimum date, so running max gives me the following:
However when I search for that particular value like below, I don't get any rows returned. It seems like it setting it to the column name. Does anyone know what is going on?
select *
from dl.temp
where [Date started] = 'Date started'
I am running this against an Azure Data Warehouse.
I think you'll find the issue is in your external file format.
In the CREATE EXTERNAL FILE FORMAT you probably need to add FIRST_ROW=2 in your FORMAT OPTIONS.
https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-file-format-transact-sql

Import Package Error - Cannot Convert between Unicode and Non Unicode String Data Type

I have made a dtsx package on my computer using SQL Server 2008. It imports data from a semicolon delimited csv file into a table where all of the field types are NVARCHAR MAX.
It works on my computer, but it needs to run on the clients server. Whenever they create the same package with the same csv file and destination table, they receive the error above.
We have gone through the creation of the package step by step, and everything seems OK. The mappings are all correct, but when they run the package in the last step, they receive this error. They are using SQL Server 2005.
Can anyone advise where to begin looking for this problem?
The problem of converting from any non-unicode source to a unicode SQL Server table can be solved by:
add a Data Conversion transformation step to your Data Flow
open the Data Conversion and select Unicode for each data type that applies
take note of the Output Alias of each applicable column (they are named Copy Of [original column name] by default)
now, in the Destination step, click on Mappings
change all of your input mappings to come from the aliased columns in the previous step (this is the step that is easily overlooked and will leave you wondering why you are still getting the same errors)
At some point, you're trying to convert an nvarchar column to a varchar column (or vice-versa).
Moreover, why is everything (supposedly) nvarchar(max)? That's a code smell if I ever saw one. Are you aware of how SQL Server stores those columns? They use pointers to where the column is stored from the actual rows, since they don't fit within the 8k pages.
Non-Unicode string data types:
Use STR for text file and VARCHAR for SQL Server columns.
Unicode string data types:
Use W_STR for text file and NVARCHAR for SQL Server columns.
The problem is that your data types do not match, so there could be a loss of data during the conversion.
Two solutions:
1- if the type of the target column is [nvarchar] it should be change to [varchar]
2- Add a "Derived Column" component to the SSIS package and add a new column with the following expression:
(DT_WSTR, «length») [ColumnName]
Length is the length of the column in the target table and ColumnName is the name of the column in the target table.
finally at the mapping part you should use this new added column instead of the original column.
Not sure if this is a best practice with SSIS but sometimes I find their tools are a bit clunky when you want to do this type of activity.
Instead of using their components you can convert the data within your query
Instead of doing
SELECT myField = myNvarchar20Field
FROM myTable
You could do
SELECT myField = CONVERT(VARCHAR(20),myNvarchar20Field)
FROM myTable
This a solution that uses the IDE to fix:
Add a Data Conversion item to your dataflow as shown below;
Double click on the Data Conversion item, and set it as shown:
Now double click on the DB Destination item, Click on Mapping, and ensure that your input Column is actually the same as coming from the Copy of [your column name], which is in fact the Data Conversion output NOT the DB Source Output (be careful here). Here is a screenshot:
And thats it .. save and run ..
Mike, I had the same problem with SSIS in SQL Server 2005...
Apparently, the DataFlowDestination object will always attempt to validate the data coming in,
into Unicode. Go to that object, Advanced Editor, Component Properties pane, change the "ValidateExternalMetaData" property to False. Now, go to the Input and Output Properties pane, Destination Input, External Columns - set each column Data type and Length to match the database table it's going to. Now, when you close that editor, those column changes will be saved and not validated over, and it will work.
Follow the below steps to avoid (cannot convert between unicode and non-unicode string data types) this error
i) Add the Data conversion Transformation tool to your DataFlow.
ii) To open the DataFlow Conversion and select [string DT_STR] datatype.
iii) Then go to Destination flow, select Mapping.
iv) change your i/p name to copy of the name.
Get to the registry to configuration of the client and change the LANG.
For Oracle, go to HLM\SOFTWARE\ORACLE\KEY_ORACLIENT...HOME\NLS_LANG and change to appropriate language.
The dts data Conversion task is time taking if there are 50 plus columns!Found a fix for this at the below link
http://rdc.codeplex.com/releases/view/48420
However, it does not seem to work for versions above 2008. So this is how i had to work around the problem
*Open the .DTSX file on Notepad++. Choose language as XML
*Goto the <DTS:FlatFileColumns> tag. Select all items within this tag
*Find the string **DTS:DataType="129"** replace with **DTS:DataType="130"**
*Save the .DTSX file.
*Open the project again on Visual Studio BIDS
*Double Click on the Source Task . You would get the message
the metadata of the following output columns does not match the metadata of the external columns with which the output columns are associated:
...
Do you want to replace the metadata of the output columns with the metadata of the external columns?
*Now Click Yes. We are done !
Resolved - to the original ask:
I've seen this before. Easiest way to fix (don't need all those data conversion steps as ALL of the meta data is available from the source connection):
Delete the OLE DB Source & OLE DB Destinations
Make sure Delayed Validation is FALSE (you can set it to True later)
Recreate the OLE DB Source with your query, etc.
Verify in the Advanced Editor that all of the output data column types are correct
Recreate your OLE DB Destination, map, create new table (or remap to existing) and you'll see that SSIS got all the data types correct (same as source).
So much easier that the stuff above.
Not sure if this is still a problem but I found this simple solution:
Right-Click Ole DB Source
Select 'Edit'
Select Input and Output Properties Tab
Under "Inputs and Outputs", Expand "Ole DB Source Output" External Columns and Output Columns
In Output columns, select offending field, on the right-hand panel ensure Data Type Property matches that of the field in External Columns properties
Hope this was clear and easy to follow
Sometime we get this error when we select static character as a field in source query/view/procedure and the destination field data type in Unicode.
Below is the issue i faced:
I used the script below at source
and got the error message Column "CATEGORY" cannot convert between Unicode and non-Unicode string data types. as below:
error message
Resolution:
I tried multiple options but none worked for me. Then I prefixed the static value with N to make in Unicode as below:
SELECT N'STUDENT DETAIL' CATEGORY, NAME, DATEOFBIRTH FROM STUDENTS
UNION
SELECT N'FACULTY DETAIL' CATEGORY, NAME, DATEOFBIRTH FROM FACULTY
If anyone is still experiencing this issue, I found that it related to a difference in Oracle Client versions.
I have posted my full experience and solution here: https://stackoverflow.com/a/43806765/923177
1.add a Data Conversion tool from toolbox
2.Open it,It shows all coloumns from excel ,convert it to desire output. take note of the Output Alias of
each applicable column (they are named Copy Of [original column name] by default)
3.now, in the Destination step, click on Mappings
I changed ValidateExternalMetadata=False for each transformation task. It worked for me.