ODP.Net - Oracle NUMBER(11,6) column is a System.Double in the data table - odp.net

Ran into an issue recently - We encounter an exception when we are trying to set a value in our data table. The exception is
System.ArgumentException: Input string was not in a correct format.Couldn't store <42.356> in COLUMN_NAME Column. Expected type is Double. ---> System.FormatException: Input string was not in a correct format.
We are using oracle client - dll version is 4.112.2.0. In the database the Column COLUMN_NAME is of type NUMBER(11,6).
We create the data table using oracledataadapter.FillSchema method. My initial understanding was that the data column will have the data type of System.Decimal. However the data column is created as System.Double - hence the exception. The even stranger issue is that this exception is not reported in our dev database, QA database or integration env but only in one specific client database.
Is the data column being created as double correct?
*What else could i be looking at?*

Is the client in question in a country where the decimal separator is a comma, e.g. Germany where it should be 42,356? It looks like you might have hit a culture issue when converting a string to a double without specifying an explicit format.

Related

SQL Query in Azure Dataflow does not work when using parameter value in where clause

I use a Azure Datafactory Pipeline.
Within that pipeline i use 2 activities:
Lookup to get a date value
This is the output:
"firstRow": {
"Date": "2022-10-26T00:00:00Z"
A dataflow which is getting the date from the lookup in 1 which is used in the source options SQL query in the where clause:
This is the query:
"SELECT ProductID ,ProductName ,SupplierID,CategoryID ,QuantityPerUnit ,UnitPrice ,UnitsInStock,UnitsOnOrder,ReorderLevel,Discontinued,LastModifiedDate FROM Noordwind.Products where LastModifiedDate >= '{$DS_LastPipeLineRunDate}'"
When i fill the parameter by hand with for example '2022-10-26' then it works great, but when i let the parameter get's its value from the Lookup in step 1 the dataflow fails
Error message:
{"message":"Job failed due to reason: Converting to a date or time failed due to an invalid character. Details:null","failureType":"UserError","target":"Products","errorCode":"DF-Executor-Conversion"}
This is the parameter in the pipeline view, but clicked on the dataflow:
I have tried casting the date al kind of things but not the right thing.
Can you help me.
UPDATE:
After a question from Rakesh:
This is the activity parameter
#activity('LookupLastPipelineRunDate').output.firstRow
I have reproduced the above and got the below results.
My source sample data from SQL database.
For demo, I have used set variable for the date and given a sample date like below.
Created a string parameter and given this variable value to it.
In your case pass the lookup firstrow output date.
I have used below dataflow expression in the query of dataflow source and got the desired result.
concat('select * from dbo.table1 where d1 >=','\'',$date_value,'\'')
Result in a target SQL table.
I have created an activity set variable:
The first pipeline still returns the right date.
I even converted it just to be sure to datetime.
I can create a variable with type string.
Code:
#activity('LookupLastPipelineRunDate').output.firstRow
Regardless of the activity set variable that fails, it looks like the date enters nicely as an input in the Set variable activity
And still a get an error:
When i read this error message, it says that you can't put a date in a string variable. But i can only choose string, boolean and array, so there is no better option for this.
I also reviewd this website.
enter link description here
There for i have altered the table which contains the source data which i use in the dataflow.
I Deleted the column LastModifiedDate because it has datatype datetime.
Now i created the same column with datatype datetime2
I did this because i read that datetime2 has less problems with conversions.

Column Datatypes Issue in sql server 2019 when Import Flatfile using SSIS

I have column in flatfile contain value like. 2021-12-15T02:40:39+01:00
When I tried to Insert to table whose column datatype is datetime2.
It throwing Error as :
The data conversion for column "Mycol" returned status value 2 and status text
"The value could not be converted because of a potential loss of data.".
What could be best datatype for such values.
It seems the problem is two-fold here. One, the destination column for your value should be a datetimeoffset(0) and two that SSIS doesn't support the format yyyy-MM-ddThh:mm:ss for a DT_DBTIMESTAMPOFFSET; the T causes it problems.
Therefore I suggest that you define the column, MyCol, in your Flat File Connection as a DT_STR. Them, in your data flow task, use a derived column transformation which replaces MyCol and uses the following expression to remove the T and with a space ( ):
(DT_DBTIMESTAMPOFFSET,0) (REPLACE(Mycol,"T"," "))
This will then cause the correct data type and value to be inserted into the database.

Error when trying to load data from an OData Source Editor to a SQL db table

Several Errors of the same type when trying to load a SQL db table from an OData SharePoint connection.
[GFXBankAccountProcessing - DB GFX Account List [2]] Error: An error occurred while setting up a binding for the "BLCompanyID" column. The binding status was "DT_NTEXT". The data flow column type is "DBBINDSTATUS_UNSUPPORTEDCONVERSION". The conversion from the OLE DB type of "DBTYPE_IUNKNOWN" to the destination column type of "DBTYPE_WVARCHAR" might not be supported by this provider.
It is expected to load the table and proceed to the next function within the process. I believe it has something to do with the data conversion but I am not sure what to convert the data to. I have looked to try to compare datatypes that the table in the DB is requiring. It is a NVARCHAR but I am not sure why it would fail. BLCompanyID is only one column, some of the other columsn are succedding while that one and a few others are failing.
The problem is that the data type you are trying to store into the destination table is not the correct format. For instance, if the destination table requires an nvarchar(255) and you're trying to insert a DT_NTEXT, it will fail. You will need to convert the column to DT_WSTR with a length of 255.
Here's a quick reference that I have bookmarked to help me:
http://wiki.melissadata.com/index.php?title=FAQ%3ASSIS%3AData_Type_Conversions

Integration Services double to string

I'm using Integration Services to load data from an Excel file to SQL Server table. When I try to send a number stored as double (DT_R8) into a database column where data are stored as varchar(50) I find a queer rounding.
For example consider data in first row first column of above image. Original value is 31.35 but as a string it's stored as shown below
I already tried to use a Delivered Column transformation to cast to string before exporting to SQL, I also added a Round(x, 5) but I get the same result.
How can I solve this problem given that I can't change SQL column data type?
The only working solution was changing the input type from double (DT_R8) to currency [DT_CY]. It seems that the rounding performed on double (DT_R8) make its use difficult when parsing is somehow involved in the export process.

Can't convert String to Numeric/Decimal in SSIS

I have five or six OLE DB Sources with a String[DT_STR], with a length of 500 and 1252 (Latin) as Code Page.
The format of the column is like 0,08 or 0,10 etc etc. As you can see, it is separated with a comma.
All of them are equal except one of them. In this one source, I have a POINT as separation. On this it is working when I set the Data Type in the advanced editor of the OLE DB Source. On another (with comma separated) it is also working, if I set the Data Type in the advanced editor of the OLE DB Source. BUT the weird thing is, that it isn't working with the other sources although they are the same (sperated with comma).
I tested Numeric(18,2) and decimal(2).
Another try to solve the problem with the conversion task and/or the derived column task, failed.
I'm using SQL Server 2008 R2
Slowly, I think SSIS is fooling me :)
Has anyone an idea?
/// EDIT
Here a two screens:
Is working:
click
Isn't working:
click
I would not set the Data Type in the Advanced Editor of the OLE DB Source. I would convert the data in the SQL Code of the OLE DB Source, or in a Script Transformation e.g. using Decimal.TryParse , which would update a new column.
SSIS is unbeleivably fussy over datatypes and trying to mess with its internals is not productive.
Check that there are any spaces in between the commas, so that the SSIS is throwing an error trying to convert the blank space to a number. A blank space does not equal nothing in between spaces.
Redirect error rows and output the data to a file. Then you can examine the data that is being rejected by the SSIS and determine why it's causing error.
Reason for the error
1) Null’s are not properly handled either in the destination database or during SSIS package creation. It is quite possible that the source contains a null database but the destination is not accepting the NULL data leading to build generate above error.
2) Data types between source and destination does not match. For example, source column has varchar data and destination column have an int data type. This can easily generate above error. There are certain kind of datatypes which will automatically convert to another data type and will not generate the error but there are for incompatible datatypes which will generate The value could not be converted because of a potential loss of data. error.
The Issue arises when there is unhandled space or null. I have worked around using the Conditional (Ternary) Operator which checks the length:
LEN(TRIM([Column Name])) >= 1 ? (DT_NUMERIC,38,8)[Column Name] : 0