Unpivot Transformation Error: XX column is not of the same unpivot datatype (Data flow in ADF) - azure-data-factory-2

I want to unpivot the following data table, but I got the error: KPI1 column is not of the same unpivot datatype. I checked the KPI1 datatype. It is a long format. I think it is one type of numerical datatypes (integer, decimal, long, double, etc), all numerical columns should be treated in the same way.
Thank you for any insights to fix the error.

Finally, I fixed it by changing the datatypes of other numerical columns.
If you have multiple numerical datatypes, it is hard to define the datatype of the new numerical unpivoted column.

Related

Making sense of date-time and datatypes in SQLite

I'm learning SQL and SQLite at the moment, and from what I understand, SQLite doesn't support a datetime datatype.
However, when I run the command PRAGMA table_info(Orders); it's showing one column as being of type datetime.
I've read that In SQLite, the datatype of a value is associated with the value itself, not with its container. taken from here https://www.sqlite.org/datatype3.html
I'm trying to understand what exactly this means, which may help to explain.
Can someone help me understand what's going on here?
What you see in the column type when you execute:
PRAGMA table_info(tablename)
or:
SELECT * FROM pragma_table_info('tablename');
is the data type that was used for the definition of the column in the CREATE TABLE statement.
SQLite allows the use of anything (even nothing), as long as it is not a reserved word, as a data type.
This:
CREATE TABLE tablename (
column0 integer primary key,
column1 integer,
column2 text,
column3 real,
column4 string,
column5 something,
column6 Jack,
column7, -- no data type defined
column8 real char int -- ??,
column9 datetime
);
is a valid statement!
From all the above column definitions, SQLite will enforce type checking only for the column column0 which is defined as integer primary key.
From Datatypes In SQLite Version 3/Storage Classes and Datatypes:
Any column in an SQLite version 3 database, except an INTEGER PRIMARY
KEY column, may be used to store a value of any storage class.
When you define a column as datetime, don't expect SQLite to understand your intention and set any internal constraints so that the values you store in it will be datetime-like.
Actually, by the rules described in Determination Of Column Affinity, this column will have NUMERIC affinity.
Of course you can store datetime values in a column, as described in Date and Time Datatype, by using INTEGER, REAL or TEXT data type, depending on the form of the datetimes that you want: Unix Times, Julian day numbers or ISO8601 strings ("YYYY-MM-DD HH:MM:SS.SSS") strings respectively.
In conclusion, SQLite will never complain for any value in any column defined as any data type (except for integer primary key).
It is your responsibility to make sure that you store values in the proper format that you can read/write, make calculations, compare etc.
For example, never store text datetimes in any other format than "YYYY-MM-DD HH:MM:SS.SSS" because all SQLite's datetime functions work with this format only.
To amplify #forpas's excellent answer, you can make up for SQLite's weak type-checking by using CHECK constraints in your table definitions. I frequently do that to enforce integers or string lengths, and you could use SQLite's date functions to verify that the value can be parsed as a date and lies between two values.

Conditional casting of column datatype

i have subquery, that returns me varchar column, in some cases this column contains only numeric values and in this cases i need to cast this column to bigint, i`ve trying to use CAST(case...) construction, but CASE is an expression that returns a single result and regardless of the path it always needs to result in the same data type (or implicitly convertible to the same data type). Is there any tricky way to change column datatype depending on condition in PostgreSQL or not? google cant help me((
SELECT
prefix,
module,
postfix,
id,
created_date
FROM
(SELECT
s."prefix",
coalesce(m."replica", to_char(CAST((m."id_type" * 10 ^ 12) AS bigint) + m."id", 'FM0000000000000000')) "module",
s."postfix",
s."id",
s."created_date"
FROM some_subquery
There is really no way to do what you want.
A SQL query returns a fixed set of columns, with the names and types being fixed. So, a priori what you want to do does not fit well within SQL.
You could work around this, by inventing your own type, that is either a big integer or a string. You could store the value as JSON. But those are work-arounds. The SQL query itself is really returning one "type" for each column; that is how SQL works.

Error unable to convert data type nvarchar to float

I have searched both this great forum and googled around but unable to resolve this.
We have two tables (and trust me I have nothing to do with these tables). Both tables have a column called eventId.
However, in one table, data type for eventId is float and in the other table, it is nvarchar.
We are selecting from table1 where eventI is defined as float and saving that Id into table2 where eventId is defined as nvarchar(50).
As a result of descrepancy in data types, we are getting error converting datatype nvarchar to float.
Without fooling around with the database, I would like to cast the eventId to get rid of this error.
Any ideas what I am doing wrong with the code below?
SELECT
CAST(CAST(a.event_id AS NVARCHAR(50)) AS FLOAT) event_id_vre,
The problem is most likely because some of the rows have event_id that is empty. There are two ways to go about solving this:
Convert your float to nvarchar, rather than the other way around - This conversion will always succeed. The only problem here is if the textual representations differ - say, the table with float-as-nvarchar uses fewer decimal digits, or
Add a condition to check for empty IDs before the conversion - This may not work if some of the event IDs are non-empty strings, but they are not float-convertible either (e.g. there's a word in the field instead of a number).
The second solution would look like this:
SELECT
case when a.eventid <> ''
then cast(cast(a.event_id as nvarchar(50)) as float)
ELSE 0.0
END AS event_id_vre,
Convert float to nvarchar instead of nvarchar to float. Of course!

SQL Convert statement

I need help with a SQL convert statement. I have NetQuanity (masterTable) which is a varchar(15) and I have another table with Purchase price (PO TABLE) which is money. When I try to multiply them in a SQL view is gives me the error:
If your field is a VARCHAR, you'll need to CAST to the appropriate data type prior to your operation. e.g.
CAST(myVarCharField as INT) * myIntField
Be forewarned however, if you attempt to CAST this field to a numeric data type and it's not numeric, you'll be in the same boat.
I would recommend using CAST over CONVERT in your example, for the following reason defined in this SO post:
Related: T-SQL Cast versus Convert
Maybe try using the CONVERT function? CONVERT(money,NetQuantity).
First of all you have a data definition problem.
The first thing is to eliminate any non-numeric entries in the master table.
SELECT whatever FROM masterTable WHERE ISNUMERIC(NetQuanity)=1
The next step is to include this as a sub-query in the calculation.
In this query use CONVERT or CAST to convert the valid quanities to integer.
i.e.
CONVERT(INT, NetQuantity)

Define dataType of column that is really big SQL Server

I have data greater to this number, if I attempt to get several sums of them like::
1,22826520941614E+24+1,357898350941614E+34+1,228367878888764E+26 I get as Result NULL, How to define the table Datatype for that kind of fields??
I am using float, but it does not work.
If you're getting NULL back, it's not the data type. It's because you have a null value in one of the rows of data. NULL + anything is NULL.
Change your Sum() to include a WHERE YourNumericColumn IS NOT NULL, or use COALESCE().
A float is sufficiently large to contain data of that range. It can store binary floating-point values from -1.79E+308 to 1.79E+308. I suspect an error elsewhere in your statement.