Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'table1', Column: 'STRT_DT', Value: '6/1/2007 10:16:57'. The attribute is 'Date'.
I got this error after linking the fact "table1" with time dimension table
and after investigation i found that the STRT_DT value is not found because, the value in the Date dimension is like 6/1/2007 00:00:00
so is there any way to link the dates only without the timestamp ??
using the convert function (CONVERT(VARCHAR(10),GETDATE(),110)) convert the data and link it .
Related
We have a source as a flat file which consists of a date column in format 19/08/2013.
We have a target in oracle table which consists of a date column in format 11-AUG-13.
When we are trying to pass the source column value in target using expression TO_DATE AND
TO_CHAR like
**source column is A ---> Source column
v1=TO_CHAR(A)
O1(output column)=TO_DATE(V1,'DD-MON-YY') we are getting the below error.
Invalid date: [19/8/2013]. The row will be skipped.**
Can anyone please help where I'm going wrong.
Thank you
You need to convert the str to date properly and then infa will load them.
Change the mapping like this -
Change data type in source to a string. read data as string.
Then use your expressions but change the to_date function like below.
v1(var column, data type string)=A
O1(output column, data type date)=IIF(IS_DATE(V1,'DD/MM/YYYY'), TO_DATE(V1,DD/MM/YYYY'),null)
IS_DATE - will check if the data is a date or not, if yes then only it will try to convert to a proper date else put null.
3. Then connect O1 column to a date type in oracle target.
This will make sure all data are loaded and none skipped.
I am using an SQL Script to parse a json into a table using dbt. One of the cols had this date value: '2022-02-09T20:28:59+0000'. What would be the correct way to define iso date's data type in Snowflake?
Currently, I just used the date type like this in my dbt sql script:
JSON_DATA:"situation_date"::date AS MY_DATE
but clearly, dateisn't the correct one because later when I test it using select * , I get this error:
SQL Error [100040] [22007]: Date '2022-02-09T20:28:59+0000' is not recognized
so I need to know which Snowflake date data type or datetime type suits the best with this one
Correct pulling the "date from JSON" so not so clear cut:
SELECT
'{"date":"2022-02-09T20:28:59+0000"}' as json_str
,parse_json(json_str) as json
,json:date as data_from_json
,TRY_TO_TIMESTAMP_NTZ(data_from_json, 'YYYY-MM-DDTHH:MI:SS+0000') as date_1
,TRY_TO_TIMESTAMP_NTZ(substr(data_from_json,1,19), 'YYYY-MM-DDTHH:MI:SS') as date_2
;
gives the error:
Function TRY_CAST cannot be used with arguments of types VARIANT and TIMESTAMP_NTZ(9)
Because the type of data_from_json as VARIANT and the TO_DATE/TO_TIMESTAMP function expect TEXT so we need to cast to that
SELECT
'{"date":"2022-02-09T20:28:59+0000"}' as json_str
,parse_json(json_str) as json
,json:date as data_from_json
,TRY_TO_TIMESTAMP_NTZ(data_from_json::text, 'YYYY-MM-DDTHH:MI:SS+0000') as date_1
,TRY_TO_TIMESTAMP_NTZ(substr(data_from_json::text,1,19), 'YYYY-MM-DDTHH:MI:SS') as date_2
;
If all your timezones are always +0000 you can just put that in the parse format (like example date_1), OR you can truncate that part off (like example date_2)
gives:
JSON_STR
JSON
DATA_FROM_JSON
DATE_1
DATE_2
{"date":"2022-02-09T20:28:59+0000"}
{ "date": "2022-02-09T20:28:59+0000" }
"2022-02-09T20:28:59+0000"
2022-02-09 20:28:59.000
2022-02-09 20:28:59.000
Using TRY_TO_TIMESTAMP:
SELECT TRY_TO_TIMESTAMP(JSON_DATA:"situation_date", 'format_here')
FROM tab;
so I need to know which Snowflake date data type or datetime type suits the best with this one
TIMESTAMP_INPUT_FORMAT
The specific input could be set up on ACCOUNT/USER/SESSION level.
AUTO Detection of Integer-stored Date, Time, and Timestamp Values
Avoid using AUTO format if there is any chance for ambiguous results. Instead, specify an explicit format string by:
Setting TIMESTAMP_INPUT_FORMAT and other session parameters for dates, timestamps, and times. See Session Parameters for Dates, Times, and Timestamps (in this topic).
I think ::TIMESTAMP should work for this. So JSON_DATA:"situation_date"::TIMESTAMP if you need to go just to date after, you could then to ::Date or to_Date()
After some testing, it seems to me you have 2 options.
Either you can get rid of the +0000 at the end:
left(column_date, len(column_date)-5)::timestamp
or use the function try_to_timestamp with format:
try_to_timestamp('2022-02-09T20:28:59+0000','YYYY-MM-DD"T"HH24:MI:SS+TZHTZM')
TZH and TZM both are TimeZone Offset Hours and Minutes
My Flat file has a date column in this format: 2020-03-31
My SQL table has a column SaleDate as Date datatype : Example: 2020-11-01
I am getting error when I am importing the data:
Data conversion failed while converting column "SaleDate" (74). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
What I tried:
I tried to update the flat file date as dt_Dbdate and dt_timestamp
either of them are not working.
I tried data conversion, even that is not working
Can someone know how to handle this?
Thank you
The value could not be converted because of a potential loss of data
Usually this error occurs when the destination column size is smaller than the source column.
For Ex: I am bringing the Hive table column (datetime data type) value in Pig and want to extract on;y the DATE portion. I have tried using ToDate function. the below is the Error Information. Please help me in this critical situation.
The Original Value in this column is "2014-07-29T06:01:33.705-04:00", I need out put as "2014-07-29"
ToDate(eff_end_ts,'YYYY-MM-DD') AS Delta_Column;
2016-07-28 07:07:25,298 [main] ERROR org.apache.pig.tools.grunt.Grunt
- ERROR 1045: Could not infer the matching function for org.apache.pig.builtin.ToDate as multiple or none of them
fit. Please use an explicit cast.
Assuming your column name is f1 which has the timestamp with values like 2014-07-29T06:01:33.705-04:00, you will have to use GetYear(),GetMonth,GetDay and CONCAT it to the required format.
B = FOREACH A GENERATE CONCAT(
CONCAT(
CONCAT((chararray)GetYear(f1),'-')),
(CONCAT((chararray)GetMonth(f1),'-')),
(chararray)GetDay(f1)) AS Day;
I did the Work around to figure out and Its working by this way:
ToDate(ToString(eff_end_ts,'YYYY-MM-DD'),'YYYY-MM-DD') AS (datetime: Delta_Column)
I added named query to my data source view, converting datetime to date. When I run the query it displays datetime anyway. Do you know how to fix this? I need date format to match with the Time dimension.
I'm using SQL Server Data Tools 2012.
The SQL DATE datatype accepts time values, so that is how it is shown in the Query preview. If the datatype of your Time dimension key is also DATE, then you should be OK to continue. Have you tried?
FWIW I would use a BIGINT key for a Date Dimension, containing a date represented as YYYYMMDD. This allows for extra rows e.g. with a -1 Key for unspecified dates (e.g. null in Fact data).