I'm trying to use PDI to subtract 1min or 60seconds from the default system date (variable). I have this command connected to a calcuator step. But it either gives me an error or NO time value
2020/04/20 11:51:00 - Calculator.0 - ERROR (version 9.0.0.0-423, build 9.0.0.0-423 from 2020-01-31 04.53.04 by buildguy) : Unexpected error
2020/04/20 11:51:00 - Calculator.0 - ERROR (version 9.0.0.0-423, build 9.0.0.0-423 from 2020-01-31 04.53.04 by buildguy) : org.pentaho.di.core.exception.KettleStepException:
2020/04/20 11:51:00 - Calculator.0 - Unable to find the second argument field 'test for calculation #1
2020/04/20 11:51:00 - Calculator.0 -
2020/04/20 11:51:00 - Calculator.0 - at org.pentaho.di.trans.steps.calculator.Calculator.processRow(Calculator.java:133)
2020/04/20 11:51:00 - Calculator.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2020/04/20 11:51:00 - Calculator.0 - at java.lang.Thread.run(Thread.java:748)
That calculation expects a fieldname, not a static value.
You can fix this by first using another calculation, "set field to constant value A". That one does accept a value in column FieldA. Give it a name (sixty_seconds), a value (60) and use that as the Field B for your real calculation.
Set Remove? to Y for the first calculation so that it doesn't get added to the output rows.
use Modified Java Script Value
MyTime is a fixed system date. need to add below code snippet
var MyNewTime;
MyNewTime = MyTime.setMinutes(MyTime.getMinutes());
MyNewTime = new Date(MyTime);
MyNewTime.setMinutes(MyTime.getMinutes() - 1);
Related
I've tried finding an answer to this question but haven't been entirely successful (or maybe I just don't understand the PostgreSQL documentation).
I am using this function in SQLite:
datetime (message.date / 1000000000 + strftime ("%s", "2001-01-01"), "unixepoch", "localtime")
message.date is a large integer that can be converted to a date starting in 1987 e.g. 550535817000000000
strftime gives me 978307200
I am attempting to recreate this function in postgresql but am definitely missing something because I keep getting errors or incorrect output. A few attempts are below:
// Gives SQL Error [42883]: ERROR: operator does not exist: timestamp with time zone + timestamp with time zone
Hint: No operator matches the given name and argument types. You might need to add explicit type casts. Position: 51
TO_TIMESTAMP((message.date / 1000000000)) + TO_TIMESTAMP('2001-01-01', 'YYYY-MM-DD HR24:MI:SS')
// Date in 1987 with TZ -7 e.g. 1987-05-12 16:36:38.000 -0700
TO_TIMESTAMP((message.date / 1000000000))
// 2001-01-01 00:00:00.000 -0800
TO_TIMESTAMP('2001-01-01', 'YYYY-MM-DD HR24:MI:SS'),
I'm pretty sure I'm missing something. Can anyone help direct me to the right solution?
UPDATE:
The solution below based on #matbalie 's feedback -- this date is based on Messages chat.db date in the message table.
to_timestamp((message.date / 1000000000)::integer + EXTRACT(EPOCH FROM '2001-01-01'::date)::integer) message_date,
question from beginner in BigQuery and data analytics.
What I want to do : uploading csv file in Bigquery.
columns of Dataset : Id, ActivityHour, StepTotal
(1503960366, 4/12/2016 12:00:00 AM, 373)
Problem : When I upload it with Schema (Auto detect) option, then error below.
Question : Could you please help me how I can upload data?
Error while reading data, error message: Could not parse '4/12/2016
12:00:00 AM' as TIMESTAMP for field ActivityHour (position 1) starting
at location 27 with message 'Invalid time zone: AM'
Problem : So I added field manually as "Id:Integer, ActivityHour:Timestamp, StepTotal:Integer" but error again
Error while reading data, error message: Could not parse 'Id' as INT64
for field Id (position 0) starting at location 0 with message 'Unable
to parse'
Problem : So I added field manually as all columns : String.
It uploaded successfully.
But when I try to change the datatype (SQL): SELECT CAST(Id as INTEGER)
error again "Bad int64"
Now I have no idea how to proceed the next step
I get error Syntax error: Illegal input character "%" at [3:35] when trying to use this syntax:
WHERE _TABLE_SUFFIX = FORMAT_DATE(%E4Y%m%d, DATE_TRUNC(DATE_SUB(date_val, INTERVAL 1 MONTH), MONTH))
Also the same error for the %Y%m%d. I'm trying to get for example 20210901 for the Sep 1, 2021. Could you help with that?
I have a column of string data type that represents a date. The date format is 'yyyy-MM-dd HH:mm:ss:SSS'. I want to truncate this date to the start of day. So form example,
2011-07-19 12:44:42.453 should become 2011-07-19 00:00:00.0
I have tried the following trunc(record_timestamp,'DD') but it just gives me blank string.
I also tried date_trunc(record_timestamp, 'DD') but I got the following exception:
java.lang.RuntimeException: org.apache.spark.sql.AnalysisException: Undefined function: 'date_trunc'. This function is neither a registered temporary function nor a permanent function registered in the database 'default'.; line 1 pos 7
Any help is appreciated.
Try this -
scala> spark.sql(s""" select "2011-07-19 12:44:42.453" as TS, concat(substring("2011-07-19 12:44:42.453", 0,10), " 00:00:00.0") as TS_Start_Of_Day """).show(false)
+-----------------------+---------------------+
|TS |TS_Start_Of_Day |
+-----------------------+---------------------+
|2011-07-19 12:44:42.453|2011-07-19 00:00:00.0|
+-----------------------+---------------------+
While using this query:
SELECT date_time
FROM test_table
where EXTRACT('epoch' FROM CONVERT_TIMEZONE(replace(timezone, '+', '-'),date_time::timestamp)) >= 1513036800
and EXTRACT('epoch' FROM CONVERT_TIMEZONE(replace(timezone, '+', '-'),date_time::timestamp)) <= 1513555200
limit 10
I am getting this error:
An error occurred when executing the SQL command:
[Amazon](500310) Invalid operation: Invalid data
Details:
-----------------------------------------------
error: Invalid data
code: 8001
context: Invalid format or data given: 0000-00-00 00:00:00
query: 1909217
location: funcs_timestamp.cpp:219
process: query1_59 [pid=25572]
-----------------------------------------------;
1 statement failed.
Execution time: 0.63s
I am unable to use "Extract" function two times in a single query.
I have tried to use "Between" as well. but that did not work.
If I remove one of any extract query from the date range. Then it would work.