Converting an epoch timestamp to date in vega - vega

I need to do some light date addition in vega. I'm using the following expression, where datum.date is of type Date and datum.days is a number:
'transform': [
{
'type': 'formula',
'expr':
'time(datum.date) + (1000*60*60*24*datum.days)',
'as': 'x'
},
]
This works great but it results in a timestamp (eg: 1627057587) instead of a js Date object. I looked into toDate() but that seems to also return a timestamp rather than a Date object.
How do I convert the result of this operation back to a date object?

Easy solution here. Similar to what you have done, you just want to use a formula like so - I'm assuming "x" is your epoch value.
{
type: formula
expr: datetime(datum.x)
as: TimestampFromEpoch
}
Arguably, this not overly clear in the Vega documentation that this is possible. But does the job :)

Related

BigQuery : Returning timestamp from JS udf throwing "Failed to coerce output value to type TIMESTAMP"

I have a bigquery code.
CREATE TEMP FUNCTION to_struct_attributes(input STRING)
RETURNS STRUCT<status_code STRING, created_time TIMESTAMP>
LANGUAGE js AS """
let res = JSON.parse(input);
res['created_time'] = Date(res['created_time'])
return res;
""";
SELECT
5 AS ID,
to_struct_attributes(
TO_JSON_STRING(
STRUCT(
TIMESTAMP(PARSE_TIMESTAMP('%Y%m%d%H%M%S', '20220215175959','America/Los_Angeles')) AS created_time
)
)
) AS ATTRIBUTES;
When I execute this, I'm getting the following error:
Failed to coerce output value "2022-02-16 01:59:59+00" to type TIMESTAMP
I feel this is quite strange, since BigQuery should be able to interpret it correctly and I haven't had this issue with any other datatypes. Also, if I do:
SELECT TIMESTAMP("2022-02-16 01:59:59+00")
It returns:
2022-02-16 01:59:59 UTC
So BigQuery can indeed parse it correctly. I'm not sure why it doesn't happen for the UDF. On searching the internet, I found this question and as the answer suggests, if I change the return statement to:
return Date(res.created_time);
It resolves the issue. But for a project of mine, doing it for every timestamp is not feasible due to the high number of struct columns.
So, I wanted to know if someone has a better alternative to it?
PS : I have removed a lot of non-essential parts from the above example, so this might look a bit abstract. Also, the actual use-case is a bit different and complex that's why I need that JS udf.
The best way to do what you want is to implement the following code.
return Date(res.created_time);
This happens when you pass a TIMESTAMP to a UDF, it is represented as a DATE object, as stated in the documentation. This is like a return of a TIMESTAMP from a JavaScript UDF, where you need to construct and return a DATE object.

dojo/mvc/at doesn't return dijit/form/DateTextBox in format of constraints datePattern

This seem's to be a question often asked, but there doesn't seem to be an easy answer or an answer at all, so I risk a duplicate here, and ask again - I feel like having a puzzle with 4 pieces and don't manage to put them together:
I'm using a dojo date picker like this
<input data-dojo-type="dijit/form/DateTextBox"
data-dojo-props="constraints: { datePattern: 'yyyy-MM-dd'},
value: at(model, 'myDate')" />
The date picker displays the date in UI like I want, but the value that's assigned in model.myDate keeps being in ISO format - I'd need that to be in yyyy-MM-dd, too.
I know that I can use dojo.date.locale.format to post-process the value, but that would be after it is saved in model.myDate. I'd like to return the value in the correct format right away. Return value null if there's no input, return value undefined if there's no valid value, and return value in format yyyy-MM-dd when the given date is valid.
Maybe I can integrate that call to dojo.date.locale.format somehow? Something like .transform(..) or whatever is possible in dojo!?
I also read about overwriting the serialize method, but I don't see how and where to do that in here.
Any ideas or hint in the right direction? Many thanks in advance.
Hi just wondering if something like at(model, prop).transform(converterObj) helps: http://dojotoolkit.org/reference-guide/1.10/dojox/mvc/at.html#data-converter

Cannot write date in BigQuery using Java Bigquery Client API

I'm doing some ETL from a CSV file in GCS to BQ, everything works fine, except for dates. The field name in my table is TEST_TIME and the type is DATE, so in the TableRow I tried passing a java.util.Date, a com.google.api.client.util.DateTime, a String, a Long value with the number of seconds, but none worked.
I got error messages like these:
Could not convert non-string JSON value to DATE type. Field: TEST_TIME; Value: ...
When using DateTime I got this error:
JSON object specified for non-record field: TEST_TIME.
//tableRow.set("TEST_TIME", date);
//tableRow.set("TEST_TIME", new DateTime(date));
//tableRow.set("TEST_TIME", date.getTime()/1000);
//tableRow.set("TEST_TIME", dateFormatter.format(date)); //e.g. 05/06/2016
I think that you're expected to pass a String in the format YYYY-MM-DD, which is similar to if you were using the REST API directly with JSON. Try this:
tableRow.set("TEST_TIME", "2017-04-06");
If that works, then you can convert the actual date that you have to that format and it should also work.
While working with google cloud dataflow, I used a wrapper from Google for timestamp - com.google.api.client.util.DateTime.
This worked for me while inserting rows into Big Query tables. So, instead of
tableRow.set("TEST_TIME" , "2017-04-07");
I would recommend
tableRow.set("TEST_TIME" , new DateTime(new Date()));
I find this to be a lot cleaner than passing timestamp as a string.
Using the Java class com.google.api.services.bigquery.model.TableRow, to set milliseconds since UTC into a BigQuery TIMESTAMP do this:
tableRow.set("timestamp", millisecondsSinceUTC / 1000.0d);
tableRow.set() expects a floating point number representing seconds since UTC with up to microsecond precision.
Very non-standard and undocumented (set() boxes the value in an object, so it's unclear what data types set() accepts. The other proposed solution of using com.google.api.client.util.DateTime did not work for me.)

jQuery Input Masks for datetime input using sql timestamp yyyy/mm/dd hh:mm:ss

I'm using Robin Herbot's jQuery Input Masks plugin on my project.
It's very good but I need sql timestamp mask: yyyy/mm/dd hh:mm:ss I don't konw if i'm doing something wrong but it seems datetime alias shows only hours and minutes.
I've tried some changes on mask but not successful.
Thanks.
I see the question is from long ago, I also came here while trying to learn jquery.inputmask.
Remember to always include what you have done (code sample) when asking a question. Even if it is wrong/not working, it will help the one providing an answer, and others looking for answers.
In general terms, I found it somewhat helpful to read through the jquery.inputmask.xxx.extensions.js files, where xxx = date in this instance. In there you can see how more complex aliases are constructed from more basic ones (by overriding the basic ones), and you can apply the same ideas in constructing a new alias if you don't find a useful one.
Code that should work for your case:
$("#tsfield").inputmask("timestamp", {
mask: "y/1/2 h:s:s",
placeholder: "yyyy/mm/dd hh:mm:ss",
separator: "/",
alias: "datetime",
hourFormat: "24"
});
... which creates a new alias named timestamp, overriding datetime, and applies it to the input with id="tsfield".
If you have more than one input field with the same input mask on your page, I find it is better to create the new alias just once in your $(document).ready(), and then apply it by name to each field (refer to jquery.inputmask.date.extensions.js and documentation for instructions).

How to store a date in postgresql "json" datatype for use with plv8?

I wanted to use Date.UTC to store dates and datetimes in postgresql 9.2 "json" field, but of course it fails:
hp=> update formapp_record set data='{"dt": Date.UTC(120, 10, 2)}' where id=17;
ERROR: invalid input syntax for type json
LINE 1: update formapp_record set data='{"dt": Date.UTC(120, 10, 2)}...
^
DETAIL: Token "Date" is invalid.
CONTEXT: JSON data, line 1: {"dt": Date...
It is possible to store the UTC timestamp directly, but then how could the decoder know that the value should decode to a date or datetime instead of an int ?
It is also possible to store the Date.UTC call as string as such:
update formapp_record set data='{"dt": "Date.UTC(120, 10, 2)"}' where id=17;
While that works, it requires 0. checking if the string starts with Date.UTC and 1. use eval in plv8
A solution would be to store some metadata like:
update formapp_record set data='{"dt": {"_type": "date", "_value": [120, 10, 2]}}' where id=17;
But that's not very "standard", it's even "hackish".
What's your take on this matter ?
Alas, json doesn't know anything about dates.
I'd store an ISO 8601 date as a string. Yes, it's a pain. Yes, it means there's no nice standard way to tell "this is a date" vs "this is a string". IMO it's less painful than most of the other options, though.
A possible solution is to use Postgres's row_to_json function and just store your dates as timestamps and extract them to json as required. However, Tobe Hede wrote some json functions for Postgres that may help and seem to be alot more complete then the 2 native options that Postgres has made available for 9.2
See post How do I query using fields inside the new PostgreSQL JSON datatype? for the thread.