I am storing a timestamp field in a SQLite3 column as TIMESTAMP DATETIME DEFAULT CURRENT_TIMESTAMP and I was wondering if there was any way for it to include milliseconds in the timestamp as well?
Instead of CURRENT_TIMESTAMP, use (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) so that your column definition become:
TIMESTAMP DATETIME DEFAULT(STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW'))
For example:
CREATE TABLE IF NOT EXISTS event
(when_ts DATETIME DEFAULT(STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')));
To get number of milliseconds since epoch you can use julianday() with some additional calculations:
-- Julian time to Epoch MS
SELECT CAST((julianday('now') - 2440587.5)*86400000 AS INTEGER);
The following method doesn't require any multiplies or divides and should always produce the correct result, as multiple calls to get 'now' in a single query should always return the same result:
SELECT strftime('%s','now') || substr(strftime('%f','now'),4);
The generates the number of seconds and concatenates it to the milliseconds part from the current second+millisecond.
Here's a query that will generate a timestamp as a string with milliseconds:
select strftime("%Y-%m-%d %H:%M:%f", "now");
If you're really bent on using a numeric representation, you could use:
select julianday("now");
The accepted answer only gives you UTC. If you need a local time instead of UTC, use this:
strftime('%Y-%m-%d %H:%M:%f', 'now', 'localtime')
Related
I want to md5 a timestamp column in Hive, without the millisecond.
If timestamp is before Epoch Unix Time (year 1970), timestamp is corrupted:
START_DATE=1915-07-15 23:25:26.290448384
select ID, START_DATE, MD5(START_DATE) from TABLE1
Result : START_DATE = 2500-02-02 00:00:00.0
No issue without adding MD5 function, or if the timestamp > 1970.
I've tried with vectorized parameter (https://cwiki.apache.org/confluence/display/hive/vectorized+query+execution#VectorizedQueryExecution-Limitations) but still the same issue.
Also tried : Cast as string, substr... before MD5.
How can we handle timestamp < 1970 ?
can you use somethign like this ?
select md5(from_unixtime(unix_timestamp(substring('1915-07-15 23:25:26.290448384',1,19)))) as md5_out
Unixtime will calculate any date before 1970 as negative number and calculate accordingly. so i think it should be alright.
I store date from Calendar.getTimeInMilliseconds() in SQLite DB.
I need to mark first rows by every month in SELECT statement, so I need convert time in milliseconds into any date format using SQLite function only. How can I avoid this?
One of SQLite's supported date/time formats is Unix timestamps, i.e., seconds since 1970.
To convert milliseconds to that, just divide by 1000.
Then use some date/time function to get the year and the month:
SELECT strftime('%Y-%m', MillisField / 1000, 'unixepoch') FROM MyTable
Datetime expects epochtime, which is in number of seconds while you are passing in milliseconds. Convert to seconds & apply.
SELECT datetime(1346142933585/1000, 'unixepoch');
Can verify this from this fiddle
http://sqlfiddle.com/#!5/d41d8/223
Do you need to avoid milliseconds to date conversion or function to convert milliseconds to date?
Since sqlite date functions work with seconds, then you can try to
convert milliseconds in your query, like this
select date(milliscolumn/1000,'unixepoch','localtime') from table1
convert millis to seconds before saving it to db, and then use date function in sql query
I have a temp table.
It has last_update column in 2/10/2018 6:01:50 PM datetime format.
How can I write THE BEST QUERY to display all information that's updated on 02-Oct-2018 day?
You can use trunc function
select *
from tab
where trunc(last_update) = date'2018-10-02'
It is preferable to avoid TRUNC especially if you have an index on the column last_update.
A simple where condition should be better and may be better performant.
WHERE last_update >= date '2018-10-02' AND
last_update < date '2018-10-02' + 1
Use trunc function for getting the same day:
trunc(last_update) = trunc(to_date('02-Oct-2018', 'DD-MONTH-YYYY'))
The TRUNC (date) function returns date with the time portion of the day truncated to the unit specified by the format model fmt. The value returned is always of datatype DATE, even if you specify a different datetime datatype for date. If you omit fmt, then date is truncated to the nearest day.
You can also use format DD-MON-YYYY
I have a column eventtime that only stores the time of day as string. Eg:
0445AM - means 04:45 AM. I am using the below query to convert to UNIX timestamp.
select unix_timestamp(eventtime,'hhmmaa'),eventtime from data_raw limit 10;
This seems to work fine for test data. I always thought unixtimestamp is a combination of date and time while here I only have the time. My question is what date does it consider while executing the above function? The timestamps seem to be quite small.
Unix timestamp is the bigint number of seconds from Unix epoch (1970-01-01 00:00:00 UTC). The unix time stamp is a way to track time as a running total of seconds.
select unix_timestamp('0445AM','hhmmaa') as unixtimestamp
Returns
17100
And this is exactly 4hrs, 45min converted to seconds.
select 4*60*60 + 45*60
returns 17100
And to convert it back use from_unixtime function
select from_unixtime (17100,'hhmmaa')
returns:
0445AM
If you convert using format including date, you will see it assumes the date is 1970-01-01
select from_unixtime (17100,'yyyy-MM-dd hhmmaa')
returns:
1970-01-01 0445AM
See Hive functions dosc here.
Also there is very useful site about Unix timestamp
We have a timestamp epoch column (BIGINT) stored in Hive.
We want to get Date 'yyyy-MM-dd' for this epoch.
Problem is my epoch is in milliseconds e.g. 1409535303522.
So select timestamp, from_unixtime(timestamp,'yyyy-MM-dd') gives wrong results for date as it expects epoch in seconds.
So i tried dividing it by 1000. But then it gets converted to Double and we can not apply function to it. Even CAST is not working when I try to Convert this double to Bigint.
Solved it by following query:
select timestamp, from_unixtime(CAST(timestamp/1000 as BIGINT), 'yyyy-MM-dd') from Hadoop_V1_Main_text_archieved limit 10;
The type should be double to ensure precision is not lost:
select from_unixtime(cast(1601256179170 as double)/1000.0, "yyyy-MM-dd hh:mm:ss.SSS") as event_timestamp
timestamp_ms is unixtime in milliseconds
SELECT from_unixtime(floor(CAST(timestamp_ms AS BIGINT)/1000), 'yyyy-MM-dd HH:mm:ss.SSS') as created_timestamp FROM table_name;
In the original answer you'll get string, but if you'd like to get date you need to call extra cast with date:
select
timestamp,
cast(from_unixtime(CAST(timestamp/1000 as BIGINT), 'yyyy-MM-dd') as date) as date_col
from Hadoop_V1_Main_text_archieved
limit 10;
Docs for casting dates and timestamps. For converting string to date:
cast(string as date)
If the string is in the form 'YYYY-MM-DD', then a date value corresponding to that year/month/day is returned. If the string value does not match this formate, then NULL is returned.
Date type is available only from Hive > 0.12.0 as mentioned here:
DATE (Note: Only available starting with Hive 0.12.0)