I have a table with a column defined as time CHAR(6) with values like '18:00' which I need to convert from char to time.
I searched here, but didn't succeed.
You can use the :: syntax to cast the value:
SELECT my_column::time
FROM my_table
If the value really is a valid time, you can just cast it:
select '18:00'::time
As said, you could use :: to cast, but you could also use the standard CAST() function:
SELECT CAST(my_column AS time) AS my_column_time
FROM my_table;
This also works in other databases, not just PostgreSQL.
Related
I have this teradata query, but when I run it on hive it doesn't support it
CAST(DATE '1900-01-01'+CAST( 999999999 - TRIM(BASM_DATE) AS INTEGER)
AS DATE) AS BASM_DATE
Error
Error while compiling statement: FAILED: SemanticException line
0:undefined:-1 Wrong arguments 'BASM_DATE': No matching method for class
org.apache.hadoop.hive.ql.udf.generic.GenericUDFOPDTIPlus with (date,
int)
can you guys tell me which part to fix and what the query will be?
Thank You.
You need to use date_add(dt,num).
Pls use below SQL -
date_add( '1900-01-01', 999999999 - cast( TRIM(BASM_DATE) as INT) ) AS BASM_DATE
I assumed BASM_DATE is a string column and using TRIM you are trying to remove trailing or leading spaces.
The Teradata SQL is relying on two implicit conversions that happen around the TRIM.
First the BASM_DATE is implicitly cast to VARCHAR using the default format, then TRIM is applied, then that string is implicitly cast to FLOAT followed by a floating-point subtraction, with that result explicitly cast to INTEGER. Then that number of days is added to the 1900-01-01 date, followed by a redundant CAST from DATE to DATE. (There's really no reason to use FLOAT here, that's just the type Teradata uses for implicit numeric conversions.) A better, clearer choice in Teradata that gives the same result would have been:
DATE'1900-01-01'+(999999999 - CAST(TO_CHAR(BASM_DATE,'YYYYMMDD') AS INTEGER) AS BASM_DATE
So the Hive equivalent would be:
date_add('1900-01-01',999999999 - cast(date_format(BASM_DATE,'yyyyMMdd') AS INT)) AS BASM_DATE
or it may be more efficient to use
date_add('1900-01-01',999999999 - year(BASM_DATE)*10000 - month(BASM_DATE)*100 - day(BASM_DATE)) AS BASM_DATE
The query I'm running as a test is:
SELECT
UNIX_DATE(created_utc)
FROM `fh-bigquery.reddit_comments.2017_08`
But I keep getting this error:
Error: No matching signature for function UNIX_DATE for argument types:
INT64. Supported signature: UNIX_DATE(DATE) at [2:3]
I checked the datatype for the created_utc field and it's an integer. Casting and whatnot won't work either.
Would really appreciate any help. Thanks!
You should use TIMESTAMP_SECONDS() instead
#standardSQL
SELECT
TIMESTAMP_SECONDS(created_utc)
FROM `fh-bigquery.reddit_comments.2017_08`
LIMIT 5
Then you can use DATE() if you need date only
DATE(TIMESTAMP_SECONDS(created_utc))
UNIX_DATE() takes a String.
And DATE_FROM_UNIX_DATE() takes an INT64. SQL has a legacy problem of thinking of time ("date") in DAYS and not SECONDS like Unix. Thus:
SELECT DATE_FROM_UNIX_DATE(CAST(created_utc/86400 as INT64))
FROM `fh-bigquery.reddit_comments.2017_08`
I'm trying to get maximum date in a row. Both fuctions, MAX and Greatest return errors:
SEL Max(date1,date2,date3...)
SELECT Failed. 3706: Syntax error: expected something between a string or a Unicode character literal and ','.
SEL Greatest(date1,date2,date3...)
SELECT Failed. 9881: Function 'GREATEST' called with an invalid number or type of parameters
How to solve this?
thx
Yep, it's stupid, LEAST and GREATEST don't work with date/time (fixed in 16.10).
As a workaround you can cast it to integer:
SEL cast(GREATEST(cast(date1 as int)
,cast(date2 as int)
,cast(date3 as int)
...) as date)
Hopefully there's no NULL, otherwise it gets ugly with additional COALESCEs/NULLIF
According to documentation, arguments for the GREATEST function can't be dates. Try to convert them to strings in the YYYYMMDD (or similar) format (so that the result wouldn't suffer from issues when strings are being sorted).
Try this :
SELECT (
SELECT MAX(maxdate)
FROM (
VALUES (date1)
,(date2)
,(date3)
) AS maximumdate(maxdate)
) AS maxdate
FROM #temp
In TD 16.x0, GREATEST/LEAST work both with dates and timestamps. However users may need to add database, like it was UDF function:
SELECT TD_SYSFNLIB.LEAST(CURRENT_TIMESTAMP(0),ADD_MONTHS(CURRENT_TIMESTAMP(0),2))
I have a column in a table where timestamps have been stored in VARCHAR format, but I need to compare these against a column of DATETIME values from another table to find time intervals, so I want to either cast or convert the VARCHAR timestamps to DATETIME. However, both casting and converting are giving me problems.
The format of the VARCHAR timestamp looks like this: "29/07/2012 01:53:36 +12".
Using the query:
SELECT CAST(event_timestamp AS datetime) FROM the_table
produces ERROR: date/time field value out of range: "29/07/2012 01:53:36 +12".
Using the query:
SELECT CONVERT(datetime, event_timestamp, 131) from the_table;
produces
ERROR: syntax error at or near ","
LINE 1: select CONVERT(datetime, event_timestamp, 131) from the_tab...
^ (note: this is pointing at the first comma).
The error with CONVERT actually happens even if you use a generic function such as getdate() for the data source. This db uses ANSI SQL-92 (or so I'm told). Could anyone please help me out with this?
This seems really painful, but the following should work:
select dateadd(hh, cast(right(tv, 3) as int),
CONVERT(datetime, left(tv, 10), 103)+CONVERT(datetime, substring(tv, 12, 8), 108)
)
from (select '29/07/2012 01:53:36 +12' as tv) t
I've never added datetime's before, but this just worked on SQL Server 2008.
Why can't SQL Server just support a flexible notation built around yyyy, mm, mmm, dd and so on?
The actual database is Aster Data, which is based on Postgres (as are most recent database engines). In this database, you would use to_timestamp(). See the documentation here http://www.postgresql.org/docs/8.2/static/functions-formatting.html. The call would be something like:
to_timestamp(val, 'MM/DD/YYYY HH:MI:SS tz') -- not sure if this gets the +12
There are no ANSI functions for date conversion, so each database does its own. Even string functions vary among databases (substr? substring? charindex? instr? location?), so there is no ANSI way to do this.
You are using the wrong syntax, try:
CONVERT(varchar(X), datetimeValue, 131)
Where X is the total number of characters desired.
You will then be able to search for a match with datetimeValue and event_timestamp, assuming each value share the same structure. This will allow you to match string against string.
If I'm not mistaken the standard (ANSI SQL) CAST operator always expect time/date/timstamp literals in ISO format ('YYYY-MM-DD')
But according to the manual for Teradata V12 (can't test it), the format of the CAST operator is
CAST(character_expression AS TIMESTAMP timestamp_data_attribute)
with date_data_attribute being a character value plus an optional FORMAT specifier.
So in your case this would probably be:
cast(event_timestamp AS TIMESTAMP FORMAT 'MM/DD/YYYY HH:MI:SS Z');
I'm not entirely sure about the format definition though. You'll probably need to adjust that
Btw: CONVERT isn't a standard SQL function. It's SQL Server specific.
We have a Netezza table that contains dates stored in a numeric YYYYMMDD format (eg 20090731).
What is the best Netezza syntax to use to convert this into date format?
eg
SELECT somefunction(20090731) as NZDATE
?
Easiest way to convert number to date would be
select date(to_char(20090731,'99999999')) as Number_As_DATE;
You can use this one as it's the best one.
SELECT TO_DATE('20090731','YYYYMMDD') as NZDATE
to_date (sk_dim_time ,'YYYYMMDD')
My efforts were thwarted originally due to invalid dates. The code bellow does work as long as you wrap it in a statement to catch bad dates.
select to_date(substring(20090731 from 1 for 8),'YYYYMMDD') as NZDATE
Obviously 20090731 should be replaced with the name of the numeric variable.
select to_date(20090731,'YYYYMMDD') as Number_As_DATE
This will work without converting to char.