I am trying to extract the time from a datetime column in my Amazon Redshift database (Postgresql 8.0). I have already referred to previous questions such as this. But I am getting an unusual error.
When I try:
SELECT collected_timestamp::time
or
SELECT cast(collected_timestamp as time)
I get the following error:
ERROR: Specified types or functions (one per INFO message) not supported on Redshift tables
The goal is to pull the time portion from the timestamp such that 2017-11-06 13:03:28 returns 13:03:28.
This seems like an easy problem to solve but for some reason I am missing something. Researching that error does not lead to anything meaningful. Any help is appreciated.
Note that Redshift <> PostgreSQL - it was forked from PostgreSQL but is very different under the hood.
You're trying to cast a timestamp value to a data type of "time" which does not exist in Redshift. To return a value that is only the time component of a timestamp you will need to cast it to a character data type, e.g.:
SELECT to_char(collected_timestamp, 'HH24:MI:SS');
There are a few ways, here is one i use:
SELECT ('2018-03-07 21:55:12'::timestamp - trunc('2018-03-07 21:55:12'::timestamp))::time;
I hope that helps.
EDIT: I have made incorrect use of ::time please see comments on other answer.
Related
I'm trying to query a BigQuery table that has column "date" (set to type DATE in the schema) formatted as yyyy-mm-dd-??. In other words, there's an extra set of information about the date and I'm not really sure what it is. When I try to query the "date" column I run into the error:
SQL Error [100032] [HY000]: [Simba]BigQueryJDBCDriver Error executing query job. Message: Invalid date: '2022-09-03-01'
I've tried cast(date as string), cast(left(date, 10) as string), all types of workarounds, but the error persists. It seems that no matter how much I try and nail it home in the query that I want this weird date column to be read as a string, so that I can work with it, BigQuery still wants to take it as a date, I guess because that's how it's setup in the schema. I don't care if this is parsed into a date properly or if it's read as a string and then I can parse it from there, I just want to be able to query the date column without getting an error.
complete newbie in PrestoDb here. I'm following documentation and can create a table with several types, but when I come to create timestamps with timezones or intervals I cant create them from dbeaver on my presto 0.252, I get syntax errors. (using driver 0.273.3)
However i can create them on the underlying PostgreSQL that i use. However the timestamptz gets shown as timestamp in presto, and the interval doesnt shown as a column at all. I`m missing something here, isnt it listed as a supported type? Should i workaround it storing it as varchar and then cast it?
I want to find difference between two dates in azure ml using apply sql transformation module. After lot of search I found that DateDiff would be helpful for doing my task. Unfortunately, it's not working. It always displays the datepart as error saying that no column in database. How to resolve it.
SQL query
SELECT datediff(month,Dispatch_Date,Order_Date) as Month_Diff
from t1;
Error :- is not correct: SQL logic error or missing database no such column: month
Use abbreviation for date part instead of directly using month.
SELECT datediff(mm,Dispatch_Date,Order_Date) as Month_Diff
from t1;
Refer SQL Server documentation for more details :- SQL Server DatePart Documentation
Datediff won't work as its not SQL but SQLLite.
You should be using the SQLLite function to get the difference
For example to get the Day difference use
Cast((JulianDay(EndDate) - JulianDay(StartDate)) As Integer)
The latest version of Tableau has started using standard SQL when it connects to Google's BigQuery.
I recently tried to update a large table but found that there appeared to be errors when trying to parse datetimes. The table originates as a CSV which is loaded into BigQuery where further manipulations happen. The datetime column in the original CSV contain strings in ISO standard date time format (basically yyyy-mm-dd hh:mm). This saves a lot of annoying manipulation later.
But on trying to convert the datetime strings in Tableau into dates or datetimes I got a bunch of errors. On investigation they seemed to come from BigQuery and looked like this:
Error: Invalid timestamp: '2015-06-28 02:01'
I thought at first this might be a Tableau issue so I loaded a chunk of the original CSV into Tableau directly where the conversion of the string to a data worked perfectly well.
I then tried simpler versions of the conversion (to a year rather than a full datetime) and they still failed. The generated SQL for the simplest conversion looks like this:
SELECT
EXTRACT(YEAR
FROM
CAST(`Arrival_Date` AS TIMESTAMP)) AS `yr_Arrival_Date_ok`
FROM
`some_dataset`.`some_table` `some_table`
GROUP BY
1
The invalid timestamp in the error message always looks to me like a perfectly valid timestamp. And further analysis suggests it doesn't happen for all the rows in the source table, just occasional ones.
This error did not appear in older versions of Tableau/BigQuery where legacy SQL was the default for Tableau. So i'm presuming it is a consequence of standard SQL.
So is there an intermittent problem with casting to timestamps in BigQuery? Or is this a Tableau problem which causes the SQL to be incorrectly formatted? And what can I do about it?
The seconds part in the canonical timestamp representation required if the hour and minute are also present. Try this instead with PARSE_TIMESTAMP and see if it works:
SELECT
EXTRACT(YEAR
FROM
PARSE_TIMESTAMP('%F %R', `Arrival_Date`)) AS `yr_Arrival_Date_ok`
FROM
`some_dataset`.`some_table`.`some_table`
GROUP BY
1
I have a simple query which states
convert(decimal(20,10),a.sumclk)/ nullif(convert(decimal(20,10),a.sumimp),0) as CTR1
When I run this I get a message saying 'Data Type "sumclk" does not match a Defined Type name.'
I looked around for what this means but I'm stuck
I'm using Teradata
Instead of convert(decimal(20,10),a.sumclk) (which is MSSQL), try CAST(a.sumclk as decimal(20,10)) (which I found on the Teradata forums: http://forums.teradata.com/forum/database/explicit-casting)