I have a query wherein one of the columns is a DATE type. I'm trying to convert that to the nanosecond representation of the timestamp associated with the date:
Input
Output
2022-07-15
1657843200000000000
2022-07-18
1658102400000000000
2022-07-19
1658188800000000000
I can get a timestamp from a date by doing this:
SELECT TRY_TO_TIMESTAMP(TO_VARCHAR($1))
FROM (
SELECT DATE($1) FROM VALUES ('2022-07-15'), ('2022-07-18'), ('2022-07-19'))
but using TRY_TO_NUMERIC doesn't work so I'm not sure what to do here.
You can use the datediff function to return the nanoseconds since the start of the Unix epoch, which is 1970-01-01:
with source_data as
(
select
COLUMN1::date as INPUT
from (values
('2022-07-15'),
('2022-07-18'),
('2022-07-19')
)
)
select INPUT
,datediff(nanoseconds, '1970-01-01'::date, INPUT) as OUTPUT
from SOURCE_DATA
;
INPUT
OUTPUT
2022-07-15
1657843200000000000
2022-07-18
1658102400000000000
2022-07-19
1658188800000000000
Related
I've a timestamp field in a table where second values SUBSTR(col,13,2) are 60+ in some places.
I want to update invalid second portion of the timestamp field and convert this kind of data into valid timestamp format DDMMYYYYHHMISS.
Sample data:
CREATE VOLATILE TABLE TEST (COL VARCHAR(50)) ON COMMIT PRESERVE ROWS;
INSERT INTO TEST (04012022000010);
INSERT INTO TEST (31012022000066);
INSERT INTO TEST (02012021000067);
COL
1 31012022000066
2 02012021000067
3 04012022000010
That's #Kendle's logic in Teradata SQL:
select
cast(substring(col from 1 for 12) as timestamp(0) format 'ddmmyyyyhhmi') +
cast(substring(col from 13 for 2) as interval second) as TS_correct,
to_char(TS_correct,'ddmmyyyyhhmiss')
from test;
I think that this is what you are needing. We convert the string without the seconds to DATETIME and add the number of seconds.
I give 2 versions because the DATETIME format requested is not the standard ISO format
The first request uses the date format as requested in the question. I give 2 versions because I don't know whether your local settings modify the automatic functions.
DDMMYYYYHHMISS
SELECT CAST(
CONCAT(
SUBSTRING(COL,1,12),'00'
) AS TIMESTAMP)
+ INTERVAL SUBSTRING(COL,11,2) second
FROM TEST;
We convert the input to ISO and then format the result to requested format.
Input: YYYY-MM-DD HH:MI:SS
Output: DDMMYYYYHHMISS
SELECT CAST(
CONCAT(
SUBSTRING(COL,5,4),'-',
SUBSTRING(COL,1,2),'-',
SUBSTRING(COL,3,2),' ',
SUBSTRING(COL,9,2),':',
SUBSTRING(COL,11,2),':00'
) AS TIMESTAMP)
+ INTERVAL SUBSTRING(COL,11,2)
FORMAT 'DDMMYYYYHHMISS'
AS corrected_date
FROM TEST;
I need concat 2 columns char in 1 column date.
I tried it:
INSERT INTO tb_teste PARTITION (dt_originacao_fcdr)
SELECT
tp_registro_fcdr,
seq_registro_fcdr,
tp_cdr_fcdr,
dt_atendimento_fcdr,
date_dt_atendimento_fcdr,
hr_atendimento_fcdr,
timestamp(from_unixtime(unix_timestamp(CONCAT(dt_atendimento_fcdr, hr_atendimento_fcdr), 'yyyyMMddHHmmss')), "yyyy-MM-dd HH:mm:ss") as date_hr_atendimento_fcdr,
duracao_atend_fcdr,
hr_originacao_fcdr,
duracao_total_fcdr,
duracao_chamada_tarifada_fcdr,
st_chamada_fcdr,
fim_sel_orig_fcdr,
numero_a_fcdr,
tp_numero_a_fcdr,
numero_b_fcdr,
tp_numero_b_fcdr,
numero_b_orig_fcdr,
numero_c_fcdr,
tp_numero_c_fcdr,
tp_trafego_fcdr,
esn_fcdr,
central_fcdr,
erb_fcdr,
tp_erb_fcdr,
face_erb_inici_fcdr,
erb_final_fcdr,
face_erb_final_fcdr,
erb_original_fcdr,
imsi_fcdr,
imei_fcdr,
tecnologia_fcdr,
cd_oper_ass_a_fcdr,
cd_oper_ass_b_fcdr,
cgi_fcdr,
nu_tlfn_fcdr,
tp_tlfn_fcdr,
tp_tarifa_fcdr,
ident_num_a_fcdr,
ident_num_b_fcdr,
cd_prestadora_fcdr,
cna_orig_ar_erb_fcdr
FROM tb_op_nor;
Result: date_hr_atendimento_fcdr 2019-03-03
The column containing the time and date is not null or empty.
Example:
Time zone: Brazil.
I need date and time in the same columns.
You can concat two fields then unix_timestamp and by using from_unixtimestamp function we can format the output timestamp.
with cte as (select stack(1,"20190303","131615") as (dt,hr)) --sample data
select
timestamp( --cast to timestamp
from_unixtime(unix_timestamp(concat(dt,hr),'yyyyMMddHHmmss'),"yyyy-MM-dd HH:mm:ss") --concat and change the format
)
from cte
Output:
2019-03-03 13:16:15.0
If you want to convert to brazil time to UTC or vice versa then use to_utc_timestamp/from_utc_timestamp.
Trying to convert from epoch code to timestamp, in legacy SQL it's working fine and in standard SQL I'm getting an error.
Field name custom_field_6 (string type field) represent epoch time and I want to convert it to timestamp (in standard SQL)
In legacy SQL I did the following:
select
custom_field_6 as custom_field_6,
timestamp(custom_field_6) as custom_field_6_convert
FROM [yellowhead-visionbi-rivery:yellowhead_prod.affise_conversions]
The query output
In standard sql:
select
custom_field_6 as custom_field_6,
cast(custom_field_6 as date) as custom_field_6_converted
FROMyellowhead-visionbi-rivery.yellowhead_prod.affise_conversions``
The error I get: "invalid date"
Below id for BigQuery Standard SQL
#standardSQL
SELECT
custom_field_6,
TIMESTAMP_SECONDS(CAST(custom_field_6 AS INT64)) custom_field_6_convert
FROM `yellowhead-visionbi-rivery.yellowhead_prod.affise_conversions`
you can test it using dummy data as below
#standardSQL
WITH `project.dataset.table` AS (
SELECT '1540051185' custom_field_6 UNION ALL
SELECT '1540252572'
)
SELECT
custom_field_6,
TIMESTAMP_SECONDS(CAST(custom_field_6 AS INT64)) custom_field_6_convert
FROM `project.dataset.table`
with result
Row custom_field_6 custom_field_6_convert
1 1540051185 2018-10-20 15:59:45 UTC
2 1540252572 2018-10-22 23:56:12 UTC
I am selecting a date column which is in the format "YYYY-MM-DD".
I want to cast it to a timestamp such that it will be "YYYY-MM-DD HH:MM:SS:MS"
I attempted:
select CAST(mycolumn as timestamp) from mytable;
but this resulted in the format YYYY-MM-DD HH:MM:SS
I also tried
select TO_TIMESTAMP(mycolumn,YYYY-MM-DD HH:MM:SS:MS) from mytable;
but this did not work either. I cannot seem to figure out the correct way to format this. Note that I only want the first digit of the milliseconds.
//////////////second question
I am also trying to select numeric data such that there will not be any trailing zeros.
For example, if I have values in a table such as 1, 2.00, 3.34, 4.50.
I want to be able to select those values as 1, 2, 3.34, 4.5.
I tried using ::float, but I occasionally strange output. I also tried the rounding function, but how could I use it properly without knowing how many decimal points I need before hand?
thanks for your help!
It seems that the functions to_timestamp() and to_char() are unfortunately not perfect.
If you cannot find anything better, use these workarounds:
with example_data(d) as (
values ('2016-02-02')
)
select d, d::timestamp || '.0' tstamp
from example_data;
d | tstamp
------------+-----------------------
2016-02-02 | 2016-02-02 00:00:00.0
(1 row)
create function my_to_char(numeric)
returns text language sql as $$
select case
when strpos($1::text, '.') = 0 then $1::text
else rtrim($1::text, '.0')
end
$$;
with example_data(n) as (
values (100), (2.00), (3.34), (4.50))
select n::text, my_to_char(n)
from example_data;
n | my_to_char
------+------------
100 | 100
2.00 | 2
3.34 | 3.34
4.50 | 4.5
(4 rows)
See also: How to remove the dot in to_char if the number is an integer
SELECT to_char(current_timestamp, 'YYYY-MM-DD HH:MI:SS:MS');
prints
2016-02-05 03:21:18:346
just add ::timestamp without time zone
select mycolumn::timestamp without time zone from mytable;
I got a column called DateOfBirth in my csv file with Excel Date Serial Number Date
Example:
36464
37104
35412
When i formatted cells in excel these are converted as
36464 => 1/11/1999
37104 => 1/08/2001
35412 => 13/12/1996
I need to do this transformation in SSIS or in SQL. How can this be achieved?
In SQL:
select dateadd(d,36464,'1899-12-30')
-- or thanks to rcdmk
select CAST(36464 - 2 as SmallDateTime)
In SSIS, see here
http://msdn.microsoft.com/en-us/library/ms141719.aspx
The marked answer is not working fine, please change the date to "1899-12-30" instead of "1899-12-31".
select dateadd(d,36464,'1899-12-30')
You can cast it to a SQL SMALLDATETIME:
CAST(36464 - 2 as SMALLDATETIME)
MS SQL Server counts its dates from 01/01/1900 and Excel from 12/30/1899 = 2 days less.
tldr:
select cast(#Input - 2e as datetime)
Explanation:
Excel stores datetimes as a floating point number that represents elapsed time since the beginning of the 20th century, and SQL Server can readily cast between floats and datetimes in the same manner. The difference between Excel and SQL server's conversion of this number to datetimes is 2 days (as of 1900-03-01, that is). Using a literal of 2e for this difference informs SQL Server to implicitly convert other datatypes to floats for very input-friendly and simple queries:
select
cast('43861.875433912' - 2e as datetime) as ExcelToSql, -- even varchar works!
cast(cast('2020-01-31 21:00:37.490' as datetime) + 2e as float) as SqlToExcel
-- Results:
-- ExcelToSql SqlToExcel
-- 2020-01-31 21:00:37.490 43861.875433912
this actually worked for me
dateadd(mi,CONVERT(numeric(17,5),41869.166666666664)*1440,'1899-12-30')
(minus 1 more day in the date)
referring to the negative commented post
SSIS Solution
The DT_DATE data type is implemented using an 8-byte floating-point number. Days are represented by whole number increments, starting with 30 December 1899, and midnight as time zero. Hour values are expressed as the absolute value of the fractional part of the number. However, a floating point value cannot represent all real values; therefore, there are limits on the range of dates that can be presented in DT_DATE. Read more
From the description above you can see that you can convert these values implicitly when mapping them to a DT_DATE Column after converting it to a 8-byte floating-point number DT_R8.
Use a derived column transformation to convert this column to 8-byte floating-point number:
(DT_R8)[dateColumn]
Then map it to a DT_DATE column
Or cast it twice:
(DT_DATE)(DT_R8)[dateColumn]
You can check my full answer here:
Is there a better way to parse [Integer].[Integer] style dates in SSIS?
Found this topic helpful so much so created a quick SQL UDF for it.
CREATE FUNCTION dbo.ConvertExcelSerialDateToSQL
(
#serial INT
)
RETURNS DATETIME
AS
BEGIN
DECLARE #dt AS DATETIME
SELECT #dt =
CASE
WHEN #serial is not null THEN CAST(#serial - 2 AS DATETIME)
ELSE NULL
END
RETURN #dt
END
GO
I had to take this to the next level because my Excel dates also had times, so I had values like this:
42039.46406 --> 02/04/2015 11:08 AM
42002.37709 --> 12/29/2014 09:03 AM
42032.61869 --> 01/28/2015 02:50 PM
(also, to complicate it a little more, my numeric value with decimal was saved as an NVARCHAR)
The SQL I used to make this conversion is:
SELECT DATEADD(SECOND, (
CONVERT(FLOAT, t.ColumnName) -
FLOOR(CONVERT(FLOAT, t.ColumnName))
) * 86400,
DATEADD(DAY, CONVERT(FLOAT, t.ColumnName), '1899-12-30')
)
In postgresql, you can use the following syntax:
SELECT ((DATE('1899-12-30') + INTERVAL '1 day' * FLOOR(38242.7711805556)) + (INTERVAL '1 sec' * (38242.7711805556 - FLOOR(38242.7711805556)) * 3600 * 24)) as date
In this case, 38242.7711805556 represents 2004-09-12 18:30:30 in excel format
In addition of #Nick.McDermaid answer I would like to post this solution, which convert not only the day but also the hours, minutes and seconds:
SELECT DATEADD(s, (42948.123 - FLOOR(42948.123))*3600*24, dateadd(d, FLOOR(42948.123),'1899-12-30'))
For example
42948.123 to 2017-08-01 02:57:07.000
42818.7166666667 to 2017-03-24 17:12:00.000
You can do this if you just need to display the date in a view:
CAST will be faster than CONVERT if you have a large amount of data, also remember to subtract (2) from the excel date:
CAST(CAST(CAST([Column_With_Date]-2 AS INT)AS smalldatetime) AS DATE)
If you need to update the column to show a date you can either update through a join (self join if necessary) or simply try the following:
You may not need to cast the excel date as INT but since the table I was working with was a varchar I had to do that manipulation first. I also did not want the "time" element so I needed to remove that element with the final cast as "date."
UPDATE [Table_with_Date]
SET [Column_With_Excel_Date] = CAST(CAST(CAST([Column_With_Excel_Date]-2 AS INT)AS smalldatetime) AS DATE)
If you are unsure of what you would like to do with this test and re-test! Make a copy of your table if you need. You can always create a view!
Google BigQuery solution
Standard SQL
Select Date, DATETIME_ADD(DATETIME(xy, xm, xd, 0, 0, 0), INTERVAL xonlyseconds SECOND) xaxsa
from (
Select Date, EXTRACT(YEAR FROM xonlydate) xy, EXTRACT(MONTH FROM xonlydate) xm, EXTRACT(DAY FROM xonlydate) xd, xonlyseconds
From (
Select Date
, DATE_ADD(DATE '1899-12-30', INTERVAL cast(FLOOR(cast(Date as FLOAT64)) as INT64) DAY ) xonlydate
, cast(FLOOR( ( cast(Date as FLOAT64) - cast(FLOOR( cast(Date as FLOAT64)) as INT64) ) * 86400 ) as INT64) xonlyseconds
FROM (Select '43168.682974537034' Date) -- 09.03.2018 16:23:28
) xx1
)
For those looking how to do this in excel (outside of formatting to a date field) you can do this by using the Text function https://exceljet.net/excel-functions/excel-text-function
i.e.
A1 = 132134
=Text(A1,"MM-DD-YYYY") will result in a date
This worked for me because sometimes the field was a numeric to get the time portion.
Command:
dateadd(mi,CONVERT(numeric(17,5),41869.166666666664)*1440,'1899-12-31')