Postgresql : set timestamp to null - sql

I have searched across SO related questions, but I couldn't find the solution for :
How to set timestamp to NULL during an insert to DB or an update ? (in postgresql).
If I put NULL like pstat.setTimestamp(idx++, null); I get an error which says that the type I put in is integer and that I should cast to Timestamp.
As definition for this table column, I use : ADD COLUMN admin_validation_date timestamp with time zone
Thank you for your help.

pstat.setNull(idx++, Types.TIMESTAMP);
should do it.

Related

Incorrect time value showing in Snowflake table for datatype TIMESTAMP_LTZ

We are specifying all our datetime/timestamp datatypes as TIMESTAMP_LTZ while creating Snowflake tables.
for one of the values in Oracle source '0001-01-01 00:00:00.000001000' we are observing that when this value gets inserted in Snowflake tables this changes to '0000-12-31 23:52:58.000001000' .
There are no issues with other datetime/timestamp values except this.
There is no changes and alter to timezones when the query runs.
Is this some conversion issue due to to timezone or bug at Snowflake side?
This can happen based on the timezone set for your session/user. More details for Timezone parameter can be found here.
https://docs.snowflake.com/en/sql-reference/parameters.html#timezone
Please use below link to get the timezone set for your session/user/account.
https://docs.snowflake.com/en/sql-reference/parameters.html#viewing-the-parameters-and-their-values
show parameters like '%zone%';
alter session set TIMEZONE = 'America/New_York';
CREATE or replace TABLE time_test (timeval TIMESTAMP_LTZ);
INSERT INTO time_test values('0001-01-01 00:00:00.000001000');
select * from time_test;
Output of the select query is 0000-12-31 23:56:02.000.

Updating invalid timestamps in Google BigQuery table

My table has a few invalid timestamps that I think are too precise (maybe beyond microseconds) for BigQuery.
When I tried updating the table with the following query using Standard SQL:
UPDATE mytable
SET event_time = TIMESTAMP(DATETIME(TIMESTAMP_MILLIS(CAST(SUBSTR(SAFE_CAST(UNIX_MILLIS(event_time) AS string),1,13) AS int64))))
WHERE DATE(logtime) BETWEEN "2018-03-21" AND "2018-03-23"
AND event_time IS NOT NULL
I get the invalid timestamp error:
Cannot return an invalid timestamp value of 1521738691071000064 microseconds relative to the Unix epoch. The range of valid timestamp values is [0001-01-1 00:00:00, 9999-12-31 23:59:59.999999]; error in writing field event_time
I think the problem is the SET event_time = part, but I don't know how to get around setting the values in the event_time column without referring to it.
Anyone have any ideas on how to resolve this?
Necessity is the mother of invention. For anyone who has a similar issue, I've figured out a workaround.
Create a new table of the affected rows (include this in the WHERE clause: LENGTH(CAST(UNIX_MILLIS(event_time) as string)) > 13, while transforming the
invalid timestamp into a valid format using TIMESTAMP(DATETIME(TIMESTAMP_MILLIS(CAST(SUBSTR(SAFE_CAST(UNIX_MILLIS(event_time) AS string),1,13) AS int64))))
Delete the affected rows from the original table using the WHERE clause mentioned above.
Insert all rows from the new table back into the original table.
A lot more work, but it should do the trick.

Column of Date type and inserting value into it

Hi I created a table in which one column is of date type and also works as PK. I tried to insert value 2009-01-07 into this column and had this error. Isn't Date default format yyyy-mm-dd? I don't understand this.
Msg 241, Level 16, State 1, Line 3
Conversion failed when converting date and/or time from character string.
This is my query:
INSERT INTO Table_Name
Values ('2009-01-07', other column values)
Your value '2009-01-07' should be converted.
Date literals are always a deep source of troubles... Best was, to use either
Unseparated: 20090107
ODBC : {d'2009-01-07'}
ISO8601 : 2009-01-07T00:00:00
But your format is short ISO 8601 and should work...
Some possible reasons:
Other values in your VALUES list
a trigger
a constraint
As a_horse_with_no_name stated in comment: Without a column list after INSERT INTO Table(col1, col2, ...) There is a great risk to state your values in a different order thus pairing values with the wrong columns...
Invalid (but good looking) dates such as 2016-06-31
Or a - well known - issue with SQL-Server. Sometimes the order of execution is absolutely not the way one expected it. There are several issues with conversion errors...
What you can try
Use ODBC format (which is treated as DATETIME immediately)
DECLARE a variable with this value and put it in place
Thank you all for the prompt replies. I read and tried all of them and found out why.
'2009-01-07' can be inserted into a Column with "Date" as data type if no CONSTRAINT has issue with that;
my problem was caused by a CHECK constraint on that column.
Originally I set CONSTRAINT as
Column_Name = 'Wednesday'
After I modified it to
DATEName(dw,[Column_Name]) = 'Wednesday'
the inserting began to work.
Thanks again.

DB2/400 - Auto generated timestamp on change (error)

I'm trying to create a table with a timestamp column that autogenerates with 'current timestamp' on each update of the record. I'm on DB2/400 (version V5R3) using ODBC driver.
That's the query:
CREATE TABLE random_table_name (
ID integer not null generated always as identity,
USERS_ID varchar (30),
DETAILS varchar (1000),
TMSTML_CREATE timestamp default current timestamp ,
TMSTMP_UPDATE timestamp not null generated always for each row on update as row change timestamp,
PRIMARY KEY ( ID )
)
I get this error (translated):
ERROR [42000] [IBM][iSeries Access ODBC Driver][DB2 UDB]SQL0104 - Token EACH not valid. Valid tokens: BIT SBCS MIXED.
Without the 'TMSTMP_UPDATE' row the query works. How can i solve this?
EDIT: Ok, i understand that in my DB2 version, the only way is to use triggers, but today AS400 seems to be evil with me.
I'm trying with this:
CREATE TRIGGER random_trigger_name
AFTER UPDATE ON random_table_name
REFERENCING NEW AS NEW_ROW
FOR EACH ROW MODE DB2SQL
BEGIN ATOMIC
SET NEW_ROW.TMSTM_UPDATE = CURRENT TIMESTAMP;
END
Error (translated):
ERROR [42000] [IBM][iSeries Access ODBC Driver][DB2 UDB]SQL0312 - Variable TMSTM_UPDATE not defined or not available.
The column TMSTM_UPDATE exist and it's a normal timestamp.
EDIT 2: I've solved the trigger problem by replacing 'after' with 'before'. Now everything works as expected. Thank you all!
There is a standard way to do it in iSeries DB2. It is documented here: IBM Knowledge center - Creating a row change timestamp column
You should change your table definition to:
TMSTMP_UPDATE TIMESTAMP NOT NULL FOR EACH ROW ON UPDATE AS ROW CHANGE TIMESTAMP
I am using it in tables in production over V7R2 and it works like a charm :) Hope it will be available for V5R3
EDIT
As Charles mentioned below unfortunately this feature is available since DB2 for i V6R1

Whiel Update ,datetime value is rounded to seconds.! i want milliseconds too

While updating a datatime column in a table from another table, i noticed that mnilliseconds value are not shown.. instead it is rounded and the value is updated to nearest seconds.
Example :
Original Value: 2008-06-26 14:06:36.643
Updated Value : 2008-06-26 14:07:00
Please help me getting the actual value including milliseconds
In the case where you're doing a straight update of a datetime in one table with one from another table (i.e. no fiddling with the value), then it sounds like the datatype in the table being updated is not the same.
i.e. in SQL Server world, it could be that you are using SMALLDATETIME column in the table being updated, but a DATETIME field in the table being copied from. SMALLDATETIME is only accurate to the second and so would show this behaviour
In SQL Server;
SELECT CAST('2008-06-26 14:06:36.643' AS SMALLDATETIME)
> 2008-06-26 14:07:00
So the destination table column is probably SMALLDATETIME (or your casting in the query).