Big query Date time in PST - google-bigquery

I have an issue in bigquery. I have a raw table which holds the created_date(date data type) and created_time(datetime data type) . From the raw table i have created an aggregate table which has a column created_date_pst (timestamp datatype) .the created_date_pst is created from the conversion as-TIMESTAMP(created_date,"America/Los_Angeles") as created_date_pst.
Now the problem is when ever data is inserted from raw table to aggregate table i dont get the correct values or summary of the values (for example sum(calls) ) . i wanted to input the data correctly in aggregate table based on pst timezone. please advise. thank you

Related

Error inserting data into Hive partitioned table

I'm trying to insert data into a Hive table with partition, the partition condition is yesterday's date in yyyyMMdd format, and I want to do that dynamically so I'm generating it using a query. The date query works fine in my other select statement, however when inserting it's throwing an error like this:
Error picture
Could you guys help me? Thank you and have a nice day.
You can create a view to load data or tweak your sql to do it. Make sure you have this date column as last column and partitioned by this column in table.
INSERT OVERWRITE TABLE dwh_vts.staging_f_vts_sale_revenue PARTITION(`date`)
SELECT 'N350','10','4500000.000000',DATE_FORMAT(date_sub(CURRENT_DATE,1),'yyyyMMdd')
union
SELECT 'T280','21','3760000.000000',,DATE_FORMAT(date_sub(CURRENT_DATE,1),'yyyyMMdd')
Or you can put above SQL into a view and then insert overwrite from the view.

An issue related to changing varchar to datetime in SQL

I have a question related to data type. Now I'm trying to import a flat file into my destination database.
I import the flat file into one table A inside my staging database. All the columns in table A are in varchar(50) data type.
I wrote a SQL query to change the data type and clean the data in table A, and finally insert the clean data into table B inside the destination database.
Here is the question: there is one column in this file containing date data. It is in varchar(50) data type in table A. But it also contains empty rows. So in table A it looks fine: some rows are date and some rows are empty. However, after I run the SQL query. In table B, all the empty rows in table A are changed into 1900-01-01 00:00:00.000. Please note I set this column in table B to be datetime data type.
Now I want the rows with date in this column to show date and the empty rows to be empty in the destination database. I don't want the empty rows to be 1900-01-01 00:00:00.000. How can I modify
my SQL code to finish my goal?
Presumably, you can do something like this:
nullif(<expression to convert col to date/time>, '1900-01-01 00:00:00.000')

SQL Geting json data value from row to another colum

I have coupule of columns in my table and one of them is a CLOB with json object.
I am working on data extraction mechanism from table and i was wondering if it is possible to create a new view with a new column containing certain value from that json (for example one column have rows with data like ...,"request":{"status":"open",.....} and i want new column STATUS)
Do you have any ideas how could I achieve this?
You can use JSON_VALUE.
SELECT
JSON_VALUE(jsonInfo,'$.request.status') status
FROM
( VALUES('{"request":{"status":"open"}}') ) J(jsonInfo)
Result:
status
------------
open

SQL : datatype as trunc(sysdate)

I was trying to create a table with a column's data type as trunc(sysdate).
Is that possible?
When I tried it , I got below error
SQL Error: ORA-00902: invalid datatype
I am trying this because I want to make sure data inserted into that column doesn't have timestamp.
Just create a trigger
CREATE TRIGGER schema.trigger_name
BEFORE INSERT OR UPDATE
ON schema.table_name
FOR EACH ROW
new.column_name = trunc(column_name);
No that is not possible.
Trunc() is a function that truncates date to a specific unit of measure.
The DATE datatype stores point-in-time values (dates and times) in a
table. The DATE datatype stores the year (including the century), the
month, the day, the hours, the minutes, and the seconds (after
midnight).

Copy Contents of One Column Of a Table To another of a different Table SQL

I want to copy the content of one column in table A and replace the contents (not insert into it - the number of rows will be the same) of another column in another table.
I can't a where condition, the table has only just been created at this point with one empty timestamp column. it will be populated via pyodbc class after the timestamps have been added - this query will fill the timestamps for me
What is the SQL command for this?
Thanks!
After discussion, this is the query needed : INSERT INTO OCAT_test_table (DateTimeStamp) SELECT DateTimeStamp FROM DunbarGen