Data move from table to table , column type error - sql

I'm using Postgresql with timescaledb extension, pg13 - tsdb 2.2.1
table 1
create table user
(
...
join_dt timestamptz NULL,
...
);
Table 2
create table a_user
(
...
join_dt timestamptz NULL,
...
);
I used SQL insert to a_user table so there's data inside the table, and want to move the data to user table by querying this :
insert into user
select * from a_user;
I get this error:
[42804] ERROR: column "join_dt" is of type timestamp with time zone but expression is of type character varying
The original data comes from a table which is like this
create table ori_user
(
...
join_dt timestamp(6) with time zone,
...
);
I export data as SQL insert form from ori_user, and then inserted into a_user, and now I want to move from a_user to user.
This is what I have tried:
insert into user select
...
join_dt::timestamptz,
...
from a_user;
does not work.
with aa as (select pg_typeof(join_dt)::varchar as type,* from a_user)
select * from aa where aa.type not like 'timestamp with time zone';
shows no row.
any other solutions? please help.
Thanks in advance.

I thought table columns are in same order, but it wasn't.
fix the order, problem solved.

Related

Recreate table from a select and add an extra datetime default column (Snowflake)

I'm having problems creating a table that should be pretty straightforward. The SQL code (Snowflake) is:
create or replace table bank_raw as
select
*,
created_at datetime default current_timestamp()
from bank_raw;
My error is: Syntax error: unexpected 'DEFAULT'. (line 12).
I don't know how I can recreate this table and add this default timestamp column. By the way, I have already created multiple tables from scratch with created_at DateTime default current_timestamp().
Any ideas?
It is possible to define column list definition when using CTAS:
Sample data:
CREATE TABLE bank_raw(id INT, col TEXT);
INSERT INTO bank_raw(id, col) VALUES (1, 'a'), (2,'b');
Query:
CREATE OR REPLACE TABLE bank_raw(id INT,
col TEXT,
created_at datetime default current_timestamp())
AS
SELECT
id, col, CURRENT_TIMESTAMP()
FROM bank_raw;
Output:
SELECT * FROM bank_raw;
DESCRIBE TABLE bank_raw;
Since this is a DML operation not a DDL operation, the default keyword does not apply. You can simply remove it and instead project the column and name it:
create or replace table bank_raw as
select
*,
current_timestamp() as created_at
from bank_raw;
Edit: To enforce a default, you cannot alter a table to add a column with a default value except for sequences. So you'd need to do something like this:
select get_ddl('table','BLANK_RAW');
-- Copy and paste the DDL. Rename the new table,
-- and add the default timestamp:
create or replace table A
(
-- Existing columns here then:
created_at timestamp default current_timestamp
);
You can then do an insert from a select on the table BLANK_RAW. You'll need to specify a column list and omit the CREATED_AT column.

Partition key failing error for postgres table

ERROR: no partition of relation "test_table" found for row DETAIL:
Partition key of the failing row contains (start_time) = (2021-04-25
00:00:00). SQL state: 23514
I am inserting a data where i have a column start time (2021-04-25 00:00:00)
This is my Schema
CREATE TABLE test_table (
start_time timestamp NULL,
)
PARTITION BY RANGE (start_time);
This sounds as if you have no partition-tables defined for this table.
You might need something like this:
CREATE TABLE test_table_2021 PARTITION OF test_table
FOR VALUES FROM ('2021-01-01') TO ('2022-01-01');
After you defined this partition for your partitioned table, you should be able to insert the data (as long as start_time is anywhen in 2021).
See the docs: https://www.postgresql.org/docs/current/ddl-partitioning.html

Azure Data Factory v2 Not Null Columns in sink

I'm trying out Azure Data Factory v2 and I want to pipe data from an SQL source to an Oracle sink.
My problem is, that I have several Not Null columns in my Oracle tables which specify for example the date and time of which a dataset is loaded into Oracle. These columns however aren't existant in the SQL tables so when I want to start the pipeline I get the error that these columns can't be null in the Oracle sink.
My question is now, is it possible to artificially add these columns during the pipeline run so that these columns get filled by the Data Factory?
Can I use a stored procedure or a custom activity for that?
Or do I have to create a Powershell script which "hardcodes" the values I want to add to the source?
You can accomplish this in ADFv2 using a query against your source dataset in the Copy activity to insert values.
Using the table ex_employee, with the following configuration in each database:
Source table (SQL):
ID int not null,
Name nvarchar(25) not null
Sink table (Oracle):
ID number(p,0) not null,
Name nvarchar2(25) not null,
CreatedDate timestamp not null
In the Source configuration on your Copy activity in ADF, you would select the Query option under Use Query, and input a query, such as:
SELECT ID, Name, CURRENT_TIMESTAMP AS CreatedDate FROM ex_employee
This will take the existing values from your SQL table, and insert a default value into the result set, which can then be inserted into your Oracle sink.
Does this column has default value ? can you add default to this column then try? I not familiar with oracle pipe data however a similar approach in the below example adding a default value to a not null column.
drop table ex_employee
/
create table ex_employee (id number(1) null ,name varchar2(100) default 'A' not null )
/
insert into ex_employee(id)
select 1 from dual
/
commit
/
selecT * from ex_employee where id=1

Order by on Two Datetime Fields .If one field is Null How to avoid Sybase from populating it with default datetime

I am using a Select Query to insert data into a Temp Table .In the Select Query I am doing order by on two columns something like this
insert into #temp
Select accnt_no,acct_name, start_date,end_date From table
Order by start_date DESC,end_date DESC
Select * from #temp
Here when there is an entry present in start_date field and an Null entry in the end_date field .During the order by operation Sybase is filling it with an Default date ( jan 1 1900 ) . I dont want that to happen .If the end_date field is Null . The data should be written just as Null .Any suggestion on how to keep it as Null even while fetching the data from the table .
The 1/1/1900 usually comes from trying to cast an empty string into a datetime.
Is your 'date' source column an actual datetime datatype or a string-ish varchar or char?
Sounds like the table definition requires that end_date not be null, and has default values inserted automatically to prevent them. Are you sure there are even nulls when you do a straight select on the table without the confusion of ordering and inserting?
I created a temp table with a nullable datetime column that has a default value.
Defaults are not there to handle nulls per se, they are there to handle missing values on inserts that were not supplied. If I run an insert without a column list (just as you have done) the default value does not apply and a null is still inserted.
I suggest adding the column list to your insert statement. This might prevent the problem (or expose a different problem in having them in the wrong order.)
insert into #temp (accnt_no, accnt_name, start_date, end_date)
select accnt_no,acct_name, start_date,end_date from ...
Here's a query that should help you find the actual defaults on any of the columns if you don't have access to the create script:
select c.name, object_name(cdefault), text
from tempdb..syscolumns c, tempdb..syscomments cm
where c.id = object_id('tempdb..#temp') and cm.id = c.cdefault

Compare columns with time datatype in SQL

I have table with column with time Datatype.I want to compare the time in SQL.The main problem is when i compare any time with '00:00:00'/'00:10:00'.
For e.g. select timings from train where trn_time > '19:00:00'.
then in output i want '00:00:00' also wic I am not getting.
And I dont want to use 'Datetime' data type.
Please help.
i created a table in mysql with name 'table11'
CREATE TABLE `table11` (
`id` INT(10) NULL DEFAULT NULL,
`time_col` TIME NULL DEFAULT NULL
)
and inserted data as follows
and when i used following query
select * from table11 tbl where tbl.time_col< '19:00:00'
i got result
if you use tbl.time_col< '19:00:00' then only you will get '00:00:00' in the output.
i think in your case its better to use 'Datetime' datatype