Convert table column data type from blob to raw - sql

I have a table designed like,
create table tbl (
id number(5),
data blob
);
Its found that the column data have
very small size data, which can be stored in raw(200):
so the new table would be,
create table tbl (
id number(5),
data raw(200)
);
How can I migrate this table to new design without loosing the data in it.

This is a bit lengthy method, but it works if you are sure that your data column values don't go beyond 200 in length.
Create a table to hold the contents of tbl temporarily
create table tbl_temp as select * from tbl;
Rem -- Ensure that tbl_temp contains all the contents
select * from tbl_temp;
Rem -- Double verify by subtracting the contents
select * from tbl minus select * from tbl_temp;
Delete the contents in tbl
delete from tbl;
commit;
Drop column data
alter table tbl drop column data;
Create a column data with raw(200) type
alter table tbl add data raw(200);
Select & insert from the temporary table created
insert into tbl select id, dbms_lob.substr(data,200,1) from tbl_temp;
commit;
We are using substr method of dbms_lob package which returns raw type data. So, the resulted value can be directly inserted.

Related

BigQuery Drop Table Column - DDL Bug

After removing a column from a table by:
ALTER TABLE MyTable
DROP COLUMN IF EXISTS MyColumn
In BigQuery UI I Can see that the column was deleted successfully & I can't query the specific column but when I query DDL I can see that the column still exists in the scheme:
SELECT DDL FROM MyDataSet.INFORMATION_SCHEMA.TABLES
WHERE DDL LIKE '%MyTable%'
What am I doing wrong?
This is a nasty, undocumented side effect of Bigquery's Time Travel. Time Travel makes it unsafe to use ALTER TABLE statements in bigquery.
Demonstration of problem:
create table apu.time_travel_problem
( id int64
, name string
);
select column_name, data_type
FROM apu.INFORMATION_SCHEMA.COLUMNS
WHERE table_name = 'time_travel_problem';
column_name
data_type
id
INT64
name
STRING
This is all normal so far, but after an ALTER TABLE everything goes odd:
alter table apu.time_travel_problem drop column name;
select column_name, data_type
FROM apu.INFORMATION_SCHEMA.COLUMNS
WHERE table_name = 'time_travel_problem';
column_name
data_type
id
INT64
name
STRING
The column we just dropped is still there!
Now try this:
alter table apu.time_travel_problem add column name string;
Column `name` was recently deleted in the table `time_travel_problem`. Deleted column name is reserved for up to the time travel duration, use a different column name instead.
Solution:
Do not use ALTER TABLE in bigquery. Instead DROP and reCREATE using a temporary table.
This is a jinja template which I use:
/* {{TABLE}} */
CREATE TABLE IF NOT EXISTS {{DATASET}}.{{TABLE}}_migration
OPTIONS (expiration_timestamp = timestamp_add(CURRENT_TIMESTAMP(), INTERVAL 8 HOUR))
AS SELECT * FROM {{DATASET}}.{{TABLE}};
DROP TABLE {{DATASET}}.{{TABLE}};
CREATE TABLE {{DATASET}}.{{TABLE}}
(
{{COLUMN_DDL}}
);
INSERT INTO {{DATASET}}.{{TABLE}}
(
{{COLUMN_LIST}}
)
SELECT
{{COLUMN_LIST}}
FROM {{DATASET}}.{{TABLE}}_migration;

String is too long and would be truncated

Query:
CREATE TABLE SRC(SRC_STRING VARCHAR(20))
CREATE OR REPLACE TABLE TGT(tgt_STRING VARCHAR(10))
INSERT INTO SRC VALUES('JKNHJYGHTFGRTYGHJ')
INSERT INTO TGT(TGT_STRING) SELECT SRC_STRING::VARCHAR(10) FROM SRC
Error: String 'JKNHJYGHTFGRTYGHJ' is too long and would be truncated
Is there any way we can enable enforce length(not for COPY command) while inserting data from high precision to low precision column?
I'd recommend using the SUBSTR( ) function, to pick the piece of data you want, example as follows where I take the first 10 characters (if available, if there were only 5 it'd use those 5 characters).
CREATE OR REPLACE TEMPORARY TABLE SRC(
src_string VARCHAR(20));
CREATE OR REPLACE TEMPORARY TABLE TGT(
tgt_STRING VARCHAR(10));
INSERT INTO src
VALUES('JKNHJYGHTFGRTYGHJ');
INSERT INTO tgt(tgt_string)
SELECT SUBSTR(src_string, 1, 10)
FROM SRC;
SELECT * FROM tgt; --JKNHJYGHTF
Here's the documentation on the function:
https://docs.snowflake.com/en/sql-reference/functions/substr.html

PostgreSQL INSERT INTO table that does not exist

I have some temp table:
CREATE TEMP TABLE IF NOT EXISTS temp_test (
col1 INTEGER NOT NULL,
col2 CHARACTER VARYING NOT NULL,
col3 BOOLEAN);
Then I do some inserts into temp_test (that works fine).
Later, without creating a new table test, I try doing the following:
INSERT INTO test(col1,col2,col3) SELECT col1,col2,col3 FROM temp_tes;
And I get the following error:
ERROR: relation "test" does not exist
I thought that if I'm using INSERT INTO, it should create the table for me. does it not?
If it matters, I'm using PostgreSQL 9.6.16.
You are wrong. INSERT inserts into an existing table; it does not create a table.
If you want to create a table, use CREATE TABLE AS:
CREATE TABLE test AS
SELECT col1, ol2, col3
FROM temp_tes;

MACRO to create a table in SQL

Hi everyone thanks so much for taking the time to read this.
I'd like to create a macro in Teradata that will create a table from another table based on specific parameters.
My original table consists of three columns patient_id, diagnosis_code and Date_of_birth
......
I'd like to build a macro that would allow me to specify a diagnosis code and it would then build the table consisting of data of all patients with that diagnosis code.
My current code looks like this
Create Macro All_pats (diag char) as (
create table pats as(
select *
from original_table
where diag = :diagnosis_code;)
with data primary index (patid);
I cant seem to get this to work - any tips?
Thanks once again
Your code has a semicolon in a wrong place and a missing closing bracket:
Create Macro All_pats (diag char) as (
create table pats as
(
select *
from original_table
where diag = :diagnosis_code
) with data primary index (patid);
);
Edit:
Passing multiple values as a delimited list is more complicated (unless you use Dynamic SQL in a Stored Procedure):
REPLACE MACRO All_lpats (diagnosis_codes VARCHAR( 1000)) AS
(
CREATE TABLE pats AS
(
SELECT *
FROM original_table AS t
JOIN TABLE (StrTok_Split_To_Table(1, :diagnosis_codes, ',')
RETURNS (outkey INTEGER,
tokennum INTEGER,
token VARCHAR(20) CHARACTER SET Unicode)
) AS dt
ON t.diag = dt.token
) WITH DATA PRIMARY INDEX (patid);
);
EXEC All_lpats('111,112,113');
As the name implies StrTok_Split_To_Table splits a delimited string into a table. You might need to adust the delimiter and the length of the resulting token.

Convert table column data type from image to varbinary

I have a table like:
create table tbl (
id int,
data image
)
It's found that the column data have very small size, which can be stored in varbinary(200)
So the new table would be,
create table tbl (
id int,
data varbinary(200)
)
How can I migrate this table to new design without loosing the data in it.
Just do two separate ALTER TABLEs, since you can only convert image to varbinary(max), but you can, afterwards, change its length:
create table tbl (
id int,
data image
)
go
insert into tbl(id,data) values
(1,0x0101010101),
(2,0x0204081632)
go
alter table tbl alter column data varbinary(max)
go
alter table tbl alter column data varbinary(200)
go
select * from tbl
Result:
id data
----------- ---------------
1 0x0101010101
2 0x0204081632
You can use this ALTER statement to convert existing column IMAGE to VARBINARY(MAX). Refer Here
ALTER Table tbl ALTER COLUMN DATA VARBINARY(MAX)
After this conversion, you are surely, get your data backout.
NOTE:- Don't forgot to take backup before execution.
The IMAGE datatype has been deprecated in future version SQL SERVER, and needs to be converted to VARBINARY(MAX) wherever possible.
How about you create a NewTable with the varbinary, then copy the data from the OldTable into it?
INSERT INTO [dbo].[NewTable] ([id], [data])
SELECT [id], [image] FROM [dbo].[OldTable]
First of all from BOL:
image: Variable-length binary data from 0 through 2^31-1
(2,147,483,647) bytes.
The image data type is essentially an alias for varbinary (2GB), so converting it to a varbinary(max) should not lead to data loss.
But to be sure:
back up your existing data
add a new field (varbinary(max))
copy data from old field to new field
swap the fields with sp_rename
test
after successful test, drop the old column