PostgreSQL: how to efficiently alter multiple columns from psql? - sql

I have PostgreSQL table with several boolean columns, currently containing only true or null. I want to do the following for all of them:
Add a default value of false
Change all null values to false
Add a not null constraint
ie.:
-- for each column specified:
update my_table set my_column = 'f' where my_column is null;
alter table my_table alter column my_column set default 'f';
alter table my_table alter column my_column set not null;
Is there a feature of psql (or standard SQL) that will iterate over a specified list of columns and apply a sequence of operations to each one?

You can not iterate over all columns, but to be safe you probably don’t want to do that anyway but specify which ones to alter yourself. Another way would be to do a script querying for the column names and then altering them.
To alter them you use ALTER TABLE. See the PgSQL doc:
http://www.postgresql.org/docs/8.4/static/sql-altertable.html
ALTER TABLE xy ALTER COLUMN a SET DEFAULT FALSE, ALTER COLUMN b SET NOT NULL
etc

This will do, needs version 8.4 or higher because of the VARIADIC.
CREATE OR REPLACE FUNCTION setdefaults(
IN _tname TEXT, -- tablename to alter
VARIADIC _cname TEXT[] -- all columnnames to alter
)
RETURNS boolean
LANGUAGE plpgsql
AS
$$
DECLARE
row record;
BEGIN
FOR row IN SELECT unnest(_cname) AS colname LOOP
EXECUTE 'ALTER TABLE ' || quote_ident(_tname) || ' ALTER COLUMN ' || quote_ident(row.colname) || ' SET DEFAULT false;';
EXECUTE 'UPDATE ' || quote_ident(_tname) || ' SET ' || quote_ident(row.colname) || ' = DEFAULT WHERE ' || quote_ident(row.colname) || ' IS NULL;';
EXECUTE 'ALTER TABLE ' || quote_ident(_tname) || ' ALTER COLUMN ' || quote_ident(row.colname) || ' SET NOT NULL;';
END LOOP;
RETURN TRUE;
END;
$$;
SELECT setdefaults('foo', 'x','y','z'); -- alter table "foo"

Related

REPLACE() in Postgres can't replace data

I'm new in postgres, I'm creating a procedure to rename table constraint names using the REPLACE(). If I test all the variables in this procedure the data is there and if I replace it manually it can. The problem is when this procedure is running the constraint name doesn't change.
create or replace procedure public.rename_existing_constraint_table(in table_name text, in date_now text, in list_constraint text[])
as $$ declare
const text;
table_rename text;
begin
table_rename := (select concat(table_name, '_', date_now));
if array_length(list_constraint, 1) >= 1 then
foreach const in array list_constraint loop
execute 'alter table if exists ' || table_name || ' RENAME CONSTRAINT ' || const || ' to ' || replace(const, table_name, table_rename);
end loop;
end if;
end $$
language plpgsql;
duplicate error because the data was not successfully renamed, it should be
app_devlogdetail_pkey to app_devlogdetail_20221214_pkey
psycopg2.errors.DuplicateTable: relation "app_devlogdetail_pkey" already exists
CONTEXT: SQL statement "alter table if exists app_devlogdetail_20221214 RENAME CONSTRAINT app_devlogdetail_pkey to app_devlogdetail_pkey"
PL/pgSQL function rename_existing_constraint_table(text,text,text[]) line 10 at EXECUTE
I have tried running the REPLACE() outside the procedure and it runs normally and the data can be changed, but when it is run inside the procedure to be executed the REPLACE() can't changed data. How to make the REPLACE() can run and the data can be changed or is there some other way around it?

How to drop all not null constraints from a DB2 table

I would like to drop all not null constraints from all columns in a table in DB2 without having to specify each column name.
Ideally this would be a function, where I could pass a table name as a parameter. Going through sysibm.syscolumns to get the not null columns is perfectly fine.
Thanks!
EDIT:
A bit of background:
DB: DB2 LUW v11.1.0.0
OS: Linux, Debian (Debian 4.9.51-1 (2017-09-28)
I am creating a table from another table and need to import data into the newly created table. Unfortunately, the data to be imported sometimes does not have all the values which are needed for the not null columns, hence I have to remove all not null constraints before loading the data.
For Oracle, I have the following:
function f_remove_mandatory(p_tbname in varchar2) return boolean is
l_tbname all_tab_columns.table_name%type;
l_AltTabTxt varchar2(220);
cursor c_AlterDlTab (p_tablename in varchar2) IS
select column_name
from user_tab_columns
where table_name = p_tablename
and nvl(nullable,'x') = 'N';
begin
l_tbname := UPPER(p_tbname);
for c1 in c_AlterDlTab (l_tbname) loop
execute immediate 'ALTER TABLE ' || l_tbname ||' MODIFY ' || c1.column_name || ' NULL';
end loop;
return true;
exception when others then return false;
end;
And need something similar for DB2.

PostgreSQL: How to add a column in every table of a database?

I have a database with 169 tables
I need this column in every table:
wid integer not null primary key
I tried this(Thanks https://stackoverflow.com/users/27535/gbn for the solution):
SELECT
'ALTER TABLE ' + T.name + ' ADD foo int NULL'
FROM
sys.tables AS T
WHERE
T.is_ms_shipped = 0
But it didn't work on PostgreSQL.
It only worked on tsql.
How to add this column in every table at once ?
do $$
declare
selectrow record;
begin
for selectrow in
select
'ALTER TABLE '|| T.mytable || ' ADD COLUMN foo integer NULL' as script
from
(
select tablename as mytable from pg_tables where schemaname ='public' --your schema name here
) t
loop
execute selectrow.script;
end loop;
end;
$$;
You can test whether all your tables altered with the new column using the following select
select
table_name,COLUMN_NAME
from
INFORMATION_SCHEMA.COLUMNS
where
COLUMN_NAME='foo' -- column name here
Try this (change 'public' to whatever schema you're doing this in)
DO $$
DECLARE
row record;
cmd text;
BEGIN
FOR row IN SELECT schemaname, tablename FROM pg_tables WHERE schemaname = 'public' LOOP
cmd := format('ALTER TABLE %I.%I ADD COLUMN foo SERIAL PRIMARY KEY ', row.schemaname, row.tablename);
RAISE NOTICE '%', cmd;
-- EXECUTE cmd;
END LOOP;
END
$$ LANGUAGE plpgsql;
If you run as is, it'll show you the commands. Uncomment the EXECUTE line to actually perform the alterations.
I'd run within a transaction so you can roll back if you're not happy with the results.
Note that the type is SERIAL - the column type will be integer, but also creates a sequence owned by the table and defaults the column value to the next value of that sequence.
We may need to check column is already exist or not.
Tested on PostgreSQL V10
do $$
declare selectrow record;
begin
for selectrow in
select 'ALTER TABLE '|| T.mytable || ' ADD COLUMN x_is_exported boolean DEFAULT FALSE' as script
from (select tablename as mytable from pg_tables where schemaname ='public') t
loop
begin
execute selectrow.script;
EXCEPTION WHEN duplicate_column THEN CONTINUE;
END;
end loop;
end;
$$;

constraint on all numerical tables

I need to put a constraint on all numerical columns, this is what I tried:
Everything has to be possitive
ALTER TABLE * ADD CONSTRAINT Checknumbers CHECK ( > 0 )
This isn't working but I can't find a solution for it.
Is their any other syntax that I can use or do I need to do it manualy for each table?
You would need to create a separate constraint for each column in each table. You could potentially write a bit of dynamic SQL for this
DECLARE
l_sql_stmt VARCHAR2(1000);
BEGIN
FOR x IN (SELECT *
FROM user_tab_columns
WHERE data_type = 'NUMBER'
AND table_name in (SELECT table_name
FROM user_tables
WHERE dropped = 'NO' )
LOOP
l_sql_stmt := 'ALTER TABLE ' || x.table_name ||
' ADD CONSTRAINT chk_' || x.table_name || '_' || x.column_name ||
' CHECK( ' || x.column_name || ' > 0)';
EXECUTE IMMEDIATE l_sql_stmt;
END LOOP;
END;
For every numeric column in every table in the current schema, this will attempt to create a check constraint. The constraint name is limited to 30 characters so if the sum of the length of the table name and the column name is more than 25, this will attempt to generate an invalid identifier. You'd need to figure out an alternate way of generating the constraint name (or you could let the system generate a name). This also won't handle case-sensitive identifiers if you happen to have any of those. You'd need to double-quote the identifiers if that is an issue for you.

Is there a way to alter many tables to add default values to a common column name?

We have a set of 45 tables which carry a common column {variety}.
The need is to set all such columns with a default value {comedy}.
The ALTER TABLE (SCHEMA.TABLE_NAME) MODIFY(VARIETY DEFAULT 'COMEDY')
Will get it done, but I am wondering if there is a way to create a sql script in Oracle 11g that will change all tables within the schema which have a common coloumn name to the common default value.
DECLARE
cnt NUMBER;
BEGIN
FOR x IN (
SELECT DISTINCT t.table_name
FROM user_tables t
INNER JOIN user_tab_columns c ON c.table_name = t.table_name
) LOOP
EXECUTE IMMEDIATE 'ALTER TABLE (SCHEMA.' || x.table_name || ') MODIFY(VARIETY DEFAULT ''COMEDY'')';
END LOOP;
END;
The alter table statement can be written as following,
using alternate quoting mechanism.
'alter table ' || x.table_name || q'[ modify (variety default 'COMEDY')]'