Convert a single row with dynamic number of columns into a single column in Oracle - sql

I have to convert a single row that is obtained via select statement into a single column with concatenated values of the individual columns of the result. The problem is that the columns are unknown and can vary in number.
Let's say the table looks similar to this:
Table USER
Name Surname Age Logindate City
Max Smith 25 20.05.20 NY
I need to SELECT * FROM USER and convert the result into a single string like Max, Smith, 25, 20.05.20, NY or with column names Name: Max, Surname: Smith, Age: 25, Logindate: 20.05.20, City: NY that I can afterwards insert into a column of other table. The name of the table that I'm selecting from is known and hardcoded into the SELECT statement that is executed inside a stored procedure.
Since the number of columns and column names are unknown, I cannot use a CONCAT function. I was also going to be satisfied with the output format of SELECT JSON_OBJECT(*) FROM USER, but the function with such usage of star operator is not supported in Oracle18c (it is in Oracle19c).
The transformation of column values of a single row into a single string seems like a basic operation, but I wasn't able to find any simple solution.

Use the data dictionary to generate the right SQL statement and then use dynamic SQL to execute it.
--Sample tables for input and output:
create table user_table as
select 'Max' Name, 'Smith' Surname, 25 Age, date '2020-05-20' LoginDate, 'NY' City
from dual;
create table concatenated_values(value varchar2(4000));
--Procedure to read all columns from USER_TABLE and write them to CONCATENATED_VALUES.
create or replace procedure concatenate_values(p_table_name varchar2) is
v_sql varchar2(4000);
begin
--Generate a SQL statement to concatenate all the values.
select
'select ' ||
listagg(column_name, '||'',''||') within group (order by column_id) ||
' from ' || owner || '.' || table_name
into v_sql
from all_tab_columns
where owner = user
and table_name = p_table_name
group by owner, table_name;
--Run the SQL statement and insert the value.
execute immediate 'insert into concatenated_values ' || v_sql;
end;
/
--Call the procedure.
begin
concatenate_values('USER_TABLE');
end;
/
--Results:
select * from concatenated_values;
VALUE
-------------------------
Max,Smith,25,20-MAY-20,NY

Related

Oracle SQL/PLSQL: change type of specific columns in one time

Assume following table named t1:
create table t1(
clid number,
A number,
B number,
C number
)
insert into t1 values(1, 1, 1, 1);
insert into t1 values(2, 0, 1, 0);
insert into t1 values(3, 1, 0, 1);
clid A B C
1 1 1 1
2 0 1 0
3 1 0 1
Type of columns A, B, and C is number. What I need to do is to change types of those columns to VARCHAR but in a quite tricky way.
In my real table I need to change datatype for hundred of columns so it is not so convenient to write a statement like following hundred of time:
ALTER TABLE table_name
MODIFY column_name datatype;
What i need to do is rather to convert all columns to VARCHAR except CLID column like we can do that in Python or R
Is there any way to do so in Oracle SQL or PLSQL?
Appreciate your help.
Here is a example of procedure that can help...
It accepts two parameters that should be a name of your table and list of columns you do not want to change...
At the begining there is a cursor that gets all the column names for your table except the one that you do not want to change...
Then it loop's though the columns and changes them...
CREATE OR REPLACE procedure test_proc(p_tab_name in varchar2
, p_col_names in varchar2)
IS
v_string varchar2(4000);
cursor c_tab_cols
is
SELECT column_name
FROM ALL_TAB_COLS
WHERE table_name = upper(p_tab_name)
and column_name not in (select regexp_substr(p_col_names,'[^,]+', 1, level) from dual
connect by regexp_substr(p_col_names, '[^,]+', 1, level) is not null);
begin
FOR i_record IN c_tab_cols
loop
v_string := 'alter table ' || p_tab_name || ' modify '
|| i_record.column_name || ' varchar(30)';
EXECUTE IMMEDIATE v_string;
end loop;
end;
/
Here is a demo:
DEMO
You can also extend this procedure with a type of data you want to change into... and with some more options I am sure....
Unfortunately, that isn't as simple as you'd want it to be. It is not a problem to write query which will write query for you (by querying USER_TAB_COLUMNS), but - column must be empty in order to change its datatype:
SQL> create table t1 (a number);
Table created.
SQL> insert into t1 values (1);
1 row created.
SQL> alter table t1 modify a varchar2(1);
alter table t1 modify a varchar2(1)
*
ERROR at line 1:
ORA-01439: column to be modified must be empty to change datatype
SQL>
If there are hundreds of columns involved, maybe you can't even
create additional columns in the same table (of VARCHAR2 datatype)
move values in there
drop "original" columns
rename "new" columns to "old names"
because there'a limit of 1000 columns per table.
Therefore,
creating a new table (with appropriate columns' datatypes),
moving data over there,
dropping the "original" table
renaming the "new" table to "old name"
is probably what you'll finally do. Note that it won't be necessarily easy either, especially if there are foreign keys involved.
A "query that writes query for you" might look like this (Scott's sample tables):
SQL> SELECT 'insert into dept (deptno, dname, loc) '
2 || 'select '
3 || LISTAGG ('to_char(' || column_name || ')', ', ')
4 WITHIN GROUP (ORDER BY column_name)
5 || ' from emp'
6 FROM user_tab_columns
7 WHERE table_name = 'EMP'
8 AND COLUMN_ID <= 3
9 /
insert into dept (deptno, dname, loc) select to_char(EMPNO), to_char(ENAME), to_char(JOB) from emp
SQL>
It'll save you from typing names of hundreds of columns.
I think its not possible to change data type of a column if values are there...
Empty the column by copying values to a dummy column and change data types.

How can I loop to get counts from multiple tables?

I am trying to derive a table with counts from multiple tables. The tables are not on my schema. The table names on the schema that I am interested in all start with 'STAF_' and end with '_TS'. The criteria i am looking for is where SEP = 'MO'. So for example, the query in its base form is:
select area, count(SEP) areacount
from mous.STAF_0001_TS
where SEP = 'MO'
group by area;
I have about 1000 tables that i'd like to do this for.
Ultimatly, I'd like the output to be a table on my schema that looks like the following:
area| areacount
0001| 3
0002| 7
0003| 438
Thank you.
As a first step I'd write an SQL query that generates an SQL query:
SELECT 'SELECT area, count(*) FROM '||c.table_name||'UNION ALL' as run_me
FROM all_tables c
WHERE c.table_name LIKE 'STAF\_%\_MS' escape '\'
Running this will produce an output that is another SQL query. Copy the result text out of your results grid and paste it back into your query pane. Delete the final UNION ALL and run it
Once you dig how to write an SQL query that generate an SQL query, you can look at turning it into a view, or creating a dynamic query in a string.
Gotta say, this is a horrible way to store data; you'd be better off using ONE table with an extra column containing whatever is in xxx of STAF_xxx_MS right now
In Oracle 12c, you can embed a FUNCTION that will query the number of rows in any given table. Then you can use that function in your main query. Here is an example:
WITH FUNCTION cnt ( p_owner VARCHAR2, p_table_name VARCHAR2 ) RETURN NUMBER IS
l_cnt NUMBER;
BEGIN
EXECUTE IMMEDIATE 'SELECT count(*) INTO :cnt FROM ' || p_owner || '.' || p_table_name INTO l_cnt;
RETURN l_cnt;
EXCEPTION WHEN OTHERS THEN
RETURN NULL; -- This will happen for entries in ALL_TABLES that are not directly accessible (e.g., IOT overflow tables)
END cnt;
SELECT t.owner, t.table_name, cnt(t.owner, t.table_name)
FROM all_tables t
where t.table_Name like 'STAF\_%\_MS' escape '\';

Oracle Object : How to show all the fields in select query?

I have the following Oracle Object :
CREATE TYPE person_typ AS OBJECT (
Id NUMBER,
first_name VARCHAR2(20),
last_name VARCHAR2(25));
And a table :
CREATE TABLE MyTable (
contact person_typ,
contact_date DATE );
I would like to make a select query to show all the fields of the Person_Typ object without specifiying their names.
When I do
select * from MyTable
The column contact is shown as [unsupported data type], I have to use instead :
select T.contact.ID, T.contact.First_name, T.contact.last_name from MyTable T
Is there another way to show the values of the object without specifying the column names ?
Thanks,
Cheers,
I don't use SQL Developer, but according to this article Showing TYPE’d Column Values in SQL Developer you could use option:
Preferences / Database / Advanced / Display Struct Value in Grid
Also you can query user_type_attr (or all_type_attr) to obtain column names. Then copy/paste select part from output and run it or create view as proposed by #sep. Here is my test data and code block:
insert into mytable values (person_typ(1, 'Paulina', 'Thomson'), date '2017-12-17');
insert into mytable values (person_typ(7, 'Keanu', 'Stevens'), date '2017-12-28');
declare
v_sql varchar2(32000);
begin
select listagg('T.CONTACT.'||attr_name||' '||attr_name, ', ')
within group (order by attr_no)
into v_sql
from user_type_attrs
where type_name = 'PERSON_TYP';
v_sql := 'SELECT '||v_sql||' FROM MYTABLE T';
dbms_output.put_line(v_sql);
execute immediate 'CREATE OR REPLACE VIEW VW_CONTACTS AS '||v_sql;
end;
select * from vw_contacts;
Result:
ID FIRST_NAME LAST_NAME
------ -------------------- -------------------------
1 Paulina Thomson
7 Keanu Stevens
I had the same problem and I created a view with all the fields. In your case:
CREATE VIEW v_myTable AS
select T.contact.ID, T.contact.First_name, T.contact.last_name
from MyTable T
and use the view instead of the table
SELECT * from v_myTable

How to find non-numeric columns containing only numeric data?

I like to find all columns in my Oracle database schema that only contain numeric data but having a non-numeric type. (So basically column-candidates with probably wrong chosen data types.)
I have a query for all varchar2-columns:
SELECT TABLE_NAME, COLUMN_NAME, DATA_TYPE
FROM user_tab_cols
WHERE DATA_TYPE = 'VARCHAR2';
Furthermore I have a query to check for any non-numeric data inside a table myTable and a column myColumn:
SELECT 1
FROM myTable
WHERE NOT REGEXP_LIKE(myColumn, '^[[:digit:]]+$');
I like to combine both queries in that way that the first query only returns the rows where not exists the second.
The main problem here is that the first query is on meta layer of the data dictionary where TABLE_NAME and COLUMN_NAME comes as data and I need that data as identifiers (and not as data) in the second query.
In pseudo-SQL I have something like that in mind:
SELECT TABLE_NAME, COLUMN_NAME, DATA_TYPE
FROM user_tab_cols
WHERE DATA_TYPE = 'VARCHAR2'
AND NOT EXISTS
(SELECT 1 from asIdentifier(TABLE_NAME)
WHERE NOT REGEXP_LIKE(asIdentifier(COLUMN_NAME), '^[[:digit:]]+$'));
Create a function as this:
create or replace function isNumeric(val in VARCHAR2) return INTEGER AS
res NUMBER;
begin
res := TO_NUMBER(val);
RETURN 1;
EXCEPTION
WHEN OTHERS THEN
RETURN 0;
END;
Then you can use it like this:
DECLARE
r integer;
BEGIN
For aCol in (SELECT TABLE_NAME, COLUMN_NAME FROM user_tab_cols WHERE DATA_TYPE = 'VARCHAR2') LOOP
-- What about CHAR and CLOB data types?
execute immediate 'select count(*) from '||aCol.TABLE_NAME||' WHERE isNumeric('||aCol.COLUMN_NAME||') = 0' into r;
if r = 0 then
DBMS_OUTPUT.put_line(aCol.TABLE_NAME ||' '||aCol.COLUMN_NAME ||' contains numeric values only');
end if;
end loop;
end;
Note, the performance of this PL/SQL block will be poor. Hopefully this is a one-time-job only.
There are two possible approaches: dynamic SQL (DSQL) and XML.
First one was already demonstrated in another reply and it's faster.
XML approach just for fun
create or replace function to_number_udf(p in varchar2) return number
deterministic is
pragma udf;
begin
return p * 0;
exception when invalid_number or value_error then return 1;
end to_number_udf;
/
create table t_chk(str1, str2) as
select '1', '2' from dual union all
select '0001.1000', 'helloworld' from dual;
SQL> column owner format a20
SQL> column table_name format a20
SQL> column column_name format a20
SQL> with tabs_to_check as
2 (
3 select 'collection("oradb:/'||owner||'/'||table_name||'")/ROW/'||column_name||'/text()' x,
4 atc.*
5 from all_tab_columns atc
6 where table_name = 'T_CHK'
7 and data_type = 'VARCHAR2'
8 and owner = user
9 )
10 select --+ no_query_transformation
11 owner, table_name, column_name
12 from tabs_to_check ttc, xmltable(x columns "." varchar2(4000)) x
13 group by owner, table_name, column_name
14 having max(to_number_udf(".")) = 0;
OWNER TABLE_NAME COLUMN_NAME
-------------------- -------------------- --------------------
TEST T_CHK STR1
PS. On Oracle 12.2 you can use to_number(... default ... on conversion error) instead of UDF.
The faster way to check if a string is all digits vs. contains at least one non-digit character is to use the translate function. Alas, due to the non-SQL Standard way Oracle handles empty strings, the form of the function we must use is a little complicated:
translate(input_string, 'z0123456789', 'z')
(z can be any non-digit character; we need it so that the third argument is not null). This works by translating z to itself and 0, etc. to nothing. So if the input string was null or all-digits, and ONLY in that case, the value returned by the function is null.
In addition: to make the process faster, you can test each column with an EXISTS condition. If a column is not meant to be numeric, then in most cases the EXISTS condition will become true very quickly, so you will have to inspect a very small number of values from such columns.
As I tried to make this work, I ran into numerous side issues. Presumably you want to look in all schemas (except SYS and perhaps SYSTEM). So you need to run the procedure (anonymous block) from an account with SYSDBA privileges. Then - I ran into issues with non-standard table and column names (names starting with an underscore and such); which brought to mind identifiers defined in double-quotes - a terrible practice.
For illustration, I will use the HR schema - on which the approach worked. You may need to tweak this further; I wasn't able to make it work by changing the line
and owner = 'HR'
to
and owner != 'SYS'
So - with this long intro - here is what I did.
First, in a "normal" user account (my own, named INTRO - I run a very small database, with only one "normal" user, plus the Oracle "standard" users like SCOTT, HR etc.) - so, in schema INTRO, I created a table to receive the owner name, table name and column name for all columns of data type VARCHAR2 and which contain only "numeric" values or null (numeric defined the way you did.) NOTE HERE: If you then want to really check for all numeric values, you will indeed need a regular expression, or something like what Wernfried has shown; I would still, otherwise, use an EXISTS condition rather than a COUNT in the anonymous procedure.
Then I created an anonymous block to find the needed columns. NOTE: You will not have a schema INTRO - so change it everywhere in my code (both in creating the table and in the anonymous block). If the procedure completes successfully, you should be able to query the table. I show that at the end too.
While logged in as SYS (or another user with SYSDBA powers):
create table intro.cols_with_numbers (
owner_name varchar2(128),
table_name varchar2(128),
column_name varchar2(128)
);
declare x number;
begin
execute immediate 'truncate table intro.cols_with_numbers';
for t in ( select owner, table_name, column_name
from dba_tab_columns
where data_type like 'VARCHAR2%'
and owner = 'HR'
)
loop
execute immediate 'select case when exists (
select *
from ' || t.owner || '.' || t.table_name ||
' where translate(' || t.column_name || ',
''z0123456789'', ''z'') is not null
) then 1 end
from dual'
into x;
if x is null then
insert into intro.cols_with_numbers (owner_name, table_name, column_name)
values(t.owner, t.table_name, t.column_name);
end if;
end loop;
end;
/
Run this procedure and then query the table:
select * from intro.cols_with_numbers;
no rows selected
(which means there were no numeric columns in tables in the HR schema, in the wrong data type VARCHAR2 - or at least, no such columns that had only non-negative integer values.) You can test further, by intentionally creating a table with such a column and testing to see it is "caught" by the procedure.
ADDED - Here is what happens when I change the owner from 'HR' to 'SCOTT':
PL/SQL procedure successfully completed.
OWNER_NAME TABLE_NAME COLUMN_NAME
-------------------- -------------------- --------------------
SCOTT BONUS JOB
SCOTT BONUS ENAME
so it seems to work fine (although on other schemas I sometimes run into an error... I'll see if I can figure out what that is).
In this case the table is empty (no rows!) - this is one example of a "false positive" you may find. (More generally, you will get a false positive if everything in a VARCHAR2 column is null - in all rows of the table.)
NOTE also that a column may have only numeric values and still the best data type would be VARCHAR2. This is the case when the values are simply identifiers and are not meant as "numbers" (which we can compare to each other or to fixed values, and/or with which we can do arithmetic). Example - a SSN (Social Security Number) or the equivalent in other countries; the SSN is each person's "official" identifier for doing business with the government. The SSN is numeric (actually, perhaps to accentuate the fact it is NOT supposed to be a "number" despite the name, it is often written with a couple of dashes...)

Oracle/SQL - Using query results as paramaters in another query -looping?

Hi everyone what I'm wondering if I can do is create a table that lists the record counts of other tables. It would get those table names from a table. So let's assume I have the table TABLE_LIST that looks like this
name
---------
sports_products <-- contains 10 records
house_products <-- contains 8 records
beauty_products <-- contains 15 records
I would like to write a statement that pulls the names from those tables to query them and coount the records and ultimately produce this table
name numRecords
------------------------------
sports_products 10
house_products 8
beauty_products 15
So I think I would need to do something like this pseudo code
select *
from foreach tableName in select name from table_list
select count(*) as numRecords
from tableName
loop
You can have a function that is doing this for you via dynamic sql.
However, make sure to declare it as authid current_user. You do not want anyone to gain some sort of privilege elevation by exploiting your function.
create or replace function SampleFunction
(
owner in VarChar
,tableName in VarChar
) return integer authid current_user is
result Integer;
begin
execute immediate 'select count(*) from "' || owner || '"."' || tableName || '"'
INTO result;
return result;
end;
One option is to simply keep your DB statistics updated, use dbms_stats package or EM, and then
select num_rows
from all_tables
where table_name in (select name from table_list);
I think Robert Giesecke solution will work fine.
A more exotic way of solving this is by using dbms_xmlgen.getxml.
See for example: Identify a table with maximum rows in Oracle