View/edit table structure with SQL in Base - sql

I'm JPablos and I trying to view the structure of "orders" table.
I'm using Base
LibreOffice Versión: 5.2.0.4 Id. de compilación: 1:5.2.0~rc4-0ubuntu1~xenial2 Subprocesos de CPU: 1; Versión de SO: Linux 4.4
SQL statement
select listagg(column_name ||','|| data_type ||','|| case
when data_type in ('VARCHAR2', 'NVARCHAR2', 'CHAR', 'RAW')
then to_char(data_length)
when data_type = 'NUMBER' and (data_precision is not null or data_scale is not null)
then data_precision || case
when data_scale > 0 then '.' || data_scale
end
end, ',') within group (order by column_id)
from all_tab_columns where table_name = 'orders';
Then SQL informs me
1: Access is denied: LISTAGG in statement [select listagg(]
Note: obviously... the easy way in Base UI: select "orders" / right click / Edit, and yes it opens the structure of table "orders". But, I want to use SQL to do it.
Thanks in advance
JPablos

The SQL statement is written for the Oracle database. The LISTAGG function is not supported by HSQLDB.
If you use LibreOffice base together with the latest HSQLDB 2.3.4 (instead of the bundled version 1.8.0) then you can use the HSQLDB function GROUP_CONCAT.

after all it is a SQL statement to do the query object of my question above, and is:
SELECT * FROM "INFORMATION_SCHEMA"."SYSTEM_COLUMNS" WHERE "TABLE_NAME" = 'Students'
Where "Students" is the name of a table used for this answer.
The SQL statement reports:
Result of the query
Best regards
JPablos

Related

Sum all numeric columns in database and log results

I have a query which gives me all numeric columns in my Postgres database:
SELECT table_schema, table_name, column_name
FROM information_schema.columns
WHERE table_schema in (
'datawarehouse_x',
'datawarehouse_y',
'datawarehouse_z',
'datawarehouse_w'
)
and udt_name not in
('date','timestamp','bool','varchar')
and column_name not like '%_id'
This gives me, what I need:
table_schema table_name column_name
schema_1 table_x column_z
schema_2 table_y column_w
I checked it and it's fine.
What I do now want to do is, to query all these columns for each table as a select sum(column) and then insert this schema_name, table_name, query_result and the current date into a log table on a daily basis.
Writing the results into a target table shouldn't be a big deal, but how in the world can I run queries according to the results of this query?
Thanks in advance.
EDIT: What I will write afterwards would be a procedure, which takes these schema/table/column as input, then queries the table and writes into the log-table. I just do not know the part in-between. This is kind of what I would be doing then, but I don't know yet which types I should use for schema, table and column.
create or replace function sandbox.daily_routine_metrics(schema_name regnamespace, table_name regclass, column_name varchar)
returns void
language plpgsql
as $$
BEGIN
EXECUTE
'INSERT INTO LOGGING.DAILY_ROUTINE_SIZE
SELECT
'|| QUOTE_LITERAL(schema_name) ||' schema_name,' ||
QUOTE_LITERAL(table_name) ||' table_name, ' ||
QUOTE_LITERAL(column_name) ||' column_name, ' ||
'current_timestamp, sum(' || QUOTE_LITERAL(column_name) || ')
FROM ' || QUOTE_LITERAL(schema_name) ||'.'|| QUOTE_LITERAL(table_name);
END;
$$;
The feature you need is known as "dynamic SQL". It's an RDBMS-specific implementation; the documents for Postgres are here.
Whilst it's possible to achieve what you want in dynamic SQL, you might find it easier to use a scripting language like Python or Ruby to achieve this. Dynamic SQL is hard to code and debug - you find yourself concatenating lots of hardcoded strings with results from SQL queries, printing them to the console to see if they work, and realizing all sorts of edge cases blow up.

How to get a list of triggers with lists of all tables which are used inside the triggers

How to get a list of triggers with lists of all tables which are used inside the triggers. Means both
1) trigger created of table
2) tables used inside trigger logic ?
Please provide the solution in oracle as well as in sql server?
You'd use ALL_TRIGGERS and ALL_DEPENDENCIES to find all triggers with their depending tables. This is: all tables that the DBMS sees dependant. So where dynamic SQL is used, the DBMS is blind to which tables the dynamic query will contain and you'll miss those.
select
t.owner || '.' || t.trigger_name as trigger_name,
t.table_owner || '.' || t.table_name as table_name,
(
select
listagg(d.referenced_owner || '.' || d.referenced_name, ', ')
within group (order by d.referenced_owner, d.referenced_name)
from all_dependencies d
where d.owner = t.owner
and d.name = t.trigger_name
and d.type = 'TRIGGER'
and d.referenced_type = 'TABLE'
and not (d.referenced_owner = t.table_owner and d.referenced_name = t.table_name)
) as other_tables
from all_triggers t;
This is for Oracle (I don't know how the same is done in SQL Server. I guess it to be quite similar. SQL Server, too, will have system tables, you could get this information from).
In Oracle try below query to get desired output
SELECT TRIGGER_NAME,TABLE_NAME,TRIGGER_TYPE, TRIGGERING_EVENT, LINE, TEXT
FROM USER_TRIGGERS UT, USER_SOURCE US
WHERE UT.TRIGGER_NAME=US.NAME;

in oracle, is it possible to query columns with same data type

I have table which has almost hundred fields. I want to get all the fields which data type is date. Is it possible in oracle to write such a query to return fields only contain a certain data type? Here is my pseudo query:
Select * from mytable
where colum_datatype is date
Similarly, I want to get all fields which is varchar2 type. is it possible to do that?
I can find all the date fields manually and put them in the query but I just want to know is there another way to do it.
Thank you!
You can query one of the system tables/views to get the list of columns:
select column_name
from all_tab_cols
where owner = :owner and table_name = :table and data_type = 'DATE';
If you need a one-off solution, just aggregate these and plug into a sql query. You can construct the entire SQL query:
select 'SELECT ' || listagg(column_name, ', ') within group (order by column_id) || ' FROM ' || :table
from all_tab_cols
where owner = :owner and table_name = :table and data_type = 'DATE';
You can also put the query into a string and use dynamic SQL (execute immediate) to run the query.
Sorry, no such functionality exists in vanilla SQL. You may be able to simulate such functionality by creating a PL/SQL function that returns a cursor to a dynamically created SQL statement.

Get Columns Names of Table

How I can Get for a specific Table its columns Names ?
I tried this :
SELECT column_name
FROM USER_TAB_COLUMNS
WHERE table_name = 'x' ;
But it doesn't work.
Try this :
SELECT COLUMN_NAME FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'TABLENAME'
Hope this helps.
EDIT :
The Oracle equivalent for information_schema.COLUMNS is USER_TAB_COLS for tables owned by the current user, ALL_TAB_COLS or DBA_TAB_COLS for tables owned by all users.
Tablespace is not equivalent to a schema, neither do you have to provide the tablespace name.
Providing the schema/username would be of use if you want to query ALL_TAB_COLS or DBA_TAB_COLS for columns OF tables owned by a specific user. in your case, I'd imagine the query would look something like:
String sqlStr= "
SELECT column_name
FROM all_tab_cols
WHERE table_name = 'users'
AND owner = ' || +_db+ || '
AND column_name NOT IN ( 'password', 'version', 'id' )
"
Note that with this approach, you risk SQL injection.
Source : Oracle query to fetch column names
String comparisons in Oracle are case sensitive by default, and table and column names are upper case by default, so you need to make sure that the capitalization of the table name you are searching for matches the the way it's stored in the database, so unless your table was named with mixed case or all lower case try making sure your string is all upper case.
SELECT column_name from USER_TAB_COLUMNS
WHERE table_name = 'X'; -- not 'x'
Additionally, if you don't own the table, then you need to use either ALL_TAB_COLUMNS or DBA_TAB_COLUMNS instead of USER_TAB_COLUMNS since USER_TAB_COLUMNS only lists details for tables owned by your current schema.
You can just describe the table:
desc x;
Try to queryUSER_TAB_COLUMNS view
SELECT column_name
FROM USER_TAB_COLUMNS
WHERE table_name = 'TABLE_NAME'

How to extract table definitions using SQL or Toad

Can somebody tell me how to extract my table definitions using SQL? I want to extract the datatypes of all my tables and other information from my Oracle schema. I have about 100 tables.
I need the complete documentation of my Oracle Schema. My schema name IS "cco".
Can I do this by SQL?
I am using Toad for Data analyst 3.3. Please let me know if this tool helps.
You can try this -
select * from all_tab_cols
where owner = 'CCO';
To get the DDL for all tables of the current user, you can use this:
select dbms_metadata.get_ddl('TABLE', table_name)
from user_tables;
You will need to adjust your SQL client to be able to properly display the content of a CLOB column.
More details (e.g. about how to get the DDL for other objects) can be found in the manual: http://docs.oracle.com/cd/B28359_01/appdev.111/b28419/d_metada.htm
you can use the table:USER_TAB_COLUMNS
Find below query example
select
table_name,
column_name,
data_type,
data_length,
data_precision,
nullable
from USER_TAB_COLUMNS
where table_name = '<table_name>';
This is only an example you can also do a select * to get more information.
you can also use the table: all_tab_columns
For a better display you can use:
select table_name,column_name, data_type||
case
when data_precision is not null and nvl(data_scale,0)>0 then '('||data_precision||','||data_scale||')'
when data_precision is not null and nvl(data_scale,0)=0 then '('||data_precision||')'
when data_precision is null and data_scale is not null then '(*,'||data_scale||')'
when char_length>0 then '('||char_length|| case char_used
when 'B' then ' Byte'
when 'C' then ' Char'
else null
end||')'
end||decode(nullable, 'N', ' NOT NULL') as data_type
from user_tab_columns
where table_name = '<TABLE_NAME>';