Generating Snowflake scripts that would grant the privileges to all the appropriate roles in an account - roles

I'm trying to write SELECT statements that would generate 'grant' scripts that would grant all the privs to the appropriate roles in the entire account.
This is needed to migrate a Snowflake account from one region to another.
Has anyone come up with such a script ?
I have the following so far (runs ok, but its not validated yet,too many rows,perhaps its a cartesian product) :
select 'grant ' || gt.privilege || ' on ' || GT.TABLE_CATALOG || '.' || gt.table_schema || '.' || gt.granted_on || ' ' || gt.name || ' to role ' || gt.grantee_name || ';' from account_usage.grants_to_roles gt
where gt.table_catalog IS NOT NULL and gt.table_schema IS NOT NULL order by gt.grantee_name, gt.name, gt.granted_on,gt.privilege
Thanks
NJ_JA

Related

Cleaning a SQL schema before using SQL Developer Database copy tool

I have a database for an application and several environment (development, test and production). I would like to use the option Database copy from SQL Developer in order to retrieve data from the production and copy them in development. Thus data on both environments will be the same.
With a previous version of the program all was working perfectly. Nevertheless with a new version (SQL Developer 18.2) imposed by our company, I obtain several error with different objects like sequences, existing table, primary key, ...) during the copy.
Thus I would like to use a script for cleaning the objects of the Database before to use the tool in order to see if the problem will be solved. But I don't know how to do that.
I found and updated this script:
BEGIN
FOR cur_rec IN (SELECT object_name, object_type
FROM user_objects
WHERE object_type IN
('TABLE',
'VIEW',
'PACKAGE',
'PROCEDURE',
'FUNCTION',
'SEQUENCE',
'SYNONYM'
))
LOOP
BEGIN
IF cur_rec.object_type = 'TABLE'
THEN
EXECUTE IMMEDIATE 'DROP '
|| cur_rec.object_type
|| ' "'
|| cur_rec.object_name
|| '" CASCADE CONSTRAINTS';
ELSE
EXECUTE IMMEDIATE 'DROP '
|| cur_rec.object_type
|| ' "'
|| cur_rec.object_name
|| '"';
END IF;
EXCEPTION
WHEN OTHERS
THEN
DBMS_OUTPUT.put_line ( 'FAILED: DROP '
|| cur_rec.object_type
|| ' "'
|| cur_rec.object_name
|| '"'
);
END;
END LOOP;
END;
Nevertheless this script Cleanup the Schema by DROPPING all objects. I would like to keep the structure and the objects but just empty the conten.
Could you please help me how to do for cleaning the different object without deleting them and importing again?
Thank in advance for your help.
Sebastien

Execute a SQL statement prepared by another SQL

Is it possible to execute a ALTER INDEX prepared by doing a SELECT on sys tables in Oracle 12c. Please see below
I am trying to find the unused indexes for which I have prepared the alter statements by select clause below -
SELECT 'alter index ' || owner || '.' || index_name || ' monitoring usage;'
FROM dba_indexes
WHERE owner NOT in (
'SYSTEM',
'SYS');
Next, I will have to manually copy the output of this SQL to a new SQL commander tab and execute them. Rather is it possible to execute these statements directly instead of showing them?
I am trying to achieve this in SQL only, as a single statement executable from any SQL utility like DBViz or SQL+, and NOT IN PL/SQL or Unix.
You can do this two ways:
1- Spoofing the result to a file and running the file.
set pages 0 lines 200 feed off term off
spool _file
SELECT 'alter index ' || owner || '.' || index_name || ' monitoring usage;'
FROM dba_indexes
WHERE owner NOT in ('SYSTEM','SYS');
spool off
#_file
1- Or via PL/SQL
BEGIN
FOR rec in (select owner, index_name FROM dba_indexes WHERE owner NOT in ('SYSTEM','SYS'))
LOOP
query := 'alter index ' || rec.owner || '.' || rec.index_name || ' monitoring usage'
execute immediate query;
END LOOP;
end;
/
SELECT * FROM(SELECT 'alter index ' || owner || '.' || index_name || ' monitoring usage;'
FROM dba_indexes
WHERE owner NOT in (
'SYSTEM',
'SYS');)

Dynamic SQL with table name as a parameter

I am trying to execute a procedure into which i send the table name and 2 column names as parameters:
EXECUTE IMMEDIATE 'select avg(#column1) from #Table1 where REF_D = #column2' into ATTR_AVG;
I have tried using the variables in combiations of '#', ':', '||' but nothing seems to work.
Has anyone used table names as a parameter. there are a few solutions here but for SQL Server
You can only use bind variables (denoted by colons) for values, not for parts of the structure. You will have to concatenate the table and column names into the query:
EXECUTE IMMEDIATE 'select avg(' || column1 | ') from ' || Table1
|| ' where REF_D = ' || column2 into ATTR_AVG;
Which implies REF_D is a fixed column name that can appear in any table you'll call this for; in a previous question that seems to be a variable. If it is actually a string variable then you'd need to bind and set that:
EXECUTE IMMEDIATE 'select avg(' || column1 | ') from ' || Table1
|| ' where ' || column2 || ' = :REF_D' into ATTR_AVG using REF_D;
If it's supposed to be a date you should make sure the local variable is the right type, or explicitly convert it.
You need to construct the executable statement using || (or else define it as one string containing placeholders that you can then manipulate with replace). Something like:
create or replace procedure demo
( p_table user_tab_columns.table_name%type
, p_column1 user_tab_columns.column_name%type
, p_column2 user_tab_columns.column_name%type )
is
attr_avg number;
begin
execute immediate
'select avg(' || p_column1 || ') from ' || p_table ||
' where ref_d = ' || p_column2
into attr_avg;
dbms_output.put_line('Result: ' || attr_avg);
end demo;
It's generally a good idea to build the string in a debugger-friendly variable first, i.e. something like:
create or replace procedure demo
( p_table user_tab_columns.table_name%type
, p_column1 user_tab_columns.column_name%type
, p_column2 user_tab_columns.column_name%type )
is
attr_avg number;
sql_statement varchar2(100);
begin
sql_statement :=
'select avg(' || p_column1 || ') from ' || p_table ||
' where ref_d = ' || p_column2;
execute immediate sql_statement into attr_avg;
dbms_output.put_line('Result: ' || attr_avg);
end demo;
Depending on what ref_d is, you may have to be careful with what you compare it to, so the above could require some more work, but hopefully it gives you the idea.
Edit: however see Alex Poole's answer for a note about the use of bind variables. If ref_d is a variable that may need to become:
sql_statement :=
'select avg(' || p_column1 || ') from ' || p_table ||
' where ' || p_column2 || ' = :b1';
execute immediate sql_statement into attr_avg using ref_d;
(The convention is to put the search expression on the right e.g. where name = 'SMITH' rather than where 'SMITH' = name, though they are the same thing to SQL.)

How to create a script that will copy tables from another schema into my schema using PL/SQL advanced scripting in SQLDEV?

So, I'm trying to create a script that will generate SQL codes for copying tables from HR schema into my own schema.
What I got so far is this but it's incorrect... will anyone help me or give me hints?
select 'create table ' || USER_TABLES || '_copy as select * from ' || USER_TABLES || ';'from hr_tables;
Please help me I'm new at this and so desperate.
Try this:
SELECT 'CREATE TABLE ' ||table_name || '_copy AS SELECT * FROM ' || table_name || ';'
FROM all_tables
WHERE OWNER = 'HR';

How can I set up a SPOOL script in Oracle APEX to export view data as a CSV?

I have a view in APEX which I want to export to the drive as a CSV file (which would be picked up by processes from other applications). There is the UTIL_FILE method but it seems much more complex. How can I use SPOOL to export a view as a CSV? I tried this, but it failed to run inside of Apex? So I guess I am not sure about the script itself, but also where to save/schedule it.
spool out.csv
select '"'|| EVENT_ID || '",' || ENTER_DATE || ',' || START_TIME || ',' || END_TIME || ',' || PLANNED_FLAG || ',' || PURPOSE
|| ',' || TITLE || ',' || SERVICES || ',' || CAUSES || ',' || TICKET_NUM || ',' || OWNER || ',' || DETAILS from DT_FULLVIEW;
spool off
exit​
There are a couple of ways to do this.
1) You can create your script and schedule it to run using SQLPLUS outside of the database using the operating system utilities.
2) dbms_scheduler is said to be able to run external programs. (I have never done this myself).
3) Create a packaged procedure that creates the file using utl_file_dir and schedule it from within the database.