How to create copy of full schema on same database in Redshift - sql

Currently Redshift do not provide privilege to create copy of full schema on same database in Redshift. I followed this(http://docs.aws.amazon.com/redshift/latest/dg/r_CREATE_SCHEMA.html), but did not get any information on my question

Yes, you can use below query to copy your data from one schema to another.
Create table new_schema.table_name as select * from old_schema.table_name.

It sounds like you want to create a copy of all the tables with data. If so then you will have to:
Create the new schema
Retrieve the DDL for all tables in existing schema
Use this view: https://github.com/awslabs/amazon-redshift-utils/blob/master/src/AdminViews/v_generate_tbl_ddl.sql
Modify the DDL to reference the new schema
Run the DDL to create the target tables
Run an INSERT INTO new_schema.new_table SELECT * FROM old_schema.old_table; for each table in the schema.

Related

Setting transactional-table properties results in external table

I am creating a managed table via Impala as follows:
CREATE TABLE IF NOT EXISTS table_name
STORED AS parquet
TBLPROPERTIES ('transactional'='false', 'insert_only'='false')
AS ...
This should result in a managed table which does not support HIVE-ACID.
However, when I run the command I still end up with an external table.
Why is this?
I found out in the Cloudera documentation that neglecting the EXTERNAL-keyword when creating the table does not mean that the table definetly will be managed:
When you use EXTERNAL keyword in the CREATE TABLE statement, HMS stores the table as an external table. When you omit the EXTERNAL keyword and create a managed table, or ingest a managed table, HMS might translate the table into an external table or the table creation can fail, depending on the table properties.
Thus, setting transactional=false and insert_only=false leads to an External Table in the interpretation of the Hive Metastore.
Interestingly, only setting TBLPROPERTIES ('transactional'='false') is completly ignored and will still result in a managed table having transactional=true).

How to get information in one database when row is inserted in another database?

I have two databases. How can I write some trigger to get notification in database1 when row is insered in some table in database2. I have link from database1 to database2, but I don’t have link from two to one.
A standard way to replicate data between databases like this is to create a materialized view in database1 for your table(s) in database2. E.g.
CREATE MATERIALIZED VIEW db1_table
REFRESH FORCE AS
SELECT * FROM db2_table#db2;
See for example this article for details.

Can't delete table in different schema from Apache Ignite

I try to drop my tables in Apache ignite.
For example;
drop table if exists city
The code is working for PUBLIC schema but I couldn't delete tables from other schemas.
The error says 'Failed to parse query. Table "PRODUCT" not found;'
Here is the screenshot of my PowerShell?
How can I delete the tables belong the different schemas?
You need to fully qualify it:
drop table "Product".PRODUCT;
You need to do the same thing whenever you reference tables in different schemas, for example with a JOIN.

Best way of importing tables to PostGre from Oracle?

I have done some queries in a read-only (Oracle/SQL developer) dB where I have absolutely no privileges to create temporary tables or anything else. I want to export the tables resulting of my queries into another db (PostGre/pgAdmin) db in order to be able to do other queries on the result.
Is there an easy way to create the columns of my exported tables in the PostGre db using pgAdmin or do I have to create all the columns manually ?
You could install the Oracle Foreign Data Wrapper in your PostgreSQL database and use IMPORT FOREIGN SCHEMA to create foreign tables for your Oracle tables.
Then you can use
CREATE TABLE local_table AS SELECT * FROM foreign_table;
to copy the data.

List Hive table properties to create another table

I have a table on Hive already created. Is there a way to copy the table schema to a terminal to pass it to a create table on another Hive server?
Have you tried the SHOW CREATE TABLE <tablename> command? I think it should give you the create ddl you are looking for.
This link provides some background on when this was implemented.