Dynamically drop partition in hive sql - sql

I need to drop data from the table which is older than 6 months, this needs to be part of a job and run everyday. I am using the below code
ALTER TABLE ab_test_cart_sbu_tableau_test_2 DROP IF EXISTS PARTITION (partition_day = add_months(current_date(),-6))
and getting the following error
Error: Error while compiling statement: FAILED: ParseException line
1:104 cannot recognize input near 'add_months' '(' 'current_date' in
constant (state=42000,code=40000)
ALTER TABLE ab_test_cart_sbu_tableau_test_2 DROP IF EXISTS PARTITION (partition_day = add_months(current_date(),-6))

Related

Pg admin 4 - error creating any new table

whenever I am trying to create a new table on PgAdmin 4 tool for postgresql I am getting an error
Query: CREATE TABLE geeks_table
ERROR: syntax error at end of input
LINE 1: CREATE TABLE geeks_table
^
SQL state: 42601
Character: 25
As per the comments, you need to add the data definition language (DDL) to the create statement.
The database needs to know what columns and data types are required, for example, the following DDL will create a 2 column table:
CREATE TABLE geeks_table (
code INTEGER PRIMARY KEY,
title varchar(40) NOT NULL
);
If you're aiming for a table without columns, this is the syntax
CREATE TABLE geeks_table ();

aws glue drop partition using spark sql

drop partition using spark sql frm glue metadata is throwing issues while same code works in hive shell.
**Hive shell**
hive> alter table prc_db.detl_stg drop IF EXISTS partition(prc_name="dq") ;
OK
Time taken: 1.013 seconds
**spark shell**
spark.sql(''' alter table prc_db.detl_stg drop IF EXISTS partition(prc_name="dq") ''') ;
Error message:
py4j.protocol.Py4JJavaError: An error occurred while calling o60.sql.
: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: InvalidObjectException(message:Unsupported expression (Service: AWSGlue; Status Code: 400; Error Code: InvalidInputException
If you are planning to drop the partition(Glue Catalogue) from spark shell then, you have to add the "" in the partition name. ex: spark.sql(""" alter table reinvent.newtesttable drop partition ( part= " 'part2' " ) """) for me it was spark.sql(''' alter table prc_db.detl_stg drop IF EXISTS partition(prc_name="'dq'") ''')

using new columns in the temp table added by alter table not work

I have a problem that the new added column can't be used in the further comments.
I have a temp table built by "select into" then I need to add an identity column by "alter table". But when I want to use the new column in a "join", I got an error "Invalid column". please note that, these commands could work separately.
I think the reason is, the new column is not found by the compiler and it give an error before running it.
Is there a solution for that ?
I have got this problem in sql server 2000 and it seems in a newer version, the problem is not there.
create table #tmp_tb
(name varchar(4), val int)
insert into #tmp_tb values('ab',1);
insert into #tmp_tb values('abc',2);
select * from #tmp_tb
alter table #tmp_tb add id int NOT NULL IDENTITY(1,1);
select * from #tmp_tb
select id,name,val from #tmp_tb
An error occurred :
Msg 207, Level 16, State 3, Line 9
Invalid column name 'id'.
Replace the last line with
EXECUTE sp_executesql N'select id,name,val from #tmp_tb';
Indeed, the parser doesn't know about the new column yet. By passing it through sp_executesql you avoid this.

Getting Security error while altering table

I have a table called borrowlist. When I try to drop a column I get an error that says :
Cannot find the object "borrowlist"
Because it does not exist or you do not have permissions."
Query : USE Test
alter table borrowlist DROP column MEMBERFULLNAME

TABLE DROP statement

I am trying to drop a table that I created in SQL Server. I typed the command as follows:
DROP TABLE [Exercise2].
I received the following error message:
Msg 3701, Level 11, State 5, Line 1
Cannot drop the table 'Exercise2', because it does not exist or you do not have permission.
I have dropped multiple tables trying to clean out my database, but this one will not drop and I'm not sure why.