hive insert into query from CLI not working - hive

I'm trying to insert into statement from CLI but getting following error. If I run the same query in hive prompt records are getting inserted. Any help would appreciated. I'm using hive 2.x version.
hive -e "insert into table dim.mpp_dim_count (run_date ,run_type , count1,count2 ,dup_check_all_dim ,fact_dim_sk_match ) values ('05-04-2019','before',100,100,'Y','Y')"
FAILED: SemanticException [Error 10293]: Unable to create temp file
for insert values AlreadyExistsException(message:Table
Values__Tmp__Table__1 already exists.)

Related

Cannot create Temporary Table in Hive using Dbvisualizer

When I run the command ->
create temporary table temp_sbm_test as select * from tablename limit 1;
I am getting below error in DBVisualizer.
Can someone explain why the temporary table is going to a database in HIVE database not temporary location?
[Code: 40000, SQL State: 42000] Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [userid] does not have [CREATE] privilege on [somedatabse/temp_sbm_test]

pgpointcloud backup fails due to a missing relation

I'm trying to backup a schema of only pointclouds, I try to use the built in backup in pgadmin4 and I receive the following error. It complains that ERROR: relation "pointcloud_formats" does not exist I test the select schema, srid from pointcloud_formats where pcid = 1 query and it returns without issue. Any ideas?
pg_dump: Dumping the contents of table "pc_0201507091159" failed: PQgetResult() failed.
pg_dump: Error message from server: ERROR: relation "pointcloud_formats" does not exist
LINE 1: select schema, srid from pointcloud_formats where pcid = 1
^
QUERY: select schema, srid from pointcloud_formats where pcid = 1
pg_dump: The command was: COPY pointcloud_99_526.pc_0201507091159 (id, pa) TO stdout;

Hive Update 0.14 version is not working Attempt to do update or delete using transaction manager that does not support these operations.“

I'm trying to update hive orc bucket table. but it throwing exception FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations.“
I'm running in hive Command prompt.
STEP 1:
set hive.support.concurrency = true;
SET hive.enforce.bucketing = true;
SET hive.exec.dynamic.partition.mode = nonstrict;
SET hive.txn.manager = org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;
SET hive.compactor.initiator.on = true;
SET hive.compactor.worker.threads = 1;
STEP 2:
create table test(id int ,name string ) clustered by (id) into 2 buckets stored as orc TBLPROPERTIES('transactional'='true');
STEP 3:
insert into table test values (1,'row1'),(2,'row2'),(3,'row3'); -- 3 rows inserted successfully
STEp 4 :
insert into table testTable values (1,'row1'),(2,'row2');
FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations.“
After this when I open another hive prompt and run show tables, it remains in hang state no results are return. I restart hive services also but no use.
According to the Hive wiki, Bucketing and Partitioning columns cannot be updated. Can you retry by using a column other than id?

need to delete bulk insert object to run again

i have a sql script I am supposed to run that starts with:
BULK INSERT #BridgeVendors
FROM 'D:\projects\databases\Scripts\Release\6.7.1\BridgeVendors.csv'
WITH ( FIELDTERMINATOR=',', FIRSTROW = 2 )
the first time I ran it I had the path name going toincorrect path so it didn't execute properly , but now - I can't run it again because I get the error :
There is already an object named '#BridgeVendors' in the database.
How do I UNDO or DELETE this "Object" that was a BULK INSERT??
You just need to drop the table :)
drop table #BridgeVendors

Failing update table in db2 with SQLCODE: -668, SQLSTATE: 57016, SQLERRMC: 7;

I am using db2 9.5 i have created a column in table which is created successfully but i am not able to update table column and getting following error
[Error] Script lines: 1-1 --------------------------
DB2 SQL error: SQLCODE: -668, SQLSTATE: 57016, SQLERRMC: 7;DB2ADMIN.XCATENTRYEXT
Message: Operation not allowed for reason code "7" on table "DB2ADMIN.XCATENTRYEXT".
Following the some blog/sites on google i found the REORG command as solution as mentioned in following link
http://bytes.com/topic/db2/answers/508869-reorg-tablespace
i have tried the following queries to run on database to solve the problem.
Database["DB2"].ExecuteNonQuery("call SYSPROC.ADMIN_CMD ('REORG TABLE DB2ADMIN.XCATENTRYEXT index CATENTRY_ID INPLACE')")
REORG TABLE DB2ADMIN.XCATENTRYEXT index CATENTRY_ID INPLACE
REORG TABLE DB2ADMIN.XCATENTRYEXT
REORG INDEXES I0000908 FOR TABLE DB2ADMIN.XCATENTRYEXT
but all queries have the same error in result like
DB2 SQL error: SQLCODE: -104, SQLSTATE: 42601, SQLERRMC: Database;BEGIN-OF-STATEMENT;<variable_set>
Message: An unexpected token "Database" was found following "BEGIN-OF-STATEMENT". Expected tokens may include: "<variable_set>".
I am stuck on this error, I am not even able to update any column of that particular table.
It is possible to do REORG through an SQL statement:
CALL SYSPROC.ADMIN_CMD('REORG TABLE SCHEMA.TABLENAME');
It follows from the error message, that you somehow submit the entire string Database["DB2"].ExecuteNonQuery("call SYSPROC.ADMIN_CMD ('REORG TABLE DB2ADMIN.XCATENTRYEXT index CATENTRY_ID INPLACE')") as a SQL statement, which obviously is incorrect.
Simply issue these on the shell command line:
db2 connect to <your database name here>
db2 REORG TABLE DB2ADMIN.XCATENTRYEXT
If you are using tool like dbeaver , you can go to Schema --> table name --> right click --> select tools and you should see option for reorg table