when i tried to query metastore in hive i am getting below error.
hive> use mydb;
OK
Time taken: 0.052 seconds
hive> select * from DBS;
FAILED: SemanticException [Error 10001]: Line 1:14 Table not found 'DBS'
hive> select * from TBLS;
FAILED: SemanticException [Error 10001]: Line 1:14 Table not found 'TBLS'
i am using hadoop version : Hadoop 2.7.3.2.6.2.3-1
Is this a access privilage issue?
kindly share your suggestion
You can try this.
select * from hive.TBLS;
Regarding your question from comments:
use hive;
show tables;
Also, you can find useful this link.
Related
I am having problems trying to read from Pesto:
presto> select * from DEFAULT.MYDIM;
Error: Query 20190722_165050_00066_7g652 failed: Hive table 'default.mydim' is corrupt. Found sub-directory in bucket directory for partition: <UNPARTITIONED>
presto> SELECT * FROM DEFAULT.hello_acid;
Query 20190722_171313_00080_7g652, FAILED, 11 nodes
Splits: 16 total, 0 done (0.00%)
0:01 [0 rows, 0B] [0 rows/s, 0B/s]
Query 20190722_171313_00080_7g652 failed: Hive table 'default.hello_acid' is corrupt. Found sub-directory in bucket directory for partition: load_date=2016-03-02
How could I solve this?
Presto currently does not support ACID/Transactional tables, unless they're fully VACUUM-ed. You can track this feature request at https://github.com/prestosql/presto/issues/576
Currently you can read such tables using Starburst Presto 323e.
I'm using Microsoft Hive ODBC driver to connect hive server. An error occurred while I'm trying to execute 'select * from tb limit 100' using a table 'tb' with schema csv and a partition key. Other table without partition key can execute successfully.
ERROR [HY000] [Microsoft][Hardy] (97) Error occurred while trying to
get table schema from server. Error: [Microsoft][Hardy] (35) Error
from server: error code: '0' error message:
'MetaException(message:java.lang.UnsupportedOperationException:
Storage schema reading not supported)'.
Add below configuration under "Custom hive-site":
metastore.storage.schema.reader.impl=org.apache.hadoop.hive.metastore.SerDeStorageSchemaReader
It worked for me.
Note: Restart affected services after saving the configuration.
I was able to truncate external table using force command in one environment, but in another it was not working and throwing error.
TRUNCATE TABLE hive_table_name FORCE
org.apache.hive.service.cli.HiveSQLException: Error while compiling
statement: FAILED: ParseException line 1:24 extraneous input 'FORCE'
expecting EOF near '
please help if we need to configure something
I have created one table in hive from existing s3 file as follows:
create table reconTable (
entryid string,
run_date string
)
LOCATION 's3://abhishek_data/dump1';
Now I would like to update one entry as follows:
update reconTable set entryid='7.24E-13' where entryid='7.24E-14';
But I am getting following error:
FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations.
I have gone through a few posts here, but not getting any idea how to fix this.
I think you should create an external table when reading data from a source like S3.
Also, you should declare the table in ORC format and set properties 'transactional'='true'.
Please refer to this for more info: attempt-to-do-update-or-delete-using-transaction-manager
You can refer to this Cloudera Community Thread:
https://community.cloudera.com/t5/Support-Questions/Hive-update-delete-and-insert-ERROR-in-cdh-5-4-2/td-p/29485
I am fetaching data from hive2 in SpagoBI 5.1 open source tool.
When i am creating dashboard, it shows an error:- Impossible to load dataset [test_connection] due to the following service errors: Method not supported;
And, in hive back ground, terminal Shows an error i.e., SemanticException [Error 10001]: Line 1:14 Table not found 'test1' through in hive.
In hive command, hive --service hiveserver2 10000 &.
Thanks