while creating the table (with user admin) with Hive I have this error :
Your query has the following error(s):
Error while compiling statement: FAILED: SemanticException No valid privileges Required privileges for this query: Server=server1->Db=*->Table=+->action=insert;Server=server1->Db=*->Table=+->action=select;
I do not understand why admin has not enough privileges, what should I do to solve the problem ?
Thanks
Login to terminal and run this command sudo -u hdfs hadoop fs -chmod 777 -R /user/hive/warehouse and try creating databases/tables.
The user admin in hue need not have permissions under hdfs location /user/hive/warehouse where all the Hive tables will be pointing to.
Related
I am implementing a python script, which uses paramiko to connect to a hadoop cluster. My problem is that I can SSH to a root user only, and from inside I have to switch user to hdfs to execute my command.
now I need something to automate this switching to HDFS user and then cding into /tmp/ and then executing command from there. I have tried invoke_shell() , it hangs, and also the && inside the exec_command, it also doesnt work.
I am getting a permission denied exception:
java.io.FileNotFoundException: file.txt (Permission denied)
There are two workflows that I have thought of:
1st one:
1. sudo -u hdfs -s
2. cd /tmp/
3. <execute the command> <outputDir>
2nd one:
sudo -u hdfs <execution command> /tmp/<outputDir>
The first one doesnt give the above error. But the second one throws this. I was trying second one just to avoid the dependent command issue.
Any help or suggestions will be appreciated.
Hey everyone I am fairly new to doing a pg_dump in Postgre sql. I have logged into the server and am running as postgres user. I try to run the pg_dump in order to do a database migration but I keep getting a "Permission Denied" prompt. I believe that I have the highest permissions and should be able to run this. Is there something wrong with my syntax in terminal? Would appreciate the help.
Issue:
*
[user#dfhsdaf07 ~]$ sudo su
[root#dfhsdaf07 user]# su postgres
bash-4.2$ pg_dump -F t file > file.tar
bash: file.tar: Permission denied
bash-4.2$ pg_dump -F t file >./file.tar
bash: ./file.tar: Permission denied
System info:
Macbook Pro 2015
16 GB RAM
Intel I7 Processoor
OS:
macOS Big Sur 11.5.2
The permission you are lacking is the permission to create the dump file in the current directory.
The reason is that if you use so postgres, you remain in the previous working directory (presumably /root), and user postgres doesn't have permissions to create a file there.
Use the - option:
su - postgres`
That will start a login shell and put you into postgres's home directory.
I begin learning BigData with Hadoop Hive
I can't upload local data to Hive table
Hive command is:
load data local inpath '/usr/local/nhanvien/testHive.txt' into table nhanvien;
I get error :
Loading data to table hivetest.nhanvien Failed with exception Unable
to move source file:/usr/local/nhanvien/testHive.txt to destination
hdfs://localhost:9000/user/hive/warehouse/hivetest.db/nhanvien/testHive_copy_3.txt
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.MoveTask
was try:
hadoop fs -chmod g+w /user/hive/warehouse
sudo chmod -R 777 /home/abc/employeedetails
it still get this error
can someone give me solution ?
You can try with:
export HADOOP_USER_NAME=hdfs
hive -e "load data local inpath '/usr/local/nhanvien/testHive.txt' into table nhanvien;"
Its a permission issue. Try giving permission to local file and directory where your file exists.
sudo chmod -R 777 /usr/local/nhanvien/testHive.txt
Then
Login as $HDFS_USER and run the following command:
hdfs dfs -chown -R $HIVE_USER:$HDFS_USER /user/hive
hdfs dfs -chmod -R 775 /user/hive
hdfs dfs -chmod -R 775 /user/hive/warehouse
You can also configure for hdfs-site.xml such as:
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
This configure will disable permissions on HDFS. So, a regular user can do the operations on HDFS.
Hope this help.
I'm seeing an issue with running the Hive CLI. When I run the CLI on an edge node I receive the following error regarding HDFS permissions:
c784gnj:~ # sudo hive
/usr/lib/hive/conf/hive-env.sh: line 5: /usr/lib/hive/lib/hive-hbase-handler-1.1.0-cdh5.5.2.jar,/usr/lib/hbase/hbase-common.jar,/usr/lib/hbase/lib/htrace-core4-4.0.1-incubating.jar,/usr/lib/hbase/lib/htrace-core-3.2.0-incubating.jar,/usr/lib/hbase/lib/htrace-core.jar,/usr/lib/hbase/hbase-hadoop2-compat.jar,/usr/lib/hbase/hbase-client.jar,/usr/lib/hbase/hbase-server.jar,/usr/lib/hbase/hbase-hadoop-compat.jar,/usr/lib/hbase/hbase-protocol.jar: No such file or directory
Java HotSpot(TM) 64-Bit Server VM warning: Using incremental CMS is deprecated and will likely be removed in a future release
16/10/11 10:35:49 WARN conf.HiveConf: HiveConf of name hive.metastore.local does not exist
Logging initialized using configuration in jar:file:/usr/lib/hive/lib/hive-common-1.1.0-cdh5.5.2.jar!/hive-log4j.properties
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=app1_K, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:216)
What is hive trying to write to in the /user directory in HDFS?
I can already see that /user/hive is created:
drwxrwxr-t - hive hive 0 2015-03-16 22:17 /user/hive
As you can see I am behind kerberos auth on Hadoop.
Thanks in advance!
Log says you need to set permission on HDFS /user directory to user app1_K
Command
hadoop fs -setfacl -m -R user:app1_K:rwx /user
Execute this command as privileged user from Hadoop bin
If you get similar permission error on any other hdfs directory, then you have to grant permission on that directory.
Refer the below link for more information.
https://hadoop.apache.org/docs/r2.7.2/hadoop-project-dist/hadoop-hdfs/HdfsPermissionsGuide.html#ACLs_Access_Control_Lists
Instead of disabling HDFS access privileges altogether, as suggested by #Kumar, you might simply create a HDFS home dir for every new user on the system, so that Hive/Spark/Pig/Sqoop jobs have a valid location to create temp files...
On a Kerberized cluster:
kinit hdfs#MY.REALM
hdfs dfs -mkdir /user/app1_k
hdfs dfs -chown app1_k:app1_k /user/app1_k
Otherwise:
export HADOOP_USER_NAME=hdfs
hdfs dfs -mkdir /user/app1_k
hdfs dfs -chown app1_k:app1_k /user/app1_k
I'm using Hive through Hue. I tried to create a table using following schema:
create table temp_batting (col_value STRING);
and I'm getting the following error
Driver returned: 1. Errors: OK Hive history
file=/tmp/hue/hive_job_log_4d872c22-e58c-4f9a-9573-442c2be4664b_1970355385.txt
FAILED: Error in metadata:
MetaException(message:javax.jdo.JDODataStoreException: Add request
failed : INSERT INTO "COLUMNS_V2"
("CD_ID","COMMENT","COLUMN_NAME","TYPE_NAME","INTEGER_IDX") VALUES
(?,?,?,?,?) NestedThrowables: org.postgresql.util.PSQLException:
ERROR: permission denied for relation COLUMNS_V2) FAILED: Execution
Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
Looks like a system problem (Linux?). The user that's executing this command probably doesn't has write access on the folder where postgres tables (files) are stored.
It seems like your PostGresql Hive Metastore is not configured properly. The Hive user should have the permissions to modify the database. More information is here.
You need change the permissions of your databases. In the path of your database you need ensure that you can write on database. If you know the path run this command on the directory:
for linux:
$ sudo chmod -R a+w .