schematool: command not found - hive

I am trying to install Hive on my Ubuntu 19.10 machine .
I am using this doc https://phoenixnap.com/kb/install-hive-on-ubuntu.
As mentioned in step 6, where I am trying to initiate Derby Database, I write the command in the right path : ~/apache-hive-3.1.2-bin/bin
schematool –initSchema –dbType derby
But I get this error :
schematool: command not found.
How can I resolve this please ?

I had the same question before.
Maybe because of the wrong configuration files, like hive-site.xml, hive-env.sh. A blank space in my configuration file caused this error.

Default path for schematool is $HIVE_HOME/bin/schematool (/apache-hive-3.1.2-bin/bin/schematool in your case). Try to add this HIVE_HOME on your .bashrc file, worked for me.
# Hive
export HIVE_HOME=/<your hive path>
export PATH=$PATH:$HIVE_HOME/bin

Try this
using this command I resolved this issue
hive --service schematool -dbType mysql -password hive -username hive -validate

run ./schematool –initSchema –dbType derby
don't forget the ./

Related

Where is s3-dist-cp of EMR 6.2.0?

I created an EMR Spark cluster with the following configuration:
Then I ssh into the master node, typed the command s3-dist-cp, then got the following error:
s3-dist-cp: command not found
I searched the whole disk but found nothing:
sudo find / -name "*s3-dist-cp*"
Where is the s3-dist-cp command? Thanks!
It turns out I must select "Hadoop", see the screenshot below:

Hive CLI giving problem while starting it

When i run command hive it is only able to start from bin folder beacause metastore is created in bin only if i run it from home its not able to get start and shows error.
I have added these lines in my .bashrc file for hive
HIVE env variables
export HIVE_HOME=/opt/hadoop/hive/apache-hive-2.3.4-bin
export PATH=$HIVE_HOME/bin:$PATH
Can you try to setup path as mentioned below and retry,
user#ubuntu:~$ sudo gedit ~/.bashrc
Copy and paste the following lines at end of the file
# Set HIVE_HOME
export HIVE_HOME="/opt/hadoop/hive/apache-hive-2.3.4-bin"
PATH=$PATH:$HIVE_HOME/bin
export PATH
But here my suggestion is, instead of using hive command prompt try to use recommended way that is beeline client. If you have hiveserver2 configured you can connect using beeline client and query to hive.

Hive script not running in crontab with hadoop must be in the path error

After setting Hadoop Home path and Prefix path in .bashrc and /etc/profile also im getting the same error - Cannot find hadoop installation: $HADOOP_HOME or $HADOOP_PREFIX must be set or hadoop must be in the path
If i run the script from crontab im facing this error from hive> prompt its working fine
plz help with the regarding how to solve this
Set $HADOOP_HOME in $HIVE_HOME/conf/hive-env.sh
try loading user bash profile in the script, as below,
. ~/.bash_profile
bash is included in user bash_profile and it will have user specific configurations as well.
see the similar question Hbase commands not working in script executed via crontab

Apache Hadoop 2.6: Pseudo Distribution Mode Setup

I am setting up Apche Hadoop 2.6 for the Psuedo Distributed Operation by following the instructions provided in the link:
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
I am facing an issue after I execute the command: $ bin/hdfs dfs -put etc/hadoop input
The error message is: put:'input': No such file or directory
How to resolve this?
Also, I have edited the hadoop-env.sh with the statement: export HADOOP_PREFIX=/usr/local/hadoop, but cannot understand that why shell prints out the warning: /usr/local/hadoop/etc/hadoop/hadop-env.sh: line 32: export:='/usr/local/hadoop': not a valid identifier
Thanks for the help.
I have fixed this problem.
I created the directory: $ bin/hdfs dfs -mkdir /user/root and the problem got solved, as I was logged in as the root in ubuntu. Earlier, I was giving wrong username, hence, facing the issue.

Hadoop + Hive - hcatalog won't startup

I just installed a single node Hadoop 2.2.0 cluster running on ubuntu.
I tried a couple of basic example calculations and it works fine.
I then tried to setup hive 0.12.0, that includes hcatalog.
I actually follow this tutorial.
And when I try to start hcatalog, I always get the following error :
bash $HIVE_HOME/hcatalog/sbin/hcat_server.sh start
dirname: missing operand
Try `dirname --help' for more information.
Started metastore server init, testing if initialized correctly...
/usr/local/hive/hcatalog/sbin/hcat_server.sh: line 91: /usr/local/hive-0.12.0/hcatalog/sbin/../var/log/hcat.out: No such file or directory
Metastore startup failed, see /usr/local/hive-0.12.0/hcatalog/sbin/../var/log/hcat.err
But there's no hcat.err file at all, I'm kind of blocked right now.
Any help would be much appreciated !
Thanks in advance,
Guillaume
I worked out that hcat was not executable in the hive installation I have downloaded.
S just sudo chmod A+X hcat and it works