How to locate/export Hive query? - hive

I am new at Hive and am attempting to export a hive query to a local file on my computer that way I can import results to excel.
When I do from inside hive;
hive -e select * from TABLE limit 10'>output.txt;
I get "FAILED: ParseException line 1:0 cannot recognize input near 'hive' '-' 'e'"
when I do
hive -S -e "USE DATABASE; select * from TABLE limit 10" > /tmp/test/test.csv;
from shell OR
insert overwrite local directory '/tmp/hello'
select * from TABLE limit 10;
It goes to the hdfs system in Hive -- how do I get this to my local machine?

You can export query to CSV file like:
hive -e 'select * from your_Table' > /home/yourfile.csv
to get this file to your local machine, you should use HDFS:
HDFS DFS -get /tmp/hello /PATHinLocalMachine
Check out this Question

You are seeing the error as you are running the hive -e commands in the hive repl as show below
hive (venkat)> hive -e 'select * from a';
NoViableAltException(26#[])
at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1084)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:202)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:437)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:320)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1219)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1260)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1156)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1146)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:168)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:379)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:739)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:624)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at Sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:233)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
FAILED: ParseException line 1:0 cannot recognize input near 'hive' '-' 'e'
you have to do it in the OS shell as shown below
[venkata_udamala#gw02 ~]$ hive -e 'use database_name;select * from table_name;' > temp.txt

Related

Hive query error

I'm trying to load a tab separated file to a HIVE text file table using hiveconf parameters as below -
load data local inpath '${hiveconf:TEXT_FILE}' into table ${hiveconf:HIVE_TABLE};
But when I run this .hql file as below
hive -hiveconf DB=$DB TEXT_FILE="$text_file_name" HIVE_TABLE=$HIVE_TABLE -f file_load.hql
I get the below error -
NoViableAltException(16#[202:1: tableName : (db= identifier DOT tab= identifier -> ^( TOK_TABNAME $db $tab) |tab= identifier -> ^( TOK_TABNAME $tab) );])
at org.antlr.runtime.DFA.noViableAlt(DFA.java:158)
......
......
FAILED: ParseException line X:YY cannot recognize input near '$' '{' 'hiveconf' in table name
I searched on google and understood that it's due to hive keyword but I have already created the table successfully and when I load the file by hardcoding the file name and table name then the data gets loaded! Please help me here!
Thank you!
You passing context variables incorrectly. it should be -hiveconf before each variable:
hive -hiveconf DB=$DB -hiveconf TEXT_FILE="$text_file_name" -hiveconf HIVE_TABLE=$HIVE_TABLE -f file_load.hql

Hive CLI option -f not working when error occur

I have one file with many load data local sqls,which may contains sqls that can cause error from hive cli.
After error the cli stop proceeding the rest sqls.
If I want to ignore these errors,and continue these sqls,How can I do?
set hive.cli.errors.ignore=true;
Demo
hive -f <(echo 'select x;select 1+1 as x')
FAILED: SemanticException [Error 10004]: Line 1:7 Invalid table alias
or column reference 'x': (possible column names are: )
hive --hiveconf hive.cli.errors.ignore=true -f <(echo 'select x;select 1+1 as x')
FAILED: SemanticException [Error 10004]: Line 1:7 Invalid table alias
or column reference 'x': (possible column names are: )
OK
2

HBase/Hive table queried from Squirrel SQL - Error in loading storage handler.org.apache.hadoop.hive.hbase.HBaseStorageHandler

I am trying to query a HBase table through Squirrel SQL. Created a Hive external table like the following
create external table tweets_hbase(key string, value string)
stored by 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
with serdeproperties ("hbase.columns.mapping" = ":key,data:tweet_text")
tblproperties ("hbase.table.name" = "tweets_hbase")
I am able to query through command line HIVE
hive> select * from tweets_hbase;
OK
20160725001730109 {"createdat":"25-Jul-2016 12:17:03","tweet_date":"2016-07-25","text":"私のランドールスゴビ:) \n#abyssrium\nhts:t.co/NcKtQi9lzm ht/t.co/WNgQIxLU05","user":"uw_kyaaaan","uniqueid":1469420239464,"searchtag":"Apple"}
20160725001730266 {"createdat":"25-Jul-2016 12:17:03","tweet_date":"2016-07-25","text":"2016年7月24日\n8422 Steps\n移動距離 6.485 km\n消費カロリー 467.6 kcal\n\n#M7POPOPO ht/t.co/eFathZXTHD","user":"matsuwichi","uniqueid":1469420239465,"searchtag":"Apple"}
20160725001730308 {"createdat":"25-Jul-2016 12:17:03","tweet_date":"2016-07-25","text":"RT #JBCrewdotcom: Don't forget to leave a nice review for #Coldwater after purchasing! \niTunes: t.co/p5YKRwPKNw\nGoogle Play: ht\u2026","user":"2016OLLGAndUGRL","uniqueid":1469420239466,"searchtag":"Apple"}
However when i try to query through Squirrel SQL, i get an Error in loading. The necessary JARs have been added to Extra Class Path.
hive-hbase-handler-1.1.0.jar
hbase-client-1.1.5.jar
hbase-common-1.1.5.jar
hbase-protocal-1.1.5.jar
hbase-server-1.1.5.jar
hive-jdbc-1.1.1-standalone.jar
Please help
java.sql.SQLException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Error in loading storage handler.org.apache.hadoop.hive.hbase.HBaseStorageHandler
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:296)
at net.sourceforge.squirrel_sql.client.session.StatementWrapper.execute(StatementWrapper.java:165)
at net.sourceforge.squirrel_sql.client.session.SQLExecuterTask.processQuery(SQLExecuterTask.java:369)
at net.sourceforge.squirrel_sql.client.session.SQLExecuterTask.run(SQLExecuterTask.java:212)
at net.sourceforge.squirrel_sql.fw.util.TaskExecuter.run(TaskExecuter.java:82)
at java.lang.Thread.run(Unknown Source)
I solved this myself. The following is what I had to do:
Upgrade HBase to 1.2.2
While starting thriftServer start with the following jars with --jars option
./start-thriftserver.sh --hiveconf hive.server2.thrift.port=10001
--hiveconf hive.server2.thrift.bind.host=xxx.xxx.xxx.xxx --hiveconf spark.cores.max=2 --master spark://xxx.xxx.xxx.xxx:7077 --name
ThriftServer --jars
file:///home/hadoop/software/apache-hive-1.2.1-bin/lib/hive-hbase-handler-1.2.1.jar,file:///home/hadoop/software/hbase-1.2.2/lib/hbase-common-1.2.2.jar,file:///home/hadoop/software/hbase-1.2.2/lib/hbase-protocol-1.2.2.jar,file:///home/hadoop/software/hbase-1.2.2/lib/hbase-client-1.2.2.jar,file:///home/hadoop/software/hbase-1.2.2/lib/guava-12.0.1.jar,file:///home/hadoop/software/hbase-1.2.2/lib/hbase-server-1.2.2.jar,file:///home/hadoop/software/hbase-1.2.2/lib/htrace-core-3.1.0-incubating.jar,file:///home/hadoop/software/hbase-1.2.2/lib/metrics-core-2.2.0.jar

sqoop-export is failing when I have \N as data

Iam getting below error when I run my sqoop export command.
This is my content to be exported by sqoop command
00001|Content|1|Content-article|\N|2015-02-1815:16:04|2015-02-1815:16:04|1 |\N|\N|\N|\N|\N|\N|\N|\N|\N
00002|Content|1|Content-article|\N|2015-02-1815:16:04|2015-02-1815:16:04|1 |\N|\N|\N|\N|\N|\N|\N|\N|\N
sqoop command
sqoop export --connect jdbc:postgresql://10.11.12.13:1234/db --table table1 --username user1 --password pass1--export-dir /hivetables/table/ --fields-terminated-by '|' --lines-terminated-by '\n' -- --schema schema
15/06/09 08:05:16 INFO mapreduce.Job: Task Id :
attempt_1431442954745_1210_m_000001_0, Status : FAILED Error:
java.io.IOException: Can't export data, please check failed map task
logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: java.lang.RuntimeException: Can't parse input data: '\N'
at duser.__loadFromFields(duser.java:690)
at duser.parse(duser.java:558)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more Caused by: java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff]
at java.sql.Timestamp.valueOf(Timestamp.java:202)
at duser.__loadFromFields(duser.java:627)
Can you help me resolve it ?
Try adding these arguments to the export statement
--input-null-string "\\\\N" --input-null-non-string "\\\\N"
From the documentation:
If --input-null-string is not specified, then the string "null" will
be interpreted as null for string-type columns. If
--input-null-non-string is not specified, then both the string "null" and the empty string will be interpreted as null for non-string
columns.
If you don't add those arguments, it won't be able to understand that the \N in your data is actually null.
The problem seems to be the order in which columns are being imported. Sqoop doesn't automatically understand the column mapping. Try using --columns argument to specify the order the columns appear in. Here's how to use it:
sqoop export --connect jdbc:postgresql://10.11.12.13:5432/reports ... --columns col1,col2,col3,...
See http://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_purpose_4 for documentation on how to use --columns.

run os command and set out put to hive variable

Is it possible to run something like this in Hive CLI?
I am trying to pass file contents as a variable to another query.
set column_list=!cat /home/user/filename.lst ;
create table tabname as select $column_list from ...
if you have a query file you pass the variables as hiveconf
hive -hiveconf var1=abcd -f file.txt
or you can construct your query and then pass it to hive cli using -e
hive -e "create table ..."
file filename.lst
line
make a file test.sh,
temp=$(cat /home/user/filename.lst)
hive -f test.hql -hiveconf var=$temp
make a another file test.hql
create table test(${hiveconf:var} string);
on terminal
sh -x test.sh
It will pass the line to the test.hql and it will create a table with line as column;
note- all files should be in same directory .This script is passing only one variable.