Hive on DSE 4.0 throwing NoSuchMethodError - hive

Getting this exception when using Hive on DSE 4.0.
Looks like the Hive version that is shipped with DSE 4.0 has a know issue
https://issues.apache.org/jira/browse/HIVE-6962
Does anyone have a workaround. I tried a few that were mentioned in the jira, that did not help
hive> SHOW TABLES;
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.unset(Ljava/lang/String;)V
at org.apache.hadoop.hive.ql.exec.Utilities.createDirsWithPermission(Utilities.java:3416)
at org.apache.hadoop.hive.ql.exec.Utilities.createDirsWithPermission(Utilities.java:3401)
at org.apache.hadoop.hive.ql.Context.getScratchDir(Context.java:214)
at org.apache.hadoop.hive.ql.Context.getLocalScratchDir(Context.java:241)
at org.apache.hadoop.hive.ql.Context.getLocalTmpPath(Context.java:333)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:296)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:391)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:291)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:944)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1009)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

DSE 4.0 is shipped with 0.12.0 version, so if you use Hive 0.13.0, it will throws that exception

Related

Weblogic Domain creation error through script in putty

I am trying to create weblogic domain using silent mode through putty .I have used below command:
./config.sh -mode=silent -silent_xml=/home/ec2-user/createdomain.xml
I am getting below error message while executing it:
Exception in thread "Thread-1" java.lang.IllegalStateException: No able to create the instance of the template catalog class com.oracle.cie.domain.template.catalog.impl.GlobalTemplateCat
at com.oracle.cie.domain.template.catalog.TemplateCatalogFactory.createGlobalTemplateCatalog(TemplateCatalogFactory.java:138)
at com.oracle.cie.domain.template.catalog.TemplateCatalogFactory.getGlobalCatalog(TemplateCatalogFactory.java:78)
at com.oracle.cie.domain.template.catalog.TemplateCatalogFactory.getGlobalCatalog(TemplateCatalogFactory.java:33)
at com.oracle.cie.wizard.domain.silent.tasks.LoadTemplateCatalogTask$1.run(LoadTemplateCatalogTask.java:23)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.oracle.cie.domain.template.catalog.TemplateCatalogFactory.createGlobalTemplateCatalog(TemplateCatalogFactory.java:133)
... 4 more
Caused by: com.oracle.cie.domain.env.EnvironmentServiceException: Failed to get inventory for /home/ec2-user/oracle/middleware/oracle_common/common/bin
at com.oracle.cie.domain.env.EnvironmentServiceImpl.init(EnvironmentServiceImpl.java:425)
at com.oracle.cie.domain.env.EnvironmentServiceImpl.<init>(EnvironmentServiceImpl.java:89)
at `com`.oracle.cie.domain.env.EnvironmentServiceImpl.getInstance(EnvironmentServiceImpl.java:364)
at com.oracle.cie.domain.env.EnvironmentServiceFactory.getEnvironmentService(EnvironmentServiceFactory.java:35)
at com.oracle.cie.domain.template.catalog.impl.OracleHomeLocator.getProductInstalDirs(OracleHomeLocator.java:31)
at com.oracle.cie.domain.template.catalog.impl.GlobalTemplateCat.populateProductCatalogs(GlobalTemplateCat.java:446)
at com.oracle.cie.domain.template.catalog.impl.GlobalTemplateCat.<init>(GlobalTemplateCat.java:90)
at com.oracle.cie.domain.template.catalog.impl.GlobalTemplateCat.<init>(GlobalTemplateCat.java:83)
... 9 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.oracle.cie.common.ReflectionHelper.process(ReflectionHelper.java:48)
at com.oracle.cie.domain.env.EnvironmentServiceImpl.init(EnvironmentServiceImpl.java:384)
... 16 more
Caused by: com.oracle.cie.gdr.external.InventoryException: com.oracle.cie.gdr.utils.GdrException: The gdr meta-data directory /home/ec2-user/oracle/middleware/oracle_common/common/bin/inventory is invalid or does not exist.
at com.oracle.cie.gdr.external.impl.OracleHomeInventoryImpl.<init>(OracleHomeInventoryImpl.java:55)
at com.oracle.cie.gdr.external.impl.OracleHomeInventoryFactory.createInventory(OracleHomeInventoryFactory.java:60)
at com.oracle.cie.gdr.external.InventoryFactory.getOracleHomeInventory(InventoryFactory.java:99)
... 22 more
Caused by: com.oracle.cie.gdr.utils.GdrException: The gdr meta-data directory /home/ec2-user/oracle/middleware/oracle_common/common/bin/inventory is invalid or does not exist.
at com.oracle.cie.gdr.MetaDataHome.init(MetaDataHome.java:206)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:188)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:172)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:157)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:144)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:86)
at com.oracle.cie.gdr.Home.getMetaDataHome(Home.java:619)
Which Weblogic version are you using? I have not seen a silent script to create domains for a While. If you are trying to do this on Weblogic 12c, it won't work as this kind of script used to be available for older versions such as 8 and 9 as far as I remember.
If you want to automate domain's provisioning for versions such as 12c you should use a newer approach. Here, I am proposing two options.
You can use Ansible, WLST and Python to create the domain. You can see an example here https://github.com/textanalyticsman/ansible-soa
You can use Weblogic Deploy Tooling, this is an Open Source tool provided by Oracle and you can find out it here https://github.com/oracle/weblogic-deploy-tooling
The combination of Weblogic Deploy Tooling and Ansible is also a good option as is shown in https://github.com/textanalyticsman/ansible-soa-wldt
You can also try Weblogic Kubernetes Operator https://oracle.github.io/weblogic-kubernetes-operator/userguide/managing-domains/domain-resource/

Hive-Hbase Integration not working

Hive Hbase integration not working for below version.
Hive - 1.2.1
Hbase - 1.2.3
We are able to create view for a Hbase table using HBaseStorageHandler,but not able to insert data to Hbase through that view.And below is the exception
Error: java.lang.RuntimeException: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Put.setDurability(Lorg/apache/hadoop/hbase/client/Durability;)V
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:172)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Put.setDurability(Lorg/apache/hadoop/hbase/client/Durability;)V
at org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat$MyRecordWriter.write(HiveHBaseTableOutputFormat.java:142)
at org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat$MyRecordWriter.write(HiveHBaseTableOutputFormat.java:117)
at org.apache.hadoop.hive.ql.io.HivePassThroughRecordWriter.write(HivePassThroughRecordWriter.java:40)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:753)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:97)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:162)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:508)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
... 8 more

Why I am keeping to get an exception of java.lang.NoClassDefFoundError: org/apache/lucene/codecs/lucene50/Lucene50Codec

I am using Lucene 5.4 and recently want to migrate a project to spring framework.
If I invoke my indexing code in a java main function it works no errors, but when deploy the code on Tomcat 9.0, it comes with the following error. The WEB-INF/lib folder has four Lucene jars, which are lucene-core-5.4.0.jar, lucene-facet-5.4.0.jar, lucene-queries-5.4.0.jar and lucene-queryparser-5.4.0.jar. I think these four jars should be enough for document indexing, right? Also I am using lucent 5.4, why the code try to find Lucene50Codec class rather than Lucene54Codec class?
Tomcat Exception report
message Handler processing failed; nested exception is java.lang.NoClassDefFoundError: org/apache/lucene/codecs/lucene50/Lucene50Codec
description The server encountered an internal error that prevented it from fulfilling this request.
exception
org.springframework.web.util.NestedServletException: Handler processing failed; nested exception is java.lang.NoClassDefFoundError: org/apache/lucene/codecs/lucene50/Lucene50Codec
org.springframework.web.servlet.DispatcherServlet.triggerAfterCompletionWithError(DispatcherServlet.java:1302)
org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:977)
org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:893)
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:969)
org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:860)
javax.servlet.http.HttpServlet.service(HttpServlet.java:622)
org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:845)
javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53)
root cause
java.lang.NoClassDefFoundError: org/apache/lucene/codecs/lucene50/Lucene50Codec
java.lang.Class.getDeclaredConstructors0(Native Method)
java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
java.lang.Class.getConstructor0(Class.java:3075)
java.lang.Class.newInstance(Class.java:412)
org.apache.lucene.util.NamedSPILoader.reload(NamedSPILoader.java:67)
org.apache.lucene.util.NamedSPILoader.<init>(NamedSPILoader.java:47)
org.apache.lucene.util.NamedSPILoader.<init>(NamedSPILoader.java:37)
org.apache.lucene.codecs.Codec$Holder.<clinit>(Codec.java:47)
org.apache.lucene.codecs.Codec.getDefault(Codec.java:140)
org.apache.lucene.index.LiveIndexWriterConfig.<init>(LiveIndexWriterConfig.java:120)
org.apache.lucene.index.IndexWriterConfig.<init>(IndexWriterConfig.java:140)
com.zhaoyun.r3ds.core.lucene.LuceneFactoryImpl.createWriter(LuceneFactoryImpl.java:113)
com.zhaoyun.r3ds.core.engine.SearchEngineImpl.getImageWriter(SearchEngineImpl.java:87)
com.zhaoyun.r3ds.core.engine.ImageEngine.addImageDocument(ImageEngine.java:50)
com.zhaoyun.r3ds.restful.controller.SemanticController.index(SemanticController.java:43)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:497)
org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:222)
org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:137)
org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:110)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:814)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:737)
org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:959)
org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:893)
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:969)
org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:860)
javax.servlet.http.HttpServlet.service(HttpServlet.java:622)
org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:845)
javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53)
The index might have been written with an earlier version of Lucene and the coded is no longer available in Lucene 5.4.
You need to include the lucene-backward-codecs-5.4.0.jar file as well.
Alternatively, you might have multiple versions of Lucene in Tomcats classpath where some are of Version 5.0 and some are of version 5.4. You should make sure, that there is only one version of Lucene on the classpath of Tomcat.

Pig and Jython - Can't Register UDF

I am trying to write it Python UDF; I am using the Datastax package for that. When I try to write a simple UDF such as:
#outputSchema("word:chararray")
def helloworld():
return 'Hello, World'
And then register it in the grunt shell:
REGISTER 'pig.py' USING org.apache.pig.scripting.jython.JythonScriptEngine as myfuncs;
I get the following error:
ERROR 2998: Unhandled internal error. org/python/core/PyObject
java.lang.NoClassDefFoundError: org/python/core/PyObject
at org.apache.pig.scripting.jython.JythonScriptEngine.registerFunctions(JythonScriptEngine.java:304)
at org.apache.pig.PigServer.registerCode(PigServer.java:534)
at org.apache.pig.tools.grunt.GruntParser.processRegister(GruntParser.java:423)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:419)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:190)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:166)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
at org.apache.pig.Main.run(Main.java:490)
at org.apache.pig.Main.main(Main.java:111)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.ClassNotFoundException: Class org.python.core.PyObject not found in modules [ModuleClassLoader:Ana$
at com.datastax.bdp.loader.SystemClassLoader.loadClass(SystemClassLoader.java:120)
at com.datastax.bdp.loader.ModuleClassLoader.loadClass(ModuleClassLoader.java:38)
at com.datastax.bdp.loader.ModuleClassLoader.loadClass(ModuleClassLoader.java:32)
... 14 more
Does anyone know what could be causing this error?
Add $PIG_HOME/lib/jython.jar to your PIG_CLASSPATH environment variable.

hive0.10.0 Exception in thread "main" java.lang.NoSuchMethodError: org.apache.thrift.EncodingUtils.setBit(BIZ)B

could u help me? I use hive 0.10.0
hive> show tables;
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.thrift.EncodingUtils.setBit(BIZ)B
at org.apache.hadoop.hive.ql.plan.api.Query.setStartedIsSet(Query.java:487)
at org.apache.hadoop.hive.ql.plan.api.Query.setStarted(Query.java:474)
at org.apache.hadoop.hive.ql.QueryPlan.updateCountersInQueryPlan(QueryPlan.java:309)
at org.apache.hadoop.hive.ql.QueryPlan.getQueryPlan(QueryPlan.java:450)
at org.apache.hadoop.hive.ql.QueryPlan.toString(QueryPlan.java:622)
at org.apache.hadoop.hive.ql.history.HiveHistory.logPlanProgress(HiveHistory.java:503)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1097)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:973)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:893)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
This issue comes because of incompatible "libthrift" jar version. So, I have downloaded the latest libthrift-0.9.3.jar, and it worked for me.
I faced the similar issue. The version of Hive used is not compatible with Hadoop. The thrift version used by hadoop is different from the one used by hive. It good to use the compatible version of Hive or replace the thirft (jar) library used by Hadoop with one used by hive.
When i faced this problem, this was my situation:
In HADOOP_HOME/lib, I placed mahout-examples-0.7-job.jar, which is not supposed to be there for some other excercises.
When I run Hive, it throwed me the same error like in your question.
I moved mahout.X.y.jar from lib, then started hive CLi and it worked like a charm.