Error when starting beeline because of Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hiv e/conf/HiveConf - hive

I'm working on windows8, I have installed hadoop and hive successfully. hiveserver2 is running in 10002. hive also.
But when i try to connect to beeline by entring this command :
beeline !connect jdbc:hive2://localhost:10000/ i have this error
Fichier introuvable
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hiv
e/conf/HiveConf
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544
)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveCon
f
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
i can't fix the source of the error.
i'm working with mysql.

Related

Apache hive beeline error: JAVA Error: A JNI error has occurred, please check your installation and try again

When I run the beeline command that came with Apache Hive 3.1.2 I get an error that says:
PS C:\Users\bluet> beeline.cmd
File Not Found
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hive/jdbc/JdbcUriParseException
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:650)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:632)
Caused by: java.lang.ClassNotFoundException: org.apache.hive.jdbc.JdbcUriParseException
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 7 more
This is what the beeline.cmd command looks like.
Hadoop is running smoothly including hdfs and yarn.
I am on Java 8.
What could be the problem?
I also received the same error. Then when i checked the file "beeline.cmd", i saw it needs hive-jdbc--standalone.jar**. So Downloading that and placing it in C:\Users-----\apache-hive-3.1.2-bin\lib solved the issue.

Apache Ignite ignite-spring dependency error

I have added my maven project' dependency ignite-spring 2.3.0 lib. And then this exeption occured java.util.concurrent.ExecutionException
I have moved ignite-spring file to libs folder of ignite folder.
Do i miss a project configuration and having trouble with creating threads?
Detailed error message;
SEVERE: A child container failed during start
java.util.concurrent.ExecutionException: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[/FrontEndService]]
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:192)
at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:939)
at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:872)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1419)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1409)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[/FrontEndService]]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:167)
... 6 more
Caused by: org.apache.catalina.LifecycleException: Failed to start component [org.apache.catalina.webresources.StandardRoot#7e28b64b]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:167)
at org.apache.catalina.core.StandardContext.resourcesStart(StandardContext.java:4869)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5001)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
... 6 more
Caused by: org.apache.catalina.LifecycleException: Failed to initialize component [org.apache.catalina.webresources.JarResourceSet#3361151a]
at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:112)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:140)
at org.apache.catalina.webresources.StandardRoot.startInternal(StandardRoot.java:709)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
... 9 more
Caused by: java.lang.IllegalArgumentException: java.util.zip.ZipException: invalid LOC header (bad signature)
at org.apache.catalina.webresources.AbstractSingleArchiveResourceSet.initInternal(AbstractSingleArchiveResourceSet.java:113)
at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:107)
... 12 more
Caused by: java.util.zip.ZipException: invalid LOC header (bad signature)
at java.util.zip.ZipFile.read(Native Method)
at java.util.zip.ZipFile.access$1400(ZipFile.java:60)
at java.util.zip.ZipFile$ZipFileInputStream.read(ZipFile.java:717)
at java.util.zip.ZipFile$ZipFileInflaterInputStream.fill(ZipFile.java:419)
at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:158)
at sun.misc.IOUtils.readFully(IOUtils.java:65)
at java.util.jar.JarFile.getBytes(JarFile.java:425)
at java.util.jar.JarFile.getManifestFromReference(JarFile.java:193)
at java.util.jar.JarFile.getManifest(JarFile.java:180)
at org.apache.catalina.webresources.AbstractSingleArchiveResourceSet.initInternal(AbstractSingleArchiveResourceSet.java:111)
... 13 more
I think that you have a corrupted (empty probably) JAR file in libs directory of web app. Check them all with jar tf.

React Native no such method- setCallWebSocket

React native android fails -
ANR Fatal Exception
java.lang.RuntimeException: Could not invoke WebSocketModule.connect
at com.facebook.react.bridge.BaseJavaModule$JavaMethod.invoke(BaseJavaModule.java:382)
at com.facebook.react.cxxbridge.JavaModuleWrapper.invoke(JavaModuleWrapper.java:138)
at com.facebook.react.bridge.queue.NativeRunnable.run(Native Method)
at android.os.Handler.handleCallback(Handler.java:751)
at android.os.Handler.dispatchMessage(Handler.java:95)
at com.facebook.react.bridge.queue.MessageQueueThreadHandler.dispatchMessage(MessageQueueThreadHandler.java:31)
at android.os.Looper.loop(Looper.java:154)
at com.facebook.react.bridge.queue.MessageQueueThreadImpl$3.run(MessageQueueThreadImpl.java:208)
at java.lang.Thread.run(Thread.java:761)
Caused by: java.lang.reflect.InvocationTargetException
at java.lang.reflect.Method.invoke(Native Method)
at com.facebook.react.bridge.BaseJavaModule$JavaMethod.invoke(BaseJavaModule.java:368)
at com.facebook.react.cxxbridge.JavaModuleWrapper.invoke(JavaModuleWrapper.java:138) 
at com.facebook.react.bridge.queue.NativeRunnable.run(Native Method) 
at android.os.Handler.handleCallback(Handler.java:751) 
at android.os.Handler.dispatchMessage(Handler.java:95) 
at com.facebook.react.bridge.queue.MessageQueueThreadHandler.dispatchMessage(MessageQueueThreadHandler.java:31) 
at android.os.Looper.loop(Looper.java:154) 
at com.facebook.react.bridge.queue.MessageQueueThreadImpl$3.run(MessageQueueThreadImpl.java:208) 
at java.lang.Thread.run(Thread.java:761) 
Caused by: java.lang.NoSuchMethodError: No virtual method setCallWebSocket(Lokhttp3/Call;)V in class Lokhttp3/internal/Internal; or its super classes (declaration of 'okhttp3.internal.Internal' appears in /data/app/com.corpay.mwallet.sit.dev-2/base.apk:classes23.dex)
Your react-native bundling server may have become corrupted. Try resetting the local server by killing it and running the following commands in your CMD. Then run your server again.
$ rm -rf $TMPDIR/react-*
$ watchman watch-del-all
$ yarn cache clean

Failed to create InputInitializerManager error - TEZ on HIVE

I have installed Apache Tez 0.8.1, Hadoop version 2.7.0 and Hive version 2.01.I am able to successfully run the Map Reduce Jobs.But when I configure hive and tried to run a simple count query, it returned the below error.From the error it is trying to look for a jar,I have placed the jar in classpath but still error did not resolve.
Please help me in resolving this.Thanks in Advance!!.
hive> select count(*) from sample1;
Query ID = root_20160728215555_a58e91a6-8913-4a57-8715-bc1739a2cb02
Total jobs = 1
Launching Job 1 out of 1
----------------------------------------------------------------------------------------------
VERTICES MODE STATUS TOTAL COMPLETED RUNNING PENDING FAILED KILLED
----------------------------------------------------------------------------------------------
Map 1 container FAILED -1 0 0 -1 0 0
Reducer 2 container KILLED 1 0 0 1 0 0
----------------------------------------------------------------------------------------------
VERTICES: 00/02 [>>--------------------------] 0% ELAPSED TIME: 0.17 s
----------------------------------------------------------------------------------------------
Status: Failed
Vertex failed, vertexName=Map 1, vertexId=vertex_1469720608711_0011_1_00, diagnostics=[Vertex vertex_1469720608711_0011_1_00 [Map 1] killed/failed due to:INIT_FAILURE, Fail to create InputInitializerManager, org.apache.tez.dag.api.TezReflectionException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator
at org.apache.tez.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:70)
at org.apache.tez.common.ReflectionUtils.createClazzInstance(ReflectionUtils.java:89)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.createInitializer(RootInputInitializerManager.java:138)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInputInitializers(RootInputInitializerManager.java:115)
at org.apache.tez.dag.app.dag.impl.VertexImpl.setupInputInitializerManager(VertexImpl.java:4676)
at org.apache.tez.dag.app.dag.impl.VertexImpl.access$4300(VertexImpl.java:204)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.handleInitEvent(VertexImpl.java:3445)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.transition(VertexImpl.java:3394)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.transition(VertexImpl.java:3375)
at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
at org.apache.tez.state.StateMachineTez.doTransition(StateMachineTez.java:57)
at org.apache.tez.dag.app.dag.impl.VertexImpl.handle(VertexImpl.java:1975)
at org.apache.tez.dag.app.dag.impl.VertexImpl.handle(VertexImpl.java:203)
at org.apache.tez.dag.app.DAGAppMaster$VertexEventDispatcher.handle(DAGAppMaster.java:2090)
at org.apache.tez.dag.app.DAGAppMaster$VertexEventDispatcher.handle(DAGAppMaster.java:2076)
at org.apache.tez.common.AsyncDispatcher.dispatch(AsyncDispatcher.java:183)
at org.apache.tez.common.AsyncDispatcher$1.run(AsyncDispatcher.java:114)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.tez.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:68)
... 20 more
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/split/SplitLocationProvider
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.<init>(HiveSplitGenerator.java:96)
... 25 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.split.SplitLocationProvider
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 26 more
]
Vertex killed, vertexName=Reducer 2, vertexId=vertex_1469720608711_0011_1_01, diagnostics=[Vertex received Kill in NEW state., Vertex vertex_1469720608711_0011_1_01 [Reducer 2] killed/failed due to:OTHER_VERTEX_FAILURE]
DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1
UPDATE:
After I face the above issue, I have copied the hadoop-core-1.2.1.jar in hive lib folder.After that I am facing another issue while starting hive.From the trace I could figure out that there is some illegal argument passed.
Exception in thread "main" java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1550)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3080)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3108)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:543)
at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:516)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1548)
... 14 more
Caused by: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 1.2.1
at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:165)
at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:132)
at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:93)
at org.apache.hadoop.hive.metastore.ObjectStore.getDataSourceProps(ObjectStore.java:376)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:268)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:517)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:482)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:544)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:370)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:219)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:67)
Issue is with the Apache Tez version.Apache Tez 0.8.1 is not compatible with Hadoop 2.7.0 and hive 2.0.1.
Have downloaded 0.8.4 src and built it which has resolved the issue.
Thanks!!.

Pig script running into jave heap space error 2997

Here is the pig stack trace. I am running my code over daily data individually, but it will fail some days. Others are done within 5 min. I have about 10 group all parallel 1 at the end to do some counts. Is this the reason of error?
Backend error message
Error: Java heap space
Pig Stack Trace
ERROR 2997: Unable to recreate exception from backed error: Error: Java heap space
org.apache.pig.backend.executionengine.ExecException: ERROR 0: org.apache.pig.backend.executionengine.ExecException: ERROR 2997: Unable to recreate exception from backed error: Error: Java heap space
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.getStats(MapReduceLauncher.java:819)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:452)
at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.launchPig(HExecutionEngine.java:282)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1431)
at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1416)
at org.apache.pig.PigServer.execute(PigServer.java:1405)
at org.apache.pig.PigServer.executeBatch(PigServer.java:456)
at org.apache.pig.PigServer.executeBatch(PigServer.java:439)
at org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:171)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:234)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:81)
at org.apache.pig.Main.run(Main.java:495)
at org.apache.pig.Main.main(Main.java:170)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 2997: Unable to recreate exception from backed error: Error: Java heap space
at org.apache.pig.backend.hadoop.executionengine.Launcher.getErrorMessages(Launcher.java:215)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.getStats(MapReduceLauncher.java:803)
... 19 more
================================================================================
Pig Stack Trace
ERROR 2244: Job failed, hadoop does not return any error message
org.apache.pig.backend.executionengine.ExecException: ERROR 2244: Job failed, hadoop does not return any error message
at org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:179)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:234)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:81)
at org.apache.pig.Main.run(Main.java:495)
at org.apache.pig.Main.main(Main.java:170)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)