spark read from S3 hangs (Caused by: java.net.UnknownHostException) - amazon-s3

Sometimes my Spark job runs without problem, sometimes it stuck, with error related to networking.
E.g.
...
Caused by: java.net.UnknownHostException: MYBUCKET.s3.amazonaws.com
...
Is there guideline or any tips regarding reading files hosted on S3? My dataset is about 3700 files, size about 50Mb each. They're gzipped JSON files.
Here is the full stack error of the above snippet.
py4j.protocol.Py4JJavaError15/10/09 13:47:21 INFO scheduler.TaskSetManager: Lost task 3.3 in stage 12.0 (TID 146) on executor 172.24.12.183: java.net.UnknownHostException (kaidee-tracking.s3.amazonaws.com) [duplicate 10]
: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 12.0 failed 4 times, most recent failure: Lost task 0.3 in stage 12.0 (TID 147, 172.24.12.183): java.net.UnknownHostException: kaidee-tracking.s3.amazonaws.com
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:178)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:579)
at sun.security.ssl.SSLSocketImpl.connect(SSLSocketImpl.java:618)
at sun.security.ssl.SSLSocketImpl.<init>(SSLSocketImpl.java:451)
at sun.security.ssl.SSLSocketFactoryImpl.createSocket(SSLSocketFactoryImpl.java:140)
at org.apache.commons.httpclient.protocol.SSLProtocolSocketFactory.createSocket(SSLProtocolSocketFactory.java:82)
at org.apache.commons.httpclient.protocol.ControllerThreadSocketFactory$1.doit(ControllerThreadSocketFactory.java:91)
at org.apache.commons.httpclient.protocol.ControllerThreadSocketFactory$SocketTask.run(ControllerThreadSocketFactory.java:158)
at java.lang.Thread.run(Thread.java:744)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1280)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1268)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1267)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1267)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1493)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1455)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1444)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1813)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1826)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1839)
at org.apache.spark.api.python.PythonRDD$.runJob(PythonRDD.scala:361)
at org.apache.spark.api.python.PythonRDD.runJob(PythonRDD.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
at py4j.Gateway.invoke(Gateway.java:259)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:207)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.net.UnknownHostException: MYBUCKET.s3.amazonaws.com
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:178)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:579)
at sun.security.ssl.SSLSocketImpl.connect(SSLSocketImpl.java:618)
at sun.security.ssl.SSLSocketImpl.<init>(SSLSocketImpl.java:451)
at sun.security.ssl.SSLSocketFactoryImpl.createSocket(SSLSocketFactoryImpl.java:140)
at org.apache.commons.httpclient.protocol.SSLProtocolSocketFactory.createSocket(SSLProtocolSocketFactory.java:82)
at org.apache.commons.httpclient.protocol.ControllerThreadSocketFactory$1.doit(ControllerThreadSocketFactory.java:91)
at org.apache.commons.httpclient.protocol.ControllerThreadSocketFactory$SocketTask.run(ControllerThreadSocketFactory.java:158)
... 1 more

Related

wildfly+intelij Error running admin process

Im using wildfly-8.2.0.Final. It works fine on work pc. But when I try deploy it on home pc I get this error
Error running admin process:
Message: java.lang.NoClassDefFoundError: org/wildfly/security/password/Password Stack trace: com.intellij.javaee.process.common.JavaeeProcessUtilException: java.lang.NoClassDefFoundError: org/wildfly/security/password/Password at com.intellij.javaee.process.common.MethodInvocator.invoke(MethodInvocator.java:47) at com.intellij.javaee.oss.process.JavaeeProcess.processRequest(JavaeeProcess.java:112) at com.intellij.javaee.oss.process.JavaeeProcess.run(JavaeeProcess.java:52) at com.intellij.javaee.oss.process.JavaeeProcess.main(JavaeeProcess.java:31) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.intellij.rt.execution.CommandLineWrapper.main(CommandLineWrapper.java:66) Caused by: java.lang.NoClassDefFoundError: org/wildfly/security/password/Password at com.intellij.javaee.oss.jboss.agent.WildFly11Agent.createAuthHandler(WildFly11Agent.java:15) at com.intellij.javaee.oss.jboss.agent.JBoss7Agent.doConnect(JBoss7Agent.java:49) at com.intellij.javaee.oss.agent.SimpleAgentBase$1.doJob(SimpleAgentBase.java:24) at com.intellij.javaee.oss.agent.SimpleAgentBase$1.doJob(SimpleAgentBase.java:20) at com.intellij.javaee.oss.agent.SimpleAgentJob.perform(SimpleAgentJob.java:12) at com.intellij.javaee.oss.agent.SimpleAgentBase.connect(SimpleAgentBase.java:33) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.intellij.javaee.process.common.MethodInvocator.invoke(MethodInvocator.java:41) ... 8 more Caused by: java.lang.ClassNotFoundException: org.wildfly.security.password.Password at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ... 19 more 19:06:24,746 INFO [org.jboss.as.connector.subsystems.datasources] (MSC service thread 1-4) JBAS010400: Bound data source [java:jboss/datasources/ehospital]
I am using:
Intelij Idea,
wildfly-8.2.0.Final,
jdk1.8.0_231
Has anyone faced this error?
Problem was with Postgre version and driver. Downloaded version for my Postgre and
changed driver in JBOSS_HOME\modules\org\postgres\main. That helped me

Not able to start hive on cloudera vm

I am beginner to Big data systems.
I have installed Cloudera quick start vm-5.13.0 in my Oracle Virtual Box.
I issued following command to start hive.
[cloudera#quickstart ~]$ hive
But getting following error and hive is not started.
It seems like it is not able to connect with Hive metastore but I haven't made any changes in the VM installation.
Please let me know any manual changes I need to start hive.
Thanks.
Logging initialized using configuration in file:/etc/hive/conf.dist/hive-log4j.properties
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked.
WARN: Please see http://www.slf4j.org/codes.html#release for an explanation.
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:833)
at org.apache.hadoop.hive.ql.session.SessionState.getAuthorizationMode(SessionState.java:1679)
at org.apache.hadoop.hive.ql.session.SessionState.isAuthorizationModeV2(SessionState.java:1690)
at org.apache.hadoop.hive.ql.processors.CommandUtil.authorizeCommand(CommandUtil.java:55)
at org.apache.hadoop.hive.ql.processors.AddResourceProcessor.run(AddResourceProcessor.java:66)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:281)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:177)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:389)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:324)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:422)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:438)
at org.apache.hadoop.hive.cli.CliDriver.processInitFiles(CliDriver.java:472)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:723)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:699)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:634)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:391)
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:810)
... 20 more
Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.setConf(HiveAuthorizationProviderBase.java:114)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:388)
... 21 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:220)
at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:338)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:299)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:274)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:256)
at org.apache.hadoop.hive.ql.security.authorization.DefaultHiveAuthorizationProvider.init(DefaultHiveAuthorizationProvider.java:29)
at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.setConf(HiveAuthorizationProviderBase.java:112)
... 24 more
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1562)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3399)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3418)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3643)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:231)
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:215)
... 30 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1560)
... 37 more
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused
at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:464)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:244)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1560)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3399)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3418)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3643)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:231)
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:215)
at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:338)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:299)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:274)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:256)
at org.apache.hadoop.hive.ql.security.authorization.DefaultHiveAuthorizationProvider.init(DefaultHiveAuthorizationProvider.java:29)
at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.setConf(HiveAuthorizationProviderBase.java:112)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:388)
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:810)
at org.apache.hadoop.hive.ql.session.SessionState.getAuthorizationMode(SessionState.java:1679)
at org.apache.hadoop.hive.ql.session.SessionState.isAuthorizationModeV2(SessionState.java:1690)
at org.apache.hadoop.hive.ql.processors.CommandUtil.authorizeCommand(CommandUtil.java:55)
at org.apache.hadoop.hive.ql.processors.AddResourceProcessor.run(AddResourceProcessor.java:66)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:281)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:177)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:389)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:324)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:422)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:438)
at org.apache.hadoop.hive.cli.CliDriver.processInitFiles(CliDriver.java:472)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:723)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:699)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:634)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:579)
at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
... 45 more
)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:512)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:244)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
... 42 more

FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeExcept

i am using cdh 5.13.0 environment
whenever i try to execute hive cmd it shows error
FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
i checked the hive-metastore.log ,it shows
2018-05-02 06:15:53,225 ERROR [main]: Datastore.Schema (Log4JLogger.java:error(125)) - Failed initialising database.
Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to create database 'metastore_db', see the next exception for details.
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedConnection.(Unknown Source)
at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:187)
at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
at com.jolbox.bonecp.BoneCP.(BoneCP.java:416)
at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
at org.datanucleus.store.rdbms.RDBMSStoreManager.(RDBMSStoreManager.java:298)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
at java.security.AccessController.doPrivileged(Native Method)
at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:418)
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:447)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:342)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:298)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.hive.metastore.RawStoreProxy.(RawStoreProxy.java:60)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:69)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:682)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:660)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:709)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:508)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.(RetryingHMSHandler.java:78)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6474)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6469)
at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:6719)
at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:6646)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: ERROR XJ041: Failed to create database 'metastore_db', see the next exception for details.
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
... 61 more
Caused by: ERROR XBM0H: Directory /metastore_db cannot be created.
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.impl.services.monitor.StorageFactoryService$10.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown Source)
at org.apache.derby.impl.services.monitor.FileMonitor.createPersistentService(Unknown Source)
at org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown Source)
... 58 more
org.datanucleus.exceptions.NucleusDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to create database 'metastore_db', see the next exception for details.
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedConnection.(Unknown Source)
at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:187)
at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
at com.jolbox.bonecp.BoneCP.(BoneCP.java:416)
at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
at org.datanucleus.store.rdbms.RDBMSStoreManager.(RDBMSStoreManager.java:298)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
at java.security.AccessController.doPrivileged(Native Method)
at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:418)
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:447)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:342)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:298)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.hive.metastore.RawStoreProxy.(RawStoreProxy.java:60)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:69)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:682)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:660)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:709)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:508)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.(RetryingHMSHandler.java:78)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6474)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6469)
at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:6719)
at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:6646)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: ERROR XJ041: Failed to create database 'metastore_db', see the next exception for details.
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
... 61 more
Caused by: ERROR XBM0H: Directory /metastore_db cannot be created.
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.impl.services.monitor.StorageFactoryService$10.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown Source)
at org.apache.derby.impl.services.monitor.FileMonitor.createPersistentService(Unknown Source)
at org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown Source)
... 58 more
i dont know what to do.
the status of hive-metastore server is down and not getting shutdown
Cause Description of what caused the problem.
The Out Of Memory (OOM) error, in this case, is a result of the server not having enough memory to start HiveMetaStore service.
RCA
HiveMetaStore service is configured in hive-site.xml on the client side, but the service is not started.
For example:
/etc/gphd/hive-0.11.0_gphd_2_1_0_0/conf/hive-site.xml
<property>
<name>hive.metastore.uris
<value>thrift://hdw1.viadea.com:9083
</property>
However on hdw1:
-bash-4.1$ service hive-metastore status
hive-metastore dead but pid file exists
After observing the hive-metastore.log, you can see that reason why the service failed to start was due to an Out of Memory (OOM) error, as shown in the example below:
14/04/07 16:43:13 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties files.
library initialization failed - unable to allocate file descriptor table - out of memory
Procedure
To resolve this issue, follow the steps below:
1. Understand why HiveMetaStore service is not started. In this case, increase the physical memory of the server.
2. Then start the HiveMetaStore service manually using the root user.
service hive-metastore start

Tomcat8 java.lang.NoClassDefFoundError: org/apache/tomcat/util/http/mapper/Mapper

I am upgrading a reporting application which comes with Tomcat bundled with the package.. Recently i upgraded latest version which contains Tomcat8, during upgrade did not see any errors or failures,it was completed successfully but after the upgrade not able to start Tomcat.. (earlier version was tomcat7)
I see following error in stderr:
SEVERE: Begin event threw error
java.lang.NoClassDefFoundError: org/apache/tomcat/util/http/mapper/Mapper
at org.apache.catalina.connector.Connector.<init>(Connector.java:230)
at org.apache.catalina.startup.ConnectorCreateRule.begin(ConnectorCreateRule.java:62)
at org.apache.tomcat.util.digester.Digester.startElement(Digester.java:1178)
at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.startElement(AbstractSAXParser.java:509)
at com.sun.org.apache.xerces.internal.parsers.AbstractXMLDocumentParser.emptyElement(AbstractXMLDocumentParser.java:182)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanStartElement(XMLDocumentFragmentScannerImpl.java:1344)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:2787)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:606)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:510)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:857)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:777)
at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141)
at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213)
at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:643)
at org.apache.tomcat.util.digester.Digester.parse(Digester.java:1451)
at org.apache.catalina.startup.Catalina.load(Catalina.java:615)
at org.apache.catalina.startup.Catalina.load(Catalina.java:663)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.catalina.startup.Bootstrap.load(Bootstrap.java:310)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:484)
Caused by: java.lang.ClassNotFoundException: org.apache.tomcat.util.http.mapper.Mapper
at java.net.URLClassLoader.findClass(URLClassLoader.java:444)
at java.lang.ClassLoader.loadClass(ClassLoader.java:490)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
... 23 more

Migrating application from WAS7 to WAS 8.5

I am in the process of migrating an application from WAS 7 to WAS 8.5 but am having trouble starting the application. Following is the error I get. A similar question was asked earlier but was not answered,Unable to start application on websphere 8.5, but running on version 7
. I would appreciate if someone could throw some light on this issue.
ADMA0116W: Unable to start: dcaEAR using: WebSphere:name=ApplicationManager,process=server1,platform=proxy,node=W7-PC009NYLNode01,version=8.5.0.1,type=ApplicationManager,mbeanIdentifier=ApplicationManager,cell=W7-PC009NYLNode01Cell,spec=1.0 exception is: javax.management.MBeanException: Exception thrown in RequiredModelMBean while trying to invoke operation startApplication
at javax.management.modelmbean.RequiredModelMBean.invokeMethod(RequiredModelMBean.java:1112)
at javax.management.modelmbean.RequiredModelMBean.invoke(RequiredModelMBean.java:966)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:848)
at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:773)
at com.ibm.ws.management.AdminServiceImpl$1.run(AdminServiceImpl.java:1335)
at com.ibm.ws.security.util.AccessController.doPrivileged(AccessController.java:118)
at com.ibm.ws.management.AdminServiceImpl.invoke(AdminServiceImpl.java:1228)
at com.ibm.ws.management.application.AppManagementImpl._startApplication(AppManagementImpl.java:1482)
at com.ibm.ws.management.application.AppManagementImpl.startApplication(AppManagementImpl.java:1371)
at com.ibm.ws.management.application.AppManagementImpl.startApplication(AppManagementImpl.java:1320)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:256)
at javax.management.modelmbean.RequiredModelMBean.invokeMethod(RequiredModelMBean.java:1085)
at javax.management.modelmbean.RequiredModelMBean.invoke(RequiredModelMBean.java:966)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:848)
at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:773)
at com.ibm.ws.management.AdminServiceImpl$1.run(AdminServiceImpl.java:1335)
at com.ibm.ws.security.util.AccessController.doPrivileged(AccessController.java:118)
at com.ibm.ws.management.AdminServiceImpl.invoke(AdminServiceImpl.java:1228)
at com.ibm.ws.management.connector.AdminServiceDelegator.invoke(AdminServiceDelegator.java:181)
at com.ibm.ws.management.connector.ipc.CallRouter.route(CallRouter.java:247)
at com.ibm.ws.management.connector.ipc.IPCConnectorInboundLink.doWork(IPCConnectorInboundLink.java:360)
at com.ibm.ws.management.connector.ipc.IPCConnectorInboundLink$IPCConnectorReadCallback.complete(IPCConnectorInboundLink.java:602)
at com.ibm.ws.tcp.channel.impl.AioReadCompletionListener.futureCompleted(AioReadCompletionListener.java:165)
at com.ibm.io.async.AbstractAsyncFuture.invokeCallback(AbstractAsyncFuture.java:217)
at com.ibm.io.async.AsyncChannelFuture.fireCompletionActions(AsyncChannelFuture.java:161)
at com.ibm.io.async.AsyncFuture.completed(AsyncFuture.java:138)
at com.ibm.io.async.ResultHandler.complete(ResultHandler.java:204)
at com.ibm.io.async.ResultHandler.runEventProcessingLoop(ResultHandler.java:775)
at com.ibm.io.async.ResultHandler$2.run(ResultHandler.java:905)
at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:1814)
Caused by: com.ibm.ws.exception.RuntimeWarning: com.ibm.ws.webcontainer.exception.WebAppNotLoadedException: Failed to load webapp: Failed to load webapp: null
at com.ibm.ws.webcontainer.component.WebContainerImpl.install(WebContainerImpl.java:432)
at com.ibm.ws.webcontainer.component.WebContainerImpl.start(WebContainerImpl.java:718)
at com.ibm.ws.runtime.component.ApplicationMgrImpl.start(ApplicationMgrImpl.java:1173)
at com.ibm.ws.runtime.component.DeployedApplicationImpl.fireDeployedObjectStart(DeployedApplicationImpl.java:1370)
at com.ibm.ws.runtime.component.DeployedModuleImpl.start(DeployedModuleImpl.java:639)
at com.ibm.ws.runtime.component.DeployedApplicationImpl.start(DeployedApplicationImpl.java:968)
at com.ibm.ws.runtime.component.ApplicationMgrImpl.startApplication(ApplicationMgrImpl.java:772)
at com.ibm.ws.runtime.component.ApplicationMgrImpl.startApplicationDynamically(ApplicationMgrImpl.java:1367)
at com.ibm.ws.runtime.component.ApplicationMgrImpl.start(ApplicationMgrImpl.java:2172)
at com.ibm.ws.runtime.component.CompositionUnitMgrImpl.start(CompositionUnitMgrImpl.java:445)
at com.ibm.ws.runtime.component.CompositionUnitImpl.start(CompositionUnitImpl.java:123)
at com.ibm.ws.runtime.component.CompositionUnitMgrImpl.start(CompositionUnitMgrImpl.java:388)
at com.ibm.ws.runtime.component.CompositionUnitMgrImpl.access$500(CompositionUnitMgrImpl.java:116)
at com.ibm.ws.runtime.component.CompositionUnitMgrImpl$1.run(CompositionUnitMgrImpl.java:663)
at com.ibm.ws.security.auth.ContextManagerImpl.runAs(ContextManagerImpl.java:5363)
at com.ibm.ws.security.auth.ContextManagerImpl.runAsSystem(ContextManagerImpl.java:5579)
at com.ibm.ws.security.core.SecurityContext.runAsSystem(SecurityContext.java:255)
at com.ibm.ws.runtime.component.CompositionUnitMgrImpl.startCompositionUnit(CompositionUnitMgrImpl.java:677)
at com.ibm.ws.runtime.component.CompositionUnitMgrImpl.startCompositionUnit(CompositionUnitMgrImpl.java:621)
at com.ibm.ws.runtime.component.ApplicationMgrImpl.startApplication(ApplicationMgrImpl.java:1259)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:256)
at javax.management.modelmbean.RequiredModelMBean.invokeMethod(RequiredModelMBean.java:1085)
... 38 more
Caused by: com.ibm.ws.webcontainer.exception.WebAppNotLoadedException: Failed to load webapp: Failed to load webapp: null
at com.ibm.ws.webcontainer.WSWebContainer.addWebApp(WSWebContainer.java:759)
at com.ibm.ws.webcontainer.WSWebContainer.addWebApplication(WSWebContainer.java:634)
at com.ibm.ws.webcontainer.component.WebContainerImpl.install(WebContainerImpl.java:426)
... 68 more
Caused by: com.ibm.ws.webcontainer.exception.WebAppNotLoadedException: Failed to load webapp: null
at com.ibm.ws.webcontainer.VirtualHostImpl.addWebApplication(VirtualHostImpl.java:176)
at com.ibm.ws.webcontainer.WSWebContainer.addWebApp(WSWebContainer.java:749)
... 70 more
Caused by: java.lang.reflect.GenericSignatureFormatError
at sun.reflect.generics.parser.SignatureParser.error(SignatureParser.java:115)
at sun.reflect.generics.parser.SignatureParser.parseSimpleClassTypeSignature(SignatureParser.java:274)
at sun.reflect.generics.parser.SignatureParser.parseClassTypeSignatureSuffix(SignatureParser.java:282)
at sun.reflect.generics.parser.SignatureParser.parseClassTypeSignature(SignatureParser.java:256)
at sun.reflect.generics.parser.SignatureParser.parseClassSignature(SignatureParser.java:183)
at sun.reflect.generics.parser.SignatureParser.parseClassSig(SignatureParser.java:138)
at sun.reflect.generics.repository.ClassRepository.parse(ClassRepository.java:46)
at sun.reflect.generics.repository.ClassRepository.parse(ClassRepository.java:35)
at sun.reflect.generics.repository.AbstractRepository.<init>(AbstractRepository.java:68)
at sun.reflect.generics.repository.GenericDeclRepository.<init>(GenericDeclRepository.java:42)
at sun.reflect.generics.repository.ClassRepository.<init>(ClassRepository.java:42)
at sun.reflect.generics.repository.ClassRepository.make(ClassRepository.java:59)
at java.lang.Class.getClassRepository(Class.java:1825)
at java.lang.Class.getTypeParameters(Class.java:1842)
at org.apache.webbeans.util.ClassUtil.isDefinitionConstainsTypeVariables(ClassUtil.java:1653)
at org.apache.webbeans.config.BeanTypeSetResolver.normalClassConfiguration(BeanTypeSetResolver.java:72)
at org.apache.webbeans.config.BeanTypeSetResolver.startConfiguration(BeanTypeSetResolver.java:60)
at org.apache.webbeans.util.ClassUtil.setTypeHierarchy(ClassUtil.java:1710)
at org.apache.webbeans.portable.AbstractAnnotated.<init>(AbstractAnnotated.java:55)
at org.apache.webbeans.portable.AnnotatedTypeImpl.<init>(AnnotatedTypeImpl.java:58)
at org.apache.webbeans.portable.AnnotatedElementFactory.newAnnotatedType(AnnotatedElementFactory.java:98)
at org.apache.webbeans.config.BeansDeployer.deployFromClassPath(BeansDeployer.java:484)
at org.apache.webbeans.config.BeansDeployer.deploy(BeansDeployer.java:171)
at org.apache.webbeans.lifecycle.AbstractLifeCycle.startApplication(AbstractLifeCycle.java:124)
at org.apache.webbeans.web.lifecycle.WebContainerLifecycle.startApplication(WebContainerLifecycle.java:78)
at com.ibm.ws.webbeans.common.CommonLifeCycle.startApplication(CommonLifeCycle.java:106)
at com.ibm.ws.webbeans.services.JCDIServletContainerInitializer.onStartup(JCDIServletContainerInitializer.java:85)
at com.ibm.ws.webcontainer.webapp.WebAppImpl.initializeServletContainerInitializers(WebAppImpl.java:613)
at com.ibm.ws.webcontainer.webapp.WebAppImpl.initialize(WebAppImpl.java:409)
at com.ibm.ws.webcontainer.webapp.WebGroupImpl.addWebApplication(WebGroupImpl.java:88)
at com.ibm.ws.webcontainer.VirtualHostImpl.addWebApplication(VirtualHostImpl.java:169)
... 71 more
According to soa4j
This error happens when you use reflection and you try to load a class
but the definition of the class does not match what you expected to found,
probably due to an older JAR library being loaded in a more modern
version of a server. Check the nested exception to try to find the
name of the class colliding and put it in another classloader.
http://www.soa4j.com/RequestContentFromID?q=968830446190576900&lang=en
Have you tried enabling verbose loading of classes to see in which class this happens?