I am running Hive on localhost and my query is throwing error:
Everything is working fine with commands like select,insert and create, but it is throwing error for those queries where group by or joins are used.
Below is the error:
java.io.FileNotFoundException: File does not exist: hdfs://localhost:9000/usr/local/hive/lib/hive-common-2.1.0.jar
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1093)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1085)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1085)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:99)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:267)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:388)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:481)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:564)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:559)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:559)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:550)
at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:433)
at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:138)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1858)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1562)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1313)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1084)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1072)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:232)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:183)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:399)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:776)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:714)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Job Submission failed with exception 'java.io.FileNotFoundException(File does not exist: hdfs://localhost:9000/usr/local/hive/lib/hive-common-2.1.0.jar)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. File does not exist: hdfs://localhost:9000/usr/local/hive/lib/hive-common-2.1.0.jar
It is saying the file does not exist but the file is the place. Below is the snap of bashrc for hive:
#hive installation
export HIVE_HOME=/usr/local/hive
export PATH=$PATH:$HIVE_HOME/bin
Related
I am beginner to Big data systems.
I have installed Cloudera quick start vm-5.13.0 in my Oracle Virtual Box.
I issued following command to start hive.
[cloudera#quickstart ~]$ hive
But getting following error and hive is not started.
It seems like it is not able to connect with Hive metastore but I haven't made any changes in the VM installation.
Please let me know any manual changes I need to start hive.
Thanks.
Logging initialized using configuration in file:/etc/hive/conf.dist/hive-log4j.properties
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked.
WARN: Please see http://www.slf4j.org/codes.html#release for an explanation.
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:833)
at org.apache.hadoop.hive.ql.session.SessionState.getAuthorizationMode(SessionState.java:1679)
at org.apache.hadoop.hive.ql.session.SessionState.isAuthorizationModeV2(SessionState.java:1690)
at org.apache.hadoop.hive.ql.processors.CommandUtil.authorizeCommand(CommandUtil.java:55)
at org.apache.hadoop.hive.ql.processors.AddResourceProcessor.run(AddResourceProcessor.java:66)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:281)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:177)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:389)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:324)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:422)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:438)
at org.apache.hadoop.hive.cli.CliDriver.processInitFiles(CliDriver.java:472)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:723)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:699)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:634)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:391)
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:810)
... 20 more
Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.setConf(HiveAuthorizationProviderBase.java:114)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:388)
... 21 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:220)
at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:338)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:299)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:274)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:256)
at org.apache.hadoop.hive.ql.security.authorization.DefaultHiveAuthorizationProvider.init(DefaultHiveAuthorizationProvider.java:29)
at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.setConf(HiveAuthorizationProviderBase.java:112)
... 24 more
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1562)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3399)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3418)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3643)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:231)
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:215)
... 30 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1560)
... 37 more
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused
at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:464)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:244)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1560)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3399)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3418)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3643)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:231)
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:215)
at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:338)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:299)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:274)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:256)
at org.apache.hadoop.hive.ql.security.authorization.DefaultHiveAuthorizationProvider.init(DefaultHiveAuthorizationProvider.java:29)
at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.setConf(HiveAuthorizationProviderBase.java:112)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:388)
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:810)
at org.apache.hadoop.hive.ql.session.SessionState.getAuthorizationMode(SessionState.java:1679)
at org.apache.hadoop.hive.ql.session.SessionState.isAuthorizationModeV2(SessionState.java:1690)
at org.apache.hadoop.hive.ql.processors.CommandUtil.authorizeCommand(CommandUtil.java:55)
at org.apache.hadoop.hive.ql.processors.AddResourceProcessor.run(AddResourceProcessor.java:66)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:281)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:177)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:389)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:324)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:422)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:438)
at org.apache.hadoop.hive.cli.CliDriver.processInitFiles(CliDriver.java:472)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:723)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:699)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:634)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:579)
at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
... 45 more
)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:512)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:244)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
... 42 more
i am using cdh 5.13.0 environment
whenever i try to execute hive cmd it shows error
FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
i checked the hive-metastore.log ,it shows
2018-05-02 06:15:53,225 ERROR [main]: Datastore.Schema (Log4JLogger.java:error(125)) - Failed initialising database.
Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to create database 'metastore_db', see the next exception for details.
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedConnection.(Unknown Source)
at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:187)
at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
at com.jolbox.bonecp.BoneCP.(BoneCP.java:416)
at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
at org.datanucleus.store.rdbms.RDBMSStoreManager.(RDBMSStoreManager.java:298)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
at java.security.AccessController.doPrivileged(Native Method)
at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:418)
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:447)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:342)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:298)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.hive.metastore.RawStoreProxy.(RawStoreProxy.java:60)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:69)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:682)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:660)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:709)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:508)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.(RetryingHMSHandler.java:78)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6474)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6469)
at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:6719)
at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:6646)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: ERROR XJ041: Failed to create database 'metastore_db', see the next exception for details.
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
... 61 more
Caused by: ERROR XBM0H: Directory /metastore_db cannot be created.
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.impl.services.monitor.StorageFactoryService$10.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown Source)
at org.apache.derby.impl.services.monitor.FileMonitor.createPersistentService(Unknown Source)
at org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown Source)
... 58 more
org.datanucleus.exceptions.NucleusDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to create database 'metastore_db', see the next exception for details.
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedConnection.(Unknown Source)
at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:187)
at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
at com.jolbox.bonecp.BoneCP.(BoneCP.java:416)
at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
at org.datanucleus.store.rdbms.RDBMSStoreManager.(RDBMSStoreManager.java:298)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
at java.security.AccessController.doPrivileged(Native Method)
at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:418)
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:447)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:342)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:298)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.hive.metastore.RawStoreProxy.(RawStoreProxy.java:60)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:69)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:682)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:660)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:709)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:508)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.(RetryingHMSHandler.java:78)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6474)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6469)
at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:6719)
at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:6646)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: ERROR XJ041: Failed to create database 'metastore_db', see the next exception for details.
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
... 61 more
Caused by: ERROR XBM0H: Directory /metastore_db cannot be created.
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.impl.services.monitor.StorageFactoryService$10.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown Source)
at org.apache.derby.impl.services.monitor.FileMonitor.createPersistentService(Unknown Source)
at org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown Source)
... 58 more
i dont know what to do.
the status of hive-metastore server is down and not getting shutdown
Cause Description of what caused the problem.
The Out Of Memory (OOM) error, in this case, is a result of the server not having enough memory to start HiveMetaStore service.
RCA
HiveMetaStore service is configured in hive-site.xml on the client side, but the service is not started.
For example:
/etc/gphd/hive-0.11.0_gphd_2_1_0_0/conf/hive-site.xml
<property>
<name>hive.metastore.uris
<value>thrift://hdw1.viadea.com:9083
</property>
However on hdw1:
-bash-4.1$ service hive-metastore status
hive-metastore dead but pid file exists
After observing the hive-metastore.log, you can see that reason why the service failed to start was due to an Out of Memory (OOM) error, as shown in the example below:
14/04/07 16:43:13 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties files.
library initialization failed - unable to allocate file descriptor table - out of memory
Procedure
To resolve this issue, follow the steps below:
1. Understand why HiveMetaStore service is not started. In this case, increase the physical memory of the server.
2. Then start the HiveMetaStore service manually using the root user.
service hive-metastore start
Getting the below exception in Hive on running simple SELECT COUNT(*) FROM Table.
Job Submission failed with exception
'org.apache.hadoop.io.nativeio.NativeIOException(No such file or
directory)' FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.mr.MapRedTask. No such file or
directory
No issues occurs on simple SELECT * FROM Table.
Please suggest where might be the problem. Hive execution engine is MR. Full stack trace of error :
2017-07-18T07:18:52,744 ERROR [main]: exec.Task (:()) - Job Submission
failed with exception
'org.apache.hadoop.io.nativeio.NativeIOException(No such file or
directory)' ENOENT: No such file or directory at
org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl(Native Method)
at
org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:230)
at
org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:729)
at
org.apache.hadoop.fs.ChecksumFileSystem$1.apply(ChecksumFileSystem.java:505)
at
org.apache.hadoop.fs.ChecksumFileSystem$FsOperation.run(ChecksumFileSystem.java:486)
at
org.apache.hadoop.fs.ChecksumFileSystem.setPermission(ChecksumFileSystem.java:502)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:602) at
org.apache.hadoop.mapreduce.JobResourceUploader.uploadFiles(JobResourceUploader.java:94)
at
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:95)
at
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:190)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) at
org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:415) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) at
org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:575) at
org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:570) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:415) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:570)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:561)
at
org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:433)
at
org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:138)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197) at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1858) at
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1562) at
org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1313) at
org.apache.hadoop.hive.ql.Driver.run(Driver.java:1084) at
org.apache.hadoop.hive.ql.Driver.run(Driver.java:1072) at
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:232)
at
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:183)
at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:399)
at
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:776)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:714) at
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606) at
org.apache.hadoop.util.RunJar.run(RunJar.java:221) at
org.apache.hadoop.util.RunJar.main(RunJar.java:136)
2017-07-18T07:18:52,745 ERROR [main]: ql.Driver (:()) - FAILED:
Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.mr.MapRedTask. No such file or
directory
Try checking permissions on hadoop tmp dir. The path to this directory is configured for example in core-site.xml. The property name is hadoop.tmp.dir.
I had a similar issue as the one you described and it was caused by the user, under which the MR job was being executed, not having write permission to that location.
While uploading build to Appstore, I am getting this error, Please help.
java.lang.RuntimeException: Unable to open JAR file, probably deleted: error in opening zip file
at org.apache.felix.framework.cache.JarContent.openZipFile(JarContent.java:489)
at org.apache.felix.framework.cache.JarContent.<init>(JarContent.java:63)
at org.apache.felix.framework.cache.JarContent.getEntryAsContent(JarContent.java:275)
at org.apache.felix.framework.ModuleImpl.calculateContentPath(ModuleImpl.java:595)
at org.apache.felix.framework.ModuleImpl.initializeContentPath(ModuleImpl.java:546)
at org.apache.felix.framework.ModuleImpl.getContentPath(ModuleImpl.java:532)
at org.apache.felix.framework.ModuleImpl.access$800(ModuleImpl.java:71)
at org.apache.felix.framework.ModuleImpl$ModuleClassLoader.findClass(ModuleImpl.java:1810)
at org.apache.felix.framework.ModuleImpl.findClassOrResourceByDelegation(ModuleImpl.java:727)
at org.apache.felix.framework.ModuleImpl.access$400(ModuleImpl.java:71)
at org.apache.felix.framework.ModuleImpl$ModuleClassLoader.loadClass(ModuleImpl.java:1768)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.felix.framework.ModuleImpl.getClassByDelegation(ModuleImpl.java:645)
at org.apache.felix.framework.resolver.WireImpl.getClass(WireImpl.java:99)
at org.apache.felix.framework.ModuleImpl.searchImports(ModuleImpl.java:1390)
at org.apache.felix.framework.ModuleImpl.findClassOrResourceByDelegation(ModuleImpl.java:722)
at org.apache.felix.framework.ModuleImpl.access$400(ModuleImpl.java:71)
at org.apache.felix.framework.ModuleImpl$ModuleClassLoader.loadClass(ModuleImpl.java:1768)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at com.apple.transporter.softwaresupport.SoftwareSupportBundleActivator.registerService(SoftwareSupportBundleActivator.java:22)
at com.apple.transporter.softwaresupport.SoftwareSupportBundleActivator.start(SoftwareSupportBundleActivator.java:15)
at org.apache.felix.framework.util.SecureAction.startActivator(SecureAction.java:629)
at org.apache.felix.framework.Felix.activateBundle(Felix.java:1827)
at org.apache.felix.framework.Felix.startBundle(Felix.java:1744)
at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:922)
at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:909)
at com.apple.transporter.osgi.OSGiBootstrapper.startBundle(OSGiBootstrapper.java:522)
at com.apple.transporter.osgi.OSGiBootstrapper.bootstrap(OSGiBootstrapper.java:297)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.apple.transporter.Application.start(Application.java:133)
at com.apple.transporter.Application.main(Application.java:357)
An error occurred while starting bundles for the software update process. Transporter will try to continue. Activator start error in bundle com.apple.transporter.softwaresupport [8].
java.lang.RuntimeException: Unable to open JAR file, probably deleted: error in opening zip file
at org.apache.felix.framework.cache.JarContent.openZipFile(JarContent.java:489)
at org.apache.felix.framework.cache.JarContent.(JarContent.java:63)
at org.apache.felix.framework.cache.JarContent.getEntryAsContent(JarContent.java:275)
at org.apache.felix.framework.ModuleImpl.calculateContentPath(ModuleImpl.java:595)
at org.apache.felix.framework.ModuleImpl.initializeContentPath(ModuleImpl.java:546)
at org.apache.felix.framework.ModuleImpl.getContentPath(ModuleImpl.java:532)
at org.apache.felix.framework.ModuleImpl.access$800(ModuleImpl.java:71)
at org.apache.felix.framework.ModuleImpl$ModuleClassLoader.findClass(ModuleImpl.java:1810)
at org.apache.felix.framework.ModuleImpl.findClassOrResourceByDelegation(ModuleImpl.java:727)
at org.apache.felix.framework.ModuleImpl.access$400(ModuleImpl.java:71)
at org.apache.felix.framework.ModuleImpl$ModuleClassLoader.loadClass(ModuleImpl.java:1768)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.felix.framework.ModuleImpl.getClassByDelegation(ModuleImpl.java:645)
at org.apache.felix.framework.resolver.WireImpl.getClass(WireImpl.java:99)
at org.apache.felix.framework.ModuleImpl.searchImports(ModuleImpl.java:1390)
at org.apache.felix.framework.ModuleImpl.findClassOrResourceByDelegation(ModuleImpl.java:722)
at org.apache.felix.framework.ModuleImpl.access$400(ModuleImpl.java:71)
at org.apache.felix.framework.ModuleImpl$ModuleClassLoader.loadClass(ModuleImpl.java:1768)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at com.apple.transporter.softwaresupport.SoftwareSupportBundleActivator.registerService(SoftwareSupportBundleActivator.java:22)
at com.apple.transporter.softwaresupport.SoftwareSupportBundleActivator.start(SoftwareSupportBundleActivator.java:15)
at org.apache.felix.framework.util.SecureAction.startActivator(SecureAction.java:629)
at org.apache.felix.framework.Felix.activateBundle(Felix.java:1827)
at org.apache.felix.framework.Felix.startBundle(Felix.java:1744)
at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:922)
at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:909)
at com.apple.transporter.osgi.OSGiBootstrapper.startBundle(OSGiBootstrapper.java:522)
at com.apple.transporter.osgi.OSGiBootstrapper.bootstrap(OSGiBootstrapper.java:297)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.apple.transporter.Application.start(Application.java:133)
at com.apple.transporter.Application.main(Application.java:357)
An error occurred while starting bundles for the software update process. Transporter will try to continue. Activator start error in bundle com.apple.transporter.softwaresupport [9].
[2017-02-07 15:54:03 IST] INFO: Transporter is searching for updated software components.
[2017-02-07 15:54:07 IST] INFO: Transporter is up-to-date.
java.lang.RuntimeException: Unable to open JAR file, probably deleted: error in opening zip file
at org.apache.felix.framework.cache.JarContent.openZipFile(JarContent.java:489)
at org.apache.felix.framework.cache.JarContent.(JarContent.java:63)
at org.apache.felix.framework.cache.JarContent.getEntryAsContent(JarContent.java:275)
at org.apache.felix.framework.ModuleImpl.calculateContentPath(ModuleImpl.java:595)
at org.apache.felix.framework.ModuleImpl.initializeContentPath(ModuleImpl.java:546)
at org.apache.felix.framework.ModuleImpl.getContentPath(ModuleImpl.java:532)
at org.apache.felix.framework.ModuleImpl.access$800(ModuleImpl.java:71)
at org.apache.felix.framework.ModuleImpl$ModuleClassLoader.findClass(ModuleImpl.java:1810)
at org.apache.felix.framework.ModuleImpl.findClassOrResourceByDelegation(ModuleImpl.java:727)
at org.apache.felix.framework.ModuleImpl.access$400(ModuleImpl.java:71)
at org.apache.felix.framework.ModuleImpl$ModuleClassLoader.loadClass(ModuleImpl.java:1768)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.felix.framework.ModuleImpl.getClassByDelegation(ModuleImpl.java:645)
at org.apache.felix.framework.resolver.WireImpl.getClass(WireImpl.java:99)
at org.apache.felix.framework.ModuleImpl.searchImports(ModuleImpl.java:1390)
at org.apache.felix.framework.ModuleImpl.findClassOrResourceByDelegation(ModuleImpl.java:722)
at org.apache.felix.framework.ModuleImpl.access$400(ModuleImpl.java:71)
at org.apache.felix.framework.ModuleImpl$ModuleClassLoader.loadClass(ModuleImpl.java:1768)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at com.apple.transporter.osgi.TransporterService.run(TransporterService.java:55)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.apple.transporter.osgi.OSGiBootstrapper.runTransporter(OSGiBootstrapper.java:579)
at com.apple.transporter.osgi.OSGiBootstrapper.bootstrap(OSGiBootstrapper.java:305)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.apple.transporter.Application.start(Application.java:133)
at com.apple.transporter.Application.main(Application.java:357)
An error occurred while trying to start transporter. Exception's name: java.lang.reflect.InvocationTargetException, Exception's message: null
ERROR: Unable to get module class path. (java.lang.RuntimeException: Unable to open JAR file, probably deleted: error in opening zip file)
ERROR: Unable to get module class path. (java.lang.RuntimeException: Unable to open JAR file, probably deleted: error in opening zip file)
ERROR: Unable to get module class path. (java.lang.RuntimeException: Unable to open JAR file, probably deleted: error in opening zip file)
Im running Pentaho 5.3 Biserver in a Redhat machine yesterday there was a power failure, when the power came back i tried to start the biserver it gives
HTTP Status 404 -
type Status report
message
description The requested resource is not available.
Apache Tomcat/6.0.411
I checked the Pentaho.log
2015-03-24 12:11:23,595 ERROR [org.pentaho.platform.osgi.OSGIBoot] Error installing Bundle
org.osgi.framework.BundleException: Bundle symbolic name and version are not unique: org.objectweb.asm.all:4.0.0
at org.apache.felix.framework.BundleImpl.createRevision(BundleImpl.java:1233)
at org.apache.felix.framework.BundleImpl.<init>(BundleImpl.java:96)
at org.apache.felix.framework.Felix.installBundle(Felix.java:2899)
at org.apache.felix.framework.BundleContextImpl.installBundle(BundleContextImpl.java:165)
at org.apache.felix.framework.BundleContextImpl.installBundle(BundleContextImpl.java:138)
at org.pentaho.platform.osgi.OSGIBoot.startup(OSGIBoot.java:137)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:421)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:77)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:343)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:311)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:212)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:135)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
2015-03-24 12:11:23,609 ERROR [org.pentaho.platform.osgi.OSGIBoot] Error installing Bundle
org.osgi.framework.BundleException: Bundle symbolic name and version are not unique: com.springsource.org.apache.commons.logging:1.1.1
at org.apache.felix.framework.BundleImpl.createRevision(BundleImpl.java:1233)
at org.apache.felix.framework.BundleImpl.<init>(BundleImpl.java:96)
at org.apache.felix.framework.Felix.installBundle(Felix.java:2899)
at org.apache.felix.framework.BundleContextImpl.installBundle(BundleContextImpl.java:165)
at org.apache.felix.framework.BundleContextImpl.installBundle(BundleContextImpl.java:138)
at org.pentaho.platform.osgi.OSGIBoot.startup(OSGIBoot.java:137)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:421)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:77)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:343)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:311)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:212)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:135)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
How do i resolve this? Help please
You should clean your osgi cache
1) stop pentaho server
2) go to /biserver-ce/pentaho-solutions/system/osgi/cache and delete everything
3) start pentaho server