I am using Mondrian OLAP FOR Analytics, We've been using Mysql but have decided to migrate to Apache drill.
Now I have configured Apache Drill and I am able to connect is to Apache drill over external clients like SQuirreL Client, but When I am trying to connect it Via Saiku, I get following stacktrace in Catalina.out
INFO: Deploying web application directory saiku
name:Lists
driver:mondrian.olap4j.MondrianOlap4jDriver
url:jdbc:mondrian:Jdbc=jdbc:drill:drillbit=localhost:31010;schema=Saiku.root;Catalog=../webapps/saiku/WEB-INF/classes/imi_India/INDIA.xml;JdbcDrivers=org.apache.drill.jdbc.Driver;
23:59:21,878 WARN [RolapUtil] Mondrian: Warning: JDBC driver sun.jdbc.odbc.JdbcOdbcDriver not found
23:59:21,879 WARN [RolapUtil] Mondrian: Warning: JDBC driver oracle.jdbc.OracleDriver not found
23:59:22,162 WARN [DrillMetrics] Removing old metric since name matched newly registered metric. Metric name: drill.allocator.root.used
23:59:22,162 WARN [DrillMetrics] Removing old metric since name matched newly registered metric. Metric name: drill.allocator.root.peak
23:59:25,246 WARN [ThreadLocalRandom] Failed to generate a seed from SecureRandom within 3 seconds. Not enough entrophy?
mondrian.olap.MondrianException: Mondrian Error:Internal error: Error while creating SQL connection: Jdbc=jdbc:drill:drillbit=localhost:31010; JdbcUser=admin; JdbcPassword=admin
at mondrian.resource.MondrianResource$_Def0.ex(MondrianResource.java:972)
at mondrian.olap.Util.newInternal(Util.java:2403)
at mondrian.olap.Util.newError(Util.java:2419)
at mondrian.rolap.RolapConnection.<init>(RolapConnection.java:246)
at mondrian.rolap.RolapSchema.<init>(RolapSchema.java:188)
at mondrian.rolap.RolapSchema.<init>(RolapSchema.java:216)
at mondrian.rolap.RolapSchemaPool.get(RolapSchemaPool.java:214)
at mondrian.rolap.RolapSchemaPool.get(RolapSchemaPool.java:66)
at mondrian.rolap.RolapConnection.<init>(RolapConnection.java:160)
at mondrian.rolap.RolapConnection.<init>(RolapConnection.java:90)
at mondrian.olap.DriverManager.getConnection(DriverManager.java:112)
at mondrian.olap.DriverManager.getConnection(DriverManager.java:68)
at mondrian.olap4j.MondrianOlap4jConnection.<init>(MondrianOlap4jConnection.java:153)
at mondrian.olap4j.FactoryJdbc4Plus$AbstractConnection.<init>(FactoryJdbc4Plus.java:323)
at mondrian.olap4j.FactoryJdbc41Impl$MondrianOlap4jConnectionJdbc41.<init>(FactoryJdbc41Impl.java:118)
at mondrian.olap4j.FactoryJdbc41Impl.newConnection(FactoryJdbc41Impl.java:32)
at mondrian.olap4j.MondrianOlap4jDriver.connect(MondrianOlap4jDriver.java:134)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.saiku.datasources.connection.SaikuOlapConnection.connect(SaikuOlapConnection.java:75)
at org.saiku.datasources.connection.SaikuOlapConnection.connect(SaikuOlapConnection.java:46)
at org.saiku.datasources.connection.SaikuConnectionFactory.getConnection(SaikuConnectionFactory.java:29)
at org.saiku.web.impl.SecurityAwareConnectionManager.connect(SecurityAwareConnectionManager.java:284)
at org.saiku.web.impl.SecurityAwareConnectionManager.getInternalConnection(SecurityAwareConnectionManager.java:99)
at org.saiku.datasources.connection.AbstractConnectionManager.getConnection(AbstractConnectionManager.java:110)
at org.saiku.datasources.connection.AbstractConnectionManager.getAllConnections(AbstractConnectionManager.java:136)
at org.saiku.web.impl.SecurityAwareConnectionManager.init(SecurityAwareConnectionManager.java:58)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1536)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1477)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1409)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:288)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:190)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:574)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:895)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:425)
at org.springframework.web.context.ContextLoader.createWebApplicationContext(ContextLoader.java:276)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:197)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:47)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:3972)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4467)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:526)
at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1041)
at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:964)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1277)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:321)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:722)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
at org.apache.catalina.core.StandardService.start(StandardService.java:516)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
at org.apache.catalina.startup.Catalina.start(Catalina.java:593)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool error Could not create a validated object, cause: null
at org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:104)
at mondrian.rolap.RolapConnection.<init>(RolapConnection.java:226)
... 66 more
Caused by: java.util.NoSuchElementException: Could not create a validated object, cause: null
at org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1008)
at org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:96)
... 67 more
now while I have added dependency in pom.xml of the application
<dependency>
<groupId>org.apache.drill.exec</groupId>
<artifactId>drill-jdbc</artifactId>
<version>1.1.0</version>
</dependency>
and used following connection string
jdbc:drill:drillbit=localhost:31010;schema=Saiku.root
I also have bunch of parquet files in HDFS directory /dashboard/saiku3//.parquet
There are 17 such tables. I have created a storage engine called Saiku with root storage in /saiku3 and created Views for all the 17 hadoop directories. Those directories have been imported from mysql to HDFS via Sqoop as parquet.
I have been stuck here for hours and Couldn't find a solution, Am I doing something wrong here?
Thanks in advance.
"Not enough entropy" ? What version of Java is this? You seem to be using /dev/random iso. /dev/urandom. Anyway, you can change this with -Djava.security.egd if I'm not mistaken.
Check if you have drill-jdbc-all-1.8.0.jar in tomcat/lib
Related
I am trying to create weblogic domain using silent mode through putty .I have used below command:
./config.sh -mode=silent -silent_xml=/home/ec2-user/createdomain.xml
I am getting below error message while executing it:
Exception in thread "Thread-1" java.lang.IllegalStateException: No able to create the instance of the template catalog class com.oracle.cie.domain.template.catalog.impl.GlobalTemplateCat
at com.oracle.cie.domain.template.catalog.TemplateCatalogFactory.createGlobalTemplateCatalog(TemplateCatalogFactory.java:138)
at com.oracle.cie.domain.template.catalog.TemplateCatalogFactory.getGlobalCatalog(TemplateCatalogFactory.java:78)
at com.oracle.cie.domain.template.catalog.TemplateCatalogFactory.getGlobalCatalog(TemplateCatalogFactory.java:33)
at com.oracle.cie.wizard.domain.silent.tasks.LoadTemplateCatalogTask$1.run(LoadTemplateCatalogTask.java:23)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.oracle.cie.domain.template.catalog.TemplateCatalogFactory.createGlobalTemplateCatalog(TemplateCatalogFactory.java:133)
... 4 more
Caused by: com.oracle.cie.domain.env.EnvironmentServiceException: Failed to get inventory for /home/ec2-user/oracle/middleware/oracle_common/common/bin
at com.oracle.cie.domain.env.EnvironmentServiceImpl.init(EnvironmentServiceImpl.java:425)
at com.oracle.cie.domain.env.EnvironmentServiceImpl.<init>(EnvironmentServiceImpl.java:89)
at `com`.oracle.cie.domain.env.EnvironmentServiceImpl.getInstance(EnvironmentServiceImpl.java:364)
at com.oracle.cie.domain.env.EnvironmentServiceFactory.getEnvironmentService(EnvironmentServiceFactory.java:35)
at com.oracle.cie.domain.template.catalog.impl.OracleHomeLocator.getProductInstalDirs(OracleHomeLocator.java:31)
at com.oracle.cie.domain.template.catalog.impl.GlobalTemplateCat.populateProductCatalogs(GlobalTemplateCat.java:446)
at com.oracle.cie.domain.template.catalog.impl.GlobalTemplateCat.<init>(GlobalTemplateCat.java:90)
at com.oracle.cie.domain.template.catalog.impl.GlobalTemplateCat.<init>(GlobalTemplateCat.java:83)
... 9 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.oracle.cie.common.ReflectionHelper.process(ReflectionHelper.java:48)
at com.oracle.cie.domain.env.EnvironmentServiceImpl.init(EnvironmentServiceImpl.java:384)
... 16 more
Caused by: com.oracle.cie.gdr.external.InventoryException: com.oracle.cie.gdr.utils.GdrException: The gdr meta-data directory /home/ec2-user/oracle/middleware/oracle_common/common/bin/inventory is invalid or does not exist.
at com.oracle.cie.gdr.external.impl.OracleHomeInventoryImpl.<init>(OracleHomeInventoryImpl.java:55)
at com.oracle.cie.gdr.external.impl.OracleHomeInventoryFactory.createInventory(OracleHomeInventoryFactory.java:60)
at com.oracle.cie.gdr.external.InventoryFactory.getOracleHomeInventory(InventoryFactory.java:99)
... 22 more
Caused by: com.oracle.cie.gdr.utils.GdrException: The gdr meta-data directory /home/ec2-user/oracle/middleware/oracle_common/common/bin/inventory is invalid or does not exist.
at com.oracle.cie.gdr.MetaDataHome.init(MetaDataHome.java:206)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:188)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:172)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:157)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:144)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:86)
at com.oracle.cie.gdr.Home.getMetaDataHome(Home.java:619)
Which Weblogic version are you using? I have not seen a silent script to create domains for a While. If you are trying to do this on Weblogic 12c, it won't work as this kind of script used to be available for older versions such as 8 and 9 as far as I remember.
If you want to automate domain's provisioning for versions such as 12c you should use a newer approach. Here, I am proposing two options.
You can use Ansible, WLST and Python to create the domain. You can see an example here https://github.com/textanalyticsman/ansible-soa
You can use Weblogic Deploy Tooling, this is an Open Source tool provided by Oracle and you can find out it here https://github.com/oracle/weblogic-deploy-tooling
The combination of Weblogic Deploy Tooling and Ansible is also a good option as is shown in https://github.com/textanalyticsman/ansible-soa-wldt
You can also try Weblogic Kubernetes Operator https://oracle.github.io/weblogic-kubernetes-operator/userguide/managing-domains/domain-resource/
During hardware migration, thought it sensible to upgrade Geoserver from 2.13.0 to 2.14.2 on Glassfish 4.0 hoping to cure flakiness.
JVM is configured to use a non-standard data directory using JVM parameter:
-DGEOSERVER_DATA_DIR=/geoserver
Using the .war downloaded from Geoserver.org, deployment fails
2.15.x exhibits the same failure
2.13.0 deploys and runs, albeit with some flakiness (similar stack track shown on the home-page of geoserver once logged in)
The issue appears platform independent, environments used:
Windows 10 X64 running Oracle Java 1.8.0_161 on Glassfish 4 OR Glassfish 5 (including fresh out-of box installs of Glassfish)
CentOS 7 running OpenJDK 1.8.0_222 on Glassfish 4
The WAR did deploy on a localhost (W10) install of TomCat, albeit without the custom data directory set
If latest GeoServer builds could be made to work on Glassfish 4, this would save considerable time and expense
Any suggestions appreciated
Severe: Exception during lifecycle processing
java.lang.Exception: java.lang.IllegalStateException: ContainerBase.addChild: start: org.apache.catalina.LifecycleException: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'gwcFacade' defined in URL [jar:file:/E:/glassfish-4.0/glassfish/domains/domain1/applications/geoserver/WEB-INF/lib/gs-gwc-2.14.5.jar!/applicationContext.xml]: Bean instantiation via constructor failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.geoserver.gwc.GWC]: Constructor threw exception; nested exception is java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkNotNull(Ljava/lang/Object;Ljava/lang/String;Ljava/lang/Object;)Ljava/lang/Object;
at com.sun.enterprise.web.WebApplication.start(WebApplication.java:168)
at org.glassfish.internal.data.EngineRef.start(EngineRef.java:122)
at org.glassfish.internal.data.ModuleInfo.start(ModuleInfo.java:291)
at org.glassfish.internal.data.ApplicationInfo.start(ApplicationInfo.java:352)
at com.sun.enterprise.v3.server.ApplicationLifecycle.deploy(ApplicationLifecycle.java:497)
at com.sun.enterprise.v3.server.ApplicationLifecycle.deploy(ApplicationLifecycle.java:219)
at org.glassfish.deployment.admin.DeployCommand.execute(DeployCommand.java:491)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$2$1.run(CommandRunnerImpl.java:527)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$2$1.run(CommandRunnerImpl.java:523)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$2.execute(CommandRunnerImpl.java:522)
at com.sun.enterprise.v3.admin.CommandRunnerImpl.doCommand(CommandRunnerImpl.java:546)
at com.sun.enterprise.v3.admin.CommandRunnerImpl.doCommand(CommandRunnerImpl.java:1423)
at com.sun.enterprise.v3.admin.CommandRunnerImpl.access$1500(CommandRunnerImpl.java:108)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$ExecutionContext.execute(CommandRunnerImpl.java:1762)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$ExecutionContext.execute(CommandRunnerImpl.java:1674)
at org.glassfish.admin.rest.resources.admin.CommandResource.executeCommand(CommandResource.java:396)
at org.glassfish.admin.rest.resources.admin.CommandResource.execCommandSimpInMultOut(CommandResource.java:234)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:125)
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$ResponseOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:152)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:91)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:346)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:341)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:101)
at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:224)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)
at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
at org.glassfish.jersey.internal.Errors.process(Errors.java:267)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317)
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:198)
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:946)
at org.glassfish.jersey.grizzly2.httpserver.GrizzlyHttpContainer.service(GrizzlyHttpContainer.java:331)
at org.glassfish.admin.rest.adapter.JerseyContainerCommandService$3.service(JerseyContainerCommandService.java:165)
at org.glassfish.admin.rest.adapter.RestAdapter.service(RestAdapter.java:181)
at com.sun.enterprise.v3.services.impl.ContainerMapper.service(ContainerMapper.java:246)
at org.glassfish.grizzly.http.server.HttpHandler.runService(HttpHandler.java:191)
at org.glassfish.grizzly.http.server.HttpHandler.doHandle(HttpHandler.java:168)
at org.glassfish.grizzly.http.server.HttpServerFilter.handleRead(HttpServerFilter.java:189)
at org.glassfish.grizzly.filterchain.ExecutorResolver$9.execute(ExecutorResolver.java:119)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeFilter(DefaultFilterChain.java:288)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeChainPart(DefaultFilterChain.java:206)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.execute(DefaultFilterChain.java:136)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.process(DefaultFilterChain.java:114)
at org.glassfish.grizzly.ProcessorExecutor.execute(ProcessorExecutor.java:77)
at org.glassfish.grizzly.nio.transport.TCPNIOTransport.fireIOEvent(TCPNIOTransport.java:838)
at org.glassfish.grizzly.strategies.AbstractIOStrategy.fireIOEvent(AbstractIOStrategy.java:113)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy.run0(WorkerThreadIOStrategy.java:115)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy.access$100(WorkerThreadIOStrategy.java:55)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy$WorkerThreadRunnable.run(WorkerThreadIOStrategy.java:135)
at org.glassfish.grizzly.threadpool.AbstractThreadPool$Worker.doWork(AbstractThreadPool.java:564)
at org.glassfish.grizzly.threadpool.AbstractThreadPool$Worker.run(AbstractThreadPool.java:544)
at java.lang.Thread.run(Thread.java:748)
Severe: Exception while loading the app
Severe: Undeployment failed for context /geoserver
Warning: The web application [/geoserver] registered the JDBC driver [org.h2.Driver] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered.
Severe: The web application [/geoserver] created a ThreadLocal with key of type [java.lang.ThreadLocal.SuppliedThreadLocal] (value [java.lang.ThreadLocal$SuppliedThreadLocal#e524836]) and a value of type [org.geowebcache.storage.CompositeBlobStore.StoreSuitabilityCheck] (value [EXISTING]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
Severe: Exception while loading the app : java.lang.IllegalStateException: ContainerBase.addChild: start: org.apache.catalina.LifecycleException: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'gwcFacade' defined in URL [jar:file:/E:/glassfish-4.0/glassfish/domains/domain1/applications/geoserver/WEB-INF/lib/gs-gwc-2.14.5.jar!/applicationContext.xml]: Bean instantiation via constructor failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.geoserver.gwc.GWC]: Constructor threw exception; nested exception is java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkNotNull(Ljava/lang/Object;Ljava/lang/String;Ljava/lang/Object;)Ljava/lang/Object;
The reason why you are seeing this is because the application you are trying to deploy packages a different Google Guava version than the one that Glassfish ships with. Unfortunately Glassfish favours its own packaged version which does not have the method com.google.common.base.Preconditions.checkNotNull(Ljava/lang/Object;Ljava/lang/String;Ljava/lang/Object;)Ljava/lang/Object;).
There exist some (more or less involved) workarounds.
https://github.com/eclipse-ee4j/glassfish/issues/20850
Since you seem to have a WAR archive, this comment https://github.com/eclipse-ee4j/glassfish/issues/20850#issuecomment-422010251 suggests that it might be enough to disable the classloader delegation in the glassfish-web.xml. Maybe you are lucky and this is indeed helping with your WAR file.
Otherwise you'll have to switch to Payara which seems to have this bug fixed since some time.
I'm using the spark-submit command as below:
spark-submit --class com.example.hdfs.spark.RawDataAdapter --master yarn --deploy-mode cluster --jars /home/hadoop/emr/deployment/server/emr-core-1.0-SNAPSHOT.jar home/hadoop/emr-spark-1.0-SNAPSHOT.jar hdfs://111.11.11.111:8020/user/hdfsinputfile.zip 8000
However, it gives me the error java.lang.NoClassDefFoundError: com/example/emr/parser/IParser3. Though the IParser3.class is present in emr-core-1.0-SNAPSHOT.jar. I don't understand why it throws that error. I tried several ways but couldn't succeed. How can I resolve this?
I am able to run the same command in client mode and also as a standalone spark application. Getting this error only when in yarn cluster mode.
Exception from container-launch. Container id: container_e37_1526066605784_0014_02_000001 Exit code: 15 Container exited with a non-zero exit code 15. Error file: prelaunch.err. Last 4096 bytes of prelaunch.err : Last 4096 bytes of stderr : g.ClassLoader.defineClass(ClassLoader.java:763) at java.lang.ClassLoader.defineClass(ClassLoader.java:642) at com.example.hdfs.spark.utils.SimpleClassLoader.loadJarFile(SimpleClassLoader.java:126) at com.example.hdfs.spark.utils.SimpleClassLoader.(SimpleClassLoader.java:38) at com.example.hdfs.spark.input RawInputFormat.loadPlugins(RawInputFormat.java:71) at com.example.hdfs.spark.RawDataAdapter.run(RawDataAdapter.java:54) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at com.example.hdfs.spark.RawDataAdapter.main(RawDataAdapter.java:33) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$anon$3.run(ApplicationMaster.scala:646) 18/05/14 14:00:13 ERROR ApplicationMaster: Uncaught exception: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205) at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:423) at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:282) at org.apache.spark.deploy.yarn.ApplicationMaster$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:768) at org.apache.spark.deploy.SparkHadoopUtil$anon$2.run(SparkHadoopUtil.scala:67) at org.apache.spark.deploy.SparkHadoopUtil$anon$2.run(SparkHadoopUtil.scala:66) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:766) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala) Caused by: java.util.concurrent.ExecutionException: Boxed Error at scala.concurrent.impl.Promise$.resolver(Promise.scala:55) at scala.concurrent.impl.Promise$.scala$concurrent$impl$Promise$resolveTry(Promise.scala:47) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:244) at scala.concurrent.Promise$class.tryFailure(Promise.scala:112) at scala.concurrent.impl.Promise$DefaultPromise.tryFailure(Promise.scala:153) at org.apache.spark.deploy.yarn.ApplicationMaster$anon$3.run(ApplicationMaster.scala:664) Caused by: java.lang.NoClassDefFoundError: com/example/emr/parser/IParser3 at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:763) at java.lang.ClassLoader.defineClass(ClassLoader.java:642) at com.example.hdfs.spark.utils.SimpleClassLoader.findClass(SimpleClassLoader.java:152) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:763) at java.lang.ClassLoader.defineClass(ClassLoader.java:642) at com.example.hdfs.spark.utils.SimpleClassLoader.loadJarFile(SimpleClassLoader.java:126) at com.example.hdfs.spark.utils.SimpleClassLoader.(SimpleClassLoader.java:38) at com.example.hdfs.spark.input.RawInputFormat.loadPlugins(RawInputFormat.java:71) at com.example.hdfs.spark.RawDataAdapter.run(RawDataAdapter.java:54) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at com.example.hdfs.spark.RawDataAdapter.main(RawDataAdapter.java:33) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$anon$3.run(ApplicationMaster.scala:646) Failing this attempt. Failing the application.
Quoting from Spark Documentation :-
http://spark.apache.org/docs/latest/running-on-yarn.html
In client mode, the driver runs in the client process, and the application master is only used for requesting resources from YARN.
In cluster mode, the Spark driver runs inside an application master process which is managed by YARN on the cluster, and the client can go away after initiating the application
So in cluster mode, the jar is executed on any available node so , so you can try these 2 ways :-
1) Copy the dependency jar to each node .
2) You can try to copy the jar to Distributed (HDFS system) and then use it .
For more details you can have a look into :
https://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management
Im running Pentaho 5.3 Biserver in a Redhat machine yesterday there was a power failure, when the power came back i tried to start the biserver it gives
HTTP Status 404 -
type Status report
message
description The requested resource is not available.
Apache Tomcat/6.0.411
I checked the Pentaho.log
2015-03-24 12:11:23,595 ERROR [org.pentaho.platform.osgi.OSGIBoot] Error installing Bundle
org.osgi.framework.BundleException: Bundle symbolic name and version are not unique: org.objectweb.asm.all:4.0.0
at org.apache.felix.framework.BundleImpl.createRevision(BundleImpl.java:1233)
at org.apache.felix.framework.BundleImpl.<init>(BundleImpl.java:96)
at org.apache.felix.framework.Felix.installBundle(Felix.java:2899)
at org.apache.felix.framework.BundleContextImpl.installBundle(BundleContextImpl.java:165)
at org.apache.felix.framework.BundleContextImpl.installBundle(BundleContextImpl.java:138)
at org.pentaho.platform.osgi.OSGIBoot.startup(OSGIBoot.java:137)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:421)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:77)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:343)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:311)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:212)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:135)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
2015-03-24 12:11:23,609 ERROR [org.pentaho.platform.osgi.OSGIBoot] Error installing Bundle
org.osgi.framework.BundleException: Bundle symbolic name and version are not unique: com.springsource.org.apache.commons.logging:1.1.1
at org.apache.felix.framework.BundleImpl.createRevision(BundleImpl.java:1233)
at org.apache.felix.framework.BundleImpl.<init>(BundleImpl.java:96)
at org.apache.felix.framework.Felix.installBundle(Felix.java:2899)
at org.apache.felix.framework.BundleContextImpl.installBundle(BundleContextImpl.java:165)
at org.apache.felix.framework.BundleContextImpl.installBundle(BundleContextImpl.java:138)
at org.pentaho.platform.osgi.OSGIBoot.startup(OSGIBoot.java:137)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:421)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:77)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:343)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:311)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:212)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:135)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
How do i resolve this? Help please
You should clean your osgi cache
1) stop pentaho server
2) go to /biserver-ce/pentaho-solutions/system/osgi/cache and delete everything
3) start pentaho server
While creating a JDBC connection between spagoBI studio and hive(CDH5/CDH4),This is my log :-
eclipse.buildId=unknown
java.version=1.7.0_45
java.vendor=Oracle Corporation
BootLoader constants: OS=linux, ARCH=x86_64, WS=gtk, NL=en_US
Command-line arguments: -os linux -ws gtk -arch x86_64 -consoleLog
This is a continuation of log file /home/cloudera/SpagoBIStudio_4.2.0_linux64/workspace/.metadata/.bak_0.log
Created Time: 2014-06-30 01:38:31.422
Error
Mon Jun 30 02:51:46 PDT 2014
Impossible to generate metamodel
java.lang.RuntimeException: Impossible to initialize the model
at it.eng.spagobi.meta.editor.multi.wizards.SpagoBIModelEditorWizard.createModel(SpagoBIModelEditorWizard.java:241)
at it.eng.spagobi.meta.editor.multi.wizards.SpagoBIModelEditorWizard.performFinish(SpagoBIModelEditorWizard.java:217)
at org.eclipse.jface.wizard.WizardDialog.finishPressed(WizardDialog.java:811)
at org.eclipse.jface.wizard.WizardDialog.buttonPressed(WizardDialog.java:430)
at org.eclipse.jface.dialogs.Dialog$2.widgetSelected(Dialog.java:624)
at org.eclipse.swt.widgets.TypedListener.handleEvent(TypedListener.java:234)
at org.eclipse.swt.widgets.EventTable.sendEvent(EventTable.java:84)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1258)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Display.java:3540)
at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:3161)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:825)
at org.eclipse.jface.window.Window.open(Window.java:801)
at it.eng.spagobi.studio.core.views.actionProvider.ResourceNavigatorActionProvider$13.run(ResourceNavigatorActionProvider.java:411)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:584)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:501)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:411)
at org.eclipse.swt.widgets.EventTable.sendEvent(EventTable.java:84)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1258)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Display.java:3540)
at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:3161)
at org.eclipse.ui.internal.Workbench.runEventLoop(Workbench.java:2640)
at org.eclipse.ui.internal.Workbench.runUI(Workbench.java:2604)
at org.eclipse.ui.internal.Workbench.access$4(Workbench.java:2438)
at org.eclipse.ui.internal.Workbench$7.run(Workbench.java:671)
at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:332)
at org.eclipse.ui.internal.Workbench.createAndRunWorkbench(Workbench.java:664)
at org.eclipse.ui.PlatformUI.createAndRunWorkbench(PlatformUI.java:149)
at org.eclipse.ui.internal.ide.application.IDEApplication.start(IDEApplication.java:115)
at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:196)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:110)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:79)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:369)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:179)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:620)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:575)
at org.eclipse.equinox.launcher.Main.run(Main.java:1408)
at org.eclipse.equinox.launcher.Main.main(Main.java:1384)
Caused by: java.lang.RuntimeException: Impossible to initialize the physical model
at it.eng.spagobi.meta.editor.multi.wizards.SpagoBIModelEditorWizard.createPhysicalModel(SpagoBIModelEditorWizard.java:340)
at it.eng.spagobi.meta.editor.multi.wizards.SpagoBIModelEditorWizard.createModel(SpagoBIModelEditorWizard.java:237)
... 41 more
Caused by: java.lang.reflect.InvocationTargetException
at org.eclipse.jface.operation.ModalContext.run(ModalContext.java:421)
at org.eclipse.jface.dialogs.ProgressMonitorDialog.run(ProgressMonitorDialog.java:507)
at it.eng.spagobi.meta.editor.multi.wizards.SpagoBIModelEditorWizard.createPhysicalModel(SpagoBIModelEditorWizard.java:295)
... 42 more
Caused by: java.lang.RuntimeException: Impossible to initialize physical model
at it.eng.spagobi.meta.initializer.PhysicalModelInitializer.initialize(PhysicalModelInitializer.java:121)
at it.eng.spagobi.meta.editor.multi.wizards.SpagoBIModelEditorWizard$1.run(SpagoBIModelEditorWizard.java:321)
at org.eclipse.jface.operation.ModalContext$ModalContextThread.run(ModalContext.java:121)
Caused by: java.sql.SQLException: Method not supported
at org.apache.hive.jdbc.HiveDatabaseMetaData.getIdentifierQuoteString(HiveDatabaseMetaData.java:342)
at it.eng.spagobi.meta.initializer.PhysicalModelInitializer.initialize(PhysicalModelInitializer.java:112)
... 2 more
some of the related questions hive methos not supported ,"java.sql.SQLException: Method not supported which says
Your original error results from using Cloudera's Hive driver which does not implement many JDBC API methods that PDI needs to function properly. That's why we have our own version of the hive driver in the cdh4 folder (called hive-0.7.0-pentaho-1.0.2 or something like that). Simply put, there should be no JARs copied from your cluster to your PDI client, the cdh4 folder already contains the correct versions of all necessary JARs.
but I didnt find any spagoBI hive driver for CDH5/CDH4.I am able to connect to hive but while accessing table getting above error in studio ,I am able to access the table on spagoBI server.any help,thanks.
You have to use Cloudera's .jar files in order to be able to connect to CDH. Have a look here and put all of the .jar files in the CLASSPATH. If running SpagoBI put it in /lib folder of the SpagoBI (Tomcat) server.