Impossible to generate metamodel in spagoBI studio with hive(CDH5|CDH4) - sql

While creating a JDBC connection between spagoBI studio and hive(CDH5/CDH4),This is my log :-
eclipse.buildId=unknown
java.version=1.7.0_45
java.vendor=Oracle Corporation
BootLoader constants: OS=linux, ARCH=x86_64, WS=gtk, NL=en_US
Command-line arguments: -os linux -ws gtk -arch x86_64 -consoleLog
This is a continuation of log file /home/cloudera/SpagoBIStudio_4.2.0_linux64/workspace/.metadata/.bak_0.log
Created Time: 2014-06-30 01:38:31.422
Error
Mon Jun 30 02:51:46 PDT 2014
Impossible to generate metamodel
java.lang.RuntimeException: Impossible to initialize the model
at it.eng.spagobi.meta.editor.multi.wizards.SpagoBIModelEditorWizard.createModel(SpagoBIModelEditorWizard.java:241)
at it.eng.spagobi.meta.editor.multi.wizards.SpagoBIModelEditorWizard.performFinish(SpagoBIModelEditorWizard.java:217)
at org.eclipse.jface.wizard.WizardDialog.finishPressed(WizardDialog.java:811)
at org.eclipse.jface.wizard.WizardDialog.buttonPressed(WizardDialog.java:430)
at org.eclipse.jface.dialogs.Dialog$2.widgetSelected(Dialog.java:624)
at org.eclipse.swt.widgets.TypedListener.handleEvent(TypedListener.java:234)
at org.eclipse.swt.widgets.EventTable.sendEvent(EventTable.java:84)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1258)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Display.java:3540)
at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:3161)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:825)
at org.eclipse.jface.window.Window.open(Window.java:801)
at it.eng.spagobi.studio.core.views.actionProvider.ResourceNavigatorActionProvider$13.run(ResourceNavigatorActionProvider.java:411)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:584)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:501)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:411)
at org.eclipse.swt.widgets.EventTable.sendEvent(EventTable.java:84)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1258)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Display.java:3540)
at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:3161)
at org.eclipse.ui.internal.Workbench.runEventLoop(Workbench.java:2640)
at org.eclipse.ui.internal.Workbench.runUI(Workbench.java:2604)
at org.eclipse.ui.internal.Workbench.access$4(Workbench.java:2438)
at org.eclipse.ui.internal.Workbench$7.run(Workbench.java:671)
at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:332)
at org.eclipse.ui.internal.Workbench.createAndRunWorkbench(Workbench.java:664)
at org.eclipse.ui.PlatformUI.createAndRunWorkbench(PlatformUI.java:149)
at org.eclipse.ui.internal.ide.application.IDEApplication.start(IDEApplication.java:115)
at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:196)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:110)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:79)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:369)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:179)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:620)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:575)
at org.eclipse.equinox.launcher.Main.run(Main.java:1408)
at org.eclipse.equinox.launcher.Main.main(Main.java:1384)
Caused by: java.lang.RuntimeException: Impossible to initialize the physical model
at it.eng.spagobi.meta.editor.multi.wizards.SpagoBIModelEditorWizard.createPhysicalModel(SpagoBIModelEditorWizard.java:340)
at it.eng.spagobi.meta.editor.multi.wizards.SpagoBIModelEditorWizard.createModel(SpagoBIModelEditorWizard.java:237)
... 41 more
Caused by: java.lang.reflect.InvocationTargetException
at org.eclipse.jface.operation.ModalContext.run(ModalContext.java:421)
at org.eclipse.jface.dialogs.ProgressMonitorDialog.run(ProgressMonitorDialog.java:507)
at it.eng.spagobi.meta.editor.multi.wizards.SpagoBIModelEditorWizard.createPhysicalModel(SpagoBIModelEditorWizard.java:295)
... 42 more
Caused by: java.lang.RuntimeException: Impossible to initialize physical model
at it.eng.spagobi.meta.initializer.PhysicalModelInitializer.initialize(PhysicalModelInitializer.java:121)
at it.eng.spagobi.meta.editor.multi.wizards.SpagoBIModelEditorWizard$1.run(SpagoBIModelEditorWizard.java:321)
at org.eclipse.jface.operation.ModalContext$ModalContextThread.run(ModalContext.java:121)
Caused by: java.sql.SQLException: Method not supported
at org.apache.hive.jdbc.HiveDatabaseMetaData.getIdentifierQuoteString(HiveDatabaseMetaData.java:342)
at it.eng.spagobi.meta.initializer.PhysicalModelInitializer.initialize(PhysicalModelInitializer.java:112)
... 2 more
some of the related questions hive methos not supported ,"java.sql.SQLException: Method not supported which says
Your original error results from using Cloudera's Hive driver which does not implement many JDBC API methods that PDI needs to function properly. That's why we have our own version of the hive driver in the cdh4 folder (called hive-0.7.0-pentaho-1.0.2 or something like that). Simply put, there should be no JARs copied from your cluster to your PDI client, the cdh4 folder already contains the correct versions of all necessary JARs.
but I didnt find any spagoBI hive driver for CDH5/CDH4.I am able to connect to hive but while accessing table getting above error in studio ,I am able to access the table on spagoBI server.any help,thanks.

You have to use Cloudera's .jar files in order to be able to connect to CDH. Have a look here and put all of the .jar files in the CLASSPATH. If running SpagoBI put it in /lib folder of the SpagoBI (Tomcat) server.

Related

Weblogic Domain creation error through script in putty

I am trying to create weblogic domain using silent mode through putty .I have used below command:
./config.sh -mode=silent -silent_xml=/home/ec2-user/createdomain.xml
I am getting below error message while executing it:
Exception in thread "Thread-1" java.lang.IllegalStateException: No able to create the instance of the template catalog class com.oracle.cie.domain.template.catalog.impl.GlobalTemplateCat
at com.oracle.cie.domain.template.catalog.TemplateCatalogFactory.createGlobalTemplateCatalog(TemplateCatalogFactory.java:138)
at com.oracle.cie.domain.template.catalog.TemplateCatalogFactory.getGlobalCatalog(TemplateCatalogFactory.java:78)
at com.oracle.cie.domain.template.catalog.TemplateCatalogFactory.getGlobalCatalog(TemplateCatalogFactory.java:33)
at com.oracle.cie.wizard.domain.silent.tasks.LoadTemplateCatalogTask$1.run(LoadTemplateCatalogTask.java:23)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.oracle.cie.domain.template.catalog.TemplateCatalogFactory.createGlobalTemplateCatalog(TemplateCatalogFactory.java:133)
... 4 more
Caused by: com.oracle.cie.domain.env.EnvironmentServiceException: Failed to get inventory for /home/ec2-user/oracle/middleware/oracle_common/common/bin
at com.oracle.cie.domain.env.EnvironmentServiceImpl.init(EnvironmentServiceImpl.java:425)
at com.oracle.cie.domain.env.EnvironmentServiceImpl.<init>(EnvironmentServiceImpl.java:89)
at `com`.oracle.cie.domain.env.EnvironmentServiceImpl.getInstance(EnvironmentServiceImpl.java:364)
at com.oracle.cie.domain.env.EnvironmentServiceFactory.getEnvironmentService(EnvironmentServiceFactory.java:35)
at com.oracle.cie.domain.template.catalog.impl.OracleHomeLocator.getProductInstalDirs(OracleHomeLocator.java:31)
at com.oracle.cie.domain.template.catalog.impl.GlobalTemplateCat.populateProductCatalogs(GlobalTemplateCat.java:446)
at com.oracle.cie.domain.template.catalog.impl.GlobalTemplateCat.<init>(GlobalTemplateCat.java:90)
at com.oracle.cie.domain.template.catalog.impl.GlobalTemplateCat.<init>(GlobalTemplateCat.java:83)
... 9 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.oracle.cie.common.ReflectionHelper.process(ReflectionHelper.java:48)
at com.oracle.cie.domain.env.EnvironmentServiceImpl.init(EnvironmentServiceImpl.java:384)
... 16 more
Caused by: com.oracle.cie.gdr.external.InventoryException: com.oracle.cie.gdr.utils.GdrException: The gdr meta-data directory /home/ec2-user/oracle/middleware/oracle_common/common/bin/inventory is invalid or does not exist.
at com.oracle.cie.gdr.external.impl.OracleHomeInventoryImpl.<init>(OracleHomeInventoryImpl.java:55)
at com.oracle.cie.gdr.external.impl.OracleHomeInventoryFactory.createInventory(OracleHomeInventoryFactory.java:60)
at com.oracle.cie.gdr.external.InventoryFactory.getOracleHomeInventory(InventoryFactory.java:99)
... 22 more
Caused by: com.oracle.cie.gdr.utils.GdrException: The gdr meta-data directory /home/ec2-user/oracle/middleware/oracle_common/common/bin/inventory is invalid or does not exist.
at com.oracle.cie.gdr.MetaDataHome.init(MetaDataHome.java:206)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:188)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:172)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:157)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:144)
at com.oracle.cie.gdr.MetaDataHome.<init>(MetaDataHome.java:86)
at com.oracle.cie.gdr.Home.getMetaDataHome(Home.java:619)
Which Weblogic version are you using? I have not seen a silent script to create domains for a While. If you are trying to do this on Weblogic 12c, it won't work as this kind of script used to be available for older versions such as 8 and 9 as far as I remember.
If you want to automate domain's provisioning for versions such as 12c you should use a newer approach. Here, I am proposing two options.
You can use Ansible, WLST and Python to create the domain. You can see an example here https://github.com/textanalyticsman/ansible-soa
You can use Weblogic Deploy Tooling, this is an Open Source tool provided by Oracle and you can find out it here https://github.com/oracle/weblogic-deploy-tooling
The combination of Weblogic Deploy Tooling and Ansible is also a good option as is shown in https://github.com/textanalyticsman/ansible-soa-wldt
You can also try Weblogic Kubernetes Operator https://oracle.github.io/weblogic-kubernetes-operator/userguide/managing-domains/domain-resource/

Flink 1.10 not connecting to minio (s3)

I'm running locally a docker compose running flink and minio
When I try to connect to minio, I always get the following error:
caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 's3'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
It seems that the plugin isn't loaded correctly.
my flink config (flink-conf.yaml):
state.backend: filesystem
s3.endpoint: http://minio:9000
s3.path.style.access: true
s3.access-key: minio
s3.secret-key: minio123
presto.s3.access-key: minio
presto.s3.secret-key: minio123
presto.s3.endpoint: http://minio:9000
presto.s3.path-style-access: true
I've copied the required plugin as following:
mkdir -p plugins/s3-fs-presto
cp opt/flink-s3-fs-presto-*.jar plugins/s3-fs-presto
Any suggestions?
Stack trace:
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 7ae6657256719d8c32d76ba113fb35f0)
at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:820)
at org.apache.flink.api.java.DataSet.collect(DataSet.java:413)
at org.apache.flink.api.java.DataSet.print(DataSet.java:1652)
at org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:96)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:604)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:466)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:274)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:746)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:273)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:205)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1008)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1081)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1081)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:147)
at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
... 21 more
Caused by: java.io.IOException: Error opening the Input Split s3://test/test.txt [0,3243]: Could not find a file system implementation for scheme 's3'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
at org.apache.flink.api.common.io.FileInputFormat.open(FileInputFormat.java:824)
at org.apache.flink.api.common.io.DelimitedInputFormat.open(DelimitedInputFormat.java:470)
at org.apache.flink.api.common.io.DelimitedInputFormat.open(DelimitedInputFormat.java:47)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:173)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 's3'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:450)
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:362)
at org.apache.flink.api.common.io.FileInputFormat$InputSplitOpenThread.run(FileInputFormat.java:995)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop is not in the classpath/dependencies.
at org.apache.flink.core.fs.UnsupportedSchemeFactory.create(UnsupportedSchemeFactory.java:58)
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:446)
... 2 more
#

Apache tomcat unable to create a JDBC connection to Apache drill

I am using Mondrian OLAP FOR Analytics, We've been using Mysql but have decided to migrate to Apache drill.
Now I have configured Apache Drill and I am able to connect is to Apache drill over external clients like SQuirreL Client, but When I am trying to connect it Via Saiku, I get following stacktrace in Catalina.out
INFO: Deploying web application directory saiku
name:Lists
driver:mondrian.olap4j.MondrianOlap4jDriver
url:jdbc:mondrian:Jdbc=jdbc:drill:drillbit=localhost:31010;schema=Saiku.root;Catalog=../webapps/saiku/WEB-INF/classes/imi_India/INDIA.xml;JdbcDrivers=org.apache.drill.jdbc.Driver;
23:59:21,878 WARN [RolapUtil] Mondrian: Warning: JDBC driver sun.jdbc.odbc.JdbcOdbcDriver not found
23:59:21,879 WARN [RolapUtil] Mondrian: Warning: JDBC driver oracle.jdbc.OracleDriver not found
23:59:22,162 WARN [DrillMetrics] Removing old metric since name matched newly registered metric. Metric name: drill.allocator.root.used
23:59:22,162 WARN [DrillMetrics] Removing old metric since name matched newly registered metric. Metric name: drill.allocator.root.peak
23:59:25,246 WARN [ThreadLocalRandom] Failed to generate a seed from SecureRandom within 3 seconds. Not enough entrophy?
mondrian.olap.MondrianException: Mondrian Error:Internal error: Error while creating SQL connection: Jdbc=jdbc:drill:drillbit=localhost:31010; JdbcUser=admin; JdbcPassword=admin
at mondrian.resource.MondrianResource$_Def0.ex(MondrianResource.java:972)
at mondrian.olap.Util.newInternal(Util.java:2403)
at mondrian.olap.Util.newError(Util.java:2419)
at mondrian.rolap.RolapConnection.<init>(RolapConnection.java:246)
at mondrian.rolap.RolapSchema.<init>(RolapSchema.java:188)
at mondrian.rolap.RolapSchema.<init>(RolapSchema.java:216)
at mondrian.rolap.RolapSchemaPool.get(RolapSchemaPool.java:214)
at mondrian.rolap.RolapSchemaPool.get(RolapSchemaPool.java:66)
at mondrian.rolap.RolapConnection.<init>(RolapConnection.java:160)
at mondrian.rolap.RolapConnection.<init>(RolapConnection.java:90)
at mondrian.olap.DriverManager.getConnection(DriverManager.java:112)
at mondrian.olap.DriverManager.getConnection(DriverManager.java:68)
at mondrian.olap4j.MondrianOlap4jConnection.<init>(MondrianOlap4jConnection.java:153)
at mondrian.olap4j.FactoryJdbc4Plus$AbstractConnection.<init>(FactoryJdbc4Plus.java:323)
at mondrian.olap4j.FactoryJdbc41Impl$MondrianOlap4jConnectionJdbc41.<init>(FactoryJdbc41Impl.java:118)
at mondrian.olap4j.FactoryJdbc41Impl.newConnection(FactoryJdbc41Impl.java:32)
at mondrian.olap4j.MondrianOlap4jDriver.connect(MondrianOlap4jDriver.java:134)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.saiku.datasources.connection.SaikuOlapConnection.connect(SaikuOlapConnection.java:75)
at org.saiku.datasources.connection.SaikuOlapConnection.connect(SaikuOlapConnection.java:46)
at org.saiku.datasources.connection.SaikuConnectionFactory.getConnection(SaikuConnectionFactory.java:29)
at org.saiku.web.impl.SecurityAwareConnectionManager.connect(SecurityAwareConnectionManager.java:284)
at org.saiku.web.impl.SecurityAwareConnectionManager.getInternalConnection(SecurityAwareConnectionManager.java:99)
at org.saiku.datasources.connection.AbstractConnectionManager.getConnection(AbstractConnectionManager.java:110)
at org.saiku.datasources.connection.AbstractConnectionManager.getAllConnections(AbstractConnectionManager.java:136)
at org.saiku.web.impl.SecurityAwareConnectionManager.init(SecurityAwareConnectionManager.java:58)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1536)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1477)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1409)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:288)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:190)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:574)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:895)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:425)
at org.springframework.web.context.ContextLoader.createWebApplicationContext(ContextLoader.java:276)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:197)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:47)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:3972)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4467)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:526)
at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1041)
at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:964)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1277)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:321)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:722)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
at org.apache.catalina.core.StandardService.start(StandardService.java:516)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
at org.apache.catalina.startup.Catalina.start(Catalina.java:593)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool error Could not create a validated object, cause: null
at org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:104)
at mondrian.rolap.RolapConnection.<init>(RolapConnection.java:226)
... 66 more
Caused by: java.util.NoSuchElementException: Could not create a validated object, cause: null
at org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1008)
at org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:96)
... 67 more
now while I have added dependency in pom.xml of the application
<dependency>
<groupId>org.apache.drill.exec</groupId>
<artifactId>drill-jdbc</artifactId>
<version>1.1.0</version>
</dependency>
and used following connection string
jdbc:drill:drillbit=localhost:31010;schema=Saiku.root
I also have bunch of parquet files in HDFS directory /dashboard/saiku3//.parquet
There are 17 such tables. I have created a storage engine called Saiku with root storage in /saiku3 and created Views for all the 17 hadoop directories. Those directories have been imported from mysql to HDFS via Sqoop as parquet.
I have been stuck here for hours and Couldn't find a solution, Am I doing something wrong here?
Thanks in advance.
"Not enough entropy" ? What version of Java is this? You seem to be using /dev/random iso. /dev/urandom. Anyway, you can change this with -Djava.security.egd if I'm not mistaken.
Check if you have drill-jdbc-all-1.8.0.jar in tomcat/lib

Can't start Idea 14.1.1

I have installed Intellij Idea 14.1.1
After setting up by wizard screen on first run then error caught:
Internal error. Please report to https://youtrack.jetbrains.com
java.lang.RuntimeException: com.intellij.ide.plugins.PluginManager$StartupAbortedException: Fatal error initializing 'org.intellij.images.fileTypes.impl.ImageFileTypeManagerImpl'
at com.intellij.idea.IdeaApplication.run(IdeaApplication.java:178)
at com.intellij.idea.MainImpl$1$1$1.run(MainImpl.java:52)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:312)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:738)
at java.awt.EventQueue.access$300(EventQueue.java:103)
at java.awt.EventQueue$3.run(EventQueue.java:699)
at java.awt.EventQueue$3.run(EventQueue.java:697)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:708)
at com.intellij.ide.IdeEventQueue.dispatchEvent(IdeEventQueue.java:362)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:242)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:161)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:150)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:146)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:138)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:91)
Caused by: com.intellij.ide.plugins.PluginManager$StartupAbortedException: Fatal error initializing 'org.intellij.images.fileTypes.impl.ImageFileTypeManagerImpl'
at com.intellij.ide.plugins.PluginManager.handleComponentError(PluginManager.java:248)
at com.intellij.openapi.fileTypes.impl.FileTypeManagerImpl.initStandardFileTypes(FileTypeManagerImpl.java:273)
at com.intellij.openapi.fileTypes.impl.FileTypeManagerImpl.<init>(FileTypeManagerImpl.java:230)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.picocontainer.defaults.InstantiatingComponentAdapter.newInstance(InstantiatingComponentAdapter.java:193)
at org.picocontainer.defaults.ConstructorInjectionComponentAdapter$1.run(ConstructorInjectionComponentAdapter.java:220)
at org.picocontainer.defaults.ThreadLocalCyclicDependencyGuard.observe(ThreadLocalCyclicDependencyGuard.java:53)
at org.picocontainer.defaults.ConstructorInjectionComponentAdapter.getComponentInstance(ConstructorInjectionComponentAdapter.java:248)
at com.intellij.util.pico.ConstructorInjectionComponentAdapter.getComponentInstance(ConstructorInjectionComponentAdapter.java:58)
at com.intellij.openapi.components.impl.ComponentManagerImpl$ComponentConfigComponentAdapter$1.getComponentInstance(ComponentManagerImpl.java:550)
at com.intellij.openapi.components.impl.ComponentManagerImpl$ComponentConfigComponentAdapter.getComponentInstance(ComponentManagerImpl.java:610)
at com.intellij.util.pico.DefaultPicoContainer.getLocalInstance(DefaultPicoContainer.java:245)
at com.intellij.util.pico.DefaultPicoContainer.getComponentInstance(DefaultPicoContainer.java:211)
at com.intellij.openapi.components.impl.ComponentManagerImpl.createComponent(ComponentManagerImpl.java:125)
at com.intellij.openapi.application.impl.ApplicationImpl.createComponent(ApplicationImpl.java:359)
at com.intellij.openapi.components.impl.ComponentManagerImpl.createComponents(ComponentManagerImpl.java:116)
at com.intellij.openapi.components.impl.ComponentManagerImpl.init(ComponentManagerImpl.java:87)
at com.intellij.openapi.components.impl.stores.ApplicationStoreImpl.load(ApplicationStoreImpl.java:101)
at com.intellij.openapi.application.impl.ApplicationImpl.load(ApplicationImpl.java:504)
at com.intellij.openapi.application.impl.ApplicationImpl.load(ApplicationImpl.java:486)
at com.intellij.idea.IdeaApplication.run(IdeaApplication.java:170)
... 16 more
Caused by: java.util.ServiceConfigurationError: javax.imageio.spi.ImageWriterSpi: Provider com.sun.media.imageioimpl.plugins.jpeg.CLibJPEGImageWriterSpi could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:224)
at java.util.ServiceLoader.access$100(ServiceLoader.java:181)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:377)
at java.util.ServiceLoader$1.next(ServiceLoader.java:445)
at javax.imageio.spi.IIORegistry.registerApplicationClasspathSpis(IIORegistry.java:210)
at javax.imageio.spi.IIORegistry.<init>(IIORegistry.java:138)
at javax.imageio.spi.IIORegistry.getDefaultInstance(IIORegistry.java:159)
at javax.imageio.ImageIO.<clinit>(ImageIO.java:65)
at org.intellij.images.fileTypes.impl.ImageFileTypeManagerImpl.createFileTypes(ImageFileTypeManagerImpl.java:80)
at com.intellij.openapi.fileTypes.impl.FileTypeManagerImpl.initStandardFileTypes(FileTypeManagerImpl.java:270)
... 38 more
Caused by: java.lang.IllegalArgumentException: vendorName == null!
at javax.imageio.spi.IIOServiceProvider.<init>(IIOServiceProvider.java:76)
at javax.imageio.spi.ImageReaderWriterSpi.<init>(ImageReaderWriterSpi.java:231)
at javax.imageio.spi.ImageWriterSpi.<init>(ImageWriterSpi.java:213)
at com.sun.media.imageioimpl.plugins.jpeg.CLibJPEGImageWriterSpi.<init>(CLibJPEGImageWriterSpi.java:84)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:379)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:373)
... 45 more
note:
source : https://www.jetbrains.com/idea/download/
there's similar problems found on google, but mine is different from them
i'm using jdk 7
ubuntu 14.04 LTS
Simply deleting:
~/Library/Java/Extensions
(which contains the offending JAI jar) does the job on a Mac.
Relevant issues:
https://youtrack.jetbrains.com/issue/IDEA-137147
http://youtrack.jetbrains.com/issue/IDEA-139178
A similar problem on OSX seems to have a root cause of the commonly used JAI extension jar files being somewhat mis-configured with a null vendorName. If you need the JAI extensions for some reason (because, for example, your local GeoServer upgrade re-installs them in ~/Library/Java/Extensions) then a work-around is to create a ~/Library/Preferences/IdeaIC14/idea.vmoptions file that contains ONLY:
-Djava.ext.dirs=/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/ext
That seems to override OSX Java 6's default behavior of checking the user's ~/Library/Java/Extensions directory in addition to the system's extensions on application start up.
If you are using some other JRE version/operating system then adjusting those paths or making sure that the Java extension directory that the application that needs JAI is different from the one that IntelliJ uses may help. At worst you might install two JDK/JRE's and give each application a different JAVA_HOME value.

Jenkins build failed on OSX

I am trying to build my project using Jenkins to deploy the artifacts to the nexus. I have a Jenkins setup on my macOSX.
below is the error, I am getting:
Parsing POMs
[maventest] $
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java
-Xmx512m -XX:MaxPermSize=128m -Dfile.encoding=UTF-8 -cp /Users/Shared/Jenkins/Home/plugins/maven-plugin/WEB-INF/lib/maven3-agent-1.3.jar:/usr/share/maven/boot/plexus-classworlds-2.4.jar
org.jvnet.hudson.maven3.agent.Maven3Main /usr/share/maven
/Users/Shared/Jenkins/Home/war/WEB-INF/lib/remoting-2.26.jar
/Users/Shared/Jenkins/Home/plugins/maven-plugin/WEB-INF/lib/maven3-interceptor-1.3.jar
59985
<===[JENKINS REMOTING CAPACITY]===>channel started
channel stopped
ERROR: Failed to parse POMs java.io.IOException: Remote call on
Channel to Maven
[/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java,
-Xmx512m, -XX:MaxPermSize=128m, -Dfile.encoding=UTF-8, -cp, /Users/Shared/Jenkins/Home/plugins/maven-plugin/WEB-INF/lib/maven3-agent-1.3.jar:/usr/share/maven/boot/plexus-classworlds-2.4.jar,
org.jvnet.hudson.maven3.agent.Maven3Main, /usr/share/maven,
/Users/Shared/Jenkins/Home/war/WEB-INF/lib/remoting-2.26.jar,
/Users/Shared/Jenkins/Home/plugins/maven-plugin/WEB-INF/lib/maven3-interceptor-1.3.jar,
59985] failed at hudson.remoting.Channel.call(Channel.java:727) at
hudson.maven.ProcessCache$MavenProcess.call(ProcessCache.java:156) at
hudson.maven.MavenModuleSetBuild$MavenModuleSetBuildExecution.doRun(MavenModuleSetBuild.java:770)
at
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:586)
at hudson.model.Run.execute(Run.java:1593) at
hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:491) at
hudson.model.ResourceController.execute(ResourceController.java:88)
at hudson.model.Executor.run(Executor.java:247) Caused by:
java.lang.InternalError: Can't connect to window server - not enough
permissions. at java.lang.ClassLoader$NativeLibrary.load(Native
Method) at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1827)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1724) at
java.lang.Runtime.loadLibrary0(Runtime.java:823) at
java.lang.System.loadLibrary(System.java:1045) at
sun.security.action.LoadLibraryAction.run(LoadLibraryAction.java:50)
at java.security.AccessController.doPrivileged(Native Method) at
java.awt.Toolkit.loadLibraries(Toolkit.java:1605) at
java.awt.Toolkit.(Toolkit.java:1627) at
java.awt.Color.(Color.java:263) at
hudson.util.ColorPalette.(ColorPalette.java:39) at
hudson.model.BallColor.(BallColor.java:56) at
hudson.model.Result.(Result.java:51) at
java.lang.Class.forName0(Native Method) at
java.lang.Class.forName(Class.java:171) at
com.sun.proxy.$Proxy8.(Unknown Source) at
sun.reflect.GeneratedSerializationConstructorAccessor41.newInstance(Unknown
Source) at
java.lang.reflect.Constructor.newInstance(Constructor.java:513) at
java.io.ObjectStreamClass.newInstance(ObjectStreamClass.java:929) at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1759)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
at java.util.HashMap.readObject(HashMap.java:1030) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597) at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:979)
at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1871)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
at hudson.remoting.UserRequest.deserialize(UserRequest.java:182) at
hudson.remoting.UserRequest.perform(UserRequest.java:98) at
hudson.remoting.UserRequest.perform(UserRequest.java:48) at
hudson.remoting.Request$2.run(Request.java:326) at
hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138) at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:680) Finished: FAILURE
I already tried the below solution but it didn't work:
http://jenkins-ci.361315.n4.nabble.com/JIRA-Created-HUDSON-5584-java-io-IOException-Remote-call-on-Channel-to-Maven-td1475049.html
Configurations I have:
MAVEN_OPTS:-Xmx1024m
-XX:MaxPermSize=128m
-Dfile.encoding=UTF-8
-Djava.awt.headless=true
output of ps -ef | grep java: /usr/bin/java -Djava.awt.headless=true -jar /Applications/Jenkins/jenkins.war
build command:clean deploy -DaltDeploymentRepository=central::default::http://<user>:<pwd>#<host>:<port>/nexus/content/groups/public/
The solution I used was to apply Java 7. What you want to do is add 1.7 to Jenkins. Following these steps I was able to successfully build my project:
Go to the Oracle Java page and downloaded the 1.7_51 jdk for Mac.
Opened the dmg and ran the executable.
On the Mac, this installs the JDK to /Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/
In Jenkins, go to 'Manage Jenkins' > 'Configure System'
Under the JDK heading, click the button that says JDK Installations
Under Name type 'JDK 1.7.0_51'
For JAVA_HOME type '/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/'
Select Save
Go to your project and select Configure
You should now have a JDK drop down near the top of the page.
Select the JDK you just configured under 'Manage Jenkins'
Run the build
After doing this, my build successfully ran without the 'Can't connect to window server - not enough permissions error.'
This line looks curious:
hudson.model.Executor.run(Executor.java:247) Caused by: java.lang.InternalError:
Can't connect to window server - not enough permissions. at java.lang.ClassLoader
$NativeLibrary.load(Native Method)
I would start with a simpler project, and then add complexity from that point, for the reason of just testing to make sure your basic assumptions are correct.
You might need to set the JVM property : -Djava.awt.headless=true . By doing that you will disable the (most likely unnecessary) gui libraries that are trying to load.