Hive error : java.io.IOException: Couldn't create proxy provider class org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider - apache

I have hive query which run successful sometimes but maximum time gives an error "java.io.IOException: Couldn't create proxy provider class org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider"
Below is my error log
java.lang.RuntimeException: java.io.IOException: Couldn't create proxy
provider class org.apache.hadoop.hdfs.server.namenode.ha.Con\
figuredFailoverProxyProvider at
org.apache.hadoop.mapred.lib.CombineFileInputFormat.isSplitable(CombineFileInputFormat.java:154)
at
org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.getMoreSplits(CombineFileInputFormat.java:283)
at
org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.getSplits(CombineFileInputFormat.java:239)
at
org.apache.hadoop.mapred.lib.CombineFileInputFormat.getSplits(CombineFileInputFormat.java:75)
at
org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getSplits(HadoopShimsSecure.java:336)
at
org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getSplits(HadoopShimsSecure.java:302)
at
org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:435)
at
org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:525)
at
org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:517)
at
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:399)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295) at
org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:415) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292) at
org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:564) at
org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:559) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:415) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:559)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:550)
at
org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:420)
at
org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1516) at
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1283) at
org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1101) at
org.apache.hadoop.hive.ql.Driver.run(Driver.java:924) at
org.apache.hadoop.hive.ql.Driver.run(Driver.java:914) at
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:269)
at
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:221)
at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:431)
at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:367)
at
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:464)
at
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:474)
at
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:756)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:694) at
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:633) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606) at
org.apache.hadoop.util.RunJar.main(RunJar.java:212) Caused by:
java.io.IOException: Couldn't create proxy provider class
org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverPr\
oxyProvider at
org.apache.hadoop.hdfs.NameNodeProxies.createFailoverProxyProvider(NameNodeProxies.java:475)
at
org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:148)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:632) at
org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:570) at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:147)
at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367) at
org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169) at
org.apache.hadoop.mapred.lib.CombineFileInputFormat.isSplitable(CombineFileInputFormat.java:151)
... 45 more Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedConstructorAccessor32.newInstance(Unknown
Source) at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at
org.apache.hadoop.hdfs.NameNodeProxies.createFailoverProxyProvider(NameNodeProxies.java:458)
... 53 more Caused by: java.lang.OutOfMemoryError: GC overhead limit
exceeded at java.util.Arrays.copyOf(Arrays.java:2219) at
java.util.ArrayList.grow(ArrayList.java:242) at
java.util.ArrayList.ensureExplicitCapacity(ArrayList.java:216) at
java.util.ArrayList.ensureCapacityInternal(ArrayList.java:208) at
java.util.ArrayList.add(ArrayList.java:440) at
java.lang.String.split(String.java:2288) at
sun.net.util.IPAddressUtil.textToNumericFormatV4(IPAddressUtil.java:47)
at java.net.InetAddress.getAllByName(InetAddress.java:1129) at
java.net.InetAddress.getAllByName(InetAddress.java:1098) at
java.net.InetAddress.getByName(InetAddress.java:1048) at
org.apache.hadoop.security.SecurityUtil$StandardHostResolver.getByName(SecurityUtil.java:474)
at
org.apache.hadoop.security.SecurityUtil.getByName(SecurityUtil.java:461)
at
org.apache.hadoop.net.NetUtils.createSocketAddrForHost(NetUtils.java:235)
at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:215)
at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:163)
at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:152)
at
org.apache.hadoop.hdfs.DFSUtil.getAddressesForNameserviceId(DFSUtil.java:677)
at
org.apache.hadoop.hdfs.DFSUtil.getAddressesForNsIds(DFSUtil.java:645)
at org.apache.hadoop.hdfs.DFSUtil.getAddresses(DFSUtil.java:628) at
org.apache.hadoop.hdfs.DFSUtil.getHaNnRpcAddresses(DFSUtil.java:727)
at
org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider.(ConfiguredFailoverProxyProvider.java:88)
at sun.reflect.GeneratedConstructorAccessor32.newInstance(Unknown
Source) at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at
org.apache.hadoop.hdfs.NameNodeProxies.createFailoverProxyProvider(NameNodeProxies.java:458)
at
org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:148)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:632) at
org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:570) at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:147)
at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367) at
org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169) Job
Submission failed with exception
'java.lang.RuntimeException(java.io.IOException: Couldn't create proxy
provider class org.apac\
he.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider)'
Could anyone tell me why this happen?

I just stumbled across a similar exception myself, and increasing the hive client heap didn't help. I found I was able to clear up the OutOfMemory GC Overhead exception by adding a partition column to the where clause of the query, so I've concluded that having a very large number of splits is causing this exception. I haven't dug into the code, but I believe I've seen this happen with string concatenation in a loop triggering gc thrashing, and something similar might be happening in the CombineHiveInputFormat.getSplits method.

Related

Payara DataSource not being injected

I'm using Payara 5.184 and Java 1.8.241 on Linux (Ubuntu 19.10). I'm also using Liquibase as my database schema change management. A field named myDataSource is being injected as follows:
#Resource(mappedName = "java:global/ds")
private DataSource myDataSource;
When the method createDataSource() from Liquibase is invoked I noticed that the variable myDataSource is null, what makes me understand that the resource is not being properly injected. This error seems to happen only on Linux, as my other colleagues that run Windows did not have any problems so far. We're using exactly the same versions of Payara/Java. Is there any specific step that needs to be done on Linux? Here it is the stacktrace found on Payara logs.
Exception during lifecycle processing
org.glassfish.deployment.common.DeploymentException: CDI deployment failure:Exception List with 2 exceptions:
Exception 0 :
org.jboss.weld.exceptions.WeldException: WELD-000049: Unable to invoke public void liquibase.integration.cdi.CDILiquibase.onStartup() on liquibase.integration.cdi.CDILiquibase#911a89a
at org.jboss.weld.injection.producer.DefaultLifecycleCallbackInvoker.invokeMethods(DefaultLifecycleCallbackInvoker.java:85)
at org.jboss.weld.injection.producer.DefaultLifecycleCallbackInvoker.postConstruct(DefaultLifecycleCallbackInvoker.java:66)
at org.jboss.weld.injection.producer.BasicInjectionTarget.postConstruct(BasicInjectionTarget.java:122)
at liquibase.integration.cdi.CDIBootstrap$1.create(CDIBootstrap.java:95)
at liquibase.integration.cdi.CDIBootstrap$1.create(CDIBootstrap.java:38)
at org.jboss.weld.contexts.AbstractContext.get(AbstractContext.java:96)
at org.jboss.weld.bean.ContextualInstanceStrategy$DefaultContextualInstanceStrategy.get(ContextualInstanceStrategy.java:100)
at org.jboss.weld.bean.ContextualInstance.get(ContextualInstance.java:50)
at org.jboss.weld.bean.proxy.ContextBeanInstance.getInstance(ContextBeanInstance.java:102)
at org.jboss.weld.bean.proxy.ProxyMethodHandler.getInstance(ProxyMethodHandler.java:131)
at liquibase.integration.cdi.CDILiquibase$Proxy$_$$_WeldClientProxy.toString(Unknown Source)
at liquibase.integration.cdi.CDIBootstrap.afterDeploymentValidation(CDIBootstrap.java:111)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.jboss.weld.injection.StaticMethodInjectionPoint.invoke(StaticMethodInjectionPoint.java:95)
at org.jboss.weld.injection.MethodInvocationStrategy$SpecialParamPlusBeanManagerStrategy.invoke(MethodInvocationStrategy.java:144)
at org.jboss.weld.event.ObserverMethodImpl.sendEvent(ObserverMethodImpl.java:330)
at org.jboss.weld.event.ExtensionObserverMethodImpl.sendEvent(ExtensionObserverMethodImpl.java:123)
at org.jboss.weld.event.ObserverMethodImpl.sendEvent(ObserverMethodImpl.java:308)
at org.jboss.weld.event.ObserverMethodImpl.notify(ObserverMethodImpl.java:286)
at javax.enterprise.inject.spi.ObserverMethod.notify(ObserverMethod.java:124)
at org.jboss.weld.util.Observers.notify(Observers.java:166)
at org.jboss.weld.event.ObserverNotifier.notifySyncObservers(ObserverNotifier.java:285)
at org.jboss.weld.event.ObserverNotifier.notify(ObserverNotifier.java:273)
at org.jboss.weld.event.ObserverNotifier.fireEvent(ObserverNotifier.java:177)
at org.jboss.weld.event.ObserverNotifier.fireEvent(ObserverNotifier.java:171)
at org.jboss.weld.bootstrap.events.AbstractContainerEvent.fire(AbstractContainerEvent.java:53)
at org.jboss.weld.bootstrap.events.AbstractDeploymentContainerEvent.fire(AbstractDeploymentContainerEvent.java:35)
at org.jboss.weld.bootstrap.events.AfterDeploymentValidationImpl.fire(AfterDeploymentValidationImpl.java:28)
at org.jboss.weld.bootstrap.WeldStartup.validateBeans(WeldStartup.java:499)
at org.jboss.weld.bootstrap.WeldBootstrap.validateBeans(WeldBootstrap.java:93)
at org.glassfish.weld.WeldDeployer.processApplicationLoaded(WeldDeployer.java:517)
at org.glassfish.weld.WeldDeployer.event(WeldDeployer.java:428)
at org.glassfish.kernel.event.EventsImpl.send(EventsImpl.java:131)
at org.glassfish.internal.data.ApplicationInfo.load(ApplicationInfo.java:333)
at com.sun.enterprise.v3.server.ApplicationLifecycle.prepare(ApplicationLifecycle.java:497)
at org.glassfish.deployment.admin.DeployCommand.execute(DeployCommand.java:540)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$2$1.run(CommandRunnerImpl.java:549)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$2$1.run(CommandRunnerImpl.java:545)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$2.execute(CommandRunnerImpl.java:544)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$3.run(CommandRunnerImpl.java:575)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$3.run(CommandRunnerImpl.java:567)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at com.sun.enterprise.v3.admin.CommandRunnerImpl.doCommand(CommandRunnerImpl.java:566)
at com.sun.enterprise.v3.admin.CommandRunnerImpl.doCommand(CommandRunnerImpl.java:1475)
at com.sun.enterprise.v3.admin.CommandRunnerImpl.access$1300(CommandRunnerImpl.java:111)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$ExecutionContext.execute(CommandRunnerImpl.java:1857)
at com.sun.enterprise.v3.admin.CommandRunnerImpl$ExecutionContext.execute(CommandRunnerImpl.java:1733)
at com.sun.enterprise.v3.admin.AdminAdapter.doCommand(AdminAdapter.java:564)
at com.sun.enterprise.v3.admin.AdminAdapter.onMissingResource(AdminAdapter.java:251)
at org.glassfish.grizzly.http.server.StaticHttpHandlerBase.service(StaticHttpHandlerBase.java:166)
at com.sun.enterprise.v3.services.impl.ContainerMapper$HttpHandlerCallable.call(ContainerMapper.java:520)
at com.sun.enterprise.v3.services.impl.ContainerMapper.service(ContainerMapper.java:217)
at org.glassfish.grizzly.http.server.HttpHandler.runService(HttpHandler.java:182)
at org.glassfish.grizzly.http.server.HttpHandler.doHandle(HttpHandler.java:156)
at org.glassfish.grizzly.http.server.HttpServerFilter.handleRead(HttpServerFilter.java:218)
at org.glassfish.grizzly.filterchain.ExecutorResolver$9.execute(ExecutorResolver.java:95)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeFilter(DefaultFilterChain.java:260)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeChainPart(DefaultFilterChain.java:177)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.execute(DefaultFilterChain.java:109)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.process(DefaultFilterChain.java:88)
at org.glassfish.grizzly.ProcessorExecutor.execute(ProcessorExecutor.java:53)
at org.glassfish.grizzly.nio.transport.TCPNIOTransport.fireIOEvent(TCPNIOTransport.java:524)
at org.glassfish.grizzly.strategies.AbstractIOStrategy.fireIOEvent(AbstractIOStrategy.java:89)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy.run0(WorkerThreadIOStrategy.java:94)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy.access$100(WorkerThreadIOStrategy.java:33)
at org.glassfish.grizzly.strategies.WorkerThreadIOStrategy$WorkerThreadRunnable.run(WorkerThreadIOStrategy.java:114)
at org.glassfish.grizzly.threadpool.AbstractThreadPool$Worker.doWork(AbstractThreadPool.java:569)
at org.glassfish.grizzly.threadpool.AbstractThreadPool$Worker.run(AbstractThreadPool.java:549)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.jboss.weld.injection.producer.DefaultLifecycleCallbackInvoker.invokeMethods(DefaultLifecycleCallbackInvoker.java:83)
... 74 more
Caused by: java.lang.NullPointerException
at liquibase.integration.cdi.CDILiquibase.performUpdate(CDILiquibase.java:126)
at liquibase.integration.cdi.CDILiquibase.onStartup(CDILiquibase.java:116)
... 79 more
EDIT: Added stacktrace
EDIT 2: Problem solved! It ended up being something not related with Liquibase at all. The queue size of the PayaraExecutorService was too small, which leads me to think that some tasks submissions were rejected. Apparently this is a known issue of Payara 5.184, and a related issue had already been opened on Github (https://github.com/payara/Payara/issues/3495). Once I increased the size of the queue to a large number everything started working as expected.
I think you need to use lookup to inject the resource.
Compare this bug report: https://github.com/payara/Payara/issues/4413

Why do I get "File could only be replicated to 0 nodes" when writing to a partitioned table?

I create an external table in Hive with partitions and then try to populate it from the existing table, however, I get the following exceptions:
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /apps/hive/warehouse/pavel.db/browserdatapart/.hive-staging_hive_2018-12-28_13-22-45_751_6056004898772238481-1/_task_tmp.-ext-10000/cityid=1/_tmp.000001_3 could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1719)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3372)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3296)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:850)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:504)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:814)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:841)
at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:841)
at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:133)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:170)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:555)
... 18 more
Caused by: org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /apps/hive/warehouse/pavel.db/browserdatapart/.hive-staging_hive_2018-12-28_13-22-45_751_6056004898772238481-1/_task_tmp.-ext-10000/cityid=1/_tmp.000001_3 could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1719)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3372)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3296)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:850)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:504)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1554)
at org.apache.hadoop.ipc.Client.call(Client.java:1498)
at org.apache.hadoop.ipc.Client.call(Client.java:1398)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
at com.sun.proxy.$Proxy11.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:459)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:290)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:202)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:184)
at com.sun.proxy.$Proxy12.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1580)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1375)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:
According to the internet these exceptions occur when datanode can't communicate with namenode or when you are running low on memory, but in my case everything is fine. I have already tried formatting my namenode and datanode as well. What else could be the issue?
https://wiki.apache.org/hadoop/CouldOnlyBeReplicatedTo I've also read this. And it didn't help me.
I am running on tez. this works:
insert into table browserdatapart partition(cityid) select UserAgent,cityid from browserdata limit 100;
And this fails with the exception I provided:
insert into table browserdatapart partition(cityid) select UserAgent,cityid from browserdata;
SET hive.exec.max.dynamic.partitions=100000;
SET hive.exec.max.dynamic.partitions.pernode=100000;
Setting the above parameters solved it for me. I guess hive was not able to replicate data to those partitions that show up in the exception, because there were more than the maximum (which is 224 in the case of my dataset).

0xDBE / PHPStorm / IDEA fails to synchronize database's tables

I'm getting the following error when trying to sync the database tables in 0xDBE:
Method org.postgresql.jdbc4.Jdbc4Array.free() is not yet implemented.
Method org.postgresql.jdbc4.Jdbc4Array.free() is not yet implemented.
The SQL statement:
with languages as (select oid as lang_oid, lanname as lang
from pg_catalog.pg_language),
routines as (select proname as r_name,
prolang as lang_oid,
oid as r_id,
xmin as r_state_number,
proargnames as arg_names,
proargmodes as arg_modes,
proargtypes::int[] as in_arg_types,
proallargtypes::int[] as all_arg_types,
proargdefaults as arg_defaults,
provariadic as arg_variadic_id,
prorettype as ret_type_id,
proisagg as is_aggregate,
proiswindow as is_window,
provolatile as volatile_kind
from pg_catalog.pg_proc
where pronamespace = oid(?)
and xmin::varchar::bigint > ?)
select *
from routines natural join languages
at org.jetbrains.jdba.jdbc.BaseExceptionRecognizer.recognizeException(BaseExceptionRecognizer.java:48)
at org.jetbrains.jdba.jdbc.JdbcIntermediateSession.recognizeException(JdbcIntermediateSession.java:347)
at org.jetbrains.jdba.jdbc.JdbcIntermediateCursor.fetch(JdbcIntermediateCursor.java:249)
at com.intellij.database.remote.jdba.impl.RemoteCursorImpl.fetch(RemoteCursorImpl.java:31)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:322)
at sun.rmi.transport.Transport$2.run(Transport.java:202)
at sun.rmi.transport.Transport$2.run(Transport.java:199)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.Transport.serviceCall(Transport.java:198)
at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:567)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:828)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.access$400(TCPTransport.java:619)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler$1.run(TCPTransport.java:684)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler$1.run(TCPTransport.java:681)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:681)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
at sun.rmi.transport.StreamRemoteCall.exceptionReceivedFromServer(StreamRemoteCall.java:275)
at sun.rmi.transport.StreamRemoteCall.executeCall(StreamRemoteCall.java:252)
at sun.rmi.server.UnicastRef.invoke(UnicastRef.java:161)
at java.rmi.server.RemoteObjectInvocationHandler.invokeRemoteMethod(RemoteObjectInvocationHandler.java:217)
at java.rmi.server.RemoteObjectInvocationHandler.invoke(RemoteObjectInvocationHandler.java:171)
at com.sun.proxy.$Proxy93.fetch(Unknown Source)
at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.execution.rmi.RemoteUtil.invokeRemote(RemoteUtil.java:124)
at com.intellij.execution.rmi.RemoteUtil.access$100(RemoteUtil.java:36)
at com.intellij.execution.rmi.RemoteUtil$RemoteInvocationHandler.invoke(RemoteUtil.java:229)
at com.sun.proxy.$Proxy94.fetch(Unknown Source)
at org.jetbrains.jdba.intermediate.AdaptIntermediateStructCollectingCursor.fetch(AdaptIntermediateStructCollectingCursor.java:107)
at org.jetbrains.jdba.core.BaseQueryRunner.fetchPack(BaseQueryRunner.java:88)
at org.jetbrains.jdba.core.BaseQueryRunner.run(BaseQueryRunner.java:70)
at com.intellij.dbm.postgre.PostgreIntrospector$SchemaRetriever.h(PostgreIntrospector.java:459)
at com.intellij.dbm.postgre.PostgreIntrospector$SchemaRetriever.access$600(PostgreIntrospector.java:177)
at com.intellij.dbm.postgre.PostgreIntrospector$SchemaRetriever$4.run(PostgreIntrospector.java:246)
at com.intellij.dbm.postgre.PostgreIntrospector$SchemaRetriever.a(PostgreIntrospector.java:203)
at com.intellij.dbm.postgre.PostgreIntrospector$SchemaRetriever.retrieve(PostgreIntrospector.java:242)
at com.intellij.dbm.postgre.PostgreIntrospector$2.run(PostgreIntrospector.java:169)
at org.jetbrains.jdba.core.BaseSession.inTransaction(BaseSession.java:88)
at org.jetbrains.jdba.core.BaseFacade$2.run(BaseFacade.java:93)
at org.jetbrains.jdba.core.BaseFacade.inSession(BaseFacade.java:125)
at org.jetbrains.jdba.core.BaseFacade.inTransaction(BaseFacade.java:89)
at com.intellij.dbm.postgre.PostgreIntrospector.a(PostgreIntrospector.java:165)
at com.intellij.dbm.postgre.PostgreIntrospector.b(PostgreIntrospector.java:153)
at com.intellij.dbm.postgre.PostgreIntrospector.introspect(PostgreIntrospector.java:86)
at com.intellij.database.dataSource.NativeSchemaLoader.a(NativeSchemaLoader.java:113)
at com.intellij.database.dataSource.NativeSchemaLoader.introspectAndAdapt(NativeSchemaLoader.java:67)
at com.intellij.database.dataSource.DatabaseSchemaLoader.loadDataSourceState(DatabaseSchemaLoader.java:107)
at com.intellij.database.dataSource.AbstractDataSource.refreshMetaData(AbstractDataSource.java:59)
at com.intellij.database.dataSource.AbstractDataSource$1.perform(AbstractDataSource.java:34)
at com.intellij.database.dataSource.AbstractDataSource$1.perform(AbstractDataSource.java:32)
at com.intellij.database.dataSource.AbstractDataSource.performJdbcOperation(AbstractDataSource.java:110)
at com.intellij.database.dataSource.AbstractDataSource.refreshMetaData(AbstractDataSource.java:32)
at com.intellij.database.dataSource.DataSourceUiUtil$2.run(DataSourceUiUtil.java:167)
at com.intellij.openapi.progress.impl.CoreProgressManager$TaskRunnable.run(CoreProgressManager.java:563)
at com.intellij.openapi.progress.impl.CoreProgressManager$2.run(CoreProgressManager.java:142)
at com.intellij.openapi.progress.impl.CoreProgressManager.a(CoreProgressManager.java:446)
at com.intellij.openapi.progress.impl.CoreProgressManager.executeProcessUnderProgress(CoreProgressManager.java:392)
at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:54)
at com.intellij.openapi.progress.impl.CoreProgressManager.runProcess(CoreProgressManager.java:127)
at com.intellij.openapi.progress.impl.ProgressManagerImpl$1.run(ProgressManagerImpl.java:126)
at com.intellij.openapi.application.impl.ApplicationImpl$8.run(ApplicationImpl.java:367)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
at org.jetbrains.ide.PooledThreadExecutor$1$1.run(PooledThreadExecutor.java:55)
Caused by: java.sql.SQLFeatureNotSupportedException: Method org.postgresql.jdbc4.Jdbc4Array.free() is not yet implemented.
at org.postgresql.Driver.notImplemented(Driver.java:727)
at org.postgresql.jdbc4.Jdbc4Array.free(Jdbc4Array.java:57)
at org.jetbrains.jdba.jdbc.JdbcValueGetters$AbstractArrayGetter.getValue(JdbcValueGetters.java:473)
at org.jetbrains.jdba.jdbc.JdbcRowFetchers$TupleFetcher.fetchRow(JdbcRowFetchers.java:163)
at org.jetbrains.jdba.jdbc.JdbcRowFetchers$TupleFetcher.fetchRow(JdbcRowFetchers.java:145)
at org.jetbrains.jdba.jdbc.JdbcRowsCollectors$ListCollector.collectRows(JdbcRowsCollectors.java:211)
at org.jetbrains.jdba.jdbc.JdbcRowsCollectors$ListCollector.collectRows(JdbcRowsCollectors.java:198)
at org.jetbrains.jdba.jdbc.JdbcIntermediateCursor.fetch(JdbcIntermediateCursor.java:245)
at com.intellij.database.remote.jdba.impl.RemoteCursorImpl.fetch(RemoteCursorImpl.java:31)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:322)
at sun.rmi.transport.Transport$2.run(Transport.java:202)
at sun.rmi.transport.Transport$2.run(Transport.java:199)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.Transport.serviceCall(Transport.java:198)
at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:567)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:828)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.access$400(TCPTransport.java:619)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler$1.run(TCPTransport.java:684)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler$1.run(TCPTransport.java:681)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:681)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
However, when executing queries from the console, the data I get is the correct. I'm not very sure what's failing here, as I don't have any problems when using pgAdmin.
Has this happened to anyone else?
I solved the issue by re-downloading the JDBC driver 0xDBE was using. Right-Click on the Data Source, then go to preferences and erase the driver files currently being used and download them again.
You have to restart IDE after redownloading the driver for the changes to take effect.
I was able to fix this issue by opening the data source settings page (where you enter the DB credentials) and switching to tab Options and enabling Introspect using JDBC metadata:

"Unexpected Error occurred attempting to open an SQL connection." when opening InMemory DB using SQuirreL SQL Client 3.5.0

We have an event generation mechanism that generates & save the events in a flatfile (or say DB file to make it easy).
To view this event in DB we complete the run for generating the events & then we use "SQuirreL SQL Client 3.5.0".
When the size of this db file is small SQuirreL Client works fine. But when the size grows near to 20MB, it refuses to open the database & throws below exception -
Unexpected Error occurred attempting to open an SQL connection.
Below is the StackTrace for your reference:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.sql.SQLException: Out of Memory
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:202)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.awaitConnection(OpenConnectionCommand.java:132)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.access$100(OpenConnectionCommand.java:45)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand$2.run(OpenConnectionCommand.java:115)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.lang.RuntimeException: java.sql.SQLException: Out of Memory
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.executeConnect(OpenConnectionCommand.java:171)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.access$000(OpenConnectionCommand.java:45)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand$1.run(OpenConnectionCommand.java:104)
... 5 more
Caused by: java.sql.SQLException: Out of Memory
at org.hsqldb.jdbc.Util.sqlException(Unknown Source)
at org.hsqldb.jdbc.Util.sqlException(Unknown Source)
at org.hsqldb.jdbc.JDBCConnection.<init>(Unknown Source)
at org.hsqldb.jdbc.JDBCDriver.getConnection(Unknown Source)
at org.hsqldb.jdbc.JDBCDriver.connect(Unknown Source)
at net.sourceforge.squirrel_sql.fw.sql.SQLDriverManager.getConnection(SQLDriverManager.java:133)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.executeConnect(OpenConnectionCommand.java:167)
... 7 more
Caused by: org.hsqldb.HsqlException: Out of Memory
at org.hsqldb.error.Error.error(Unknown Source)
at org.hsqldb.error.Error.error(Unknown Source)
at org.hsqldb.persist.ScriptRunner.runScript(Unknown Source)
at org.hsqldb.persist.ScriptRunner.runScript(Unknown Source)
at org.hsqldb.persist.Log.processLog(Unknown Source)
at org.hsqldb.persist.Log.open(Unknown Source)
at org.hsqldb.persist.Logger.openPersistence(Unknown Source)
at org.hsqldb.Database.reopen(Unknown Source)
at org.hsqldb.Database.open(Unknown Source)
at org.hsqldb.DatabaseManager.getDatabase(Unknown Source)
at org.hsqldb.DatabaseManager.newSession(Unknown Source)
... 12 more
Try increasing the memory allocated to SQuirreL. Open the script or batch file in a text editor and change the amount of memory allocated. Look for -Xmx256m and replace the 256 with a larger number.
On Windows the file is squirrel-sql.bat, on Linux it is squirrel-sql.sh

Deploying Liferay on Cloudbees

I'd like to deploy Liferay on Cloudbees. I found this blog '15 minutes to liferay on cloudbees paas' with instructions on how to do it. I followed all those instructions, and the actual deployment seemed to go fine. There are no error messages in the logs either. But when I click the link that should lead to my site (with Liferay on it), I get the message that the resource isn't available. Does anyone have any clues what I might have done wrong?
My own guess is that I don't have the right values in portal-ext.properties, so here it is:
jdbc.default.driverClassName=com.cloudbees.jdbc.Driver
jdbc.default.url=jdbc:cloudbees://liferay_db/liferay_db?useUnicode=true&characterEncoding=UTF-8&useFastDateParsing=false
jdbc.default.username=MyUserName
jdbc.default.password=MyPassword
#lucene.store.type=ram
#lucene.store.jdbc.auto.clean.up.enabled=true
[update]
I have now tried the Liferay Clickstart. Now I get a '502 Bad gateway' when I access the site. But at least this time there are error messages in the logs:
19:17:03,413 ERROR [main][JDBCExceptionReporter:76] Table 'liferay-po-12.portalpreferences' doesn't exist
19:17:03,565 ERROR [main][FileChecker:123]
com.liferay.portal.kernel.exception.SystemException:
com.liferay.portal.kernel.dao.orm.ORMException:
org.hibernate.exception.SQLGrammarException: could not execute query
com.liferay.portal.kernel.exception.SystemException:
com.liferay.portal.kernel.dao.orm.ORMException:
org.hibernate.exception.SQLGrammarException: could not execute query
at
com.liferay.portal.service.persistence.impl.BasePersistenceImpl.processException(BasePersistenceImpl.java:193)
at com.liferay.portal.service.persistence.PortalPreferencesPersistenceImpl.fetchByO_O(PortalPreferencesPersistenceImpl.java:600)
at com.liferay.portal.service.persistence.PortalPreferencesPersistenceImpl.fetchByO_O(PortalPreferencesPersistenceImpl.java:519)
at com.liferay.portal.service.impl.PortalPreferencesLocalServiceImpl.doGetPreferences(PortalPreferencesLocalServiceImpl.java:170)
at com.liferay.portal.service.impl.PortalPreferencesLocalServiceImpl.getPreferences(PortalPreferencesLocalServiceImpl.java:101)
at com.liferay.portal.service.impl.PortalPreferencesLocalServiceImpl.getPreferences(PortalPreferencesLocalServiceImpl.java:88)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.liferay.portal.spring.aop.ServiceBeanMethodInvocation.proceed(ServiceBeanMethodInvocation.java:122)
at com.liferay.portal.spring.transaction.TransactionInterceptor.invoke(TransactionInterceptor.java:71)
at com.liferay.portal.spring.aop.ServiceBeanMethodInvocation.proceed(ServiceBeanMethodInvocation.java:118)
at com.liferay.portal.spring.aop.ChainableMethodAdvice.invoke(ChainableMethodAdvice.java:57)
at com.liferay.portal.spring.aop.ServiceBeanMethodInvocation.proceed(ServiceBeanMethodInvocation.java:118)
at com.liferay.portal.spring.aop.ChainableMethodAdvice.invoke(ChainableMethodAdvice.java:57)
at com.liferay.portal.spring.aop.ServiceBeanMethodInvocation.proceed(ServiceBeanMethodInvocation.java:118)
at com.liferay.portal.spring.aop.ChainableMethodAdvice.invoke(ChainableMethodAdvice.java:57)
at com.liferay.portal.spring.aop.ServiceBeanMethodInvocation.proceed(ServiceBeanMethodInvocation.java:118)
at com.liferay.portal.spring.aop.ChainableMethodAdvice.invoke(ChainableMethodAdvice.java:57)
at com.liferay.portal.spring.aop.ServiceBeanMethodInvocation.proceed(ServiceBeanMethodInvocation.java:118)
at com.liferay.portal.security.pacl.PACLAdvice.invoke(PACLAdvice.java:51)
at com.liferay.portal.spring.aop.ServiceBeanMethodInvocation.proceed(ServiceBeanMethodInvocation.java:118)
at com.liferay.portal.spring.aop.ServiceBeanAopProxy.invoke(ServiceBeanAopProxy.java:211)
at $Proxy67.getPreferences(Unknown Source)
at com.liferay.portal.service.PortalPreferencesLocalServiceUtil.getPreferences(PortalPreferencesLocalServiceUtil.java:282)
at com.liferay.portal.util.PrefsPropsUtil.getPreferences(PrefsPropsUtil.java:250)
at com.liferay.portal.util.PrefsPropsUtil.getPreferences(PrefsPropsUtil.java:241)
at com.liferay.portal.util.PrefsPropsUtil.getString(PrefsPropsUtil.java:448)
at com.liferay.portal.deploy.DeployUtil.getAutoDeployDestDir(DeployUtil.java:64)
at com.liferay.portal.deploy.DeployManagerImpl.getInstalledDir(DeployManagerImpl.java:60)
at com.liferay.portal.kernel.deploy.DeployManagerUtil.getInstalledDir(DeployManagerUtil.java:48)
at com.liferay.portal.security.pacl.checker.FileChecker.afterPropertiesSet(FileChecker.java:119)
at com.liferay.portal.security.pacl.BasePACLPolicy.initChecker(BasePACLPolicy.java:111)
at com.liferay.portal.security.pacl.BasePACLPolicy.initCheckers(BasePACLPolicy.java:146)
at com.liferay.portal.security.pacl.BasePACLPolicy.<init>(BasePACLPolicy.java:48)
at com.liferay.portal.security.pacl.InactivePACLPolicy.<init>(InactivePACLPolicy.java:32)
at com.liferay.portal.security.pacl.PACLPolicyManager.<clinit>(PACLPolicyManager.java:200)
at com.liferay.portal.security.pacl.PACLAdvice.invoke(PACLAdvice.java:47)
at com.liferay.portal.spring.aop.ServiceBeanMethodInvocation.proceed(ServiceBeanMethodInvocation.java:118)
at com.liferay.portal.spring.aop.ServiceBeanAopProxy.invoke(ServiceBeanAopProxy.java:211)
at $Proxy48.clear(Unknown Source)
at com.liferay.portal.service.LockLocalServiceUtil.clear(LockLocalServiceUtil.java:267)
at com.liferay.portal.events.StartupAction.doRun(StartupAction.java:75)
at com.liferay.portal.events.StartupAction.run(StartupAction.java:52)
at com.liferay.portal.servlet.MainServlet.processStartupEvents(MainServlet.java:1306)
at com.liferay.portal.servlet.MainServlet.init(MainServlet.java:214)
at javax.servlet.GenericServlet.init(GenericServlet.java:212)
at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1206)
at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:1026)
at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4421)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4734)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.startup.Embedded.start(Embedded.java:825)
at com.staxnet.appserver.TomcatServerBase.startContainer(TomcatServerBase.java:120)
at com.staxnet.appserver.TomcatServerBase.start(TomcatServerBase.java:190)
at com.staxnet.appserver.StaxAppServer.main(StaxAppServer.java:89)
at com.staxnet.appserver.SnazAppServer.main(SnazAppServer.java:26)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at net.stax.appserver.bootstrap.Bootstrap.invokeAppServerMain(Bootstrap.java:41)
at net.stax.appserver.bootstrap.Bootstrap.main(Bootstrap.java:30)
Caused by: com.liferay.portal.kernel.dao.orm.ORMException:
org.hibernate.exception.SQLGrammarException: could not execute query
at com.liferay.portal.dao.orm.hibernate.ExceptionTranslator.translate(ExceptionTranslator.java:30)
at com.liferay.portal.dao.orm.hibernate.QueryImpl.list(QueryImpl.java:98)
at com.liferay.portal.dao.orm.hibernate.QueryImpl.list(QueryImpl.java:75)
at com.liferay.portal.service.persistence.PortalPreferencesPersistenceImpl.fetchByO_O(PortalPreferencesPersistenceImpl.java:575)
... 65 more
Caused by: org.hibernate.exception.SQLGrammarException: could not execute query
at org.hibernate.exception.SQLStateConverter.convert(SQLStateConverter.java:92)
at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:66)
at org.hibernate.loader.Loader.doList(Loader.java:2545)
at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2276)
at org.hibernate.loader.Loader.list(Loader.java:2271)
at org.hibernate.hql.classic.QueryTranslatorImpl.list(QueryTranslatorImpl.java:940)
at org.hibernate.engine.query.HQLQueryPlan.performList(HQLQueryPlan.java:196)
at org.hibernate.impl.SessionImpl.list(SessionImpl.java:1268)
at org.hibernate.impl.QueryImpl.list(QueryImpl.java:102)
at com.liferay.portal.dao.orm.hibernate.QueryImpl.list(QueryImpl.java:86)
... 67 more
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'liferay-po-12.portalpreferences' doesn't exist
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:407)
at com.mysql.jdbc.Util.getInstance(Util.java:382)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3603)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3535)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1989)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2150)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2626)
at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2119)
at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2281)
at org.apache.tomcat.dbcp.dbcp.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:96)
at org.apache.tomcat.dbcp.dbcp.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:96)
at org.hibernate.jdbc.AbstractBatcher.getResultSet(AbstractBatcher.java:208)
at org.hibernate.loader.Loader.getResultSet(Loader.java:1953)
at org.hibernate.loader.Loader.doQuery(Loader.java:802)
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:274)
at org.hibernate.loader.Loader.doList(Loader.java:2542)
... 74 more
19:17:04,035 ERROR [main][JDBCExceptionReporter:76] Table 'liferay-po-12.lock_' doesn't exist
There was also a message about a log file being non-existing:
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /tmp/liferay/logs/liferay.2012-09-17.log (Permission denied)
at java.io.FileOutputStream.openAppend(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:192)
at java.io.FileOutputStream.<init>(FileOutputStream.java:116)
at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
at org.apache.log4j.rolling.RollingFileAppender.activateOptions(RollingFileAppender.java:179)
at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
at org.apache.log4j.xml.DOMConfigurator.parseAppender(DOMConfigurator.java:295)
at org.apache.log4j.xml.DOMConfigurator.findAppenderByName(DOMConfigurator.java:176)
at org.apache.log4j.xml.DOMConfigurator.findAppenderByReference(DOMConfigurator.java:191)
at org.apache.log4j.xml.DOMConfigurator.parseChildrenOfLoggerElement(DOMConfigurator.java:523)
at org.apache.log4j.xml.DOMConfigurator.parseRoot(DOMConfigurator.java:492)
at org.apache.log4j.xml.DOMConfigurator.parse(DOMConfigurator.java:1001)
at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:867)
at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:815)
at com.liferay.util.log4j.Log4JUtil.configureLog4J(Log4JUtil.java:79)
at com.liferay.util.log4j.Log4JUtil.configureLog4J(Log4JUtil.java:57)
at com.liferay.portal.util.InitUtil.init(InitUtil.java:99)
at com.liferay.portal.spring.context.PortalContextLoaderListener.contextInitialized(PortalContextLoaderListener.java:149)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4206)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4705)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.startup.Embedded.start(Embedded.java:825)
at com.staxnet.appserver.TomcatServerBase.startContainer(TomcatServerBase.java:120)
at com.staxnet.appserver.TomcatServerBase.start(TomcatServerBase.java:190)
at com.staxnet.appserver.StaxAppServer.main(StaxAppServer.java:89)
at com.staxnet.appserver.SnazAppServer.main(SnazAppServer.java:26)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at net.stax.appserver.bootstrap.Bootstrap.invokeAppServerMain(Bootstrap.java:41)
at net.stax.appserver.bootstrap.Bootstrap.main(Bootstrap.java:30)
[/update]
I recommend using the newly created ClickStart for Liferay, which you can find here:
https://github.com/CloudBees-community/liferay-clickstart
You would have to click the "Deploy instantly on CloudBees" button, and there are two modifications to be done before it can run properly (And maybe those were the reason why it didn't work originally for you, as Tomcat will return "resource not available" if it runs out of PermGen space)
Set a bigger PermGen space with:
bees app:update ACCOUNT/APP_NAME jvmPermSize=256
Change the size of your application with the CloudBees web management console to the recommended 1024MB for Liferay.