Can not run Sample 460, Introduction to Eventing and Event Mediator - wso2-esb

Currently i try to run this sample, but it doesn't work
[2014-01-23 14:34:22,519] ERROR - DefaultRealmService Tenant domain has not been set in CarbonContext
java.lang.NullPointerException: Tenant domain has not been set in CarbonContext
at org.wso2.carbon.caching.impl.CacheManagerFactoryImpl.getCacheManager(CacheManagerFactoryImpl.java:79)
at org.wso2.carbon.user.core.tenant.TenantCache.getTenantCache(TenantCache.java:39)
at org.wso2.carbon.user.core.tenant.TenantCache.getValueFromCache(TenantCache.java:77)
at org.wso2.carbon.user.core.tenant.JDBCTenantManager.getTenant(JDBCTenantManager.java:224)
at org.wso2.carbon.user.core.tenant.JDBCTenantManager.getTenant(JDBCTenantManager.java:54)
at org.wso2.carbon.user.core.common.DefaultRealmService.getTenantUserRealm(DefaultRealmService.java:159)
at org.wso2.carbon.event.core.internal.delivery.inmemory.InMemoryDeliveryManager.publish(InMemoryDeliveryManager.java:87)
at org.wso2.carbon.event.core.internal.EventPublisher.run(EventPublisher.java:56)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
[2014-01-23 14:34:22,523] ERROR - EventPublisher Can not publish the message
org.wso2.carbon.event.core.exception.EventBrokerException: Can not access the user store manager
at org.wso2.carbon.event.core.internal.delivery.inmemory.InMemoryDeliveryManager.publish(InMemoryDeliveryManager.java:109)
at org.wso2.carbon.event.core.internal.EventPublisher.run(EventPublisher.java:56)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Could you please explain me what's happen? And How can i solve that issuse ?

This is a known bug, look at : https://wso2.org/jira/browse/ESBJAVA-2857

Related

Alfresco LDAP batch sync

I am trying to connect my alfresco instance to our ldap server to authenticate users.
My configuration
# LDAP Authentication
authentication.chain=alfrescoNtlm1:alfrescoNtlm,ldap1:ldap
ldap.authentication.active=true
ldap.authentication.java.naming.provider.url=ldap://myurl:389
ldap.authentication.userNameFormat=dc=example,dc=com
ldap.authentication.java.naming.security.authentication=simple
ldap.synchronization.java.naming.security.principal=cn\=myCN,ou\=admin,dc\=example,dc\=com
ldap.synchronization.java.naming.security.credentials=secret
ldap.authentication.allowGuestLogin=false
ldap.synchronization.userSearchBase=ou\=users,dc\=example,dc\=com
ldap.synchronization.groupSearchBase=dc\=example,dc\=com
ldap.synchronization.attributeBatchSize=200
ldap.synchronization.queryBatchSize=200
The problem is that I reach the sizelimit of the ldap server every time. I doesn't seem like the batch size is used. I cannot raise the size limit of the ldap server. Is there a way to process user data batchwise?
Alfresco throws the following error:
2021-04-01 13:28:54,863 ERROR [org.alfresco.repo.security.sync.ChainingUserRegistrySynchronizer] [localhost-startStop-1] Synchronization aborted due to error
org.alfresco.error.AlfrescoRuntimeException: 03010018 Error during LDAP Search. Reason:[LDAP: error code 4 - Sizelimit Exceeded]
at org.alfresco.repo.security.sync.ldap.LDAPUserRegistry.processQuery(LDAPUserRegistry.java:1335)
at org.alfresco.repo.security.sync.ldap.LDAPUserRegistry.access$14(LDAPUserRegistry.java:1287)
at org.alfresco.repo.security.sync.ldap.LDAPUserRegistry$PersonCollection.<init>(LDAPUserRegistry.java:1524)
at org.alfresco.repo.security.sync.ldap.LDAPUserRegistry.getPersons(LDAPUserRegistry.java:573)
at org.alfresco.repo.security.sync.ChainingUserRegistrySynchronizer.syncWithPlugin(ChainingUserRegistrySynchronizer.java:1775)
at org.alfresco.repo.security.sync.ChainingUserRegistrySynchronizer.synchronizeInternal(ChainingUserRegistrySynchronizer.java:739)
at org.alfresco.repo.security.sync.ChainingUserRegistrySynchronizer.access$16(ChainingUserRegistrySynchronizer.java:474)
at org.alfresco.repo.security.sync.ChainingUserRegistrySynchronizer$7.doWork(ChainingUserRegistrySynchronizer.java:2138)
at org.alfresco.repo.security.authentication.AuthenticationUtil.runAs(AuthenticationUtil.java:555)
at org.alfresco.repo.security.sync.ChainingUserRegistrySynchronizer.onBootstrap(ChainingUserRegistrySynchronizer.java:2132)
at org.springframework.extensions.surf.util.AbstractLifecycleBean.onApplicationEvent(AbstractLifecycleBean.java:56)
at org.alfresco.repo.security.sync.ChainingUserRegistrySynchronizer.onApplicationEvent(ChainingUserRegistrySynchronizer.java:2495)
at org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:172)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:165)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:139)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:127)
at org.alfresco.repo.management.subsystems.ChildApplicationContextFactory$ChildApplicationContext.publishEvent(ChildApplicationContextFactory.java:569)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:887)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:552)
at org.alfresco.repo.management.subsystems.ChildApplicationContextFactory$ApplicationContextState.start(ChildApplicationContextFactory.java:824)
at org.alfresco.repo.management.subsystems.AbstractPropertyBackedBean.start(AbstractPropertyBackedBean.java:1098)
at org.alfresco.repo.management.subsystems.AbstractPropertyBackedBean.onApplicationEvent(AbstractPropertyBackedBean.java:637)
at org.alfresco.repo.management.SafeApplicationEventMulticaster.multicastEventInternal(SafeApplicationEventMulticaster.java:221)
at org.alfresco.repo.management.SafeApplicationEventMulticaster.multicastEvent(SafeApplicationEventMulticaster.java:186)
at org.alfresco.repo.management.SafeApplicationEventMulticaster.multicastEvent(SafeApplicationEventMulticaster.java:206)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:399)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:353)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:887)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:552)
at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:409)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:291)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:103)
at org.alfresco.web.app.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:70)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4753)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5215)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:752)
at org.apache.catalina.core.ContainerBase.access$000(ContainerBase.java:129)
at org.apache.catalina.core.ContainerBase$PrivilegedAddChild.run(ContainerBase.java:150)
at org.apache.catalina.core.ContainerBase$PrivilegedAddChild.run(ContainerBase.java:140)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:726)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:734)
at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1141)
at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1875)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: javax.naming.SizeLimitExceededException: [LDAP: error code 4 - Sizelimit Exceeded]; remaining name 'ou=users,dc=example,dc=com'
at com.sun.jndi.ldap.LdapCtx.mapErrorCode(LdapCtx.java:3206)
at com.sun.jndi.ldap.LdapCtx.processReturnCode(LdapCtx.java:3100)
at com.sun.jndi.ldap.LdapCtx.processReturnCode(LdapCtx.java:2891)
at com.sun.jndi.ldap.AbstractLdapNamingEnumeration.getNextBatch(AbstractLdapNamingEnumeration.java:148)
at com.sun.jndi.ldap.AbstractLdapNamingEnumeration.hasMoreImpl(AbstractLdapNamingEnumeration.java:217)
at com.sun.jndi.ldap.AbstractLdapNamingEnumeration.hasMore(AbstractLdapNamingEnumeration.java:189)
at org.alfresco.repo.security.sync.ldap.LDAPUserRegistry.processQuery(LDAPUserRegistry.java:1316)
... 49 more
Thanks for every help.
You should be able to configure "ldap.synchronization.queryBatchSize=1000" (or some other batch size) in alfresco-global.properties. Are you sure you're editing the effective alfresco-global.properties?
Additionally, if you set "org.alfresco.repo.security.sync.ldap.LDAPUserRegistry" into debug, you should be able to see the bath size reflected in the log as:
Return result limit:

Why is my RabbitMQ message impossible to serialize using Apache Beam?

I'm trying to read a RabbitMQ queue using Apache Beam.
I've written some transformation code to have messages written to Kafka.
I've even tested my scenario using simple text messages.
Now I try to deploy it with the effective messages this transformer is made to run on. These are JSON message of a quite moderate size.
Strangely, when i try to read "production" messages, I get this exception stack trace.
java.lang.IllegalArgumentException: Unable to encode element 'ValueWithRecordId{id=[], value=org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage#f179a7f}' with coder 'ValueWithRecordId$ValueWithRecordIdCoder(org.apache.beam.sdk.coders.SerializableCoder#76190fb2)'.
org.apache.beam.sdk.coders.Coder.getEncodedElementByteSize(Coder.java:300)
org.apache.beam.sdk.coders.Coder.registerByteSizeObserver(Coder.java:291)
org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.registerByteSizeObserver(WindowedValue.java:564)
org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.registerByteSizeObserver(WindowedValue.java:480)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory$ElementByteSizeObservableCoder.registerByteSizeObserver(IntrinsicMapTaskExecutorFactory.java:400)
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputObjectAndByteCounter.update(OutputObjectAndByteCounter.java:125)
org.apache.beam.runners.dataflow.worker.DataflowOutputCounter.update(DataflowOutputCounter.java:64)
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:43)
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1283)
org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.access$1000(StreamingDataflowWorker.java:147)
org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker$6.run(StreamingDataflowWorker.java:1020)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
java.lang.Thread.run(Thread.java:745)
Caused by: java.io.NotSerializableException: com.rabbitmq.client.impl.LongStringHelper$ByteArrayLongString
java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
java.util.HashMap.internalWriteEntries(HashMap.java:1785)
java.util.HashMap.writeObject(HashMap.java:1362)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1028)
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
org.apache.beam.sdk.coders.SerializableCoder.encode(SerializableCoder.java:183)
org.apache.beam.sdk.coders.SerializableCoder.encode(SerializableCoder.java:53)
org.apache.beam.sdk.values.ValueWithRecordId$ValueWithRecordIdCoder.encode(ValueWithRecordId.java:105)
org.apache.beam.sdk.values.ValueWithRecordId$ValueWithRecordIdCoder.encode(ValueWithRecordId.java:99)
org.apache.beam.sdk.values.ValueWithRecordId$ValueWithRecordIdCoder.encode(ValueWithRecordId.java:81)
org.apache.beam.sdk.coders.Coder.getEncodedElementByteSize(Coder.java:297)
org.apache.beam.sdk.coders.Coder.registerByteSizeObserver(Coder.java:291)
org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.registerByteSizeObserver(WindowedValue.java:564)
org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.registerByteSizeObserver(WindowedValue.java:480)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory$ElementByteSizeObservableCoder.registerByteSizeObserver(IntrinsicMapTaskExecutorFactory.java:400)
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputObjectAndByteCounter.update(OutputObjectAndByteCounter.java:125)
org.apache.beam.runners.dataflow.worker.DataflowOutputCounter.update(DataflowOutputCounter.java:64)
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:43)
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1283)
org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.access$1000(StreamingDataflowWorker.java:147)
org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker$6.run(StreamingDataflowWorker.java:1020)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
java.lang.Thread.run(Thread.java:745)
My understanding is that the RabbitMQ reader consider the messages big enough to require the use of LongString, which is not serializable.
Am I right on this point ? And if so, how do I suggest RabbitMQ to use a simple String (which will be enough for these messages) ?
This is an Apache Beam (https://issues.apache.org/jira/browse/BEAM-7414) for which solution is ... not yet merged into Apache Beam repo by pure laziness (this is bad). If someone wants to have the fix immediatly, it is possible to build my branch https://github.com/Riduidel/beam/tree/fix/rabbitmq-message-not-serializable

ERROR {API_LOGGER ...} - DataMapper mediator : mapping failed WSO2

I am having a little problem with datamapper, i tried to create one to pass some little messages and have a problem, how bellow:
TID[-1234] [EI] [2017-08-02 14:37:18,356] ERROR {org.wso2.carbon.mediator.datamapper.DataMapperMediator} - DataMapper mediator : mapping failed org.wso2.carbon.mediator.datamapper.engine.input.readers.XMLInputReader.xmlTraverse(XMLInputReader.java:174) org.wso2.carbon.mediator.datamapper.engine.input.readers.XMLInputReader.read(XMLInputReader.java:120) org.wso2.carbon.mediator.datamapper.engine.input.InputBuilder.buildInputModel(InputBuilder.java:59) org.wso2.carbon.mediator.datamapper.engine.core.mapper.MappingHandler.doMap(MappingHandler.java:88) org.wso2.carbon.mediator.datamapper.DataMapperMediator.transform(DataMapperMediator.java:309) org.wso2.carbon.mediator.datamapper.DataMapperMediator.mediate(DataMapperMediator.java:259) org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:97) org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:59) org.apache.synapse.mediators.base.SequenceMediator.mediate(SequenceMediator.java:158) org.apache.synapse.rest.Resource.process(Resource.java:343) org.apache.synapse.rest.API.process(API.java:399) org.apache.synapse.rest.RESTRequestHandler.apiProcess(RESTRequestHandler.java:123) org.apache.synapse.rest.RESTRequestHandler.dispatchToAPI(RESTRequestHandler.java:101) org.apache.synapse.rest.RESTRequestHandler.process(RESTRequestHandler.java:69) org.apache.synapse.core.axis2.Axis2SynapseEnvironment.injectMessage(Axis2SynapseEnvironment.java:304) org.apache.synapse.core.axis2.SynapseMessageReceiver.receive(SynapseMessageReceiver.java:78) org.apache.axis2.engine.AxisEngine.receive(AxisEngine.java:180) org.apache.synapse.transport.passthru.ServerWorker.processNonEntityEnclosingRESTHandler(ServerWorker.java:326) org.apache.synapse.transport.passthru.ServerWorker.processEntityEnclosingRequest(ServerWorker.java:372) org.apache.synapse.transport.passthru.ServerWorker.run(ServerWorker.java:151) org.apache.axis2.transport.base.threads.NativeWorkerPool$1.run(NativeWorkerPool.java:172) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) java.lang.Thread.run(Thread.java:748)
TID[-1234] [EI] [2017-08-02 14:37:18,357] ERROR {API_LOGGER.vendedor} - DataMapper mediator : mapping failed org.wso2.carbon.mediator.datamapper.engine.input.readers.XMLInputReader.xmlTraverse(XMLInputReader.java:174) org.wso2.carbon.mediator.datamapper.engine.input.readers.XMLInputReader.read(XMLInputReader.java:120) org.wso2.carbon.mediator.datamapper.engine.input.InputBuilder.buildInputModel(InputBuilder.java:59) org.wso2.carbon.mediator.datamapper.engine.core.mapper.MappingHandler.doMap(MappingHandler.java:88) org.wso2.carbon.mediator.datamapper.DataMapperMediator.transform(DataMapperMediator.java:309) org.wso2.carbon.mediator.datamapper.DataMapperMediator.mediate(DataMapperMediator.java:259) org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:97) org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:59) org.apache.synapse.mediators.base.SequenceMediator.mediate(SequenceMediator.java:158) org.apache.synapse.rest.Resource.process(Resource.java:343) org.apache.synapse.rest.API.process(API.java:399) org.apache.synapse.rest.RESTRequestHandler.apiProcess(RESTRequestHandler.java:123) org.apache.synapse.rest.RESTRequestHandler.dispatchToAPI(RESTRequestHandler.java:101) org.apache.synapse.rest.RESTRequestHandler.process(RESTRequestHandler.java:69) org.apache.synapse.core.axis2.Axis2SynapseEnvironment.injectMessage(Axis2SynapseEnvironment.java:304) org.apache.synapse.core.axis2.SynapseMessageReceiver.receive(SynapseMessageReceiver.java:78) org.apache.axis2.engine.AxisEngine.receive(AxisEngine.java:180) org.apache.synapse.transport.passthru.ServerWorker.processNonEntityEnclosingRESTHandler(ServerWorker.java:326) org.apache.synapse.transport.passthru.ServerWorker.processEntityEnclosingRequest(ServerWorker.java:372) org.apache.synapse.transport.passthru.ServerWorker.run(ServerWorker.java:151) org.apache.axis2.transport.base.threads.NativeWorkerPool$1.run(NativeWorkerPool.java:172) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) java.lang.Thread.run(Thread.java:748)
If anyone could help me a i would stay quite grateful.
Thank you.
I was able to solve my problem.
The thing is, my project had a ContentType property, but don't had a MessageType property, i created that after the datamapping and before the ContentType property. This solved my problem.
Thank you for whom could help me.

Unable to produce data to hazelcast in apache camel

I have the following route configured in apache-camel
from("direct:hazelCast")
.setHeader(HazelcastConstants.OPERATION, constant(HazelcastConstants.PUT_OPERATION))
.toF("hazelcast:map:testHazel", HazelcastConstants.MAP_PREFIX);
But, when the above route is invoked I'm getting the following error:
java.lang.NullPointerException: Null key is not allowed!
at com.hazelcast.map.impl.proxy.MapProxyImpl.put(MapProxyImpl.java:95)
at com.hazelcast.map.impl.proxy.MapProxyImpl.put(MapProxyImpl.java:89)
at org.apache.camel.component.hazelcast.map.HazelcastMapProducer.put(HazelcastMapProducer.java:125)
at org.apache.camel.component.hazelcast.map.HazelcastMapProducer.process(HazelcastMapProducer.java:60)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:141)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:460)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:121)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:83)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.component.direct.DirectProducer.process(DirectProducer.java:62)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:141)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:460)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:109)
at org.apache.camel.processor.MulticastProcessor.doProcessParallel(MulticastProcessor.java:814)
at org.apache.camel.processor.MulticastProcessor.access$200(MulticastProcessor.java:84)
at org.apache.camel.processor.MulticastProcessor$1.call(MulticastProcessor.java:314)
at org.apache.camel.processor.MulticastProcessor$1.call(MulticastProcessor.java:299)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
The code that I used was almost similar to what was there is camel docs http://camel.apache.org/hazelcast-component.html
The following is the code which produces the data to hazelcast
The following is the code snippet that I have used to produce the data to hazelcast in camel:
from("direct:hazelCast")
.setHeader(HazelcastConstants.OPERATION, constant(HazelcastConstants.PUT_OPERATION))
.setHeader(HazelcastConstants.OBJECT_ID, constant("SOME BLA BLA"))
.split()
.tokenizeXML(<SOMEValidTag>).streaming()
.unmarshal(jaxb)
.convertBodyTo(<Valid>.class)
.marshal().json(JsonLibrary.Jackson)
.toF("hazelcast:%stestHazel", HazelcastConstants.MAP_PREFIX);
Note: We need to convert the body to class which is should be serializeable
You need to set the objectid it is missing

Intellij Ultimate not executing jpa-ql query

I have Intellij-Idea Ultimate version 14.1.4, and I was trying to execute a JPA-QL query with the console.
I tried a simple one and I had an error I report below. Is this some misconfiguration on my end or a bug?
[2015-08-24 14:43:59] java.lang.IllegalArgumentException: Argument for #NotNull parameter 'stream' of com/intellij/openapi/util/io/FileUtil.loadTextAndClose must not be null
at com.intellij.openapi.util.io.FileUtil.loadTextAndClose(FileUtil.java)
at com.intellij.jpa.engine.JpaEngine.loadJpaTemplate(JpaEngine.java:241)
at com.intellij.jpa.engine.JpaEngine.createPersistenceXmlText(JpaEngine.java:151)
at com.intellij.jpa.engine.JpaEngine.createTemporaryJpaConfig(JpaEngine.java:207)
at com.intellij.jpa.engine.JpaEngine.access$000(JpaEngine.java:61)
at com.intellij.jpa.engine.JpaEngine$1.compute(JpaEngine.java:143)
at com.intellij.jpa.engine.JpaEngine$1.compute(JpaEngine.java:136)
at com.intellij.openapi.project.DumbService$1.run(DumbService.java:87)
at com.intellij.openapi.project.DumbService$2.compute(DumbService.java:123)
at com.intellij.openapi.project.DumbService$2.compute(DumbService.java:117)
at com.intellij.openapi.application.impl.ApplicationImpl.runReadAction(ApplicationImpl.java:888)
at com.intellij.openapi.project.DumbService.runReadActionInSmartMode(DumbService.java:117)
at com.intellij.openapi.project.DumbService.runReadActionInSmartMode(DumbService.java:84)
at com.intellij.jpa.engine.JpaEngine.createTemporaryConfig(JpaEngine.java:136)
at com.intellij.jpa.engine.JpaEngine.ensureInitialized(JpaEngine.java:97)
at com.intellij.jpa.engine.JpaEngine.createQuery(JpaEngine.java:104)
at com.intellij.jpa.engine.JpaEngineBase.executeQueryInner(JpaEngineBase.java:166)
at com.intellij.jpa.engine.JpaEngineBase.access$000(JpaEngineBase.java:56)
at com.intellij.jpa.engine.JpaEngineBase$1.compute(JpaEngineBase.java:128)
at com.intellij.jpa.engine.JpaEngineBase$1.compute(JpaEngineBase.java:123)
at com.intellij.database.console.AbstractEngine$4.compute(AbstractEngine.java:179)
at com.intellij.database.console.AbstractEngine$4.compute(AbstractEngine.java:174)
at com.intellij.database.console.AbstractEngine$3.run(AbstractEngine.java:159)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)