I am using a case of wso2 esb (4.8.1) message store and message processor.
I am using ActiveMQ as a message store. I am successfully storing the message in queue and also message processor successfully deliver it. The reply that come from the backend on deliver I want to pass that to out sequence.
find my message processor configuration
<?xml version="1.0" encoding="UTF-8"?>
<messageProcessor xmlns="http://ws.apache.org/ns/synapse"
class="org.apache.synapse.message.processor.impl.forwarder.ScheduledMessageForwardingProcessor"
name="Forwarder_Cluster"
messageStore="cluster_jms">
<parameter name="max.delivery.attempts">2</parameter>
<parameter name="interval">3000</parameter>
<parameter name="is.active">true</parameter>
<parameter name="message.processor.reply.sequence">out</parameter>
</messageProcessor>
I am getting an error Scheduled Message Processor Could not find the message consumer to clean up.
Any ways to do this
You need to create a separate sequence for handling the response. Default sequence want support your use case. Please refer Store and Forward Using JMS Message Stores for more details.
Best Regards,
Vanji
Related
In one of the requirement i want to integrate splunk with existing mule esb server that we are using(we are using cloudhub).
How can we achieve that?
Any help will be great.
You can use the Splunk connector to integrate within your applications. If you want to just send event data from existing applications see this other documentation page: Sending Data from Runtime Manager to External Monitoring Software
Add below configuration in your log4j file.
<Configuration packages="com.mulesoft.ch.logging.appender,com.splunk.logging">
<Http name="SPLUNK" url="https://input-prd-p-3ppsrv6qrg3h.cloud.splunk.com:8088" token="19954BDA-7A21-4FFA-A1E5-B3DFB7C75D55" disableCertificateValidation="true">
<PatternLayout pattern="%m" />
</Http>
where you should prefix "input-" to the domain name you have for splunk & the token will be the one that of http even collector.
Also while logging give reference of above http splunk appender :
<Root level="INFO">
<AppenderRef ref="SPLUNK" />
</Root>
Let me know if it works
I need to consume WS that's using JMS as a transport rather than HTTP.
The Web Service Consumer doc says that it supports JMS but no any example provided unfortunately of using the component with non HTTP transports.
Need help with this subject. These are the steps I've done:
The wsdl has been successfully loaded by the connector wizard in anypoint. I've specified the method to be called. The parameters were recognized by DataSense so I can see the input parameters with DataMapper etc.
The URL looks like this: jms:queue:toOrderManagement?replyToName=fromOrderManagement?targetService=OrderManagement
I've defined global JMS connector like this:
<jms:connector name="JMSConnector" specification="1.1" username="user"
password="******" validateConnections="true" doc:name="JMS">
<reconnect-forever />
</jms:connector>
and associated it with the WS connector like this:
<ws:consumer-config name="Web_Service_Consumer" wsdlLocation="myOrder.wsdl"
service="OrderManager" port="JMSOrderManager"
serviceAddress="jms:queue:toOrderManagement?replyToName=fromOrderManagement?targetService=OrderManagement"
doc:name="Web Service Consumer" connector-ref="JMSConnector"/>
So, how to specify the actual JMS queue name and what the format of the serviceAddress attribute when it's configured for JMS/WS?
Ok, it took me some time to get the answer on this.
So, first, the format of the servcieAddress should be like this:
jms://${toQueue}?exchangePattern=request-response
where toQueue is the "request" queue name i.e. the one the request will be sent to.
Now, if you don't specify any additional parameters, a temporary "response" queue will be automatically created and WS Consumer will be waiting for receiving the response from it.
If you wanna use a pre-configured queue for getting responses then, before calling the WS Consumer, you need to set the message property JMSReplyTo with the response queue name to be used. If the property is set then WS Consumer will wait for the response from that queue rather than from the temporary one.
I'm attempting to call a Stored proc, get some data back, map that to the desired schema, output the result.
However, I'm getting the following error:
The Messaging engine failed to process a message submitted by
adapter:WCF-SQL Source
URL:mssql://master-biztalk//ReportServer?InboundId=batman. Details:The
published message could not be routed because no subscribers were
found. This error occurs if the subscribing orchestration or send port
has not been enlisted, or if some of the message properties necessary
for subscription evaluation have not been promoted. Please use the
Biztalk Administration console to troubleshoot this failure.
I'm not quite certain why I'm getting this error. Searching the web hasn't enlightened me any further. Following are some of the steps I've undertaken
Consume adapter = generated schema.
Map this schema to desired output.
Receive port in the Orchestration which connects to a receive message which has the Schema generated by the consume adapter as the type.
Added a receive port in Biztalk, configured WCF-SQL and setup bindings. (Typed polling)
Linked this receive port to the logical receive port in orchestration.
I haven't promoted any elements in messages.
Thanks for the help
EDIT: I updated the pipeline to XML, and now I'm getting the following error:
There was a failure executing the receive pipeline:
"Microsoft.BizTalk.DefaultPipelines.XMLReceive,
Microsoft.BizTalk.DefaultPipelines, Version=3.0.1.0, Culture=neutral,
PublicKeyToken=31bf3856ad364e35" Source: "XML disassembler" Receive
Port: "DatabaseReceiveport" URI:
"mssql://master-biztalk//ReportServer?InboundId=batman" Reason:
Finding the document specification by message type
"http://schemas.microsoft.com/Sql/2008/05/TypedPolling/batman#TypedPolling"
failed. Verify the schema deployed properly.
I'm not quite sure what it's trying to do above.
This means that either:
The Project/Assembly with the specified Schema has not been deployed.
The WCF SQL configuration is off somehow, frequently the InBoundId parameter. That's where "batman" would come from.
In the All Artifacts Application, check the Schemas folder for that Schema by Root Node and Namespace.
I'm trying to switch from ActiveMQ 5.6 to Apollo 1.5.
I have two soft that are exchanging messages, using publish/subscribe on topics.
The first one is c++ and use openwire with tcp
The second one is Javascript and use stomp with websockets
With ActiveMQ everything worked fine, and the messages I sent could be read and write on both softs, and I didn't changed the clients since.
Now, I send messages from the c++ soft (using openwire), and try to read them with the JS soft, and I get errors. In fact I receive message with header content-type: "protocol/openwire", but I expect stomp.
this is how I configured apollo.xml connector section :
<connector id="tcp" bind="tcp://0.0.0.0:61613">
<openwire max_inactivity_duration="-1" max_inactivity_duration_delay="-1" />
<stomp max_header_length="10000" die_delay="-1" />
</connector>
<connector id="ws" bind="tcp://0.0.0.0:61623">
<stomp max_header_length="10000" die_delay="-1" />
</connector>
I also tried with <detect /> in tcp and ws connector, that is supposed to auto detect client protocol, but dosen't work either.
Does someone can help me to figure this out ?
Thank you,
edit :
I found out that I do receive stomp protocol messages, but they are very weirdly formated, and even contains non text char that make stomp.js fail to parse the message and correctly fill the message body.
here are the same message received once from activemq openwire and then apollo openwire in with the same c++ publisher and js subscriber :
activemq
"MESSAGE
message-id:ID:myID-61443-1352999572576-0:0:0:0:0
class:Message.PointToPoint
destination:/topic/my-topic
timestamp:1352999626186
expires:0
subscription:sub-0
priority:4
<PointToPoint xmlns="Message" ><SourceId>u_23</SourceId><TargetId>u_75</TargetId></PointToPoint>"
apollo
"MESSAGE
subscription:sub-0
destination:
content-length:331
content-type:protocol/openwire
message-id:xps-broker-291
Eç{#ID:myID-61463-1352999939140-0:0emy-topicn{#ID:myID-61463-1352999939140-0:0; Å??<PointToPoint xmlns="Message" ><SourceId>u_23</SourceId><TargetId>u_75</TargetId></PointToPoint>(class Message.PointToPoint
"
Do you think it could be a problem in Apollo ?
ActiveMQ 5.6 handles translating the logical OpenWire messages into a text representation for STOMP clients. Apollo, currently does not support that feature yet! :( See:
https://issues.apache.org/jira/browse/APLO-267
It just takes the full openwire message and uses it as the body of the STOMP message. BTW using binary data in a STOMP message is totally valid as long as the content-length header is properly set.
i am creating one webservice in mule using cxf:jaxws-service. This is the url :http://localhost:65042/InsertDocService/InsertDoc, i am ablie to generate WSDL file, but i want to consume this service in mule using cxf:jaxws-client.
<flow name="documentumclientflowFlow1" doc:name="documentumclientflowFlow1">
<inbound-endpoint address="http://localhost:65042/InsertDocumentumService/InsertDocumentum" doc:name="Generic"/>
<cxf:jaxws-client operation="insertDocumentum" serviceClass="com.integration.IDocumentumInsert" port="80" mtomEnabled="true" enableMuleSoapHeaders="true" doc:name="SOAP"/>
<outbound-endpoint address="http://locahhost:65042/InsertDocumentumService/InsertDocumentum" doc:name="Generic"/>
</flow>
if i invoke this, it's going to Service project and getting erorr like"org.apache.cxf.interceptor.Fault: No such operation: (HTTP GET PATH_INFO: /InsertDocumentumService/InsertDocumentum)". Please any one suggest me how can i solve this issue and how can i consume this service.
Besides the type (locahhost instead of localhost), it looks like you're trying to use the same address in the outbound endpoint than in the inbound one. I don't think this is what you want to do, as it will lead to a looping re-entrant call that will eventually exhaust the pool threads, block, time out and die in a fire.
So what do you want to do with "documentumclientflowFlow1"? It is unclear from your question.