RabbitMQ Priority Queues not working in MULE - rabbitmq

I'm trying to use the Priority Queues mechanism provided by RabbitMQ within MULE ESB.
I have created a queue with x-max-priority = 3 and in a new Mule Proyect, I've set 3 AMQP:connectors, one for each priority, like this:
<amqp:connector
name="amqpLocalhostConnector0"
host="${amqp.host}"
port="${amqp.port}"
fallbackAddresses="${amqp.fallbackAddresses}"
virtualHost="${amqp.virtualHost}"
username="${amqp.username}"
password="${amqp.password}"
priority="0"
ackMode="AMQP_AUTO"
prefetchCount="${amqp.prefetchCount}" />
/* The values within ${} are taken from a properties file.
* Priorities are: "0", "1", "2" */
Then, there is a simple flow that sends messages every second to the queue containing the priority as a string in the body, besides the property priority used in the connector.
<flow name="rabbitmqFlow1" doc:name="rabbitmqFlow1">
<poll frequency="1000" doc:name="Poll">
<set-payload value="PRIORITY 1" doc:name="Set Payload"/>
</poll>
<amqp:outbound-endpoint
exchangeName="amq.direct"
routingKey="priority"
connector-ref="amqpLocalhostConnector0">
</amqp:outbound-endpoint>
</flow>
I manually repeat this proccess alternating the priority used by each message batch. This way, my queue has several messages with different priorities mixed.
Now, if I manually dequeue messages using the Get Message(s) button in the Management UI, they are delivered according the priority: first those with priority 2, then those with priority 1 and finally those with priority 0.
The problem is when I try to get the messages using a amqp:inbound-endpoint component in MULE.
Here is another simple flow that just gets the messages from the priority queue and shows what's the content of each one.
<flow name="rabbitmqFlow2" doc:name="rabbitmqFlow2">
<amqp:inbound-endpoint
queueName="priority"/>
<byte-array-to-string-transformer doc:name="Byte Array to String"/>
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
</flow>
Here the messages are obtained in the order they were sended to the queue and not according their priorities.
What can I do to read the messages from the queue respecting priorities?

Related

Consuming message in sequence from Jms in mule flow

I have a tibco queue where we receive messages from producers and a mule flow is consuming these messages. I have numberOfConsumers set to 20 in jms:connector. some time when the load is high my messages are received out of sequence in the flow.
I want my flow to receive messages in sequence without making it single threaded.
Below is the flow with logger in the beginning of the flow:
<flow name="some name" doc:name="ServiceId-8" initialState="started">
<jms:inbound-endpoint queue="${queue1}" connector-ref="jmsconnector" doc:name="JMS">
<logger message="Receiving Message: #[message.payload]" category="com.xyz" level="INFO" doc:name="Logger"/>
<jms:transaction action="ALWAYS_BEGIN"/>
</jms:inbound-endpoint>
<processor....
<component....
........
........
</flow>
Connector:
<jms:connector name="jmsconnector" specification="1.1" username="${name}" password="${pass}" validateConnections="true" jndiInitialFactory="factoryClass" jndiProviderUrl="${url}" connectionFactoryJndiName="GenericConnectionFactory" cacheJmsSessions="true" eagerConsumer="true" forceJndiDestinations="true" numberOfConsumers="20" persistentDelivery="true" maxRedelivery="5" doc:name="JMS">
<spring:property name="jndiProviderProperties">
<spring:map>
<spring:entry key="java.naming.security.principal" value="${name}"/>
<spring:entry key="java.naming.security.credentials" value="${pass}"/>
</spring:map>
</spring:property>
<reconnect-forever/>
</jms:connector>
You can take any of the approach below.
Set the queue to be exclusive. This would enable server to send message to only one consumer.
using JMSXGroupID property of JMS. This would ensure that messages are processed in order for particular group coming in the header "JMSXGroupID". e.g. if there are 5 messages with JMSXGroupID set as "customer1" for 3 messages and "customer2" for 2 messages. Then the processing of 3 messages under "customer1" and processing of 2 messages under "customer2" will be sequential. However messages for both groups will be executed in parallel.

Mule-Unable to access Session Variable value in another flow

I need to retrieve the value set in session variable in flow1 to flow2. The code I've written looks like this :
<flow name="demo1Flow">
<http:listener config-ref="HTTP_Listener_Configuration" path="demo" doc:name="HTTP"/>
<set-session-variable variableName="name" value="balwant" doc:name="Session Variable"/>
<logger message="Inside demo1 #[sessionVars.name]" level="INFO" doc:name="Logger"/>
<http:request config-ref="HTTP_Request_Configuration" path="/test" method="GET" doc:name="HTTP"/>
</flow>
<flow name="demoFlow">
<http:listener config-ref="HTTP_Listener_Configuration" path="/test" doc:name="HTTP"/>
<logger message="Inside demo flow #[sessionVars['name']]" level="INFO" doc:name="Logger"/>
</flow>
With the above code I'm not able to get the value from session variable which was set in demo1Flow to demoFlow. The output for this flow I'm getting is :
INFO 2017-03-07 12:55:28,455 [[demo].HTTP_Listener_Configuration.worker.01] org.mule.api.processor.LoggerMessageProcessor: Inside demo1 balwant
INFO 2017-03-07 12:55:28,536 [[demo].HTTP_Listener_Configuration.worker.02] org.mule.api.processor.LoggerMessageProcessor: Inside demo flow null.
As the documentation says that value in Session variable is accessible across the session in different flows, but here that is not happening :(. Not sure what is the reason.
Referring Session Variable Transformer Reference documentation, the Session Variable persist for the entire message lifecycle, regardless of transport barriers, except for the HTTP Connector which doesn’t propagate them.
They are two independent flows which process messages based on different input paths, although you are calling using http requester from flow1,second has its own scope.
For every flow it's scope begins with its inbound.
As there is no relationship between those two flows, you can't access anything from flow1 in the other. If you want that variable you can set as outbound property then it will become inbound property to the second flow. Otherwise you can set as uri parameters.
Regards,
Mallesh

synchronizing a database insert and select behind a web service

I'm struggling to figure out how to solve this problem in mule using the studio and thought that perhaps reaching out to the good users of SO may be of some help.
I have a simple webservice that takes a request from a client. this request will preform an insert into a database table, effectively using this database as a message queue. A separate process periodically polls this table, performs additional processing on the message, and then writes results to an output table. the database insert and subsequent select will be linked by a a correlationId that I can pass along to ensure I get the result for the message that was sent. Unfortunately, the software this will integrate with requires this pattern to work correctly.
Here's the workflow that is needed:
HttpRequest -> insert record into a table -> wait(or poll/retry/etc?) until a record is written to another table by a separate process(with the same correlationId) -> return data from this other table back to the httpRequest
here's a sample flow that is as close as i've been able to get with this. Oddly enough, this flow does actually return a payload, however it seems to always be "1". i can't quite see how to make this flow retry the database query until a row exists and then return the resulting row.
How should i be synchronizing 2 database calls? is this possible within mule perhaps with a different combination of components?
Thanks.
<flow name="mainFlow">
<http:listener config-ref="HTTP_Listener_Configuration" path="hello" doc:name="HTTP"/>
<cxf:jaxws-service doc:name="CXF" configuration-ref="CXF_Configuration" serviceClass="kansas.MuleTestServiceImpl"/>
<db:insert config-ref="Oracle_Configuration" doc:name="Database">
<db:parameterized-query><![CDATA[insert into tblRequest (id, correlationId) values(#[payload], #[message.correlationId])]]></db:parameterized-query>
</db:insert>
<until-successful objectStore-ref="MyObjectStore" maxRetries="5" millisBetweenRetries="2000" doc:name="Until Successful" > <!-- failureExpression="???" -->
<db:select config-ref="Oracle_Configuration" doc:name="Database">
<db:parameterized-query><![CDATA[select correlationId,msgResponse from tblResponse where correlationId = #[message.correlationId]]]></db:parameterized-query>
</db:select>
</until-successful>
<logger level="INFO" doc:name="Logger" message="#[payload]"/> <!-- why is payload always = 1? -->
</flow>
Mule is great tool but it makes your life too easy. Sometime so easy that you forget simple things.
In your case you forgot that payload is one object which is result of last component. Think about flow as rails with just one cart. Whatever you load on last station is delivered to the next one. And then process repeats. What was originally delivered to the station does not matter. Matters what you load.
In your case first database component has original payload from CXF and stores something in the database. It returns result of the INSERT statement which is 1 - one row is inserted. So our payload keeps deliver new cargo - 1.
But you need original payload from CXF. Where it is? It is gone - we have only one flow, one pair of trails, one cart.
What to do in this situation? Keep required information not in the cart but somewhere else. For example in flow variables. Store original payload in some variable and then restore it when it required again. Like this
<flow name="mainFlow">
<http:listener config-ref="HTTP_Listener_Configuration" path="hello" doc:name="HTTP"/>
<cxf:jaxws-service doc:name="CXF" configuration-ref="CXF_Configuration" serviceClass="kansas.MuleTestServiceImpl"/>
<set-variable variableName="storedPaylod" value="#[payload]" doc:name="Store original payload"/>
<db:insert config-ref="Oracle_Configuration" doc:name="Database">
<db:parameterized-query><![CDATA[insert into tblRequest (id, correlationId) values(#[payload], #[message.correlationId])]]></db:parameterized-query>
</db:insert>
<set-payload value="#[flowVars.storedPaylod]" doc:name="Restore Payload"/>
<until-successful objectStore-ref="MyObjectStore" maxRetries="5" millisBetweenRetries="2000" doc:name="Until Successful" > <!-- failureExpression="???" -->
<db:select config-ref="Oracle_Configuration" doc:name="Database">
<db:parameterized-query><![CDATA[select correlationId,msgResponse from tblResponse where correlationId = #[message.correlationId]]]></db:parameterized-query>
</db:select>
</until-successful>
<logger level="INFO" doc:name="Logger" message="#[payload]"/> <!-- why is payload always = 1? -->
</flow>
Good idea will be to check that first database component really returns 1 - record is inserted. Do this, produce alerts on the error, and then restore original payload and continue your flow.
The best solution to avoid killing the actual value of your payload after the database insert is to make use of the Message Enricher processor.
try this code below:
<flow name="mainFlow">
<http:listener config-ref="HTTP_Listener_Configuration" path="hello" doc:name="HTTP"/>
<cxf:jaxws-service configuration-ref="CXF_Configuration" serviceClass="kansas.MuleTestServiceImpl" doc:name="CXF"/>
<enricher source="#[payload]" target="#[flowVars.insertResponse]" doc:name="Message Enricher">
<db:insert config-ref="Oracle_Configuration" doc:name="Database">
<db:parameterized-query><![CDATA[insert into tblRequest (id, correlationId) values(#[payload], #[message.correlationId])]]></db:parameterized-query>
</db:insert>
</enricher>
<flow-ref name="dbSelectSubFlow" doc:name="dbSelectSubFlow"/>
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
</flow>
<sub-flow name="dbSelectSubFlow">
<until-successful objectStore-ref="MyObjectStore" maxRetries="5" millisBetweenRetries="2000" doc:name="Until Successful">
<db:select config-ref="Oracle_Configuration" doc:name="Database">
<db:parameterized-query><![CDATA[select correlationId,msgResponse from tblResponse where correlationId = #[message.correlationId]]]></db:parameterized-query>
</db:select>
</until-successful>
</sub-flow>

Parsing multiple records after Polling in Mule and pushing every single record in the queue

As of now ,when I poll the database with a query,It will fetch me multiple records during a specified duration.Problem is ,these multiple records will be pushed as a single message into the queue.How do I push every record from the set of records as an individual message?
As you have explained the JDBC endpoint is fetching a collection of records and sending them as one single message to the queue. Solution for this is two options.
Using Mule's For-Each message processor. This helps in iterating through the collection object and processes each item as one message.
Using Mule's collection splitter to iterate through the collection of records.
Solution for option 1 looks like as shown in the image below.
The code for this flow loks like this.
<flow name="JDBC-For-Each-JMS-Flow" >
<jdbc-ee:inbound-endpoint queryKey="SelectAll" mimeType="text/javascript" queryTimeout="500000" pollingFrequency="1000" doc:name="Database">
<jdbc-ee:query key="SelectAll" value="select * from users"/>
</jdbc-ee:inbound-endpoint>
<foreach doc:name="For Each" collection="#[payload]" >
<jms:outbound-endpoint doc:name="JMS"/>
</foreach>
<catch-exception-strategy doc:name="Catch Exception Strategy"/>
</flow>
Note: This is a sample flow.
Hope this helps.

Mule MEL to read database result-set from an second 'database outbound endpoint'

I have a flow something like this
A 'Database inbound endpoint' which polls(for every 5 mins) to mySQL Database-Server and get result-set by a select-query (automatically this becomes the current payload i.e #[message.payload])
'For each' component and a 'Logger' component in it using a expression as #[message.payload]
Now flow has one more 'Database-out-bound-endpoint' component which executes another select-query and obtains result-set.
'For each' component with a 'Logger' component in it using a expression as #[message.payload]
Note: in the loggers result-set of first DB is printing. I mean second logger is also showing result-set of first query itself.Because the result-set is storing as payload
so, my questions are
what is the MEL to read the result-set of second database-query in the above scenario.
is there any another way to read result-set in the flow
Here is the configuration XML
<jdbc-ee:connector name="oracle_database" dataSource-ref="Oracle_Data_Source" validateConnections="true" queryTimeout="-1" pollingFrequency="0" doc:name="Database"/>
<flow name="testFileSaveFlow3" doc:name="testFileSaveFlow3">
<poll frequency="1000" doc:name="Poll">
<jdbc-ee:outbound-endpoint exchange-pattern="one-way" queryKey="selectTable1" queryTimeout="-1" connector-ref="oracle_database" doc:name="get data from table 1">
<jdbc-ee:query key="selectTable1" value="SELECT * FROM TABLE1"/>
</jdbc-ee:outbound-endpoint>
</poll>
<foreach doc:name="For Each">
<logger message="#[message.payload]" level="INFO" doc:name="prints result-set of table1"/>
</foreach>
<jdbc-ee:outbound-endpoint exchange-pattern="one-way" queryKey="selectTable2" queryTimeout="-1" connector-ref="oracle_database" doc:name="get data from table 2">
<jdbc-ee:query key="selectTable2" value="SELECT * FROM TABLE2"/>
</jdbc-ee:outbound-endpoint>
<foreach doc:name="For Each">
<logger message="#[message.payload]" level="INFO" doc:name="prints result-set of table2"/>
</foreach>
</flow>
thanks in advance.
This is not the issue with the MEL. It is the issue with your flow logic.
The second result set is not available in the message.
The JDBC Outbound Endpoint is one-way. So Mule flow will not wait for the reply (result set) from the second JDBC (outbound ) in the middle of the flow. So the second time also it is printing the first result set.
Type 1:
Try making your JBDC outbound request-response instead of one-way.
Type 2:
Try Mule Enricher to call the JDBC outbound to call the DB and store the result set into a varaible and try looping the varaible.
Hope this helps.