Upgrading From ActiveMQ "Classic" to ActiveMQ Artemis - class java.util.ArrayList is not a valid property type - activemq

I am upgrading ActiveMQ "Classic" to ActiveMQ Artemis while maintaining client code. I have multiple places code looking like this.
import java.io.IOException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;
import org.springframework.jms.core.JmsTemplate;
public class TestV {
public static void main(String[] args) throws IOException {
ApplicationContext ctx = new ClassPathXmlApplicationContext("root.xml");
JmsTemplate jms = ctx.getBean(JmsTemplate.class);
Map<String, Object> map = new HashMap<>();
List<Integer> ids = new ArrayList<>();
ids.add(10);
ids.add(20);
map.put("ids", ids);
map.put("updated", true);
jms.convertAndSend("mytest", map);
}
}
How do I fix below error coming from above code.
Exception in thread "main" org.springframework.jms.UncategorizedJmsException: Uncategorized exception occurred during JMS processing; nested exception is javax.jms.JMSException: org.apache.activemq.artemis.api.core.ActiveMQPropertyConversionException: class java.util.ArrayList is not a valid property type
at org.springframework.jms.support.JmsUtils.convertJmsAccessException(JmsUtils.java:311)
at org.springframework.jms.support.JmsAccessor.convertJmsAccessException(JmsAccessor.java:185)
at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:507)
at org.springframework.jms.core.JmsTemplate.send(JmsTemplate.java:584)
at org.springframework.jms.core.JmsTemplate.convertAndSend(JmsTemplate.java:661)
at com.mycompany.adhoc.TestV.main(TestV.java:33)
Caused by: javax.jms.JMSException: org.apache.activemq.artemis.api.core.ActiveMQPropertyConversionException: class java.util.ArrayList is not a valid property type
at org.apache.activemq.util.JMSExceptionSupport.create(JMSExceptionSupport.java:54)
at org.apache.activemq.ActiveMQConnection.syncSendPacket(ActiveMQConnection.java:1404)
at org.apache.activemq.ActiveMQConnection.syncSendPacket(ActiveMQConnection.java:1437)
at org.apache.activemq.ActiveMQConnection.syncSendPacket(ActiveMQConnection.java:1324)
at org.apache.activemq.ActiveMQSession.send(ActiveMQSession.java:1981)
at org.apache.activemq.ActiveMQMessageProducer.send(ActiveMQMessageProducer.java:288)
at org.apache.activemq.ActiveMQMessageProducer.send(ActiveMQMessageProducer.java:223)
at org.apache.activemq.ActiveMQMessageProducerSupport.send(ActiveMQMessageProducerSupport.java:241)
at org.springframework.jms.core.JmsTemplate.doSend(JmsTemplate.java:634)
at org.springframework.jms.core.JmsTemplate.doSend(JmsTemplate.java:608)
at org.springframework.jms.core.JmsTemplate.lambda$send$3(JmsTemplate.java:586)
at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:504)
In ActiveMQ "Classic" we can set trusted packages in connection factory. How do I do it in ActiveMQ Artemis?

This problem has nothing to do with setting the trusted packages on the JMS ConnectionFactory.
The problem is that your application is implicitly using this JMS "extension" provided by ActiveMQ "Classic." As the documentation states:
This JMS extension feature allows you to attach Map and List properties to any JMS Message or to use nested Maps and Lists inside a MapMessage. [emphasis mine]
When you pass the Map<String, Object> variable map to JmsTemplate.convertAndSend it uses the default SimpleMessageConverter to convert that Map into a javax.jms.MapMessage. As the JavaDoc for MapMessage states:
The names are String objects, and the values are primitive data types in the Java programming language. [emphasis mine]
In other words, according to the JMS specification the values in the MapMessage can only be primitive data types. However, ActiveMQ "Classic" provides an extension which allows using List implementations. Code which uses this extension is not portable to other JMS brokers since it does not adhere to the JMS specification. This is why ActiveMQ Artemis throws the error ActiveMQPropertyConversionException: class java.util.ArrayList is not a valid property type.
You will either have to change your application code to adhere to the JMS specification (i.e. use primitive data types in the values of your Map) or this same extension will have to be implemented by ActiveMQ Artemis.

Related

#Named doesn't resolve the value parameter

After upgrading to micronaut 3.x and replacing the annotation Javax.inject to Jakarta.inject. #Named is unable resolve the property.
import io.micronaut.context.annotation.Bean
import io.micronaut.context.annotation.Factory
import io.micronaut.context.annotation.Requires
import java.util.concurrent.ExecutorService
import jakarta.inject.Named
import kotlin.coroutines.CoroutineContext
#Factory
open class ExecutorServiceCoroutineContextFactory {
#Bean
#Requires(missingBeans = [CoroutineContext::class])
fun executorServiceCoroutineContext(#Named("\${coroutines.executor}")executorService: ExecutorService): CoroutineContext {
return ExecutorServiceCoroutineDispatcher(executorService)
}
}
application.yml
coroutines:
executor: coroutines
The resulting error
Message: No bean of type [java.util.concurrent.ExecutorService] exists for the given qualifier: #Named('${coroutines.executor}'). Make sure the bean is not disabled by bean requirements (enable trace logging for 'io.micronaut.context.condition' to check) and if the bean is enabled then ensure the class is declared a bean and annotation processing is enabled (for Java and Kotlin the 'micronaut-inject-java' dependency should be configured as an annotation processor).
Path Taken: new Controller(Client client,CoroutineContext coroutineContext) --> new Controller(Client client,[CoroutineContext coroutineContext]) --> CoroutineContext.executorServiceCoroutineContext([ExecutorService executorService],String name)
io.micronaut.context.exceptions.DependencyInjectionException: Failed to inject value for parameter [executorService] of method [executorServiceCoroutineContext] of class: kotlin.coroutines.CoroutineContext
Hardcoding the value "coroutines" does work but wondering why the code can't resolve it.

Where to define setTrustAllPackages=true in #MessageDriven bean in external ActiveMQ

I am doing a Publish - Subscribe using external ActiveMQ (5.15.10). My application is deployed on TomEE 8.0.1 server and ActiveMQ configurations are done in tomee.xml.
I am able publish the message successfully but while receiving messages am facing issues. In onMessage method I need to process a pojo and I get below error
"This class is not trusted to be serialized as ObjectMessage payload"
I use EclipseLink JPA in my application and I need to send the pojo that I receive in onMessage method to my #Stateless bean (here UserService) to process it further. So, UserService is injected with #EJB annotation in my MDBSubscriber class below.
#MessageDriven(
activationConfig = {
#ActivationConfigProperty(
propertyName = "destinationType",
propertyValue = "javax.jms.Queue"),
#ActivationConfigProperty(
propertyName = "destination",
propertyValue = "userQueue")
}
)
public class MDBSubscriber implements MessageListener {
#EJB
UserService uService;
public void onMessage(Message msg) {
if(msg instanceof ObjectMessage) {
ObjectMessage objMsg = (ObjectMessage) msg;
UserForm uForm= (UserForm) objMsg.getObject();
----
----
uService.process(uForm);
}
}
}
When I read through ActiveMQ docs, it says setTrustAllPackages=true can be set on ActiveMQConnectionFactory object but since am using #MessageDriven Bean I don't have ActiveMQConnectionFactory object in my class defined above.
So, my problem is where or how do we define setTrustAllPackages=true in #MessageDriven Bean?
I am stuck with this problem since more than 10 days and could not find a solution.
Can someone help me here ?
You can configure this via a system property as well which avoids the trustAllPackages connection factory option. There is documentation for this already on the ActiveMQ site.
In case you want to shortcut this mechanism, you can allow all packages to be trusted by using * wildcard, like
-Dorg.apache.activemq.SERIALIZABLE_PACKAGES=*

Consumer-driven contract testing with Spring, JMS and ActiveMQ

I am currently trying to learn consumer-driven contract testing in the context of Services that use ActiveMQ Messages to communicate.
Spring offers a documentation for Spring Cloud Contract Verifier Messaging
https://cloud.spring.io/spring-cloud-contract/spring-cloud-contract.html#_spring_cloud_contract_verifier_messaging
but I was not able to follow it with the setting I have ( Spring Services and ActiveMQ ).
My question is:
Is it a good idea to use consumer-driven contract testing for messaging ?
What are the best practices ?
If its a good idea, do you have any good tutorials on this for consumer-driven contract testing spring services that communicate via JMS and ActiveMQ ?
I managed to create a custom MessageVerifier for JMS and ActiveMQ which looks like this:
package de.itemis.seatreservationservice;
import com.fasterxml.jackson.databind.ObjectMapper;
import de.itemis.seatreservationservice.domain.ReservationRequest;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.cloud.contract.verifier.messaging.MessageVerifier;
import org.springframework.context.annotation.Primary;
import org.springframework.jms.core.JmsTemplate;
import org.springframework.messaging.support.GenericMessage;
import org.springframework.stereotype.Component;
import javax.jms.JMSException;
import javax.jms.Message;
import javax.jms.TextMessage;
import java.io.IOException;
import java.util.Map;
import java.util.concurrent.TimeUnit;
#Component
#Primary
public class JmsMessageVerifier implements MessageVerifier {
#Autowired
JmsTemplate jmsTemplate;
#Autowired
ObjectMapper mapper;
#Override
public void send(Object message, String destination) {
jmsTemplate.convertAndSend(destination, message, new ReplyToProcessor());
}
#Override
public Object receive(String destination, long timeout, TimeUnit timeUnit) {
jmsTemplate.setReceiveTimeout(timeout);
return receiveMessage(destination);
}
#Override
public Object receive(String destination) {
return receiveMessage(destination);
}
#Override
public void send(Object payload, Map headers, String destination) {
ReservationRequest request = null;
try {
request = mapper.readValue((String) payload, ReservationRequest.class);
} catch (IOException e) {
e.printStackTrace();
}
jmsTemplate.convertAndSend(destination, request, new ReplyToProcessor());
}
private Object receiveMessage(String queueName) {
Message message = jmsTemplate.receive(queueName);
TextMessage textMessage = (TextMessage) message;
try {
return new GenericMessage<>(textMessage.getText());
} catch (JMSException e) {
e.printStackTrace();
return null;
}
}
}
Currently i need them in both test folders (producer and consumer).
Normally i would expect that the JmsMessageVerifier in my Producer would be packaged inside the generated stubs JAR so that the consumer contract tests use this JmsMessageVerifier instead of implementing their own.
What are your thoughts in this Marcin Grzejszczak ?
I would create an Issue for that if its a useful feature.
Here is the repository with both services:
but I was not able to follow it with the setting I have ( Spring Services and ActiveMQ ).
ActiveMQ is not supported out of the box, you would have to provide your own bean of MessageVerifier type, where you would teach the framework on how to send and receive messages
Is it a good idea to use consumer-driven contract testing for messaging ?
Absolutely! You can follow the same flow as with HTTP but for messaging
What are the best practices ?
It depends ;) You can follow the standard practices of doing cdc as if it was an http based communication. If you want to abstract the producer and the consumer of such messages in such a way that you care more about the topic / queue as such, you can follow this guideline https://cloud.spring.io/spring-cloud-static/Greenwich.SR2/single/spring-cloud.html#_how_can_i_define_messaging_contracts_per_topic_not_per_producer where we describe how to define messaging contracts per topic and not per producer
If its a good idea, do you have any good tutorials on this for consumer-driven contract testing spring services that communicate via JMS and ActiveMQ ?
As I said earlier we don't have such support out of the box. You can however use Spring Integration or Apache Camel and communicate with ActiveMQ via those.

How to use Apache Apex Malhar RabbitMQ operator in DAG

I have an Apache Apex application DAG which reads RabbitMQ message from a queue. Which Apache Apex Malhar operator should I use? There are several operators but it's not clear which one to use and how to use it.
Have you looked at https://github.com/apache/apex-malhar/tree/master/contrib/src/main/java/com/datatorrent/contrib/rabbitmq ? There are also tests in https://github.com/apache/apex-malhar/tree/master/contrib/src/test/java/com/datatorrent/contrib/rabbitmq that show how to use the operator
https://github.com/apache/apex-malhar/blob/master/contrib/src/main/java/com/datatorrent/contrib/rabbitmq/AbstractRabbitMQInputOperator.java
That is the main operator code where the tuple type is a generic parameter and emitTuple() is an abstract method that subclasses need to implement.
AbstractSinglePortRabbitMQInputOperator is a simple subclass that provides a single output port and implements emitTuple() using another abstract method getTuple() which needs an implementation in its subclasses.
The tests that Sanjay pointed to show how to use these classes.
I also had problems finding out how to read messages from RabbitMQ to Apache Apex. With the help of the provided links of Sanjay's answer (https://stackoverflow.com/a/42210636/2350644) I finally managed to get it running. Here's how it works all together:
1. Setup a RabbitMQ Server
There are lot of ways installing RabbitMQ that are described here: https://www.rabbitmq.com/download.html
The simplest way for me was using docker (See: https://store.docker.com/images/rabbitmq)
docker pull rabbitmq
docker run -d --hostname my-rabbit --name some-rabbit -p 5672:5672 -p 15672:15672 rabbitmq:3-management
To check if RabbitMQ is working, open a browser and navigate to: http://localhost:15672/. You should see the Management page of RabbitMQ.
2. Write a Producer program
To send messages to the queue you can write a simple JAVA program like this:
import com.rabbitmq.client.BuiltinExchangeType;
import com.rabbitmq.client.Channel;
import com.rabbitmq.client.Connection;
import com.rabbitmq.client.ConnectionFactory;
import java.util.ArrayList;
public class Send {
private final static String EXCHANGE = "myExchange";
public static void main(String[] args) throws Exception {
ConnectionFactory factory = new ConnectionFactory();
factory.setHost("localhost");
Connection connection = factory.newConnection();
Channel channel = connection.createChannel();
channel.exchangeDeclare(EXCHANGE, BuiltinExchangeType.FANOUT);
String queueName = channel.queueDeclare().getQueue();
channel.queueBind(queueName, EXCHANGE, "");
List<String> messages = Arrays.asList("Hello", "World", "!");
for (String msg : messages) {
channel.basicPublish(EXCHANGE, "", null, msg.getBytes("UTF-8"));
System.out.println(" [x] Sent '" + msg + "'");
}
channel.close();
connection.close();
}
}
If you execute the JAVA program you should see some outputs in the Management UI of RabbitMQ.
3. Implement a sample Apex Application
3.1 Bootstrap a sample apex application
Follow the official apex documentation http://docs.datatorrent.com/beginner/
3.2 Add additional dependencies to pom.xml
To use the classes provided by malhar add the following dependencies:
<dependency>
<groupId>org.apache.apex</groupId>
<artifactId>malhar-contrib</artifactId>
<version>3.7.0</version>
</dependency>
<dependency>
<groupId>com.rabbitmq</groupId>
<artifactId>amqp-client</artifactId>
<version>4.2.0</version>
</dependency>
3.3 Create a Consumer
We first need to create an InputOperator that consumes messages from RabbitMQ using available code from apex-malhar.
import com.datatorrent.contrib.rabbitmq.AbstractSinglePortRabbitMQInputOperator;
public class MyRabbitMQInputOperator extends AbstractSinglePortRabbitMQInputOperator<String> {
#Override
public String getTuple(byte[] message) {
return new String(message);
}
}
You only have to override the getTuple() method. In this case we simply return the message that was received from RabbitMQ.
3.4 Setup an Apex DAG
To test the application we simply add an InputOperator (MyRabbitMQInputOperator that we implemented before) that consumes data from RabbitMQ and a ConsoleOutputOperator that prints the received messages.
import com.rabbitmq.client.BuiltinExchangeType;
import org.apache.hadoop.conf.Configuration;
import com.datatorrent.api.annotation.ApplicationAnnotation;
import com.datatorrent.api.StreamingApplication;
import com.datatorrent.api.DAG;
import com.datatorrent.api.DAG.Locality;
import com.datatorrent.lib.io.ConsoleOutputOperator;
#ApplicationAnnotation(name="MyFirstApplication")
public class Application implements StreamingApplication
{
private final static String EXCHANGE = "myExchange";
#Override
public void populateDAG(DAG dag, Configuration conf)
{
MyRabbitMQInputOperator consumer = dag.addOperator("Consumer", new MyRabbitMQInputOperator());
consumer.setHost("localhost");
consumer.setExchange(EXCHANGE);
consumer.setExchangeType(BuiltinExchangeType.FANOUT.getType());
ConsoleOutputOperator cons = dag.addOperator("console", new ConsoleOutputOperator());
dag.addStream("myStream", consumer.outputPort, cons.input).setLocality(Locality.CONTAINER_LOCAL);
}
}
3.5 Test the Application
To simply test the created application we can write a UnitTest, so there is no need to setup a Hadoop/YARN cluster.
In the bootstrap application there is already a UnitTest namely ApplicationTest.java that we can use:
import java.io.IOException;
import javax.validation.ConstraintViolationException;
import org.junit.Assert;
import org.apache.hadoop.conf.Configuration;
import org.junit.Test;
import com.datatorrent.api.LocalMode;
/**
* Test the DAG declaration in local mode.
*/
public class ApplicationTest {
#Test
public void testApplication() throws IOException, Exception {
try {
LocalMode lma = LocalMode.newInstance();
Configuration conf = new Configuration(true);
//conf.addResource(this.getClass().getResourceAsStream("/META-INF/properties.xml"));
lma.prepareDAG(new Application(), conf);
LocalMode.Controller lc = lma.getController();
lc.run(10000); // runs for 10 seconds and quits
} catch (ConstraintViolationException e) {
Assert.fail("constraint violations: " + e.getConstraintViolations());
}
}
}
Since we don't need any properties for this application the only thing changed in this file is uncommenting the line:
conf.addResource(this.getClass().getResourceAsStream("/META-INF/properties.xml"));
If you execute the ApplicationTest.java and send messages to RabbitMQ using the Producer program as described in 2., the Test should output all the messages.
You might need to increase the time of the test to see all messages (It is set to 10sec currently).

How to get mule security context or security principal reference in mule flow

Was trying out authentication and authorisation in mule and got it working. Now I want the reference to the mule security context specifically the principal object reference to be used within the flow . How to get hold of the reference to principal object in mule inside a flow?
link to mule xml
Security context is available through MuleSession and this session is available through eventContext. To get eventContext reference, following can be done.
This can be achieved by implementing Callable. Create the following java class. Now place a java component in the mule flow where this has to be invoked and configure with the created java class. Mule calls automatically onCall method which has eventContext as a parameter and no extra configuration is required to invoke.
The example java component gets the security content from session and from that it gets the security principal and stores it in a flow variable "user" which can be used by other flow elements which appear after this java component in the flow.
import org.mule.api.MuleEventContext;
import org.mule.api.lifecycle.Callable;
import org.mule.api.security.Authentication;
import org.springframework.security.core.userdetails.UserDetails;
public class GetSecurityPrincipalCallable implements Callable {
#Override
public Object onCall(MuleEventContext eventContext) throws Exception {
Authentication auth = eventContext.getSession().getSecurityContext()
.getAuthentication();
UserDetails principal = (UserDetails) auth.getPrincipal();
System.out.println("username is : " + principal.getUsername());
eventContext.getMessage().setInvocationProperty("user", principal);
return null;
}
}