I am currently trying to learn consumer-driven contract testing in the context of Services that use ActiveMQ Messages to communicate.
Spring offers a documentation for Spring Cloud Contract Verifier Messaging
https://cloud.spring.io/spring-cloud-contract/spring-cloud-contract.html#_spring_cloud_contract_verifier_messaging
but I was not able to follow it with the setting I have ( Spring Services and ActiveMQ ).
My question is:
Is it a good idea to use consumer-driven contract testing for messaging ?
What are the best practices ?
If its a good idea, do you have any good tutorials on this for consumer-driven contract testing spring services that communicate via JMS and ActiveMQ ?
I managed to create a custom MessageVerifier for JMS and ActiveMQ which looks like this:
package de.itemis.seatreservationservice;
import com.fasterxml.jackson.databind.ObjectMapper;
import de.itemis.seatreservationservice.domain.ReservationRequest;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.cloud.contract.verifier.messaging.MessageVerifier;
import org.springframework.context.annotation.Primary;
import org.springframework.jms.core.JmsTemplate;
import org.springframework.messaging.support.GenericMessage;
import org.springframework.stereotype.Component;
import javax.jms.JMSException;
import javax.jms.Message;
import javax.jms.TextMessage;
import java.io.IOException;
import java.util.Map;
import java.util.concurrent.TimeUnit;
#Component
#Primary
public class JmsMessageVerifier implements MessageVerifier {
#Autowired
JmsTemplate jmsTemplate;
#Autowired
ObjectMapper mapper;
#Override
public void send(Object message, String destination) {
jmsTemplate.convertAndSend(destination, message, new ReplyToProcessor());
}
#Override
public Object receive(String destination, long timeout, TimeUnit timeUnit) {
jmsTemplate.setReceiveTimeout(timeout);
return receiveMessage(destination);
}
#Override
public Object receive(String destination) {
return receiveMessage(destination);
}
#Override
public void send(Object payload, Map headers, String destination) {
ReservationRequest request = null;
try {
request = mapper.readValue((String) payload, ReservationRequest.class);
} catch (IOException e) {
e.printStackTrace();
}
jmsTemplate.convertAndSend(destination, request, new ReplyToProcessor());
}
private Object receiveMessage(String queueName) {
Message message = jmsTemplate.receive(queueName);
TextMessage textMessage = (TextMessage) message;
try {
return new GenericMessage<>(textMessage.getText());
} catch (JMSException e) {
e.printStackTrace();
return null;
}
}
}
Currently i need them in both test folders (producer and consumer).
Normally i would expect that the JmsMessageVerifier in my Producer would be packaged inside the generated stubs JAR so that the consumer contract tests use this JmsMessageVerifier instead of implementing their own.
What are your thoughts in this Marcin Grzejszczak ?
I would create an Issue for that if its a useful feature.
Here is the repository with both services:
but I was not able to follow it with the setting I have ( Spring Services and ActiveMQ ).
ActiveMQ is not supported out of the box, you would have to provide your own bean of MessageVerifier type, where you would teach the framework on how to send and receive messages
Is it a good idea to use consumer-driven contract testing for messaging ?
Absolutely! You can follow the same flow as with HTTP but for messaging
What are the best practices ?
It depends ;) You can follow the standard practices of doing cdc as if it was an http based communication. If you want to abstract the producer and the consumer of such messages in such a way that you care more about the topic / queue as such, you can follow this guideline https://cloud.spring.io/spring-cloud-static/Greenwich.SR2/single/spring-cloud.html#_how_can_i_define_messaging_contracts_per_topic_not_per_producer where we describe how to define messaging contracts per topic and not per producer
If its a good idea, do you have any good tutorials on this for consumer-driven contract testing spring services that communicate via JMS and ActiveMQ ?
As I said earlier we don't have such support out of the box. You can however use Spring Integration or Apache Camel and communicate with ActiveMQ via those.
Related
I am new to Spring Boot and Spring Webflux. I am working on a Spring Webflux reactor-netty server to handle WebSocket connections. In the simplest sense, this is how the server looks like:
...
#Component
public class ServerWebSocketHandler implements WebSocketHandler {
private final Logger logger = LoggerFactory.getLogger(getClass());
#Override
public Mono<Void> handle(WebSocketSession session) {
String sessionId = session.getId();
Sinks.Many<String> unicastSink = Sinks.many().unicast().onBackpressureError();
// save the unicastSink in cache so that on demand messages can be sent to the sink
Mono<Void> receiver =
session
.receive()
.map(WebSocketMessage::getPayloadAsText)
.doOnNext(message -> this.handleIncomingMessage(sessionId, message))
.doOnError(error -> {
logger.info("Error occurred in the session - Session: '{}'; Error: '{}'", sessionId, error);
})
.doFinally(s -> {
this.cleanUp(sessionId, s);
})
.then();
Mono<Void> sender =
session
.send(unicastSink.asFlux().map(session::textMessage));
return Mono.zip(receiver, sender).then();
}
// handleIncomingMessage, cleanUp, and other private methods to handle business logic
}
Now, I want to monitor the meters, specifically meters that can help in identifying back pressure or memory leak like reactor.netty.eventloop.pending.tasks, reactor.netty.bytebuf.allocator.used.direct.memory, reactor.netty.bytebuf.allocator.used.heap.memory. I read about these meters in Reactor Netty Reference Guide https://projectreactor.io/docs/netty/1.1.0-SNAPSHOT/reference/index.html#_metrics. The example of how to enable it is done on the server creation, but in Webflux, all these are abstracted out. So, my question is, in this case, how can I enable monitoring the meters and how to consume the meter. A small example code which shows how to do it would be greatly useful.
You can use Spring Boot API for configuring the web server
https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#howto.webserver.configure
#Component
public class MyNettyWebServerCustomizer
implements WebServerFactoryCustomizer<NettyReactiveWebServerFactory> {
#Override
public void customize(NettyReactiveWebServerFactory factory) {
factory.addServerCustomizers(httpServer -> httpServer.metrics(...));
}
}
These built-in Reactor Netty metrics use Micrometer so you can consume them with everything that has integration with Micrometer.
Im Try to create schedule job sample, buy task dont execute.
what's wrong?
#ApplicationScoped
public class CustomApplication extends Application {
#Override
public Set<Class<?>> getClasses() {
Set<Class<?>> classes = new HashSet<Class<?>>();
classes.add(FileService.class);
return classes;
}
}
public class FileService {
public void schedulerFeature() throws InterruptedException {
Scheduling.fixedRateBuilder()
.delay(4)
.initialDelay(2)
.timeUnit(TimeUnit.SECONDS)
.task(inv -> {
System.out.println("Running in:" + Thread.currentThread().getName());
System.out.println("Every 4 seconds an action, with an initial delay");
})
.build();
Thread.sleep(12000);
}
}
Im Try to create schedule job sample, buy task dont execute.
I don't know which version of Helidon you're using, and hence I don't know which version of JAX-RS/Jakarta RESTful Web Services you're using. For simplicity and brevity, I will assume you are using Helidon 3.x and therefore Jakarta RESTful Web Services 3.0.0.
This is not a Helidon question, but rather a basic JAX-RS/Jakarta RESTful Web Services question. You are really asking: "Why is my FileService class not instantiated by Jersey?"
Checking the documentation for Application#getClasses(), we can see that it reads: "Get a set of root resource, provider and feature classes." Your FileService class does not meet any of these requirements so it is simply ignored.
Guessing some more: I see you use the word "feature" in your example. This suggests that perhaps you want this class to actually be a true Jakarta RESTful Web Services Feature. Once again, the documentation here will tell you what you need to know to do next.
I revised my application and i had imported wrong maven lib. To schedule a job It is necessary import microprofile scheduling maven artifact:
<dependency>
<groupId>io.helidon.microprofile.scheduling</groupId>
<artifactId>helidon-microprofile-scheduling</artifactId>
<version>3.0.2</version>
</dependency>
import java.util.HashSet;
import java.util.Set;
#ApplicationScoped
public class CustomApplication extends Application {
#Override
public Set<Class<?>> getClasses() {
Set<Class<?>> classes = new HashSet<Class<?>>();
classes.add(FileService.class);
return classes;
}
}
import io.helidon.microprofile.scheduling.Scheduled;
import jakarta.enterprise.context.ApplicationScoped;
/**
* File service.
*/
#ApplicationScoped
public class FileService {
#Scheduled("0/2 * * * * ? *")
//#FixedRate(1)
public void schedulerFeature() {
System.out.println("Running");
}
}
I'm trying to integrate StarMX framework (https://github.com/rogeriogentil/starmx) into a legacy web application. This framework uses JMX techonology and is initialized using the Singleton pattern: StarMXFramework.createInstance(). The web application uses Java EE 6 technologies such as EJB and CDI (also DeltaSpike). However, the way the framework is being initialized (code below) doesn't add its instance to the CDI context.
import org.starmx.StarMXException;
import org.starmx.StarMXFramework;
import javax.annotation.PostConstruct;
import javax.annotation.PreDestroy;
import javax.ejb.Singleton;
import javax.ejb.Startup;
#Startup
#Singleton
public class StarMXSingleton {
private StarMXFramework starMX;
#PostConstruct
public void postConstruct() {
try {
starMX = StarMXFramework.createInstance();
} catch (StarMXException e) {
(...)
}
}
#PreDestroy
public void preDestroy() {
if (starMX != null) {
try {
starMX.shutdown();
} catch (StarMXException e) {
(...)
}
}
}
}
I know that is possible to extend CDI, but is it possible to add an instance of singleton framework to CDI context?
There are two ways, first and easy one is a producer. Here is a link to what CDI producers are and how they work. In short, CDI will use this producer to create the instance of a bean whose types are mandated by the return type of the producer method.
The producer method has to be placed inside a CDI bean so that is it picked up by CDI. Note that the scope of the producer affects how often it will be invoked, just as it would be with standard bean. Here is how it could look like:
#ApplicationScoped
public class SomeCdiBeanInYourApplication {
#Produces //denotes producer method
#ApplicationScoped // scope of produced bean, use CDI scope (the singleton you have is EJB annotation)
public StarMXFramework produceMxFramework() {
return StarMXFramework.createInstance();
}
}
Second means is then CDI extension, namely a lifecycle observer for AfterBeanDiscovery event where you can addBean(). Here is a link to CDI 2.0 spec, feel free to browse older versions based on what version you are on.
I won't write code for that as it is rather complex and long, the producer should do the trick for you.
See also
Please explain the #Produces annotation in CDI
I have an Apache Apex application DAG which reads RabbitMQ message from a queue. Which Apache Apex Malhar operator should I use? There are several operators but it's not clear which one to use and how to use it.
Have you looked at https://github.com/apache/apex-malhar/tree/master/contrib/src/main/java/com/datatorrent/contrib/rabbitmq ? There are also tests in https://github.com/apache/apex-malhar/tree/master/contrib/src/test/java/com/datatorrent/contrib/rabbitmq that show how to use the operator
https://github.com/apache/apex-malhar/blob/master/contrib/src/main/java/com/datatorrent/contrib/rabbitmq/AbstractRabbitMQInputOperator.java
That is the main operator code where the tuple type is a generic parameter and emitTuple() is an abstract method that subclasses need to implement.
AbstractSinglePortRabbitMQInputOperator is a simple subclass that provides a single output port and implements emitTuple() using another abstract method getTuple() which needs an implementation in its subclasses.
The tests that Sanjay pointed to show how to use these classes.
I also had problems finding out how to read messages from RabbitMQ to Apache Apex. With the help of the provided links of Sanjay's answer (https://stackoverflow.com/a/42210636/2350644) I finally managed to get it running. Here's how it works all together:
1. Setup a RabbitMQ Server
There are lot of ways installing RabbitMQ that are described here: https://www.rabbitmq.com/download.html
The simplest way for me was using docker (See: https://store.docker.com/images/rabbitmq)
docker pull rabbitmq
docker run -d --hostname my-rabbit --name some-rabbit -p 5672:5672 -p 15672:15672 rabbitmq:3-management
To check if RabbitMQ is working, open a browser and navigate to: http://localhost:15672/. You should see the Management page of RabbitMQ.
2. Write a Producer program
To send messages to the queue you can write a simple JAVA program like this:
import com.rabbitmq.client.BuiltinExchangeType;
import com.rabbitmq.client.Channel;
import com.rabbitmq.client.Connection;
import com.rabbitmq.client.ConnectionFactory;
import java.util.ArrayList;
public class Send {
private final static String EXCHANGE = "myExchange";
public static void main(String[] args) throws Exception {
ConnectionFactory factory = new ConnectionFactory();
factory.setHost("localhost");
Connection connection = factory.newConnection();
Channel channel = connection.createChannel();
channel.exchangeDeclare(EXCHANGE, BuiltinExchangeType.FANOUT);
String queueName = channel.queueDeclare().getQueue();
channel.queueBind(queueName, EXCHANGE, "");
List<String> messages = Arrays.asList("Hello", "World", "!");
for (String msg : messages) {
channel.basicPublish(EXCHANGE, "", null, msg.getBytes("UTF-8"));
System.out.println(" [x] Sent '" + msg + "'");
}
channel.close();
connection.close();
}
}
If you execute the JAVA program you should see some outputs in the Management UI of RabbitMQ.
3. Implement a sample Apex Application
3.1 Bootstrap a sample apex application
Follow the official apex documentation http://docs.datatorrent.com/beginner/
3.2 Add additional dependencies to pom.xml
To use the classes provided by malhar add the following dependencies:
<dependency>
<groupId>org.apache.apex</groupId>
<artifactId>malhar-contrib</artifactId>
<version>3.7.0</version>
</dependency>
<dependency>
<groupId>com.rabbitmq</groupId>
<artifactId>amqp-client</artifactId>
<version>4.2.0</version>
</dependency>
3.3 Create a Consumer
We first need to create an InputOperator that consumes messages from RabbitMQ using available code from apex-malhar.
import com.datatorrent.contrib.rabbitmq.AbstractSinglePortRabbitMQInputOperator;
public class MyRabbitMQInputOperator extends AbstractSinglePortRabbitMQInputOperator<String> {
#Override
public String getTuple(byte[] message) {
return new String(message);
}
}
You only have to override the getTuple() method. In this case we simply return the message that was received from RabbitMQ.
3.4 Setup an Apex DAG
To test the application we simply add an InputOperator (MyRabbitMQInputOperator that we implemented before) that consumes data from RabbitMQ and a ConsoleOutputOperator that prints the received messages.
import com.rabbitmq.client.BuiltinExchangeType;
import org.apache.hadoop.conf.Configuration;
import com.datatorrent.api.annotation.ApplicationAnnotation;
import com.datatorrent.api.StreamingApplication;
import com.datatorrent.api.DAG;
import com.datatorrent.api.DAG.Locality;
import com.datatorrent.lib.io.ConsoleOutputOperator;
#ApplicationAnnotation(name="MyFirstApplication")
public class Application implements StreamingApplication
{
private final static String EXCHANGE = "myExchange";
#Override
public void populateDAG(DAG dag, Configuration conf)
{
MyRabbitMQInputOperator consumer = dag.addOperator("Consumer", new MyRabbitMQInputOperator());
consumer.setHost("localhost");
consumer.setExchange(EXCHANGE);
consumer.setExchangeType(BuiltinExchangeType.FANOUT.getType());
ConsoleOutputOperator cons = dag.addOperator("console", new ConsoleOutputOperator());
dag.addStream("myStream", consumer.outputPort, cons.input).setLocality(Locality.CONTAINER_LOCAL);
}
}
3.5 Test the Application
To simply test the created application we can write a UnitTest, so there is no need to setup a Hadoop/YARN cluster.
In the bootstrap application there is already a UnitTest namely ApplicationTest.java that we can use:
import java.io.IOException;
import javax.validation.ConstraintViolationException;
import org.junit.Assert;
import org.apache.hadoop.conf.Configuration;
import org.junit.Test;
import com.datatorrent.api.LocalMode;
/**
* Test the DAG declaration in local mode.
*/
public class ApplicationTest {
#Test
public void testApplication() throws IOException, Exception {
try {
LocalMode lma = LocalMode.newInstance();
Configuration conf = new Configuration(true);
//conf.addResource(this.getClass().getResourceAsStream("/META-INF/properties.xml"));
lma.prepareDAG(new Application(), conf);
LocalMode.Controller lc = lma.getController();
lc.run(10000); // runs for 10 seconds and quits
} catch (ConstraintViolationException e) {
Assert.fail("constraint violations: " + e.getConstraintViolations());
}
}
}
Since we don't need any properties for this application the only thing changed in this file is uncommenting the line:
conf.addResource(this.getClass().getResourceAsStream("/META-INF/properties.xml"));
If you execute the ApplicationTest.java and send messages to RabbitMQ using the Producer program as described in 2., the Test should output all the messages.
You might need to increase the time of the test to see all messages (It is set to 10sec currently).
I am new to JEE7 and have been working on some quick exercises but I've bumped into a problem. I have a sample Java SE application that sends a message to an ActiveMQ queue and I have an MDB deployed on Wildfly 8 that reads the messages as they come in. This all works fine and I can receive the messages using getText. However, when I use getBody to get the message body, I get an "Unknown Error". Can anyone let me know what I'm doing wrong?
Here's my code below;
/***CLIENT CODE****/
import javax.jms.*;
import org.apache.activemq.ActiveMQConnection;
import org.apache.activemq.ActiveMQConnectionFactory;
public class SimpleMessageClient {
// URL of the JMS server. DEFAULT_BROKER_URL will just mean
// that JMS server is on localhost
private static String url = ActiveMQConnection.DEFAULT_BROKER_URL;
// Name of the queue we will be sending messages to
private static String subject = "MyQueue";
public static void main(String[] args) throws JMSException {
// Getting JMS connection from the server and starting it
ConnectionFactory connectionFactory =
new ActiveMQConnectionFactory(url);
Connection connection = connectionFactory.createConnection();
connection.start();
// JMS messages are sent and received using a Session. We will
// create here a non-transactional session object. If you want
// to use transactions you should set the first parameter to 'true'
Session session = connection.createSession(false,
Session.AUTO_ACKNOWLEDGE);
// Destination represents here our queue 'TESTQUEUE' on the
// JMS server. You don't have to do anything special on the
// server to create it, it will be created automatically.
Destination destination = session.createQueue(subject);
// MessageProducer is used for sending messages (as opposed
// to MessageConsumer which is used for receiving them)
MessageProducer producer = session.createProducer(destination);
// We will send a small text message saying 'Hello' in Japanese
TextMessage message = session.createTextMessage("Jai Hind");
//Message someMsg=session.createMessage();
// someMsg.
// Here we are sending the message!
producer.send(message);
System.out.println("Sent message '" + message.getText() + "'");
connection.close();
}
}
And the consumer;
package javaeetutorial.simplemessage.ejb;
import java.util.logging.Level;
import java.util.logging.Logger;
import javax.annotation.Resource;
import javax.ejb.ActivationConfigProperty;
import javax.ejb.MessageDriven;
import javax.ejb.MessageDrivenContext;
import javax.jms.JMSException;
import javax.jms.Message;
import javax.jms.MessageListener;
import javax.jms.TextMessage;
#MessageDriven(activationConfig = {
#ActivationConfigProperty(propertyName = "destinationType", propertyValue = "javax.jms.Queue"),
#ActivationConfigProperty(propertyName = "destination", propertyValue = "MyQueue")
})
public class SimpleMessageBean implements MessageListener {
#Resource
private MessageDrivenContext mdc;
static final Logger logger = Logger.getLogger("SimpleMessageBean");
public SimpleMessageBean() {
}
#Override
public void onMessage(Message inMessage) {
try {
if (inMessage instanceof TextMessage) {
logger.log(Level.INFO,
"MESSAGE BEAN: Message received: {0}",
inMessage.getBody(String.class));
} else {
logger.log(Level.WARNING,
"Message of wrong type: {0}",
inMessage.getClass().getName());
}
} catch (JMSException e) {
e.printStackTrace();
logger.log(Level.SEVERE,
"SimpleMessageBean.onMessage: JMSException: {0}",
e.toString());
mdc.setRollbackOnly();
}
}
}
Part of the error I get is;
16:47:48,510 ERROR [org.jboss.as.ejb3] (default-threads - 32) javax.ejb.EJBTransactionRolledbackException: Unexpected Error
16:47:48,511 ERROR [org.jboss.as.ejb3.invocation] (default-threads - 32) JBAS014134: EJB Invocation failed on component SimpleMessageBean for method public void javaeetutorial.simplemessage.ejb.SimpleMessageBean.onMessage(javax.jms.Message): javax.ejb.EJBTransactionRolledbackException: Unexpected Error
at org.jboss.as.ejb3.tx.CMTTxInterceptor.handleInCallerTx(CMTTxInterceptor.java:157) [wildfly-ejb3-8.2.0.Final.jar:8.2.0.Final]
at org.jboss.as.ejb3.tx.CMTTxInterceptor.invokeInCallerTx(CMTTxInterceptor.java:253) [wildfly-ejb3-8.2.0.Final.jar:8.2.0.Final]
at org.jboss.as.ejb3.tx.CMTTxInterceptor.required(CMTTxInterceptor.java:342) [wildfly-ejb3-8.2.0.Final.jar:8.2.0.Final]
The method
<T> T Message.getBody(Class<T> c)
you refer to was an addition to JMS 2.0 (see also: http://www.oracle.com/technetwork/articles/java/jms20-1947669.html).
While WildFly 8 is fully compliant to Java EE 7 and therefore JMS 2.1, the current ActiveMQ (5.12.0) is still restricted to JMS 1.1.
Since you presumably import the JMS 2.1 API in your SimpleMessageBean, you reference a method simply not present in the ActiveMQ message.
When you try to call the getBody()-method on the message, it cannot be resolved in the message implementation and hence an AbstractMethodError is thrown. This results in the rollback of the transaction which gives you the EJBTransactionRolledbackException.
I see two immediate solutions for your problem:
If you want to keep using ActiveMQ, confine yourself to the JMS 1.1 API. The getText()-method you mentioned is part of JMS 1.1 and therefore works flawlessly. See here for the JMS 1.1 API (https://docs.oracle.com/javaee/6/api/javax/jms/package-summary.html) and here for the current ActiveMQ API documentation (http://activemq.apache.org/maven/5.12.0/apidocs/index.html).
Switch to a JMS 2.x compliant message broker. Since you are using WildFly, I recommend taking a look at HornetQ (http://hornetq.jboss.org/).