Quarkus AMQP send message to queue after request business logic - kotlin

Once I receive a HTTP Get/Post I have to persist and object and then send a message to a queue where other services are listening to start doing other complex work
My current issue is that I can't just call a method with an #Outgoing("channel") annotation, I tried that and just keeps on executing the method without calling
Is there a way to call a method to send a JSON payload to a queue using the Quarkus framework?
PS: Im also trying to use rabbitMQ and switched back to ActiveMQ
Ive followed the Quarkus tuturial on reactive messaging and tried to register something on in implemented resource, but no luck
#Path("/part")
class PartService : PanacheRepository<PartDao>, Logging {
#GET
#Produces(MediaType.APPLICATION_JSON)
#Transactional
fun fetchParts(): List<PartDao> {
val partDao = PartDao(label = "Test", status = PartStatus.INBANK, creatorId = "ghost-007")
partDao.persist()
if (partDao.isPersistent) {
// Send a message to a queue -> PoC
send(partDao)
}
return findAll().list()
}
#Outgoing("part-persisted")
#Transactional
fun send(partDao: PartDao): CompletionStage<AmqpMessage<*>> {
val future = CompletableFuture<AmqpMessage<*>>()
val message = "hello from sender"
// Debug proposes
println("Sending (data): $message")
logger.debug(partDao.toString())
future.complete(AmqpMessage(message))
return future
}
}
Expected:
Register message "hello from sender" in queue after doing:
curl http://localhost/part
Actual results:
send method just keeps on executing

If I understand correctly, you want to call a method that would put something into a stream.
To my knowledge, you have to use an Emitter to do it, see e.g. https://github.com/michalszynkiewicz/devoxxpl-demo/blob/master/search/src/main/java/com/example/search/SearchEndpoint.java#L23
See https://smallrye.io/smallrye-reactive-messaging/#_stream documentation.

Related

Webflux, with Websocket how to prevent subscribing twice of reactive redis messaging operation

I have a websocket implementation using redis messaging operation on webflux. And what it does is it listens to topic and returns the values via websocket endpoint.
The problem I have is each time a user sends a message via websocket to the endpoint it seems a brand new redis subscription is made, resulting in the accumulation of subscribers on the redis message topic and the websocket responses are increased with the number of redis topic message subscribtions as well (example user sends 3 messages, redis topic subscriptions are increased to three, websocket connection responses three times).
Would like to know if there is a way to reuse the same subscription to the messaging topic so it would prevent multiple redis topic subscriptions.
The code I use is as follows:
Websocket Handler
public class SendingMessageHandler implements WebSocketHandler {
private final Gson gson = new Gson();
private final MessagingService messagingService;
public SendingMessageHandler(MessagingService messagingService) {
this.messagingService = messagingService;
}
#Override
public Mono<Void> handle(WebSocketSession session) {
Flux<WebSocketMessage> stringFlux = session.receive()
.map(WebSocketMessage::getPayloadAsText)
.flatMap(inputData ->
messagingService.playGame(inputData)
.map(data ->
session.textMessage(gson.toJson(data))
)
);
return session.send(stringFlux);
}
}
Message Handling service
public class MessagingService{
private final ReactiveRedisOperations<String, GamePubSub> reactiveRedisOperations;
public MessagingService(ReactiveRedisOperations<String, GamePubSub> reactiveRedisOperations) {
this.reactiveRedisOperations = reactiveRedisOperations;
}
public Flux<Object> playGame(UserInput userInput){
return reactiveRedisOperations.listenTo("TOPIC_NAME");
}
}
Thank you in advance.
Instead of using ReactiveRedisOperations, MessageListener is the way to go here. You can register a listener once, and use the following as the listener.
data -> session.textMessage(gson.toJson(data))
The registration should happen only once at the beginning of the connection. You can override void afterConnectionEstablished(WebSocketSession session) of SendingMessageHandler to accomplish this. That way a new subscription created per every new Websocket connection, per every message.
Also, don't forget to override afterConnectionClosed, and unsubscribe from the redis topic, and clean up the listener within it.
Instructions on how to use MessageListener.

Spring Cloud Gateway Custom Filter : WebClient.create().post() causes hanging when testing

So I've created a custom filter that, when accessed, will create a webflux client and post to a predetermined url. This seems to work fine when running, but when testing this code the test is hanging (until I cancel the test). So I feel there is a possible memory leak on top of not being able to complete the test to make sure this route is working properly. If I switch the WebClient method to get() then a resulting test of the filter works fine. Something with a post() I am not sure what is missing.
#Component
class ProxyGatewayFilterFactory: AbstractGatewayFilterFactory<ProxyGatewayFilterFactory.Params>(Params::class.java) {
override fun apply(params: Params): GatewayFilter {
return OrderedGatewayFilter(
GatewayFilter { exchange, chain ->
exchange.request.mutate().header("test","test1").build()
WebClient.create().post()
.uri(params.proxyBasePath)
.body(BodyInserters.fromDataBuffers(exchange.request.body))
.headers { it.addAll(exchange.request.headers) }
.exchange()
.flatMap {
println("the POST statusCode is "+it.statusCode())
Mono.just(it.statusCode().is2xxSuccessful)
}
.map {
exchange.request.mutate().header("test", "test2").build()
println("exchange request uri is " + exchange.request.uri)
println("exchange response statusCode is "+ exchange.response.statusCode)
exchange
}
.flatMap(chain::filter)
}, params.order)
}
Taken from the documentation, if using exchange you have an obligation to consume the body.
Unlike retrieve(), when using exchange(), it is the responsibility of the application to consume any response content regardless of the scenario (success, error, unexpected data, etc). Not doing so can cause a memory leak. The Javadoc for ClientResponse lists all the available options for consuming the body. Generally prefer using retrieve() unless you have a good reason for using exchange() which does allow to check the response status and headers before deciding how to or if to consume the response.
Spring framework 5.2.9 Webclient
This api has been changed in the latest version of the spring framework 5.3.0 now spring will force you to consume the body, because developers didn't actually read the docs.

How to read the request body with spring webflux

I'm using Spring 5, Netty and Spring webflux to develop and API Gateway. Sometime I want that the request should be stopped by the gateway but I also want to read the body of the request to log it for example and return an error to the client.
I try to do this in a WebFilter by subscribing to the body.
#Override
public Mono<Void> filter(ServerWebExchange exchange, GatewayFilterChain chain) {
if (enabled) {
logger.debug("gateway is enabled. The Request is routed.");
return chain.filter(exchange);
} else {
logger.debug("gateway is disabled. A 404 error is returned.");
exchange.getRequest().getBody().subscribe();
exchange.getResponse().setStatusCode(HttpStatus.NOT_FOUND);
return exchange.getResponse().writeWith(Mono.just(exchange.getResponse().bufferFactory().allocateBuffer(0)));
}
}
When I do this it works when the content of the body is small. But when I have a large boby, only the first element of the flux is read so I can't have the entire body. Any idea how to do this ?
1.Add "readBody()" to the post route:
builder.routes()
.route("get_route", r -> r.path("/**")
.and().method("GET")
.filters(f -> f.filter(myFilter))
.uri(myUrl))
.route("post_route", r -> r.path("/**")
.and().method("POST")
.and().readBody(String.class, requestBody -> {return true;})
.filters(f -> f.filter(myFilter))
.uri(myUrl))
2.Then you can get the body string in your filter:
String body = exchange.getAttribute("cachedRequestBodyObject");
Advantages:
No blocking.
No need to refill the body for further process.
Works with Spring Boot 2.0.6.RELEASE + Sring Cloud Finchley.SR2 + Spring Cloud Gateway.
The problem here is that you are subscribing manually within the filter, which means you're disconnecting the reading of the request from the rest of the pipeline. Calling subscribe() gives you a Disposable that helps you manage the underlying Subscription.
So you need to turn connect the whole process as a single pipeline, a bit like:
Flux<DataBuffer> requestBody = exchange.getRequest().getBody();
// decode the request body as a Mono or a Flux
Mono<String> decodedBody = decodeBody(requestBody);
exchange.getResponse().setStatusCode(HttpStatus.NOT_FOUND);
return decodedBody.doOnNext(s -> logger.info(s))
.then(exchange.getResponse().setComplete());
Note that decoding the whole request body as a Mono means your gateway will have to buffer the whole request body in memory.
DataBuffer is, on purpose, a low level type. If you'd like to decode it (i.e. implement the sample decodeBodymethod) as a String, you can use one of the various Decoder implementations in Spring, like StringDecoder.
Now because this is a rather large and complex space, you can use and/or take a look at Spring Cloud Gateway, which does just that and way more.

spring cloud stream unable to parse message posted to RabbitMq using Spring RestTemplate

I have an issue in getting the message to spring-cloud-stream spring-boot app.
I am using rabbitMq as message engine.
Message producer is a non spring-boot app, which sends a message using Spring RestTemplate.
Queue Name: "audit.logging.rest"
The consumer application is setup to listen that queue. This app is spring-boot app(spring-cloud-stream).
Below is the consumer code
application.yml
cloud:
stream:
bindings:
restChannel:
binder: rabbit
destination: audit.logging
group: rest
AuditServiceApplication.java
#SpringBootApplication
public class AuditServiceApplication {
#Bean
public ByteArrayMessageConverter byteArrayMessageConverter() {
return new ByteArrayMessageConverter();
}
#Input
#StreamListener(AuditChannelProperties.REST_CHANNEL)
public void receive(AuditTestLogger logger) {
...
}
AuditTestLogger.java
public class AuditTestLogger {
private String applicationName;
public String getApplicationName() {
return applicationName;
}
public void setApplicationName(String applicationName) {
this.applicationName = applicationName;
}
}
Below is the request being sent from the producer App in JSON format.
{"applicationName" : "AppOne" }
Found couple of issues:
Issue1:
What I noticed is the below method is getting triggered only when the method Parameter is mentioned as Object, as spring-cloud-stream is not able to parse the message into Java POJO object.
#Input
#StreamListener(AuditChannelProperties.REST_CHANNEL)
public void receive(AuditTestLogger logger) {
Issue2:
When I changed the method to receive object. I see the object is of type RMQTextMessage which cannot be parsed. However I see actual posted message within it against text property.
I had written a ByteArrayMessageConverter which even didn't help.
Is there any way to tell spring cloud stream to extract the message from RMQTextMessage using MessageConverter and get the actual message out of it.
Thanks in Advance..
RMQTextMessage? Looks like it is a part of rabbitmq-jms-client.
In case of RabbitMQ Binder you should rely only on the Spring AMQP.
Now let's figure out what your producer application is doing.
Since you get RMQTextMessage as value for the #StreamListener method that says me that the sender really uses rabbitmq-jms-client for producing, and therefore the real AMQP message in queue has that RMQTextMessage as a wrapper for real payload.
Why don't use Spring AMQP there as well?
It's a late reply but I have the exact problem and solved it by sending and receiving the messages in application/json format. use this in the spring cloud stream config.
content-type: application/json

Java Spring RabbitMq consumer

I am trying to create a RabbitMq consumer in Java Spring framework. Where I need to implement RabbitMq RPC model, so basically consumer shall receive some message from RabbitMq broker, process it, and send it back to the associated reply queue.
Can somebody please point me a neat sample code which implements this requirement in Spring ?
Thanks in advance.
Consider using the Spring AMQP Project.
See the documentation about async consumers. You just need to implement a POJO method and use a MessageListenerAdapter (which is inserted by default when using XML configuration) - if your POJO method returns a result, the framework will automatically send the reply to the replyTo in the inbound message, which can be a simple queue name, or exchange/routingKey.
<rabbit:listener-container connection-factory="rabbitConnectionFactory">
<rabbit:listener queues="some.queue" ref="somePojo" method="handle"/>
</rabbit:listener-container>
public class SomePojo {
public String handle(String in) {
return in.toUpperCase();
}
}
Or, you can use the annotation #RabbitListener in your POJO - again, see the documentation.
Thanks Gary, it worked for me. I used #RabbitListener annotation.
Strangely it only works when I provide queue alone, However specifying a binding of exchange, routing key and queue doesn't work. Not sure what the issue here.
Here is client code snippet in python.
#!/usr/bin/env python
import pika
connection = pika.BlockingConnection(pika.ConnectionParameters(
host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='myQueue',durable='true')
channel.basic_publish(exchange='myExchange',
routing_key='rpc_queue',
body='Hello World!')
print " [x] Sent 'Hello World!'"
connection.close()
Here is spring consumer code.
#RabbitListener(
bindings = #QueueBinding(
value = #Queue(value = "myQueue", durable = "true"),
exchange = #Exchange(value = "myExchange"),
key = "rpc_queue")
)
public void processOrder(Message message) {
String messageBody= new String(message.getBody());
System.out.println("Received : "+messageBody);
}
Not sure whats going wrong with this binding.