Is there an integration of spring cloud function webflux + spring cloud stream with http source - spring-webflux

I am trying to integrate spring cloud stream with spring cloud function webflux
as they are deprecating spring cloud reactive streams in future releases I am trying to use spring cloud functions
https://cloud.spring.io/spring-cloud-static/spring-cloud-stream/2.1.2.RELEASE/single/spring-cloud-stream.html#spring-cloud-stream-preface-notable-deprecations
Spring cloud web function can expose an end point of its function with paths like in the doc
https://cloud.spring.io/spring-cloud-static/spring-cloud-function/1.0.0.RELEASE/single/spring-cloud-function.html
from cloud stream I can see the source needs to be defined as
supplier https://cloud.spring.io/spring-cloud-static/spring-cloud-stream/2.1.2.RELEASE/single/spring-cloud-stream.html#_spring_cloud_function
but my use case is to get POST data from reactive http end point and ingest into kafka, is there any way achieving it from spring cloud function web and spring cloud stream ?
from the doc for spring cloud function with spring cloud stream
#SpringBootApplication
#EnableBinding(Source.class)
public static class SourceFromSupplier {
public static void main(String[] args) {
SpringApplication.run(SourceFromSupplier.class, "--spring.cloud.stream.function.definition=date");
}
#Bean
public Supplier<Date> date() {
return () -> new Date(12345L);
}
}
and if i run this i can see date is getting inserted into kafka every 1 second and if i call the get endpoint for supplier like localhost:/8080/date results in a date response, is there any way of injecting the paylaod from post to kafka with spring cloud function ?

There is an issue which your question helped to discover and it has to do with lifecycle inconsistency between auto-configurations provided by function and stream. The issue manifests itself in a way that the rest point created by Spring Cloud Functions can not see the bindings as it is created much earlier
So we'll address the issue shortly. Meanwhile there is a workaround which would require you to access output channel from the ApplicationContext (see below):
#SpringBootApplication
#EnableBinding(Source.class)
public class SimpleFunctionRabbitDemoApplication {
public static void main(String[] args) throws Exception {
SpringApplication.run(SimpleFunctionRabbitDemoApplication.class);
}
#Bean
public Consumer<String> storeSync(ApplicationContext context) {
return v -> {
MessageChannel channel = context.getBean(Source.OUTPUT, MessageChannel.class);
channel.send(MessageBuilder.withPayload(v).build());
};
}
}

Related

How to monitor meters in Spring Webflux for a reactor-netty server

I am new to Spring Boot and Spring Webflux. I am working on a Spring Webflux reactor-netty server to handle WebSocket connections. In the simplest sense, this is how the server looks like:
...
#Component
public class ServerWebSocketHandler implements WebSocketHandler {
private final Logger logger = LoggerFactory.getLogger(getClass());
#Override
public Mono<Void> handle(WebSocketSession session) {
String sessionId = session.getId();
Sinks.Many<String> unicastSink = Sinks.many().unicast().onBackpressureError();
// save the unicastSink in cache so that on demand messages can be sent to the sink
Mono<Void> receiver =
session
.receive()
.map(WebSocketMessage::getPayloadAsText)
.doOnNext(message -> this.handleIncomingMessage(sessionId, message))
.doOnError(error -> {
logger.info("Error occurred in the session - Session: '{}'; Error: '{}'", sessionId, error);
})
.doFinally(s -> {
this.cleanUp(sessionId, s);
})
.then();
Mono<Void> sender =
session
.send(unicastSink.asFlux().map(session::textMessage));
return Mono.zip(receiver, sender).then();
}
// handleIncomingMessage, cleanUp, and other private methods to handle business logic
}
Now, I want to monitor the meters, specifically meters that can help in identifying back pressure or memory leak like reactor.netty.eventloop.pending.tasks, reactor.netty.bytebuf.allocator.used.direct.memory, reactor.netty.bytebuf.allocator.used.heap.memory. I read about these meters in Reactor Netty Reference Guide https://projectreactor.io/docs/netty/1.1.0-SNAPSHOT/reference/index.html#_metrics. The example of how to enable it is done on the server creation, but in Webflux, all these are abstracted out. So, my question is, in this case, how can I enable monitoring the meters and how to consume the meter. A small example code which shows how to do it would be greatly useful.
You can use Spring Boot API for configuring the web server
https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#howto.webserver.configure
#Component
public class MyNettyWebServerCustomizer
implements WebServerFactoryCustomizer<NettyReactiveWebServerFactory> {
#Override
public void customize(NettyReactiveWebServerFactory factory) {
factory.addServerCustomizers(httpServer -> httpServer.metrics(...));
}
}
These built-in Reactor Netty metrics use Micrometer so you can consume them with everything that has integration with Micrometer.

Spring cloud stream dlq processing with spring cloud function for rabbitmq

I have read the spring cloud stream binder reference document which mentioned DLQ processing using #RabbitListener. https://docs.spring.io/spring-cloud-stream-binder-rabbit/docs/3.0.10.RELEASE/reference/html/spring-cloud-stream-binder-rabbit.html#rabbit-dlq-processing
Can we achieve the same via Spring cloud function like we can do the same for consumers?
Like
#Bean
public Consumer<Message> dlqprocess(DLQProcess dlqprocess) {
return t -> dlqprocess.do(t);
}
I am not sure whether we can do this or not. If this allows what are the other configuration we have to do?
If you aim is to requeue failed messages, the function can just throw exceptions as described in docs.
Furthermore, if you need more fine-grained control about send and requeued messages you can use StreamBrdidge. Here you need to explicitly define DLQ binding in the configuration file:
spring.cloud.stream.bindings.myDlq-out-0.destination=DLX
spring.cloud.stream.rabbit.bindings.myDlq-out-0.producer.exchangeType=direct
spring.cloud.stream.rabbit.bindings.myDlq-out-0.producer.routingKeyExpression='myDestination.myGroup'
spring.cloud.stream.source=myDlq
Finally, the function controls whether to send and requeue the message:
#Bean
public Consumer<Message> process(StreamBridge streamBridge) {
return t -> {
// ....
if(republish) streamBridge.send("myDlq-out-0", t);
if(sendToDestination) streamBridge.send("myDestination-out-0", t);
// ....
};
}

Webflux access log header

How to customize the reactor access log in Spring webflux?
I am able to turn on reactor netty access log by setting
-Dreactor.netty.http.server.accessLogEnabled=true
I would like to customize the format, eg: I need a few request headers to be logged and remove the IP address.
Any hints to achieve this in Spring Webflux application would be helpful.
You can do it programmatically like this
#Component
public class MyNettyWebServerCustomizer
implements WebServerFactoryCustomizer<NettyReactiveWebServerFactory> {
#Override
public void customize(NettyReactiveWebServerFactory factory) {
factory.addServerCustomizers(httpServer -> httpServer.accessLog(true, x -> AccessLog.create("method={}, uri={}", x.method(), x.uri())));
}
}
More about custom access logging you can find in the documentation

how can webflux handle global error, like 404 page not found

i use #restcontrolleradvice and #ExceptionHandler , but i can handle controller exception. server error like 404, 500 can't handle.
#RestControllerAdvice
public class HttpExceptionHandler {
private static final Logger logger = LoggerFactory.getLogger(HttpExceptionHandler.class);
#ExceptionHandler(value = Exception.class)
public String exceptions(Exception e) {
String code = Global.ERR_UNKNOWN;
if (e instanceof MethodNotAllowedException) {
code = Global.ERR_HTTP_METHOD;
}
return code;
}
}
If you're using Spring Boot, this is already done for you and you can customize this support as well quite easily (see Spring Boot reference docs).
If you're using plain Spring Framework, then you need to register a custom WebExceptionHandler bean to handle that (see Spring Framework reference docs). Because those errors can happen at any point during request handling (i.e. not only during the controller handling phase, but also during response encoding, within a WebFilter...), the API there is quite low level and you need to deal with raw DataBuffer instances. If you're looking for inspiration on how to achieve higher level error handling support, you can also take a look at what's done in Spring Boot.

spring cloud stream unable to parse message posted to RabbitMq using Spring RestTemplate

I have an issue in getting the message to spring-cloud-stream spring-boot app.
I am using rabbitMq as message engine.
Message producer is a non spring-boot app, which sends a message using Spring RestTemplate.
Queue Name: "audit.logging.rest"
The consumer application is setup to listen that queue. This app is spring-boot app(spring-cloud-stream).
Below is the consumer code
application.yml
cloud:
stream:
bindings:
restChannel:
binder: rabbit
destination: audit.logging
group: rest
AuditServiceApplication.java
#SpringBootApplication
public class AuditServiceApplication {
#Bean
public ByteArrayMessageConverter byteArrayMessageConverter() {
return new ByteArrayMessageConverter();
}
#Input
#StreamListener(AuditChannelProperties.REST_CHANNEL)
public void receive(AuditTestLogger logger) {
...
}
AuditTestLogger.java
public class AuditTestLogger {
private String applicationName;
public String getApplicationName() {
return applicationName;
}
public void setApplicationName(String applicationName) {
this.applicationName = applicationName;
}
}
Below is the request being sent from the producer App in JSON format.
{"applicationName" : "AppOne" }
Found couple of issues:
Issue1:
What I noticed is the below method is getting triggered only when the method Parameter is mentioned as Object, as spring-cloud-stream is not able to parse the message into Java POJO object.
#Input
#StreamListener(AuditChannelProperties.REST_CHANNEL)
public void receive(AuditTestLogger logger) {
Issue2:
When I changed the method to receive object. I see the object is of type RMQTextMessage which cannot be parsed. However I see actual posted message within it against text property.
I had written a ByteArrayMessageConverter which even didn't help.
Is there any way to tell spring cloud stream to extract the message from RMQTextMessage using MessageConverter and get the actual message out of it.
Thanks in Advance..
RMQTextMessage? Looks like it is a part of rabbitmq-jms-client.
In case of RabbitMQ Binder you should rely only on the Spring AMQP.
Now let's figure out what your producer application is doing.
Since you get RMQTextMessage as value for the #StreamListener method that says me that the sender really uses rabbitmq-jms-client for producing, and therefore the real AMQP message in queue has that RMQTextMessage as a wrapper for real payload.
Why don't use Spring AMQP there as well?
It's a late reply but I have the exact problem and solved it by sending and receiving the messages in application/json format. use this in the spring cloud stream config.
content-type: application/json