I want to stream a file in a reactive way using spring webflux.
How my endpoint should look like more specific what is the type of the object ?
#GetMapping("/file")
Flux<???> file() {
//Read file content into this ??? thing .
}
You can return a Resource instance like this:
#GetMapping("/file")
Mono<Resource> file() {
//Create a ClassPathResource, for example
}
Note that this supports Byte Range HTTP requests automatically.
Related
My micro service needs to communicate with 2 different services over HTTP. 1 has an API contract with snake_case JSON, while the other uses camelCase. How can I configure WebFlux to deserialize and serialize JSON with a certain Jackson ObjectMapper on a set of functional endpoints, while use another one on different endpoints?
The WebFlux documentation shows how to wire in another ObjectMapper, but this applies to all the endpoints of my API. So right now either all my JSON in snake_case or in camelCase. Cant find any resource to solve this issue, but it must be doable right?
Update: to make it clear I want to configure the web server which receives the requests from other services, not the webclient for sending http requests myself. I know how to do the latter.
you can use the #JsonNaming annotation on the classes you want to serialize/deserialize and specify what type of naming strategy you want.
jackson-advanced-annotations
Okay, so this is not the cleaned up solution, I will use this solution from our library, but the basic gist of my work around looks like this:
#Controller
public class Handler {
private ObjectMapper mapper;
public Handler(#Qualifier("snakeCaseWrapper") ObjectMapper mapper) {
this.mapper = mapper;
}
Mono<ServerResponse> returnUser(final ServerRequest request) {
//REQUEST DESERIALIZATION
var messageReader = new DecoderHttpMessageReader<>(new Jackson2JsonDecoder(mapper));
var configuredRequest = ServerRequest.create(request.exchange(), List.of(messageReader));
//RESPONSE SERIALIZATION
return configuredRequest.bodyToMono(UserDto.class)
.map(userDto -> {
try {
return mapper.writeValueAsString(userDto);
} catch (JsonProcessingException e) {
e.printStackTrace();
//properly handle the error here
return "";
}
})
.flatMap(json -> ServerResponse.ok()
.contentType(MediaType.APPLICATION_JSON)
.body(BodyInserters.fromObject(json))
);
}
}
This is the only way I could find to programatically choose which kind of ObjectMapper I want to use for a specific endpoint/handler method for request deserialization. For response serialization, the trick was to first use the ObjectMapper to serialize the response body to a String, and put that String into the response with BodyInserters.fromObject(json) .
It works, so I'm happy with it.
The title basically explains itself.
I have a REST endpoint with VertX. Upon hitting it, I have some logic which results in an AWS-S3 object.
My previous logic was not to upload to S3, but to save it locally. So, I can do this at the response routerCxt.response().sendFile(file_path...).
Now that the file is in S3, I have to download it locally before I could call the above code.
That is slow and inefficient. I would like to stream S3 object directly to the response object.
In Express, it's something like this. s3.getObject(params).createReadStream().pipe(res);.
I read a little bit, and saw that VertX has a class called Pump. But it is used by vertx.fileSystem() in the examples.
I am not sure how to plug the InputStream from S3'sgetObjectContent() to the vertx.fileSystem() to use Pump.
I am not even sure Pump is the correct way because I tried to use Pump to return a local file, and it didn't work.
router.get("/api/test_download").handler(rc -> {
rc.response().setChunked(true).endHandler(endHandlr -> rc.response().end());
vertx.fileSystem().open("/Users/EmptyFiles/empty.json", new OpenOptions(), ares -> {
AsyncFile file = ares.result();
Pump pump = Pump.pump(file, rc.response());
pump.start();
});
});
Is there any example for me to do that?
Thanks
It can be done if you use the Vert.x WebClient to communicate with S3 instead of the Amazon Java Client.
The WebClient can pipe the content to the HTTP server response:
webClient = WebClient.create(vertx, new WebClientOptions().setDefaultHost("s3-us-west-2.amazonaws.com"));
router.get("/api/test_download").handler(rc -> {
HttpServerResponse response = rc.response();
response.setChunked(true);
webClient.get("/my_bucket/test_download")
.as(BodyCodec.pipe(response))
.send(ar -> {
if (ar.failed()) {
rc.fail(ar.cause());
} else {
// Nothing to do the content has been sent to the client and response.end() called
}
});
});
The trick is to use the pipe body codec.
I'm using Spring 5, Netty and Spring webflux to develop and API Gateway. Sometime I want that the request should be stopped by the gateway but I also want to read the body of the request to log it for example and return an error to the client.
I try to do this in a WebFilter by subscribing to the body.
#Override
public Mono<Void> filter(ServerWebExchange exchange, GatewayFilterChain chain) {
if (enabled) {
logger.debug("gateway is enabled. The Request is routed.");
return chain.filter(exchange);
} else {
logger.debug("gateway is disabled. A 404 error is returned.");
exchange.getRequest().getBody().subscribe();
exchange.getResponse().setStatusCode(HttpStatus.NOT_FOUND);
return exchange.getResponse().writeWith(Mono.just(exchange.getResponse().bufferFactory().allocateBuffer(0)));
}
}
When I do this it works when the content of the body is small. But when I have a large boby, only the first element of the flux is read so I can't have the entire body. Any idea how to do this ?
1.Add "readBody()" to the post route:
builder.routes()
.route("get_route", r -> r.path("/**")
.and().method("GET")
.filters(f -> f.filter(myFilter))
.uri(myUrl))
.route("post_route", r -> r.path("/**")
.and().method("POST")
.and().readBody(String.class, requestBody -> {return true;})
.filters(f -> f.filter(myFilter))
.uri(myUrl))
2.Then you can get the body string in your filter:
String body = exchange.getAttribute("cachedRequestBodyObject");
Advantages:
No blocking.
No need to refill the body for further process.
Works with Spring Boot 2.0.6.RELEASE + Sring Cloud Finchley.SR2 + Spring Cloud Gateway.
The problem here is that you are subscribing manually within the filter, which means you're disconnecting the reading of the request from the rest of the pipeline. Calling subscribe() gives you a Disposable that helps you manage the underlying Subscription.
So you need to turn connect the whole process as a single pipeline, a bit like:
Flux<DataBuffer> requestBody = exchange.getRequest().getBody();
// decode the request body as a Mono or a Flux
Mono<String> decodedBody = decodeBody(requestBody);
exchange.getResponse().setStatusCode(HttpStatus.NOT_FOUND);
return decodedBody.doOnNext(s -> logger.info(s))
.then(exchange.getResponse().setComplete());
Note that decoding the whole request body as a Mono means your gateway will have to buffer the whole request body in memory.
DataBuffer is, on purpose, a low level type. If you'd like to decode it (i.e. implement the sample decodeBodymethod) as a String, you can use one of the various Decoder implementations in Spring, like StringDecoder.
Now because this is a rather large and complex space, you can use and/or take a look at Spring Cloud Gateway, which does just that and way more.
I have an issue in getting the message to spring-cloud-stream spring-boot app.
I am using rabbitMq as message engine.
Message producer is a non spring-boot app, which sends a message using Spring RestTemplate.
Queue Name: "audit.logging.rest"
The consumer application is setup to listen that queue. This app is spring-boot app(spring-cloud-stream).
Below is the consumer code
application.yml
cloud:
stream:
bindings:
restChannel:
binder: rabbit
destination: audit.logging
group: rest
AuditServiceApplication.java
#SpringBootApplication
public class AuditServiceApplication {
#Bean
public ByteArrayMessageConverter byteArrayMessageConverter() {
return new ByteArrayMessageConverter();
}
#Input
#StreamListener(AuditChannelProperties.REST_CHANNEL)
public void receive(AuditTestLogger logger) {
...
}
AuditTestLogger.java
public class AuditTestLogger {
private String applicationName;
public String getApplicationName() {
return applicationName;
}
public void setApplicationName(String applicationName) {
this.applicationName = applicationName;
}
}
Below is the request being sent from the producer App in JSON format.
{"applicationName" : "AppOne" }
Found couple of issues:
Issue1:
What I noticed is the below method is getting triggered only when the method Parameter is mentioned as Object, as spring-cloud-stream is not able to parse the message into Java POJO object.
#Input
#StreamListener(AuditChannelProperties.REST_CHANNEL)
public void receive(AuditTestLogger logger) {
Issue2:
When I changed the method to receive object. I see the object is of type RMQTextMessage which cannot be parsed. However I see actual posted message within it against text property.
I had written a ByteArrayMessageConverter which even didn't help.
Is there any way to tell spring cloud stream to extract the message from RMQTextMessage using MessageConverter and get the actual message out of it.
Thanks in Advance..
RMQTextMessage? Looks like it is a part of rabbitmq-jms-client.
In case of RabbitMQ Binder you should rely only on the Spring AMQP.
Now let's figure out what your producer application is doing.
Since you get RMQTextMessage as value for the #StreamListener method that says me that the sender really uses rabbitmq-jms-client for producing, and therefore the real AMQP message in queue has that RMQTextMessage as a wrapper for real payload.
Why don't use Spring AMQP there as well?
It's a late reply but I have the exact problem and solved it by sending and receiving the messages in application/json format. use this in the spring cloud stream config.
content-type: application/json
I am new to rabbitmq and I am trying to send a .sh file in rabbitmq. I have setup my queue and exchanges. I am using spring-amqp and I can send json messages with my listerner container
public SimpleMessageListenerContainer messageListenerContainer() {
SimpleMessageListenerContainer container = new SimpleMessageListenerContainer(connectionFactory());
container.setQueues(topicQueue());
container.setAcknowledgeMode(AcknowledgeMode.AUTO);
container.setMessageListener(new MessageListenerAdapter(pageListener(), jsonMessageConverter()));
return container;
}
but I am not sure how to send a sh file and write it in my pagelistener. Any idea how to do it?
You need to read the file and send the content.
You can use a SimpleMessageConverter (the default) and if the content_type property is text/plain, you'll get a String; otherwise you'll get a byte[].
On the receiving side (presumably) you'd have to write it to a file and set the permissions.