I'm trying to learn Spring webflux & R2DBC. The one I try is simple use case:
have a book table
create an API (/books) that provides text stream and returning Flux<Book>
I'm hoping when I hit /books once, keep my browser open, and any new data inserted to book table, it will send the new data to browser.
Scenario 2, still from book table:
have a book table
create an API (/books/count) that returning count of data in book as Mono<Long>
I'm hoping when I hit /books/count once, keep my browser open, and any new data inserted /deleted to book table, it will send the new count to browser.
But it does not works. After I isnsert new data, no data sent to any of my endpoint.
I need to hit /books or /books/count to get the updated data.
I think to do this, I need to use Server Sent Events? But how to do this in and also querying data? Most sample I got is simple SSE that sends string every certain interval.
Any sample to do this?
Here is my BookApi.java
#RestController
#RequestMapping(value = "/books")
public class BookApi {
private final BookRepository bookRepository;
public BookApi(BookRepository bookRepository) {
this.bookRepository = bookRepository;
}
#GetMapping(produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<Book> getAllBooks() {
return bookRepository.findAll();
}
#GetMapping(value = "/count", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Mono<Long> count() {
return bookRepository.count();
}
}
BookRepository.java (R2DBC)
import org.springframework.data.r2dbc.repository.R2dbcRepository;
public interface BookRepository extends R2dbcRepository<Book, Long> {
}
Book.java
#Table("book")
#Data
#AllArgsConstructor
#NoArgsConstructor
public class Book {
#Id
private Long id;
#Column(value = "name")
private String name;
#Column(value = "author")
private String author;
}
Use a Processor or Sink to handle the Book created event.
Check my example using reactor Sinks, and read this article for the details.
Or use a tailable Mongo document.
A tailable MongoDB document can do the work automatically, check the main branch of the same repos.
My above example used the WebSocket protocol, it is easy to switch to SSE, RSocket.
Below Post would help you to achieve your first requirement
Spring WebFlux (Flux): how to publish dynamically
Let me know , if that helps you
Related
I’m very new to the SpringReactor project.
Until now I've only used Mono from WebClient .bodyToMono() steps, and mostly block() those Mono's or .zip() multiple of them.
But this time I have a usecase where I need to asynchronously call methods in multiple service classes, and those multiple service classes are calling multiple backend api.
I understand Project Reactor doesn't provide asynchronous flow by default.
But we can make the publishing and/or subscribing on different thread and make code asynchronous
And that's what I am trying to do.
I tried to read the documentation here reactor reference but still not clear.
For the purpose of this question, I’m making up this imaginary scenario. that is a little closer to my use case.
Let's assume we need to get a search response from google for some texts searched under images.
Example Scenario
Let's have an endpoint in a Controller
This endpoint accepts the following object from request body
MultimediaSearchRequest{
Set<String> searchTexts; //many texts.
boolean isAddContent;
boolean isAddMetadata;
}
in the controller, I’ll break the above single request object into multiple objects of the below type.
MultimediaSingleSearchRequest{
String searchText;
boolean isAddContent;
boolean isAddMetadata;
}
This Controller talks to 3 Service classes.
Each of the service classes has a method searchSingleItem.
Each service class uses a few different backend Apis, but finally combines the results of those APIs responses into the same type of response class, let's call it MultimediaSearchResult.
class JpegSearchHandleService {
public MultimediaSearchResult searchSingleItem
(MultimediaSingleSearchRequest req){
return comboneAllImageData(
getNameApi(req),
getImageUrlApi(req),
getContentApi(req) //dont call if req.isAddContent false
)
}
}
class GifSearchHandleService {
public MultimediaSearchResult searchSingleItem
(MultimediaSingleSearchRequest req){
return comboneAllImageData(
getNameApi(req),
gitPartApi(req),
someRandomApi(req),
soemOtherRandomApi(req)
)
}
}
class VideoSearchHandleService {
public MultimediaSearchResult searchSingleItem
(MultimediaSingleSearchRequest req){
return comboneAllImageData(
getNameApi(req),
codecApi(req),
commentsApi(req),
anotherApi(req)
)
}
}
In the end, my controller returns the response as a List of MultimediaSearchResult
Class MultimediaSearchResponse{
List< MultimediaSearchResult> results;
}
If I want to use this all asynchronously using the project reactor. how to achieve it.
Like calling searchSingleItem method in each service for each searchText asynchronously.
Even within the services call each backend API asynchronously (I’m already using WebClient and converting response bodyToMono for backend API calls)
First, I will outline a solution for the upper "layer" of your scenario.
The code (a simple simulation of the scenario):
public class ChainingAsyncCallsInSpring {
public Mono<MultimediaSearchResponse> controllerEndpoint(MultimediaSearchRequest req) {
return Flux.fromIterable(req.getSearchTexts())
.map(searchText -> new MultimediaSingleSearchRequest(searchText, req.isAddContent(), req.isAddMetadata()))
.flatMap(multimediaSingleSearchRequest -> Flux.merge(
classOneSearchSingleItem(multimediaSingleSearchRequest),
classTwoSearchSingleItem(multimediaSingleSearchRequest),
classThreeSearchSingleItem(multimediaSingleSearchRequest)
))
.collectList()
.map(MultimediaSearchResponse::new);
}
private Mono<MultimediaSearchResult> classOneSearchSingleItem(MultimediaSingleSearchRequest req) {
return Mono.just(new MultimediaSearchResult("1"));
}
private Mono<MultimediaSearchResult> classTwoSearchSingleItem(MultimediaSingleSearchRequest req) {
return Mono.just(new MultimediaSearchResult("2"));
}
private Mono<MultimediaSearchResult> classThreeSearchSingleItem(MultimediaSingleSearchRequest req) {
return Mono.just(new MultimediaSearchResult("3"));
}
}
Now, some rationale.
In the controllerEndpoint() function, first we create a Flux that will emit every single searchText from the request. We map these to MultimediaSingleSearchRequest objects, so that the services can consume them with the additional metadata that was provided with the original request.
Then, Flux::flatMap the created MultimediaSingleSearchRequest objects into a merged Flux, which (as opposed to Flux::concat) ensures that all three publishers are subscribed to eagerly i.e. they don't wait for one another. It works best on this exact scenario, when several independent publishers need to be subscribed to at the same time and their order is not important.
After the flat map, at this point, we have a Flux<MultimediaSearchResult>.
We continue with Flux::collectList, thus collecting the emitted values from all publishers (we could also use Flux::reduceWith here).
As a result, we now have a Mono<List<MultimediaSearchResult>>, which can easily be mapped to a Mono<MultimediaSearchResponse>.
The results list of the MultimediaSearchResponse will have 3 items for each searchText in the original request.
Hope this was helpful!
Edit
Extending the answer with a point of view from the service classes as well. Assuming that each inner (optionally skipped) call returns a different type of result, this would be one way of going about it:
public class MultimediaSearchResult {
private Details details;
private ContentDetails content;
private MetadataDetails metadata;
}
public Mono<MultimediaSearchResult> classOneSearchSingleItem(MultimediaSingleSearchRequest req) {
return Mono.zip(getSomeDetails(req), getContentDetails(req), getMetadataDetails(req))
.map(tuple3 -> new MultimediaSearchResult(
tuple3.getT1(),
tuple3.getT2().orElse(null),
tuple3.getT3().orElse(null)
)
);
}
// Always wanted
private Mono<Details> getSomeDetails(MultimediaSingleSearchRequest req) {
return Mono.just(new Details("details")); // api call etc.
}
// Wanted if isAddContent is true
private Mono<Optional<ContentDetails>> getContentDetails(MultimediaSingleSearchRequest req) {
return req.isAddContent()
? Mono.just(Optional.of(new ContentDetails("content-details"))) // api call etc.
: Mono.just(Optional.empty());
}
// Wanted if isAddMetadata is true
private Mono<Optional<MetadataDetails>> getMetadataDetails(MultimediaSingleSearchRequest req) {
return req.isAddMetadata()
? Mono.just(Optional.of(new MetadataDetails("metadata-details"))) // api call etc.
: Mono.just(Optional.empty());
}
Optionals are used for the requests that might be skipped, since Mono::zip will fail if either of the zipped publishers emit an empty value.
If the results of each inner call extend the same base class or are the same wrapped return type, then the original answer applies as to how they can be combined (Flux::merge etc.)
I am currently developing an API which needs to "extend" its own data (my database) with the data I receive from another API.
I have an inventory class/aggregate in the domain layer with a repository interface which is implemented in the infrastructure layer.
Now I have injected via Dependency Injection both the Entity Manager for my own database as well as the RestClient for the external API.
public class RestInventoryRepository implements InventoryRepository {
#RestClient
#Inject
InventoryRestClient inventoryRestClient;
#Inject
EntityManager eM;
In the repository method implementation I first call the rest client and receive its representation of the requested inventory. I then map it to my inventory class with an Object Mapper. After that I try to get the additional information from my boxInventory by the same inventory id and then append it to the earlier received inventory.
The result is a rather big method which only gets bigger with other additions. My question now is if there is a good practise to handle situations like that? I found the API Composition pattern but I am not sure if I can handle mixing a database with a API the same way as if mixing different APIs.
public Optional<Inventory> findInventoryById(String inventoryId) {
JSONObject inventoryJSON = inventoryRestClient.getInventoryById(inventoryId);
if (inventoryJSON == null) {
return Optional.empty();
}
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.registerModule(new JavaTimeModule());
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
Inventory inventory = objectMapper.convertValue(inventoryJSON, Inventory.class);
//extracts boxTagInventory
TypedQuery<BoxInventory> typedQuery =
eM.createQuery("FROM BoxInventory WHERE INVENTORY_ID = :inventoryId",
BoxInventory.class);
typedQuery.setParameter("inventoryId", inventory.inventoryId());
List<BoxInventory> resultSet = typedQuery.getResultList();
if(resultSet.isEmpty()){
return Optional.empty();
}
inventory.addBoxInventory(resultSet.get(0));
return Optional.of(inventory);
}
I am using jdbcTemplate to query hive then writing the results to a .csv file. I basically just generate a list of objects then steam the list to write each record to the file.
I will like to stream the results as they coming back from hive and write it to the file instead of wait to get the whole thing then processing it. Can anyone pointing me to the right direction? Thanks!
private List<Avs> queryAvsData(String asSql) {
List<Avs> llistAvs = new ArrayList<Avs>();
List<Map<String, Object>> rows = hiveJdbcTemplate.queryForList(asSql);
Iterator<Map<String, Object>> it = rows.iterator();
while (it.hasNext()) {
Map<String, Object> row = it.next();
Avs laAvs = Avs.builder()
.make((String) row.get("make"))
.model((String) row.get("model"))
.build();
llistAvs.add(laAvs);
}
return llistAvs;
}
It doesn't look like there's a built-in solution, but you can do it. Basically, you wrap the existing functionality in an iterator, and use a spliterator to turn it into a stream. Here's a blog post on the subject:
The code implements Spring’s ResultSetExtractor interface, which is a Single Abstract Method (SAM) interface, allowing the use of a lambda expression to implement it.
The implementation wraps the SQL ResultSet in an iterator, constructs a stream using the Spliterators and StreamSupport utility classes, and applies that to a Function taking a stream of row sets and returning a generic result.
It's possible to stream values from JdbcTemplate. The following example is a service based on Spring Boot 2.4.8.
As, I run into problems (connection leak) using queryForStream then I will put a demo code here just to know that stream must be closed after usage.
import lombok.RequiredArgsConstructor;
import org.springframework.jdbc.core.SingleColumnRowMapper;
import org.springframework.jdbc.core.namedparam.NamedParameterJdbcTemplate;
import org.springframework.stereotype.Service;
import java.util.Map;
import java.util.stream.Stream;
#Service
#RequiredArgsConstructor
public class DataCleaningService {
private final NamedParameterJdbcTemplate jdbcTemplate;
public void doSomeStreaming() {
String nativeQuery = "SELECT string_value FROM my_table WHERE column = :valueToFiler";
Map<String, Object> queryParameters = Map.of("valueToFiler", "my value");
SingleColumnRowMapper<String> stringRowMapper = SingleColumnRowMapper.newInstance(String.class);
try (Stream<String> stringValueStream = jdbcTemplate.queryForStream(nativeQuery, queryParameters, stringRowMapper)) {
stringValueStream.forEach(stringValue -> {
// do the needed action with the value
//..
System.out.printf("My cool value: %s", stringValue);
});
}
}
}
UPD. Sorry, guys.
I have an application that acts as a SOAP server, how do I write a PHPUnit test to test it?
SOAP extension is reading data from PHP input stream. You just provide your own data there and create some integration/unit tests for your API.
Take a look at the signature of SoapServer::handle() method. It takes as an argument a string which is a request itself. This parameter is optional and if you don't pass anything in, PHP will just read the data itself. But you can simply override it.
I used streams to do it. First you wrap the SoapServer with your own class like this:
class MyServer
{
/** \SoapServer */
private $soapServer;
public function __construct(\SoapServer $soapServer)
{
$this->soapServer = $soapServer;
}
public function handle(Psr\Http\Message\StreamInterface $inputStream): void
{
$this->soapServer->handle($inputStream->getContent());
}
}
Now you are ready to mock the request.
In your test you can do:
class MyTest extends TestCase
{
public function testMyRequest(): void
{
$mySoapServer = $this->createMySoapServer();
$request = $this->createRequest();
$mySoapServer->handle($request);
}
private function createRequest(): StreamInterface
{
$requestString = '<soap:Envelope></soap:Envelope>';
$fh = fopen('php://temp', 'rw');
fwrite($fh, $requestString);
fseek($fh, SEEK_SET);
return new Psr\Http\Message\StreamInterface\Stream($fh);
}
private function createMySoapServer(): MyServer
{
return new MyServer(new \SoapServer());
}
}
One thing to keep in mind - this test will generate output. You may want to test this output or ignore it. Depends on your use case.
Another side note. What you are asking for has really nothing to do with PHPUnit. It just a matter of designing your SOAP server correctly.
If you are wondering how to set up the stream when you have a live request, this is really simple:
$server->handle(new Psr\Http\Message\StreamInterface\Stream(fopen('php://input', 'r+')));
I am consuming an API mehod and it returns response as of type Product and below is the response class structure.
Public class Product
{
public int Id;
public string Name;
public IList<Product> MasterProduct { get; set; }
}
The API result include the product attributes along with IList. Since this API cannot be consumed directly though our windows client we have a wrapper web API which consume this API, for this in the local API we have defined similar Product class. The issue I am facing is when trying to map the attibues of external API with local. Below is what I am trying to do.
response = Response.Result.Select(x => new Product
{
Id=x.Id,
Name=x.Name
MasterProduct = x.MasterProduct.Cast<MasterProduct>().ToList()//tried below
}).ToList();
but it fails with error as - Unable to cast object of type 'Api.Models.Product' to type 'App.DataContracts.Product'
The Masterproduct consist of hierarchal data .I am wondering if the approach I am taking is right or it has to be done through some method. Any suggestion or help would be appreciated.
Upon searching the web I came across some code where serpare method is being called to parse using Microsoft.Its.Data, but this was for single object where as in my case I have a List(Hierarchical).
Appreciate if someone can point to some linke/sampel to achive the same.
Trying serialization/deserialization would do. Below is the code
Perhaps trying serialization/deserialization would do.
if (response.Result != null)
{
var serializedResponse = JsonConvert.SerializeObject(Response.Result, Formatting.Indented);
response = JsonConvert.DeserializeObject<List<Product>>(serializedResponse);
}).ToList();
return response;