I have a Boot 2.x app using Webflux where I'm streaming in a flux of Parts using #RequestBody Flux<Part>.
My problem is that I need to read the first Part, initialize an object using the content from that Part, and then pass that object to be used in the second Part. How do I go about doing that while ensuring I read each part as it's available?
My current solution is to use groupBy, but doing so triggers waiting until all parts are finished, which isn't acceptable.
Here's an example of what I'm trying to do:
parts.scan( new Foo(), (foo, part) -> {
if(part.name().equalsIgnoreCase("first_part"))
{
Jackson2JsonDecoder jackson2JsonDecoder = new Jackson2JsonDecoder();
Mono<Metadata> metadata = jackson2JsonDecoder.decodeToMono(part.content(), ResolvableType.forClass(Metadata.class), null, null);
// Here's my problem. How do I call foo.init(metadata) in a non-blocking way while still returning foo so it can be used by the next part?
}
else if(part.name().equalsIgnoreCase("second_part"))
{
// Use initialized foo from part 1 to push second_part's DataBuffer Flux
}
});
Thanks!
Related
Every time I think I understand Webflux and project reactor, I find out I have no idea.
So I making some API calls... I want to call 1 first ... Get information back use that information, to make subsequent calls.
so I do this like so
public Mono<ResponseObject> createAggregatedRecords(RecordToPersist recordToPersist){
return randomApiClient.createRecord(recordToPersist)
.flatMap(result -> {
return Mono.zip(
webClientInstance.createOtherRecord1(result.getChildRecord1()),
webClientInstance2.createOtherRecord2(result.getChildRecord2()),
webClientInstance3.createOtherRecord3(result.getChildRecord3()))
.map(tupple -> {
ResponseObject respObj = new ResponseObject();
respObj.setChildResult1(tupple.getT1());
respObj.setChildResult2(tupple.getT2());
respObj.setChildResult3(tupple.getT3());
return respObj;
}
}).doOnSuccess(res -> log.info("This throws an error: {}", res.getChildResult1.getFirstField()))
}
Now, for some reason, I am returning a null object with this very code to my Controller and I am not printing out the object in Json.
I suspect it is because I am nesting the Mono.zip inside the flatmap, and am not returning the results back correctly. I am making all of those API calls though as my End-to-End integration tests are succeeding.
Now I thought that I would return that response object from the .map function from the Mono.zip chain and then return that to the flatMap call in the chain. If I put observers on the chain like a doOnSuccess and print out response object fields I get a null pointer ... Not sure what I am missing
Is this a good pattern to achieve that goal? Or should I try a different path?
Why can I not get the response Object to return?
I have managed to read data from my firebase database but cant seem to re-use the String which has been read.
My successful read is as per below. When i check the logcat for the Log.d("Brand") it actually shows the String as expected.
brandchosenRef=FirebaseDatabase.getInstance().reference
val brandsRef = brandchosenRef.child("CarList2").orderByChild("Car").equalTo(searchable_spinner_brand.selectedItem.toString())
val valueEventListener = object : ValueEventListener {
override fun onDataChange(dataSnapshot: DataSnapshot) {
for(ds in dataSnapshot.children){
Log.d("spinner brand",searchable_spinner_brand.selectedItem.toString())
val Brand = ds.child("Brand").getValue(String::class.java)
val brandselected= Brand.toString()
Log.d("Brand","$brandselected")
selectedbrand== brandselected
Log.d("selected brand",selectedbrand)
}
}
override fun onCancelled(databaseError: DatabaseError) {
Log.d("Branderror","error on brand")
}
}
brandsRef.addListenerForSingleValueEvent(valueEventListener)
What i am trying to do is write "selectedbrand" into a separate node using the following:
val carselected = searchable_spinner_brand.selectedItem.toString()
val dealref = FirebaseDatabase.getInstance().getReference("Deal_Summary2")
val dealsummayId = dealref.push().key
val summaryArray = DealSummaryArray(dealsummayId.toString(),"manual input for testing","brand","Deal_ID",carselected,extrastext.text.toString(),otherinfo.text.toString(),Gauteng,WC,KZN,"Open")
dealref.child(dealsummayId.toString()).setValue(summaryArray).addOnCompleteListener{
}
Note, in the above i was inputting "manual input for testing" to check that my write to Firebase was working and it works as expected. if i replace that with selectedbrand, then i get the below error.
kotlin.UninitializedPropertyAccessException: lateinit property selectedbrand has not been initialized
the summary array indicated above is defined in a separate class as follows. and as seen "manual input for testing is declared as String.
class DealSummaryArray(val id:String,val brand:String,val Buyer_ID:String,val Deal_ID:String,val Car:String,val extras:String,val other_info:String,val Gauteng:String,val Western_Cape:String,val KZN:String,val Status:String) {
constructor():this("","","","","","","","","","",""){
}
}
My question simply put, it why can i not re-use the value i read from the database? even if i was not trying to re-write it to a new node i cannot seem to utilize the value outside of the firebase query.
I seem to get this problem everywhere in my activities and have to find strange work around's like write to a textview and then reference the textview. please assist.
Data is loaded from Firebase asynchronously, as it may take some time before you get a response from the server. To prevent blocking the application (which would be a bad experience for your users), your main code continues to run while the data is being loaded. And then when the data is available, Firebase calls your onDataChange method.
What this means in practice is that any code that needs the data from the database, needs to be inside the onDataChange method or be called from there. So any code that requires selectedbrand needs to be inside onDataChange or called from there (typically through a callback interface).
Also see:
How to check a certain data already exists in firestore or not, which contains example code including of the callback interface, in Java.
getContactsFromFirebase() method return an empty list, which contains a similar example for the Firebase Realtime Database.
Setting Singleton property value in Firebase Listener, which shows a way to make the code behave more synchronous, and explains shows that this may not work on various Android versions.
I am trying to call external service in a micro-service application to get all responses in parallel and combine them before starting the other computation. I know i can use block() call on each Mono object but that will defeat the purpose of using reactive api. is it possible to fire up all requests in parallel and combine them at one point.
Sample code is as below. In this case "Done" prints before actual response comes up. I also know that subscribe call is non blocking.
I want "Done" to be printed after all responses has been collected, so need some kind of blocking. however do not want to block each and every request
final List<Mono<String>> responseOne = new ArrayList<>();
IntStream.range(0, 10).forEach(i -> {
Mono<String> responseMono =
WebClient.create("https://jsonplaceholder.typicode.com/posts")
.post()
.retrieve()
.bodyToMono(String.class)
;
System.out.println("create mono response lazy initialization");
responseOne.add(responseMono);
});
Flux.merge(responseOne).collectList().subscribe( res -> {
System.out.println(res);
});
System.out.println("Done");
Based on the suggestion, I came up with this which seems to work for me.
StopWatch watch = new StopWatch();
watch.start();
final List<Mono<String>> responseOne = new ArrayList<>();
IntStream.range(0, 10).forEach(i -> {
Mono<String> responseMono =
WebClient.create("https://jsonplaceholder.typicode.com/posts")
.post()
.retrieve()
.bodyToMono(String.class);
System.out.println("create mono response lazy initialization");
responseOne.add(responseMono);
});
CompletableFuture<List<String>> futureCount = new CompletableFuture<>();
List<String> res = new ArrayList<>();
Mono.zip(responseOne, Arrays::asList)
.flatMapIterable(objects -> objects) // make flux of objects
.doOnComplete(() -> {
futureCount.complete(res);
}) // will be printed on completion of the flux created above
.subscribe(responseString -> {
res.add((String) responseString);
}
);
watch.stop();
List<String> response = futureCount.get();
System.out.println(response);
// do rest of the computation
System.out.println(watch.getLastTaskTimeMillis());
If you want your calls to be parallel it is a good idea to use Mono.zip
Now, you want Done to be printed after the collection of all the responses
So, you can modify your code as below
final List<Mono<String>> responseMonos = IntStream.range(0, 10).mapToObj(
index -> WebClient.create("https://jsonplaceholder.typicode.com/posts").post().retrieve()
.bodyToMono(String.class)).collect(Collectors.toList()); // create iterable of mono of network calls
Mono.zip(responseMonos, Arrays::asList) // make parallel network calls and collect it to a list
.flatMapIterable(objects -> objects) // make flux of objects
.doOnComplete(() -> System.out.println("Done")) // will be printed on completion of the flux created above
.subscribe(responseString -> System.out.println("responseString = " + responseString)); // subscribe and start emitting values from flux
It's also not a good idea to call subscribe or block explicitly in your reactive code.
is it possible to fire up all requests in parallel and combine them at one point.
That's exactly what your code is doing already. If you don't believe me, stick .delayElement(Duration.ofSeconds(2)) after your bodyToMono() call. You'll see that your list prints out after just over 2 seconds, rather than 20 (which is what it would be if executing sequentially 10 times.)
The combining part is happening in your Flux.merge().collectList() call.
In this case "Done" prints before actual response comes up.
That's to be expected, as your last System.out.println() call is executing outside of the reactive callback chain. If you want "Done" to print after your list is printed (which you've confusingly given the variable name s in the consumer passed to your subscribe() call) then you'll need to put it inside that consumer, not outside it.
If you're interfacing with an imperative API, and you therefore need to block on the list, you can just do:
List<String> list = Flux.merge(responseOne).collectList().block();
...which will still execute the calls in parallel (so still gain you some advantage), but then block until all of them are complete and combined into a list. (If you're just using reactor for this type of usage however, it's debatable if it's worthwhile.)
I'm trying to use Kotlin Kovenant because I want a promise-based solution to track my retrofit calls.
What I did first was this:
all (
walkingRoutePromise,
drivingRoutePromise
) success { responses ->
//Do stuff with the list of responses
}
where the promises I pass are those that are resolved at the completion of my retrofit calls. However "responses" is a list of two identical objects. When debugging, I can confirm that two different objects with different values are being passed to the respective resolve methods. However kovenant returns two identical objects (same location in memory)
My next attempt was this:
task {
walkingRoutePromise
} then {
var returnval = it.get()
walkingDTO = returnval.deepCopy()
drivingRoutePromise
} success {
val returnval = it.get()
drivingDTO = returnval.deepCopy()
mapRoutes = MapRoutes(walkingDTO!!, drivingDTO!!)
currentRoute = mapRoutes!!.walking
callback()
}
Where I tried to do the calls one at a time and perform deep copies of the results. This worked for the first response, but then I found that it.get() in the success block - the success block of the second call - is the same unchanged object that I get from it.get() in the "then" block. It seems Kovenant is implemented to use one object for all of its resolutions, but after you resolve once, the single object it uses for the resolutions cannot be changed. What am I supposed to do if I want to access unique values from promise.resolve(object)? Seems like a very broken system.
I've got two dojo.dnd.Sources with items. Whenever an item is dropped I need to persist the new order of the items in the Sources using an xhr.
Is there an dojo event or topic that is fired after an dnd operation has (successfully) finished? What would be the best way to use it?
Probably I don't understand the problem in all details but I don't see why you need to process events or topics. The best way to record changes is to intercept updating methods on relevant sources. Specifically you need to intercept insertNodes() for drops or any other additions.
Simple example (pseudo-code):
var source1, source2;
// ...
// initialize sources
// populate sources
// ...
function getAllItems(source){
var items = source.getAllNodes().map(function(node){
return source.getItem(node.id);
});
return items;
}
function dumpSource(source){
var items = getAllItems(source);
// XHR items here to your server
}
function recordChange(){
// now we know that some change has occured
// it could be a drop or some programmatic updates
// we don't really care
dumpSource(source1);
dumpSource(source2);
}
dojo.connect(source1, "insertNodes", recordChanges);
dojo.connect(source2, "insertNodes", recordChanges);
// now any drop or other change will trigger recordChanges()
// after the change has occurred.
You can try to be smart about that and send some diff information instead of a whole list, but it is up to you to generate it — you have everything you need for that.
You can use dojo.subscribe to do something when a drop is finished like so:
dojo.subscribe("/dnd/drop", function(source, nodes, copy, target) {
// do your magic here
});
There's examples of using subscribe on the dojotoolkit tests site. More info about dojo publish and subscribe too.
Alternately, you could connect to the onDndDrop method.
var source = new dojo.dnd.Source( ... );
dojo.connect( source, "onDndDrop", function( source, nodes, copy, target ) {
// make magic happen here
});
connect methods are called at the end so the items will be there at that point.
I'm keeping this note for dojo Tree folks just like me who would run in to this problem. Solutions given here was not quite worked well in my situation. I was using a dijit.tree.dndSource with Dojo tree and subscribing to "/dnd/drop" allows me to capture the event even though at that point my underlying data store hadn't been updated with latest changes. So I tried waiting as Wienczny explains, that doesn't solve the problem completely as I can't rely on a timeout to do the waiting job. Time taken for store update could be vary, i.e. shorter or very long depends on how complex your data structure is. I found the solution with overriding the onDndDrop method of the dndController. Simply you can specify the onDndDrop : on your tree initialization. One thing I found odd though you can not hitch this method, you will get weird behavior during dnd.
Tree
this._tree = new MapConfigTree({
checkAcceptance: this.dndAccept,
onDndDrop: this.onDndDrop,
betweenThreshold:5,
method
onDndDrop : function(source, nodes, copy, target){
if(source.dropPosition === 'Over' && (target.targetAnchor.item.type[0] == 'Test layer')) {
this.inherited(arguments);
// do your bit here
} else {
this.onDndCancel();
}
}