How to merge 2 separate streams, buffer populated data from them and subsrcibe to it after some amout of time - kotlin

I am trying to test situation like this:
I have 2 classess which just extends from the same Parent.
I am creating and Observables from the list of items for each of the class:
val listSomeClass1 = ArrayList<SomeClass1>()
val listSomeClass2 = ArrayList<SomeClass2>()
fun populateJust1() {
listSomeClass1.add(SomeClass1("23", 23))
listSomeClass1.add(SomeClass1("24", 24))
listSomeClass1.add(SomeClass1("25", 25))
}
fun populateJust2() {
listSomeClass2.add(SomeClass2(23.00))
listSomeClass2.add(SomeClass2(24.00))
listSomeClass2.add(SomeClass2(25.00))
}
populateItemsSomeClass1()
populateItemsSomeClass2()
Now i can create 2 observables:
val someClass1Observable = Observable.fromIterable(listSomeClass1)
val someClass2Observable = Observable.fromIterable(listSomeClass2)
And here, i want to merge emission from them, buffer it, and subscribe to it after 10 seconds:
Observable.merge(someClass1Observable, someClass2Observable)
.buffer(10, TimeUnit.SECONDS)
.doOnSubscribe { Log.v("parentObservable", "STARTED") }
.subscribe { t: MutableList<Parent> ->
Log.v("parentObservable", "onNext")
t.forEach { Log.v("onNext", it.toString()) }
}
However, the observable is not starting after 10 seconds as i expected, and just starts immedietaly with this data ready.
How to simulate something like this, that i will gather 2 separate streams and after 10 seconds i will be able to get the gathered data
I must point that i don't want to use any Subject.
UPDATE
I've done somehitng like this:
val list1 = listOf(SomeClass1("1", 1), SomeClass1("2", 2), SomeClass1("3", 3))
val list2 = listOf(SomeClass2(5.00), SomeClass2(4.00), SomeClass2(6.00))
val someClass1Observable = Observable
.fromIterable(list1)
.zipWith(Observable.interval(2, TimeUnit.SECONDS),
BiFunction { item: SomeClass1, _: Long -> item })
val someClass2Observable = Observable
.fromIterable(list2)
.zipWith(Observable.interval(1, TimeUnit.SECONDS),
BiFunction { item: SomeClass2, _: Long -> item })
someClass1Observable.subscribe {
Log.v("someClass1", it.toString())
}
someClass2Observable.subscribe {
Log.v("someClass2", it.toString())
}
Observable.merge(someClass1Observable, someClass2Observable)
.buffer(10, TimeUnit.SECONDS)
.delay(10, TimeUnit.SECONDS)
.doOnSubscribe { Log.v("parentObservable", "STARTED") }
.subscribe { t: MutableList<Parent> ->
Log.v("parentObservable", "onNext")
t.forEach { Log.v("onNext", it.toString()) }
}
Thread.sleep(13000)
someClass1Observable.subscribe {
Log.v("someClass1", it.toString())
}
someClass2Observable.subscribe {
Log.v("someClass2", it.toString())
}
Here, i want to just simulate 2 infinite streams of someClass1 and someclass2 Observables and same for the merge Observable.
Again, i want to have ability to merge those 2 streams, buffer populated data and do something with it after 10 seconds. If after 10 seconds those 2 streams will again populate some data, the merge Observable should clean previous buffer, and should again buffer new data and emit after 10 seconds and so on, infinite. However, my code is not working as i expected, what changes I need to do to make it as i described?

I think you're looking for the delay operator
http://reactivex.io/documentation/operators/delay.html
Delay
shift the emissions from an Observable forward in time by a particular amount
So something like:
.delay(10, TimeUnit.SECONDS)

Related

Custom function to update data in pending kotlin channel buffer

I have an UNLIMITED size buffered channel where senders are much faster than receivers. I would like to update the buffer by removing old data and replacing it with newer one (if the receiver does not yet consume it)
Here is my code
import kotlinx.coroutines.channels.Channel
import kotlinx.coroutines.coroutineScope
import kotlinx.coroutines.delay
import kotlinx.coroutines.launch
data class Item(val id: Int, val value: Int)
val testData = listOf(
Item(1, 10),
Item(2, 24),
Item(3, 12),
Item(1, 17), // This one should replace the Item(1, 10) if it's not yet consumed
Item(4, 16),
Item(2, 32), // This one should replace the Item(2, 24) if it's not yet consumed
)
suspend fun main(): Unit = coroutineScope {
val channel = Channel<Item>(Channel.UNLIMITED)
launch {
for(item in testData) {
delay(50)
println("Producing item $item")
channel.send(item)
}
}
// As you can see the sender already sent all the testData and they are waiting in the buffer to be consumed by the receiver.
// I would like to do some checks whenever new item is added to the buffer
// if(itemInBuffer.id == newItem.id && itemInBuffer.value < newItem.value) then replace it with newItem
launch {
for (item in channel) {
delay(5000)
println(item.toString())
}
}
}
Is there any kotlin built function which takes some custom condition and removes items from the buffer? I saw there is a function called distinctUntilChangedBy in flow which removes the duplicate data based on the custom key selector. Is there anything similar available for Channel or Is it possible to achieve it with ChannelFlow (Note: in my real code events are comes from some network calls so I'm not sure channelFlow could be suitable there)
This isn't as simple as it sounds. We can't access the channel queue to modify its contents and moreover, even if we could, it wouldn't be easy to find an item with the same id. We would have to iterate over the whole queue. distinctUntilChangedBy() is a much different case, because it only compares the last item - it doesn't look through the whole queue.
I think our best bet is to not use queues provided by channels, but instead store data by ourselves in a map and only provide send and receive functionality for it. I implemented this as a flow-like operator and also I made it generic, so it could be used in other similar cases:
context(CoroutineScope)
fun <T : Any, K> ReceiveChannel<T>.groupingReduce(keySelector: (T) -> K, reduce: (T, T) -> T): ReceiveChannel<T> = produce {
val items = mutableMapOf<K, T>()
while (!isClosedForReceive) {
select<Unit> {
if (items.isNotEmpty()) {
val (key, item) = items.entries.first()
onSend(item) {
items.remove(key)
}
}
onReceiveCatching { result ->
val item = result.getOrElse { return#onReceiveCatching }
items.merge(keySelector(item), item, reduce)
}
}
}
items.values.forEach { send(it) }
}
It keeps the data in a map, it tries to send and receive at the same time, whatever finishes the first. If received an item and the key is already in a map, it allows to merge both values in a way provided by the caller. It sends items in the order they appeared the first time in the source channel, so new value for the same key doesn't push back this item to the last position in the queue.
This is how we can use it with the example provided by you. I modified it a little as your version is confusing to me. It consumes (1, 10) before producing (1, 17), so actually the example is incorrect. Also, producer and consumer don't run at the same time, so launching them concurrently and adding delays doesn't change too much:
suspend fun main(): Unit = coroutineScope {
val channel = Channel<Item>(Channel.UNLIMITED)
val channel2 = channel.groupingReduce(
keySelector = { it.id },
reduce = { it1, it2 -> if (it1.value > it2.value) it1 else it2 }
)
for(item in testData) {
println("Producing item $item")
channel.send(item)
}
channel.close()
// Needed because while using `UNLIMITED` sending is almost immediate,
// so it actually starts consuming at the same time it is producing.
delay(100)
for (item in channel2) {
println(item.toString())
}
}
I created another example where producer and consumer actually run concurrently. Items are produced every 100ms and are consumed every 200ms with initial delay of 50ms.
suspend fun main(): Unit = coroutineScope {
val channel = Channel<Item>(Channel.UNLIMITED)
val channel2 = channel.groupingReduce(
keySelector = { it.id },
reduce = { it1, it2 -> if (it1.value > it2.value) it1 else it2 }
)
launch {
delay(50)
for (item in channel2) {
println(item.toString())
delay(200)
}
}
launch {
listOf(
Item(1, 10),
// consume: 1, 10
Item(2, 20),
Item(1, 30),
// consume: 2, 20
Item(3, 40),
Item(1, 50),
// consume: 1, 50
Item(4, 60),
Item(1, 70),
// consume: 3, 40
Item(5, 80),
// consume: 4, 60
// consume: 1, 70
// consume: 5, 80
).forEach {
channel.send(it)
delay(100)
}
channel.close()
}
}
Maybe there is a better way to solve this. Also, to be honest, I'm not 100% sure this code is correct. Maybe I missed some corner case around channel closing, cancellations or failures. Additionally, I'm not sure if select { onSend() } guarantees that if the code block has not been executed, then the item has not been sent. If we cancel send(), we don't have a guarantee the item has not been sent. It may be the same in this case.

How to remove item from mutableList in kotlin

I am scanning a list and adding an unique item in mutableList. Scanning a item through ScanCallback but below example is using for Kotlin Flow for better understanding and make a simple use case. I am giving an example of emiting different types of item.
Basically I want to remove items from the specific condtions :-
when flow data is finished to emit new values.
when emiting an item, if we no longer receive an item within 30 sec then we remove the item from the list.
import kotlinx.coroutines.delay
import kotlinx.coroutines.flow.Flow
import kotlinx.coroutines.flow.collectLatest
import kotlinx.coroutines.flow.flow
import kotlinx.coroutines.runBlocking
class ItemList {
val scanResultList = mutableListOf<ScanResults>()
fun doSomething(): Flow<ScanResults> = flow {
(0..20).forEach {
delay(200L)
when (it) {
in 10..12 -> {
emit(ScanResults(Device("item is Adding in range of 10 -- 20")))
}
in 15..18 -> {
emit(ScanResults(Device("item is Adding in range of 15 -- 18")))
}
else -> {
emit(ScanResults(Device("item is Adding without range")))
}
}
}
}
fun main() = runBlocking {
doSomething().collectLatest { value ->
handleScanResult(value)
}
}
private fun handleScanResult(result: ScanResults) {
if (!scanResultList.contains(result)) {
result.device?.name?.let {
if (hasNoDuplicateScanResult(scanResultList, result)) {
scanResultList.add(result)
println("Item added")
}
}
}
}
private fun hasNoDuplicateScanResult(value: List<ScanResults>, result: ScanResults): Boolean {
return value.count { it.device == result.device } < 1
}
data class ScanResults(val device: Device? = null)
data class Device(val name: String? = null)
}
I am not adding Set because in SnapshotStateList is not available in jetpack compose.
I'll try to reword the problem in simple terms. I'll say the input is a Flow of some imaginary data class DeviceInfo so it's easier to describe.
Problem:
There is a source flow of DeviceInfos. We want our output to be a Flow of Set<DeviceInfo>, where the Set is all DeviceInfo's that have been emitted from the source in the past 30 seconds.
(If you want, you can convert this output Flow into State, or collect it and update a mutablestateListOf with it, etc.)
Here is a strategy I thought of. Disclaimer: I haven't tested it.
Tag each incoming DeviceInfo with a unique ID (could be based on system time or a UUID). Add each DeviceInfo to a Map with its latest ID. Launch a child coroutine that delays 30 seconds and then removes the item from the map if the ID matches. If newer values have arrived, then the ID won't match so obsolete child coroutines will expire silently.
val sourceFlow: Flow<DeviceInfo> = TODO()
val outputFlow: Flow<Set<DeviceInfo>> = flow {
coroutineScope {
val tagsByDeviceInfo = mutableMapOf<DeviceInfo, Long>()
suspend fun emitLatest() = emit(tagsByDeviceInfo.keys.toSet())
sourceFlow.collect { deviceInfo ->
val id = System.currentTimeMillis()
if (tagsByDeviceInfo.put(deviceInfo, id) == null) {
emitLatest() // emit if the key was new to the map
}
launch {
delay(30.seconds)
if (tagsByDeviceInfo[deviceInfo] == id) {
tagsByDeviceInfo.remove(deviceInfo)
emitLatest()
}
}
}
}
}

Kotlin - Debounce Only One Specific Value When Emitting from Flow

I have two flows that are being combined to transform the flows into a single flow. One of the flows has a backing data set that emits much faster than the the other.
Flow A - emits every 200 ms
Flow B - emits every ~1s
The problem I am trying to fix is this one:
combine(flowA, flowB) { flowAValue, flowBValue // just booleans
flowAValue && flowBValue
}.collect {
if(it) {
doSomething
}
}
Because Flow A emits extremely quickly, the boolean that's emitted can get cleared rapidly, which means that when flowB emits true, flowA already emitted true and the state is now false.
I've attempted something like:
suspend fun main() {
flowA.debounce {
if (it) {
1250L
} else {
0L
}
}.collect {
println(it)
}
}
But this doesn't work as sometimes the true values aren't emitted - inverting the conditional (so that if(true) = 0L else 1250L) also doesn't work. Basically what I'm looking for is that if flowA is true - hold that value for 1 second before changing values. Is something like that possible?
I made this use conflated on the 2nd flow, that is drastically faster, so that zipping them will always take the latest value from fastFlow, when slowFlow is finally ready, if you don't use conflated on the 2nd flow, it will always be the first time both emit.
fun forFlow() = runTest {
val slowString = listOf("first", "second", "third", "fourth")
val slowFlow = flow {
slowString.forEach {
delay(100)
emit(it)
}
}
val fastFlow = flow {
(1 until 1000).forEach { num ->
delay(5)
emit(num)
}
}.conflate()
suspend fun zip() {
slowFlow.zip(fastFlow) { first, second -> "$first: $second" }
.collect {
println(it)
}
}
runBlocking {
zip()
}
println("Done!")
}
With Conflated on fastFlow:
first: 1
second: 15
third: 32
fourth: 49
Done!
Without Conflated on fastFlow:
first: 1
second: 2
third: 3
fourth: 4
Done!

Kotlin - How To Collect X Values From a Flow?

Let's say I have a flow that is constantly sending updated like the following:
locationFlow = StateFlow<Location?>(null)
I have a use-case where after a particular event occurs, I want to collect X values from the flow and continue, so something like what I have below. I know that collect is a terminal operator, so I don't think the logic I have below works, but how could I do this in this case? I'd like to collect X items, save them, and then send them to another function for processing/handling.
fun onEventOccurred() {
launch {
val locations = mutableListOf<Location?>()
locationFlow.collect {
//collect only X locations
locations.add(it)
}
saveLocations(locations)
}
}
Is there a pre-existing Kotlin function for something like this? I'd like to collect from the flow X times, save the items to a list, and pass that list to another function.
It doesn't matter that collect is terminal. The upstream StateFlow will keep behaving normally because StateFlows don't care what their collectors are doing. you can use the take function to get a specific number of items, and you can use toList() (another terminal function) to concisely copy them into a list once they're all ready.
fun onEventOccurred() {
launch {
saveLocations(locationFlow.take(5).toList())
}
}
If I understood correctly your use case, you want to:
discard elements until a specific one is sent – actually, after re-reading your question I don't think this is the case.. I'm leaving it in the example just FYI
when that happens, you want to collect X items for further processing
Assuming that's correct, you can use a combination of dropWhile and take, like so:
fun main() = runBlocking {
val messages = flow {
repeat(10) {
println(it)
delay(500)
emit(it)
}
}
messages
.dropWhile { it < 5 }
.take(3)
.collect { println(it) } // prints 5, 6, 7
}
You can even have more complex logic, i.e. discard any number that's less than 5, and then take the first 10 even numbers:
fun main() = runBlocking {
val messages = flow {
repeat(100) {
delay(500)
emit(it)
}
}
messages
.dropWhile { it < 5 }
.filter { it % 2 == 0}
.take(10)
.collect { println(it) } // prints even numbers, 6 to 24
}

Kotlin \ Android - LiveData async transformation prevent previous result

So I have a LiveData that I transform to an async function that takes a while to execute (like 2 seconds sometimes, or 4 seconds).
sometimes the call takes long, and sometimes it's really fast (depends on the results) sometimes it's instant (empty result)
the problem is that if I have 2 consecutive emits in my LiveData, sometimes the first result takes a while to execute, and the second one will take an instant, than it will show the second before the first, and than overwrite the result with the earlier calculation,
what I want is mroe of a sequential effect. (kinda like RxJava concatMap)
private val _state = query.mapAsync(viewModelScope) { searchString ->
if (searchString.isEmpty()) {
NoSearch
} else {
val results = repo.search(searchString)
if (results.isNotEmpty()) {
Results(results.map { mapToMainResult(it, searchString) })
} else {
NoResults
}
}
}
#MainThread
fun <X, Y> LiveData<X>.mapAsync(
scope: CoroutineScope,
mapFunction: androidx.arch.core.util.Function<X, Y>
): LiveData<Y> {
val result = MediatorLiveData<Y>()
result.addSource(this) { x ->
scope.launch(Dispatchers.IO) { result.postValue(mapFunction.apply(x)) }
}
return result
}
how do I prevent the second result from overwriting the first result?
#MainThread
fun <X, Y> LiveData<X>.mapAsync(
scope: CoroutineScope,
mapFunction: (X) -> Y,
): LiveData<Y> = switchMap { value ->
liveData(scope.coroutineContext) {
withContext(Dispatchers.IO) {
emit(mapFunction(value))
}
}
}