I have a following problem: there are two types of events:
enum class Event {
REGULAR,
URGENT
}
These events are sent to val eventsFlow = SharedFlow<Event> (or channel), where later they're being processed by collector:
scope.launch {
eventsFlow.collect { event ->
delay(100) //process event (same logic)
}
}
What I want to achieve:
REGULAR events should not cancel processing (if new `REGULAR` event arrives and the collector is busy, processing should not be cancelled, like with standart collect)
But if URGENT event arrives, it should cancel processing of current event (like collectLatest would do)
So I need to come up with some solution, was trying to play around with flatMapLatest and other operators, but getting no luck
scope.launch {
var job: Job? = null
eventsFlow.collect { event ->
when (event) {
Event.URGENT -> job?.cancelAndJoin()
Event.REGULAR -> job?.join()
}
job = launch {
delay(100) //process event (same logic)
}
}
}
Related
I am observing inside a fragment the events of a sharedflow such as this:
myEvent.collectInLifeCycle(viewLifecycleOwner) { event ->
when (state) {
//check the event. The event emited form onStart is never reached here :(
}
}
Whereas in the viewmodel I have
private val _myEvent = MutableSharedFlow<MyEvent>()
val myEvent: SharedFlow<MyEvent> = _myEvent
fun loadData() =
viewModelScope.launch {
getDataUseCase
.safePrepare(onGenericError = { _event.emit(Event.Error(null)) })
.onStart { _event.emit(Event.Loading) }
.onEach { result ->
result.onSuccess { response ->
_event.emit(Event.Something)
}
}
.launchIn(viewModelScope)
}
So the problem is that only the Event.Something is the one being properly collected from the fragment, whereas _event.emit(Event.Loading) is not being collected... If I debug it goes to the onStart, but it is never called in the fragment.
Your SharedFlow needs to have a replay so that collectors always get at least the most recent value. Otherwise, if you emit to the Flow before the collector is registered, it will never see anything emitted. Do this:
private val _myEvent = MutableSharedFlow<MyEvent>(replay = 1)
Personally, unless I'm missing some detail here that would change my mind, I would simplify all your code to avoid having to manually call loadData(). Something like this but I'm guessing a bit because I don't know all your types and functions.
val myEvent: SharedFlow<MyEvent> = flow {
emit(Event.Loading)
emitAll(
getDataUseCase
.transform { result ->
result.onSuccess { response ->
emit(Event.Something)
}
}
.catch { error -> emit(Event.Error(null)) }
)
}.shareIn(viewModelScope, SharingStarted.Lazily, replay = 1)
I'm trying to wrap a callbackFlow within an outer flow - there are items I'd like to emit from the outer flow, but I've got an old callback interface, which I'd like to adapt to Kotlin flow. I've looked at several examples of usage of callbackFlow but I can't figure out how to properly trigger it within another flow.
Here's an example:
class Processor {
fun start(processProgress: ProcessProgressListener) {
processProgress.onFinished() //finishes as soon as it starts!
}
}
interface ProcessProgressListener {
fun onFinished()
}
//main method here:
fun startProcess(processor: Processor): Flow<String> {
val mainFlow = flow {
emit("STARTED")
emit("IN_PROGRESS")
}
return merge(processProgressFlow(processor), mainFlow)
}
fun processProgressFlow(processor: Processor) = callbackFlow {
val listener = object : ProcessProgressListener {
override fun onFinished() {
trySend("FINISHED")
}
}
processor.start(listener)
}
The Processor takes a listener, which is triggered when the process has finished. When that happens, I would like to emit the final item FINISHED.
The way I invoke the whole flow is as follows:
runBlocking {
startProcess(Processor()).collect {
print(it)
}
}
But, I get no output whatsoever. If I don't use the megre and only return the mainFlow, however, I do get the STARTED and IN_PROGRESS items though.
What am I doing wrong?
You forgot to call awaitClose in the end of callbackFlow block:
fun processProgressFlow(processor: Processor) = callbackFlow<String> {
val listener = object : ProcessProgressListener {
override fun onFinished() {
trySend("FINISHED")
channel.close()
}
}
processor.start(listener)
/*
* Suspends until 'channel.close() or cancel()' is invoked
* or flow collector is cancelled (e.g. by 'take(1)' or because a collector's coroutine was cancelled).
* In both cases, callback will be properly unregistered.
*/
awaitClose { /* unregister listener here */ }
}
awaitClose {} should be used in the end of callbackFlow block.
Otherwise, a callback/listener may leak in case of external cancellation.
According to the callbackFlow docs:
awaitClose should be used to keep the flow running, otherwise the channel will be closed immediately when block completes. awaitClose argument is called either when a flow consumer cancels the flow collection or when a callback-based API invokes SendChannel.close manually and is typically used to cleanup the resources after the completion, e.g. unregister a callback. Using awaitClose is mandatory in order to prevent memory leaks when the flow collection is cancelled, otherwise the callback may keep running even when the flow collector is already completed. To avoid such leaks, this method throws IllegalStateException if block returns, but the channel is not closed yet.
I want to use Channel as a Queue, but I need to clear it periodically. I didn't found clear method for the Channel and I make workaround with Channel.cancel and create new Channel, but it looks bad.
The question is:
How can I implement using Kotlin's channel as a queue with cleaning? Recreating a channel looks not so good...
Simplified context.
I have methods called by an external system: enqueue(id: Int) and cancel() and I don't have access to system that invokes these methods (React native methods in my case).
enqueue(id: Int) - enqueue id into the queue for processing (only one item at time can be processed) and starts processing the queue if is not started before.
cancel() - cancel pending processing but allow finishing current processing for current processing item.
My Processor is a singleton and enqueue(id: Int) can be called multiple times before canceling (to add items into queue) and after (for new processing).
My solution is to use channel as a queue and consume its items as a flow. cancel() will cancel the channel that allow current item processint to finish.
The problem is that after channel.cancel() channel is closed and I need to create new channel that is not so beautiful.
fun main() = runBlocking<Unit> {
val processor = Processor()
repeat(3) { processor.enqueue(it) }
delay(150)
processor.cancelPending()
delay(500)
println("Run processing one more time.")
repeat(3) { processor.enqueue(it) }
delay(500)
}
class Processor : CoroutineScope by CoroutineScope(Dispatchers.Default) {
private var channel = Channel<Int>(50)
private var processJob: Job? = null
fun enqueue(id: Int) {
channel.offer(id)
if (processJob?.isActive == true) return
processJob = launch {
channel.consumeAsFlow().collect { process(it) }
}
}
private suspend fun process(id: Int) {
delay(100)
println("[$id] processed.")
}
fun cancelPending() {
println("Cancel.")
channel.cancel()
channel = Channel(50)
}
}
Update Coroutines 1.3.0-RC
Working version:
#FlowPreview
suspend fun streamTest(): Flow<String> = channelFlow {
listener.onSomeResult { result ->
if (!isClosedForSend) {
offer(result)
}
}
awaitClose {
listener.unsubscribe()
}
}
Also checkout this Medium article by Roman Elizarov: Callbacks and Kotlin Flows
Original Question
I have a Flow emitting multiple Strings:
#FlowPreview
suspend fun streamTest(): Flow<String> = flowViaChannel { channel ->
listener.onSomeResult { result ->
if (!channel.isClosedForSend) {
channel.sendBlocking(result)
}
}
}
After some time I want to unsubscribe from the stream. Currently I do the following:
viewModelScope.launch {
beaconService.streamTest().collect {
Timber.i("stream value $it")
if(it == "someString")
// Here the coroutine gets canceled, but streamTest is still executed
this.cancel()
}
}
If the coroutine gets canceled, the stream is still executed. There is just no subscriber listening to new values. How can I unsubscribe and stop the stream function?
A solution is not to cancel the flow, but the scope it's launched in.
val job = scope.launch { flow.cancellable().collect { } }
job.cancel()
NOTE: You should call cancellable() before collect if you want your collector stop when Job is canceled.
You could use the takeWhile operator on Flow.
flow.takeWhile { it != "someString" }.collect { emittedValue ->
//Do stuff until predicate is false
}
For those willing to unsubscribe from the Flow within the Coroutine scope itself, this approach worked for me :
viewModelScope.launch {
beaconService.streamTest().collect {
//Do something then
this.coroutineContext.job.cancel()
}
}
With the current version of coroutines / Flows (1.2.x) I don't now a good solution. With onCompletion you will get informed when the flow stops, but you are then outside of the streamTest function and it will be hard to stop listening of new events.
beaconService.streamTest().onCompletion {
}.collect {
...
}
With the next version of coroutines (1.3.x) it will be really easy. The function flowViaChannel is deprecated in favor for channelFlow. This function allows you to wait for closing of the flow and do something in this moment, eg. remove listener:
channelFlow<String> {
println("Subscribe to listener")
awaitClose {
println("Unsubscribe from listener")
}
}
When a flow runs in couroutin scope, you can get a job from it to controls stop subscribe.
// Make member variable if you want.
var jobForCancel : Job? = null
// Begin collecting
jobForCancel = viewModelScope.launch {
beaconService.streamTest().collect {
Timber.i("stream value $it")
if(it == "someString")
// Here the coroutine gets canceled, but streamTest is still executed
// this.cancel() // Don't
}
}
// Call whenever to canceled
jobForCancel?.cancel()
For completeness, there is a newer version of the accepted answer. Instead of explicitly using the launch coroutine builder, we can use the launchIn method directly on the flow:
val job = flow.cancellable().launchIn(scope)
job.cancel()
Based on #Ronald answer this works great for testing when you need to make your Flow emits again.
val flow = MutableStateFlow(initialValue)
flow.take(n).collectIndexed { index, _ ->
if (index == something) {
flow.value = update
}
}
//your assertions
We have to know how many emissions in total we expect n and then we can use the index to know when to update the Flow so we can receive more emissions.
If you want to cancel only the subscription being inside it, you can do it like this:
viewModelScope.launch {
testScope.collect {
return#collect cancel()
}
}
There are two ways to do this that are by design from the Kotlin team:
As #Ronald pointed out in another comment:
Option 1: takeWhile { //predicate }
Cancel collection when the predicate is false. Final value will not be collected.
flow.takeWhile { value ->
value != "finalString"
}.collect { value ->
//Do stuff, but "finalString" will never hit this
}
Option 2: transformWhile { //predicate }
When predicate is false, collect that value, then cancel
flow.transformWhile { value ->
emit(value)
value != "finalString"
}.collect { value ->
//Do stuff, but "finalString" will be the last value
}
https://github.com/Kotlin/kotlinx.coroutines/issues/2065
I want an observable that:
Can emit items on demand and never really completes (a hot observable?)
Is aware of when it has subscribers or not
If no subscribers, it will buffer items that I tell it to emit
When subscribed, it will emit the buffered items in order, then clear the buffer, and then continue to allow me to emit more items
When unsubscribed (subscriber is disposed?), it will go back to buffering.
Also:
There is only expected to be one subscriber at a time
It does not need to be thread safe
Here's sort of a pseudocode of what I am thinking -- I don't have the necessary callbacks though to do this the right way. Also it would be nice if I could wrap it all up in an Observable or Subject.
class RxEventSender {
private val publishSubject = PublishSubject.create<Action>()
val observable: Observable<Action> = publishSubject
private val bufferedActions = arrayListOf<Action>()
private var hasSubscribers = false
fun send(action: Action) {
if (hasSubscribers) {
publishSubject.onNext(action)
} else {
bufferedActions.add(action)
}
}
//Subject was subscribed to -- not a real callback
fun onSubscribed() {
hasSubscribers = true
bufferedActions.forEach {action ->
publishSubject.onNext(action)
}
bufferedActions.clear()
}
//Subject was unsubscribed -- not a real callback
fun onUnsubscribed() {
hasSubscribers = false
}
}
Use a ReplaySubject. It has unbounded and bounded versions if you get worried about the buffer getting too big.
After spending some time, I think this works well enough for my needs. It's not wrapped up nicely in a Subject or Observable but all I needed was to emit items and subscribe.
class RxEventSender<T> {
private val bufferedEvents = arrayListOf<T>()
private val publishSubject = PublishSubject.create<T>()
val observable: Observable<T> = publishSubject
.mergeWith(Observable.fromIterable(bufferedEvents))
.doOnDispose {
bufferedEvents.clear()
}
fun send(event: T) {
if (publishSubject.hasObservers()) {
publishSubject.onNext(event)
} else {
bufferedEvents.add(event)
}
}
}