Why doesn't my code executed from top to bottom in kotlin? - kotlin

I'm making a recyclerview. when using init block, I have problem.
I expected code is executed in order. but It's out of order
Some code :
inner class TodoListFragmentRecyclerViewAdapter : RecyclerView.Adapter<RecyclerView.ViewHolder>() {
val doListDTOs = ArrayList<DoListDTO>()
init {
Log.e("1","1")
doListListenerRegistration = fireStore.collection("doList").whereEqualTo("doListName",todoList_name).orderBy("doListTimestamp", Query.Direction.DESCENDING).limit(100)
.addSnapshotListener { querySnapshot, firebaseFirestoreException ->
if (querySnapshot == null) return#addSnapshotListener
doListDTOs.clear()
for (snapshot in querySnapshot.documents) {
val item = snapshot.toObject(DoListDTO::class.java)
doListDTOs.add(item!!)
Log.e("2",doListDTOs.toString())
notifyDataSetChanged()
}
}
Log.e("3",doListDTOs.toString())
}
}
I want log showed like below order
1 -> 2 -> 3
but, the actual output is
1 -> 3 -> 2
why is this?
As an additional issue, because of above order, last doListDTOs.toString() is null in log 3. but doListDTOs.toString in log 2 have some value.
If It's not a order problem, I'd be grateful if you could tell me what the problem was.

When you connect to Firestore and try to request data from firestore DB. you are acutally making a network call which is run via a background thread.
Now the main thread first print log for you as 1. and then starts a new thread A (for example). And lastly print 3.
But you have to notice that 2 will be print when data is read via thread A and is returned to callback of main thread (IPC) and that's exactly why it shows 3 before 2.
If you need to run 3 after two, you have write code inside firestore callback. when that task is completed, you will iniate your next step.
Hope it helps!

Related

Flow.take(ITEM_COUNT) returning all the elements rather then specified amount of elements

I've a method X that's getting data from the server via pub sub. This method returns a flow. I've another method that subscribes to the flow by method X but only wants to take the first 3 values max from the flow if the data is distinct compared to previous data. I've written the following code
fun subscribeToData() : Flow<List<MyData>> {
....
//incoming data
emit(list)
}
fun getUptoFirst3Items() {
subscribeToData()
.take(ITEM_COUNT) // ITEM_COUNT is 3
.distinctUntilChange() //only proceed if the data is different from the previous top 3 items
.mapIndex {
//do transformation
}
.collect { transformedListOf3Elements ->
}
}
Problem:
In collect{} I'm not getting 3 elements but rather I'm getting all the data that's coming in the flow.
I'm not sure what's wrong here? Can someone help me?
You have a Flow<List<MyData>> here, which means every element of this flow is itself a list.
The take operator is applied on the flow, so you will take the 3 first lists of the flow. Each individual list is not limited, unless you use take on the list itself.
So the name transformedListOf3Elements is incorrect, because the list is of an unknown number of elements, unless you filter it somehow in the map.
#Joffrey answer already explained why you get the whole list returned and suggested you use take() on the list itself.
If you want to take just the first ITEM_COUNT elements from every list that is emitted/observed, then you have to map the result and only take ITEM_COUNT items from the list each time, instead of taking ITEM_COUNT items from the flow.
fun getUptoFirst3Items() {
subscribeToData()
.map {
// in Kotlin stdlib Iterable<T> has an extension method take(n: Int)
// that will return a List<T> containing the first n element from the iterable
it.take(ITEM_COUNT)
// alternatively you can also use subList, but the semantics are not the same,
// so check the subList documentation, before using it
it.subList(0, ITEM_COUNT)
}
.distinctUntilChange() //only proceed if the data is different from the previous top 3 items
.mapIndex {
//do transformation
}
.collect { transformedListOf3Elements ->
}
}

How do I add a short delay, so user can see every number that was rolled. Kotlin, Android Studio

let's say I'm making a simple dnd dice roller (cause I am), I made it so it rolls a bunch of random numbers based on how many dice they want rolled and the type of dice. it then sends it to a text view one at a time(what I want); However, it only shows one number because it has no delay to let the the user see each number rolled (it only shows the last number).
How would I do that?
else if (numTimesRolled.progress <= 4) {
for (i in 0 until numTimesRolled.progress){
randNum = Random.nextInt(1, diceIsComfirm)
resultsArray[i] = randNum.toString()
}
for (i in 0 until numTimesRolled.progress){
randNumDisplay.text = resultsArray[i]
}
Non-coroutines solution is to post Runnables:
val delayPerNumber = 500L // 500ms
for (i in 0 until numTimesRolled.progress){
randNumDisplay.postDelayed({ randNumDisplay.text = resultsArray[i] }, i * delayPerNumber)
}
With a coroutine:
lifecycleScope.launch {
for (i in 0 until numTimesRolled.progress){
delay(500) // 500ms
randNumDisplay.text = resultsArray[i]
}
}
An advantage with the coroutine is it will automatically stop if the Activity or Fragment is destroyed, so if the Activity/Fragment is closed while the coroutine's still running, it won't hold your obsolete views in memory.

emitting flow values asynchronously with kotlins flow

Iam building a simple Spring Service with kotlin and webflux.
I have a endpoint which returns a flow. The flow contains elements which take a long time to compute which is simulated by a delay.
It is constructed like this:
suspend fun latest(): Flow<Message> {
println("generating messages")
return flow {
for (i in 0..20) {
println("generating $i")
if (i % 2 == 0) delay(1000)
else delay(200)
println("generated messsage $i")
emit(generateMessage(i))
}
println("messages generated")
}
}
My expectation was that it would return Message1 followed by Message3, Message5... and then Message0 because of the different delays the individual generation takes.
But in reality the flow contains the elements in order.
I guess iam missing something important about coroutins and flow and i tryed diffrent thinks to achive what i want with couroutins but i cant figure out how.
Solution
As pointed out by Marko Topolnik and William Reed using channelFlow works.
fun latest(): Flow<Message> {
println("generating numbers")
return channelFlow {
for (i in 0..20) {
launch {
send(generateMessage(i))
}
}
}
}
suspend fun generateMessage(i: Int): Message {
println("generating $i")
val time = measureTimeMillis {
if (i % 2 == 0) delay(1000)
else delay(500)
}
println("generated messsage $i in ${time}ms")
return Message(UUID.randomUUID(), "This is Message $i")
}
When run the results are as expected
generating numbers
generating 2
generating 0
generating 1
generating 6
...
generated messsage 5 in 501ms
generated messsage 9 in 501ms
generated messsage 13 in 501ms
generated messsage 15 in 505ms
generated messsage 4 in 1004ms
...
Once you go concurrent with the computation of each element, your first problem will be to figure out when all the computation is done.
You have to know in advance how many items to expect. So it seems natural to me to construct a plain List<Deferred<Message>> and then await on all the deferreds before returning the entire thing. You aren't getting any mileage from the flow in your case, since flow is all about doing things synchronously, inside the flow collection.
You can also use channelFlow in combination with a known count of messages to expect, and then terminate the flow based on that. The advantage would be that Spring can start collecting the flow earlier.
EDIT
Actually, the problem of the count isn't present: the flow will automatically wait for all the child coroutines you launched to complete.
Your current approach uses a single coroutine for the entire function, including the for loop. That means that any calling of a suspend fun, e.g. delay will block that entire coroutine until it completes. It does free up the thread to go do other stuff, but the current coroutine is blocked.
It's hard to say what the right solution is based on your simplified example. If you truly did want a new coroutine for each for loop, you could launch it there, but it doesn't seem clear that is the right solution from the information given.

Freeing resources associated with items

I am implementing some multi-stage pipeline-style processing with RxJava3. I use several consecutive .concatMap calls for the individual steps. As a side effect, the steps create some (large) temporary files that should be deleted on both error and success. The first step hands over the files to the next one. I successfully use Single.using to close the file handles, but can't delete the file this way, as it would be gone before the next step can use it. Deleting the file in doOnError of the first step, and in using in the second step works in most cases.
However, there is a corner case where the file "leaks", i.e. is not deleted: If the second step of the first work item fails (throws an exception) after the second item has completed its first (first concatMap) step but not yet begun its second step (second concatMap), that second item is in some intermediate place and not deleted, as it is not currently captured in any using scope.
My minimal example is:
import io.reactivex.Observable
import io.reactivex.schedulers.Schedulers
import java.io.File
import io.reactivex.Single
import java.io.FileOutputStream
fun main(args: Array<String>) {
Observable.just(5, 4).subscribeOn(Schedulers.computation())
.concatMapSingle { workItem ->
val file = File("/tmp/foo/work$workItem")
Single.using({ FileOutputStream(file) }, { oStream ->
Single.just(oStream)
.subscribeOn(Schedulers.computation())
.map { os ->
println("Pretending to work on item $workItem")
os.write("File $workItem".toByteArray())
// if (Math.random () > 0.5) throw Exception ("Something else failed")
Thread.sleep(workItem.toLong() * 1000) // Second work item should have a quicker first step than the second step of the first item
Pair(file, workItem) // Propagate both file and item number
}.doOnError { println("Deleting ${file.absolutePath}"); file.delete() }
}, { os -> println("Closing file ${file.absolutePath}"); os.close(); })
}
.concatMapSingle { data1 ->
Single.using({ data1 }, { data2 ->
Single.just(data2)
.subscribeOn(Schedulers.computation())
.map { data ->
val workItem = data.second
println("Continuing pretend work on item ${workItem}");
Thread.sleep(workItem.toLong() * 1000)
// ... More complex stuff here ...
if (workItem == 5) throw Exception("Something failed")
}
}, { data -> println("Deleting ${data.first.absolutePath}"); data.first.delete(); })
}.blockingSubscribe();
}
If the exception is thrown, the file /tmp/foo/work4 is not deleted, as the work item "4" waits for 1s to be processed by the 2nd concatMap. The output is:
Pretending to work on item 5
Closing file /tmp/foo/work5
Continuing pretend work on item 5
Pretending to work on item 4
Closing file /tmp/foo/work4
Deleting /tmp/foo/work5
Exception in thread "main" java.lang.RuntimeException: java.lang.Exception: Something failed
at io.reactivex.internal.util.ExceptionHelper.wrapOrThrow(ExceptionHelper.java:46)
[...]
If the first (commented out) or no exception throws, everything is deleted fine. The problem is the same with flatMap, but harder to debug, as more things are running in parallel.
Therefore, my question is: Can I associate some "clean-up" function with the items (here: 5, 4) that is always called when that item goes out of scope?

Custom command to go back in a process instance (execution)

I have a process where I have 3 sequential user tasks (something like Task 1 -> Task 2 -> Task 3). So, to validate the Task 3, I have to validate the Task 1, then the Task 2.
My goal is to implement a workaround to go back in an execution of a process instance thanks to a Command like suggested in this link. The problem is I started to implement the command by it does not work as I want. The algorithm should be something like:
Retrieve the task with the passed id
Get the process instance of this task
Get the historic tasks of the process instance
From the list of the historic tasks, deduce the previous one
Create a new task from the previous historic task
Make the execution to point to this new task
Maybe clean the task pointed before the update
So, the code of my command is like that:
public class MoveTokenCmd implements Command<Void> {
protected String fromTaskId = "20918";
public MoveTokenCmd() {
}
public Void execute(CommandContext commandContext) {
HistoricTaskInstanceEntity currentUserTaskEntity = commandContext.getHistoricTaskInstanceEntityManager()
.findHistoricTaskInstanceById(fromTaskId);
ExecutionEntity currentExecution = commandContext.getExecutionEntityManager()
.findExecutionById(currentUserTaskEntity.getExecutionId());
// Get process Instance
HistoricProcessInstanceEntity historicProcessInstanceEntity = commandContext
.getHistoricProcessInstanceEntityManager()
.findHistoricProcessInstance(currentUserTaskEntity.getProcessInstanceId());
HistoricTaskInstanceQueryImpl historicTaskInstanceQuery = new HistoricTaskInstanceQueryImpl();
historicTaskInstanceQuery.processInstanceId(historicProcessInstanceEntity.getId()).orderByExecutionId().desc();
List<HistoricTaskInstance> historicTaskInstances = commandContext.getHistoricTaskInstanceEntityManager()
.findHistoricTaskInstancesByQueryCriteria(historicTaskInstanceQuery);
int index = 0;
for (HistoricTaskInstance historicTaskInstance : historicTaskInstances) {
if (historicTaskInstance.getId().equals(currentUserTaskEntity.getId())) {
break;
}
index++;
}
if (index > 0) {
HistoricTaskInstance previousTask = historicTaskInstances.get(index - 1);
TaskEntity newTaskEntity = createTaskFromHistoricTask(previousTask, commandContext);
currentExecution.addTask(newTaskEntity);
commandContext.getTaskEntityManager().insert(newTaskEntity);
AtomicOperation.TRANSITION_CREATE_SCOPE.execute(currentExecution);
} else {
// TODO: find the last task of the previous process instance
}
// To overcome the "Task cannot be deleted because is part of a running
// process"
TaskEntity currentUserTask = commandContext.getTaskEntityManager().findTaskById(fromTaskId);
if (currentUserTask != null) {
currentUserTask.setExecutionId(null);
commandContext.getTaskEntityManager().deleteTask(currentUserTask, "jumped to another task", true);
}
return null;
}
private TaskEntity createTaskFromHistoricTask(HistoricTaskInstance historicTaskInstance,
CommandContext commandContext) {
TaskEntity newTaskEntity = new TaskEntity();
newTaskEntity.setProcessDefinitionId(historicTaskInstance.getProcessDefinitionId());
newTaskEntity.setName(historicTaskInstance.getName());
newTaskEntity.setTaskDefinitionKey(historicTaskInstance.getTaskDefinitionKey());
newTaskEntity.setProcessInstanceId(historicTaskInstance.getExecutionId());
newTaskEntity.setExecutionId(historicTaskInstance.getExecutionId());
return newTaskEntity;
}
}
But the problem is I can see my task is created, but the execution does not point to it but to the current one.
I had the idea to use the activity (via the object ActivityImpl) to set it to the execution but I don't know how to retrieve the activity of my new task.
Can someone help me, please?
Unless somethign has changed in the engine significantly the code in the link you reference should still work (I have used it on a number of projects).
That said, when scanning your code I don't see the most important command.
Once you have the current execution, you can move the token by setting the current activity.
Like I said, the code in the referenced article used to work and still should.
Greg
Referring the same link in your question, i would personally recommend to work with the design of you your process. use an exclusive gateway to decide whether the process should end or should be returned to the previous task. if the generation of task is dynamic, you can point to the same task and delete local variable. Activiti has constructs to save your time from implementing the same :).