Kotlin coroutine for executing external process - kotlin

The traditional approach in Java to execute an external process is to start a new Process, start two threads to consume its inputStream and errorStream and then call its blocking Process.waitFor() to wait till the external command has exited.
How can this be done in a (almost) non-blocking style with Kotlin coroutines?
I tried it this way. Do you have any suggestions to improve it?
(How to asynchronously read the streams, also call ProcessBuilder.start() in withContext(Dispatchers.IO), are there too many calls to Dispatchers.IO, ...?)
suspend fun executeCommand(commandArgs: List<String>): ExecuteCommandResult {
try {
val process = ProcessBuilder(commandArgs).start()
val outputStream = GlobalScope.async(Dispatchers.IO) { readStream(process.inputStream) }
val errorStream = GlobalScope.async(Dispatchers.IO) { readStream(process.errorStream) }
val exitCode = withContext(Dispatchers.IO) {
process.waitFor()
}
return ExecuteCommandResult(exitCode, outputStream.await(), errorStream.await())
} catch (e: Exception) {
return ExecuteCommandResult(-1, "", e.localizedMessage)
}
}
private suspend fun readStream(inputStream: InputStream): String {
val readLines = mutableListOf<String>()
withContext(Dispatchers.IO) {
try {
inputStream.bufferedReader().use { reader ->
var line: String?
do {
line = reader.readLine()
if (line != null) {
readLines.add(line)
}
} while (line != null)
}
} catch (e: Exception) {
// ..
}
}
return readLines.joinToString(System.lineSeparator())
}

Related

Does cancelling a coroutine cancel the write to a file?

suspend fun copy(oldFile: File, newFile: File): Boolean{
return withContext(Dispatchers.IO) {
var inputStream: InputStream? = null
var outputStream: OutputStream? = null
try {
val fileReader = ByteArray(4096)
inputStream = oldFile.inputStream()
outputStream = FileOutputStream(newFile)
while (true) {
val read: Int = inputStream.read(fileReader)
if (read == -1) {
break
}
outputStream.write(fileReader, 0, read)
}
outputStream.flush()
true
} catch (e: IOException) {
Log.e(TAG, "${e.message}")
false
} finally {
inputStream?.close()
outputStream?.close()
}
}
}
In the above code, if I cancel the job that is running the function, does the copying gets cancelled or do I have to manually check for state of the job inside while loop using ensureActive()?

How to create a polling mechanism with kotlin coroutines?

I am trying to create a polling mechanism with kotlin coroutines using sharedFlow and want to stop when there are no subscribers and active when there is at least one subscriber. My question is, is sharedFlow the right choice in this scenario or should I use channel. I tried using channelFlow but I am unaware how to close the channel (not cancel the job) outside the block body. Can someone help? Here's the snippet.
fun poll(id: String) = channelFlow {
while (!isClosedForSend) {
try {
send(repository.getDetails(id))
delay(MIN_REFRESH_TIME_MS)
} catch (throwable: Throwable) {
Timber.e("error -> ${throwable.message}")
}
invokeOnClose { Timber.e("channel flow closed.") }
}
}
You can use SharedFlow which emits values in a broadcast fashion (won't emit new value until the previous one is consumed by all the collectors).
val sharedFlow = MutableSharedFlow<String>()
val scope = CoroutineScope(Job() + Dispatchers.IO)
var producer: Job()
scope.launch {
val producer = launch() {
sharedFlow.emit(...)
}
sharedFlow.subscriptionCount
.map {count -> count > 0}
.distinctUntilChanged()
.collect { isActive -> if (isActive) stopProducing() else startProducing()
}
fun CoroutineScope.startProducing() {
producer = launch() {
sharedFlow.emit(...)
}
}
fun stopProducing() {
producer.cancel()
}
First of all, when you call channelFlow(block), there is no need to close the channel manually. The channel will be closed automatically after the execution of block is done.
I think the "produce" coroutine builder function may be what you need. But unfortunately, it's still an experimental api.
fun poll(id: String) = someScope.produce {
invokeOnClose { Timber.e("channel flow closed.") }
while (true) {
try {
send(repository.getDetails(id))
// delay(MIN_REFRESH_TIME_MS) //no need
} catch (throwable: Throwable) {
Timber.e("error -> ${throwable.message}")
}
}
}
fun main() = runBlocking {
val channel = poll("hello")
channel.receive()
channel.cancel()
}
The produce function will suspended when you don't call the returned channel's receive() method, so there is no need to delay.
UPDATE: Use broadcast for sharing values across multiple ReceiveChannel.
fun poll(id: String) = someScope.broadcast {
invokeOnClose { Timber.e("channel flow closed.") }
while (true) {
try {
send(repository.getDetails(id))
// delay(MIN_REFRESH_TIME_MS) //no need
} catch (throwable: Throwable) {
Timber.e("error -> ${throwable.message}")
}
}
}
fun main() = runBlocking {
val broadcast = poll("hello")
val channel1 = broadcast.openSubscription()
val channel2 = broadcast.openSubscription()
channel1.receive()
channel2.receive()
broadcast.cancel()
}

How to create a flow with a few subscribtions in Kotlin?

I need to run a task, which emits some data. I want to subscribe to this data like PublishSubject. But I can't solve a problem of one-instance flow. If I try to call it again, it will create another instance and the job will be done twice.
I tried to run the flow internally and post values to the BroadcastChannel, but this solution doesn't seem correct.
What is the best practice for such a task?
This will do the magic:
fun <T> Flow<T>.refCount(capacity: Int = Channel.CONFLATED, dispatcher: CoroutineDispatcher = Dispatchers.Default): Flow<T> {
class Context(var counter: Int) {
lateinit var job: Job
lateinit var channel: BroadcastChannel<T>
}
val context = Context(0)
fun lock() = synchronized(context) {
if (++context.counter > 1) {
return#synchronized
}
context.channel = BroadcastChannel(capacity)
context.job = GlobalScope.async(dispatcher) {
try {
collect { context.channel.offer(it) }
} catch (e: Exception) {
context.channel.close(e)
}
}
}
fun unlock() = synchronized(context) {
if (--context.counter == 0) {
context.job.cancel()
}
}
return flow {
lock()
try {
emitAll(context.channel.openSubscription())
} finally {
unlock()
}
}
}

Will this code leak resources when the coroutine is canceled?

Will the following code leak resources when the Kotlin coroutine is canceled?
General: The code is nested inside a ViewModel!
The method retrievePDFDocument will get triggered in the onStart Event of a Fragment.
fun retrievePDFDocument() {
job = viewModelScope.launch {
withContext(Dispatchers.IO) {
downloadFile(assetPath.value!!)
}
}
}
And here the suspend function:
private fun downloadFile(strPdfUrl: String): File? {
var inputStream: InputStream? = null
val lenghtOfFile: Int //lenghtOfFile is used for calculating download progress
//this is where the file will be seen after the download
var fileOut: FileOutputStream? = null
var localPDFFile: File? = null
if(strPdfUrl.isBlank())
return localPDFFile
try {
val pdfUrl = URL(strPdfUrl)
val urlConnection = pdfUrl.openConnection() as HttpURLConnection
if (urlConnection.responseCode == 200) {
//file input is from the url
inputStream = BufferedInputStream(urlConnection.inputStream)
lenghtOfFile = urlConnection.contentLength
localPDFFile = File(localPdfDirectory, pdfFileName.value!!)
fileOut = FileOutputStream(localPDFFile)
//here’s the download code
val buffer = ByteArray(1024)
var total: Long = 0
while (true) {
// If coroutine is cancelled
// a CancellationException will be thrown here
// Do not more work then necesarry
coroutineContext.ensureActive()
val length = inputStream.read(buffer)
if (length <= 0) break
total += length.toLong()
_currProgress.postValue( (total * 100 / lenghtOfFile).toInt() )
fileOut.write(buffer, 0, length)
}
}
} catch (e: IOException) {
Log.e("PdfViewerViewModel - downloadFile Error: ${e.message}")
localPdfFile?.delete() // remove partially downloaded file
localPDFFile = null
} catch (e1: CancellationException) {
Log.e("PdfViewerViewModel - downloadFile CancellationException: ${e1.message}")
localPdfFile?.delete() // remove partially downloaded file
localPdfFile = null
finally {
try {
inputStream?.close()
fileOut?.apply {
flush()
close()
}
} catch (e1: IOException) { //do nothing here }
}
return localPDFFile
}
Kindly regards
Frank
#Update 10.04.2020
Implementation with coroutineContext.ensureActive() and catching the exception

Fan-out / fan-in - closing result channel

I'm producing items, consuming from multiple co-routines and pushing back to resultChannel. Producer is closing its channel after last item.
The code never finishes as resultChannel is never being closed. How to detect and properly finish iteration so hasNext() return false?
val inputData = (0..99).map { "Input$it" }
val threads = 10
val bundleProducer = produce<String>(CommonPool, threads) {
inputData.forEach { item ->
send(item)
println("Producing: $item")
}
println("Producing finished")
close()
}
val resultChannel = Channel<String>(threads)
repeat(threads) {
launch(CommonPool) {
bundleProducer.consumeEach {
println("CONSUMING $it")
resultChannel.send("Result ($it)")
}
}
}
val iterator = object : Iterator<String> {
val iterator = resultChannel.iterator()
override fun hasNext() = runBlocking { iterator.hasNext() }
override fun next() = runBlocking { iterator.next() }
}.asSequence()
println("Starting interation...")
val result = iterator.toList()
println("finish: ${result.size}")
You can run a coroutine that awaits for the consumers to finish and then closes the resultChannel.
First, rewrite the code that starts the consumers to save the Jobs:
val jobs = (1..threads).map {
launch(CommonPool) {
bundleProducer.consumeEach {
println("CONSUMING $it")
resultChannel.send("Result ($it)")
}
}
}
And then run another coroutine that closes the channel once all the Jobs are done:
launch(CommonPool) {
jobs.forEach { it.join() }
resultChannel.close()
}