I'm trying to execute a db transaction with the vertx reactive sql client in a coroutine.
Somehow I can't figure out how I can convert the CompletableFuture to the desired io.vertx.core.Future type. Are there any helper methods or extensions to do this easily ?
val client : PgPool
...
suspend fun someServiceFunction () {
coroutineScope {
client.withTransaction { connection ->
val completableFuture = async {
repository.save(connection, requestDTO) //This is a suspend function
}.asCompletableFuture()
//Return type has to be a io.vertx.core.Future
//How can I transform the completableFuture to it ?
}
}
}
Thank you for your help !
Vert.x Future has a conversion method:
future = Future.fromCompletionStage(completionStage, vertxContext)
I adapted this from the code for asCompletableFuture() to use as an alternative. Disclaimer: I don't use Vert.x and I didn't test this.
fun <T> Deferred<T>.asVertxFuture(): Future<T> {
val promise = Promise.promise<T>()
invokeOnCompletion {
try {
promise.complete(getCompleted())
} catch (t: Throwable) {
promise.fail(t)
}
}
return promise.future()
.onComplete { result ->
cancel(result.cause()?.let {
it as? CancellationException ?: CancellationException("Future was completed exceptionally", it)
})
}
}
I wonder if mixing coroutines in with Vert.x could hurt performance because you're not using the Vert.x thread pools. Maybe you can create a Dispatchers.Vertx that borrows its thread pools.
Related
Coroutines and RxJava3
I have the following method that first makes a call to a suspend method and in the same launch scope I make 2 calls to RxJava.
I am wondering if there is a way to remove the Rxjava code out of the viewModelScope.launch scope and return the result of fetchRecentUseCase.execute().
Basically, is it possible for the viewModelScope.launch to return the listOfProducts rather than doing everything in the launch scope?
fun loadRecentlyViewed() {
viewModelScope.launch {
val listOfProducts = withContext(Dispatchers.IO) {
fetchRecentUseCase.execute()
}
val listOfSkus = listOfProducts.map { it.sku }
if (listOfSkus.isNotEmpty()) {
loadProductUseCase.execute(listOfSkus)
.subscribeOn(schedulersFacade.io)
.flatMap(convertProductDisplayUseCase::execute)
.map { /* work being done */ }
.observeOn(schedulersFacade.ui)
.subscribeBy(
onError = Timber::e,
onSuccess = { }
)
}
}
}
Usecase for the suspend method
class FetchRecentUseCaseImp() {
override suspend fun execute(): List<Products> {
// Call to network
}
}
Many thanks in advance
With coroutines, the way to return a single item that is produced asynchronously is to use a suspend function. So instead of launching a coroutine, you mark the function as suspend and convert blocking or async callback functions into non-blocking code.
The places where coroutines are launched are typically at UI interactions (click listeners), or when classes are first created (on Android, this is places like in a ViewModel constructor or Fragment's onViewCreated()).
As a side note, it is against convention for any suspend function to expect the caller to have to specify a dispatcher. It should internally delegate if it needs to, for example:
class FetchRecentUseCaseImp() {
override suspend fun execute(): List<Products> = withContext(Dispatchers.IO) {
// Synchronous call to network
}
}
But if you were using a library like Retrofit, you'd simply make your Request and await() it without specifying a dispatcher, because await() is a suspend function itself.
So your function should look something like:
suspend fun loadRecentlyViewed(): List<SomeProductType> {
val listOfSkus = fetchRecentUseCase.execute().map(Product::sku)
if (listOfSkus.isEmpty()) {
return emptyList()
}
return runCatching {
loadProductUseCase.execute(listOfSkus) // A Single, I'm assuming
.await() // Only if you're not completely stripping Rx from project
.map { convertProductDisplayUseCase.execute(it).await() } // Ditto for await()
.toList()
.flatten()
}.onFailure(Timber::e)
.getOrDefault(emptyList())
}
I have a suspend function
private suspend fun getResponse(record: String): HashMap<String, String> {}
When I call it in my main function I'm doing this, but the type of response is Job, not HashMap, how can I get the correct return type?
override fun handleRequest(event: SQSEvent?, context: Context?): Void? {
event?.records?.forEach {
try {
val response: Job = GlobalScope.launch {
getResponse(it.body)
}
} catch (ex: Exception) {
logger.error("error message")
}
}
return null
}
Given your answers in the comments, it looks like you're not looking for concurrency here. The best course of action would then be to just make getRequest() a regular function instead of a suspend one.
Assuming you can't change this, you need to call a suspend function from a regular one. To do so, you have several options depending on your use case:
block the current thread while you do your async stuff
make handleRequest a suspend function
make handleRequest take a CoroutineScope to start coroutines with some lifecycle controlled externally, but that means handleRequest will return immediately and the caller has to deal with the running coroutines (please don't use GlobalScope for this, it's a delicate API)
Option 2 and 3 are provided for completeness, but most likely in your context these won't work for you. So you have to block the current thread while handleRequest is running, and you can do that using runBlocking:
override fun handleRequest(event: SQSEvent?, context: Context?): Void? {
runBlocking {
// do your stuff
}
return null
}
Now what to do inside runBlocking depends on what you want to achieve.
if you want to process elements sequentially, simply call getResponse directly inside the loop:
override fun handleRequest(event: SQSEvent?, context: Context?): Void? {
runBlocking {
event?.records?.forEach {
try {
val response = getResponse(it.body)
// do something with the response
} catch (ex: Exception) {
logger.error("error message")
}
}
}
return null
}
If you want to process elements concurrently, but independently, you can use launch and put both getResponse() and the code using the response inside the launch:
override fun handleRequest(event: SQSEvent?, context: Context?): Void? {
runBlocking {
event?.records?.forEach {
launch { // coroutine scope provided by runBlocking
try {
val response = getResponse(it.body)
// do something with the response
} catch (ex: Exception) {
logger.error("error message")
}
}
}
}
return null
}
If you want to get the responses concurrently, but process all responses only when they're all done, you can use map + async:
override fun handleRequest(event: SQSEvent?, context: Context?): Void? {
runBlocking {
val responses = event?.records?.mapNotNull {
async { // coroutine scope provided by runBlocking
try {
getResponse(it.body)
} catch (ex: Exception) {
logger.error("error message")
null // if you want to still handle other responses
// you could also throw an exception otherwise
}
}
}.map { it.await() }
// do something with all responses
}
return null
}
You can use GlobalScope.async() instead of launch() - it returns Deferred, which is a future/promise object. You can then call await() on it to get a result of getResponse().
Just make sure not to do something like: async().await() - it wouldn't make any sense, because it would still run synchronously. If you need to run getResponse() on all event.records in parallel, then you can first go in loop and collect all deffered objects and then await on all of them.
I try to write a kotlin multiplatform library (android and ios) that uses ktor. Thereby I experience some issues with kotlins coroutines:
When writing tests I always get kotlinx.coroutines.JobCancellationException: Parent job is Completed; job=JobImpl{Completed}#... exception.
I use ktors mock engine for my tests:
client = HttpClient(MockEngine)
{
engine
{
addHandler
{ request ->
// Create response object
}
}
}
A sample method (commonMain module) using ktor. All methods in my library are written in a similar way. The exception occures if client.get is called.
suspend fun getData(): Either<Exception, String> = coroutineScope
{
// Exception occurs in this line:
val response: HttpResponse = client.get { url("https://www.google.com") }
return if (response.status == HttpStatusCode.OK)
{
(response.readText() as T).right()
}
else
{
Exception("Error").left()
}
}
A sample unit test (commonTest module) for the above method. The assertTrue statement is never called since the exception is thrown before.
#Test
fun getDataTest() = runTest
{
val result = getData()
assertTrue(result.isRight())
}
Actual implementation of runTest in androidTest and iosTest modules.
actual fun<T> runTest(block: suspend () -> T) { runBlocking { block() } }
I thought when I use coroutineScope, it waits until all child coroutines are done. What am I doing wrong and how can I fix this exception?
you can't cache HttpClient of CIO in client variable and reuse, It would be best if change the following code in your implementation.
val client:HttpClient get() = HttpClient(MockEngine) {
engine {
addHandler { request ->
// Create response object
}
}
}
The library must be updated, this glitch is in the fix report here: https://newreleases.io/project/github/ktorio/ktor/release/1.6.1
The problem is that you cannot use the same instance of the HttpClient. My ej:
HttpClient(CIO) {
install(JsonFeature) {
serializer = GsonSerializer()
}
}.use { client ->
return#use client.request("URL") {
method = HttpMethod.Get
}
}
I use webflux with netty and jdbc, so I wrap blocking jdbc operation the next way:
static <T> Mono<T> fromOne(Callable<T> blockingOperation) {
return Mono.fromCallable(blockingOperation)
.subscribeOn(jdbcScheduler)
.publishOn(Schedulers.parallel());
}
Blocking operation will be processed by the jdbcScheduler, and I want the other pipeline will be proccesed by webflux event-loop scheduler.
How to get webflux event-loop scheduler?
I will strongly advise to revisit the technology options. If you are going to use jdbc, which is still blocking, then you should not use webflux. This is because webflux will shine in a non-blocking stack but coupled with Jdbc it will act as a bottleneck. The performance will actually go down.
I agree with #Vikram Rawat use jdbc is very dangerous mainly because jdbc is a bocking IO api and use an event loop reactive model is very dangerous because basically it is very easy block all the server.
However, even if it is an experimental effort I suggest you to stay tuned to R2DBC project that is able to leverage a no blocking api for sql I used it for a spike and it is very elegant.
I can provide you an example taken from a my home project on github based on sprign boot 2.1 and kotlin:
web layer
#Configuration
class ReservationRoutesConfig {
#Bean
fun reservationRoutes(#Value("\${baseServer:http://localhost:8080}") baseServer: String,
reservationRepository: ReservationRepository) =
router {
POST("/reservation") {
it.bodyToMono(ReservationRepresentation::class.java)
.flatMap { Mono.just(ReservationRepresentation.toDomain(reservationRepresentation = it)) }
.flatMap { reservationRepository.save(it).toMono() }
.flatMap { ServerResponse.created(URI("$baseServer/reservation/${it.reservationId}")).build() }
}
GET("/reservation/{reservationId}") {
reservationRepository.findOne(it.pathVariable("reservationId")).toMono()
.flatMap { Mono.just(ReservationRepresentation.toRepresentation(it)) }
.flatMap { ok().body(BodyInserters.fromObject(it)) }
}
DELETE("/reservation/{reservationId}") {
reservationRepository.delete(it.pathVariable("reservationId")).toMono()
.then(noContent().build())
}
}
}
repository layer:
class ReactiveReservationRepository(private val databaseClient: TransactionalDatabaseClient,
private val customerRepository: CustomerRepository) : ReservationRepository {
override fun findOne(reservationId: String): Publisher<Reservation> =
databaseClient.inTransaction {
customerRepository.find(reservationId).toMono()
.flatMap { customer ->
it.execute().sql("SELECT * FROM reservation WHERE reservation_id=$1")
.bind("$1", reservationId)
.exchange()
.flatMap { sqlRowMap ->
sqlRowMap.extract { t, u ->
Reservation(t.get("reservation_id", String::class.java)!!,
t.get("restaurant_name", String::class.java)!!,
customer, t.get("date", LocalDateTime::class.java)!!)
}.one()
}
}
}
override fun save(reservation: Reservation): Publisher<Reservation> =
databaseClient.inTransaction {
customerRepository.save(reservation.reservationId, reservation.customer).toMono()
.then(it.execute().sql("INSERT INTO reservation (reservation_id, restaurant_name, date) VALUES ($1, $2, $3)")
.bind("$1", reservation.reservationId)
.bind("$2", reservation.restaurantName)
.bind("$3", reservation.date)
.fetch().rowsUpdated())
}.then(Mono.just(reservation))
override fun delete(reservationId: String): Publisher<Void> =
databaseClient.inTransaction {
customerRepository.delete(reservationId).toMono()
.then(it.execute().sql("DELETE FROM reservation WHERE reservation_id = $1")
.bind("$1", reservationId)
.fetch().rowsUpdated())
}.then(Mono.empty())
}
I hope that can help you
I have a collection of objects, which I need to perform some transformation on. Currently I am using:
var myObjects: List<MyObject> = getMyObjects()
myObjects.forEach{ myObj ->
someMethod(myObj)
}
It works fine, but I was hoping to speed it up by running someMethod() in parallel, instead of waiting for each object to finish, before starting on the next one.
Is there any way to do this in Kotlin? Maybe with doAsyncTask or something?
I know when this was asked over a year ago it was not possible, but now that Kotlin has coroutines like doAsyncTask I am curious if any of the coroutines can help
Yes, this can be done using coroutines. The following function applies an operation in parallel on all elements of a collection:
fun <A>Collection<A>.forEachParallel(f: suspend (A) -> Unit): Unit = runBlocking {
map { async(CommonPool) { f(it) } }.forEach { it.await() }
}
While the definition itself is a little cryptic, you can then easily apply it as you would expect:
myObjects.forEachParallel { myObj ->
someMethod(myObj)
}
Parallel map can be implemented in a similar way, see https://stackoverflow.com/a/45794062/1104870.
Java Stream is simple to use in Kotlin:
tasks.stream().parallel().forEach { computeNotSuspend(it) }
If you are using Android however, you cannot use Java 8 if you want an app compatible with an API lower than 24.
You can also use coroutines as you suggested. But it's not really part of the language as of now (August 2017) and you need to install an external library. There is very good guide with examples.
runBlocking<Unit> {
val deferreds = tasks.map { async(CommonPool) { compute(it) } }
deferreds.forEach { it.await() }
}
Note that coroutines are implemented with non-blocking multi-threading, which mean they can be faster than traditional multi-threading. I have code below benchmarking the Stream parallel versus coroutine and in that case the coroutine approach is 7 times faster on my machine. However you have to do some work yourself to make sure your code is "suspending" (non-locking) which can be quite tricky. In my example I'm just calling delay which is a suspend function provided by the library. Non-blocking multi-threading is not always faster than traditional multi-threading. It can be faster if you have many threads doing nothing but waiting on IO, which is kind of what my benchmark is doing.
My benchmarking code:
import kotlinx.coroutines.experimental.CommonPool
import kotlinx.coroutines.experimental.async
import kotlinx.coroutines.experimental.delay
import kotlinx.coroutines.experimental.launch
import kotlinx.coroutines.experimental.runBlocking
import java.util.*
import kotlin.system.measureNanoTime
import kotlin.system.measureTimeMillis
class SomeTask() {
val durationMS = random.nextInt(1000).toLong()
companion object {
val random = Random()
}
}
suspend fun compute(task: SomeTask): Unit {
delay(task.durationMS)
//println("done ${task.durationMS}")
return
}
fun computeNotSuspend(task: SomeTask): Unit {
Thread.sleep(task.durationMS)
//println("done ${task.durationMS}")
return
}
fun main(args: Array<String>) {
val n = 100
val tasks = List(n) { SomeTask() }
val timeCoroutine = measureNanoTime {
runBlocking<Unit> {
val deferreds = tasks.map { async(CommonPool) { compute(it) } }
deferreds.forEach { it.await() }
}
}
println("Coroutine ${timeCoroutine / 1_000_000} ms")
val timePar = measureNanoTime {
tasks.stream().parallel().forEach { computeNotSuspend(it) }
}
println("Stream parallel ${timePar / 1_000_000} ms")
}
Output on my 4 cores computer:
Coroutine: 1037 ms
Stream parallel: 7150 ms
If you uncomment out the println in the two compute functions you will see that in the non-blocking coroutine code the tasks are processed in the right order, but not with Streams.
You can use RxJava to solve this.
List<MyObjects> items = getList()
Observable.from(items).flatMap(object : Func1<MyObjects, Observable<String>>() {
fun call(item: MyObjects): Observable<String> {
return someMethod(item)
}
}).subscribeOn(Schedulers.io()).observeOn(AndroidSchedulers.mainThread()).subscribe(object : Subscriber<String>() {
fun onCompleted() {
}
fun onError(e: Throwable) {
}
fun onNext(s: String) {
// do on output of each string
}
})
By subscribing on Schedulers.io(), some method is scheduled on background thread.
To process items of a collection in parallel you can use Kotlin Coroutines. For example the following extension function processes items in parallel and waits for them to be processed:
suspend fun <T, R> Iterable<T>.processInParallel(
dispatcher: CoroutineDispatcher = Dispatchers.IO,
processBlock: suspend (v: T) -> R,
): List<R> = coroutineScope { // or supervisorScope
map {
async(dispatcher) { processBlock(it) }
}.awaitAll()
}
This is suspend extension function on Iterable<T> type, which does a parallel processing of items and returns some result of processing each item. By default it uses Dispatchers.IO dispatcher to offload blocking tasks to a shared pool of threads. Must be called from a coroutine (including a coroutine with Dispatchers.Main dispatcher) or another suspend function.
Example of calling from a coroutine:
val myObjects: List<MyObject> = getMyObjects()
someCoroutineScope.launch {
val results = myObjects.processInParallel {
someMethod(it)
}
// use processing results
}
where someCoroutineScope is an instance of CoroutineScope.
Or if you want to just launch and forget you can use this function:
fun <T> CoroutineScope.processInParallelAndForget(
iterable: Iterable<T>,
dispatcher: CoroutineDispatcher = Dispatchers.IO,
processBlock: suspend (v: T) -> Unit
) = iterable.forEach {
launch(dispatcher) { processBlock(it) }
}
This is an extension function on CoroutineScope, which doesn't return any result. It also uses Dispatchers.IO dispatcher by default. Can be called using CoroutineScope or from another coroutine.
Calling example:
someoroutineScope.processInParallelAndForget(myObjects) {
someMethod(it)
}
// OR from another coroutine:
someCoroutineScope.launch {
processInParallelAndForget(myObjects) {
someMethod(it)
}
}
where someCoroutineScope is an instance of CoroutineScope.