Is there a thread-safe method in Ktor where it is possible to statically access the current ApplicationCall? I am trying to get the following simple example to work;
object Main {
fun start() {
val server = embeddedServer(Jetty, 8081) {
intercept(ApplicationCallPipeline.Call) {
// START: this will be more dynamic in the future, we don't want to pass ApplicationCall
Addon.processRequest()
// END: this will be more dynamic in the future, we don't want to pass ApplicationCall
call.respondText(output, ContentType.Text.Html, HttpStatusCode.OK)
return#intercept finish()
}
}
server.start(wait = true)
}
}
fun main(args: Array<String>) {
Main.start();
}
object Addon {
fun processRequest() {
val call = RequestUtils.getCurrentApplicationCall()
// processing of call.request.queryParameters
// ...
}
}
object RequestUtils {
fun getCurrentApplicationCall(): ApplicationCall {
// Here is where I am getting lost..
return null
}
}
I would like to be able to get the ApplicationCall for the current context to be available statically from the RequestUtils so that I can access information about the request anywhere. This of course needs to scale to be able to handle multiple requests at the same time.
I have done some experiments with dependency inject and ThreadLocal, but to no success.
Well, the application call is passed to a coroutine, so it's really dangerous to try and get it "statically", because all requests are treated in a concurrent context.
Kotlin official documentation talks about Thread-local in the context of coroutine executions. It uses the concept of CoroutineContext to restore Thread-Local values in specific/custom coroutine context.
However, if you are able to design a fully asynchronous API, you will be able to bypass thread-locals by directly creating a custom CoroutineContext, embedding the request call.
EDIT: I've updated my example code to test 2 flavors:
async endpoint: Solution fully based on Coroutine contexts and suspend functions
blocking endpoint: Uses a thread-local to store application call, as referred in kotlin doc.
import io.ktor.server.engine.embeddedServer
import io.ktor.server.jetty.Jetty
import io.ktor.application.*
import io.ktor.http.ContentType
import io.ktor.http.HttpStatusCode
import io.ktor.response.respondText
import io.ktor.routing.get
import io.ktor.routing.routing
import kotlinx.coroutines.asContextElement
import kotlinx.coroutines.launch
import kotlin.coroutines.AbstractCoroutineContextElement
import kotlin.coroutines.CoroutineContext
import kotlin.coroutines.coroutineContext
/**
* Thread local in which you'll inject application call.
*/
private val localCall : ThreadLocal<ApplicationCall> = ThreadLocal();
object Main {
fun start() {
val server = embeddedServer(Jetty, 8081) {
routing {
// Solution requiring full coroutine/ supendable execution.
get("/async") {
// Ktor will launch this block of code in a coroutine, so you can create a subroutine with
// an overloaded context providing needed information.
launch(coroutineContext + ApplicationCallContext(call)) {
PrintQuery.processAsync()
}
}
// Solution based on Thread-Local, not requiring suspending functions
get("/blocking") {
launch (coroutineContext + localCall.asContextElement(value = call)) {
PrintQuery.processBlocking()
}
}
}
intercept(ApplicationCallPipeline.ApplicationPhase.Call) {
call.respondText("Hé ho", ContentType.Text.Plain, HttpStatusCode.OK)
}
}
server.start(wait = true)
}
}
fun main() {
Main.start();
}
interface AsyncAddon {
/**
* Asynchronicity propagates in order to properly access coroutine execution information
*/
suspend fun processAsync();
}
interface BlockingAddon {
fun processBlocking();
}
object PrintQuery : AsyncAddon, BlockingAddon {
override suspend fun processAsync() = processRequest("async", fetchCurrentCallFromCoroutineContext())
override fun processBlocking() = processRequest("blocking", fetchCurrentCallFromThreadLocal())
private fun processRequest(prefix : String, call : ApplicationCall?) {
println("$prefix -> Query parameter: ${call?.parameters?.get("q") ?: "NONE"}")
}
}
/**
* Custom coroutine context allow to provide information about request execution.
*/
private class ApplicationCallContext(val call : ApplicationCall) : AbstractCoroutineContextElement(Key) {
companion object Key : CoroutineContext.Key<ApplicationCallContext>
}
/**
* This is your RequestUtils rewritten as a first-order function. It defines as asynchronous.
* If not, you won't be able to access coroutineContext.
*/
suspend fun fetchCurrentCallFromCoroutineContext(): ApplicationCall? {
// Here is where I am getting lost..
return coroutineContext.get(ApplicationCallContext.Key)?.call
}
fun fetchCurrentCallFromThreadLocal() : ApplicationCall? {
return localCall.get()
}
You can test it in your navigator:
http://localhost:8081/blocking?q=test1
http://localhost:8081/blocking?q=test2
http://localhost:8081/async?q=test3
server log output:
blocking -> Query parameter: test1
blocking -> Query parameter: test2
async -> Query parameter: test3
The key mechanism you want to use for this is the CoroutineContext. This is the place that you can set key value pairs to be used in any child coroutine or suspending function call.
I will try to lay out an example.
First, let us define a CoroutineContextElement that will let us add an ApplicationCall to the CoroutineContext.
class ApplicationCallElement(var call: ApplicationCall?) : AbstractCoroutineContextElement(ApplicationCallElement) {
companion object Key : CoroutineContext.Key<ApplicationCallElement>
}
Now we can define some helpers that will add the ApplicationCall on one of our routes. (This could be done as some sort of Ktor plugin that listens to the pipeline, but I don't want to add to much noise here).
suspend fun PipelineContext<Unit, ApplicationCall>.withCall(
bodyOfCall: suspend PipelineContext<Unit, ApplicationCall>.() -> Unit
) {
val pipeline = this
val appCallContext = buildAppCallContext(this.call)
withContext(appCallContext) {
pipeline.bodyOfCall()
}
}
internal suspend fun buildAppCallContext(call: ApplicationCall): CoroutineContext {
var context = coroutineContext
val callElement = ApplicationCallElement(call)
context = context.plus(callElement)
return context
}
And then we can use it all together like in this test case below where we are able to get the call from a nested suspending function:
suspend fun getSomethingFromCall(): String {
val call = coroutineContext[ApplicationCallElement.Key]?.call ?: throw Exception("Element not set")
return call.parameters["key"] ?: throw Exception("Parameter not set")
}
fun Application.myApp() {
routing {
route("/foo") {
get {
withCall {
call.respondText(getSomethingFromCall())
}
}
}
}
}
class ApplicationCallTest {
#Test
fun `we can get the application call in a nested function`() {
withTestApplication({ myApp() }) {
with(handleRequest(HttpMethod.Get, "/foo?key=bar")) {
assertEquals(HttpStatusCode.OK, response.status())
assertEquals("bar", response.content)
}
}
}
}
Related
I try to write a kotlin multiplatform library (android and ios) that uses ktor. Thereby I experience some issues with kotlins coroutines:
When writing tests I always get kotlinx.coroutines.JobCancellationException: Parent job is Completed; job=JobImpl{Completed}#... exception.
I use ktors mock engine for my tests:
client = HttpClient(MockEngine)
{
engine
{
addHandler
{ request ->
// Create response object
}
}
}
A sample method (commonMain module) using ktor. All methods in my library are written in a similar way. The exception occures if client.get is called.
suspend fun getData(): Either<Exception, String> = coroutineScope
{
// Exception occurs in this line:
val response: HttpResponse = client.get { url("https://www.google.com") }
return if (response.status == HttpStatusCode.OK)
{
(response.readText() as T).right()
}
else
{
Exception("Error").left()
}
}
A sample unit test (commonTest module) for the above method. The assertTrue statement is never called since the exception is thrown before.
#Test
fun getDataTest() = runTest
{
val result = getData()
assertTrue(result.isRight())
}
Actual implementation of runTest in androidTest and iosTest modules.
actual fun<T> runTest(block: suspend () -> T) { runBlocking { block() } }
I thought when I use coroutineScope, it waits until all child coroutines are done. What am I doing wrong and how can I fix this exception?
you can't cache HttpClient of CIO in client variable and reuse, It would be best if change the following code in your implementation.
val client:HttpClient get() = HttpClient(MockEngine) {
engine {
addHandler { request ->
// Create response object
}
}
}
The library must be updated, this glitch is in the fix report here: https://newreleases.io/project/github/ktorio/ktor/release/1.6.1
The problem is that you cannot use the same instance of the HttpClient. My ej:
HttpClient(CIO) {
install(JsonFeature) {
serializer = GsonSerializer()
}
}.use { client ->
return#use client.request("URL") {
method = HttpMethod.Get
}
}
I have code that should change SharedPreferences into obsarvable storage with flow so I've code like this
internal val onKeyValueChange: Flow<String> = channelFlow {
val callback = SharedPreferences.OnSharedPreferenceChangeListener { _, key ->
coroutineScope.launch {
//send(key)
offer(key)
}
}
sharedPreferences.registerOnSharedPreferenceChangeListener(callback)
awaitClose {
sharedPreferences.unregisterOnSharedPreferenceChangeListener(callback)
}
}
or this
internal val onKeyValueChange: Flow<String> = callbackFlow {
val callback = SharedPreferences.OnSharedPreferenceChangeListener { _, key ->
coroutineScope.launch {
send(key)
//offer(key)
}
}
sharedPreferences.registerOnSharedPreferenceChangeListener(callback)
awaitClose {
sharedPreferences.unregisterOnSharedPreferenceChangeListener(callback)
}
}
Then I observe this preferences for token, userId, companyId and then log into but there is odd thing as I need to build app three times like changing token not causes tokenFlow to emit anything, then second time new userId not causes userIdFlow to emit anything, then after 3rd login I can logout/login and it works. On logout I am clearing all 3 properties stores in prefs token, userId, companyId.
For callbackFlow:
You cannot use emit() as the simple Flow (because it's a suspend function) inside a callback. Therefore the callbackFlow offers you a synchronized way to do it with the trySend() option.
Example:
fun observeData() = flow {
myAwesomeInterface.addListener{ result ->
emit(result) // NOT ALLOWED
}
}
So, coroutines offer you the option of callbackFlow:
fun observeData() = callbackFlow {
myAwesomeInterface.addListener{ result ->
trySend(result) // ALLOWED
}
awaitClose{ myAwesomeInterface.removeListener() }
}
For channelFlow:
The main difference with it and the basic Flow is described in the documentation:
A channel with the default buffer size is used. Use the buffer
operator on the resulting flow to specify a user-defined value and to
control what happens when data is produced faster than consumed, i.e.
to control the back-pressure behavior.
The trySend() still stands for the same thing. It's just a synchronized way (a non suspending way) for emit() or send()
I suggest you to check Romans Elizarov blog for more detailed information especially this post.
Regarding your code, for callbackFlow you wont' be needing a coroutine launch:
coroutineScope.launch {
send(key)
//trySend(key)
}
Just use trySend()
Another Example, maybe much concrete:
private fun test() {
lifecycleScope.launch {
someFlow().collectLatest {
Log.d("TAG", "Finally we received the result: $it")
// Cancel this listener, so it will not be subscribed anymore to the callbackFlow. awaitClose() will be triggered.
// cancel()
}
}
}
/**
* Define a callbackFlow.
*/
private fun someFlow() = callbackFlow {
// A dummy class which will run some business logic and which will sent result back to listeners through ApiCallback methods.
val service = ServiceTest() // a REST API class for example
// A simple callback interface which will be called from ServiceTest
val callback = object : ApiCallback {
override fun someApiMethod(data: String) {
// Sending method used by callbackFlow. Into a Flow we have emit(...) or for a ChannelFlow we have send(...)
trySend(data)
}
override fun anotherApiMethod(data: String) {
// Sending method used by callbackFlow. Into a Flow we have emit(...) or for a ChannelFlow we have send(...)
trySend(data)
}
}
// Register the ApiCallback for later usage by ServiceTest
service.register(callback)
// Dummy sample usage of callback flow.
service.execute(1)
service.execute(2)
service.execute(3)
service.execute(4)
// When a listener subscribed through .collectLatest {} is calling cancel() the awaitClose will get executed.
awaitClose {
service.unregister()
}
}
interface ApiCallback {
fun someApiMethod(data: String)
fun anotherApiMethod(data: String)
}
class ServiceTest {
private var callback: ApiCallback? = null
fun unregister() {
callback = null
Log.d("TAG", "Unregister the callback in the service class")
}
fun register(callback: ApiCallback) {
Log.d("TAG", "Register the callback in the service class")
this.callback = callback
}
fun execute(value: Int) {
CoroutineScope(Dispatchers.IO).launch {
if (value < 2) {
callback?.someApiMethod("message sent through someApiMethod: $value.")
} else {
callback?.anotherApiMethod("message sent through anotherApiMethod: $value.")
}
}
}
}
I'm trying to learn a bit of Functional Programming using Kotlin and Arrow and in this way I've already read some blogposts like the following one: https://jorgecastillo.dev/kotlin-fp-1-monad-stack, which is good, I've understand the main idea, but when creating a program, I can't figure out how to run it.
Let me be more explicit:
I have the following piece of code:
typealias EitherIO<A, B> = EitherT<ForIO, A, B>
sealed class UserError(
val message: String,
val status: Int
) {
object AuthenticationError : UserError(HttpStatus.UNAUTHORIZED.reasonPhrase, HttpStatus.UNAUTHORIZED.value())
object UserNotFound : UserError(HttpStatus.NOT_FOUND.reasonPhrase, HttpStatus.NOT_FOUND.value())
object InternalServerError : UserError(HttpStatus.INTERNAL_SERVER_ERROR.reasonPhrase, HttpStatus.INTERNAL_SERVER_ERROR.value())
}
#Component
class UserAdapter(
private val myAccountClient: MyAccountClient
) {
#Lazy
#Inject
lateinit var subscriberRepository: SubscriberRepository
fun getDomainUser(ssoId: Long): EitherIO<UserError, User?> {
val io = IO.fx {
val userResource = getUserResourcesBySsoId(ssoId, myAccountClient).bind()
userResource.fold(
{ error -> Either.Left(error) },
{ success ->
Either.right(composeDomainUserWithSubscribers(success, getSubscribersForUserResource(success, subscriberRepository).bind()))
})
}
return EitherIO(io)
}
fun composeDomainUserWithSubscribers(userResource: UserResource, subscribers: Option<Subscribers>): User? {
return subscribers.map { userResource.toDomainUser(it) }.orNull()
}
}
private fun getSubscribersForUserResource(userResource: UserResource, subscriberRepository: SubscriberRepository): IO<Option<Subscribers>> {
return IO {
val msisdnList = userResource.getMsisdnList()
Option.invoke(subscriberRepository.findAllByMsisdnInAndDeletedIsFalse(msisdnList).associateBy(Subscriber::msisdn))
}
}
private fun getUserResourcesBySsoId(ssoId: Long, myAccountClient: MyAccountClient): IO<Either<UserError, UserResource>> {
return IO {
val response = myAccountClient.getUserBySsoId(ssoId)
if (response.isSuccessful) {
val userResource = JacksonUtils.fromJsonToObject(response.body()?.string()!!, UserResource::class.java)
Either.Right(userResource)
} else {
when (response.code()) {
401 -> Either.Left(UserError.AuthenticationError)
404 -> Either.Left(UserError.UserNotFound)
else -> Either.Left(UserError.InternalServerError)
}
}
}.handleError { Either.Left(UserError.InternalServerError) }
}
which, as you can see is accumulating some results into an IO monad. I should run this program using unsafeRunSync() from arrow, but on javadoc it's stated the following: **NOTE** this function is intended for testing, it should never appear in your mainline production code!.
I should mention that I know about unsafeRunAsync, but in my case I want to be synchronous.
Thanks!
Instead of running unsafeRunSync, you should favor unsafeRunAsync.
If you have myFun(): IO<A> and want to run this, then you call myFun().unsafeRunAsync(cb) where cb: (Either<Throwable, A>) -> Unit.
For instance, if your function returns IO<List<Int>> then you can call
myFun().unsafeRunAsync { /* it (Either<Throwable, List<Int>>) -> */
it.fold(
{ Log.e("Foo", "Error! $it") },
{ println(it) })
}
This will run the program contained in the IO asynchronously and pass the result safely to the callback, which will log an error if the IO threw, and otherwise it will print the list of integers.
You should avoid unsafeRunSync for a number of reasons, discussed here. It's blocking, it can cause crashes, it can cause deadlocks, and it can halt your application.
If you really want to run your IO as a blocking computation, then you can precede this with attempt() to have your IO<A> become an IO<Either<Throwable, A>> similar to the unsafeRunAsync callback parameter. At least then you won't crash.
But unsafeRunAsync is preferred. Also, make sure your callback passed to unsafeRunAsync won't throw any errors, at it's assumed it won't. Docs.
This is my interface:
interface BlogService {
suspend fun tag() : JsonObject
}
Is it possible to create a dynamic proxy for the suspend method and run coroutine inside?
I can't use "Proxy.newProxyInstance" from jdk because I get a compilation error (suspend function should be run from another suspend function)
I had the same problem. And I think the answer is Yes. Here's what I figured out.
The following interface
interface IService {
suspend fun hello(arg: String): Int
}
is compiled into this
interface IService {
fun hello(var1: String, var2: Continuation<Int>) : Any
}
after compilation, there's no difference between a normal function and a suspend
function, except that the latter one has an additional argument of type
Continuation. Just return COROUTINE_SUSPENDED in the delegated
InvocationHandler.invoke if you actually want it supended.
Here's an example of creating an ISerivce instance via the java dynamic proxy
facility Proxy.newProxyInstance
import java.lang.reflect.InvocationHandler
import java.lang.reflect.Proxy
import kotlin.coroutines.Continuation
import kotlin.coroutines.intrinsics.COROUTINE_SUSPENDED
import kotlin.coroutines.resume
fun getServiceDynamic(): IService {
val proxy = InvocationHandler { _, method, args ->
val lastArg = args?.lastOrNull()
if (lastArg is Continuation<*>) {
val cont = lastArg as Continuation<Int>
val argsButLast = args.take(args.size - 1)
doSomethingWith(method, argsButLast, onComplete = { result: Int ->
cont.resume(result)
})
COROUTINE_SUSPENDED
} else {
0
}
}
return Proxy.newProxyInstance(
proxy.javaClass.classLoader,
arrayOf(IService::class.java),
proxy
) as IService
}
I believe this code snippet is simple enough and self-explaining.
I have a collection of objects, which I need to perform some transformation on. Currently I am using:
var myObjects: List<MyObject> = getMyObjects()
myObjects.forEach{ myObj ->
someMethod(myObj)
}
It works fine, but I was hoping to speed it up by running someMethod() in parallel, instead of waiting for each object to finish, before starting on the next one.
Is there any way to do this in Kotlin? Maybe with doAsyncTask or something?
I know when this was asked over a year ago it was not possible, but now that Kotlin has coroutines like doAsyncTask I am curious if any of the coroutines can help
Yes, this can be done using coroutines. The following function applies an operation in parallel on all elements of a collection:
fun <A>Collection<A>.forEachParallel(f: suspend (A) -> Unit): Unit = runBlocking {
map { async(CommonPool) { f(it) } }.forEach { it.await() }
}
While the definition itself is a little cryptic, you can then easily apply it as you would expect:
myObjects.forEachParallel { myObj ->
someMethod(myObj)
}
Parallel map can be implemented in a similar way, see https://stackoverflow.com/a/45794062/1104870.
Java Stream is simple to use in Kotlin:
tasks.stream().parallel().forEach { computeNotSuspend(it) }
If you are using Android however, you cannot use Java 8 if you want an app compatible with an API lower than 24.
You can also use coroutines as you suggested. But it's not really part of the language as of now (August 2017) and you need to install an external library. There is very good guide with examples.
runBlocking<Unit> {
val deferreds = tasks.map { async(CommonPool) { compute(it) } }
deferreds.forEach { it.await() }
}
Note that coroutines are implemented with non-blocking multi-threading, which mean they can be faster than traditional multi-threading. I have code below benchmarking the Stream parallel versus coroutine and in that case the coroutine approach is 7 times faster on my machine. However you have to do some work yourself to make sure your code is "suspending" (non-locking) which can be quite tricky. In my example I'm just calling delay which is a suspend function provided by the library. Non-blocking multi-threading is not always faster than traditional multi-threading. It can be faster if you have many threads doing nothing but waiting on IO, which is kind of what my benchmark is doing.
My benchmarking code:
import kotlinx.coroutines.experimental.CommonPool
import kotlinx.coroutines.experimental.async
import kotlinx.coroutines.experimental.delay
import kotlinx.coroutines.experimental.launch
import kotlinx.coroutines.experimental.runBlocking
import java.util.*
import kotlin.system.measureNanoTime
import kotlin.system.measureTimeMillis
class SomeTask() {
val durationMS = random.nextInt(1000).toLong()
companion object {
val random = Random()
}
}
suspend fun compute(task: SomeTask): Unit {
delay(task.durationMS)
//println("done ${task.durationMS}")
return
}
fun computeNotSuspend(task: SomeTask): Unit {
Thread.sleep(task.durationMS)
//println("done ${task.durationMS}")
return
}
fun main(args: Array<String>) {
val n = 100
val tasks = List(n) { SomeTask() }
val timeCoroutine = measureNanoTime {
runBlocking<Unit> {
val deferreds = tasks.map { async(CommonPool) { compute(it) } }
deferreds.forEach { it.await() }
}
}
println("Coroutine ${timeCoroutine / 1_000_000} ms")
val timePar = measureNanoTime {
tasks.stream().parallel().forEach { computeNotSuspend(it) }
}
println("Stream parallel ${timePar / 1_000_000} ms")
}
Output on my 4 cores computer:
Coroutine: 1037 ms
Stream parallel: 7150 ms
If you uncomment out the println in the two compute functions you will see that in the non-blocking coroutine code the tasks are processed in the right order, but not with Streams.
You can use RxJava to solve this.
List<MyObjects> items = getList()
Observable.from(items).flatMap(object : Func1<MyObjects, Observable<String>>() {
fun call(item: MyObjects): Observable<String> {
return someMethod(item)
}
}).subscribeOn(Schedulers.io()).observeOn(AndroidSchedulers.mainThread()).subscribe(object : Subscriber<String>() {
fun onCompleted() {
}
fun onError(e: Throwable) {
}
fun onNext(s: String) {
// do on output of each string
}
})
By subscribing on Schedulers.io(), some method is scheduled on background thread.
To process items of a collection in parallel you can use Kotlin Coroutines. For example the following extension function processes items in parallel and waits for them to be processed:
suspend fun <T, R> Iterable<T>.processInParallel(
dispatcher: CoroutineDispatcher = Dispatchers.IO,
processBlock: suspend (v: T) -> R,
): List<R> = coroutineScope { // or supervisorScope
map {
async(dispatcher) { processBlock(it) }
}.awaitAll()
}
This is suspend extension function on Iterable<T> type, which does a parallel processing of items and returns some result of processing each item. By default it uses Dispatchers.IO dispatcher to offload blocking tasks to a shared pool of threads. Must be called from a coroutine (including a coroutine with Dispatchers.Main dispatcher) or another suspend function.
Example of calling from a coroutine:
val myObjects: List<MyObject> = getMyObjects()
someCoroutineScope.launch {
val results = myObjects.processInParallel {
someMethod(it)
}
// use processing results
}
where someCoroutineScope is an instance of CoroutineScope.
Or if you want to just launch and forget you can use this function:
fun <T> CoroutineScope.processInParallelAndForget(
iterable: Iterable<T>,
dispatcher: CoroutineDispatcher = Dispatchers.IO,
processBlock: suspend (v: T) -> Unit
) = iterable.forEach {
launch(dispatcher) { processBlock(it) }
}
This is an extension function on CoroutineScope, which doesn't return any result. It also uses Dispatchers.IO dispatcher by default. Can be called using CoroutineScope or from another coroutine.
Calling example:
someoroutineScope.processInParallelAndForget(myObjects) {
someMethod(it)
}
// OR from another coroutine:
someCoroutineScope.launch {
processInParallelAndForget(myObjects) {
someMethod(it)
}
}
where someCoroutineScope is an instance of CoroutineScope.