Inter service communication - injection of service B in to serviceAImpl fails compiling (with out message broker trying remote call) - lagom

I am a beginner of Lagom and using it with Scala. It is an awesome experience for me to create a microservice using it till now.
As I referred the following
https://www.lagomframework.com/documentation/1.4.x/scala/ServiceClients.html#Binding-a-service-client
and I am trying out for calling inter service with out message broker via remote calls. Which means I wanted to inject ServiceB in to serviceAImpl and do a invoke call on the client.This is for one of the scenario where I dont want the call to be via Event or message broker and should be a direct service to service call.
In Lagom scala I have ServiceA and ServiceB. I have created a ServiceB client in the ServiceAApplication and trying to inject the serviceB in the ServiceAImpl. I get an error during compilation saying the following,
Cannot find a value of type : [com.example.ServiceB]
override lazy val lagomServer = serverForServiceA
in the abstract class ServiceAApplication
snippet of the ApplicationLoader class where I get this error only when I inject the ServiceB in to the ServiceAImpl constructor.
Snippet of ServiceAApplicationLoader.scala:
trait ServiceAComponents extends LagomServerComponents
with SlickPersistenceComponents
with LagomConfigComponent
with HikariCPComponents
with LagomKafkaComponents
{
implicit def executionContext: ExecutionContext
def environment: Environment
implicit def materializer: Materializer
override lazy val lagomServer = serverFor[ServiceA](wire[ServiceAImpl])
lazy val serviceARepository = wire[ServiceARepository]
lazy val jsonSerializerRegistry = ServiceASerializerRegistry
persistentEntityRegistry.register(wire[ServiceAEntity])
readSide.register(wire[ServiceAEventProcessor])
}
abstract class ServiceAApplication(context: LagomApplicationContext) extends LagomApplication(context)
with ServiceAComponents
with AhcWSComponents
with SlickPersistenceComponents
with LagomServiceClientComponents
with LagomConfigComponent
{
lazy val serviceB = serviceClient.implement[ServiceB]
}
`class ServiceAApplicationLoader extends LagomApplicationLoader {
override def load(context: LagomApplicationContext) =
new ServiceAApplication(context) with LagomServiceLocatorComponents
override def loadDevMode(context: LagomApplicationContext) =
new ServiceAApplication(context) with LagomDevModeComponents
override def describeService = Some(readDescriptor[ServiceA])
}
`
Snippet of ServiceAImpl.scala:
class ServiceAImpl (registry: PersistentEntityRegistry, serviceARepository:ServiceARepository ,serviceB: ServiceB )
(implicit ec: ExecutionContext) extends ServiceA {
/// Here in one of the method calling serviceB which is injected in constructor.
When I compile I get the error as following :
Cannot find a value of type : [com.example.ServiceB]
override lazy val lagomServer = serverForServiceA
Note: When I do the application loader in the following way, I dont get the error, But as you will seee below, I dont define a component and hence losing the testability:
I dont have a trait for th ServiceAComponent like above, instead have defined as below.
abstract class ServiceAApplication(context: LagomApplicationContext) extends LagomApplication(context)
with ServiceAComponents
with AhcWSComponents
with SlickPersistenceComponents
with LagomServiceClientComponents
with LagomConfigComponent
{
override lazy val lagomServer = serverFor[ServiceA](wire[ServiceAImpl])
lazy val serviceARepository = wire[ServiceARepository]
lazy val jsonSerializerRegistry = ServiceASerializerRegistry
persistentEntityRegistry.register(wire[ServiceAEntity])
readSide.register(wire[ServiceAEventProcessor])
lazy val serviceB = serviceClient.implement[ServiceB]
}
class ServiceAApplicationLoader extends LagomApplicationLoader {
override def load(context: LagomApplicationContext) =
new ServiceAApplication(context) with LagomServiceLocatorComponents
override def loadDevMode(context: LagomApplicationContext) =
new ServiceAApplication(context) with LagomDevModeComponents
override def describeService = Some(readDescriptor[ServiceA])
}
This Application Loader works fine with runAll, but if I have to run with unit test as there is no component trait found, not able to run along with the ServiceB.
snippet of unittest:
class ServiceAImplIntegrationTest extends AsyncWordSpec with Matchers with BeforeAndAfterAll {
private val server = ServiceTest.startServer(ServiceTest.defaultSetup.withCassandra(true)) { ctx=>
new ServiceAApplication(ctx) with LocalServiceLocator{
override def additionalConfiguration: AdditionalConfiguration =
super.additionalConfiguration ++ ConfigFactory.parseString(
"cassandra-query-journal.eventual-consistency-delay = 0"
)
}
}
Note: If you see the test case, it does not follow the way it is done in the sample ItemIntegrationTest, instead of starting the server with Component, I have started with just ServiceAApplication and hence the test fail saying service B is not running. How to deal with this.
Questions:
1. Is injecting a serviceB in to ServiceAImpl the right way and call it using invoke method ? (with out having a subscriber)
2. How to test this as integration of serviceA and serviceB ?

I have got answer in the following link
https://discuss.lightbend.com/t/inter-service-communication-with-out-message-broker-unable-to-inject-if-creating-component-extending-lagomservercomponent/3257
I changed the component defnition as mention in above link and that solved the issue.

Related

Hilt Singleton doesn't seems to work in my Service

I'm facing an issue, like if my repository injected was not a Singleton.
I have a repository (in reality many, but let's make it simple) marked as #Singleton
#Module
#InstallIn(SingletonComponent::class)
class AppModule {
#Provides
#Singleton
fun provideSettingsRepository(#ApplicationContext context: Context): SettingsRepository {
return SettingsRepositoryImpl(context)
}
}
Here the implementation of my repository :
class SettingsRepositoryImpl(context: Context) : SettingsRepository {
private var _flow = MutableStateFlow("init value")
override fun getFlow(): StateFlow<String?> = _flow.asStateFlow()
override fun setMyValue(value:String) {
_flow.value = value
}
}
When I use it apart of my service (in some viewModels or others class with DI), it work perfectly. Today I implemented an AIDL service and wanted to do some DI. I had to use field injection because the service constructor has to be empty. It seems like the update made from my application isen't reported on my "TestApplication" who consume the Service (like if I had two instance of my repository).
Here the code of my service :
#AndroidEntryPoint
class AppService : Service() {
#Inject lateinit var settingsRepository: SettingsRepository
fun someActionOnMyRepository() {
settingsRepository.setMyValue("whatever")
}
}
When I set the value from my UI (viewModel or any other class who as the repository injected), it's not updated in my service. The flow doesn't contains the new value (tested by debug or logcat).
I'm expecting my settingsRepository to be a singleton. What am I missing ? Is it because the field injection ?
Best regards
Ok, the problem was not from Hilt but about how i declared my service in my AndroidManisfest.xml
<service android:name=".services.MyAppService"
android:process=":remote" <----- THIS
android:exported="true">
Making it like that make it on another process. That mean it's like another app instance (so no more Singleton / SharedPreferences).

How to mock a class which takes a parameter and validates it?

I am trying to mock a org.apache.kafka.clients.producer.KafkaProducer.
But the mock fails because of the implementation of the class. The input parameter is validated, if it is null a null pointer exception is thrown.
How can I mock it?
The reason I think it fails is, The first parameter of the class KafkaProducer is a ProducerConfig which extends AbstractConfig. It validates the properties that is passed. If it is null, it throws a null pointer exception.
import org.apache.kafka.clients.producer.{KafkaProducer, ProducerConfig}
import org.scalamock.scalatest.MockFactory
object MyProducerTest extends MockFactory with App {
val mockKafkaProducer = mock[KafkaProducer[String,String]]
}
Exception in thread "main" java.lang.NullPointerException
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:52)
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:63)
at org.apache.kafka.clients.producer.ProducerConfig.<init>(ProducerConfig.java:340)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:166)
at org.hs.My.tools.connector.MyProducerTest$$anon$1.<init>(MyProducerTest.scala:21)
at org.hs.My.tools.connector.MyProducerTest$.delayedEndpoint$org$hs$My$tools$connector$MyProducerTest$1(MyProducerTest.scala:21)
at org.hs.My.tools.connector.MyProducerTest$delayedInit$body.apply(MyProducerTest.scala:16)
at scala.Function0.apply$mcV$sp(Function0.scala:39)
at scala.Function0.apply$mcV$sp$(Function0.scala:39)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
at scala.App.$anonfun$main$1$adapted(App.scala:80)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.App.main(App.scala:80)
at scala.App.main$(App.scala:78)
at org.hs.My.tools.connector.MyProducerTest$.main(MyProducerTest.scala:16)
at org.hs.My.tools.connector.MyProducerTest.main(MyProducerTest.scala)
I think you can try extending the object with your own custom class.
class MyKafkaProducer extends KafkaProducer[String, String]()
val mockKafkaProducer = mock[MyKafkaProducer]
Or, mock the class above your KafkaProducer that makes the calls instead. Something like this:
# main:
class BusinessApp {
// create a KafkaProducer
def sendMessage(msg: String) = {
kafkaProducer.send(new ProducerRecord(msg))
}
}
# tests:
val mockBusinessApp = mock[BusinessApp]
(mockBusinessApp.sendMessage _).expects("test").returns(true)
Then you're not mocking the lower-level API of a KafkaProducer.
As a note, You could use embedded kafka and you don't have to mock KafkaProducer at all, and get actual messages produced/consumed during tests.

How to correctly use Mockito's verify on a spring #Service test

I have this service (all in kotlin):
#Service
class MyService {
fun getSomeString(): String = "test"
}
And this integration test class:
#RunWith(SpringRunner::class)
#SpringBootTest
#EmbeddedKafka // used on some kafka tests
class BasicTests {
and method:
#Test
fun `test method count`() {
// here I have a kafka producer sending a message to a topic that ends up
// calling myService.getSomeString via #KafkaListener from other services
verify(someObjectRelatedToMyService, atLeast(1)).getSome()
}
In the place of someObjectRelatedToMyService I tried to use
#Autowired
lateinit var myService: MyService
But then I got Argument passed to verify() is of type MyService and is not a mock!
But when I use
#Mock
lateinit var myMock: MyService
I get Actually, there were zero interactions with this mock.
And actually, to me it makes sense, since my mock wasn't called, but my real service at the application was.
Is it possible to count method calls from my real object?
You can spy on the real object to count method calls on it like this:
#Test
fun `test method count`() {
Mockito.spy(someObjectRelatedToMyService)
verify(someObjectRelatedToMyService, atLeast(1)).getSome()
}
As you can see, the only thing you have to do is to call the spy method which enables tracking interactions with the target object.
When adding this call before the verify method, you should not get the error anymore that the object is not a mock.
[Posting here since no rep to comment] Have you tried using a #Spy? Then you could specify which methods to mock and which methods to call. I supposed you can also apply Mockito.verify on spies...

How to create a TestContainers base test class in Kotlin with JUnit 5

I am trying to use Neo4j TestContainers with Kotlin, Spring Data Neo4j, Spring Boot and JUnit 5. I have a lot of tests that require to use the test container. Ideally, I would like to avoid copying the container definition and configuration in each test class.
Currently I have something like:
#Testcontainers
#DataNeo4jTest
#Import(Neo4jConfiguration::class, Neo4jTestConfiguration::class)
class ContainerTest(#Autowired private val repository: XYZRepository) {
companion object {
const val IMAGE_NAME = "neo4j"
const val TAG_NAME = "3.5.5"
#Container
#JvmStatic
val databaseServer: KtNeo4jContainer = KtNeo4jContainer("$IMAGE_NAME:$TAG_NAME")
.withoutAuthentication()
}
#TestConfiguration
internal class Config {
#Bean
fun configuration(): Configuration = Configuration.Builder()
.uri(databaseServer.getBoltUrl())
.build()
}
#Test
#DisplayName("Create xyz")
fun testCreateXYZ() {
// ...
}
}
class KtNeo4jContainer(val imageName: String) : Neo4jContainer<KtNeo4jContainer>(imageName)
How can I extract the databaseServer definition and the #TestConfiguration? I tried different ways of creating a base class and having the ContainerTest extend it, but it is not working. From what I understand, static attriubutes are not inherited in Kotlin.
Below my solution for sharing same container between tests.
#Testcontainers
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
abstract class IntegrationTest {
companion object {
#JvmStatic
private val mongoDBContainer = MongoDBContainer(DockerImageName.parse("mongo:4.0.10"))
.waitingFor(HostPortWaitStrategy())
#BeforeAll
#JvmStatic
fun beforeAll() {
mongoDBContainer.start()
}
#JvmStatic
#DynamicPropertySource
fun registerDynamicProperties(registry: DynamicPropertyRegistry) {
registry.add("spring.data.mongodb.host", mongoDBContainer::getHost)
registry.add("spring.data.mongodb.port", mongoDBContainer::getFirstMappedPort)
}
}
}
The key here is to not use #Container annotation as it will close just created container after your first test subclass executes all tests.
Method start() in beforeAll() initialize container only once (upon first subclass test execution), then does nothing while container is running.
By theory we shouldn't have to do this hack, based on:
https://www.testcontainers.org/test_framework_integration/junit_5/
...container that is static should not be closed until all of tests of all subclasses are finished, but it's not working that way and I don't know why. Would be nice to have some answer on that :).
I've had the same issue (making Spring Boot + Kotlin + Testcontainers work together) and after searching the web for (quite) a while I found this nice solution: https://github.com/larmic/testcontainers-junit5. You'll just have to adopt it to your database.
I faced very similar issue in Kotlin and spring boot 2.4.0.
The way you can reuse one testcontainer configuration can be achieved through initializers, e.g.:
https://dev.to/silaev/the-testcontainers-mongodb-module-and-spring-data-mongodb-in-action-53ng or https://nirajsonawane.github.io/2019/12/25/Testcontainers-With-Spring-Boot-For-Integration-Testing/ (java versions)
I wanted to use also new approach of having dynamicProperties and it worked out of a boxed in java. In Kotlin I made sth like this (I wasn't able to make #Testcontainer annotations working for some reason). It's not very elegant but pretty simple solution that worked for me:
MongoContainerConfig class:
import org.testcontainers.containers.MongoDBContainer
class MongoContainerConfig {
companion object {
#JvmStatic
val mongoDBContainer = MongoDBContainer("mongo:4.4.2")
}
init {
mongoDBContainer.start()
}
}
Test class:
#SpringBootTest(
classes = [MongoContainerConfig::class]
)
internal class SomeTest {
companion object {
#JvmStatic
#DynamicPropertySource
fun setProperties(registry: DynamicPropertyRegistry) {
registry.add("mongodb.uri") {
MongoContainerConfig.mongoDBContainer.replicaSetUrl
}
}
}
Disadvantage is this block with properties in every test class what suggests that maybe approach with initializers is desired here.

How to correct "verify should appear after all code under test has been exercised" when verify is last?

I get the error "verify should appear after all code under test has been exercised" with the following:
class CowTest extends MockFactory {
Cow.init(testCowProcesses)
#Test
def noProcessesTest: Unit = {
val cow: Cow = Cow(testCowProcesses)
cow.simulateOneDay(0 nanoseconds)
}
#Test
def processSimulationTest: Unit = {
val NUMBER_OF_TRIES: Int = 10
val cow: Cow = Cow(testCowProcesses)
for (ii <- 0 until NUMBER_OF_TRIES) {
cow.simulateOneDay(0 nanoseconds)
}
(cow.metabolicProcess.simulateOneDay _).verify(0 nanoseconds).repeated(NUMBER_OF_TRIES)
}
}
testCowProcesses is defined in another file, like this (abbreviated):
object CowTesters extends MockFactory {
val metProc = stub[MetabolicProcess]
(metProc.replicate _).when().returns(metProc)
val testCowProcesses = CowProcesses(metProc)
}
I don't quite understand the error message. If I comment out the verify line, the test runs. Alternatively, if I comment out the first test, the second test can run. There are no other tests in the test class. This seems to indicate that the stub objects cannot be reused, as they were in mockito (I'm adapting code from mockito).
Is the best solution to reinstantiate the mock objects, perhaps by converting CowTesters into a class?
Edit:
I confirmed the above suggestion works (not sure if it is the best), but in the mean time I did something a bit more convoluted to get me through compiles:
//TODO: once all tests are converted to ScalaMock,
//TODO: just make this a class with a companion object
trait CowTesters extends MockFactory {
val metProc = stub[MetabolicProcess]
(metProc.replicate _).when().returns(metProc)
val testCowProcesses = CowProcesses(metProc)
}
object CowTesters extends CowTesters {
def apply(): CowTesters = new CowTesters {}
}
From your code above, it seems you are either trying to use JUnit or TestNG. ScalaMock doesn't support either of those frameworks directly, which is why you are struggling with the verification of mocks.
You need to implement your tests using either ScalaTest, or Specs2. See http://scalamock.org/user-guide/integration/
The conversion from JUnit to ScalaTest should be pretty straightforward if you switch to e.g. a FunSuite: http://www.scalatest.org/user_guide/selecting_a_style