I am trying to mock a org.apache.kafka.clients.producer.KafkaProducer.
But the mock fails because of the implementation of the class. The input parameter is validated, if it is null a null pointer exception is thrown.
How can I mock it?
The reason I think it fails is, The first parameter of the class KafkaProducer is a ProducerConfig which extends AbstractConfig. It validates the properties that is passed. If it is null, it throws a null pointer exception.
import org.apache.kafka.clients.producer.{KafkaProducer, ProducerConfig}
import org.scalamock.scalatest.MockFactory
object MyProducerTest extends MockFactory with App {
val mockKafkaProducer = mock[KafkaProducer[String,String]]
}
Exception in thread "main" java.lang.NullPointerException
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:52)
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:63)
at org.apache.kafka.clients.producer.ProducerConfig.<init>(ProducerConfig.java:340)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:166)
at org.hs.My.tools.connector.MyProducerTest$$anon$1.<init>(MyProducerTest.scala:21)
at org.hs.My.tools.connector.MyProducerTest$.delayedEndpoint$org$hs$My$tools$connector$MyProducerTest$1(MyProducerTest.scala:21)
at org.hs.My.tools.connector.MyProducerTest$delayedInit$body.apply(MyProducerTest.scala:16)
at scala.Function0.apply$mcV$sp(Function0.scala:39)
at scala.Function0.apply$mcV$sp$(Function0.scala:39)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
at scala.App.$anonfun$main$1$adapted(App.scala:80)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.App.main(App.scala:80)
at scala.App.main$(App.scala:78)
at org.hs.My.tools.connector.MyProducerTest$.main(MyProducerTest.scala:16)
at org.hs.My.tools.connector.MyProducerTest.main(MyProducerTest.scala)
I think you can try extending the object with your own custom class.
class MyKafkaProducer extends KafkaProducer[String, String]()
val mockKafkaProducer = mock[MyKafkaProducer]
Or, mock the class above your KafkaProducer that makes the calls instead. Something like this:
# main:
class BusinessApp {
// create a KafkaProducer
def sendMessage(msg: String) = {
kafkaProducer.send(new ProducerRecord(msg))
}
}
# tests:
val mockBusinessApp = mock[BusinessApp]
(mockBusinessApp.sendMessage _).expects("test").returns(true)
Then you're not mocking the lower-level API of a KafkaProducer.
As a note, You could use embedded kafka and you don't have to mock KafkaProducer at all, and get actual messages produced/consumed during tests.
Related
Using ByteBuddy how can I create enum with constructors such as this one :
public enum EnumConstructorSample {
STATE1(10),
STATE2(15);
public int count;
EnumConstructorSample(int count){
this.count = count;
}
}
I tried this code and it gives me error.
Class enumClass = new ByteBuddy().makeEnumeration("STATE1", "STATE2")
.name("DynamicEnum")
.defineConstructor(Visibility.PACKAGE_PRIVATE)
.withParameters(int.class)
.intercept(FixedValue.value(1))
.make()
.load(EnumWithConstructor.class.getClassLoader(), ClassLoadingStrategy.Default.WRAPPER)
.getLoaded();
System.out.println(enumClass.getDeclaredConstructors()[0]);
This is the Error and it is happening in enumClass.getDeclaredConstructors()
Exception in thread "main" java.lang.VerifyError: Constructor must call super() or this() before return
Exception Details:
Location:
DynamicEnum.<init>(I)V #2: return
Reason:
Error exists in the bytecode
Bytecode:
0x0000000: 0457 b1
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getDeclaredConstructors(Class.java:2020)
at EnumWithConstructor.main(EnumWithConstructor.java:19)
For constructors, it is required to invoke the super method within the constructor. For enumerations, you'd need to invoke the Enum(String, int) constructor. You can implement this using MethodCall.invoke(...).onSuper().
If you wanted to achieve this, I'd recommend you to subclass Enum manually since you'd otherwise define multiple constructors for the enum you are creating where Byte Buddy would invoke its own enum constructor and the fields would all have its default value.
Rather, implement the method and return the value based on its name. You can for example use a MethodDelegation and then use a #This Enum<?> val injection where you switch over the name to return the correct value as if it was stored in a field.
Is there any way to create an instance of Derived but not call the constructor of Base?
open class Base(p: Int)
class Derived(p: Int) : Base(p)
You actually can do it
import sun.misc.Unsafe
open class Base(p: Int){
init {
println("Base")
}
}
class Derived(p: Int) : Base(p){
init {
println("Derived")
}
}
fun main() {
val unsafe = Unsafe::class.java.getDeclaredField("theUnsafe").apply {
isAccessible = true
}.get(null) as Unsafe
val x = unsafe.allocateInstance(Derived::class.java)
println("X = $x")
}
But don't, this solution is a low-level mechanism that was designed to be used only by the core Java library and not by standard users. You will break the logic of OOP if you use it.
this is not possible. The constructor of the derived class has to call (any) constructor of the base class in order to initialise the content(fields) of the base class.
This is also the same case in Java. Just that the default constructor is called by default (if no parameters are provided in the constructor), but if you have to choose between constructors with parameters, you always have to call them explicitly, because you have to choose which values to pass into the constructor.
You must always call a constructor of a super-class to ensure that the foundation of the class is initialized. But you can work around your issue by providing a no-arg constructor in the base class. Something like this:
open class Base(p: Int?){
val p: Int? = p
constructor(): this(null)
}
class Derived(p: Int) : Base()
The way you handle which constructor of the base class is default and which parameters are nullable, etc. will depend highly on the specific case.
I have this service (all in kotlin):
#Service
class MyService {
fun getSomeString(): String = "test"
}
And this integration test class:
#RunWith(SpringRunner::class)
#SpringBootTest
#EmbeddedKafka // used on some kafka tests
class BasicTests {
and method:
#Test
fun `test method count`() {
// here I have a kafka producer sending a message to a topic that ends up
// calling myService.getSomeString via #KafkaListener from other services
verify(someObjectRelatedToMyService, atLeast(1)).getSome()
}
In the place of someObjectRelatedToMyService I tried to use
#Autowired
lateinit var myService: MyService
But then I got Argument passed to verify() is of type MyService and is not a mock!
But when I use
#Mock
lateinit var myMock: MyService
I get Actually, there were zero interactions with this mock.
And actually, to me it makes sense, since my mock wasn't called, but my real service at the application was.
Is it possible to count method calls from my real object?
You can spy on the real object to count method calls on it like this:
#Test
fun `test method count`() {
Mockito.spy(someObjectRelatedToMyService)
verify(someObjectRelatedToMyService, atLeast(1)).getSome()
}
As you can see, the only thing you have to do is to call the spy method which enables tracking interactions with the target object.
When adding this call before the verify method, you should not get the error anymore that the object is not a mock.
[Posting here since no rep to comment] Have you tried using a #Spy? Then you could specify which methods to mock and which methods to call. I supposed you can also apply Mockito.verify on spies...
I am a beginner of Lagom and using it with Scala. It is an awesome experience for me to create a microservice using it till now.
As I referred the following
https://www.lagomframework.com/documentation/1.4.x/scala/ServiceClients.html#Binding-a-service-client
and I am trying out for calling inter service with out message broker via remote calls. Which means I wanted to inject ServiceB in to serviceAImpl and do a invoke call on the client.This is for one of the scenario where I dont want the call to be via Event or message broker and should be a direct service to service call.
In Lagom scala I have ServiceA and ServiceB. I have created a ServiceB client in the ServiceAApplication and trying to inject the serviceB in the ServiceAImpl. I get an error during compilation saying the following,
Cannot find a value of type : [com.example.ServiceB]
override lazy val lagomServer = serverForServiceA
in the abstract class ServiceAApplication
snippet of the ApplicationLoader class where I get this error only when I inject the ServiceB in to the ServiceAImpl constructor.
Snippet of ServiceAApplicationLoader.scala:
trait ServiceAComponents extends LagomServerComponents
with SlickPersistenceComponents
with LagomConfigComponent
with HikariCPComponents
with LagomKafkaComponents
{
implicit def executionContext: ExecutionContext
def environment: Environment
implicit def materializer: Materializer
override lazy val lagomServer = serverFor[ServiceA](wire[ServiceAImpl])
lazy val serviceARepository = wire[ServiceARepository]
lazy val jsonSerializerRegistry = ServiceASerializerRegistry
persistentEntityRegistry.register(wire[ServiceAEntity])
readSide.register(wire[ServiceAEventProcessor])
}
abstract class ServiceAApplication(context: LagomApplicationContext) extends LagomApplication(context)
with ServiceAComponents
with AhcWSComponents
with SlickPersistenceComponents
with LagomServiceClientComponents
with LagomConfigComponent
{
lazy val serviceB = serviceClient.implement[ServiceB]
}
`class ServiceAApplicationLoader extends LagomApplicationLoader {
override def load(context: LagomApplicationContext) =
new ServiceAApplication(context) with LagomServiceLocatorComponents
override def loadDevMode(context: LagomApplicationContext) =
new ServiceAApplication(context) with LagomDevModeComponents
override def describeService = Some(readDescriptor[ServiceA])
}
`
Snippet of ServiceAImpl.scala:
class ServiceAImpl (registry: PersistentEntityRegistry, serviceARepository:ServiceARepository ,serviceB: ServiceB )
(implicit ec: ExecutionContext) extends ServiceA {
/// Here in one of the method calling serviceB which is injected in constructor.
When I compile I get the error as following :
Cannot find a value of type : [com.example.ServiceB]
override lazy val lagomServer = serverForServiceA
Note: When I do the application loader in the following way, I dont get the error, But as you will seee below, I dont define a component and hence losing the testability:
I dont have a trait for th ServiceAComponent like above, instead have defined as below.
abstract class ServiceAApplication(context: LagomApplicationContext) extends LagomApplication(context)
with ServiceAComponents
with AhcWSComponents
with SlickPersistenceComponents
with LagomServiceClientComponents
with LagomConfigComponent
{
override lazy val lagomServer = serverFor[ServiceA](wire[ServiceAImpl])
lazy val serviceARepository = wire[ServiceARepository]
lazy val jsonSerializerRegistry = ServiceASerializerRegistry
persistentEntityRegistry.register(wire[ServiceAEntity])
readSide.register(wire[ServiceAEventProcessor])
lazy val serviceB = serviceClient.implement[ServiceB]
}
class ServiceAApplicationLoader extends LagomApplicationLoader {
override def load(context: LagomApplicationContext) =
new ServiceAApplication(context) with LagomServiceLocatorComponents
override def loadDevMode(context: LagomApplicationContext) =
new ServiceAApplication(context) with LagomDevModeComponents
override def describeService = Some(readDescriptor[ServiceA])
}
This Application Loader works fine with runAll, but if I have to run with unit test as there is no component trait found, not able to run along with the ServiceB.
snippet of unittest:
class ServiceAImplIntegrationTest extends AsyncWordSpec with Matchers with BeforeAndAfterAll {
private val server = ServiceTest.startServer(ServiceTest.defaultSetup.withCassandra(true)) { ctx=>
new ServiceAApplication(ctx) with LocalServiceLocator{
override def additionalConfiguration: AdditionalConfiguration =
super.additionalConfiguration ++ ConfigFactory.parseString(
"cassandra-query-journal.eventual-consistency-delay = 0"
)
}
}
Note: If you see the test case, it does not follow the way it is done in the sample ItemIntegrationTest, instead of starting the server with Component, I have started with just ServiceAApplication and hence the test fail saying service B is not running. How to deal with this.
Questions:
1. Is injecting a serviceB in to ServiceAImpl the right way and call it using invoke method ? (with out having a subscriber)
2. How to test this as integration of serviceA and serviceB ?
I have got answer in the following link
https://discuss.lightbend.com/t/inter-service-communication-with-out-message-broker-unable-to-inject-if-creating-component-extending-lagomservercomponent/3257
I changed the component defnition as mention in above link and that solved the issue.
If I am modeling my value objects using Kotlin data classes what is the best way to handle validation. Seems like the init block is the only logical place since it executes after the primary constructor.
data class EmailAddress(val address: String) {
init {
if (address.isEmpty() || !address.matches(Regex("^[a-zA-Z0-9]+#[a-zA-Z0-9]+(.[a-zA-Z]{2,})$"))) {
throw IllegalArgumentException("${address} is not a valid email address")
}
}
}
Using JSR-303 Example
The downside to this is it requires load time weaving
#Configurable
data class EmailAddress(#Email val address: String) {
#Autowired
lateinit var validator: Validator
init {
validator.validate(this)
}
}
It seems unreasonable to me to have object creation validation anywhere else but in the class constructor. This is the place responsible for the creation, so that is the place where the rules which define what is and isn't a valid instance should be. From a maintenance perspective it also makes sense to me as it would be the place where I would look for such rules if I had to guess.
I did make a comment, but I thought I would share my approach to validation instead.
First, I think it is a mistake to perform validation on instantiation. This will make the boundary between deserialization and handing over to your controllers messy. Also, to me, if you are sticking to a clean architecture, validation is part of your core logic, and you should ensure with tests on your core logic that it is happening.
So, to let me tackle this how I wish, I first define my own core validation api. Pure kotlin. No frameworks or libraries. Keep it clean.
interface Validatable {
/**
* #throws [ValidationErrorException]
*/
fun validate()
}
class ValidationErrorException(
val errors: List<ValidationError>
) : Exception() {
/***
* Convenience method for getting a data object from the Exception.
*/
fun toValidationErrors() = ValidationErrors(errors)
}
/**
* Data object to represent the data of an Exception. Convenient for serialization.
*/
data class ValidationErrors(
val errors : List<ValidationError>
)
data class ValidationError(
val path: String,
val message: String
)
Then I have a framework specific implementations. For example a javax.validation.Validation implementation:
open class ValidatableJavax : Validatable {
companion object {
val validator = Validation.buildDefaultValidatorFactory().validator!!
}
override fun validate() {
val violations = validator.validate(this)
val errors = violations.map {
ValidationError(it.propertyPath.toString(), it.message)
}.toMutableList()
if (errors.isNotEmpty()) {
throw ValidationErrorException(errors = errors)
}
}
}
The only problem with this, is that the javax annotations don't play so well with kotlin data objects - but here is an example of a class with validation:
import javax.validation.constraints.Positive
class MyObject(
myNumber: BigDecimal
) : ValidatableJavax() {
#get:Positive(message = "Must be positive")
val myNumber: BigDecimal = myNumber
}
Actually, it looks like that validation is not a responsibility of data classes. data tells for itself — it's used for data storage.
So if you would like to validate data class, it will make perfect sense to set #get: validation on arguments of the constructor and validate outside of data class in class, responsible for construction.
Your second option is not to use data class, just use simple class and implement whole logic in the constructor passing validator there
Also, if you use Spring Framework — you can make this class Bean with prototype scope, but chances are it will be absolutely uncomfortable to work with such kind of spaghetti-code :)
I disagree with your following statement :
Seems like the init block is the only logical place since it executes after the primary constructor.
Validation should not be done at construction time, because sometimes, you need to have intermediate steps before getting a valid object, and it does not work well with Spring MVC for example.
Maybe use a specific interface (like suggested in previous answer) with a method dedicated to executing validation.
For the validation framework, I personnaly use valiktor, as I found it a lot less cumbersome that JSR-303