jOOQ Custom Pojo & DAO Generation - sql

Problem
I'm having some issues configuring mapping to custom Pojos during code generation.
Question
I have implemented RecordMapperProvider but wondering how I register it to be used during the code generation phase, or even if that is possible?
More Context
I love the fact that Pojos & DAOs are generated but I want to define the Pojo myself without too much configuration code. I am using ModelMapper to map from Type to Target:
#Override
public <R extends Record, E> RecordMapper<R, E> provide(RecordType<R> recordType,
Class<? extends E> type) {
if (mapping.containsKey(type)) {
return record -> modelMapper.map(mapping.get(type), type);
}
return new DefaultRecordMapper<>(recordType, type);
}
If it helps, I am configuring jOOQ using a DefaultConfiguration object (which is a bean):
#Bean
public DefaultConfiguration configuration() {
DefaultConfiguration jooqConfiguration = new DefaultConfiguration();
jooqConfiguration.setConnectionProvider(dataSourceConnectionProvider());
jooqConfiguration.setExecuteListenerProvider(new DefaultExecuteListenerProvider(
jooqToSpringExceptionTranslator()));
jooqConfiguration.setSQLDialect(
SQLDialect.valueOf(env.getRequiredProperty("jooq.sql.dialect")));
jooqConfiguration.setRecordMapperProvider(new JooqRecordMapperFactory(modelMapper()));
return jooqConfiguration;
}
And then for Code Generation I am configuring it in gradle:
jooq {
version = '3.10.5'
edition = 'OSS'
myDb(sourceSets.getByName("main")) {
jdbc {
driver = dbDriver
url = dbUrl
user = dbUsername
}
generator {
name = 'org.jooq.util.JavaGenerator'
strategy {
name = 'org.jooq.util.DefaultGeneratorStrategy'
}
database {
name = 'org.jooq.util.postgres.PostgresDatabase'
inputSchema = dbSchema
}
generate {
relations = true
deprecated = false
records = true
immutablePojos = true
fluentSetters = true
daos = true
}
target {
packageName = 'com.textiq.quinn.common.dao.model.generated'
}
}
}
}
I am sure there is a disconnect here between both configurations but I can't glean from the documentation how I synch these. Ideally I want jOOQ to generate Pojos (based on the mapping that ModelMapper provides in my implementation of RecordMapperProvider) and also have jOOQ provide the DAO's for these Pojos. Is this possible? The documentation states:
If you're using jOOQ's code generator, you can configure it to generate POJOs for you, but you're not required to use those generated POJOs. You can use your own. See the manual's section about POJOs with custom RecordMappers to see how to modify jOOQ's standard POJO mapping behaviour.
Source: https://www.jooq.org/doc/3.9/manual/sql-execution/fetching/pojos/
Which to me indicates the possibility of this but only leads me to implementing RecordMapperProvider and nothing after that.

I have implemented RecordMapperProvider but wondering how I register it to be used during the code generation phase, or even if that is possible?
No, it's not possible, out of the box.
I love the fact that Pojos & DAOs are generated but I want to define the Pojo myself without too much configuration code
Then, I suggest turning off the generation of POJOs and DAOs and roll your own. Either, create manual implementations of DAOs, or extend the JavaGenerator to do so.

I'm a few years late to the party, but I actually found a very simple way to do this. I'll admit that it's a little brittle, but you can refine it further to suit your needs.
Create a new module in your gradle project, for example called jooq-generator
Add jooq-codegen as a compileOnly dependency to the module
Create a new class in the module:
public class Generator extends JavaGenerator {
#Override
public boolean generatePojos() {
return false;
}
}
Create a new class in the module:
public class MyGeneratorStrategy extends DefaultGeneratorStrategy {
#Override
public String getJavaPackageName(Definition definition, Mode mode) {
if (mode != Mode.POJO) {
return super.getJavaPackageName(definition, mode);
}
return "com.example.my.model.package.prefix";
}
}
Add the module as a dependency to the jooqGenerator jooqGenerator project(":jooq-generator")
Add your new classes to the jooq config
jooq {
configurations {
main {
generationTool {
generator {
name = 'com.example.my.package.name.Generator'
strategy {
name = 'com.example.my.package.name.MyGeneratorStrategy'
}
}
}
}
}
}
The daos will now be generated using that package prefix for the POJOs instead of the generated package name.

Related

Which design pattern to use for using different subclasses based on input [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 months ago.
Improve this question
There is an interface called Processor, which has two implementations SimpleProcessor and ComplexProcessor.
Now I have a process, which consumes an input, and then using that input decides whether it should use SimpleProcessor or ComplexProcessor.
Current solution : I was thinking to use Abstract Factory, which will generate the instance on the basis of the input.
But the issue is that I don't want new instances. I want to use already instantiated objects. That is, I want to re-use the instances.
That means, Abstract factory is absolutely the wrong pattern to use here, as it is for generating objects on the basis of type.
Another thing, that our team normally does is to create a map from input to the corresponding processor instance. And at runtime, we can use that map to get the correct instance on the basis of input.
This feels like a adhoc solution.
I want this to be extendable : new input types can be mapped to new processor types.
Is there some standard way to solve this?
You can use a variation of the Chain of Responsibility pattern.
It will scale far better than using a Map (or hash table in general).
This variation will support dependency injection and is very easy to extend (without breaking any code or violating the Open-Closed principle).
Opposed to the classic version, handlers do not need to be explicitly chained. The classic version scales very bad.
The pattern uses polymorphism to enable extensibility and is therefore targeting an object oriented language.
The pattern is as follows:
The client API is a container class, that manages a collection of input handlers (for example SimnpleProcessor and ComplexProcessor).
Each handler is only known to the container by a common interface and unknown to the client.
The collection of handlers is passed to the container via the constructor (to enable optional dependency injection).
The container accepts the predicate (input) and passes it on to the anonymous handlers by iterating over the handler collection.
Each handler now decides based on the input if it can handle it (return true) or not (return false).
If a handler returns true (to signal that the input was successfully handled), the container will break further input processing by other handlers (alternatively, use a different criteria e.g., to allow multiple handlers to handle the input).
In the following very basic example implementation, the order of handler execution is simply defined by their position in their container (collection).
If this isn't sufficient, you can simply implement a priority algorithm.
Implementation (C#)
Below is the container. It manages the individual handler implementation using polymorphism. Since handler implementation are only known by their common interface, the container scales extremely well: simply add/inject an additional handler implementation.
The container is actually used directly by the client (whereas the handlers are hidden from the client, while anonymous to the container).
interface IInputProcessor
{
void Process(object input);
}
class InputProcessor : IInputProcessor
{
private IEnumerable<IInputHandler> InputHandlers { get; }
// Constructor.
// Optionally use an IoC container to inject the dependency (a collection of input handlers).
public InputProcessor(IEnumerable<IInputHandler> inputHandlers)
{
this.InputHandlers = inputHandlers;
}
// Method to handle the input.
// The input is then delegated to the input handlers.
public void Process(object input)
{
foreach (IInputHandler inputHandler in this.InputHandlers)
{
if (inputHandler.TryHandle(input))
{
return;
}
}
}
}
Below are the input handlers.
To add new handlers i.e. to extend input handling, simply implement the IInputHandler interface and add it to a collection which is passed/injected to the container (IInputProcessor):
interface IInputHandler
{
bool TryHandle(object input);
}
class SimpleProcessor : IInputHandler
{
public bool TryHandle(object input)
{
if (input == 1)
{
//TODO::Handle input
return true;
}
return false;
}
}
class ComplexProcessor : IInputHandler
{
public bool TryHandle(object input)
{
if (input == 3)
{
//TODO::Handle input
return true;
}
return false;
}
}
Usage Example
public class Program
{
public static void Main()
{
/* Setup Chain of Responsibility.
/* Preferably configure an IoC container. */
var inputHandlers = new List<IInputHandlers>
{
new SimpleProcessor(),
new ComplexProcessor()
};
IInputProcessor inputProcessor = new InputProcessor(inputHandlers);
/* Use the handler chain */
int input = 3;
inputProcessor.Pocess(input); // Will execute the ComplexProcessor
input = 1;
inputProcessor.Pocess(input); // Will execute the SimpleProcessor
}
}
It is possible to use Strategy pattern with combination of Factory pattern. Factory objects can be cached to have reusable objects without recreating them when objects are necessary.
As an alternative to caching, it is possible to use singleton pattern. In ASP.NET Core it is pretty simple. And if you have DI container, just make sure that you've set settings of creation instance to singleton
Let's start with the first example. We need some enum of ProcessorType:
public enum ProcessorType
{
Simple, Complex
}
Then this is our abstraction of processors:
public interface IProcessor
{
DateTime DateCreated { get; }
}
And its concrete implemetations:
public class SimpleProcessor : IProcessor
{
public DateTime DateCreated { get; } = DateTime.Now;
}
public class ComplexProcessor : IProcessor
{
public DateTime DateCreated { get; } = DateTime.Now;
}
Then we need a factory with cached values:
public class ProcessorFactory
{
private static readonly IDictionary<ProcessorType, IProcessor> _cache
= new Dictionary<ProcessorType, IProcessor>()
{
{ ProcessorType.Simple, new SimpleProcessor() },
{ ProcessorType.Complex, new ComplexProcessor() }
};
public IProcessor GetInstance(ProcessorType processorType)
{
return _cache[processorType];
}
}
And code can be run like this:
ProcessorFactory processorFactory = new ProcessorFactory();
Thread.Sleep(3000);
var simpleProcessor = processorFactory.GetInstance(ProcessorType.Simple);
Console.WriteLine(simpleProcessor.DateCreated); // OUTPUT: 2022-07-07 8:00:01
ProcessorFactory processorFactory_1 = new ProcessorFactory();
Thread.Sleep(3000);
var complexProcessor = processorFactory_1.GetInstance(ProcessorType.Complex);
Console.WriteLine(complexProcessor.DateCreated); // OUTPUT: 2022-07-07 8:00:01
The second way
The second way is to use DI container. So we need to modify our factory to get instances from dependency injection container:
public class ProcessorFactoryByDI
{
private readonly IDictionary<ProcessorType, IProcessor> _cache;
public ProcessorFactoryByDI(
SimpleProcessor simpleProcessor,
ComplexProcessor complexProcessor)
{
_cache = new Dictionary<ProcessorType, IProcessor>()
{
{ ProcessorType.Simple, simpleProcessor },
{ ProcessorType.Complex, complexProcessor }
};
}
public IProcessor GetInstance(ProcessorType processorType)
{
return _cache[processorType];
}
}
And if you use ASP.NET Core, then you can declare your objects as singleton like this:
services.AddSingleton<SimpleProcessor>();
services.AddSingleton<ComplexProcessor>();
Read more about lifetime of an object

Is it possible to add completion items to a Microsoft Language Server in runtime?

I am trying to develop a IntelliJ plugin which provides a Language Server with help of lsp4intellij by ballerina.
Thing is, i've got a special condition: The list of completion items should be editable in runtime.
But I've not found any way to communicate new completionItems to the LanguageServer process once its running.
My current idea is to add an action to the plugin which builds a new jar and then restarts the server with the new jar, using the Java Compiler API.
The problem with that is, i need to get the source code from the plugin project including the gradle dependencies accessable from the running plugin... any ideas?
If your requirement is to modify the completion items (coming from the language server) before displaying them in the IntelliJ UI, you can do that by implementing the LSP4IntelliJ's
LSPExtensionManager in your plugin.
Currently, we do not have proper documentation for the LSP4IntelliJ's extension points but you can refer to our Ballerina IntelliJ plugin as a reference implementation, where it has implemented Ballerina LSP Extension manager to override/modify completion items at the client runtime in here.
For those who might stumble upon this - it is indeed possible to change the amount of CompletionItems the LanguageServer can provide during runtime.
I simply edited the TextDocumentService.java (the library I used is LSP4J).
It works like this:
The main function of the LanguageServer needs to be started with an additional argument, which is the path to the config file in which you define the CompletionItems.
Being called from LSP4IntelliJ it would look like this:
String[] command = new String[]{"java", "-jar",
"path\\to\\LangServer.jar", "path\\to\\config.json"};
IntellijLanguageClient.addServerDefinition(new RawCommandServerDefinition("md,java", command));
The path String will then be passed through to the Constructor of your CustomTextDocumentServer.java, which will parse the config.json in a new Timer thread.
An Example:
public class CustomTextDocumentService implements TextDocumentService {
private List<CompletionItem> providedItems;
private String pathToConfig;
public CustomTextDocumentService(String pathToConfig) {
this.pathToConfig = pathToConfig;
Timer timer = new Timer();
timer.schedule(new ReloadCompletionItemsTask(), 0, 10000);
loadCompletionItems();
}
#Override
public CompletableFuture<Either<List<CompletionItem>, CompletionList>> completion(CompletionParams completionParams) {
return CompletableFuture.supplyAsync(() -> {
List<CompletionItem> completionItems;
completionItems = this.providedItems;
// Return the list of completion items.
return Either.forLeft(completionItems);
});
}
#Override
public void didOpen(DidOpenTextDocumentParams didOpenTextDocumentParams) {
}
#Override
public void didChange(DidChangeTextDocumentParams didChangeTextDocumentParams) {
}
#Override
public void didClose(DidCloseTextDocumentParams didCloseTextDocumentParams) {
}
#Override
public void didSave(DidSaveTextDocumentParams didSaveTextDocumentParams) {
}
private void loadCompletionItems() {
providedItems = new ArrayList<>();
CustomParser = new CustomParser(pathToConfig);
ArrayList<String> variables = customParser.getTheParsedItems();
for(String variable : variables) {
String itemTxt = "$" + variable + "$";
CompletionItem completionItem = new CompletionItem();
completionItem.setInsertText(itemTxt);
completionItem.setLabel(itemTxt);
completionItem.setKind(CompletionItemKind.Snippet);
completionItem.setDetail("CompletionItem");
providedItems.add(completionItem);
}
}
class ReloadCompletionItemsTask extends TimerTask {
#Override
public void run() {
loadCompletionItems();
}
}
}

Spring Data Rest ResourceProcessor not applied on Projections

I am using a ResourceProcessor to add additional links to my resource object when listed in a collection or fetched individually. However, when I apply a projection (or an excerpt project) to my repository, the ResourceProcessor does not get run and thus my links for that resource do not get created. Is there a means to allow my custom resource links to be added to a resource regardless of how the resource content is projected?
I think this issue is describing your case:
https://jira.spring.io/browse/DATAREST-713
Currently, spring-data-rest does not offer functionality to solve your problem.
We are using a little workaround that still needs a separate ResourceProcessor for each projection but we do not need to duplicate the link logic:
We have a base class that is able to get the underlying Entity for a Projection and invokes the Entity's ResourceProcessor and applies the links to the Projection.
Entity is a common interface for all our JPA entities - but I think you could also use org.springframework.data.domain.Persistable or org.springframework.hateoas.Identifiable.
/**
* Projections need their own resource processors in spring-data-rest.
* To avoid code duplication the ProjectionResourceProcessor delegates the link creation to
* the resource processor of the underlying entity.
* #param <E> entity type the projection is associated with
* #param <T> the resource type that this ResourceProcessor is for
*/
public class ProjectionResourceProcessor<E extends Entity, T> implements ResourceProcessor<Resource<T>> {
private final ResourceProcessor<Resource<E>> entityResourceProcessor;
public ProjectionResourceProcessor(ResourceProcessor<Resource<E>> entityResourceProcessor) {
this.entityResourceProcessor = entityResourceProcessor;
}
#SuppressWarnings("unchecked")
#Override
public Resource<T> process(Resource<T> resource) {
if (resource.getContent() instanceof TargetAware) {
TargetAware targetAware = (TargetAware) resource.getContent();
if (targetAware != null
&& targetAware.getTarget() != null
&& targetAware.getTarget() instanceof Entity) {
E target = (E) targetAware.getTarget();
resource.add(entityResourceProcessor.process(new Resource<>(target)).getLinks());
}
}
return resource;
}
}
An implementation of such a resource processor would look like this:
#Component
public class MyProjectionResourceProcessor extends ProjectionResourceProcessor<MyEntity, MyProjection> {
#Autowired
public MyProjectionResourceProcessor(EntityResourceProcessor resourceProcessor) {
super(resourceProcessor);
}
}
The implementation itself just passes the ResourceProcessor that can handle the entity class and passes it to our ProjectionResourceProcessor. It does not contain any link creation logic.
Here is a generic solution:
#Component
public class ProjectionProcessor implements RepresentationModelProcessor<EntityModel<TargetAware>> {
private final RepresentationModelProcessorInvoker processorInvoker;
public ProjectionProcessor(#Lazy RepresentationModelProcessorInvoker processorInvoker) {
this.processorInvoker = processorInvoker;
}
#Override
public EntityModel<TargetAware> process(EntityModel<TargetAware> entityModel) {
TargetAware content = entityModel.getContent();
if (content != null) {
entityModel.add(processorInvoker.invokeProcessorsFor(EntityModel.of(content.getTarget())).getLinks());
}
return entityModel;
}
}
It gets links for original entities and adds them to corrseponding projections.

Hazelcast 3.6.1 "There is no suitable de-serializer for type" exception

I am using Hazelcast 3.6.1 to read from a Map. The object class stored in the map is called Schedule.
I have configured a custom serializer on the client side like this.
ClientConfig config = new ClientConfig();
SerializationConfig sc = config.getSerializationConfig();
sc.addSerializerConfig(add(new ScheduleSerializer(), Schedule.class));
...
private SerializerConfig add(Serializer serializer, Class<? extends Serializable> clazz) {
SerializerConfig sc = new SerializerConfig();
sc.setImplementation(serializer).setTypeClass(clazz);
return sc;
}
The map is created like this
private final IMap<String, Schedule> map = client.getMap("schedule");
If I get from the map using schedule id as key, the map returns the correct value e.g.
return map.get("zx81");
If I try to use an SQL predicate e.g.
return new ArrayList<>(map.values(new SqlPredicate("statusActive")));
then I get the following error
Exception in thread "main" com.hazelcast.nio.serialization.HazelcastSerializationException: There is no suitable de-serializer for type 2. This exception is likely to be caused by differences in the serialization configuration between members or between clients and members.
The custom serializer is using Kryo to serialize (based on this blog http://blog.hazelcast.com/comparing-serialization-methods/)
public class ScheduleSerializer extends CommonSerializer<Schedule> {
#Override
public int getTypeId() {
return 2;
}
#Override
protected Class<Schedule> getClassToSerialize() {
return Schedule.class;
}
}
The CommonSerializer is defined as
public abstract class CommonSerializer<T> implements StreamSerializer<T> {
protected abstract Class<T> getClassToSerialize();
#Override
public void write(ObjectDataOutput objectDataOutput, T object) {
Output output = new Output((OutputStream) objectDataOutput);
Kryo kryo = KryoInstances.get();
kryo.writeObject(output, object);
output.flush(); // do not close!
KryoInstances.release(kryo);
}
#Override
public T read(ObjectDataInput objectDataInput) {
Input input = new Input((InputStream) objectDataInput);
Kryo kryo = KryoInstances.get();
T result = kryo.readObject(input, getClassToSerialize());
input.close();
KryoInstances.release(kryo);
return result;
}
#Override
public void destroy() {
// empty
}
}
Do I need to do any configuration on the server side? I thought that the client config would be enough.
I am using Hazelcast client 3.6.1 and have one node/member running.
Queries require the nodes to know about the classes as the bytestream has to be deserialized to access the attributes and query them. This means that when you want to query on objects you have to deploy the model classes (and serializers) on the server side as well.
Whereas when you use key-based access we do not need to look into the values (neither into the keys as we compare the byte-arrays of the key) and just send the result. That way neither model classes nor serializers have to be available on the Hazelcast nodes.
I hope that makes sense.

How to mock method call from other class in Rhino Mock AAA?

I have the following code(simplified).
public class OrderProcessor
{
public virtual string PlaceOrder(string test)
{
OrderParser orderParser = new OrderParser();
string tester = orderParser.ParseOrder(test);
return tester + " here" ;
}
}
public class OrderParser
{
public virtual string ParseOrder(string test)
{
if (!string.IsNullOrEmpty(test.Trim()))
{
if (test == "Test1")
return "Test1";
else
{
return "Hello";
}
}
else
return null;
}
}
My test is as follows -
public class OrderTest
{
public void TestParser()
{
// Arrange
var client = MockRepository.GenerateMock<OrderProcessor>();
var spec = MockRepository.GenerateStub<OrderParser>();
spec.Stub(x => x.ParseOrder("test")).IgnoreArguments().Return("Test1");
//How to pass spec to client so that it uses the same.
}
}
Now how do I test client so that it uses the mocked method from OrderParser.
I can mock the OrderParser but how do I pass that to the orderProcessor mocked class?
Please do let me know.
Thanks in advance.
I'm a little confused by your test since you are not really testing anything except that RhinoMocks works. You create two mocks and then do some assertions on them. You haven't even tested your real classes.
You need to do some dependency injection if you really want to get a good unit test. You can quickly refactor your code to use interfaces and dependency injection to make your test valid.
Start by extracting an interface from your OrderParser class:
public interface IOrderParser
{
String ParseOrder(String value);
}
Now make sure your OrderParser class implements that interface:
public class OrderParser: IOrderParser{ ... }
You can now refactor your OrderProcessor class to take in an instance of an IOrderParser object through its constructor. In this way you "inject" the dependency into the class.
public class OrderProcessor
{
IOrderParser _orderParser;
public OrderProcessor(IOrderParser orderParser)
{
_orderParser = orderParser;
}
public virtual string PlaceOrder(string val)
{
string tester = _orderParser.ParseOrder(val);
return tester + " here" ;
}
}
In your test you only want to mock out the dependency and not the SUT (Subject Under Test). Your test would look something like this:
public class OrderTest
{
public void TestParser()
{
// Arrange
var spec = MockRepository.GenerateMock<IOrderParser>();
var client = new OrderProcessor(spec);
spec.Stub(x => x.ParseOrder("test")).IgnoreArguments().Return("Test1");
//Act
var s = client.PlaceOrder("Blah");
//Assert
Assert.AreEqual("Test1 Here", s);
}
}
It is difficult for me to gauge what you are trying to do with your classes, but you should be able to get the idea from this. A few axioms to follow:
Use interfaces and composition over inheritance
Use dependency injection for external dependencies (inversion of control)
Test a single unit, and mock its dependencies
Only mock one level of dependencies. If you are testing class X which depends on Y which depends on Z, you should only be mocking Y and never Z.
Always test behavior and never implementation details
You seem to be on the right track, but need a little guidance. I would suggest reading material that Martin Fowler, and Bob Martin have to get up to speed.