Optaplanner - ConstraintProvider not found if in submodule - optaplanner

Still trying to run Optaplanner on our projects, today I tried to separate in multi modules our applications but got some troubles with Optaplanner.
So we have multiple gradles modules that looks like this:
app -> Contains the #SpringBootApplication
referentialdata -> Contains almost all our entities / repository / etc
auth -> Contains the authentication logic
simulation -> Contains the solver logic
And they are all under the parent build.gradle file
It seems that if I move my ConstraintProvider from the base module (app) where the SpringBootApplication is to another module (simulation), then I go this error.. All the other solver related things (like the SolverConfig.xml, the SolverService where I invoke the solver are already in the simulation module), the SpringBootApplication is well scanning the packages containing the ConstraintProvider as the entities and repositories under the same package are detected
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.optaplanner.core.config.solver.SolverConfig]: Factory method 'solverConfig' threw exception; nested exception is java.lang.IllegalStateException: No classes found that implement EasyScoreCalculator, ConstraintProvider or IncrementalScoreCalculator.
Neither was a property optaplanner.score-drl defined, nor a constraints.drl resource found.
Maybe your ConstraintProvider class is not in a subpackage of your #SpringBootApplication annotated class's package.
Maybe move your constraint provider class to your application class's (sub)package.
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:185) ~[spring-beans-5.3.1.jar:5.3.1]
at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:651) ~[spring-beans-5.3.1.jar:5.3.1]
... 48 common frames omitted
Here is my configuration (the paths are good):
<?xml version="1.0" encoding="UTF-8"?>
<solver>
<solutionClass>com.api.simulation.impl.domain.optimizer.AllocationProblemSolution</solutionClass>
<entityClass>com.api.simulation.impl.domain.optimizer.OptimizerAffectation</entityClass>
<termination>
<secondsSpentLimit>160</secondsSpentLimit>
</termination>
<scoreDirectorFactory>
<!-- <scoreDrl>com.api.simulation.optim/rules.drl</scoreDrl>-->
<constraintProviderClass>com.api.simulation.impl.AllocationConstraintProvider</constraintProviderClass>
</scoreDirectorFactory>
</solver>
Is there a way to make it work in submodules or should the constraint provider be in the main module ?

The error message says that this is intentional, and it also hints at what you need to do to fix that:
Maybe your ConstraintProvider class is not in a subpackage of your #SpringBootApplication annotated class's package.
It's been a while since that code was written, and we are currently evaluating whether or not we still need to place that limitation on users. I will update my answer here if/when we decide to change the behavior.

Related

AutoMapper Dependency Injection into Profile classes

I added dependencies to my profile class:
public class MyModelMappingProfile : Profile
{
public MyModelMappingProfile(
IDependency1 dependencyOne, IDependencyTwo dependencyTwo)
When I start the service it complains System.MissingMethodException: No parameterless constructor defined for type 'MyModelMappingProfile'.
Found this solution, which solves the problem, but is very manual. Is there something more generic? Haven't found an answer in the docs
https://jimmybogard.com/automapper-usage-guidelines/
X DO NOT inject dependencies into profiles
Profiles are static configuration, and injecting dependencies into them can cause unknown behavior at runtime. If you need to use a dependency, resolve it as part of your mapping operation. You can also have your extension classes (resolvers, type converters, etc.) take dependencies directly.

Gradle extra properties not visible in a custom task defined in a subproject

I'm trying to reuse common logic among multiple Gradle tasks, similar to what was suggested in this answer, but I'm having trouble with extra project properties not being visible.
Boiled down, here's the problem. Say I have a root Gradle build script, build.gradle that sets an extra project property,
project.ext.myProp = 'myValue'
I have a subproject defined in settings.gradle,
include 'subproject'
and the subproject defines and uses a custom task that references that extra project property,
class CustomTask extends DefaultTask {
CustomTask() {
doFirst {
println project.ext.myProp
}
}
}
task custom(type: CustomTask) {
println 'custom task'
}
Executing this gives me this:
FAILURE: Build failed with an exception.
...
* Exception is:
org.gradle.api.GradleScriptException: A problem occurred evaluating project ':subproject'.
...
Caused by: org.gradle.api.tasks.TaskInstantiationException: Could not create task of type 'CustomTask'.
...
Caused by: groovy.lang.MissingPropertyException: cannot get property 'myProp' on extra properties extension as it does not exist
...
BUILD FAILED
Note that this seems to work if:
the custom task is defined in the root project alongside the extra property
if you use dynamic properties instead of extra properties, but those are deprecated
The recommended syntax for reading an extra property named foo in a build script is foo or project.foo (rather than ext.foo), which will also search the parent projects' (extra) properties. EDIT: In a task class, you can use project.foo.
It's important to note that extra properties are only meant for ad-hoc scripting in build scripts; task classes and plugins should not use them. A task class shouldn't reach out into the Gradle object model at all; instead, it should declare properties (and, if necessary, methods) which allow build scripts and/or plugins to supply it with all information that it needs. This makes it easier to understand, reuse, and document the task class, and makes it possible to declare inputs and outputs via #Input... and #Output... annotations.
PS: Instead of calling doFirst in a constructor, a task class usually has a method annotated with #TaskAction.

Save and Load instances of objects created earlier via the Eclipse registry

I am currently experiencing a problem in my RCP application and wanted to ask, if someone stumbled over the same problem and can give me some valuable hints:
My RCP application allows plugins to provide implementations of a specific abstract class of my model (singleton) to extend my model during runtime via the update manager. I instantiate these classes via
extensionPointImplementation.createExecutableExtension(..)
after parsing the Eclipse registry. I can serialize the created instances using the default Java serialization API.
Now to the problem: The plugin trying to deserialize the objects cannot find the class implementations of the model extensions due to the fact, that there is no plugin dependency between the plugins. Nevertheless, it is not possible for me to create such a dependency which would make the idea of extending the model during runtime obsolete.
Is it possible to solve this problem by using the default Java serialization API or do I have to implement my own serialization (which parses the Eclipse registry and creates the instances via the line shown above if all necessary plugins are available, otherwise throw an exception) which might be based on the default Java serialization API (if possible I do not want to create the serialization completely by myself)?
Thanks.
You need to define a so called buddy policy.
In the bundle trying to instantiate the class add
Eclipse-BuddyPolicy: registered
to the manifest.mf.
In the bundle providing the class add
Eclipse-RegisterBuddy: <symbolic name of the bundle instantiating the class>
to the manifest.mf.

Can't access package scoped field of depend plugin

I have a plugin A which exports the package foo.bar. In the package foo.bar there is a abstract class FooBar with default scope members. In a plugin B I like to extend the FooBar within the same package and access the default scoped fields.
Plugin A manifest:
.
Bundle-SymbolicName: A
Export-Package: foo.bar
.
Plugin B manifest:
.
Bundle-SymbolicName: B
Require-Bundle: A
.
Class FooBar in Plugin A:
package foo.bar;
public abstract class FooBar{
int min = -1;
}
Class MyFooBar in Plugin B:
package foo.bar;
public class MyFooBar extends FooBar{
public void setMin(int min){
this.min = min;
}
}
The result:
..Caused by: java.lang.IllegalAccessError: tried to access field foo.bar.FooBar.min from class foo.bar.MyFooBar
In a normal java environment I can access package-scoped members if I define my class in the same package. Apparently this is not so in OSGI-Environment, is it??
Try to make the bundle B a fragment for A (if it is possible in your case). Than B can access package-private classes because there is the same package name and class loader.
Read the following article Split Packages – An OSGi nightmare - it explains the problem in clear way.
You are correct, OSGI plugins aren't the same as java packages. If you truly want to share the same min value in both plugins, the best option is to share it through extension points. Extension points are the OSGI way to share data and code between plugins. This leads to a solution that doesnt cause a tight coupling between the plugins.
If plugin B truly must access the min value in plugin A that might be an indication that those two classes should be in the same plugin.
Also see this question
More information about extension points can be found here
Eclipse plug-ins / OSGi bundles each have their own separate classloaders (and thus namespaces). Package private means same package and classloader.

'Ambiguous EJB reference "beanName" or more precise "beanInterface" should be specified'

I have a multi-module project where the EJB BarService in project bar refers to a FooService EJB in foo project. The #EJB annotation is use to inject the EJB.
#EJB
private FooService fooService;
I'm using IntellijIDEA 11, and it complains with
'Ambiguous EJB reference "beanName" or more precise "beanInterface" should be specified'.
This error is only showing up for EJB references in a different module. If I use the beanName as suggested, the error goes away. However I would prefer not to use it, since it would be hard to refactor the component name as it is a string.
Is this a bug in IDEA, or am I trying to do something wrong here?
(Noticed this and this asking the exact same question in the JetBrains forums, but there are no replies sadly).
The javadoc for javax.ejb.EJB is somewhat unclear on this point:
If no explicit linking information is provided and there is only one
session bean within the same application that exposes the matching
client view type, by default the EJB dependency resolves to that
session bean.
It's debatable whether application in this context means "EJB module" or "EAR", so I don't necessarily think IDEA is to blame. I'm not familiar with other vendors, but at least WebSphere Application Server will attempt to disambiguate within the client EJB/WAR module before considering all EJBs in all modules in the EAR.