When I use between dependencies together
<dependency>
<groupId>org.kie.server</groupId>
<artifactId>kie-server-client</artifactId>
</dependency>
<dependency>
<groupId>org.optaplanner</groupId>
<artifactId>optaplanner-spring-boot-starter</artifactId>
</dependency>
I get below warn
Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'solverManager' defined in class path resource [org/optaplanner/spring/boot/autoconfigure/OptaPlannerAutoConfiguration.class]: Unsatisfied dependency expressed through method 'solverManager' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'solverFactory' defined in class path resource [org/optaplanner/spring/boot/autoconfigure/OptaPlannerAutoConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.optaplanner.core.api.solver.SolverFactory]: Factory method 'solverFactory' threw exception; nested exception is java.lang.NoClassDefFoundError: org/drools/core/reteoo/CoreComponentFactory
and then an exception.
Do you know how I can use these two dependencies together?
I'm adding a second answer which takes an entirely different approach.
The Spring Boot starter is a way to develop standalone OptaPlanner-based applications. KIE Server is an application that embeds OptaPlanner and allows third parties to run solvers on KIE Server.
From this point of view, the two are fundamentally incompatible:
Are you embedding OptaPlanner? Use the Spring Boot starter.
Are you talking to KIE Server? Use the client.
I fail to see how you would even combine both in a single JAR.
The fact still stands, though - KIE Server is an obsolete technology, and OptaPlanner 8 does not support it.
OptaPlanner 8 and Drools 7 can not be used together. OptaPlanner 8 relies on Drools 8, and that will cause all sorts of classpath conflicts.
There is a way how to use OptaPlanner 8 without Drools. Assuming you do not use score DRL (or, if you're using constraint streams, you switch to the BAVET implementation) you will be able to remove all Drools dependencies from OptaPlanner 8. However, I can not guarantee that this will not blow up for some other reason, as it's never been tried; the safest answer is that this is just not going to work. OptaPlanner 8 requires Drools 8.
Another option is to use OptaPlanner 7 which has been designed to work with KIE, but that (very old) version is no longer maintained by the community.
Related
Still trying to run Optaplanner on our projects, today I tried to separate in multi modules our applications but got some troubles with Optaplanner.
So we have multiple gradles modules that looks like this:
app -> Contains the #SpringBootApplication
referentialdata -> Contains almost all our entities / repository / etc
auth -> Contains the authentication logic
simulation -> Contains the solver logic
And they are all under the parent build.gradle file
It seems that if I move my ConstraintProvider from the base module (app) where the SpringBootApplication is to another module (simulation), then I go this error.. All the other solver related things (like the SolverConfig.xml, the SolverService where I invoke the solver are already in the simulation module), the SpringBootApplication is well scanning the packages containing the ConstraintProvider as the entities and repositories under the same package are detected
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.optaplanner.core.config.solver.SolverConfig]: Factory method 'solverConfig' threw exception; nested exception is java.lang.IllegalStateException: No classes found that implement EasyScoreCalculator, ConstraintProvider or IncrementalScoreCalculator.
Neither was a property optaplanner.score-drl defined, nor a constraints.drl resource found.
Maybe your ConstraintProvider class is not in a subpackage of your #SpringBootApplication annotated class's package.
Maybe move your constraint provider class to your application class's (sub)package.
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:185) ~[spring-beans-5.3.1.jar:5.3.1]
at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:651) ~[spring-beans-5.3.1.jar:5.3.1]
... 48 common frames omitted
Here is my configuration (the paths are good):
<?xml version="1.0" encoding="UTF-8"?>
<solver>
<solutionClass>com.api.simulation.impl.domain.optimizer.AllocationProblemSolution</solutionClass>
<entityClass>com.api.simulation.impl.domain.optimizer.OptimizerAffectation</entityClass>
<termination>
<secondsSpentLimit>160</secondsSpentLimit>
</termination>
<scoreDirectorFactory>
<!-- <scoreDrl>com.api.simulation.optim/rules.drl</scoreDrl>-->
<constraintProviderClass>com.api.simulation.impl.AllocationConstraintProvider</constraintProviderClass>
</scoreDirectorFactory>
</solver>
Is there a way to make it work in submodules or should the constraint provider be in the main module ?
The error message says that this is intentional, and it also hints at what you need to do to fix that:
Maybe your ConstraintProvider class is not in a subpackage of your #SpringBootApplication annotated class's package.
It's been a while since that code was written, and we are currently evaluating whether or not we still need to place that limitation on users. I will update my answer here if/when we decide to change the behavior.
I am currently experiencing a problem in my RCP application and wanted to ask, if someone stumbled over the same problem and can give me some valuable hints:
My RCP application allows plugins to provide implementations of a specific abstract class of my model (singleton) to extend my model during runtime via the update manager. I instantiate these classes via
extensionPointImplementation.createExecutableExtension(..)
after parsing the Eclipse registry. I can serialize the created instances using the default Java serialization API.
Now to the problem: The plugin trying to deserialize the objects cannot find the class implementations of the model extensions due to the fact, that there is no plugin dependency between the plugins. Nevertheless, it is not possible for me to create such a dependency which would make the idea of extending the model during runtime obsolete.
Is it possible to solve this problem by using the default Java serialization API or do I have to implement my own serialization (which parses the Eclipse registry and creates the instances via the line shown above if all necessary plugins are available, otherwise throw an exception) which might be based on the default Java serialization API (if possible I do not want to create the serialization completely by myself)?
Thanks.
You need to define a so called buddy policy.
In the bundle trying to instantiate the class add
Eclipse-BuddyPolicy: registered
to the manifest.mf.
In the bundle providing the class add
Eclipse-RegisterBuddy: <symbolic name of the bundle instantiating the class>
to the manifest.mf.
I have a multi-module project where the EJB BarService in project bar refers to a FooService EJB in foo project. The #EJB annotation is use to inject the EJB.
#EJB
private FooService fooService;
I'm using IntellijIDEA 11, and it complains with
'Ambiguous EJB reference "beanName" or more precise "beanInterface" should be specified'.
This error is only showing up for EJB references in a different module. If I use the beanName as suggested, the error goes away. However I would prefer not to use it, since it would be hard to refactor the component name as it is a string.
Is this a bug in IDEA, or am I trying to do something wrong here?
(Noticed this and this asking the exact same question in the JetBrains forums, but there are no replies sadly).
The javadoc for javax.ejb.EJB is somewhat unclear on this point:
If no explicit linking information is provided and there is only one
session bean within the same application that exposes the matching
client view type, by default the EJB dependency resolves to that
session bean.
It's debatable whether application in this context means "EJB module" or "EAR", so I don't necessarily think IDEA is to blame. I'm not familiar with other vendors, but at least WebSphere Application Server will attempt to disambiguate within the client EJB/WAR module before considering all EJBs in all modules in the EAR.
Trying to make a module with spring aspects gives me:
can't determine superclass of missing type org.springframework.transaction.interceptor.TransactionAspectSupport
Works in other modules, what's up with this one? Missing dep?
/S
this is unfortunately an error that occurs from time to time when developing with AspectJ.
Often, in the classpath of any java application, there are some "dead" classes, that is classes that are there inside some jar, but are never used.
These classes often also miss their dependencies. For example, Velocity (to name one, but most libraries do this) ships with classes to bridge many logging facilities, like log4j, java logging etc.. If you want to use one of them, you also need to include its dependency (like log4j.jar), otherwise if you don't use it, you can not add that dependency.
This is not a problem per se when using the library, because those classes will never be loaded. However, when you use AspectJ things change a bit.
Suppose you have a pointcut like :
execution(int MyClass+.getSomething())
While this pointcut seems very specific, it says "a method named getSomething in any subclass of MyClass". That means that to know wether a certain class meets or not the pointcut, AspectJ has to check all superclasses while weaving.
But what happens if AspectJ tries to do that on a "dead class" as the one mentioned above? It will search for superclass and fail, cause the class is not used and it's dependencies are not satisfied.
I usually instruct AspectJ to only warn me in this situation, instead of raising a blocking error, cause 9 times out of 10 this happens on dead code, and can be safely ignored.
Anther way is to spot which pointcut is causing AspectJ to check that class and try to rewrite it so that the scope is stricter. However this is not always possible.
A dirty, but quick, hack could be to write :
execution(... MyClass+ ....) && !this(org.springframework.....)
This is (usually) optimized by AspectJ so that the !this(....) fails before trying to evaluate the complete execution pointcut .. but it ties your pointcut to a specific situation, so is useful only for testing of last second patching a running system while searching for a better solution.
The one to blame, in this case, is not AspectJ, but libraries that include dead classes, which could (and should) be places in separate modules. Many libraries don't do this to avoid "module proliferation" (like, each library should release single modules for each logging system and so on..), which is a good argument, but can be solved easily and better with recent module management systems (like Maven, Ivy etc..) rather than packing single jar files with tons of classes with unmet dependencies, and then stating in documentation that you need that dependency to load that class.
You'll need to add the spring-tx dependency to clear this:
http://mvnrepository.com/artifact/org.springframework/spring-tx
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${spring.version}</version>
</dependency>
I just solved a similar problem by making a maven clean.
The error message was almost same, but was about my own classes. So I think the answer from Simone Gianni should be correct, there were some incorrect classes which were generated by IDE for some reasons, so just remove them then it should be fine.
AbstractTransactionAspect from spring-aspects references TransactionAspectSupport from spring-tx - do you have it in deps?
Add optional dependency, if not needed actually at runtime:
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<optional>true</optional>
</dependency>
Or Xlint option change to warning( or ignore).
Now, with EJB 3.1, we can find the javax.ejb.Singleton annocation that can ensure that this bean is going to be singleton.
Is there a way that i can ensure singleton using stateless beans in EJB 3.0 with some modifications in my code (use of the keyword static, or other way to do that....)
If you're able to limit your #Stateless bean pool size to exactly 1, then you can get pretty close to an #Singleton.
The effect would be like having an #Singleton that uses #Lock(WRITE) for all calls (i.e. no concurrency) and does not eagerly startup via #Startup (it will start on first access).
You might still be able to get the effect of #Startup if your platform has the option to eagerly fill #Stateless bean pools.
Is there a way that I can ensure singleton using stateless beans in EJB 3.0 with some modifications in my code (use of the keyword static, or other way to do that....)
No, nothing standard. Your container might provide some specific extensions though (e.g. JBoss has a proprietary #Service annotation).