After upgrading to micronaut 3.x and replacing the annotation Javax.inject to Jakarta.inject. #Named is unable resolve the property.
import io.micronaut.context.annotation.Bean
import io.micronaut.context.annotation.Factory
import io.micronaut.context.annotation.Requires
import java.util.concurrent.ExecutorService
import jakarta.inject.Named
import kotlin.coroutines.CoroutineContext
#Factory
open class ExecutorServiceCoroutineContextFactory {
#Bean
#Requires(missingBeans = [CoroutineContext::class])
fun executorServiceCoroutineContext(#Named("\${coroutines.executor}")executorService: ExecutorService): CoroutineContext {
return ExecutorServiceCoroutineDispatcher(executorService)
}
}
application.yml
coroutines:
executor: coroutines
The resulting error
Message: No bean of type [java.util.concurrent.ExecutorService] exists for the given qualifier: #Named('${coroutines.executor}'). Make sure the bean is not disabled by bean requirements (enable trace logging for 'io.micronaut.context.condition' to check) and if the bean is enabled then ensure the class is declared a bean and annotation processing is enabled (for Java and Kotlin the 'micronaut-inject-java' dependency should be configured as an annotation processor).
Path Taken: new Controller(Client client,CoroutineContext coroutineContext) --> new Controller(Client client,[CoroutineContext coroutineContext]) --> CoroutineContext.executorServiceCoroutineContext([ExecutorService executorService],String name)
io.micronaut.context.exceptions.DependencyInjectionException: Failed to inject value for parameter [executorService] of method [executorServiceCoroutineContext] of class: kotlin.coroutines.CoroutineContext
Hardcoding the value "coroutines" does work but wondering why the code can't resolve it.
Related
I am upgrading ActiveMQ "Classic" to ActiveMQ Artemis while maintaining client code. I have multiple places code looking like this.
import java.io.IOException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;
import org.springframework.jms.core.JmsTemplate;
public class TestV {
public static void main(String[] args) throws IOException {
ApplicationContext ctx = new ClassPathXmlApplicationContext("root.xml");
JmsTemplate jms = ctx.getBean(JmsTemplate.class);
Map<String, Object> map = new HashMap<>();
List<Integer> ids = new ArrayList<>();
ids.add(10);
ids.add(20);
map.put("ids", ids);
map.put("updated", true);
jms.convertAndSend("mytest", map);
}
}
How do I fix below error coming from above code.
Exception in thread "main" org.springframework.jms.UncategorizedJmsException: Uncategorized exception occurred during JMS processing; nested exception is javax.jms.JMSException: org.apache.activemq.artemis.api.core.ActiveMQPropertyConversionException: class java.util.ArrayList is not a valid property type
at org.springframework.jms.support.JmsUtils.convertJmsAccessException(JmsUtils.java:311)
at org.springframework.jms.support.JmsAccessor.convertJmsAccessException(JmsAccessor.java:185)
at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:507)
at org.springframework.jms.core.JmsTemplate.send(JmsTemplate.java:584)
at org.springframework.jms.core.JmsTemplate.convertAndSend(JmsTemplate.java:661)
at com.mycompany.adhoc.TestV.main(TestV.java:33)
Caused by: javax.jms.JMSException: org.apache.activemq.artemis.api.core.ActiveMQPropertyConversionException: class java.util.ArrayList is not a valid property type
at org.apache.activemq.util.JMSExceptionSupport.create(JMSExceptionSupport.java:54)
at org.apache.activemq.ActiveMQConnection.syncSendPacket(ActiveMQConnection.java:1404)
at org.apache.activemq.ActiveMQConnection.syncSendPacket(ActiveMQConnection.java:1437)
at org.apache.activemq.ActiveMQConnection.syncSendPacket(ActiveMQConnection.java:1324)
at org.apache.activemq.ActiveMQSession.send(ActiveMQSession.java:1981)
at org.apache.activemq.ActiveMQMessageProducer.send(ActiveMQMessageProducer.java:288)
at org.apache.activemq.ActiveMQMessageProducer.send(ActiveMQMessageProducer.java:223)
at org.apache.activemq.ActiveMQMessageProducerSupport.send(ActiveMQMessageProducerSupport.java:241)
at org.springframework.jms.core.JmsTemplate.doSend(JmsTemplate.java:634)
at org.springframework.jms.core.JmsTemplate.doSend(JmsTemplate.java:608)
at org.springframework.jms.core.JmsTemplate.lambda$send$3(JmsTemplate.java:586)
at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:504)
In ActiveMQ "Classic" we can set trusted packages in connection factory. How do I do it in ActiveMQ Artemis?
This problem has nothing to do with setting the trusted packages on the JMS ConnectionFactory.
The problem is that your application is implicitly using this JMS "extension" provided by ActiveMQ "Classic." As the documentation states:
This JMS extension feature allows you to attach Map and List properties to any JMS Message or to use nested Maps and Lists inside a MapMessage. [emphasis mine]
When you pass the Map<String, Object> variable map to JmsTemplate.convertAndSend it uses the default SimpleMessageConverter to convert that Map into a javax.jms.MapMessage. As the JavaDoc for MapMessage states:
The names are String objects, and the values are primitive data types in the Java programming language. [emphasis mine]
In other words, according to the JMS specification the values in the MapMessage can only be primitive data types. However, ActiveMQ "Classic" provides an extension which allows using List implementations. Code which uses this extension is not portable to other JMS brokers since it does not adhere to the JMS specification. This is why ActiveMQ Artemis throws the error ActiveMQPropertyConversionException: class java.util.ArrayList is not a valid property type.
You will either have to change your application code to adhere to the JMS specification (i.e. use primitive data types in the values of your Map) or this same extension will have to be implemented by ActiveMQ Artemis.
Using Spring 2.7.0 to create an API gateway, that was working fine until I tried to replace the RestTemplate with OpenFeign.
Here is the relevant build.gradle contents:
plugins {
id 'org.springframework.boot' version '2.7.0'
id 'io.spring.dependency-management' version '1.0.11.RELEASE'
id 'java'
id 'org.liquibase.gradle' version '2.1.0'
id 'groovy'
}
ext {
set('springCloudVersion', "2021.0.3")
}
// implementation 'org.springframework.boot:spring-boot-starter-web'
implementation 'org.springframework.boot:spring-boot-starter-webflux'
implementation "org.springframework.cloud:spring-cloud-dependencies:${springCloudVersion}"
implementation 'org.springframework.cloud:spring-cloud-starter-circuitbreaker-resilience4j'
implementation 'org.springframework.cloud:spring-cloud-starter-netflix-eureka-client'
implementation 'org.springframework.cloud:spring-cloud-starter-openfeign'
implementation 'org.springframework.cloud:spring-cloud-starter-gateway'
implementation 'org.springframework.cloud:spring-cloud-starter-config'
implementation 'org.springframework.cloud:spring-cloud-starter-bootstrap'
My application class:
#SpringBootApplication
#EnableDiscoveryClient
#EnableFeignClients
public class ApiGatewayApplication {
...
}
My FeignClient:
#FeignClient(name="identity-service")
public interface IdentityServiceClient {
#GetMapping("/api/all")
public List<ApiKey> getAllApiKeys();
}
when I try to start the application now I get:
o.s.c.openfeign.FeignClientFactoryBean : For 'identity-service' URL not provided. Will try picking an instance via load-balancing.
DiscoveryClientOptionalArgsConfiguration : Eureka HTTP Client uses RestTemplate
....
onfigReactiveWebServerApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'apiGatewayApplication': Invocation of init method failed; nested exception is feign.codec.DecodeException: No qualifying bean of type 'org.springframework.boot.autoconfigure.http.HttpMessageConverters' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {#org.springframework.beans.factory.annotation.Autowired(required=true)}
com.netflix.discovery.DiscoveryClient : Shutting down DiscoveryClient ...
com.netflix.discovery.DiscoveryClient : Unregistering ...
com.netflix.discovery.DiscoveryClient : DiscoveryClient_API-GATEWAY/192.168.1.72:api-gateway:8081 - deregister status: 404
com.netflix.discovery.DiscoveryClient : Completed shut down of DiscoveryClient
***************************
APPLICATION FAILED TO START
***************************
Description:
A component required a bean of type 'org.springframework.boot.autoconfigure.http.HttpMessageConverters' that could not be found.
Action:
Consider defining a bean of type 'org.springframework.boot.autoconfigure.http.HttpMessageConverters' in your configuration.
But none of the online examples seem to show the need for adding custom HttpMessageConverters since they are included in spring-web and spring-boot-starter-webflux
UPDATE -
this appears to be related to #Autowired. I put both my service and feign service in a regular class and it starts up fine. But when I autowire those two services, I get this error again.
I have a Quarkus application that makes use of caches for methods.
These methods and cache eviction got to be tested somehow preferably when Quarkus context is fully operational.
This is what I figured (PostgresContainer for reference):
#QuarkusTest
class ScreeningRepositorySpec : PostgresContainer() {
private val cacheManager = CaffeineCacheSupplier().get()
init {
"test cache manager gets initialized" {
logger().info("Cache size: {}", cacheManager.size)
}
}
}
The problem arises when any kind of invocation happens for cacheManager: it fires NPE. https://github.com/im-infamou5/quarkus-cache-playground
Downstream code as follows is
#Override
public List<CaffeineCache> get() {
CacheManager cacheManager = cacheManager();
...
}
which ultimately yields:
public static CacheManager cacheManager() {
return Arc.container().instance(CacheManager.class).get();
}
And here comes that Arc.container() is null somehow.
What else was tried:
#QuarkusIntegrationTest - no bean for injection, null for arc container
Explicit #Inject for CacheManager - yields "no bean matches injection point"
Explicit Cache definition with manual CacheManager instantiation - same issue with null arc container
Variations for #Inject for default bean injection - same things about missing injection point
Looks like CacheManager bean lifecycle issue that expects it way too early and never succeeds as a result.
I've dug into quarkus tests which have quite a workaround but still hope easier approach is available to avoid this much dependensies just to test cache properly.
Version of Quarkus is 2.7.0.
Sample project can be found here with simple type of direct injection.
Output as follows:
org.acme.GreetingResourceIT > test cache size FAILED
kotlin.UninitializedPropertyAccessException at GreetingResourceIT.kt:11
Caused by: kotlin.UninitializedPropertyAccessException at GreetingResourceIT.kt:11
I'm trying to integrate StarMX framework (https://github.com/rogeriogentil/starmx) into a legacy web application. This framework uses JMX techonology and is initialized using the Singleton pattern: StarMXFramework.createInstance(). The web application uses Java EE 6 technologies such as EJB and CDI (also DeltaSpike). However, the way the framework is being initialized (code below) doesn't add its instance to the CDI context.
import org.starmx.StarMXException;
import org.starmx.StarMXFramework;
import javax.annotation.PostConstruct;
import javax.annotation.PreDestroy;
import javax.ejb.Singleton;
import javax.ejb.Startup;
#Startup
#Singleton
public class StarMXSingleton {
private StarMXFramework starMX;
#PostConstruct
public void postConstruct() {
try {
starMX = StarMXFramework.createInstance();
} catch (StarMXException e) {
(...)
}
}
#PreDestroy
public void preDestroy() {
if (starMX != null) {
try {
starMX.shutdown();
} catch (StarMXException e) {
(...)
}
}
}
}
I know that is possible to extend CDI, but is it possible to add an instance of singleton framework to CDI context?
There are two ways, first and easy one is a producer. Here is a link to what CDI producers are and how they work. In short, CDI will use this producer to create the instance of a bean whose types are mandated by the return type of the producer method.
The producer method has to be placed inside a CDI bean so that is it picked up by CDI. Note that the scope of the producer affects how often it will be invoked, just as it would be with standard bean. Here is how it could look like:
#ApplicationScoped
public class SomeCdiBeanInYourApplication {
#Produces //denotes producer method
#ApplicationScoped // scope of produced bean, use CDI scope (the singleton you have is EJB annotation)
public StarMXFramework produceMxFramework() {
return StarMXFramework.createInstance();
}
}
Second means is then CDI extension, namely a lifecycle observer for AfterBeanDiscovery event where you can addBean(). Here is a link to CDI 2.0 spec, feel free to browse older versions based on what version you are on.
I won't write code for that as it is rather complex and long, the producer should do the trick for you.
See also
Please explain the #Produces annotation in CDI
Using #ApplicationScoped #Named #Eager, my #EJB-injected #Stateless beans are not properly instantiated and evaluate to null.
I had an #ApplicationScoped #ManagedBean(eager=true) that was used to schedule a few jobs. Some #Stateless beans were injected using #EJB annotation, and that worked fine.
In the move to CDI annotations, I added the OmniFaces #Eager annotation as substitute for #ManagedBean(eager=true) which is missing in standard CDI:
import javax.annotation.PostConstruct;
import javax.annotation.PreDestroy;
import javax.ejb.EJB;
import javax.enterprise.context.ApplicationScoped;
import javax.inject.Inject;
import javax.inject.Named;
import org.omnifaces.cdi.Eager;
#Named
#ApplicationScoped
#Eager
public class MyScheduler implements Serializable {
#EJB
private MyService myService;
#Inject
private MyNamedBean myNamedBean;
#PostConstruct
public void init() {
setupSchedulers();
}
#PreDestroy
public void destroy() {
destroySchedulers();
}
//...
}
Using this setup, the #PostConstruct method is correctly called on application startup (though it seems to run even before the context is initialized), but then myService evaluates to null.
In the log, the following warnings appear:
Severe: No valid EE environment for injection of org.omnifaces.cdi.eager.EagerBeansRepository
Severe: No valid EE environment for injection of my.package.MyScheduler
Info: Initializing Mojarra 2.2.8 ( 20140814-1418 https://svn.java.net/svn/mojarra~svn/tags/2.2.8#13507) for context '/tagific'
Since I need to access this bean from other ones, I couldn't use the #Singleton and #Schedule annotations.
How could I properly inject #Stateless beans in an #Named applications scoped bean that would be instantiated on application startup?
This looks like an initialization ordering bug in GlassFish. The #Eager #ApplicationScoped runs in a ServletContextListener. Apparently at that point GlassFish hasn't EJBs ready for injection. This construct works in e.g. WildFly.
However, in CDI's name of unifying various different depency injection approaches throughout Java EE, you can also just use #Inject instead of #EJB. The CDI proxy is capable of delegating further to the right #Stateless instance.
#Inject
private MyService myService;
You can also use #Inject inside EJBs itself, but as of now (Java EE 7) it doesn't yet support self-referencing for e.g. #Asynchronous methods. For that you have still to stick to #EJB.
That said, are you aware that Oracle stopped commercial support on GlassFish and that you'd better not use it for production environments? See also this blog.