The micronaut bean injection is not working correctly after creating new beans unless gradle build is made with --rerun-tasks - kotlin

After using micronaut-inject for dependency injection if new bean is created we have to use --rerun-tasks flag in order for kapt to work otherwise its not injecting newly created beans
seems gradle cache issue
What can be the possible solution or configuration that might be missing

Related

using kotlinx serialization inside a gradle plugin that creates a gradle task

I am trying to create a gradle plugin that will generate files (serialized from data classes) from a gradle task that can run in another project.
lets say that the classes that I am serializing are marked with some annotation #Annot and I find all the relevant classes with reflection in the gradle task (I made sure to depend on kotlin compile so that the binaries are created). The problem is that when I try to use
val clazz: Class<*>
clazz.kotlin.serializer()
I get a Serializer for class 'Type' is not found. (Type is the actual class that I found and is annotated with #Serializable and #Annot .
I am using gradle version 7.2, kotlin 1.5.21 (tried with 1.5.31 too)
The project that uses the plugin has a kotlinx serialization plugin enabled
What am I missing? why can’t I access the class serializer with the gradle task?
Note* if I run the above code in the target project (and not in the plugin then the serializer() function doesn't throw an exception
So This didn't work in a the way I wanted it to but I found a way to make it work.
I defined a task that extends JavaExec task:
tasks.create(createFilesTaskName, JavaExec::class.java) {
mainClass.set("package.of.file.SchemaKt")
classpath = sourceSets.getByName("main").runtimeClasspath
group = groupName
}
The code in SchemaKt is in the source set of my kotlin sources or alternatively in a package required by the current project.
The serializer() is accessible and working from there and I can run the schemas creation from a gradle task which is exactly what I needed.
I hope this helps someone in the future.

calling java component with multiple java classes from Mule

I am using Anypont Studio 5.3.0 and server runtime 3.7.0. I want to invoke a main() method from my component. Application is developed using Maven, SpringBoot and JPA. It sits in the jar file and have the following structure.
com
package
Application.class (with main method)
another package
Other classes
lib
other jars
META-INF
persistance.xml
MANIFEST.MF
Org
springframework boot loader and other spring classes.
when file arrives with file pattern that I detect with mule polling component I would like to invoke Java component in mule flow that has main class and all the supporting classes.
Thanks,
David
did you mavenize your Application? If yes, you can add that as a dependency in your mule project pom, which is also mavenize. But you need to make sure that the jars are added in your maven repository i.e. execute first "mvn clean install" to your java application. Otherwise, add the jars in you build path. When you are able to do those, you can create a spring bean or create a java component in mule where they could call your class with main() method.
I never came across this kind of production scenario where there is a need to call main method of java class in enterprise application. Are you sure you have only main method to access other classes, it should have initialize, spring way of injection etc. Simple answer to you question, create a mule java component and override onCall method to call Application(class).main. I will never do this kind of stuff [for sure it will give more problems based on what is being written in main method]. In general we will use main method invocation in desktop application. if possible work on (or let the application team to work on) jar file to have better initializing options

Custom Gradle configurations by example

I am developing a library (JAR) that is meant to be used across many projects. I am using SLF4j for logging, and so I have declared the SLF4J API JAR to be a compile configuration.
When I'm developing this library locally on my machine, I'd like to run tests and see the output from all the SLF4J log statements. Or, outside of a test, it helps to add a temporary main(String[]) method to a random class and test functionality and log output as if the lib was an executable JAR. Since SLF4J's default binding is a No-Op (no output whatsoever), I have been getting by so far by adding the SLF4J Simple binding as a compile configuration dependency while I am developing & testing. Then, before I commit and publish, I simply remove the Simple binding as a dependency (since each developer who uses my lib should be able to select their own binding).
This is hacky and I know Gradle support custom configs, but I have yet to see a coherent example that could act as a guide. Ideally I'd like to define a custom dev configuration so that as a dependency I could have:
dependencies {
compile 'org.slf4j:slf4j-api:1.7.5'
dev 'org.slf4j:slf4j-simple:1.7.5' // Only used when running/testing locally
}
...but then ony the SLF4J API JAR gets added to my pom. Any ideas as to how to accomplish this? Perhaps Gradle already has such a concept built into it, or perhaps a custom configuration isn't even the right approach.

Invalid ejb jar: it contains zero ejb.

I have 2 modules: ejb and war, and ear module, that contains them. Modules build successfully, but when I try to deploy ear to glassfish, I recieve this error:
glassfish3.1.2|javax.enterprise.system.tools.admin.org.glassfish.deployment.admin|_ThreadID=17;_ThreadName=Thread-2;|Exception while deploying the app [EarModule] : Invalid ejb jar [BackEnd-1.0-SNAPSHOT.jar]: it contains zero ejb.
Note:
1. A valid ejb jar requires at least one session, entity (1.x/2.x style), or message-driven bean.
2. EJB3+ entity beans (#Entity) are POJOs and please package them as library jar.
3. If the jar file contains valid EJBs which are annotated with EJB component level annotations (#Stateless, #Stateful, #MessageDriven, #Singleton), please check server.log to see whether the annotations were processed properly.|#]
I really don't know what to do, I've found a lot of questions like mine, but there was no solution.
I understood, what was wrong. The problem was in run configurations, I'm using Intellij Idea and in run configurations there was build and make before run of my ear module. I removed this and after maven install it deployed successfully.
You have to add an EJB into your WAR or EAR file. Just Create a new Class and annotate it with #Stateless
I know this is very build specific and it uses Netbeans instead of the OP's IDE but because I was lead here and this will likely be useful to some users:
I had the following build:
Netbeans Enterprise Application with Maven
Glassfish 4.1
Java EE 7
I had tried migrating from a previous non-maven enterprise application and the clone didn't quite work the way I expected, there was some old ejb jars lying around that I deleted.
I had done quite a few things to fix it:
Ensure theres no ejb jars lying around that shouldn't be there. Ensure that you don't have accidently have the ejb module jar included more than once as this can result in the same error too (Manually deploying the ear and deployment through netbeans sometimes gave me different errors).
I used the #Remote interface on my EJB applications. Now you should not be importing your EJB into your War, you should use the annotations correctly as described https://docs.oracle.com/javaee/7/tutorial/ejb-intro004.htm
(This is more of a note) When you update any of your war or ejb, clean and build them before cleaning and building your ear (sounds funny right?).
If you are using interfaces for your session beans then you should put them in a separate jar, make a new project maven > java application. Do the same thing with your persistence entities. Add these as dependencies to both your ejb and war project.
This doesn't relate to me in particular but you should have at least 1 #stateless (or I think #stateful) annotation in a java class inside your ejb module for it to run (for the module to be considered an ejb).
I likely had to do a few more things that I forgot but if you still run into issues comment below and I'll try to update.
Just try to build & install your project using Maven , and then , deploy it in glassfish ( do not run your project directly from your IDE )
I encountered this problem as well. It occurred when I had imported a new EJB project into my Eclipse workspace. The project didn't have a reference to the Glassfish libraries then, since it was not yet included in the EAR deployment assembly.
Upon saving the Bean file, the IDE automatically imported javax.inject.Singleton instead of javax.ejb.Singleton. This made the code compile without warnings, but throw the same error as in the original post.

Glassfish 3.1 CDI problem with multimodule layout

I'm not sure if this is a bug in Weld or Glassfish or if I'm doing something just plain wrong.
I have three jars: api, impl and base. These jars are packaged to war that is deployed to Glassfish 3.1 (b37). The outcome is an error that states that injection point is unsatisfied in a pojo that is in impl. The failing pojo to inject is in base jar. The curious thing is that I can inject that very same failing pojo to a backing bean in my war with no fuss, and I can also just instantiate the injection dependency by hand using the old fashioned new keyword.
Any thoughts? I also have Maven based test project that replicates this if anyone is interested to see that.
Every jar has beans.xml in META-INF - even the war has beans.xml.
Ville
This problem can be solved by replacing module weld-osgi-bundle.jar (Glassfish is shipped with weld 1.1.0.Final) in Glassfish modules directory with the newest one - 1.1.1.Final. After that remove all files from domains/domain1/osgi-cache directory.