Writing HK2 components in GlassFish 4? - glassfish

In version 3 there is this guide on how to write components(including HK2 components):
Oracle GlassFish Server 3.0.1 Add-On Component Development Guide
This documentation is not available with GF4. Why not?
Why am I asking?
Because I want to write a custom Logging Handler, as documented in Chapter 7 of the administration guide(https://glassfish.java.net/docs/4.0/administration-guide.pdf).
"Note: The custom handler class should be packaged in an OSGi
module and the JAR file placed in the as-install/modules directory."
So how exactly do I proceed to create a HK2 component for GF4? Is it the same as GF3?

GlassFish 4 added a new handlerServices logging property that is not in GlassFish 3. If your handler is a HK2 handler you have to place it in the modules directory and install the handler under the "handlerServices" property in the logging.properties file. The Hundred-Kilobyte Kernel website contains all of the documentation to get you started. I would reference the GFFileHandler source code and Adding custom handlers to GlassFish v3 loggers when building your handler.
The GlassFish 4 Administration Guide also states on page 7-17:
To configure a custom handler that is not developed as an HK2 component, add the
new handler to the logging.properties file after the developer has put the custom
handler JAR file into the domain-dir/lib/ext directory
So you might be able to setup your handler without using HK2. For these handlers you have to use the standard .handlers proprty in the logging properties file.

Related

Get SLF4J running without coping anything into WebLogic lib-directory

I need to make SLF4J working inside the WebLogic application. According to Buttso [1] and Oracle [2], one need to copy files into domain/lib directory:
slf4j-api
slf4j-jdk14-1.6.0.jar
Then define the following handler in logging.property file:
handlers = weblogic.logging.ServerLoggingHandler
and start WebLogic with following parameter attached.
-Djava.util.logging.config.file=C:\tmp\logging.properties
I understand why the property file must be defined globally. But I don't understand, why JARs must be copied into domain/lib directory of the WebLogic. I tries to leave them inside my WAR file, but it doesn't work.
Is there a way to retain the log libraries under the control of the application? Where is this limitation come from? It is possible to utilize the JDK14 logging infrastructure of the Weblogic directly from the application as:
java.util.logging.Logger LOGGER = java.util.logging.Logger.getLogger("my.logger.Name");
LOGGER.info("JDK14 Anonymous info");
It works as expected. The handler weblogic.logging.ServerLoggingHandler is able to successfully intercept the message and forward it into WSL log file. Why SLF4J bridge is not able to do the same?
[1] Using SLF4J with WebLogic Server Logging
http://buttso.blogspot.com/2011/06/using-slf4j-with-weblogic-server.html
[2] How to Redirect SLF4J to the WebLogic Logging System?
https://support.oracle.com/epmos/faces/DocumentDisplay?id=1507456.1 (Oracle subscription needed)
SHOT DESCRIPTION:
It works perfectly with SLF4J packaged with the application. The important thing is that the API slf4j-api and the implementation slf4j-jdk14 must be loaded by the same classloader.
LONG DESCRIPTION:
By default the Weblogic classloader has a priority. If both libraries (slf4j-api and slf4j-jdk14) are located in domain/lib directory nothing can go wrong.
If slf4j-api is located in the application classpath but NOT in the Weblogic classpath, two things can happen:
SLF4J finds some wrong implementation packaged with the application. For example it could be a logback as mandatory dependency of some third party library. In this case the messages will be forwarded into the wrong implementation and they won't reach the WebLogic logging infrastructure.
SLF4J finds some implementation inside the WebLogic classpath. In this case the application will be most probably fail to be deployed because of ClassCastException.
As I said, it is possible to have all the SLF4J logging libraries inside the application. For example it is needed, if the WebLogic server is a shared instance and not under your control. Two things are needed to be done:
Ensure that only a single SLF4J implementation is packaged with the application. In our case it is slf4j-jdk14. Do maven clean to be sure, all the leftovers from previous tries are removed from WAR file!
Enforce the usage of the application classloader for loading the SLF4J library. It is done by WEB-INF/weblogic.xml like this:
<wls:weblogic-web-app xmlns:wls="http://xmlns.oracle.com/weblogic/weblogic-web-app" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd http://xmlns.oracle.com/weblogic/weblogic-web-app http://xmlns.oracle.com/weblogic/weblogic-web-app/1.9/weblogic-web-app.xsd">
<wls:weblogic-version>14.1.1.0</wls:weblogic-version>
<wls:context-root>test-oauth</wls:context-root>
<wls:container-descriptor>
<wls:prefer-application-packages>
<wls:package-name>org.slf4j.*</wls:package-name>
</wls:prefer-application-packages>
</wls:container-descriptor>
</wls:weblogic-web-app>
Here is a useful example how to find out, which classloader was responsible for a given class or instance.
Which classloader loaded a class of the provided instance

Dynamically selecting log4j2 configuration in Mule

I have gone through Mule Logging documentation but not clear on how to dynamically load different logging configuration files for each environments. Basically I want to control log verbosity and sync/async feature across environments so looking for similar feature of dynamically selecting property file based on server environment property variable.
2 ways you can load the log4j2 file dynamically in your application either from an external path or from your application classpath:-
setting the log4j2 file path in your application's mule-deploy.properties like :-
log.configFile=E:\common-log4j2.xml
Loading the log4j2.xml in your application programmatically by reconfiguring the log manager via Spring and load our own log4j2.xml file from your defined path:-
ref:- https://dzone.com/articles/getting-own-log4j2-file-for-mule-via-spring
I haven't tried it, but you should be able to set the log4j config file at the command line when launching Mule, using the log4j.configuration system property.
For example (in Windows) by adding -Dlog4j.configuration=c:\some-path\log4j-%MULE-ENV%.xml if your env variable is called MULE-ENV.
Note that there are several places this can be set - directly on the command line if using Mule standalone (in which case I believe you need -M-Dlog4j.configuration=...), in the wrapper.conf file if using standalone, or in the VM params section of the Arguments tab in Run Configurations when running in Studio.
You can have a bean in your application which can call the method to set the configuration. You can pass the environment name as an argument to this bean, it'll pick the configuration file associated with that environment. You can call the method using the invoke component and have this flow executed at the startup.
Till the flow is executed, default logging configuration can be used.

calling java component with multiple java classes from Mule

I am using Anypont Studio 5.3.0 and server runtime 3.7.0. I want to invoke a main() method from my component. Application is developed using Maven, SpringBoot and JPA. It sits in the jar file and have the following structure.
com
package
Application.class (with main method)
another package
Other classes
lib
other jars
META-INF
persistance.xml
MANIFEST.MF
Org
springframework boot loader and other spring classes.
when file arrives with file pattern that I detect with mule polling component I would like to invoke Java component in mule flow that has main class and all the supporting classes.
Thanks,
David
did you mavenize your Application? If yes, you can add that as a dependency in your mule project pom, which is also mavenize. But you need to make sure that the jars are added in your maven repository i.e. execute first "mvn clean install" to your java application. Otherwise, add the jars in you build path. When you are able to do those, you can create a spring bean or create a java component in mule where they could call your class with main() method.
I never came across this kind of production scenario where there is a need to call main method of java class in enterprise application. Are you sure you have only main method to access other classes, it should have initialize, spring way of injection etc. Simple answer to you question, create a mule java component and override onCall method to call Application(class).main. I will never do this kind of stuff [for sure it will give more problems based on what is being written in main method]. In general we will use main method invocation in desktop application. if possible work on (or let the application team to work on) jar file to have better initializing options

inverse_loading in WASCE 3.0.0.3 / Geronimo 3.0

I am trying to deploy a war onto a IBM Websphere Application Server Community Edition (WASCE) 3.0.0.3. I had some jars conflicting problems between those jars that comes with WASCE 3.0.0.3 and the jars comes from our application dependencies. At the end, I fixed the problem by using below property in geronimo-web.xml to force WASCE to load jars from my application.
<import-package>!the.conflicting.jars</import-package>
However, I would like to force WASCE to always take jars from my application first, i.e. inverse the default classloader behavior to load from application first. What is the correct config to change in this case?
After some searches, WASCE 3.0 is based on Geronimo 3.0 according to link. I found setting <inverse-classloading> in geronimo-web.xml may be helpful. But below two documents on Apache Geronimo 3.0 website mention that this function is no longer available on Geronimo 3.0
in Migrating from G 2.x to G 3.x, it says:
inverse-classloading Geronimo 3.0 does not support the element in the deployment plan.
in geronimo-web.xml,
The <sys:environment> element contains the following elements:
...
The <inverse-classloading> element can be used to specify that standard classloader delegation is to be reversed for this module. The Geronimo classloader delegation follows the Java EE 5 specifications, and the normal behavior is to load classes from a parent classloader (if available) before checking the current classloader. ...... ...... (Not supported in 3.0, use <import-package/> instead)
So if <inverse-classloading> is no longer available, what is the equivalent of this property in WASCE 3.0.0.3? Or how exactly should I do this using <import-package/> for all duplicated jars?
In the link you mentioned you will find the following section
<sys:environment>
The <sys:environment> XML element uses the Geronimo System namespace, which is used to specify the common elements for common libraries and module-scoped services, and is documented here:
http://geronimo.apache.org/schemas-3.0/docs/geronimo-module-1.2.xsd.html
The element contains the following elements:
The <moduleId> element is used to provide the configuration name for the web application as deployed in the Geronimo server. It contains elements for the groupId, artifactId, version and module type. Module IDs are normally printed with slashes between the four components, such as GroupID/ArtifactID/Version/Type.
The <dependencies> element is used to provide the configurations and third party libraries on which the web module is dependent upon. These configurations and libraries are made available to the web module via the Geronimo classloader hierarchy.
The <bundle-activator> element is used to create Bundle-Activator header in the manifest file of the web application. It specifies the entry point of the web application as deployed in the Geronimo server.
The <bundle-classPath> element is used to create Import-Package header in the manifest file of the web application. It contains a list of directories or embedded jar files, which are also called bundle resources and extend the classpath of the web application.
The <import-package> element is used to create Import-Package header in the manifest file of the web application. It specifies a list of packages to be resolved before the web application is started. Use <import-package>!packagename</import-package> to override the specific package in server.
The <export-package> element is used to create Export-Package header in the manifest file of the web application. It specifies a list of packages to be exported.
The <require-bundle> element is used create Require-Bundle header in the manifest file of the web application. It specifies a list of bundles to bind to regardless their packages.
The <dynamic-import-package> element is used to create DynamicImport-Package header in the manifest file of the web application. It specifies a list of packaged to be imported dynamically, especially during class loading.
So basically, you need to add the following directive i.e
<sys:import-package>!package-class-name-here*</sys:import-package> within the <sys:environment> stanza. Typically before the Application Context-Root directive.
As you already know, this is in the geronimo-web.xml embedded in the Application WAR/EAR -- as mentioned in the link
http://geronimo.apache.org/GMOxDOC30/geronimo-webxml.html

How to load a dll in Karaf container?

I have a dll which provides a simple functionality (called HelloCpp.dll) and
I need to access to the content of this library from Karaf container via REST calls.
I created a simple maven bundle which provides the REST api and a class (HelloJNI) which loads HelloCpp.dll using: System.loadLibrary("HelloCpp");
I have also addressed this dll in my POM file using: <Bundle-NativeCode>HelloCpp.dll</Bundle-NativeCode>
I have coppied the dll in both project directory and also karaf/lib folder.
I can successfuly install the bundle and I don't receive any compilation error as well, but when I deploy my bundle into Karaf container and try to start bundle, I get this error message: No matching native libraries found.
Could you please help me to solve the problem? Maybe I don't address the dll correctly in the POM file.
Thanks in advance,
Mandana