I'm using some off-the-shelf OSGi bundles in my application and would like to repackage them together with additional packages that are not yet OSGi compatible into a new bundle.
Case in point is EclipseLink, which is available as several OSGi bundles, most of which are optional, depending on what you want to do. I want to pick those bundles that are relevant for me, add database drivers (for example the MySQl JDBC connector) and repackage them into a new bundle that is easier to deploy.
I'm using the maven-bundle-plugin from Apache Felix. I set up a new Maven project without source code, added the four eclipselink and the mysql connector as dependencies and tried the following:
use the <Embed-Dependency> and <Embed-Transitive> instructions to include all dependencies in one bundle. Problem: Optional dependencies from the eclipselink bundles (for example, javax.mail.internet) become required as the plugin rewrites the manifest. The original bundles contain "resolution=optional" in the manifest and thus work well without.
use the manifest goal of the plugin and a jar-with-dependencies assembly, but that gives me basically the same result, only with more work.
used the bundleall goal of the plugin, which is not quite what I want, because it creates separate bundles again. Even worse, because now these bundles don't have their dependencies inside.
I'm going to face similar issues with Struts 2. I'm not going to be obsessive about this, and just go with a whole bunch of separate third-party bundles, but if I can package them more neatly, I would really like to. I'm aware that a point of OSGi is modularity, so creating big bundles kind of defeats that, but I feel that if your modules are tightly coupled anyway, you might as well put them into a single bundle.
Of course, I could manually tweak the manifests, but I definitely don't want to.
As omerkudat says, this is probably not a good practice to encourage, but as you have your reasons, this is a way you could do a poor-man's merge.
Assuming you are handling the OSGi manifest yourself, you only really need to get all the classes from the bundles and jars into the target/classes directory before the package phase.
You can do this with either of the dependency plugin's unpack-dependencies or unpack goals. I'd use the unpack-dependencies if you want to process all the project dependencies (or those following a certain naming patter or in a certain groupId) and the unpack goal if you want to have fine control over the artifacts to be unpacked (at the expense of a verbose POM). I'll assume unpack in my example. Each unpack is output to the project's outputDirectory (i.e. target/classes).
Note this will overwrite duplicate artifacts from each package in the order they're downloaded, so the manifests will be clobbering each other. To ensure your artifacts are managed correctly, I would bind the unpack goal to an early phase so that your src/main/resources are copied on top of the unpacked contents and not overwritten. In the sample below this phase is generate-resources, so it will happen after your local compile. If you need to overwrite any of the classes, use an earlier phase to unpack the dependencies such as generate-sources
My sample below unpacks the contents of junit-3.8.1 and commons-io 1.4 (just the first two dependencies I had declarations for) into target/classes before the project's resources are copied there. Note that the versions are defined in my dependencies section. If you haven't got the bundles/jars declared as dependencies you'll need to declare the version in the artifactItem as well.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>unpack</id>
<phase>generate-resources</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<overWrite>false</overWrite>
<outputDirectory>${project.build.outputDirectory}</outputDirectory>
</artifactItem>
<artifactItem>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<overWrite>false</overWrite>
<outputDirectory>${project.build.outputDirectory}</outputDirectory>
</artifactItem>
</artifactItems>
<overWriteReleases>false</overWriteReleases>
<overWriteSnapshots>true</overWriteSnapshots>
</configuration>
</execution>
</executions>
</plugin>
Related
We have a Maven project with multiple compile dependencies and every time a new <dependency> is added, we need to create an equivalent <weaveDependency> entry in
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.7</version>
<configuration>
<weaveDependencies>
<weaveDependency>
<groupId>a-group</groupId>
<artifactId>new-dependency</artifactId>
</weaveDependency>
</weaveDependencies>
<weaveDirectories>
<weaveDirectory>${project.build.directory}/classes/</weaveDirectory>
</weaveDirectories>
<complianceLevel>${java.version}</complianceLevel>
<showWeaveInfo>true</showWeaveInfo>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
This is being done exactly as described in
http://mojo.codehaus.org/aspectj-maven-plugin/examples/weaveJars.html
But this could easily lead to problems if it's needed to weave everything because someone could forget to add the <weaveDependency> after adding a new <dependency>, so is there a way of detecting and weaving all compile dependencies automatically? Maybe with another plugin?
AFAIK there is no such option or plugin, unless you decide to write one or open a ticket for AspectJ Maven.
One question before we continue: Are you really sure you want to weave all dependencies? What about libraries such as JUnit or Log4J in your aspect POM?
The way I usually go about weaving my aspects into the code - if they are production and not just development, debugging or profiling aspects, that is - is that I do it the other way around than you: I use the AspectJ Maven Plugin in each of my modules to directly compile the aspects into my code from source. So in my case each Java module depends on an aspect module, using it as an aspect library. Because usually I have way fewer aspect libs than Java modules, I cannot so easily forget to include them. Okay, I have to do it in each module, but this is a no-brainer with a good IDE (global search and replace on all pom.xml files in my project).
If you really want to do it even more cleanly and nicely, you can use the approach explained in Strategies for using AspectJ in a Maven multi-module reactor, i.e. you create a normal root POM and an aspect root POM which has the root pom as its parent. Then each Java module which needs the aspects can use the aspect root POM as its parent, other Java modules use the root POM directly.
The advantage of compiling the aspects into your artifacts right away is that there are no two class file versions of each artifact: an original without aspects and a woven version with aspects in the aspect module's target directory. The only reason why you would not do it they way I explained is if for some reason you also need artifact versions without aspect code. But then, as I said, you probably use development, debugging or profiling aspects. Be it as it might, you can still use my approach for production aspects and your old approach for the development stuff.
I have a regular maven jar project, which has dependencies such as the reflection library and I want to
convert it to osgi, what Ive already done.
created a common interface layer in a different (maven) jar
project and added it to the bundle as a dependencies.
changed the type of the osgi-module-to-be to 'bundle'.
created an implementation of BundleActivator
added this plugin the pom:
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.3.7</version>
<extensions>true</extensions>
<configuration>
<instructions>
<Bundle-Activator>my.package.MyServiceActivator</Bundle-Activator>
<Export-Package>
my.package.exp.*
</Export-Package>
<Import-Package>
!org.reflections,???
</Import-Package>
<Embed-Dependency>
slf4j-api;scope=compile,???
</Embed-Dependency>
</instructions>
</configuration>
here is where it gets lost, I need to figure out the "Import-Package" and "Embed-Dependency"
and, even more important figure how to deploy it on glassfish as a zip or,
maybe, ORB (or Gogo) so that it will deploy with all it's dependencies jars.
any ideas?
G.
BTW: the org.reflections package is not OSGi ready
It seems you're confused about how OSGi and the Maven Bundle plugin work.
Maybe reading the Felix guide will help you:
http://felix.apache.org/site/apache-felix-maven-bundle-plugin-bnd.html
Basically, you should have something like this:
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<extensions>true</extensions>
<configuration>
<instructions>
<Bundle-Activator>my.package.MyServiceActivator</Bundle-Activator>
<Export-Package>
my.package.exp.*
</Export-Package>
<Import-Package>
!org.reflections*,*
</Import-Package>
<Embed-Dependency>
org.reflections
</Embed-Dependency>
</instructions>
</configuration>
</plugin>
By default (ie. if you just omit it, which is usually the case), your Import-Package instructions would be *, which means anything you refer to in the code which is not in java.* or in your own bundle should be imported. But as you have a dependency on a non-bundle jar you want to embed, you need to tell the plugin that by using the expression !org.reflections*,*, which means you don't want to import the org.reflections package but everything else is fine. You also need to declare that any artifactId called org.reflections should be embedded in the jar by using the Embed-Dependency instruction.
BTW, You most likely don't want to embed your logging framework SLF4J implementation, let alone the API, as just about any OSGi environment should provide a logging implementation for you.
After you package your bundle (mvn package or just mvn install) make sure to check the generated MANIFEST to ensure that it looks like everything is correct (importantly, check the Import-Package packages and see if your environment will have all bundles which provide such packages).
Once you get your bundle set up correctly, deploying it is trivial. Just drop it into your framework's bundle directory, ensure all other bundles you need are also there, and everything should work fine.
As a side note, you might want to consider wrapping the non-bundle JAR you need as a bundle by using PAX-WRAP or just Karaf (just throw a JAR in the deploy folder and you will get it wrapped as an OSGi bundle immediately), for example.
I am trying to figure out how to aggregate my maven dependencies in a multi-module project. For example, if I have:
root pom/project1
root pom/project2
and I run mvn dependency:copy-dependencies, I end up with the dependencies in:
root pom/project1/target/dependency
root pom/project2/target/dependency
What I really want is that if I run the mvn command in the root pom folder, all of the dependencies to be copied to root pom/dependency. Is there a maven property that gets me the output directory of the root pom? (similar to ${project.build.directory})? I realize that I can just copy all the dependency folders to the same place after the fact, but I was hoping for something a little cleaner.
You will have to configure the dependency plugin to copy depdendencies to a particular location. This can be done by the outputDirectory configuration property.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>${outputDir}</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
But if you trying to do this for distribution, I'd recommend you create an assembly using the maven assembly plugin
The documentation says:
The Assembly Plugin for Maven 2.0 is primarily intended to allow users to aggregate the
project output along with its dependencies, modules, site documentation, and other files
into a single distributable archive.
What I really want is that if I run the mvn command in the root pom folder, all of the dependencies to be copied to root pom/dependency. Is there a maven property that gets me the output directory of the root pom? (similar to ${project.build.directory})?
No, because modules shouldn't actually be aware of that.
I realize that I can just copy all the dependency folders to the same place after the fact, but I was hoping for something a little cleaner.
The Maven way would to use the Maven Assembly Plugin and a custom descriptor. But if you're not familiar with the Maven Assembly Plugin and its descriptor format, it won't be easy.
A less clean but easier approach would be to configure the Maven Dependency plugin to copy the dependencies into the parent project using a relative path. Something like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.1</version>
<configuration>
<outputDirectory>../root_pom/target/dependency</outputDirectory>
</configuration>
</plugin>
But as I said, this introduces tight coupling between modules and the root project which is not good at all (and I wouldn't go further by including the goal invocation as part of the build, modules should remain independent and you should be able to build one module without "checkouting" the parent).
If you have an existing ant file, what is the best way to convert the project to Maven. I've checked out things like fAnt, but if I'm going to mess with this stuff, I might as well go full-bore for Maven. I expected something to exist that can just start the pom.xml for me based on the existing build.xml, but I haven't found anything yet. Suggestions?
I don't know any good automated way to do such a migration because things may just be too different so I would do it manually, step by step, and keep the existing ant build in parallel of the future new one until the whole migration is done (from both technical and human points of view).
First, refactor the existing Ant build to align it on Maven conventions:
Make things modular: if your existing build is a big monolithic build producing several artifacts from a single source tree, break it down into separate modules, one for each artifact.
Update directory structure: Maven comes with a standard directory layout and, while it is possible to customize this layout (i.e. to configure plugins for another layout), this is not really recommended and is more a source of troubles than benefits. So I'd move existing app sources, configuration files, tests, etc to match Maven's layout (e.g. src/main/java for application sources, etc).
Then, start to create the Maven build:
Create POMs for each module: Create a POM, declare external libraries as Maven dependencies (maybe add them to a corporate repository, using an enterprise repository is a good practice in an enterprise context anyway), add dependencies between modules.
Finalize the multi-modules build: Add parent(s) POM(s) and inheritance/aggregating relationships. Test that there is no regression with the created artifacts.
You could do this work in a separate VCS branch if you don't want to change anything until the work is done and create scripts to move things. And when ready, merge the Maven specific stuff and apply the scripts.
You could run the Ant script from Maven with the maven-antrun-plugin. Your pom.xml would look something like this:
<project>
...
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<dependencies>
<dependency>
<groupId>org.apache.ant</groupId>
<artifactId>ant-nodeps</artifactId>
<version>${ant-nodeps.version}</version>
</dependency>
</dependencies>
<executions>
<execution>
<id>init</id>
<phase>compile</phase>
<configuration>
<tasks>
<!-- Ant code goes here -->
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</build>
</project>
That way you can start to move your dependencies into Maven, and reference them in the Ant script like so
${com.foo.bar:my-lib:jar}
Then just start slowly moving pieces of your Ant into pure Maven stuff.
I'm using Maven and its assembly plugin to build a distribution package of my project like this:
one project assembles a basic runtime (based on Felix), with the appropriate directories and bundles, in a ZIP file.
third-party libraries are collected in one project each and either converted to OSGi bundles or, if they are already OSGi compatible, they are just copied
my own project consists of several modules that are built into OSGi bundles, too.
Now, I'm adding another project that unpacks the ZIP, drops all the other JARs into the proper directories, and repackages it for distribution. Now, my bundles might contain configuration files that I want to merge into, rather than replacing, identically named ones in the runtime assembly. How do I do that?
The files are plain text (property files), but I might run into a similar situation with XML files later.
Expanding a bit on Juergen's answer for those who stumble on this - the containerDescriptorHandler in the descriptor can take four values (v2.3), these are metaInf-services, file-aggregator, plexus, metaInf-spring. It's a bit buried in the code (found in the package org.apache.maven.plugin.assembly.filter) but it is possible to aggregate config/properties files.
Here's an example descriptor that aggregates the META-INF/services and
named property files located in com.mycompany.actions.
descriptor.xml
<assembly>
...
<containerDescriptorHandlers>
<containerDescriptorHandler>
<handlerName>metaInf-services</handlerName>
</containerDescriptorHandler>
<containerDescriptorHandler>
<handlerName>file-aggregator</handlerName>
<configuration>
<filePattern>com/mycompany/actions/action.properties</filePattern>
<outputPath>com/mycompany/actions/action.properties</outputPath>
</configuration>
</containerDescriptorHandler>
</containerDescriptorHandlers>
....
</assembly>
The file-aggregator can contain a regular expression in the filePattern to match multiple files. The following would match all files names 'action.properties'.
<filePattern>.+/action.properties</filePattern>
The metaInf-services and metaInf-spring are used for aggregating SPI and spring config files respectively whilst the plexus handler will aggregate META-INF/plexus/components.xml together.
If you need something more specialised you can add your own configuration handler by implementing ContainerDescriptorHandler and defining the component in META-INF/plexus/components.xml. You can do this by creating an upstream project which has a dependency on maven-assembly-plugin and contains your custom handler. It might be possible to do this in the same project you're assembling but I didn't try that. Implementations of the handlers can be found in org.apache.maven.plugin.assembly.filter.* package of the assembly source code.
CustomHandler.java
package com.mycompany;
import org.apache.maven.plugin.assembly.filter.ContainerDescriptorHandler;
public class CustomHandler implements ContainerDescriptorHandler {
// body not shown
}
then define the component in /src/main/resources/META-INF/plexus/components.xml
components.xml
<?xml version='1.0' encoding='UTF-8'?>
<component-set>
<components>
<component>
<role>org.apache.maven.plugin.assembly.filter.ContainerDescriptorHandler</role>
<role-hint>custom-handler</role-hint>
<implementation>com.mycompany.CustomHandler</implementation>
<instantiation-strategy>per-lookup</instantiation-strategy>
</component>
</components>
</component-set>
Finally you add this as a dependency on the assembly plugin in the project you wish to assemble
pom.xml
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2.1</version>
<configuration>
<descriptors>
<descriptor>...</descriptor>
</descriptors>
</configuration>
<dependencies>
<dependency>
<groupId>com.mycompany</groupId>
<artifactId>sample-handler</artifactId>
<version>1.0</version>
</dependency>
</dependencies>
</plugin>
and define the handlerName in the descriptor
descriptor.xml
...
<containerDescriptorHandler>
<handlerName>custom-handler</handlerName>
</containerDescriptorHandler>
...
The maven-shade-plugin can also create 'uber-jars' and has some resource transforms for handling XML, licences and manifests.
J
Old question but stumbled over it while trying to solve similar problem: Assembly plugin 2.2 has capabilities to merge files: http://maven.apache.org/plugins/maven-assembly-plugin/assembly.html#class_containerDescriptorHandler
e.g. handlerName "metaInf-services" (will concat all META-INF/services files), "metaInf-spring" are the only ones I know of (I personally needed metaInf-services)
I don't know of a robust solution to this problem. But a bit of looking around shows that somebody has created a plugin to merge properties files. By the look of it you need to tell it which files to merge, which is a good thing as you don't want this applied willy nilly.
Assuming you have used dependency-unpack to unpack the zip to a known location, it would be a case of configuring the plugin to merge each pair of properties files and specify the appropriate target location.
You could extend the plugin to handle XML by using something like xmlmerge from EL4J, as described in this Javaworld article.
Ive also created a merge files plugin, in my case i use it to merge SQL files from various projects into a single installer SQL file which can create all the schemas/tables/static data etc for our apps in a single file, http://croche.googlecode.com/svn/docs/maven-merge-files-plugin/0.1/usage.html
https://github.com/rob19780114/merge-maven-plugin (available on maven central) also seems to do the job.
See below for an example configuration
<plugin>
<groupId>org.zcore.maven</groupId>
<artifactId>merge-maven-plugin</artifactId>
<version>0.0.3</version>
<executions>
<execution>
<id>merge</id>
<phase>generate-resources</phase>
<goals>
<goal>merge</goal>
</goals>
<configuration>
<mergers>
<merger>
<target>${build.outputDirectory}/output-file-1</target>
<sources>
<source>src/main/resources/file1</source>
<source>src/main/resources/file2</source>
</sources>
</merger>
<merger>
<target>${build.outputDirectory}/output-file-2</target>
<sources>
<source>src/main/resources/file3</source>
<source>src/main/resources/file4</source>
</sources>
</merger>
</mergers>
</configuration>
</execution>
</executions>