Maven2: Possible to deploy depends on artifact classifier? - maven-2

In fact I have 2 different problems, but I think they are kind of related:
I have an artifact, with an assembly descriptor set which will build an extra JAR (with extra classifier). By default, Maven2/3 will deploy the assembly generated together with the main artifact to remove Maven repository. Is there any way that I can deploy only the main artifact but not the assembly?
I have an artifact, in which I have jar plugin generate another artifact with different classifier (more specific, an EJB artifact, and I generate an client JAR). I want to deploy only the client JAR to Maven repo coz I think the main EJB artifact is not really going to be shared by other project. Is it possible to do so?
Thanks a lot
editied to provide more info:
The reason for avoiding deploy the EJB, is because the EJB main artifact is not going to be depended by other project except the containing project. The containing project will build a EAR (which contains the EJB), and normally we only need that build locally (by mvn package). However, the EJB client is something that we will deploy to our repo to let other project share when they need to communicate with our application.
Honestly it doesn't harm to deploy the EJB too, but I just want to see if I can save unnecessary waste of disk space on our repository.
Similarly, for deploying assembly, it is because the project is something we want to deploy to let other project to depends on. However, when building that project, we also have a separate assembly created on the same time (for example, an all-in-one executable jar) which we only need that built locally, and it is not something that other projects will depends on.

Turn off the 'attach' option to the assembly plugin. Then it won't be officially an artifact and it won't deploy; it will just lurk in the target directory, sulking that you don't love it as much as it's elder sibling and plot revenge.

Based on your first question i would like to know why do you create the supplemental assembly which is usually deployed as well as the main artifact. If you wan't to prevent you can put the creation of the assembly into a profile but this means you will not generate the supplemental artifact in your usual build only by activating the profile.

Related

IntelliJ, JRebel, Maven and a JEE 6 application

My setup is
IDE: IntelliJ
Application: JEE6 with an EAR and a WAR module
Build: Maven
Hot-Code-Replacement: JRebel
App-Server: Glassfish 3.1
I configured the application in IntelliJ in a way that the ear gets deployed. The ear "target" folder looks like this
target/classes/
target/appEar/appWeb-version-Snapshot.war/
target/appEar/lib/
target/appEar/META-INF
In the default configuration JRebel listens for changes in the classes/ folder.
When I change something in the web module, and build this, the classes are only updated in appWeb/target/classes/ but not in appEar/target/appEar/appWeb-version-Snapshot.war/.
If I want to update those classes I have to select "Build Artifacts" in IntelliJ after building the project.
To sum up, I have to do these steps for a hot code replacement:
(once) Configure JRebel correctly.
Make project
Build Artifacts
This whole procedure appears to be too complicated to me. Does anyone have a clue how to setup IntelliJ/Maven/Glassfish/JEE/JRebel correctly? I have not found an example containing all my tools. I'd like to have only one action for the code replacement, not two.
There's "build on make" checkbox in your project artifact settings, that will always recreate your artifact on compiling, if that's what you are looking for. However JRebel should remap where your application is reading class files and resources based on rebel.xml, so you probably should just rewrite rebel.xml to look for classes where they are compiled to, not where they end up after building the artifact.
Why do you need to Build Artifacts every time?
Your war should contain the rebel.xml that maps to the classes in /target/classes folder.
When you make changes to said classes, your server then knows to load the changes from those classes.
So you only need to build your project in order to see the changes assuming your rebel.xml classpath points to /target/classes.

How to make a maven project buildable for the customer

We have a project which should be buildable by the customer using maven. It has some open source dependencies that are mavenized (no problem), some that aren't mavenized, proprietary stuff (oracle jdbc driver) and some internal stuff.
Until now we had everything but the first category packaged with the project itself in a local repository (repository with file://path-in-project-folder specified in the projects pom.xml).
We would love to move these out of the project, as we are about to use them in other projects as well. Currently we plan to use nexus as an internal maven repository.
Whats the best practice to make such dependencies/maven repositories available to the customer so he can continue to build the project.
Ideas so far:
Customer sets up a nexus repository as well, we somehow deploy all these non-public dependencies to his repository (like a mirror)
We provide a 'dumb' dump/snapshot of the non-public dependencies, customer adds this snapshot to this settings.xml as a repository, (but how is this possible).
Make our internal nexus repo available to the customers build server (not an option in our case)
I'm wondering how others solve these problems.
Thank you!
Of course, hosting a repository of some kind is a straightforward option, as long as you can cover the uptime / bandwidth / authentication requirements.
If you're looking to ship physical artifacts, you'll find this pattern helpful: https://brettporter.wordpress.com/2009/06/10/a-maven-friendly-pattern-for-storing-dependencies-in-version-control/
That relies on the repository being created in source control - if you want a project to build a repository, consider something like: http://svn.apache.org/viewvc/incubator/npanday/trunk/dist/npanday-repository-builder/pom.xml?revision=1139488&view=markup (using the assembly plugin's capability to build a repository).
Basically, by building a repository you can ship that with the source code and use file:// to reference it from within the build.
There are two options:
Document exactly what artifacts you need to compile which are not
available via Maven Central
Implement Nexus and make a export with Nexus give the export
to customer and they need to do a import of it. I'm not sure
if you come to licenses issues.
I assumed that you already have a Repository Manager already but it reads like you didn't.

How to generate different deployables from the same Maven project?

I have a situation that I'm sure must be fairly common. I have some Maven-built applications that deploy to different types of application server - like Tomcat, JBoss, etc.
The build processes 'tunes' the deployable artifact to the specific target type of application server (for example, different included dependencies, context roots, other config). This tuning is controlled with build profiles (-Ptomcat, -Pjboss etc)
So, for a given version of my application, I need to run builds that produce different deployables. I run mvn -Ptomcat clean package for example and I get an artifact in my /target directory that is the tomcat-tuned version.
The best approach I've been able to come up with so far is to specify finalnames for the artifacts that include the profile information, but for that approach, I'm not sure how to configure Maven to copy the final artifact off to some specific location so that the next build for a different type doesn't overwrite it.
Is this a good approach? If so, how can I achieve that final copy?
Or is there a better way?
You'll need to use Maven Assembly Plugin.

How does the maven file structure work?

We are planning on restructuring a complex project with many modules/pieces, what ever you wanna call it. In order to move toward a standard directory structure, we would like to adopt the maven file structure.
So the big question is: Can anybody provide a description of the maven file structure, where we don't have to dig through all the maven speak?
Please see
http://maven.apache.org/guides/introduction/introduction-to-the-standard-directory-layout.html
src/main/java Application/Library sources
src/main/resources Application/Library resources
src/main/filters Resource filter files
src/main/assembly Assembly descriptors
src/main/config Configuration files
src/main/webapp Web application sources
src/test/java Test sources
src/test/resources Test resources
src/test/filters Test resource filter files
src/site Site
LICENSE.txt Project's license
README.txt Project's readme
BTW, we did that migration on existing projects.
It was a really long and hard task to make everything work as intended, but we are finally done and happy with it.
UPDATED
When you have many projects, you have the same structure for each project.
Now the real problem starts when you want to group them. We had a hard time reading Maven documentation and best-practices, and deciding what was the appropriate structure for us.
The basic idea would be to group related projects in a common directory (that we call a module), allowing to process the module as a whole without listing them. But if you open the module in an IDE (Eclipse in our case), the projects themselves belong to it, but are not opened as subprojects (that notion doesn't exist in Eclipse).
We ended up with a strict hierarchy, that freed us from many maven problems:
The actual coding projects (java projects) are always leaf in our directory tree. They are the only ones we open in the IDE. They are of type JAR, or WAR.
Their parents/modules are always of type POM. They have no java code.
I've been using the same approach as Jens on a number of projects both with Maven 2.2.1 and now with Maven 3.0-alpha-6: POM modules define the module structure of your project tree, JAR/WAR modules are the leaves of the tree. All modules have the same version.
Advantages:
You can
place properties or dependencies on
specific levels in the module
hierarchy and they will be inherited
to all sub-modules.
You can build
related modules simply by going to
the appropriate level in the tree and
running "mvn install" - Maven will
work out the correct build order
according.
Various Maven plugins such
as the release plugin rely on this
tree structure.
The latest Maven
Eclipse plugin can handle this
structure very well and will
represent the tree as a flat list.
There is an experimental feature in
the plugin which ensures that
so-called "shadowed" artifacts appear
only once which helps when searching
for resources in Eclipse.
Disadvantages:
Extension takes some time. For instance, if you decide that a JAR module requires sub-modules, you will need to convert the existing JAR module into a POM module and then distribute its contents to the newly created JAR sub-modules as POM modules cannot contain any code themselves.
All the POM modules will appear in Eclipse and can slow down the build somewhat. Hoever, you can close them and Eclipse will source them from the repository instead.

Maven: local development deploy vs bundling for distribution

Bear with me, I'm migrating from Ant to Maven2: I think I've hit one of those little things that was easy in Ant, but not so in Maven...
How do I handle the difference between a local deployment vs. creating an archive/bundle for distribution to another machine?
Let's assume my project's output is an EAR plus some additional config files. A developer that is actively working on the project will need to deploy and re-deploy frequently to his local app-server (say JBoss), while an Integration Engineer that is building for QA/production will need only to create the final archive assembly (tar/gz).
In Ant we had two targets for this: "dev-deploy" and "bundle". Both do a complete build, but differ in the final step: "dev-deploy" copies the EAR and config files to the respective local folders, while "bundle" just puts the EAR & config files in a tar.gz assembly.
How do you do this in Maven?
I've seen that the assembly plugin can create either archives (tar, gz, etc.) or exploded directories (from the same assembly descriptor). I can invoke either assembly:assembly or assembly:directory, but for the latter, how do I copy the final output to the local JBoss deployment folders? From a related post it seems that ad-hoc copying of files is not really what Maven is about, so an antrun copy is probably the most appropriate?
Finally, since the type of assembly may differ depending on who invokes it, it doesn't seem wise to bind assembly to the build lifecycle, not so? But this means that a developer will always need to invoke 'mvn package' followed by 'mvn assembly:directory' to rebuild and test a change. Conversely, an Integration Engineer will always need to run 'mvn package' followed by 'mvn assembly:assembly' to create the distributable archive. I was hoping for a one-command solution for each, or should I just script it?
In Ant we had two targets for this: "dev-deploy" and "bundle". Both do a complete build, but differ in the final step: "dev-deploy" copies the EAR and config files to the respective local folders, while "bundle" just puts the EAR & config files in a tar.gz assembly.
Not sure what you mean by respective local folders about "dev-deploy" but this sounds like what mvn pacakge is doing and "bundle" indeed sounds like a maven assembly.
I've seen that the assembly plugin can create either archives (tar, gz, etc.) or exploded directories (from the same assembly descriptor). I can invoke either assembly:assembly or assembly:directory, but for the latter, how do I copy the final output to the local JBoss deployment folders? From a related post it seems that ad-hoc copying of files is not really what Maven is about, so an antrun copy is probably the most appropriate?
I guess that we are talking about the Integration Engineer's tasks here. As you didn't explain what the "bundle" contains exactly, what the target application server is (my understanding is that you are using JBoss for QA/production too but, again, this is a guess), if this bundle has to be deployed automatically, it's hard to imagine all solutions and/or alternatives to antrun. But indeed, to copy/move/unzip/whatever the assembly, the maven antrun plugin is a candidate.
Finally, since the type of assembly may differ depending on who invokes it, it doesn't seem wise to bind assembly to the build lifecycle, not so? But this means that a developer will always need to invoke 'mvn package' followed by 'mvn assembly:directory' to rebuild and test a change. Conversely, an Integration Engineer will always need to run 'mvn package' followed by 'mvn assembly:assembly' to create the distributable archive. I was hoping for a one-command solution for each, or should I just script it?
My understanding was that the Integration Engineer was building the bundle. Why would a developer need the bundle? This is confusing... Anyway, I don't really need the details to think of an answer. You could actually declare the maven assembly plugin into specific build profiles, one for development and one for integration, and bind either the single or the directory-single mojos to the project's build lifecycle in each profile. This would allow to use only one command and avoid any scripting (really, don't go this way).