Summary: I'm looking for a way to instruct maven to search for dependencies in target/classes instead of jar in the local repository
Say I have 2 modules, A and B where A depends on B. Both are listed in a module S. Normally I need to run 'mvn install' in S. I'm looking for a way to run 'mvn compile' so that when A is compiled its classpath will contain ../B/target/classes instead of ~/.m2/repository/com/company/b/1.0/b-1.0.jar.
(my reason is so that i can have continous compilation without the need to go through packaing and installation, or, more exactly, use 'mvn scala:cc' on multiple modules)
I don't think that this is possible without horrible hacking, this is just not how maven works. Maven uses binary dependencies and needs a local repository to resolve them. So, the maven way to handle this is to launch a reactor build on all modules. Just in case, have a look at Maven Tips and Tricks: Advanced Reactor Options.
But, during development, can't you just import all your projects in your IDE and use "project references" (i.e. configure your projects to depend on source code instead of a JAR) like most Java developers are doing? This is the common approach to avoid having to install an artifact to "see" the modifications.
If this is not possible and if you really don't want to install artifacts into your local repository, then you'll have to move your code into a unique module.
i know this is annoying. which helped me here is definitely IDE support. eclipse and IntelliJ are clever to collect all dependencies once a maven-project import is done. even cross module dependencies are compiled live.
Related
to sum up the components and environment:
multi-project, typically each gradle project is soley in a seperate git
you don't want to use submodules
gradle init scripts in a seperate config / super repository
using gradle wrapper
for the GUI guy: IntelliJ IDEa with Gradle integratiom -> help
allowed to use gradle idea -> guide
so,
Q: How to elegant marriage these components. How can I define an init script to be used in the wrapper of a single repository without affecting other repositories.
I know:
init scripts are typical in a "GRADLE_HOME" directory
init scripts can be defined per console via -I
(yes, I read the documentation 😅 )
Problems found:
intelliJ doesn't allow to define the -I option in UI
anyone needs to checkout and update a seperate repository if you want to share between projects
the settings.gradle || gradle.properties file seems not to support any option either
Constraints:
(while these are possible answers, they are neither elegant nor fault proof)
the desired solution should be applicable for SINGLE projects, and should not be globally applied to all projects on the same computer
Hidden Questions:
can I include global gradle settings from an URL so noone needs a clone of the meta-repo??
does an URL include do the same as an init script? Or what you can do with initScript what you can't in include?
You can do the following:
Create a custom gradle distribution with the common settings defined in the init script
Configure your projects to use that distribution through the distributionUrl key in the gradle/wrapper/gradle-wrapper.properties
Use regular gradle build from command line/usual import into intellij - it just works
By the way, there is a gradle plugin for simplifying custom gradle distribution construction
You can use the buildSrc customization - depending on what you need -
where buildSrc/build.gradle takes effect prior configuration phase of your project.
What you should know, that there is a different scope, i.e. buildSrc/build.gradle's allprojects is scoped to any project beneath buildSrc and not your normal projects.
More generally speaking: buildSrc/build.gradle is like what you do normally in buildscript or task declarations in script plugins and you can write clean plugin code without publish it as plugins.
⚠️ Limitations:
you can't take care about plugin resolution - therefor you have to get into your projects settings.gradle
you can't change dependency management for your projects - you still have to do this in your project's buildSrc
for both you can see How can the gradle plugin repository be changed?
you still have to apply (even self buildSrc homed) plugins in your project (what is a good thing if you ask me, because it's more visible / clear what happens)
you can't share this with a second repository - without using git submodules, etc.
I'm developping application with JOGL2 and my favorite IDE Eclipse, also I want to use Maven2 for this purpose. Unfortunately, JOGL2 has no artifact yet. Also, I plan to deploy it as a runnable jar file.
So I want to install JOGL artifact locally : so i'll use the install:install-file command.
But I want to group several jars to make several artifacts, that is :
gluegen-rt.jar and jogl.all.jar as a single artifact named jogl.core
gluegen-rt-natives-linux-i586.jar and jogl.all-natives-linux-i586.jar as a single jar named jogl-natives-linux-i586
and so on
Is it possible ? (The official documentation does not mention the possibility or unpossibility to do so).
Thanks in advance
Install all files as usual like file:jar:version. Than create pom with pom packaging and use gluegen-rt.jar and jogl.all.jar as dependencies in it (they must be already installed). After that use new pom as dependency in your project.
maven doesn't have support for that. You would have to unpack these JAR files and repackage them together.
maven does have support for merging JAR with dependencies (http://stackoverflow.com/questions/574594) - and it's done the way I mentioned above. But you are asking about merging two arbitrary JARs, which is not possible in maven.
I have a alot of jobs on Hudson, most of which are really small and consist of just a few modules. But one is big and consist of several modules.
When ever I make a commit to our subversion repository for any of those several modules in that big job, Hudson builds the entire job instead of just the module that have changed.
It doesn't matter if I just scm-polling or a subversion hook, the result is the same.
It seems to me like it would be better if the modules where built instead of the jobs since the other modules in other jobs have dependencies to the modules and not to the jobs.
Can this be configured or do I have to create several jobs instead of the big one? And if so, can I configure the big job to never build when any of it's modules are being triggered but still build when it's own pom.xml is changed?
Thanks.
Hudson has an "Incremental Build" option in the Maven area of the job configuration.
It's hidden in the "Advanced" area.
You could make use of the reactor plugin. For example:
mvn reactor:make-scm-changes
This will only build those modules that have been changed in the SCM. Follow the link for other examples.
Doesn't your compiler offers you the incremental compile option? The java 1.6 compiler usually searches for class and source files and decides using the timestamp to determine whether to use the source or class file. Just leave out the clean goal when building your code.
Another option would be to first run a batch/shell script to determine what files changed and delete the corresponding class files so that the compiler incrementally builds the class files that are missing.
Suppose, I have an opensource project that depends on some library, that must be patched in order to fix some issues. How do I do that? My ideas are:
Have that library sources set up as a module, keep them in my vcs. Pros: simple. Cons: some third party sources in my repo, might slow down build process, hard to find a patched place (though can be fixed in README)
Have a module, like in 1, but keep patched source files only, compile them with orignal library jar in classpath and somehow replace *.class files in library jar on build. Pros: builds faster, easy to find patched places. Cons: hard to configure, that jar hackery is non-obvious (library jar in repository and in my project assembly would be different)
Keep patched *.class files in main/resources, and replace on packaging like in 2). Pros: almost none. Cons: binaries in vcs, hard to recompile a patched class as patch compilation is not automated.
One nice solution is to create a distinct project with patched library sources, and deploy it on local/enterprise repository with -patched qualifier. But that would not fit for an opensourced project that is meant to be easily buildable by anyone who checks out its sources. Or should I just say "and also, before you build my project, please check out that stuff and run mvn install".
One nice solution is to create a distinct project with patched library sources, and deploy it on local/enterprise repository with -patched qualifier. But that would not fit for an opensourced project that is meant to be easily buildable by anyone who checks out its sources. Or should I just say "and also, before you build my project, please check out that stuff and run mvn install".
This is what I would do (and actually what I do) for both a corporate and an opensource project. Get the sources, put them under version control in a distinct project, patch them, rebuild the patched library (and include this information in the version, something like X.Y.Z-patched), deploy it to a repository (you could use SVN for this, a la Google Code1), declare the repository in your POM and update the dependency to point on your patched version.
With this approach, you can say to your users: check out my code and run mvn install and they will just get the patched version without any extra action. This is IMHO the cleanest way (not error prone, no class path order mess, no increase of the build time, etc).
1 Lots of people are deploying their code to their hosted subversion repository (how-to in this post).
One nice solution is to create a distinct project with patched library sources, and deploy it on local/enterprise repository with -patched qualifier. But that would not fit for an opensourced project that is meant to be easily buildable by anyone who checks out its sources. Or should I just say "and also, before you build my project, please check out that stuff and run mvn install".
I'd agree with this and Pascal's answer. Some additional notes:
you may use dependency:unpack on the original artifact and then combine that with your compiled classes if you don't want to rebuild the whole dependant project
in either case, your pom.xml will need to correctly represent the dependencies of that library
you can still integrate this as part of your project's build to avoid the 'deploy to a repository' step
make sure you honour the constraints of the project's license when doing all this!
I'm currently working on two projects simultaneously:
My main project (built with maven)
A spike of an open source project, which my main project depends on (not build with maven)
How do I set up maven to use the OSS project as a dependency with the least amount of friction, given that I'm often developing the two in tandem?
I can think of several solutions:
Mavenize the existing OSS project. This is of course the "ideal" option but often not feasible (even if you introduce the new build system in parallel of the existing one). The project has likely an existing project structure that differs from Maven's standard layout. Changing the existing layout and build script may not be desired by developers, adapting a Maven build to use a non standard layout can be painful. In both case, you're screwed.
Wrap the existing Ant build with Maven. This can be nice if you want to include the build of the OSS project in the lifecycle of your project and have both of them built in one step. You can check this answer on SO for details on how to do this.
Use Apache Ivy or Maven Ant Task in the existing build to produce and install a Maven artifact in your local repository. Use this artifact as a regular dependency in your Maven project (except that you'll have to declare its transitive dependencies manually). This is maybe the quicker and less intrusive approach if building both project separately is not a problem.
It looks like you choose option 3. I think it's a good choice for a quick win.
The solution I've used is the maven-ant tasks (http://maven.apache.org/ant-tasks/).
I added an install task onto the build.xml file, which installs the compiled .jar into the local repo.
While adding a full-fledged pom to the project would defintely be the best approach, this is a major chunk of work, and inflicts maven on the project (where the other users would prefer not to use it).
I think you probably need to bite the bullet and set up a POM for your OSS project tree. This is the painful part (as you would need to hunt down the details of specifying resources paths for various plugins involved depending on the OSS app type (i.e. web, etc.)). Good news is that this is a one time effort.
Once that is done, your main project can refer to the (wrapped) OSS project as a dependency. Here a (standard maven) multi-project structure would apply.
If OSS project has dependencies - create a POM with those dependencies (your project will use them as transitive dependencies) and install that artifact and pom in local repository. If OSS project hasn't any other dependencies is even simpler - the POM is generated automatically during installing.
For both cases use maven-install-plugin.
mvn install:install-file -Dfile=your-artifact-1.0.jar \
[-DpomFile=your-pom.xml] \
[-Dsources=src.jar] \
[-Djavadoc=apidocs.jar] \
[-DgroupId=org.some.group] \
[-DartifactId=your-artifact] \
[-Dversion=1.0] \
[-Dpackaging=jar] \
[-Dclassifier=sources] \
[-DgeneratePom=true] \
[-DcreateChecksum=true]