A tool to manage source libraries in a project (without submodules)? - automation

Is there a command-line tool to automate maintaining a collection of source libraries and applying them to projects?
For gamejams (or possibly also longer-term projects), I don't want to host my own clone of each library I use to be able to modify it (and grant each member of my team permission on each clone). Instead, I want all of my source code inside my project source tree -- which means I can't use git submodules.
So say I have a project "puppypark" and I want to use some libraries "baton" and "windfield".
I'm looking for this kind of workflow:
register libraries
librarian add baton git://lib.com/baton.git
clones to a central repository (e.g., in ~/.librarian/)
add libraries to a project
librarian use baton puppypark ./src/foreign/
creates a branch (off master) for that project in the central repository
merge project changes into central repository
librarian apply puppypark
switches to project's branch and copies changes into clone
merge upstream changes into project
librarian apply baton puppypark
takes the current state of project's branch and copies to project
no actions occur unless the target repository has a clean workspace
actions always result in commit summarizing the action taken
commits should always have a commit id (sha)
This could be independent of my version control system, but if it did it with git, that's great too.
I think this is kinda like Carthage (but for source instead of built libraries and I'm not interested in dependencies). Maybe I can achieve some of this with git-subtree, but I don't understand how. I think this is like loverboy, but I'm interested in a more general solution.

I couldn't find a solution, so I wrote my own: librarian automates copying modules to and from projects. Unlike loverboy, you give it general rules for how to copy modules instead of requiring rules for each module.
Example usage to setup for love2d and import windfield into project "puppypark":
librarian config love --path src/lib/ --root-marker init.lua --rename-single-file-root-marker ".*.lua" --include-pattern ".*.lua|LICENSE.*"
librarian acquire love windfield https://github.com/adnzzzzZ/windfield.git
librarian checkout puppypark windfield
librarian checkin puppypark windfield

Related

How to make a maven project buildable for the customer

We have a project which should be buildable by the customer using maven. It has some open source dependencies that are mavenized (no problem), some that aren't mavenized, proprietary stuff (oracle jdbc driver) and some internal stuff.
Until now we had everything but the first category packaged with the project itself in a local repository (repository with file://path-in-project-folder specified in the projects pom.xml).
We would love to move these out of the project, as we are about to use them in other projects as well. Currently we plan to use nexus as an internal maven repository.
Whats the best practice to make such dependencies/maven repositories available to the customer so he can continue to build the project.
Ideas so far:
Customer sets up a nexus repository as well, we somehow deploy all these non-public dependencies to his repository (like a mirror)
We provide a 'dumb' dump/snapshot of the non-public dependencies, customer adds this snapshot to this settings.xml as a repository, (but how is this possible).
Make our internal nexus repo available to the customers build server (not an option in our case)
I'm wondering how others solve these problems.
Thank you!
Of course, hosting a repository of some kind is a straightforward option, as long as you can cover the uptime / bandwidth / authentication requirements.
If you're looking to ship physical artifacts, you'll find this pattern helpful: https://brettporter.wordpress.com/2009/06/10/a-maven-friendly-pattern-for-storing-dependencies-in-version-control/
That relies on the repository being created in source control - if you want a project to build a repository, consider something like: http://svn.apache.org/viewvc/incubator/npanday/trunk/dist/npanday-repository-builder/pom.xml?revision=1139488&view=markup (using the assembly plugin's capability to build a repository).
Basically, by building a repository you can ship that with the source code and use file:// to reference it from within the build.
There are two options:
Document exactly what artifacts you need to compile which are not
available via Maven Central
Implement Nexus and make a export with Nexus give the export
to customer and they need to do a import of it. I'm not sure
if you come to licenses issues.
I assumed that you already have a Repository Manager already but it reads like you didn't.

How can I tell Hudson to build the modules instead of the jobs?

I have a alot of jobs on Hudson, most of which are really small and consist of just a few modules. But one is big and consist of several modules.
When ever I make a commit to our subversion repository for any of those several modules in that big job, Hudson builds the entire job instead of just the module that have changed.
It doesn't matter if I just scm-polling or a subversion hook, the result is the same.
It seems to me like it would be better if the modules where built instead of the jobs since the other modules in other jobs have dependencies to the modules and not to the jobs.
Can this be configured or do I have to create several jobs instead of the big one? And if so, can I configure the big job to never build when any of it's modules are being triggered but still build when it's own pom.xml is changed?
Thanks.
Hudson has an "Incremental Build" option in the Maven area of the job configuration.
It's hidden in the "Advanced" area.
You could make use of the reactor plugin. For example:
mvn reactor:make-scm-changes
This will only build those modules that have been changed in the SCM. Follow the link for other examples.
Doesn't your compiler offers you the incremental compile option? The java 1.6 compiler usually searches for class and source files and decides using the timestamp to determine whether to use the source or class file. Just leave out the clean goal when building your code.
Another option would be to first run a batch/shell script to determine what files changed and delete the corresponding class files so that the compiler incrementally builds the class files that are missing.

A layout for maven project with a patched dependency

Suppose, I have an opensource project that depends on some library, that must be patched in order to fix some issues. How do I do that? My ideas are:
Have that library sources set up as a module, keep them in my vcs. Pros: simple. Cons: some third party sources in my repo, might slow down build process, hard to find a patched place (though can be fixed in README)
Have a module, like in 1, but keep patched source files only, compile them with orignal library jar in classpath and somehow replace *.class files in library jar on build. Pros: builds faster, easy to find patched places. Cons: hard to configure, that jar hackery is non-obvious (library jar in repository and in my project assembly would be different)
Keep patched *.class files in main/resources, and replace on packaging like in 2). Pros: almost none. Cons: binaries in vcs, hard to recompile a patched class as patch compilation is not automated.
One nice solution is to create a distinct project with patched library sources, and deploy it on local/enterprise repository with -patched qualifier. But that would not fit for an opensourced project that is meant to be easily buildable by anyone who checks out its sources. Or should I just say "and also, before you build my project, please check out that stuff and run mvn install".
One nice solution is to create a distinct project with patched library sources, and deploy it on local/enterprise repository with -patched qualifier. But that would not fit for an opensourced project that is meant to be easily buildable by anyone who checks out its sources. Or should I just say "and also, before you build my project, please check out that stuff and run mvn install".
This is what I would do (and actually what I do) for both a corporate and an opensource project. Get the sources, put them under version control in a distinct project, patch them, rebuild the patched library (and include this information in the version, something like X.Y.Z-patched), deploy it to a repository (you could use SVN for this, a la Google Code1), declare the repository in your POM and update the dependency to point on your patched version.
With this approach, you can say to your users: check out my code and run mvn install and they will just get the patched version without any extra action. This is IMHO the cleanest way (not error prone, no class path order mess, no increase of the build time, etc).
1 Lots of people are deploying their code to their hosted subversion repository (how-to in this post).
One nice solution is to create a distinct project with patched library sources, and deploy it on local/enterprise repository with -patched qualifier. But that would not fit for an opensourced project that is meant to be easily buildable by anyone who checks out its sources. Or should I just say "and also, before you build my project, please check out that stuff and run mvn install".
I'd agree with this and Pascal's answer. Some additional notes:
you may use dependency:unpack on the original artifact and then combine that with your compiled classes if you don't want to rebuild the whole dependant project
in either case, your pom.xml will need to correctly represent the dependencies of that library
you can still integrate this as part of your project's build to avoid the 'deploy to a repository' step
make sure you honour the constraints of the project's license when doing all this!

Maven - installing artifacts to a local repository in workspace

I'd like to have a way in which 'mvn install' puts files in a repository folder under my source (checkout) root, while using 3rd party dependencies from ~/.m2/repository.
So after 'mvn install', the layout is:
/work/project/
repository
com/example/foo-1.0.jar
com/example/bar-1.0.jar
foo
src/main/java
bar
src/main/java
~/.m2/repository
log4j/log4j/1.2/log4j-1.2.jar
(In particular, /work/project/repository does not contain log4j)
In essense, I'm looking for a way of creating a composite repository that references other repositories
My intention is to be able to have multiple checkouts of the same source and work on each without overwriting each other in the local repository with 'install'. Multiple checkouts can be because of working on different branches in cvs/svn but in my case it is due to cloning of the master branch in git (in git, each clone is like a branch). I don't like the alternatives which are to use a special version/classifier per checkout or to reinstall (rebuild) everything each time I switch.
Maven can search multiple repositories (local, remote, "fake" remote) to resolve dependencies but there is only ONE local repository where artifacts get installed during install. It would be a real nightmare to install artifacts into specific locations and to maintain this list without breaking anything, that would just not work, you don't want to do this.
But, TBH, I don't get the point. So, why do you want to do this? There might be alternative and much simpler solutions, like installing your artifacts in the local repository and then copying them under your project root. Why wouldn't this work? I'd really like to know the final intention though.
UPDATE: Having read the update of the initial question, the only solution I can think of (given that you don't want to use different versions/tags) would be to use two local repositories and to switch between them (very error prone though).
To do so, either use different user accounts (as the local repository is user specific by default).
Or update your ~/.m2/settings.xml each time you want to switch:
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
http://maven.apache.org/xsd/settings-1.0.0.xsd">
<localRepository>${user.home}/.m2/repository</localRepository>
<!--localRepository>${user.home}/.m2/repository2</localRepository-->
...
</settings>
Or have another settings.xml and point on it using the --settings option:
mvn install --settings /path/to/alternate/settings.xml
Or specify the alternate location on the command line using the -Dmaven.repo.local option:
mvn -Dmaven.repo.local=/path/to/repo
These solutions are all error prone as I said and none of them is very satisfying. Even if you might have very good reasons to work on several branches in parallel, your use case (not rebuilding everything) is not very common. Here, using distinct user accounts migh be the less worse solution IMO.
This is INDEED possible with the command line, and in fact is quite useful. For example, if you want to create an additional repo under your Eclipse project, you just do:
mvn install:install-file -DlocalRepositoryPath=repo \
-DcreateChecksum=true -Dpackaging=jar \
-Dfile=%2 -DgroupId=%3 -DartifactId=%4 -Dversion=%5
It's the "localRepositoryPath" parameter that will direct your install to any local repo you want.
I have this in a batch file that I run from my project root, and it installs the file into a "repo" directory within my project (hence the % parameters). So why would you want to do this? Well, let's you say you are professional services consultant, and you regularly go into customer locations where you are forced to use their security hardened laptops. You copy your self-contained project to their laptop from a USB stick, and presto, you can do your maven build no problem.
Generally, if you are using YOUR laptop, then it makes sense to have a single local repo that has everything in it. But to you who got cocky and said things like "why would you want to do that", I have some news...the world is a bigger place with more options than you might realize. If you are using laptops that are NOT yours, and you need to build your project on that laptop, get the resulting artifact, and then remove your project directory (and the local repo you just used), this is the way to go.
As to why you would want to have 2 local repos, the default .m2/repository is where the companies standard stuff goes, and the local "in project" repo is where YOUR stuff goes.
This is not possible with the command line client but you can create more complex repository layouts with a Maven repository server like Nexus.
The reason why it's not possible is that Maven allows to nest projects and most of them will reference each other, so installing each artifact in a different repository would lead to lots of searches on your local hard disk (or to failed builds when you start a build in a sub-project).
FYI: symlinks work in Windows7 and above so this kind of thing is easy to achieve if all your code goes in the same place in the local repo, i.e /com/myco/.
type mklink for details
I can see that you do not want to use special versions or classifiers but that is one of the best solutions to solve this problem. I work on the same project but different versions and each mvn install takes half an hour to build. The best option is to change the pom version appended with the change name, for example 1.0.0-SNAPSHOT-change1 that I'm working on thereby having multiple versions of the same project but with different code base.
It has made my life very easy in the long run. It helps run multiple builds at the same time without issues. Even during SCM push, we can skip the pom file from staging so there can always be 2 versions for you to work on.
In case you have a huge project with multiple sub-modules and want to change all the versions together, you can use the below command to do just that
mvn versions:set -DnewVersion=1.0.0-SNAPSHOT-change1 -DprocessAllModules
And once done, you can revert using
mvn versions:revert
I know this might be not what you are looking for, but it might help someone who wants to do this.

Maven: local development deploy vs bundling for distribution

Bear with me, I'm migrating from Ant to Maven2: I think I've hit one of those little things that was easy in Ant, but not so in Maven...
How do I handle the difference between a local deployment vs. creating an archive/bundle for distribution to another machine?
Let's assume my project's output is an EAR plus some additional config files. A developer that is actively working on the project will need to deploy and re-deploy frequently to his local app-server (say JBoss), while an Integration Engineer that is building for QA/production will need only to create the final archive assembly (tar/gz).
In Ant we had two targets for this: "dev-deploy" and "bundle". Both do a complete build, but differ in the final step: "dev-deploy" copies the EAR and config files to the respective local folders, while "bundle" just puts the EAR & config files in a tar.gz assembly.
How do you do this in Maven?
I've seen that the assembly plugin can create either archives (tar, gz, etc.) or exploded directories (from the same assembly descriptor). I can invoke either assembly:assembly or assembly:directory, but for the latter, how do I copy the final output to the local JBoss deployment folders? From a related post it seems that ad-hoc copying of files is not really what Maven is about, so an antrun copy is probably the most appropriate?
Finally, since the type of assembly may differ depending on who invokes it, it doesn't seem wise to bind assembly to the build lifecycle, not so? But this means that a developer will always need to invoke 'mvn package' followed by 'mvn assembly:directory' to rebuild and test a change. Conversely, an Integration Engineer will always need to run 'mvn package' followed by 'mvn assembly:assembly' to create the distributable archive. I was hoping for a one-command solution for each, or should I just script it?
In Ant we had two targets for this: "dev-deploy" and "bundle". Both do a complete build, but differ in the final step: "dev-deploy" copies the EAR and config files to the respective local folders, while "bundle" just puts the EAR & config files in a tar.gz assembly.
Not sure what you mean by respective local folders about "dev-deploy" but this sounds like what mvn pacakge is doing and "bundle" indeed sounds like a maven assembly.
I've seen that the assembly plugin can create either archives (tar, gz, etc.) or exploded directories (from the same assembly descriptor). I can invoke either assembly:assembly or assembly:directory, but for the latter, how do I copy the final output to the local JBoss deployment folders? From a related post it seems that ad-hoc copying of files is not really what Maven is about, so an antrun copy is probably the most appropriate?
I guess that we are talking about the Integration Engineer's tasks here. As you didn't explain what the "bundle" contains exactly, what the target application server is (my understanding is that you are using JBoss for QA/production too but, again, this is a guess), if this bundle has to be deployed automatically, it's hard to imagine all solutions and/or alternatives to antrun. But indeed, to copy/move/unzip/whatever the assembly, the maven antrun plugin is a candidate.
Finally, since the type of assembly may differ depending on who invokes it, it doesn't seem wise to bind assembly to the build lifecycle, not so? But this means that a developer will always need to invoke 'mvn package' followed by 'mvn assembly:directory' to rebuild and test a change. Conversely, an Integration Engineer will always need to run 'mvn package' followed by 'mvn assembly:assembly' to create the distributable archive. I was hoping for a one-command solution for each, or should I just script it?
My understanding was that the Integration Engineer was building the bundle. Why would a developer need the bundle? This is confusing... Anyway, I don't really need the details to think of an answer. You could actually declare the maven assembly plugin into specific build profiles, one for development and one for integration, and bind either the single or the directory-single mojos to the project's build lifecycle in each profile. This would allow to use only one command and avoid any scripting (really, don't go this way).