I'm learning how to augment my build with Ivy using a "brute force" method of just trying to get a few sample projects up and running. I've poured over the official docs and read several online tutorials, but am choking on a few terms that seem to be used vaguely, ambiguously and/or in conflicting ways somehow. I'm just looking for an experienced Ivy connoisseur to help bring some clarity to these terms for me.
"Resolution" Cache vs. "Repository" Cache vs. "Ivy" Cache
The "Ivy Repository", as opposed to my normal SCM which is a server running SVN
What's the difference between these 3 types of cache? What's the difference between the "Ivy Repository" and my SVN?
Thanks to anyone who can help!
"Resolution" Cache vs. "Repository" Cache vs. "Ivy" Cache
The ivy cache is basically a folder, where ivy stores artifacts and configurations. If not configured differently it can be found in UserHome/.ivy2
The ivy cache consists of the resolution cache and a repository cache.
The repository cache contains the artifacts from a repository, that were downloaded by ivy. It is caching the repository, so that ivy won't need to query the repository every time it tries to resolve/download an artefact. If it finds an suitable artifact in the repository cache it will not query the repository. Thus saving the cost to query the repository. If and how the cache is used is a bit more complicated and depends on the dependencies/configuration.
The resolution cache is a collection of ivy-specific files, that tell ivy how an artifact was resolved (downloaded).
The "Ivy Repository", as opposed to my normal SCM which is a server running SVN
A Repository in the ivy world is a location, which contains artifacts(jar) files. This can be the local filesystem or a web server. It has no versioning system. Each version of an artifact is contained in a seperate folder. You can't commit artifacts, you just add them to the file system. See the terminology
org\artifact\version1\artifact.jar
org\artifact\version2\artifact.jar
A repository is accessed via a resolver, which has to know the layout of the repository.
From the doc on caches:
Cache types
An Ivy cache is composed of two different parts:
the repository cache
The repository cache is where Ivy stores data downloaded from module repositories, along with some meta information concerning these artifacts, like their original location.
This part of the cache can be shared if you use a well suited lock strategy.
the resolution cache
This part of the cache is used to store resolution data, which is used by Ivy to reuse the results of a resolve process.
This part of the cache is overwritten each time a new resolve is performed, and should never be used by multiple processes at the same time.
While there is always only one resolution cache, you can define multiple repository caches, each resolver being able to use a separate cache.
Related
I am trying to revise my build process to use ant with apache ivy for my personal projects. These consist of a few shared modules, and a few application modules that depend on the shared modules. For the sake of this post, let's simplify and say I have a shared module (common), and an application module (application) which depends on common. Each module has it's own effective svn repository:
svn_repo_1/common/trunk
/branches
/tags
svn_repo_2/application/trunk
/branches
/tags
I check out the relevant revision into a common workspace, in a flat structure:
workspace/common
workspace/application
In general, application will depend on a published version of common, so there will be no need to build common when building application.
However, when I need to add new functionality to common that is required by application, I would then like application to depend on the latest common build from my workspace (without needing to publish common to my repository).
I assumed this is what latest.integration meant (i.e. changing application's ivy.xml to specify latest.integration for the common revision). My intention was to use the ivy buildlist task to find the local modules that needed to be built before application could be built. This does not work however, because the buildlist task seems to include the common/build.xml entry regardless of whether application's ivy.xml file specifies latest.integration or some other published revision.
I would appreciate any suggestions. I am struggling with ivy's documentation and samples, so any real-world examples would also be helpful. Note: I am not interested in a Maven solution here.
Wow, this is truly deja vu! Go back to some of my first questions on this site from 3 - 4 months ago and they're almost all Ivy-related! I empathize with you 100% that Ivy is a difficult beast to learn and tame, but after using it professionally for a few months now, I'll never develop without it again. So my first piece of advice: keep going. Sooner or later, what little (practical) documentation you find on Apache Ivy will alll start to make sense and fall into play.
I can understand there may be extenuating reasons for why you don't want to publish your common to your repo. However, if you are a newcome to transitive dependency management, the first piece of practical advice I can give you is that you should always publish your JARs/WARs/whatever to your repo; not an intermediary "integration" local to your workspace.
The reason for this is simple: Ivy only has the ability to crawl the repositories you define in your settings file (basically). If you deliberately keep a JAR like common outside of one of these defined repositories, then: (a) Ivy has no way to resolve transitive dependencies (its primary job), and (b) "downstream" (dependent) JARs fail to be dynamically updated every time you tweak common. Thus, using Ivy only to not publish JARs is a bit counter-productive; I'm surprised Ivy even includes it as a feature.
I guess I would need to understand your motivation for not publishing common. If you're simply having problems getting the ivy:publish task to work, no worries I can provide plenty of examples to help get you started. But if there are some other reasons, then I ask you to consider this solution: set up multiple repositories.
Perhaps you have one "primary" repository where mostly everything gets published; and then you have a "secondary" or "intermediary" repository where you publish common to whenever it makes sense (for you) to do that. You can then configure your Ant build with two different publish tasks, such as publish-main and publish-integration.
That way you get the best of both worlds: you get your intermediary staging area, and you get to keep everything inside of Ivy's powerful control.
I'm trying to set up my first Ivy-powered build and am running into implementation problems, and I feel like I don't fully understand Ivy terminologies & best practices, even though I've spent a great deal of time reading the official docs and countless articles.
I have a SVN server that I want to use as the central repository for all of my projects. I do not want to use any public repositories! When I need a JAR, I'll pull it down from one of those public repos, run a checksum for security, and then push it to my SVN server (wherebyit will be deemed to be a "certified" version of the JAR; by certified, I really mean "safe").
(1) I want all of my projects to share the same ivy-settings.xml file. Do I put this in my SVN root, or somewhere inside SVN that makes sense? Here was my tentative thinking:
svn://MyRepoRoot/
ivy/
ivy-settings.xml
artifacts/
Project1/
trunk/
ivy.xml
...
branches/
tags/
Project2/
...
...
The ivy/ directory would contain a master copy of my ivy-settings.xml file. It would also contain an artifacts subdirectory where all of my "certified" JARs/WARs would go (as well as any publications my projects produce for downstream modules). Can I request for commentary?
(2) Also, something that I'm just not getting, is if each of my projects (modules) have their own ivy.xml file, and I want that file to reference the "global ivy-settings.xml file, which should by all means fall under its own, non-module-related versioning scheme, how do I pull down, say, Project1's trunk as my working copy, but configure it with the settings file which is not even a part of the same SVN project?!?
Thanks to anyone who can help give me a little practical advice and better clarity!
The ivysettings.xml is not referenced in the ivy.xml. You need the ivysettings.xml in your ant tasks to find the defined resolver, which resolve the artifacts defined in the ivy.xml.
ivy.xml defines the dependencies and ivysettings.xml the (local) runtime environment for ivy. You can change the ivysettings.xml anytime without need to edit the ivy.xml files.
The ivysettings.xml needs to be referenced in yout (ant) build.xml in ivys <settings /> task.
As for the layout. I use the same approach and it works fine for me.
I wrote my antfiles, so that I need to have the ivy folder checked out parallel to my project(s).
Another approach could be svns externals. But I never tried that.
If your svn has access over http you could also use the url parameter of the task to access ivysettings.xml.
We recently switched from Ant to Buildr for building our projects. We use Ivy for dependency management, using the ivy4r Buildr extension. We have a local repository at the office which is used as a cache for public artifacts and in which we also publish our own artifacts.
Now for the problem: I'd like to be able to build my project when I do not have access to the office repository. Buildr has a flag to tell it to work offline (-o), but ivy4r does not seem to take this into account. Is there any way to make Ivy not try to download artifacts? I have them all available in the cache on my machine already.
Setting the cache timeout to eternal
You can set the cache property ${ivy.cache.ttl.default} to eternal this will set the TTL: so that the repository will not be checked for new revisions.
You could achieve this by calling ant with the following parameter:
ant -Divy.cache.ttl.default=eternal build
This is from the documentation:
Defines a TTL (Time To Live) rule for resolved revision caching. When
Ivy resolves a dynamic version constraint (like latest.integration or
a version range), it can store the result of the resolution (like
latest.integration=1.5.1) for a given time, called TTL. It means that
Ivy will reuse this dynamic revision resolution result without
accessing the repositories for the duration of the TTL, unless running
resolve in refresh mode.
...
The TTL duration can also be set to 'eternal', in which case once
resolved the revision is always use, except when resolving in refresh
mode.
Other references:
IVY-879 Implementation of this feature
Setting UseCacheOnly for the resolve task
The resolve task has the attribute useCacheOnly, which can be used to
force[s] the resolvers to only use their caches and not their actual
contents
Example:
<ivy:resolve file="path/to/ivy.xml" useCacheOnly="true/>
We have a project which should be buildable by the customer using maven. It has some open source dependencies that are mavenized (no problem), some that aren't mavenized, proprietary stuff (oracle jdbc driver) and some internal stuff.
Until now we had everything but the first category packaged with the project itself in a local repository (repository with file://path-in-project-folder specified in the projects pom.xml).
We would love to move these out of the project, as we are about to use them in other projects as well. Currently we plan to use nexus as an internal maven repository.
Whats the best practice to make such dependencies/maven repositories available to the customer so he can continue to build the project.
Ideas so far:
Customer sets up a nexus repository as well, we somehow deploy all these non-public dependencies to his repository (like a mirror)
We provide a 'dumb' dump/snapshot of the non-public dependencies, customer adds this snapshot to this settings.xml as a repository, (but how is this possible).
Make our internal nexus repo available to the customers build server (not an option in our case)
I'm wondering how others solve these problems.
Thank you!
Of course, hosting a repository of some kind is a straightforward option, as long as you can cover the uptime / bandwidth / authentication requirements.
If you're looking to ship physical artifacts, you'll find this pattern helpful: https://brettporter.wordpress.com/2009/06/10/a-maven-friendly-pattern-for-storing-dependencies-in-version-control/
That relies on the repository being created in source control - if you want a project to build a repository, consider something like: http://svn.apache.org/viewvc/incubator/npanday/trunk/dist/npanday-repository-builder/pom.xml?revision=1139488&view=markup (using the assembly plugin's capability to build a repository).
Basically, by building a repository you can ship that with the source code and use file:// to reference it from within the build.
There are two options:
Document exactly what artifacts you need to compile which are not
available via Maven Central
Implement Nexus and make a export with Nexus give the export
to customer and they need to do a import of it. I'm not sure
if you come to licenses issues.
I assumed that you already have a Repository Manager already but it reads like you didn't.
Some of the artifacts in my local Nexus repository don't have the correct checksum. For example (wrong checksum):
cat central/org/codehaus/plexus/plexus-compiler-api/1.8/plexus-compiler-api-1.8.pom.sha1
95f3332c2bbace129da501424f297e47dd0e976b
vs (correct checksum):
sha1sum central/org/codehaus/plexus/plexus-compiler-api/1.8/plexus-compiler-api-1.8.pom
4c2947f7e2d09b6e13da34292d897c564f1f9828
It looks like I have a few artifacts in my repository that were downloaded when this bug was active.
Maven Central has the correct checksum (4c29...) now, but the checksums in my local Nexus repository remain stale. I don't know how to get my local repository to verify and / or re-download the correct checksum from central.
What is the correct way of fixing my local repository. There aren't too many artifacts with this problem, so I think I could (by hand) verify they still exist in central and delete them from my local repository. They should get re-cached with the correct checksums. Is there a better way?
Update:
I've looked at this more and I'm almost positive I know what the source of my problem is. One of the artifacts I'm having trouble with is this one (plexus-compiler-api:1.8):
In my repository, both the .pom and .pom.sha1 are timestamped as 29-Mar-2010. At central, the .pom is timestamped as 29-Mar-2010 while the .pom.sha1 is timestamped as 21-Apr-2010. I was reading about Nexus maintenance. I assume that, on 21-Apr-2010, Maven Central rebuilt metadata and verified checksums which fixed the incorrect .sha1 for the plexus-compiler-api:1.8 artifact.
According to the Sonatype link above, I should be able to expire the caches for Maven Central and have my local installation pull new copies of anything with newer timestamps than the originally cached artifacts. However, based on the behavior I've observed, I think it's only checking timestamps for artifact files, not checksum files.
As far as my local Nexus repository is concerned, I have the most recent version of the artifact (29-Mar-2010), so there's no need to re-download anything.
I've noticed my version of Nexus is quite old (1.5 vs 1.9.1), so I'll try updating and see if the newer version does a better job of expiring caches. If not, I'll probably see what the Sonatype guys think (maybe it's a bug?).
Nope, what you face is the defined behaviour of Nexus and Maven.
First, expiring caches does not delete anything from local cache of Nexus, it just marks them "old". The effect of marking items as "old" is shown on next incoming request asking for those same artifacts (if never asked for, the "old" artifacts just sits there). Meaning, expire cache alone will not cause Nexus to download remotely changed (newer) files. Nexus never downloads on it's own (if we leave out the index from this discussion). You have to force a client (Maven) ask for them – and that will result following chain of action: "cache content old", remote change detection and finally re-download and caching of the new file.
Next, what happens here is that Maven, since artifact (the JAR file) is not changed, not even asks for checksum file either, hence nothing "triggers" the "old" marked checksum refetch on Nexus side. Other to note, if we talk about released artifact (and Maven Central does contains released artifacts only), Maven will never re-check them, unless they are not present in local repository (once brought into local repository, Maven will never try to refetch them). Meaning, you need to remove them from local repository to be sure that Maven will ask for them from Nexus, and finally, that Nexus will detect the checksum file changes on remote and do what you actually want.
Re-download should happen, for example if you nuke your Maven's local repository and rebuild with a clean/empty one. In this case, Maven should ask for both, JAR artifact and checksum file – but from your description it's not clear how did you (or did you?) invoke Maven after expiring caches on Nexus.
Try this:
a) run expire caches on Nexus "Maven Central" proxy repository
b) nuke local repository (or just redirect it to a new clean folder by tampering ~/.m2/settings.xml
c) make Maven build your project, and it should refetch both, the JAR and checksum files (by using empty/nuked local repository)
Hope this explains some of the stuff you wrote.
Reference to JIRA issue discussing same thing.
This was a bug.
As explained by Tamas, when a proxied repository cache is expired, Nexus will check the remote repository for newer timestamps. The locally cached artifacts are essentially flagged dirty and the check for updated artifacts happens on demand as artifacts are requested from the local Nexus server.
Nexus (1.9.1) is making the assumption that if an artifact timestamp is unchanged, the checksums should be unchanged as well. Most of the time this will be true, but, due to the old bug in Maven that was deploying artifacts with incorrect checksums, there are rare cases where an artifact can be unchanged yet have an updated checksum.
I think the best way to deal with this for now will be to move any bad checksums and let Nexus try to re-resolve them the next time they are requested:
mv plexus-compiler-api.pom.sha1 plexus-compiler-api-1.8.pom.sha1.bak
Thanks for the help Tamas.