Is there a way to make Maven Depend on a SNAPSHOT only if a release isn't available? - maven-2

Given that we have a project called tools-lib.jar and it currently is at version 1.0.0 and other projects such as rest-api.war have a dependency on tools-lib-1.0.0.jar. Lets assume that both tools-lib and our rest api are under development by separate teams with separate timelines. What I would like to do is make my rest-api.war depend on tools-lib-2.0.0.jar. If the 2.0.0 release version isn't there, the application would go build against 2.0.0-SNAPSHOT. Does anyone know of a way to configure this via maven?
The goal of this is that if the rest-api is done before the tools lib, we don't have to go back through and change all the dependencies manually in order to build against the release, it will just do it.
Thanks in advance.

Related

How can I add a framework into Xcode project using fastlane or command line?

I have an app that is dependent on a framework file that I wrote. They're independent xcode project and has their own Github repo. I want to create a fastlane to automatically add the framework to the app project and do a build whenever I commit to the app repo. Right now I have to manually add the framework into Embedded Binaries and Linked Frameworks and Libraries inside the app project. I can't find any actions in fastlane to update the project's framework section.
Thanks
If I understand your question correctly you are wanting to manage an API framework project which you main project uses.
Fast lane (IMHO) is about building and managing built artefacts. Managing dependencies within your project is a different thing. To that end there are two options which I am aware of:
Cocoapods - The only option for a long time. Ruby based tool. Manages dependencies by adding builds and XCConfigs to your project. Does this by re-writing your project files. Rebuilds all dependencies when you build. My opinion - I've never been a fan. I don't like that you really need to know Ruby to use it, the way it hacks into your project and enforces it's build ideas on you.
Carthage - Newer option. OS X native tool. Updates and builds dependencies only when you tell it to. Zero impact on your project, but you have to do a little work to include the framework files. Only works with frameworks. My opinion - feels more natural that Cocoapods and the best option.

How to ensure eclipse plugin has required bundles available?

I'm just starting to develop a new eclipse plugin where I want a web application server running in Eclipse. I found a nice blog, OSGi as a Web Application Server, that describes how to do this. The author suggests creating a target environment for my bundle requirements, and some of those bundles get pulled in from the Equinox Project SDK (now called Equinox Target Components in Juno). I notice that the tutorial project runs fine when my target platform is the platform I created in the tutorial, but fails to start when it is the default platform. So, now for my question...
If I need bundles that are not part of the default, how will my plugin project get access to those bundles? Will I need to deploy them along with my plugin? How would I know if the user's eclipse does or does not already have those required bundles?
You was not much clear about what kind of application you are developing. Running a web server in an Eclipse IDE as a plugin don't make any sense to me. This kind of server application is best just running on top of Equinox.
Anyway, the right path is to create a "Product Configuration" file and add categories that contains the needed bundles (go to File/Plug-in Development/Product Configuration).
With this file you can run an instance of the product (inside the IDE) and can export it (create a zip containing all needed bundles)
And if you want to able your user to install plugin inside his IDE you must create a P2 repository (using a Target Definition File) and expose the exported directory within a Http server. You could research about Tycho to build this kind of components in a maven style.
Well, I'm not sure if re-inventing the wheel again is really sufficient.
You might take a look at Pax-Web for inspiration on how to do it, or take a look Apache Karaf as a OSGi-Container (using Pax-Web). Or even better start contributing to one of the two :-)

Problems with Maven 2

I recently started to use Maven2 in one of my Java web application projects. Now I had many issues with it, some times project fails to build for no apparent reason and then it suddenly starts to work when nothing was done at all to project. Or some times our project members must delete project from their harddrive and download project again from SVN. There seems to be many very odd bugs in Maven in eclipse, but there some issues I would like know if it is possible to solve this issues.
1) I have understood that Maven2 should be able to get dependencies for added jars, but when I add a new dependency in Eclipse, it fails when I build it, it says dependecies are missing. How can I make maven to download those missing dependecies automatically?
2) I have Tuckey UrlRewrite Filter in use, but public repositories have only old version of this dependecy, so when I use this old version (3.1 when I need 3.2). How can I include this to project? We have many programmers in this project, so setting up local repository would mean that all our programmers would have to install that local repository.
Now I had many issues with it, some times project fails to build for no apparent reason and then it suddenly starts to work when nothing was done at all to project. (...)
Ok and what is the point of this free rant? I use Maven and my builds are 100% reproducible, there are well known practices to follow to achieve this. Maybe you're just not following them. Anyway if you're not happy with it, what can I say, don't use it.
I have understood that Maven2 should be able to get dependencies for added jars, but when I add a new dependency in Eclipse, it fails when I build it, it says dependencies are missing. How can I make maven to download those missing dependencies automatically?
I think you misunderstood, Eclipse won't guess what Maven coordinates to add if you don't provide the required informations for them. Dependencies must be declared in the POM, either by editing the POM manually or by using m2eclipse wizards.
And if this is what you did (and if I misunderstood the question) then please provide the <dependency> declaration and the exact error trace.
I have Tuckey UrlRewrite Filter in use, but public repositories have only old version of this dependecy, so when I use this old version (3.1 when I need 3.2). How can I include this to project? We have many programmers in this project, so setting up local repository would mean that all our programmers would have to install that local repository.
This question has already been asked several times, see for example Maven, how to add additional libs not available in repo where I suggest two possible solutions (use a corporate repository like Nexus or a "file-based" repository, the former suggestion being the preferred one for a long term solution).

maven deploy changed artifacts only

I'm using maven 2.2 with nexus 1.4.0
Let's say I have a pom-structure like this (with corresponding versions)
parentproj, v1.0.1
- childproj1, v1.0.2
- childproj2, v1.0.7
childproj1 and childproj2 represent different parts of the application (e g gui and backend) and I want to be able to keep their versions separate so that I can release a new version of the backend without having to release a new version of the gui.
Now, to deploy this structure to Nexus it would be convenient to go to parentproj and say
mvn deploy -DperformRelease=true
which would deploy all artifacts to the Nexus realease repository. This works fine the first time I deploy it, but the second time I run into problems: let's say that I made an update to childproj1 so that we now have the following versions:
parentproj, v1.0.1
- childproj1, v1.0.3
- childproj2, v1.0.7
In this situation Nexus will not let me do mvn deploy from parentproj, since it already has a copy of childproj2 in the 1.0.7 version. Nexus will say "Resource, illegal request:Repository with ID='releases' does not allow updating artifacts." This is fine, I don't want to update existing versions by mistake.
But I guess that what I would like to do is to be able to tell maven something like "deploy only those artifacts that have versions that are not already present in the release repository".
Is there a way to do this, or would I have to deploy each project by itself?
In my experience, it has been easier to deploy everything, and often use the same version number for all the components. For example, if my team is working on version 1.0.7, all the submodules have the version number of 1.0.7-SNAPSHOT, until we release, even if no code has changed in certain modules. Then when we deploy, we would deploy the whole application. I think it has several advantages over a piecemeal deployment. First, if you every have to rollback to the last stable version, you just have to rollback to 1.0.6 for all modules--you don't have to remember that the backend was 1.0.3 while the GUI was 1.0.6. Second, it ensures that all the components are compiled correctly against each other and have been tested as a logical group.
Sorry, I know this isn't a specific answer to your question, but, at least in my team's case, it was useful to think slightly differently
First of all, I think you should distinguish between parent project and aggregation project. Parent projects should be used for those settings that are common to several projects, e.g. dependencies' versions; aggregation projects should be used in order to build at the same time a group of projects, e.g. a set of jars and the war that includes them.
The two kind of projects are best kept separated. The parent project usually does not change very often and when it does it is usually best to release new versions of all the projects that depend from it; the aggregation project's only purpose is to drive the build of a bunch of projects, so its release number should probably change whenever one of the projects it contains needs to be released.
Once you've separated parent from aggregator you're in a better position to choose whether to follow John Paulett's advice and keep everything at the same version number or to try and change each project's version number only when you actually need to release it. The first option is simpler and less error prone, but causes you to release new version of libraries that haven't changed. This might not be acceptable if, for instance, you need to ship patches rather than full releases. The second option is more complicated and error prone, but causes your release numbers to match the evolution of your software. The Maven release plugin and the Jenkins continuous integration tool may be of help there, I think you should check them out. Also, see if you can upgrade Maven to at least version 2.2.1 and Nexus to a more recent version.
I would suggest you Artifact Exists Maven Plugin (https://github.com/chonton/exists-maven-plugin). This wonderful thing requires only to be mentioned on the parent.pom, and will automatically skip the install and deploy phase for all release artifacts, that already exists in repository (Nexus or Artifactory). And still deploy the Snapshots (this is configurable).
Example:
<plugin>
<groupId>org.honton.chas</groupId>
<artifactId>exists-maven-plugin</artifactId>
<version>0.0.6</version>
<executions>
<execution>
<goals>
<goal>remote</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
I would suggest that if you plan to maintain, build, and deploy the modules independently, you should consider setting up separate CI and mvn deploy jobs for each. Having independent mvn deploy jobs will give you the behavior you are looking for out of the box. This means not using the aggregator pom (parentprj) to attempt building and deploying these modules.
If you want to do everything from the aggregator pom, like build and deploy, then I would suggest following John's answer and keeping all the version number in sync.
It just depends on how your team wants to look at the code base. If you want to keep things in a true modular fashion, you should be using your maven modules like building blocks, treating them differently, until you are ready to put the whole app together. If your app is more monolithic in nature, treat it as so and keep things in sync. This doesn't mean you still can't break out separate maven modules for maintaining code-base modularity, just recognize they don't have any value outside the context of your larger app.
A good way of making this decision is asking yourself "Will any other projects/apps need to reference this module as a dependency?". If so, it is best practice to build, version, and deploy it independently. If not, I don't see any pitfalls to making the versions match up.
Clearly this need is not addressed by maven, neither by Nexus or archiva.
For now it can only be addressed by additional tricks setup by the build manager like the ones suggested in previous posts.
In an ideal world
the pom would include
. both the release version and the snapshot version of the module
. a definition of the files which, if changed, justify the use of the snapshot version
. the source control management system reference of the released module
dependent modules poms would add in the appropriate dependency section the release version info next to the snapshot version info so that it links to the snapshot library if present in the repo and the release library otherwise
the maven reactor would have an option to read both the dependency hierarchy and the file changes info (scm diff) to know whether a given module is to be used in its release or snapshot version.
the release plugin would by default skip the releasing of the modules whitch still can be used with their release version based on the file changes and the dependency info.

The best way to clean your plugins out of eclipse 3.2

Since the configuration manager and update manager for eclipse 3.2 is devoid of nice options for REMOVING or DELETING all my plugins it can be cumbersome to deal with needing to get your plugins in order. Just getting your dependencies worked out can be a nightmare when you have installed one version too high than you needed depending on the jdk version you are developing for.
Other than trashing the files in the plugins and features directory (which sometimes works) what other options do we have in a M$ environment?
In the situation where you are using RAD 7 you have to deal with the shared SDP70Shared folder too which is a bit ethereal as well.
I want to see a fool proof way to clean house for regular eclipse 3.x, RAD, or any all in one package that will work.
Eclipse 3.2 has "uninstall" feature for plugins under Help->Software updates->Manage configuration.
Eclipse 3.4 has the same functionality under Help->Software updates->Installed software
You could do a complete uninstall of eclipse and use a custom eclipse builder like http://www.yoxos.com/ondemand/]1 to create a build with the base set of plugins you use. I don't recall how eclipse is configured in Windows, but in Linux there's usually a hidden project directory in your work space that you may want to remove just in case. I'm not sure if there's any registry settings you need to worry about though.
The best solution I have found thus far is to uninstall and rebuild from scratch. Sometimes you can delete files in the plugins dir and run CCleaner on the registry and that might fix issues but it is problematic depending on the situation. If there were an application that really could make sense of eclipse plugins everyone would use it but there isn't.
Though it's not recommended to manually remove plugins managed by p2,
I find a regular plugin cleanup greatly improves performance and postability,
especially if you have a master eclipse configuration copied and shared
with multiple developers. In that case it's better to just archive the
master eclipse install instead of relying on everyone to update their
configs in synch.
see
How to remove old versions of Eclipse plugins?