optional artifacts download task in bamboo? - bamboo

Is it possible to configure a deployment project with optional 'Artifact Download' task?
The artifact comes from another plan which has 2 stages producing 2 artifacts. If only 1 stage is executed, it will have 1 shared artifact. I want my deployment project to run even there is only 1 artifact.
But bamboo fail the whole execution with error: "Unable to download artifact Shared artifact: ..." trying to locate the 2nd artifact.
How can I tell Bamboo to ignore the missing artifact and continue the execution?

The only way I've figure this out is to instead of name an artifact, put all of the artifacts into a "directory" as part of the build process, say "artifacts/", and define the artifacts as "artifacts/**". Then on the Deployment side, be clever about manipulating the artifacts for deployments.
Note, in my case, I have an issue with multiple branches for the same build (think, "future release", "current release", "legacy release") that may have different artifacts on them (either new features in "future release", or aged off artifacts from "legacy release"). I had to wrap the actual deployments into a script that was "smart enough" to just iterate through artifacts that actually existed for a given deployment environment.
I'm not completely happy with Bamboo's treatment of special cases for artifact management at all. In fact, I've found that judicious use of the "script" task in Bamboo (and managing those scripts in some external git repo) seems to be the only real way to manage larger Bamboo installations in general.

Related

Several artifacts, how to force build order?

I have a (standard non-maven, non-gradle, non-whatsoever) project in IntelliJ IDEA that consists of several modules.
One of those modules results in a jar that is used by one of the other modules.
I have two artifacts. The first one creates a war file. This one depends on the jar file built from the second artifact.
How can I order the build process of the two artifacts so that the second one creates the jar file and copies it to the lib folder of the first, before the first one builds, without the need to recreate both artifacts?
As soon as I select "Build/Build Artifacts/All Artifacts" it always tries to build the first one first.
EDIT: Maybe a better question: What is the recommended way to manually build several artifacts in order of their dependencies?
How can I [configure IDEA] ... so that [it] ... creates the jar file and copies it to the lib folder of the first...
You can't really configure IDEA to do this directly. While you can configure Artifacts in the Project Structure dialog, there are no provisions for copying artifacts. IntelliJ IDEA is an IDE, not a build tool. While it can do a lot regarding complying and building, it has its limits.
One possible hackish way would be to go to the Artifact definition in the project structure. There, there are "Pre-processing" and "post-processing" tabs, They have the option to run an Ant target. So you could create a simple Ant target to do the copying. But in the end, I think the best answer to your question:
Maybe a better question: What is the recommended way to manually build several artifacts in order of their dependencies?
is to use a build tool such as Ant, Maven, or Gradle for building the project.

Trouble deploying snapshot from Bamboo to Artifactory

I would like to deploy snapshot builds from Bamboo to Artifactory. My repository's Handle Snapshots option is checked and it's Maven Snapshot Version Behavior is set to Unique. The repository's layout is gradle-default.
My goal is for a build plan to deploy an artifact at a location similar to the following:
repo-local:com.company/project/1.0-SNAPSHOT/project-1.0-20120612.101600.txt
In Bamboo I have a Artifactory Generic Deploy Task, configured with the following for the Edit Published Artifacts field:
project-1.0-SNAPSHOT.txt=>com.company/project/1.0-SNAPSHOT
However Artifactory rejects my build artifacts, saying The repository 'repo-local' rejected the artifact 'repo-local:com.company/project/1.0-SNAPSHOT/project-1.0-SNAPSHOT.txt' due to its snapshot/release handling policy.
How do I get Artifactory to accept the artifact and automatically replace SNAPSHOT with a timestamp in the filename?
Your problem is most likely the fact that the path you deploy to is not considered a valid integration revision by the layout you've selected (gradle-default).
The gradle-default layout expects integration revisions like:
org/module/1.0-12345678912345/module-1.0-12345678912345.jar
That is, for a 14 digit long time stamp to be appended after the base revision;
While your path contains SNAPSHOT instead of a 14 digit long timestamp.
If you want to have pattern like:
com.company/project/1.0-SNAPSHOT/project-1.0-20120612.101600.txt
You will have to customize the layout to accept -SNAPSHOT as the folder integration revision and modify your artifact to contain a timestamp as the file integration revision.
I'm guessing your assumption was that Artifactory will convert the non-unique integration revision to a unique one; Artifactory performs this conversion only when the repository is set to the default Maven layout and when the artifacts adhere to Maven's layout.
This is due to the fact the while Maven actually has defined standards for integration revisions, Gradle do not have such a standard; So basically, a Gradle revision could be practically anything.
On top of that, the concept of unique and non-unique integration revisions doesn't really exist in the Gradle world, it doesn't actually have any built-on functionality to support these features; and so when you see a Mavenized path in Gradle, it's just basically mimicking the pattern.

Configure a hudson maven job to keep building if there are test failures, but only deploy if there are no test failures

I've created a hudson job for our maven multi-project with 5 modules to deploy the SNAPSHOT artifacts to the maven repository. That's ok, as long as it builds successfully without test failures. However, now I'd like to fulfill the following requirements:
When a module has a test failure, the build should continue bulding and test the other modules, but turn yellow. Using -Dmaven.test.failure.ignore=true accomplishes, but fails at the next requirement.
When a module has a test failure, none of the artifacts should be deployed to the maven repository. Other projects depend on the snapshots this project and those projects only want to use the latest snapshots that don't have any failing tests.
Preferably, use the hudson maven integration instead of a free script we get the hudson report pages (red/yellow/blue status per module, build log error coloring, ...). Specifically running the maven build twice (first mvn test -Dmaven.test.failure.ignore=true, than mvn deploy -DskipTests) is not a solution because it's a performance loss and it confuses the hudson report pages and it's not atomic (it updates from the repositories again in the second build).
Is there any way to accomplish this?
There is an post build option called Deploy artifacts to Maven repository. If you do not select Deploy even if the build is unstable, then that mean if test fails, it won't deploy anything. Together with the -fae in the command, thing should work in your desired way
maybe you can try use mvn -fae option with you jobs on hudson - it make maven fail only after full build
If build time isn't a problem for you, I think the better option is to create another job, just for deploying. Something like this:
Configure your original job (let's call it "build job") with "mvn -fae clean install"
Create a new job ("deploy job") with "mvn deploy", and don't configure any Build triggers for it
In the "build job", enable the Build other projects option, under Post-build actions and set it to run your "deploy job".
Maybe you can try to configure both jobs to use the same workspace, saving some time on the whole build/deploy process.
If you happen to use Artifactory as a repository manager, you can use the Hudson/Jenkins Artifactory plugin to deploy your artifacts. This plugin will only deploy your artifacts if all tests pass for all modules of a Maven build.

How to generate different deployables from the same Maven project?

I have a situation that I'm sure must be fairly common. I have some Maven-built applications that deploy to different types of application server - like Tomcat, JBoss, etc.
The build processes 'tunes' the deployable artifact to the specific target type of application server (for example, different included dependencies, context roots, other config). This tuning is controlled with build profiles (-Ptomcat, -Pjboss etc)
So, for a given version of my application, I need to run builds that produce different deployables. I run mvn -Ptomcat clean package for example and I get an artifact in my /target directory that is the tomcat-tuned version.
The best approach I've been able to come up with so far is to specify finalnames for the artifacts that include the profile information, but for that approach, I'm not sure how to configure Maven to copy the final artifact off to some specific location so that the next build for a different type doesn't overwrite it.
Is this a good approach? If so, how can I achieve that final copy?
Or is there a better way?
You'll need to use Maven Assembly Plugin.

Maven: local development deploy vs bundling for distribution

Bear with me, I'm migrating from Ant to Maven2: I think I've hit one of those little things that was easy in Ant, but not so in Maven...
How do I handle the difference between a local deployment vs. creating an archive/bundle for distribution to another machine?
Let's assume my project's output is an EAR plus some additional config files. A developer that is actively working on the project will need to deploy and re-deploy frequently to his local app-server (say JBoss), while an Integration Engineer that is building for QA/production will need only to create the final archive assembly (tar/gz).
In Ant we had two targets for this: "dev-deploy" and "bundle". Both do a complete build, but differ in the final step: "dev-deploy" copies the EAR and config files to the respective local folders, while "bundle" just puts the EAR & config files in a tar.gz assembly.
How do you do this in Maven?
I've seen that the assembly plugin can create either archives (tar, gz, etc.) or exploded directories (from the same assembly descriptor). I can invoke either assembly:assembly or assembly:directory, but for the latter, how do I copy the final output to the local JBoss deployment folders? From a related post it seems that ad-hoc copying of files is not really what Maven is about, so an antrun copy is probably the most appropriate?
Finally, since the type of assembly may differ depending on who invokes it, it doesn't seem wise to bind assembly to the build lifecycle, not so? But this means that a developer will always need to invoke 'mvn package' followed by 'mvn assembly:directory' to rebuild and test a change. Conversely, an Integration Engineer will always need to run 'mvn package' followed by 'mvn assembly:assembly' to create the distributable archive. I was hoping for a one-command solution for each, or should I just script it?
In Ant we had two targets for this: "dev-deploy" and "bundle". Both do a complete build, but differ in the final step: "dev-deploy" copies the EAR and config files to the respective local folders, while "bundle" just puts the EAR & config files in a tar.gz assembly.
Not sure what you mean by respective local folders about "dev-deploy" but this sounds like what mvn pacakge is doing and "bundle" indeed sounds like a maven assembly.
I've seen that the assembly plugin can create either archives (tar, gz, etc.) or exploded directories (from the same assembly descriptor). I can invoke either assembly:assembly or assembly:directory, but for the latter, how do I copy the final output to the local JBoss deployment folders? From a related post it seems that ad-hoc copying of files is not really what Maven is about, so an antrun copy is probably the most appropriate?
I guess that we are talking about the Integration Engineer's tasks here. As you didn't explain what the "bundle" contains exactly, what the target application server is (my understanding is that you are using JBoss for QA/production too but, again, this is a guess), if this bundle has to be deployed automatically, it's hard to imagine all solutions and/or alternatives to antrun. But indeed, to copy/move/unzip/whatever the assembly, the maven antrun plugin is a candidate.
Finally, since the type of assembly may differ depending on who invokes it, it doesn't seem wise to bind assembly to the build lifecycle, not so? But this means that a developer will always need to invoke 'mvn package' followed by 'mvn assembly:directory' to rebuild and test a change. Conversely, an Integration Engineer will always need to run 'mvn package' followed by 'mvn assembly:assembly' to create the distributable archive. I was hoping for a one-command solution for each, or should I just script it?
My understanding was that the Integration Engineer was building the bundle. Why would a developer need the bundle? This is confusing... Anyway, I don't really need the details to think of an answer. You could actually declare the maven assembly plugin into specific build profiles, one for development and one for integration, and bind either the single or the directory-single mojos to the project's build lifecycle in each profile. This would allow to use only one command and avoid any scripting (really, don't go this way).