We have developed a script using NSIS Version 2.46 which would generate a installer for windows. Now that we would want to automate the build process of generating the installer by taking help of maven.
We currently use maven for building our java code projects and for building our end product.
For automating the build process of NSIS script, I am not able to find the maven plugin information which supports NSIS script build.
I googled for the information but I did not get any concrete information on how to start with it.
Could anyone explain how to start with it or point me to a page which explains about this with an example?
Try this one from codehaus.
After you install or build 'makensis', you should be able to configure your pom to look something like this:
<!-- Codehause Snapshots - Nsis plugin needs this -->
<pluginRepository>
<id>Codehaus Snapshots</id>
<url>http://nexus.codehaus.org/snapshots/</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
<releases>
<enabled>true</enabled> <!-- Workaround for MNG-2974, see note below -->
</releases>
</pluginRepository>
<!-- NSIS plugin for producing nsis installer -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>nsis-maven-plugin</artifactId>
<version>1.0-SNAPSHOT</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>generate-project</goal>
<goal>compile</goal>
</goals>
<configuration>
<makensisBin>/usr/local/nsis/nsis-2.46/bin/makensis</makensisBin>
<setupScript>src/nsis/setup.nsi</setupScript>
<outputFile>${project.build.directory}/${project.build.finalName}.exe</outputFile>
</configuration>
</execution>
</executions>
</plugin>
Related
I have written a WSDL and I want to generate java classes.
I use the cxf-codegen-plugin for maven2, but I get the following validation error in Eclipse:
Plugin execution not covered by lifecycle configuration:
org.apache.cxf:cxf-codegen-plugin:2.2.7:wsdl2java (execution:
generate-sources, phase: generate-sources).
Can anybody help me to resolve this? Or propose another solution?
The error message sounds like it is coming from the maven eclipse integration (m2e). If the build works from the command line but not from within eclipse then maybe this article on the eclipse wiki can help you to configure eclipse.
This answer shows the solution for a similar problem with another maven plugin.
You can add this snippet to your pom.xml (from the link published by #Jörn Horstmann):
<pluginManagement>
<plugins>
<!--This plugin's configuration is used to store Eclipse m2e settings
only. It has no influence on the Maven build itself. -->
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-codegen-plugin</artifactId>
<versionRange>[2.3.3,)</versionRange>
<goals>
<goal>wsdl2java</goal>
</goals>
</pluginExecutionFilter>
<action>
<execute />
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
The more important lines are:
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-codegen-plugin</artifactId>
<versionRange>[2.3.3,)</versionRange>
<goals>
<goal>wsdl2java</goal>
</goals>
So, you have to ensure what version of CXF are you using...
Hope this can help...
Change the plugin version to this especific. After, Run Maven Update Project, and finally Run Maven generate-sources
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-codegen-plugin</artifactId>
<version>2.7.10</version>
Best regards,
I am using maven-assembly plugin to create a jar of my application, including its dependencies as follows:
<assembly>
<id>macosx</id>
<formats>
<format>tar.gz</format>
<format>dir</format>
</formats>
<dependencySets>
<dependencySet>
<includes>
<include>*:jar</include>
</includes>
<outputDirectory>lib</outputDirectory>
</dependencySet>
</dependencySets>
</assembly>
(I omitted some other stuff that is not related to the question)
So far this has worked fine because it creates a lib directory with all dependencies. However, I recently added a new dependency whose scope is system, and it does not copy it to the lib output directory. i must be missing something basic here, so I call for help.
The dependency that I just added is:
<dependency>
<groupId>sourceforge.jchart2d</groupId>
<artifactId>jchart2d</artifactId>
<version>3.1.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/external/jchart2d-3.1.0.jar</systemPath>
</dependency>
The only way I was able to include this dependency was by adding the following to the assembly element:
<files>
<file>
<source>external/jchart2d-3.1.0.jar</source>
<outputDirectory>lib</outputDirectory>
</file>
</files>
However, this forces me to change the pom and the assembly file whenever this jar is renamed, if ever. Also, it seems just wrong.
I have tried with <scope>runtime</scope> in the dependencySets and <include>sourceforge.jchart2d:jchart2d</include> with no luck.
So how do you include a system scoped jar to your assembly file in maven 2?
Thanks a lot
I'm not surprised that system scope dependencies are not added (after all, dependencies with a system scope must be explicitly provided by definition). Actually, if you really don't want to put that dependency in your local repository (for example because you want to distribute it as part of your project), this is what I would do:
I would put the dependency in a "file system repository" local to the project.
I would declare that repository in my pom.xml like this:
<repositories>
<repository>
<id>my</id>
<url>file://${basedir}/my-repo</url>
</repository>
</repositories>
I would just declare the artifact without the system scope, this is just a source of troubles:
<dependency>
<groupId>sourceforge.jchart2d</groupId>
<artifactId>jchart2d</artifactId>
<version>3.1.0</version>
</dependency>
I'm not 100% sure this will suit your needs but I think it's a better solution than using the system scope.
Update: I should have mentioned that in my original answer and I'm fixing it now. To install a third party library in the file-based repository, use install:install-file with the localRepositoryPath parameter:
mvn install:install-file -Dfile=<path-to-file> \
-DgroupId=<myGroup> \
-DartifactId=<myArtifactId> \
-Dversion=<myVersion> \
-Dpackaging=<myPackaging> \
-DlocalRepositoryPath=<path-to-my-repo>
You can paste this as is in a *nix shell. On windows, remove the "\" and put everything on a single line.
Btw you can automate it and make it a part of your maven build. The following will install your jar into your local repository before compilation:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<executions>
<execution>
<id>hack-binary</id>
<phase>validate</phase>
<configuration>
<file>${basedir}/lib/your-lib.jar</file>
<repositoryLayout>default</repositoryLayout>
<groupId>your-group</groupId>
<artifactId>your-artifact</artifactId>
<version>0.1</version>
<packaging>jar</packaging>
<generatePom>true</generatePom>
</configuration>
<goals>
<goal>install-file</goal>
</goals>
</execution>
</executions>
</plugin>
I find easy solution in case you creating jar
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.1.1</version>
<configuration>
<webResources>
<resource>
<directory>dependencies/mydep</directory>
<targetPath>WEB-INF/lib</targetPath>
<filtering>true</filtering>
<includes>
<include>**/*.jar</include>
</includes>
</resource>
</webResources>
</configuration>
</plugin>
You can also handle this via adding a supplemental dependencySet in your dependencySets.
<dependencySet>
<scope>system</scope>
<includes>
<include>*:jar</include>
</includes>
<outputDirectory>lib</outputDirectory>
</dependencySet>
The best thing would be to use a Repository Manager (like Nexus, Artifactory, Archiva) and install this kind of dependency in a particular repository. After that you can use such things as a simple dependency. This will simplify your life.
Docs:
Edited: Sorry that i didn't realize alx also mentioned about the clean life cycle workaround.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<executions>
<execution>
<id>hack-binary</id>
<phase>clean</phase>
<configuration>
<file>${basedir}/lib/your-lib.jar</file>
<repositoryLayout>default</repositoryLayout>
<groupId>your-group</groupId>
<artifactId>your-artifact</artifactId>
<version>0.1</version>
<packaging>jar</packaging>
<generatePom>true</generatePom>
</configuration>
<goals>
<goal>install-file</goal>
</goals>
</execution>
</executions>
</plugin>
Base on the solution provided by alx, you can execute the install file step at clean phase. but since the clean phase is not in the default life cycle, you have to execute mvn clean at the first time to ensure the jar is ready in the local repo.
ex: mvn clean; mvn package
A simple solution for this is to add it into local maven repository
One way to do is via mvn install commands as suggested in previous post .
Another easy way is ,
1) In your eclipse ide right click on project select Maven option .
2) Select Install or deploy an artifact to a maven repository option and click on next.
3)Click on browse next to the Artifact file checkbox & select your jar file
4)Enter the GroupId and ArtifactId and version ensure generate pom & create checksum are checked & packaging is jar
Click on finish and that's it ! Your job is done the jar is added in your local repository which you can define in setting.xml or m2 directory
Now just add the simple maven dependency as per the GroupId,ArtifactId & jar version that you have entered as per the import and that's it your external jar will be packaged by maven.
it has worked in a easier way on my solution :
remove from your dependency :
<dependency>
<groupId>tiago.medici</groupId>
<artifactId>eureka</artifactId>
<version>0.0.1</version>
</dependency>
Then add the maven-install-plugin in the pom.xml as well.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<executions>
<execution>
<id>install-external</id>
<phase>clean</phase>
<configuration>
<file>${basedir}/external/tiago.medici-0.0.1.jar</file>
<repositoryLayout>default</repositoryLayout>
<groupId>tiago.medici</groupId>
<artifactId>eureka</artifactId>
<version>0.0.1</version>
<packaging>jar</packaging>
<generatePom>true</generatePom>
</configuration>
<goals>
<goal>install-file</goal>
</goals>
</execution>
</executions>
</plugin>
What I want to do is to create source code distribution of my application with all dependencies and burn it on DVD. So that I could build it in 100 years (well, ok, you know what I mean...). No online dependencies on libraries or maven plugins!
I know that Ant would be better for this, but I'm using maven in my project. I'm not going to switch to Ant just for that, I'm asking how to do this with maven. Or, if there is a way how to generate self sustainable Ant build that I could put on DVD that would be great too.
(there is ant:ant plugin but it just generates Ant build.xml that points dependencies to local maven repo)
The approach I've taken is that I wanted to create special local repository that I can put on DVD and then build project with mvn -o -Dmaven.repo.local=repo/on/dvd. I was trying to make such repository with dependency:copy-dependencies anduseRepositoryLayout param set to true. But it doesn't copy freaking maven plugins that my build depends on...
The only way I can think of to include the plugins is to specify a different local repository for the build on the command line and ensure all the dependency sources etc are downloaded, then create an archive including the project's contents and the custom repository.
Here is a pom that downloads the sources and javadocs (it downloads them to the project's target directory, which we exclude from the archive because they will also be in the local repository). The assembly descriptor bundles the project's contents and the local repository into a single (pretty large) archive.
Note the processing is all in a profile because you really don't want this running on every build. If temporary local repository is in the target directory you can easily clean the mess up afterwards with a mvn clean.
To activate the profile do something like the following:
mvn package -Parchive -Dmaven.repo.local=.\target\repo
Here's the pom:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>name.seller.rich</groupId>
<artifactId>test-archive</artifactId>
<version>0.0.1</version>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.5</version>
<scope>test</scope>
</dependency>
</dependencies>
<profiles>
<profile>
<id>archive</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>sources</id>
<phase>pre-package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<classifier>sources</classifier>
<failOnMissingClassifierArtifact>false</failOnMissingClassifierArtifact>
<!--the target directory won't be included, but the sources will be in the repository-->
<outputDirectory>${project.build.directory}/sources</outputDirectory>
</configuration>
</execution>
<execution>
<id>javadocs</id>
<phase>pre-package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<classifier>javadoc</classifier> <failOnMissingClassifierArtifact>false</failOnMissingClassifierArtifact>
<outputDirectory>${project.build.directory}/javadocs</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2-beta-4</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<descriptors>
<descriptor>src/main/assembly/archive.xml</descriptor>
</descriptors>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
</project>
And here's the assembly:
<assembly>
<id>archive</id>
<formats>
<format>zip</format>
</formats>
<fileSets>
<fileSet>
<directory>${project.basedir}</directory>
<outputDirectory>/</outputDirectory>
<excludes>
<exclude>target/**</exclude>
</excludes>
</fileSet>
<fileSet>
<directory>${maven.repo.local}</directory>
<outputDirectory>repo</outputDirectory>
</fileSet>
</fileSets>
</assembly>
Watch this:
Maven Assembly Plugin
Quote from the homepage:
Do you want to create a binary
distribution from a Maven project that
includes supporting scripts,
configuration files, and all runtime
dependencies? You need to use the
Assembly Plugin to create a
distribution for your project.
It's well configurable. I used it especially for making self-running demo versions of web-applications with an embedded jetty server and user documentation.
I don't have a complete answer. Last time I looked at this, I thought that cleaning out the localRepository at the start of the build (or using a separate one) and the running mvn dependency:go-offline.
If you're really keen, you'll also want to bundle maven itself and a JDK into the distribution. This likely takes it out of scope of a pure maven build.
I have a multi-module project and I want to deploy on the project's site an HTML version of my source code using the JXR maven plugin.
The problem is that the JXR plugin runs well, the XREF folder is properly generated for each of my module, but when I use the mvn site:stage command in order to retrieve all the project's site content and to have all link properly generated it does not retrieve the XREF folders.
Here is an extract of my POM file where the JXR plugin is configured:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>jxr-maven-plugin</artifactId>
<configuration>
<aggregate>true</aggregate>
</configuration>
</plugin>
Here is the command I use to create and stage my site:
mvn site site:stage
Do you guys have any idea?
Thanks in advance.
r.
Not sure this is relevant, but your command is running the site twice, mvn site will generate the site, and site:stage will also run the site, perhaps this is causing problems but I honestly can't see why.
Looking at the JXR documentation, it only mentions the site:site goal, I can't see why it wouldn't be run properly for the site:stage goal as it extends it. If you run the site goal, then copy the output to another directory, run the site:stage goal and compare the output it might give some insight into the problem.
Update: I tried this myself and the xref was included and aggregated nicely in c:\test\stage with the cross references correctly managed. I've included the configuration I used.
In my parent pom I defined the site configuration like this:
<build>
<plugins>
<plugin>
<artifactId>maven-site-plugin</artifactId>
<executions>
<execution>
<phase>prepare-package</phase>
<goals>
<goal>stage</goal>
</goals>
</execution>
</executions>
<configuration>
<stagingDirectory>c:\test\stage</stagingDirectory>
</configuration>
</plugin>
</plugins>
</build>
The distributionManagement section was configured with the site information (not really needed as I set the stagingDirectory above, but the goal won't run without it).
<distributionManagement>
<site>
<id>mojo.website</id>
<name>Mojo Website</name>
<url>scp://test/</url>
</site>
</distributionManagement>
My JXR configuration in the parent pom was as follows:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jxr-plugin</artifactId>
<reportSets>
<reportSet>
<id>src-xref</id>
<reports>
<report>jxr</report>
</reports>
</reportSet>
<reportSet>
<id>test-xref</id>
<reports>
<report>test-jxr</report>
</reports>
</reportSet>
</reportSets>
<configuration>
<aggregate>true</aggregate>
</configuration>
</plugin>
The commandline run was mvn clean site:stage
Edit: Per the comments, there is a codehaus jxr plugin with slightly different semantics. Be sure to use the org.apache.maven.plugins version rather than the org.codehaus.mojo version.
I'm experimenting with Protocol Buffers in an existing, fairly vanilla Maven 2 project. Currently, I invoke a shell script every time I need to update my generated sources. This is obviously a hassle, as I would like the sources to be generated automatically before each build. Hopefully without resorting to shameful hackery.
So, my question is two-fold:
Long shot: is there a "Protocol Buffers plugin" for Maven 2 that can achieve the above in an automagic way? There's a branch on Google Code whose author appears to have taken a shot at implementing such a plugin. Unfortunately, it hasn't passed code review or been merged into protobuf trunk. The status of that plugin is thus unknown.
Probably more realistic: lacking an actual plugin, how else might I go about invoking protoc from my Maven 2 build? I suppose I may be able to wire up my existing shell script into an antrun invocation or something similar.
Personal experiences are most appreciated.
You'll find some information about the plugin available in the Protocol Buffers repository in the Protocol Buffers Compiler Maven Plug-In thread on the Protocol Buffers discussion group. My understanding is that it's usable but lacking tests. I'd give it a try.
Or you could just use the antrun plugin (snipet pasted from the thread mentioned above):
<build>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>generate-sources</id>
<phase>generate-sources</phase>
<configuration>
<tasks>
<mkdir dir="target/generated-sources"/>
<exec executable="protoc">
<arg value="--java_out=target/generated-sources"/>
<arg value="src/main/protobuf/test.proto"/>
</exec>
</tasks>
<sourceRoot>target/generated-sources</sourceRoot>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>2.0.3</version>
</dependency>
</dependencies>
The accepted answer encouraged me to get the Google-provided plugin to work. I merged the branch mentioned in my question into a checkout of 2.2.0 source code, built and installed/deployed the plugin, and was able to use it in my project as follows:
<build>
<plugins>
<plugin>
<groupId>com.google.protobuf.tools</groupId>
<artifactId>maven-protoc-plugin</artifactId>
<version>0.0.1</version>
<executions>
<execution>
<id>generate-sources</id>
<goals>
<goal>compile</goal>
</goals>
<phase>generate-sources</phase>
<configuration>
<protoSourceRoot>${basedir}/src/main/protobuf/</protoSourceRoot>
<includes>
<param>**/*.proto</param>
</includes>
</configuration>
</execution>
</executions>
<configuration>
<protocExecutable>/usr/local/bin/protoc</protocExecutable>
</configuration>
</plugin>
</plugins>
</build>
Note that I changed the plugin's version to 0.0.1 (no -SNAPSHOT) in order to make it go into my non-snapshot thirdparty Nexus repository. YMMV. The takeaway is that this plugin will be usable once it's no longer necessary to jump through hoops in order to get it going.
The accepted solution does not scale for multiple proto files. I had to come up with my own:
<build>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>compile-protoc</id>
<phase>generate-sources</phase>
<configuration>
<tasks>
<mkdir dir="${generated.sourceDirectory}" />
<path id="proto.path">
<fileset dir="src/main/proto">
<include name="**/*.proto" />
</fileset>
</path>
<pathconvert pathsep=" " property="proto.files" refid="proto.path" />
<exec executable="protoc" failonerror="true">
<arg value="--java_out=${generated.sourceDirectory}" />
<arg value="-I${project.basedir}/src/main/proto" />
<arg line="${proto.files}" />
</exec>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</build>
There's also great plugin by Igor Petruk named protobuf-maven-plugin. It's in central repo now and plays nicely with eclipse (m2e-1.1 is recommended).
I just updated the maven plugin to work with 2.2.0 -- the updated pom are attached to the code review bug.
Here are the instructions to build the plugin yourself:
svn co http://protobuf.googlecode.com/svn/branches/maven-plugin/tools/maven-plugin
cd maven-plugin
wget -O pom.xml 'http://protobuf.googlecode.com/issues/attachment?aid=8860476605163151855&name=pom.xml'
mvn install
You can then use the maven config above.
I just tried a less official but very recent (v 0.1.7) fork from https://github.com/dtrott/maven-protoc-plugin and it worked very well, courtesy of David Trott. I tested it with a couple of Maven modules one of which contained DTO-style messages and the other a service depending on them. I borrowed the plugin configuration MaxA posted on Oct 16 '09, I had protoc on my PATH and I added
<temporaryProtoFileDirectory>${basedir}/target/temp</temporaryProtoFileDirectory>
right after
<protocExecutable>protoc</protocExecutable>.
What is really nice is that all I had to do is to declare a normal dependency from the service module on the DTO module. The plugin was able to resolve proto files dependencies by finding the proto files packaged with the DTO module, extracting them to a temporary directory and using while generating code for the service. And it was smart enough not to package a second copy of the generated DTO classes with the service module.
There is a maven plugin for protobuf. https://www.xolstice.org/protobuf-maven-plugin/usage.html
The minimal config
<plugin>
<groupId>org.xolstice.maven.plugins</groupId>
<artifactId>protobuf-maven-plugin</artifactId>
<version>0.5.0</version>
<configuration>
<protocExecutable>/usr/local/bin/protoc</protocExecutable>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
</plugin>
I think that using antrun to invoke non-Maven steps is the generally accepted solution.
You could also try the maven-exec-plugin.
I forked of the plugin from David Trott and have it compiling multiple languages which makes it a lot more useful. See the github project here and a tutorial on integrating it with a maven build here.