What causes jarsigner to overwrite MANIFEST.MF - jarsigner

The question is why is jarsigner sometimes overwriting the MANIFEST.MF file rather than just appending the signing entries to an existing MANIFEST.MF file in the jar that is being signed.
Note: I post this as an open ended question and will provide an answer or observation to one situation that I had that cause this problem. If there are other situations or circumstance in which this can happen hopefully others will expand on the posted question.

Here's one situation I saw. I have a gradle build that builds both a debug and release signed jar that is created by using jarsigner. The debug jar was signed by the default "debug.keystore" provided by android studio and the release jar was signed by my private keystore. For practical purposes the two jars are pretty much identical. Both contained the identical META-INF/MANIFEST.MF entry. The jarsigner for the debug jar overwrote the MANIFEST.MF entry. It only contained the signing entries. On the other hand, the jarsigner for the release jar appended the signing entries to the end of the existing MANIFEST.MF file as expected. Just for fun, I made a copy of the "debug.keystore" and renamed it and tried using it instead for the keystore. The manifest file was still overwritten. I then made a private keystore and created it with the same keyalias, cn, o, ou, as contained in the debug.keystore. This time the manifest file was appended to rather than being overwritten. That seems to imply that there is something related to the keystore itself that determines whether the MANIFEST.MF file is overwritten or not. Weird, but that's what I observed.
With some further investigation I discovered that my original MANIFEST.MF file did not contain a "Manifest-Version" header entry. So I added the entry and tried things again signing the debug jar with the original "debug.keystore". This time both jars appended the signing entries to the end of the existing MANIFEST.MF file rather than the debug jar overriding the MANIFEST.MF file. From that observation, we can all say, that makes sense; but who would have thought that. I certainly could not find that behavior documented anywhere.

ugh, my own 2 cents to the matter is that simply upgrading jarsigner plugin to version 3.0.0 solved the problem (original manifest entries are kept intact)
in pom.xml:
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jarsigner-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<id>sign</id>
<goals>
<goal>sign</goal>
</goals>
</execution>
</executions>
<configuration>
<keystore>mykeystore.jks</keystore>
<alias>myalias</alias>
<storepass>password</storepass>
<keypass>password</keypass>
</configuration>
</plugin>
...
Of course, other problems might still follow - e.g. in my case ClassNotFoundException was still thrown - because it turned out that there were some bouncycastle .SF and .DSA files included into my JAR's META-INF folder, carried from bouncycastle signed JARs by the shade plugin - they were interfering with the later signing I did with jarsigner plugin, causing this error when running the resulting jar with java -jar myjar.jar

Related

IntelliJ: how to make non java files copied to the bin directory as well?

My module contains some non java files along the java source files. When the module is built, the java files are copied to the bin folder (and included in the jar artifact), but the non java files are left out.
I need them to be copied as well (this is what Eclipse does). Note, that they do appear in the project tree view on the left, I did not exclude them in any way.
How can I make them get into the bin folder (jar artifact)?
Thanks.
Settings (Preferences on Mac) | Compiler | Resource Patterns.
This question duplicates/relates to:
copy jbehave stories into target directory using IntelliJ Idea
IntelliJ, Akka and Configuration files
IntelliJ IDEA 11.1.2 does not copy SQL files to the out folder
Add a properties file to IntelliJ's classpath
import images into an intelliJ Java project
Intellij - how do I add a text file to the resources
Null Pointer Exception for read properties file in Idea
IntelliJ Idea - resource SQL files not being copied to target
Scala getClass.getResource() returning null
On IDEA 14.1.4, the xml file in src/main/java/my/package folder is not copied. My compiler settings are !?*.java;!?*.form;!?*.class;!?*.groovy;!?*.scala;!?*.flex;!?*.kt;!?*.clj;!?*.aj.
I changed the gradle file by adding:
test {
resources {
srcDir 'src/main/java'
include '**/*.xml'
}
}
It starts working. I am not sure if I have missed anything, but I could not find that part reflected on project settings.
If you are working with Maven, the following code should have the same effect:
<build>
<testResources>
<testResource>
<filtering>false</filtering>
<directory>src/test/java</directory>
<includes>
<include>**/*.xml</include>
</includes>
</testResource>
<testResource>
<directory>src/test/resources</directory>
</testResource>
</testResources>
</build>
I posted it here as an answer, because it may help someone who has the same issue and the above answers may not work.
Uncheck use external build in project compiler setting.
Using CrazyCoder's info about version 12 (which I'm not using), I added the following as my resource pattern which worked well:
*.*;!*.form;!*.java;!*.class;!*.groovy;!*.as;!*.flex;!*.kt

Attaching Build Number for binaries in Maven

I am running maven build and storing files in Artifactory. One issue I am facing is when ever I try a -snapshot version it overwrites the binary in Artifactory. I tried using the Maven build number plugin, but running in to issues.I reffered to this
http://blog.codehangover.com/track-every-build-number-with-maven/
Describing below What I did?
Updated the masterpom.xml with following line.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>buildnumber-maven-plugin</artifactId>
<version>1.0-beta-3</version>
<executions>
<execution>
<phase>validate</phase>
<goals>
<goal>create</goal>
</goals>
</execution>
</executions>
<configuration>
<doCheck>false</doCheck>
<doUpdate>false</doUpdate>
<format>${version}.{0,number}</format>
<items>
<item>buildNumber</item>
</items>
</configuration>
</plugin>
Now I update the pom of ear and webproject as below
<build>
<finalName>${project.artifactId}-${project.version}.${buildNumber}</finalName>
</build>
When I ran mvn clean install, ear and war got generated but when i checked the war inside the ear I am finding it as some thing like war-1.0-SNAPSHOT-null.war. I believe the war and ear couldn't get the buildNumber parameter. I was able to successfully generate the buildNUmber.property files and was able to increment the number by running the buildnumber:create plugin. Here are my questions
What I am doing wrong here and why the buildNumber parameter is not picked.
Also I want to generate all the binaries including jars in the following format binary-version-Snapshot.${buildNumber}. So Do i need to update pom of each file or any other way to update this?
Also we are using Hudson builds for Continous Integration and we want to separate developers builds with Hudson Build number. How can we achieve this if we don't want to checkin the buildNumber.properties after the Hudson build.
To get unique snapshots use the uniqueVersion flag (see James Lorenzen's Blog). If you use the maven goal deploy:deploy-file the uniqueVersion flag is true by default. At my company we have the following policy. Only "official" snapshots go to the repository. A "official" snapshot is one that was build on our reference system (our Jenkins ci server). We don't need the unique feature for snapshots, since we let Jenkins archive the artifacts. This way we can always go back to a certain version if we would like too by using Jenkins. If the build breaks the snapshot will not be deployed to the repo.
To your 2nd question; my understanding is that you need to update every pom file. But since it is a one time change, it shouldn't be too much of a burden.
I am not completely understanding your 3rd question ("... separate developers builds with Hudson Build number..."). In case you want to add the build number for every build done by Hudson, you have several options.
You can add a string as classifier while deploying. Maven will add that classifier in the filename (artifactID-version-classifier.jar - e.g. my.company.calendar-0.0.1-Snapshot-Hudson.jar). The artifact will be retrieved by adding the classifier to the dependency.
add another parameter to your maven call - outputfilename (${project.build.finalName}, see maven docu)
changing your version string to something like

Release problems with Nexus + Maven + Hudson

When using the release plug-in for Maven on Hudson(1.368), I am getting an error that my distributionManagement section is missing during the deployment phase to our Nexus Maven Repository Manager. If I deploy without using release It woks just fine so should not be a misconfiguration with the server, the section or the settings.
It is worth noting that my company uses different pom files for Hudson and have named them differently. Also the settings.xml in in the individual project directories. This has never been a problem as Hudson allows for the name of the pom and the location and name of the settings file to be specified.
The reason I note the above is that when distributionManagement is moved into the regular pom.xml it does find it (but still doesn't work because its missing the username and password in the settings file). This confuses the heck out of me since for the prior parts of the release process, it uses the correct pom and settings. It just seems to forget them later on. What is going on here?
Thank you in advance.
UPDATE
It seems that the maven release plug-in spins up a new instance of maven which, it seems, is using the default pom.xml rather than our differently named pom. More testing is needed.
The answer (for any lost souls who stumble upon this question) is that maven was indeed forking out a new process which was not using the correct pom file and settings. The solution was to add a section to the pom file as thus:
<plugin>
<artifactId>maven-release-plugin</artifactId>
<version>2.0</version>
<configuration>
<goals>-f POMFILE -s SETTINGSFILE deploy</goals>
</configuration>
</plugin>
This specified those two files to the new maven process.
If I deploy without using release It woks just fine so should not be a misconfiguration with the server, the section or the settings.
Well, there is clearly a misconfiguration somewhere, be it at the Hudson level. But it will be hard to spot it without seeing the pom, the settings, the active profiles, the profiles used during the release, the Hudson setup, etc.
First step: try to reproduce the problem on the command line using the exact same configuration as Hudson.
Second step: use the Maven Help Plugin to understand and debug the issue. More specifically, the following goals:
help:active-profiles
help:effective-pom
help:effective-settings
The reason I note the above is that when distributionManagement is moved into the regular pom.xml it does find it (but still doesn't work because its missing the username and password in the settings file).
It's unclear where the distributionManagement is specified if outside the project's pom.xml (in a corporate environment, it goes typically in a corporate pom.xml, is it the case here?).
It's also unclear if you are actually providing the username and password for a server id matching the repository id of the distributionManagement.
But somehow, a wrong combination is used here. Double check what profiles/settings are active during release/deploy to spot the problem as suggested.
See also
The Maven Deploy Plugin Usage page

Maven Checkstyle Plugin - Test XREF

My maven reports are working great, all except Checkstyle and test xref. My test source is still being cross referenced at xref and not the test xref. So, when I click on the xref from within a Checkstyle report, I naturally get an error, the file isn't found. If I click on a source file, it works perfectly.
I tried testXrefLocation in the configuration to no avail. Is this by design, or am I missing a configuration?
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-checkstyle-plugin</artifactId>
<configuration>
<enableRulesSummary>false</enableRulesSummary>
<includeTestSourceDirectory>true</includeTestSourceDirectory>
<configLocation>${project.build.directory}/checkstyle.xml</configLocation>
<testXrefLocation>${project.reporting.outputDirectory}/xref-test</testXrefLocation>
</configuration>
</plugin>
mvn clean install site
In my target directory where all this stuff is generated, I have both xref and xref-test. However, my checkstyle reports for my test source code are still linked to target/xref and not target/xref-test.
Also, FYI, I am using a lot of inheritance to reduce the amount of configuration in a single Maven POM. Therefore, this plugin belongs to a parent pom which declares which plugins I want to use for testing. I have another that says I want to generate javadoc and source in addition to the compiled code.
Walter
I actually ended up removing this configuration in favor of using Sonar since it gives me much more information with a nicer UI.

using maven to manage java dependencies in a jruby rails app

I'm trying to write a pom.xml that will allow me to run a command locally and fetch all dependencies that my jruby Rails app has. I'm seeing two different configs though and I'm not totally sure which to use (as I'm not a java person whatsoever)
First, many Pom's i'm seeing just have a tag under the root of the pom.xml that list all dependencies. This doesn't however have any information about where these are stored etc... so I feel like this isn't what I want (I need to copy them to my rails lib dir)
Second option, I'm seeing in the mvn docs is to use the maven-dependency-plugin, which seems more like what i'm looking for. I assume then that my outputDirectory would be something like lib
So I don't fully understand what the purpose of the first option's dependency list is for. All I want is mvn to copy my jars locally (and then eventually when my CI server does a deploy). Can someone point me in the right direction?
First Option
<project>
<dependencies>
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<version>2.4</version>
</dependency>
</project>
Second Option
<project>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<configuration>
<artifactItems>
<artifactItem>
<groupId>[ groupId ]</groupId>
<artifactId>[ artifactId ]</artifactId>
<version>[ version ]</version>
<type>[ packaging ]</type>
<classifier> [classifier - optional] </classifier>
<overWrite>[ true or false ]</overWrite>
<outputDirectory>[ output directory ]</outputDirectory>
<destFileName>[ filename ]</destFileName>
</artifactItem>
</artifactItems>
<!-- other configurations here -->
</configuration>
</plugin>
</plugins>
</build>
</project>
First, many Pom's i'm seeing just have a tag under the root of the pom.xml that list all dependencies. This doesn't however have any information about where these are stored etc... so I feel like this isn't what I want (I need to copy them to my rails lib dir)
This is the traditional way to declare and use dependencies on a Java project. Dependencies declared under the <dependencies> element are downloaded from a "remote repository" and installed to your "local repository" (in ~/.m2/repository by default) and artifacts are then handled from there. Maven projects (at least the Java ones) don't use a local lib/ folder for their dependencies.
Second option, I'm seeing in the mvn docs is to use the maven-dependency-plugin, which seems more like what i'm looking for. I assume then that my outputDirectory would be something like lib
The maven dependency plugin allows to interact with artifacts and to copy/unpack them from the local or remote repositories to a specified location. So it can be used to get some dependencies and copy them in lets say a lib/ directory indeed. Actually, it has several goals allowing to do this:
dependency:copy takes a list of artifacts defined in the plugin
configuration section and copies them
to a specified location, renaming them
or stripping the version if desired.
This goal can resolve the artifacts
from remote repositories if they don't
exist in local.
dependency:copy-dependencies takes the list of project direct
dependencies and optionally transitive
dependencies and copies them to a
specified location, stripping the
version if desired. This goal can also
be run from the command line.
The first goal would use the setup you described in your second option. The second goal would use the standard project dependencies that you described in your first option. Both approaches would work.
The problem here is that I don't know exactly what a JRuby Rails app is, what the development workflow is, how to build such an app, etc so I don't know exactly what you need to do and, consequently, what would be the best way to implement that with Maven.
So I googled a bit and found this post that shows another approach based on OS commands (using the maven exec plugin) and has a complete pom.xml doing some other things. Maybe you should look at it and use it as a starting point instead of reinventing everything. This is my suggestion actually.