I am opening this thread as I was requested to do so, in a reply to comment/question i added here:
Disable scanning of CDI beans in WAR
The question is the following.
Is there any specific additional step that one needs to take besides adding jandex to his build pom to get the feature enabled?
I notice no deployment speed difference when using jandex on Wildfly 10.1.0.Final and Weblogic 12.2.1.2 deployments. If anything, the deployment tends to be about 1 second slower.
Steps taken:
1. Visit https://github.com/wildfly/jandex-maven-plugin
Enrich a multi-module pom with the plugin
<plugin>
<groupId>org.jboss.jandex</groupId>
<artifactId>jandex-maven-plugin</artifactId>
<version>1.0.5</version>
<executions>
<execution>
<id>make-index</id>
<goals>
<goal>jandex</goal>
</goals>
<!-- phase is 'process-classes by default' -->
<configuration>
<!-- Nothing needed here for simple cases -->
</configuration>
</execution>
</executions>
</plugin>
Notice that all .jar files have a size relatively larger due to the contained Jandex.idx that gets written to the META-INF/
Deploy the WAR application via wildfly/weblogic console
No difference at all in deployment time.
And on this point, believe me, the application is not light-weight on the number of CDI beans it holds.
This is something being addressed, but as a short-time solution I would like to find a quickfix to accelerate the deployment time and was hoping Jandex to have some impact.
Instead it seems to make 0 difference, if anything the deployment with jandex always seemed to take one to two seconds extra.
Perhaps additional information that might be relevant.
Both in wildfly and in weblogic, there is a tunning that can be done to tell the newer versions of WELD to not scan all deployed .jar files.
We use the setting that tells weld to only consider jar files with the beans.xml file within them.
And these jar files have bean-discover="all" - while CDI recommends that one uses the "annoted" approach to speed up analysis time and memmory foot print (but that would require a bigger refactoring).
See http://weld.cdi-spec.org/news/2016/10/25/tip3-performance/
So in short:
Is there something more that needs to be done to tell a container to consider the jandex index.
Or is it simply that Weld is already so fast analyzing the deployed classes, that pre-building the index makes virtually no difference except to add up to the deployment a few MBytes?
I would assume not because Jandex is still mentioned as weld tip for deployment speed improvement, so I am tempted to think I am missing some piece of configuration.
Many thanks for any help on this front.
You are right - this won't be faster. It would be (most likely) in SE and Servlet but not necessarily in EE server.
Weld SPI offers a service interface to integrators (such as WildFly and WebLogic) and they may or may not choose to use it and feed Weld with class info (from Jandex for instance). Now, I don't know about WebLogic, but I guess they don't use Jandex at all (it's WFLY sub-project after all). But when we talk WildFly, they do use Jandex, but they create their own Jandex index "on-the-fly" during deployment which they then use instead of the pre-prepared one you might have had there. That explains the additional second of something you see.
On the other hand, in SE/Servlet environment, Weld is an "integrator" for itself and can (and does) make sure that Jandex will be used.
Related
I have multiple maven modules: DBLayer (Contains all entities),UserManagement (User management services), WebApp
For setting up the testing i need to understand how to structure everything.
I want to test each module in an embedded container. As you all know, in each module I have my test folder where I put my test classes. Do I have to put the ShrinkWrap setup in each module test class (code that prepares embedded container for testing), or is that only needed in the webapp?
I also have the arquillian dependencies in each module pom file. Is there a better way to add these dependencies?
Thank you in advance.
I want to test each module in an embedded container (...)
If you are considering an embedded container, please take a look at this answer. Dan Allen explains it in details in his short article The Danger of Embedded Containers. Please take a look at it.
It looks to me that you are using Maven, then the following answer you may find useful in terms of separating integration tests from unit tests. To be short: stick to the Maven naming conventions (just name your integration tests with *IT suffix, e.g. MyClassIT.java) and let maven-failsafe-plugin do its job.
Do I have to put the ShrinkWrap setup in each module test class (code
that prepares embedded container for testing), or is that only needed
in the webapp?
Unfortunately yes. That is because you may want to prepare a different deployment for each test class. But drawbacks are clear:
Usually repeated deployment's setup.
Slowed down execution of tests (there is a new deploy operation for every test case)
Fortunately, there is Arquillian Suite Extension that lets you achive that. The Extension will force all classes in a TestSuite to run from the same DeploymentScenario. But as far as I know, it is more complex topic than it would appear on the surface - just keep in mind that this extension (although extremely useful) may not work with some other arquillian extensions.
I also have the arquillian dependencies in each module pom file. Is
there a better way to add these dependencies?
Your intuition is correct:) Yes, there are better ways. Personally I would recommend keep dependency to the arquillian only in a parent pom.xml file, then all sub-modules would inherit this dependency.
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.jboss.arquillian</groupId>
<artifactId>arquillian-bom</artifactId>
<version>1.1.5.Final</version>
<scope>import</scope>
<type>pom</type>
</dependency>
</dependencies>
</dependencyManagement>
Generally, be sure to read Getting started with Arquillian Guide.
I'm working on getting the rpm-maven plugin setup in a project. In our staging and production environments, the build occurs on Red Hat boxes, but we have several Windows boxes that are used for development and testing so I wanted the RPM build process to be part of a profile that is only active on a box that has rpmbuild installed.
This was my first attempt at an activation condition:
<activation>
<os>
<family>unix</family>
</os>
<file>
<exists>/usr/bin/rpmbuild</exists>
</file>
</activation>
My initial testing only involved building on a Windows box and building on a CentOS box, and both gave me the results I expected. Later, the build broke on a Linux machine that didn't have rpmbuild available. It looks like having two conditions like this isn't supported. Is this the case? I realize I can probably just get rid of the <os/> element and get the results I want, but for future reference is there a better way to create profiles with multiple activation conditions?
Maven <activation> block is a list of OR -- the profile will be activated as soon as the first criteria is met. So, it is less likely that your problem has a solution at least until this bug-report gets fixed https://issues.apache.org/jira/browse/MNG-4565
Update:
it's fixed in 3.2.2 now – sfussenegger (via comment)
And worst you can mix condition of different type for example file, jdk and property as described here http://www.sonatype.com/books/mvnref-book/reference/profiles-sect-activation.html, but you can't even put two condition of same type, for example two properties
<activation>
<property>
<name>integrationTest</name>
</property>
<property>
<name>packaging</name>
<value>swf</value>
</property>
</activation>
This won't work as only one <property> tag will be allowed.
Associated JIRA : https://issues.apache.org/jira/browse/MNG-3328
And the bug described above is still open... 5 years it's just a shame !
Just fixed by me :)
Starting from 3.2.2 it will work as expected: multiple conditions will be ANDed
Reference - https://github.com/apache/maven/commits/master, search by MNG-4565
Commit URL - https://github.com/apache/maven/commit/c6529932f9e3efdfc86ed73f59a307a8f8b6ea5f
I think this is what these Maven extensions do:
Maven EL Profile Activator Extension
This one is pretty simple, have a look at the source
Maven Profile Activation Extension
This one has more options for the actual activation expression, including Scala.
However, since it's an extension (not a plugin), every project using it will have to register the extension. And there's a risk that the project author will abandon it and it won't work in future maven versions.
When using the release plug-in for Maven on Hudson(1.368), I am getting an error that my distributionManagement section is missing during the deployment phase to our Nexus Maven Repository Manager. If I deploy without using release It woks just fine so should not be a misconfiguration with the server, the section or the settings.
It is worth noting that my company uses different pom files for Hudson and have named them differently. Also the settings.xml in in the individual project directories. This has never been a problem as Hudson allows for the name of the pom and the location and name of the settings file to be specified.
The reason I note the above is that when distributionManagement is moved into the regular pom.xml it does find it (but still doesn't work because its missing the username and password in the settings file). This confuses the heck out of me since for the prior parts of the release process, it uses the correct pom and settings. It just seems to forget them later on. What is going on here?
Thank you in advance.
UPDATE
It seems that the maven release plug-in spins up a new instance of maven which, it seems, is using the default pom.xml rather than our differently named pom. More testing is needed.
The answer (for any lost souls who stumble upon this question) is that maven was indeed forking out a new process which was not using the correct pom file and settings. The solution was to add a section to the pom file as thus:
<plugin>
<artifactId>maven-release-plugin</artifactId>
<version>2.0</version>
<configuration>
<goals>-f POMFILE -s SETTINGSFILE deploy</goals>
</configuration>
</plugin>
This specified those two files to the new maven process.
If I deploy without using release It woks just fine so should not be a misconfiguration with the server, the section or the settings.
Well, there is clearly a misconfiguration somewhere, be it at the Hudson level. But it will be hard to spot it without seeing the pom, the settings, the active profiles, the profiles used during the release, the Hudson setup, etc.
First step: try to reproduce the problem on the command line using the exact same configuration as Hudson.
Second step: use the Maven Help Plugin to understand and debug the issue. More specifically, the following goals:
help:active-profiles
help:effective-pom
help:effective-settings
The reason I note the above is that when distributionManagement is moved into the regular pom.xml it does find it (but still doesn't work because its missing the username and password in the settings file).
It's unclear where the distributionManagement is specified if outside the project's pom.xml (in a corporate environment, it goes typically in a corporate pom.xml, is it the case here?).
It's also unclear if you are actually providing the username and password for a server id matching the repository id of the distributionManagement.
But somehow, a wrong combination is used here. Double check what profiles/settings are active during release/deploy to spot the problem as suggested.
See also
The Maven Deploy Plugin Usage page
I have to deal with what is pretty ugly and large blob of ColdFusion code which up to this day is maintained by direct modifications on production server (don't ask). I managed to clean it up from dupes and backups and put it into Subversion, now I need tp pick a make system to be able to put this onto continuous build (TeamCity) and also scheduled releases.
To my surprise I only found pretty much a single blog article on how to retrofit CF project with Maven, so the question is - does anyone have experience successfully using Maven on CF and what in general people use to manage large CF projects?
Your suggestions, tips and links will be much appreciated
Since I don't want to start religions wars - Maven is pretty much company standard (vs Ant)
First, here's another blog you might find helpful.
build-tools-maven-and-coldfusion
I haven't tried to build ColdFusion with Maven, but I have experience with managing Maven builds for a large company. There are a few things for you to consider.
Project structure
Coldfusion cfm and cfc files should be put in src/main/resources so they are bundled in the jar (the blog referenced above overrides the Maven convention to put them in src. this is ok, but could be a problem if you later need to add anything else to the project).
I'd probably keep cfc and cfm files in separate projects with appropriate dependency declarations to link them, this keeps your cfc projects as libraries and helps reuse. It is also worth considering the granularity of the cfc projects. Generally Maven's dependency management helps you keep artifacts small, with little need to worry about finding all the jars.
Deployment
The simplest way to deliver the artifacts is to use the maven-war-plugin to create a war containing your artifacts and all their transitive dependencies. This makes each application self-contained, which can be useful. The downside of this is that you'll end up bundling the same artifacts repeatedly and they can be quite large. To mitigate this you can either use the assembly-plugin to create custom packages excluding the common components, or you can specify that certain components (e.g. ColdSpring) are scope provided, this means they won't be included in the war.
Version Management
Maven encourages a proliferation of dependencies, by default each dependency declaration has a version, this can cause maintenance issues, particularly when you want to bump the version of an external dependency. You can mitigate this by defining a parent POM or an "app" POM. Either would have a dependencyManagement section declaring the details (groupId, artifactId, and version) for common artifacts. Any POM inheriting from the parent need not declare the dependency version as it will be inherited (note this doesn't mean that all children will have all dependencies, only that any that declare a dependency don't need to declare the version). If you define an "app" project with packaging "pom" and a dependencyManagement section, you can reference it with scope import (from Maven 2.0.9 onwards), this will import the dependencyManagement section from the "app" project to the project POM. See the dependency documentation for more details.
If you declare a dependency with a scope in the dependencyManagement section, that scope will be inherited unless it is overridden in the child POM. Related to the deployment section above, this means that you can declare the common libraries scope provided in the parent to ensure they are not bundled in each applciation.
Naming Conventions
You'll need a naming convention for the packages to avoid collisions.
It's probably best to follow the Maven convention and use java package-like groupIds (org.apache.maven for maven.apache.org) and the jar name for the artifact. This convention would give the groupId "org.coldspringframework" and artifactId "coldspring" for ColdSpring.
Further distinctions might need to be made across the company. For example, if you have a web and core team, you could give the web team the groupIds com.mycompany.web.* and the core team com.mycompany.core.*
Dependency Management
You'll need to add your CFC packages to a Maven repository such as Nexus so they are accessible to other builds across the enterprise.
If you want to keep the CFC packages separate to the jars. You can specify a custom packaging type, so that they won't be mixed up with any Java artifacts. If you create a custom packaging type, the artifacts can have the ".jar" extension, but any dependency declaration must have the type set.
Here's an example following those conventions:
<dependency>
<groupId>org.coldspringframework</groupId>
<artifactId>coldspring</artifactId>
<version>1.2</version>
<!--custom packaging type helps keep separate from Java artifacts-->
<type>cfc</type>
</dependency>
There's a section in the Nexus book that describes custom lifecycles (follow the links for more details. Essentially you need to create a plugin with a META-INf/plexus/components.xml to describe the plexus mechanics (what archiver to use, what extension to output etc).
The components.xml would look something like this:
<component-set>
<components>
<component>
<role>org.apache.maven.lifecycle.mapping.LifecycleMapping</role>
<role-hint>cfc</role-hint>
<implementation>org.apache.maven.lifecycle.mapping.DefaultLifecycleMapping</implementation>
<configuration>
<phases>
<process-resources>org.apache.maven.plugins:maven-resources-plugin:resources</process-resources>
<package>com.hsbc.maven.plugins:maven-jar-plugin:jar</package>
<install>org.apache.maven.plugins:maven-install-plugin:install</install>
<deploy>org.apache.maven.plugins:maven-deploy-plugin:deploy</deploy>
</phases>
</configuration>
</component>
<component>
<role>org.apache.maven.artifact.handler.ArtifactHandler</role>
<role-hint>cfc</role-hint>
<implementation>org.apache.maven.artifact.handler.DefaultArtifactHandler</implementation>
<configuration>
<extension>jar</extension>
<type>cfc</type>
<packaging>cfc</packaging>
</configuration>
</component>
<component>
<role>org.codehaus.plexus.archiver.Archiver</role>
<role-hint>cfc</role-hint>
<implementation>org.codehaus.plexus.archiver.zip.ZipArchiver</implementation>
<instantiation-strategy>per-lookup</instantiation-strategy>
</component>
<component>
<role>org.codehaus.plexus.archiver.UnArchiver</role>
<role-hint>cfc</role-hint>
<implementation>org.codehaus.plexus.archiver.zip.ZipUnArchiver</implementation>
<instantiation-strategy>per-lookup</instantiation-strategy>
</component>
</components>
</component-set>
Maven looked interesting to me too, but I couldn't find enough resources, and didn't have enough time to figure it out, so I moved onto what seemed to be good as well.
I understand you prefer to use Maven, I have come across several articles regarding Ant and Coldfusion, as well as a recent one about Hudson with Coldfusion.
Coldfusion also has the cfant (undocumented) tag. You can run ANT scripts right from CF?
I'm just in the middle of revisiting maven. Our team had a bad experience when we last looked at this, as it was during the period when maven was rearchitecting from 1.x to 2.x, so a lot of the dependencies we needed hadn't been moved across to the new repositories. However, I have the time to reconsider now.
I am interested in using maven and either LaTeX or DocBook for creating documentation, and I was wondering if anyone had any experiences to share, project/module structure, good plugins to use, etc...
Many thanks :-)
Edit:
Just to clarify, I was looking to write a technical article/book, and my desired artifact would probably be a PDF.
DocBook is one of the many supported inputs to Doxia, the engine used to generate docs by maven. Refer here: http://maven.apache.org/doxia/modules/index.html
In fact, the Doxia site answers your exact question: http://maven.apache.org/doxia/book/index.html
You can easily create a site (that contains documentation) with Maven using the mvn site command (i.e. using the plugin site).
This plugin creates technical reports (such as Javadoc, Unit tests reports, code coverage...) but can be also used to create a "real site".
You have more details about that in this page.
Basically, you write your page using APT (Almost Plain Text which is quite simple to understand), or a XML-based format, Xdoc.
2 years ago, I create a complete user guide for one application I developed, using the XDoc format and the Site Maven plugin. Globally, it was quite easy to create!
I hope this will help you!
I've been using with success the Maven plugin Docbkx. You should give it a try
Docbkx
You should definitely take a look at the Maven Docbkx Plugin. It probably fits your needs. Doxia's support of DocBook is -uhm- suboptimal. In fact, last time I tried it, it generated something new that - as far as I could tell - wasn't DocBook.
The Maven Docbkx Plugin that I'm referring to supports all the customizations of the world (through plugin parameters, or XSLT overrides, if you're into that) + it features some mechanisms to integrate it with the Maven build. (Such as processing instructions for including Maven pom properties into your documents.)
Note that the ambition is to have a plugin that prevents you from having to manually put together a processing chain yourselves. So this plugin will both do the transformation to FO, and transforming that to PDF.
I recently implemented the project documentation for my maven multi-module project using docbook and the docbkx plugin for maven. I now have it automatically generating html and pdf files every time I build the project site. I think docbkx really rocks, so I would suggest you use that.
Its true -you can create a very nice site just using the maven site and doxia plugins. In fact I'm using those two to generate my project site, But doxia support for docbook is very limited and doesn't let you modularize documentation, including parts of documents in a main document, for instance. So for the big reference-manuals I'm using docbkx.
If you want to take a peek, my project is here. You can actually download the source and see the nitty gritty of it. And, of course, if you have any question regarding this setup, i'll be more than glad to help.
Cheers
Carlos
Although the question is quite old I want to give an update on this. If you want to use LaTeX for your documentation you should use a maven plugin to generate the documentation. There are a couple of maven-plugins doing this but a lot of them are not maintained anymore.
There is a new maven-plugin which requires none or less configuration to get it working and the generated PDF (or PS or DVI) can be published as artifact.
Have a look at: mathan-latex-maven-plugin
There is AFAIK no official or semi-official plugin that will process LaTeX or DocBook, but what you could do (besides using the aforementioned site plugin) is to configure the exec plugin to process your LaTeX/DocBook sources during the site lifecycle, i.e. at the same time that the project's website is built.
E.g., something like
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<executions>
<execution>
<id>latex</id>
<goals>
<goal>exec</goal>
</goals>
<phase>site</phase>
<configuration>
...
</configuration>
</execution>
</executions>
</plugin>