I have experience in swapping business logic in .NET by loading assemblies, and using reflection to find an implemented interface. This enabled behaviour composition at runtime, by simply distributing and placing DLL files into its working directory. How can I achieve the same in Clojure?
I have been informed I could compile my Lein project without AoT compilation, with dependency on a class which the JVM will search for I assume from sibling JAR files? I've also seen Java 9 has a solution called "Jigsaw", and there other projects such as lein-jlink too. I'm unsure if those are suitable.
I'd really appreciate an article/tutorial, working example, or a good few hints on how to do this as I'm new to JVM also.
My project in particular would involve a business logic model "module" loaded at startup, consuming messages and producing messages in return. It's meant to be somewhat a blackbox.
An alternate route I'd like to avoid is an MQTT-style approach where distributed modules are relatively heavy standalone programs.
Thank you for your time.
In plain Java you can have use same approach as you did in C#: you develop a core and provide interfaces that can be used for extensions, then you inspect (using reflection) the Java CLASSPATH for implementations of the interface in Jar files (this is the same idea of DLLs), but the Java CLASSPATH is either an environment variable or a command-line parameter with a list of paths where to search for Jar files.
In Clojure, you have the advantage that you can distribute libraries either as compiled code or as source code which the Clojure runtime will load. I'd recommend looking into the Deps and CLI guide because it will give you good guidance into how to:
add dependencies on a configuration file through various means, including loading dependencies from private repos, or even a dependency in a git repo at an exact commit
launching you code with the various switches or configuration you might need, so that you can change behaviour by editing a config file
Related
I never used modules in IntelliJ IDEA, but in Java 9 appeared modules ( which I also never used but wanna study now what is this)
So the question is: are there match each other? Or IDEA modules appeared long before and for different purposes?
It's a similar concept that appeared long before Java 9 modules. It's also not IDE specific. Build systems like Maven and Gradle also use this concept when working with the projects that consist of multiple sub-projects. In IntelliJ IDEA terminology the module is just a sub-project (In Eclipse the module is a Project and Workspace can have multiple projects).
Java 9 modules map to IntelliJ IDEA modules and provide additional features via the module descriptor specifying:
the packages it explicitly makes available to other modules (all
other packages in the module are implicitly unavailable to other
modules)
the services it offers
the services it consumes
to what other modules it allows reflection
IntelliJ IDEA already has a concept of modules for a project. Every
IntelliJ IDEA module builds its own classpath. With the introduction
of the new Java platform module system, IntelliJ IDEA modules had to
extend their capability by supporting the Java platform's module-path
if it is used instead of the classpath.
Related links:
IntelliJ IDEA modules explained
Migrating From Eclipse to IntelliJ IDEA
Understanding Java 9 Modules
Getting Started with Java 9 Module System
Support for Java 9 Modules in IntelliJ IDEA
Much as I love Intellij, I have to answer yes: there is one big difference.
Java 9 Modules are a much needed step toward encapsulation and decoupling. But there is a form of (accidental? pathological?) coupling in Intellij modules by virtue of their membership in a single Intellij (and therefore VCS) project. Java 9 Modules can (and probably should, from the standpoint of encapsulation) be developed in separate VCS/IDE projects, from which they can expose only the APIs that make sense. A super (parent) POM is a mechanism providing non-pathological coupling across projects to reduce redundancy.
A module based on a well-defined domain and bounded context should be freely available for reuse: it's a big step toward the componentization we have been talking about for years. These domains are not random - they reflect an emerging analysis of the world. If it sounds like I'm advocating something like the anarchy and Balkanization faced by Node developers - I'm not: the well-analyzed domain is the key.
Either Intellij modules are "benefiting corruptly" from the kind of otherwise highly desirable coupling-under-the-hood that is one of the fantastic benefits of Intellij projects, or there is no value added by maintaining them in a single project. Working in a non-TBD environment with a branch for every JIRA ticket increases the cost of this kind of coupling (personal experience).
The justification in another answer for this coupling is the possibility of a "refactoring" that involves changes to multiple modules. But this is a code/design smell: either breaking changes are being made to a public API, which is a problem for all clients and can usually be avoided or mitigated with some imagination, deprecation, EOL-warnings (Strangler pattern), etc., or the modules exhibit pathological cohesion, probably due to incomplete analysis into bounded contexts.
I'm starting development using OSGi but when one of my concerns is about the lack of support at development time, meaning that commonly IDEs (started using Intellij IDEA) don't use OSGi for class discovery but classpath search IDE managed (I'm in search for one that uses OSGi instead).
The main concern here is to prevent classpath issues at execution time by suing the same OSGi mechanisms at development time.
Does any IDE work this way ?
update: added link to blog post with my experience with IDEA
OSGi is a runtime technology, therefore there is no such thing as an OSGi mechanism at build time. Also bear in mind that ultimately all Java code must be compiled by a Java compiler, usually javac. The javac compiler does not use package dependencies like Import-Package, it always uses JARs or directories on the classpath.
Having said that, Bndtools uses package filtering at build time, based on the exported and private packages of the dependencies. This is a special feature of Eclipse and it does not work when you compile outside of the IDE, e.g. with Ant or Maven. However it may still be useful because if you try to use a non-exported package from another bundle you will get a problem marker with a red X in the Eclipse IDE.
I have a Maven-managed project which contains a few modules, one of which is the actual library of interest. The other modules are just add-ons or examples that build off of the library. I'm looking to generate the Maven site for this library and have it automatically deployed (as a standalone site and not as part of a multi-module site) but I am having trouble with the Javadoc plugin.
When executing the javadoc:javadoc goal, the javadoc plugin is attempting to access the jar for the other modules causing a failure.
I have created a simple example which demonstrates this phenomenon. Make sure you run the clean goal before any others so that the flaw be shown. Though executing the packaging first would solve this error, this cannot be done because the use case occurs during the Maven-managed release process which starts from a clean state.
Is there a way for me to disable this functionality in the javadoc plugin so I only get the documetation for the library module?
I can think of two options depending on your preference. Both include using profiles. If you want the default build to create the javadocs for your library of interest. Make the other modules use a property inside of the default profile in order to skip the javadocs.
If you are okay with passing in a profile, just have the javadocs only run in the profile.
I want to deliver a single .jar file to my clients, but my project is currently built with Maven, and I have several modules that generate a single .jar each.
I know nesting different .jar files is not a great idea, so I am not sure how can I achieve this.
Any ideas?
If you really want to go this direction, there are several ways to do that:
with the Maven Assembly Plugin and maybe the jar-with-dependencies predefined assembly descriptor (that will unpack dependencies)
with the Maven Shade Plugin (similar to the above one but gives more flexibility)
with the Maven One-Jar Plugin (that uses One-JAR and its custom classloader to allow nesting of JARs)
Depending on your exact requirements and constraints, you might prefer one or the other.
First of all, ask yourself if you have a really good reason for packaging your application and all of its dependencies in to a single jar. I haven't found a very many good reason for this at all (with most reasons being related to organizational policy foolishness or just plain ignorance). The way to go is to keep libraries in their own jars and supplying a .zip/.tar.gz containing all of your libraries and your application with either
An executable .jar with the
classpath setup appropriately in
your MANFIEST.MF file
a .bat/.sh
script that invokes java and builds
an appropriate classpath based on
your deps
Conversely, use JNLP (better known as Java Web Start).
If you really want to have maven bundle all of your dependencies and your application under a single jar, what you want to use is the "jar-with-dependencies" predefined assembly. The maven assembly plugin usage page also shows how you might this up as well.
You can try Maven Shade plugin too.
http://maven.apache.org/plugins/maven-shade-plugin/
General instructions on how to use the Shade Plugin can be found on the usage page. Some more specific use cases are described in the examples given below. Last but not least, users occasionally contribute additional examples, tips or errata to the plugin's wiki page.
In case you still have questions regarding the plugin's usage, please feel free to contact the user mailing list. The posts to the mailing list are archived and could already contain the answer to your question as part of an older thread. Hence, it is also worth browsing/searching the mail archive.
If you feel like the plugin is missing a feature or has a defect, you can fill a feature request or bug report in our issue tracker. When creating a new issue, please provide a comprehensive description of your concern. Especially for fixing bugs it is crucial that the developers can reproduce your problem. For this reason, entire debug logs, POMs or most preferably little demo projects attached to the issue are very much appreciated. Of course, patches are welcome, too. Contributors can check out the project from our source repository and will find supplementary information in the guide to helping with Maven.
Actually, nesting .jar files is not possible. A jar can't have other jars in it.
.war and .ear files can contain jars, and that's a good solution if you're delivering a J2EE application.
If your app is just J2SE, however, I recommend looking at the Maven Assembly plugin. As the name implies, it allows you to create a single binary distribution of your build.
We have a Java codebase that is currently one Web-based Netbeans project. As our organization and codebase grows it seems obvious that we should partition the various independent pieces of our system into individual jars. So one Jar library for the data access layer, one for a general lib, one for a specialized knowledge access, etc. Then we'd have a separate project for the web application, and could have one for a command line tools app, another web app eventually, etc.
What is the recommended practice for doing and managing this? Is it Maven? Can it all be effectively done with just Netbeans alone by simply creating individual projects and setting the dependecies of one project on the jar files of the others?
I'd agree with SteveG above on using Maven2 to help you modularise your code base, but I'd use Nexus as the local repository for Maven instead of Archiva. The guys at Sonatype also have an excellent (free html/pdf) book on how to use Maven, Nexus, and integrate it into IDEs.
Be careful on how you decide to partition up your projects, though. There's no sense in over-complicating your dependencies just for the sake of it.
I would definitely say check Maven(2) out. It is very good for doing this sort of thing. You can define individual models and version then very easily. Netbeans also does a decent job of integrating with.
Also I suggest you set up Archiva which will let you be dependent upon binaries of other artifacts that your company generates internally. This also acts as a proxy and will keep a local copy of any external dependencies your projects might have so its very quick to get the new versions internally.
I would create ant scripts to build the pieces and for deployment. Then you are not depending on your IDE for build/deployment.
It sounds like your code is getting to the point where you're graduating from the WAR approach and have entered into the EAR level.
An EAR is just another archive that contains all the other JARs and WARs that get combined to create an application. There are four types of modules that can reside inside it, Web, EJB, Connectors and Utilities. Most people only use Web and Utilities so they go with using the WEB-INF/lib approach.
But if you're starting to get a lot of interdependencies what you do create an EAR project and make your web project a child of it. Each Utility JAR which is just straight Java code used by other modules also becomes a child of the EAR. Finally in each of your projects there should be a META-INF/manifest.mf file that just has the name of the JARs that JAR/WAR depends on.
I'm an eclipse guy and most of this gets taken care of for you in eclipse, but I'm sure netbeans has very similar functionality.
Now the only problem is that you have to use a full Java EE server to deploy an EAR so I don't think you can use Tomcat if that's what you're currently using.