Implications of Mule runtime version change from 3.8.x and 3.9 to 3.9.2 - mule

Have a few projects that need to run at the same time so have one run/debug configuration. Not all projects have the same mule runtime (EE) and I want all of them under the same runtime version. One is 3.8.4, another – 3.9.0 and the rest – 3.9.2. What are the main implications that I should be aware of?

Related

Adding different versions of same plugin in Feature file

While trying to generate build we get some plugin dependencies. When trying to add them in .product file, It shows plugin version as 0.0.0 by default. We have a situation to add more than one version of same plugins.
We tried to manually change 0.0.0 to the required version from the dependencies. We are successfully able to launch the application. But while trying to generate a build we get some errors.We have the required plugins installed.
If anyone knows how to add different versions, the help is much appreciated.
Edit:
Image showing the problem
This is the problem we are facing
The solution we tried
We tried manually changing version number but creates error during build generation
I'm not sure that this scenario is supported by PDE Build, because it sounds exotic a bit.
You can try to use different features to introduce different version of bundle.
But I think that more promissing strategy will be to "align" your dependencies, i.e. in your case it is better to select the version of GMF that uses the right Batik version.
Otherwise sooner or later you will get "blocking" bundle with singleton:true in your dependency tree, as #greg-449 mentioned.
Also, please have a look at this question: Tycho | How to build multiple version of same plugin using tycho
See my answer there:
https://stackoverflow.com/a/62426443/9062163
In fact the details I mentioned in my other answer where the result of a successful integration of Sirius 6.0.1 in an Eclipse RCP based on Photon. The troubles came from the integration of Batik 1.7 and 1.6 in the same product, the latter version being forced by the GMF version I use. I also needed some batik plugins of versions 1.8.0 and 1.9.1 for other reasons.

OSGI aware IDE at development time

I'm starting development using OSGi but when one of my concerns is about the lack of support at development time, meaning that commonly IDEs (started using Intellij IDEA) don't use OSGi for class discovery but classpath search IDE managed (I'm in search for one that uses OSGi instead).
The main concern here is to prevent classpath issues at execution time by suing the same OSGi mechanisms at development time.
Does any IDE work this way ?
update: added link to blog post with my experience with IDEA
OSGi is a runtime technology, therefore there is no such thing as an OSGi mechanism at build time. Also bear in mind that ultimately all Java code must be compiled by a Java compiler, usually javac. The javac compiler does not use package dependencies like Import-Package, it always uses JARs or directories on the classpath.
Having said that, Bndtools uses package filtering at build time, based on the exported and private packages of the dependencies. This is a special feature of Eclipse and it does not work when you compile outside of the IDE, e.g. with Ant or Maven. However it may still be useful because if you try to use a non-exported package from another bundle you will get a problem marker with a red X in the Eclipse IDE.

IntelliJ Datanucleus Enhancer plugin not working

The project I'm developing uses Datanucleus 2.0.3, so I'm using those libraries for enhancement (plugin is configured to use the module dependencies as well). IntelliJ version 12.0.1 on a Ubuntu 12.4 machine. I know the 2.0.3 is ancient history but upgrading it at least now is not an option for me.
From gradle all works fine. I imported by project to IntelliJ and when I ran the tests from junit I got the usual ClassNotPersistenceCapableException so I recalled I need a plugin for this.
I installed the newest plugin (tried both the beta and the last stable version) and configured the plugin to enhance my this one module. I chose JDO and applied, it discovered all the classes annotated for persistence, I rebuilt the whole project, ran the tests again and the same error occurs.
some things I've noticed / checked:
- the Enchaner is ticked in "Build / Datanucleus Enhancer"
- looked for multiple datanucleus jars, but there is only one
- haven't seen any message in IntelliJ in the Event Log saying is has done enhancing (the gradle enhancer logs such a message)
- haven't seen any error messages in IntelliJ saying enhancement failed, I also didn't find any log files outside IntelliJ (should there be any?)
- when I manually added the gradle built classes at the top of the classpath for the test the tests passed - but this is no good
- the module has the following datanucleus 2.0.3 jars on it's classpath: datanucleus-core, datanucleus-enhancer, datanucleus-connectionpool, datanucleus-rdbms and the asm-3.1.jar (the dependencies say it's 3.0-4.0 so this one should fit)
I have no idea why it sees the classes but doesn't enhance them, or maybe it does try and silently fail ... but then I don't know how to diagnose the problem
No other ideas come to my mind, please advise what to check or what to try.

Make sure your Plugin runs with Eclipse 3.4 when compiling with 3.5

I am developing an Eclipse Plugin and want most of the features to be compatible with Eclipse 3.4.
Until now that was no problem because we could just use eclipse 3.4 in the build process, so compiler errors would be found easily.
Now we have a new feature that requires eclipse 3.5 and we cannot use 3.4 for the build any longer but have to use 3.5 at least. The problem now is that we dont know if the old features are still compatible with eclipse 3.4. (at least not by automatic build)
Is there any smart solution to this problem? Make sure some of the plugin features are compatible with eclipse 3.4 and some with 3.5? Preferably a solution that can be automated and added to the build process.
Build your 3.5 plugins with a 3.4 target. Then you'll see which problems ouccurs :).
After you have identified your bundles which are using new features only avialable in 3.5 set in your MANIFEST.MF the version number of the dependeny to the used version in 3.5, so that resolving the dependencies of your bundles in a 3.4 target will fail.
In general I would recommend that you should setup your target you're building against in your IDE, to get notified about possible problems while you're writing the code, not when building your plugins.
To make a bundle plugin runnable in a 3.4 and in 3.5 with the new features there will be no easy solution, the probably easiest way is to split the bundle and isolate the 3.5 features in a new bundle, so that your plugins can also be run in a 3.4 environment.
In addition to toms' answer, I would recommend that you run your test suite during a headless build against a 3.5 eclipse as well as a 3.4 eclipse.
The way that we do this in our own shop is this, with 4 automated build jobs:
Build the product against a 3.5 target eclipse
Run the tests on the 3.5 target
Build the product against a 3.4 target eclipse
Run the tests on the 3.4 target
If the 3.5 target fails, then we don't build a 3.4 target. (Of course, in our case, we are doing 3.6 and 3.5 (and starting to introduce 3.7)).

Can I package my Eclipse extension so the right version is automatically installed?

Is there a way to build an Eclipse Update Site so that Eclipse 3.3 will install one version of my plug-in while Eclipse 3.4 will install another version? The feature spec allows for "optional included features" but I can't see how to make them conditional on the version of the target.
Background: I've become responsible for an Eclipse extension that has half a dozen plug-ins which depend on a "support" plug-in, and the support plug-in needs to be one version for Eclipse 3.3 and a different version for Eclipse 3.4/3.5. I currently have two separate features, "extension for 3.3" and "extension for 3.4+" but I'd like to not bother my users with this detail.