We're hoping to implement plugins which will interact, through interfaces declared in a shared assembly. Technically, several plugins will register entities with a "main" plugin (during integration), which in turn makes use of these entities (during the Petrel session).
As each plugin typically will be distinct products on the Store, they need to be installed by distinct PIPs.
The plugins will be backwards compatible with regards to the shared assembly.
Is it possible to accomplish this with PIP installers - and how?
It is not possible with PIP installers right now, but we have this requirement in our list, and will implement it in one of our future releases.
Related
I want to test my OSGI app in Pax Exam, but i have some trouble with starting Application from one of my plug-ins.
I use Equinox and there is some equinox-specific class that extends org.eclipse.equinox.app.IApplication. This class can be then selected in Eclipse Application Launcher and is first class to be run (in my case it controls app lifecycle).
When I run Pax Exam test, all bundles are resolved, but my IApplication is not started.
How can I run this kind of application in Pax Exam?
Additionally how can I pass some app arguments? I see only some frameworkProperty (-F) and systemProperty (-D), but i need some regular app arguments.
As far as I can tell IApplication is not part of equinox but part of the eclipse plattform. So I think it is not directly supported in pax exam. Pax exam will only start the OSGi framework and load and start the bundles you specify.
So the way to make this work might be to load the eclipse bundles that take care of the start of applications. I am not sure though how this would work in detail.
When using eclipse specific stuff you make your application less portable. So maybe you can achieve the same thing with pure OSGi infrastructure?
Or is the application you want to test an eclipse RCP application? In this case pax exam is probably not the best test facility. Some UI test frameworks would match better in this case. (e.g. https://developers.google.com/java-dev-tools/wintester/html/)
The IApplication is part of Equinox, and uses the registry to find out what is installed. So as well as including the appropriate jars in PAX, you'll also need to ensure that you start at least declarative services and the extension registry bundles, as otherwise the IApplication stuff won't be found.
Secondly there is no bundle that calls the EclipseStarter class, which is the thing that handles the main arguments, and which passes that through to the runtime. So unless you're doing that yourself, you will find that the application won't run at all.
If you're starting Eclipse specifically you might find some Eclipse-specific arguments to specify these as Java system properties:
http://help.eclipse.org/luna/index.jsp?topic=%2Forg.eclipse.platform.doc.isv%2Freference%2Fmisc%2Fruntime-options.html
for example, you could specify -Dapplication.id=yourapp
You might also try eclipse.commands as a newline-separated list of arguments.
I need to maintain in parallel two version of the same applications. One for app store and one standalone (not app store) version.
My initial plan was to use the master branch for developing all feature that are common for both version. For standalone version my plan was to create a new branch, and in that branch to add a features for standalone applications, and the same for app store version.
Actually I tried my plan, but I have a problem with cocoapods. I have multiple dependencies that are common, but for standalone application I need to use a Sparkle framework. Now every time I try to merge master to one of my branches I have a huge conflicts with cocoapods files. That are very difficult to solve.
I was thinking about removing pods for my source control but than, when I switch branches I will need to do "pod install" every time.
Is there a better way to do this?
Thanks.
I strongly recommend to use Targets.
Look Apple docs and here.
Using Monticello package manager does not seem to guarantee that, once you added the interesting package(s), the total image is still coherent. Are there any ways to verify that? Are dependencies verified? Are there guidelines in that direction?
I think you're looking for Metacello, a package and configuration manager for Monticello.
You can check out this guide: Managing projects with Metacello, and also there's a page on Google code
While Monticello actually has the possibility to ensure that dependencies are met,
it is limited to the form “this Monticello version depends on exactly these other Monticello versions”. Also, specifying these dependencies is a bit hidden in the Monticello browser and, above all, scarcely used in the community.
As Uko said, Metacello is exactly intended to solve the problem of dependency management in Smalltalk systems. It is not limited to Monticello, conceptually. To my knowledge, most GemStone, Pharo, and Squeak images come with Metacello pre-installed or easily installable.
Have a look at the blog of Metacello’s author, Dale Henrichs, where he gives some introduction to using Metacello.
There is also the Metacello Repository, where most configurations (think software receipts) can be found.
Monticello's responsibility ends with loading individual packages. Coherence comes with either Metacello (see Uko's answer) or with SqueakMap.
SqueakMap stores install scripts that ensure that entire applications get loaded into your image.
Preface:
My Company, like most, has several run-time environments and several release versions which themselves are composed of different versions of various jars.
For example, let us consider release versions 1.1, 1.2, and 1.3 of Software X, which may be deployed to a developer computer, testing, or production.
Software-x-1.1 is itself composed of jarA-0.9.1 and jarB-0.7.5, but software-x-1.3 is composed of jarA-1.7.31 and jarB-0.8.1.
Currently we use Spring's PropertyPlaceholderConfigurer to configure run-time variables (such as database credentials), however, properties also change with release versions.
We also use Maven 2 POM version 4 to specify which versions of our code need to be used. We place the version numbers of our jars as properties within profiles (dev,test,prod) inside of the parent pom and then reference those version numbers in all project poms.
As of right now, we have no way to specify which project versions pertain to a given release other than the most current one. Moreover, we deploy our run-time configurations to the SSDM pickup which then configures and creates the services defined by the built versions of our software.
--
Questions:
Is there any procedure/tool we can use to build our product by merely providing the run-time environment and version number? IE "build 1.1 dev"?
Is there anyway we can store the required jar versions for each release build? We are currently versioning all files, including the parent pom, but merely versioning the parent pom does not record which release version is pertinent to that parent pom.
What else can we do to further automate the process of builds?
For example, if we could manage run-time configurations within the parent pom that would be a step in the right direction, but that seems like a violation of scope.
Any tool outside of our framework is inconceivable at this point, but not in the far future.
Summary:
How can we automate our build process to the fullest extent without being error prone?
Based on the part for released version 1.1, 1.2 and 1.3 of the Software X it seemed to be right way to use profiles to handle differences between test, production etc. environments.
The software itself is an other story. I assume you are using a version control tool (VCT) to store the state of your development. So during the preparation of Software-x-1.1 you change your root pom and define the dependencies (jarA-0.9.1, jarB-0.7.5). Make a Tag Release 1.1. and than continue to Release 1.2...during the development of Release 1.3 you decided to change the dependencies (to jarA-1.7.31 and jarB-0.8.1) which results in a change to the pom's or your root pom only). May be i oversight your real problem.
If I summarize your problem: you want to manage release of versions across multiple environments, and you release distribution is an aggregate of executable (jars) as well as environments properties. Different versions of these deploy-able distributions propagate to diff env at different stages with there own set of env properties and you are looking at a way to have a common roll out (or may be release process) to handle all of this.
It seems the first problem you have is that you run a build per release per environment when you are propagating a release. If I am not wrong, you should try looking at your app architecture first to see if there is a way you can create environment independent binaries, in some cases projects prefer keeping properties as a separate module which is deployed along with the jars, and a Property Manager of sorts which figures reads the files, so you may have a maven module called properties, which bundles one zip each for every env set of property files. Your deployer script can then be given a parameter while running on which zip file to extract to a location from where the properties can be read into the application. What you gain this way is that you "create one release distribution per release - which has contents to run on all environments".
Also, is it the case that you release version is "not" the version that you have in POM? if not aligning your release version to POMs should be done. i.e. POM should be 1.3-SNAPSHOT when you are working on development phase of that release, and be bumped off to 1.3 in a branch when you are releasing it.
There are no one size fits all solutions for such things but practices similar to this one do help to a good extent.
PS Do let me know if I got your problem right, or have ended up beating around the bushes ;-) DS.
We have a largish standalone (i.e. not Java EE) commercial Java project (10,000+ classes, four or five SVN repositories, ten or twenty third-party libraries) that's in the process of switching over to Maven. Unfortunately only one engineer (in a team of a dozen or so distributed across three countries) has any prior Maven experience, so we're kind of figuring it out as we go.
In the old Ant way of doing things, we'd:
check out source code from three or four repositories
compile it all into a single monolithic JAR
release that (as part of a ZIP file with library JARs, an installer, various config files, etc.)
check the JAR into SVN so we had a record of what the customers had actually got.
Now, we've got a Maven repository full of artifacts, and a build process that depends on Maven having access to that repository. So if we need to replicate what we actually shipped to a customer, we need to do a build against a Maven repository that has all the proper versions of everything. This is doable, I guess, if in (some version of) the (SVN-controlled) POM files we set all the dependencies to released versions?
But it gives our release engineer the creepy-crawlies, because there doesn't seem to be any way:
to make sure that somebody doesn't clobber the copy of foo-api-1.2.3.jar on the WebDAV server by mistake (the WebDAV server has access control, but that wouldn't stop a buggy build script)
to detect it if they did
to recover afterwards
His idea is, for release builds, to use a local file system as the repository rather than the WebDAV server, and put that local repository under SVN control.
Our one Maven-experienced engineer doesn't like that -- I guess because he doesn't like putting binaries under version control? -- and suggests that maybe the professional version of the Nexus server can solve the clobbering or clobber-tracking/recovery problem.
Personally, I'm not happy (sorry, Sonatype readers) with shelling out money for a non-free build system when we haven't even seen any benefit from the free version yet, and there's no guarantee it will actually solve the problem.
So our choices seem to be:
WebDAV server
Pros: only one server, also accessible by devs, ...?
Cons: easy clobbering, no clobber-tracking/recovery
Local file system
Pros: can be placed under revision control
Cons: only works with the distribution script
Frankly, both of these seem like hacks to me, and I have to wonder if there isn't a better way to do this.
So: Is there a right thing to do here?
I'm not sure to get everything but I would:
Use the maven-release-plugin (which automates the release process i.e. execute all the steps documented in release:prepare).
Use WebDAV with anonymous read-only and authenticated write policy (so only release engineer can actually deploy released artifacts to the corporate repo).
There is a no need to put generated artifacts under version control (if you have the poms under version control). I don't see the benefits of using the local file system instead of WebDAV (this is not providing more security, you can secure WebDAV as well). I don't see what the commercial version of Nexus would solve here.
Nexus has a setting which prevents you from clobbering an already released artefact in a release repository.
For a team of about a dozen, the free version of Nexus should be enough.