Can SAP Java connector JCO3 lib and JCO2 lib co-exist in a Solaris/Apache/Tomcat server? I am thinking if I can use JCO3 for a new application without touching existing JCO2 applications.
Yes! The classes and packages, JAR files and native libraries are independent and have different names, so you can have both loaded at the same time. The API is different though, so your code will have to be different for each.
Related
I'm looking for a "repository" to store derived information (build artifacts).
We have a repository (currently Mercurial) to store our source code. When something is pushed to the source repository the code goes through a continuous integration server and we do an incremental build and as a result some dlls will be changed. This should be added to some "repository" so that everybody can use that version without needing to do the build again.
I'm looking for the following features:
It should be easy to update the source code and get the corresponding binaries (we could probably make a script for that)
You should easily get all binaries at once (not only those that changed during the last incremental build.
Binaries that weren't changed should only be stored once in the repository.
When updating the source code and the binaries only the changed binaries should be transferred (and not all binaries). This is similar to what happens for source code.
When updating to some version, only that version should be stored locally, not the complete history.
We should be able to remove certain versions from the binary "repository" after a while. However if the dlls are still necessary for subsequent incremental builds, these dlls should of course not be completely removed from the "repository"
What would fit these requirements?
I agree with Manfred, what you are looking for is a binary repository manager. Besides the Nexus repository manager you should consider Artifactory.
As for the feature list you asked about:
As you have mentioned the CI server should be responsible for identifying a change in the version control and starting a build process which creates the binaries. The CI server/build tool should also deploy the generated binaries to the repository manager, in case the build was successful. Artifactory offers a build integration feature which takes care of deploying the binaries together with the build metadata.
Using the build integration feature of Artifactory, you can get a list of all the binaries generated by a specific build and download them as an archive. Artifactory provides a REST API for those actions.
There are different approaches for storing the artifacts in a repository manager. Some tools stores a multiple copies of the same binary. Other, for example Artifactory, use a checksum based storage which keeps only one copy per binary (based on its checksum). This pays of if you keep multiple copies of the same binary in different repositories, especially if you are dealing with large binaries (war files, docker images, ISOs etc.). Another benefit are cheap copies/moves between repositories which is a common practice for promotion workflows.
The Artifactory build integration uses checksum based deployment which deploys only binaries which does not exist in Artifactory. For binaries which do exist and have not changed, it only created a new reference to the existing binary saving the need to send the actual bytes.
Artifactory provides multiple option of cleaning up binaries, including built in cleanup policies and the option to develop your own custom logic using user plugins and the Artifactory query language (AQL)
In addition, I highly recommend to take a look at the binary repository comparison matrix.
Disclaimer: I am working for JFrog the company behind Artifactory
You are basically asking for a repository manager like the Nexus Repository Manager as you have correctly identified with the tags.
In terms of specific requirement from your questions here are a couple of ideas.
binary components are typically identified via some coordinates that most of the time includes some sort of name and version. A release and build process changes those and deploys them to the repository. This allows you to match source code with binaries. You can also embed information like git refs in the produced binaries.
accessing the binaries is typically done via HTTP, so its easy. You then just have to determine what it means to get "all binaries".
not duplicating binaries that are essentially the same can be supported by the underlying file system or the build tool. I have seen both processes to work. Often it is however not worth the effort since storage is cheap.
there are various ways to automatically clean up repositories including scheduled tasks that do it regularly. Worst case you have to implement your own logic in an extension
Disclaimer: I work as community advocate and trainer for the Nexus Repository Manager with Sonatype.
I have my application splited into 4 main parts:
main application (acting like a glue for other parts - load plugins, has linked core and ui libraries)
core (shared library with classes etc., it will be even something like sdk, basically contains all except things related to Ui)
ui (shared library, that contains ui resources, types etc.)
other plugins (shared libraries, loaded by main application, which will use Plugin manager from core
The main reason for this is that i want to have possibility to replace all parts of application just by downloading plugins for my application (through plugin manager window in that application).
Let's say i want to redesign look of my app. In that case i should just release new version of ui shared library/plugin.
I am not sure if it will work, if that ui shared library is linked to my app by linker when application is compiled (core and ui are linked by linker, other shared libraries/plugins are loaded by plugin manager when app is starting).
Question:There will be probably saved some metadata about those libraries in final executable, for instance size?? So i probably can't just replace ui shared library, without need to compile and link my app again?
Generally speaking, you can replace a shared library with an other version of the shared library in distribution (without recompilation of the executable, etc.) in case the original library and the replaced library do have same ABI
I'm developing ios B2B app and I have several questions regarding app modularization.
Firstly i need to understand main difference between bundles and frameworks. When to use bundles and when frameworks.
Another question is. Is it possible for bundle to contain a .framework inside in it and vice versa.
Is it possible to create a plugins for ios app and load them dynamically, if yes then what it should be? bundle framework or library?
Is it possible for library to contain a resource files ?
Is it possible to create a resource bundle and dynamic library and then load them dynamically at runtime.
Is it possible to create a plugins for ios app and load them
dynamically, if yes then what it should be? bundle framework or
library?
No
Is it possible for library to contain a resource files ?
No
Is it possible to create a resource bundle and dynamic library and
then load them dynamically at runtime.
No
A Bundle is a type of Directory, a folder. A Framework is a bundle. So is an Application and so is a Plugin.
A Static Library is a single file code archive you can compile into your app at build time
A Dynamic Library is a single file code archive you can load at Runtime
A Framework is a Dynamic library in a Bundle with other things
A Plugin is a Dynamic library in a Bundle with other things
The Xcode build option 'Bundle' means 'Place the compiled Dynamic Library in a Bundle' - this is what you do when you want to create a Plugin.
Static libraries are the only option for modularising your code on iOS.
On the desktop..
Typically a Framework is for sharing code and resources between multiple apps. You want your app to behave as though the code was actually compiled into it. You want loading to happen transparently and you don't want to do anything special to use the methods, functions, etc. contained in it.
A Plugin (a Bundle containing compiled code and resources) is for optional, dynamically loaded code, e.g. a software extension that you can choose to load or not. You want to carefully architect your app so that it isn't dependent on the Plugin but acquires new behaviour if you manually locate and load it at Runtime.
A Framework and a Plugin are very similar, but a Framework has a strict file layout to facilitate locating and loading code and resources. With a plugin, these jobs are your responsibility so you can structure the Bundle contents however you want.
Because loading code is so easy in Cocoa on OSX (but not iOS) Frameworks can contain Plugins which contain Frameworks which contain more Frameworks, etc.
On iOS some people put Static Libraries in Bundles with resources and call them Frameworks. This has none of the benefits and all of the drawbacks of a real framework.
WHere can I get a XD version of dojo source like the one hosted on google? What I want to do is to host dojo source from my local CDN, and my custom dojo module in my web application. Is this a good practice? or I might as well just include the dojo source in my web app, and run the custom build?
Thanks,
You can build an xd version of dojo from the source code
Here are instructions on how to do it:
http://dojotoolkit.org/reference-guide/1.7/quickstart/custom-builds.html
See the section on "doing xdomain builds"
In our organization (a large one), we do have a CDN version of dojo deployed on internal CDN mainly since some of our webapps are not allowed to access extranet (firewall issues).
For performance, though, a custom build gives biggest boost since it is customized to the modules you need/use - once the custom build is done, you only need to ship a single compressed js output file and a small number of supporting files
When doing your custom build, you can use the xdDojoPath and loader=xdomain if you wish to use cross domain dojo to load your optimized js - see http://osdir.com/ml/cometd-users/2011-08/msg00050.html for some notes on this
Also see related SO question: Dojo on a CDN vs own install
The good news is that with Dojo 1.7+ and the new loader, you don't have to do anything special for a cross domain build (good answer above from #Vijay Agrawal, but I think that reference guide link may need some updating for 1.7) Just write your code to the new AMD format, use asynch:true, run the build tools to create layers, and deploy them on any server. AMD makes use of callbacks and many of the tricks the old Dojo xd builder used to employ, but in a much simpler way.
To support older code, there is a legacy cross domain mode mentioned in the loader docs.
Trying to figure out the best way to use Nuget in a development environment to manage our own libraries.
We want to standardize on Nuget way of doing things for our 3rd party libs, but would also like to use Nuget to manage our internal utility libraries, for developers consuming the in house libs this is great and everyones happy. However, for devs actively working on the Utility lib it seems to be more problematic, their previous process of build lib , build main app , F5 and go is now slowed down with publishing, and updating and potentially lots of packages, not to mention the moaning about additional process!
We use TDD on the internal libs but everyone needs to be able to debug and modify libs along with main app, have seen Phil Haacks demo on debug packages in 1.3 and read David Ebbos blog, but that fits different scenario.
So what is the best process for dev/debug cycles? if to use Nuget then we need to accept the existing constraints, or is there a hybrid practice people are using and maybe 1.3 gets closer to automating all this, or do we just avoid Nuget for internal packages which would be a real shame.
Loving Nuget, maybe wanting way to much from the little guy, feedback appreciated.
Thanks
I'd suggest you use separate network shares or feeds (similar to what myget.org supports in the cloud) for different scenarios.
You could imagine creating a CI share, a QA share, a Releases share, ...
Make people working on the referenced library do CI builds that drop CI packages on the CI repository for instance, and have them picked up by other projects (who just need to do a simple update, could be automated through PowerShell in pre-build: check for new version, if so, update).
Just make sure that when products release their milestones, they also release with released dependencies (could be as simple as switching feeds, releases will always have a higher version number than CI builds).
Hope that helps!
Cheers,
Xavier
If you're working on the source code for the lib and the main app at the same time, I'd say NuGet is probably not a good solution. I think it'll only work in situations where you work with a "stable" version of the library that don't need to change frequently during the development of your main app.
That said - is it possible the development on your library could be done in isolation? You already mention you're doing TDD on the lib, so why can't that work be done, then built, deployed, then the main app work done?