I've been evaluating several opensource message queue technologies, such as RabbitMQ, ActiveMQ, OpenAMQ, etc. My question is, what benefits are gained by using a commercial technology such as Tibco EMS, WebSphereMQ, Sonic, etc. instead of something like Active or Rabbit? PHP will be the primary language involved, although Java systems will be interacting as well.
I'd say the benefits are few and far between. You really need to be sure that a commercial system is for you before you invest as there is likely to be no going back.
Some of these things are so esoteric, so prone to vendor lock-in, so damn heavyweight that you'll feel like you have a gorilla on your back, not just a monkey ;)
Those commercial technologies are good, but investment in them can be steep. Both yearly license costs and on-going support costs must be considered when making a decision. As far as vendor lock-in goes, in the commercial world there's only one vendor offering support for a given product. In the open source world, there's typically more than one vendor offering support. Consider ActiveMQ for example. Both Progress Software and SpringSource offer support agreements for ActiveMQ as well as some others.
Also, in the commercial world, you won't ever get to look a the source code yourself. For a product like ActiveMQ, anyone can grab the source code. This is pretty powerful because it means that you can add features, etc. and quite possibly get them added to the product.
ActiveMQ has a great community and is very widely deployed. ActiveMQ provides client APIs for many languages including C/C++, Java, .NET, Perl, PHP, Python, Ruby and more.
There are great communities around projects like RabbitMQ (check out the mailing list for example). Also, if cost is an issue, obviously open source is a win there.
The biggest difference I have found is operational support and management. The commercial vendors usually provide better tools for ops/support staff to resubmit, edit messages etc.
This is often a weakness of open source offerings, which if rectified, should cause some serious lack of sleep for commercial vendors.
I think it's always best to thoroughly examine your requirements before choosing a messaging system:
Not all the commercial vendors will support PHP for example. ActiveMQ and RabbitMQ will.
Not all the messaging systems can support very large Queue sizes - though ActiveMQ does
Not all the messaging systems survive a hard broker stop without loosing messages ActiveMQ will - without you having to use transactions.
And if you are going to use open source - always look at the community - ActiveMQ is the most active community of any open source message vendor - and it's also Apache - which means diversity and no reliance on any single developer or vendor for delivery.
If you use commercial products it comes with everything(just we have
to use) but all the open source products will have basic features but
still we can implement commercial product features(involves lot of
development)
Related
I'm developping an open source OTA update system for a few MCUs of a certain project. I wonder if there is some "standard" protocol for CAN-bus based bootloaders. Everything I saw online and in Application Notes from the chip manufacturers seem to be using their own brand of communication and thus their own specialized upload software too (mainly for demonstration for ANs).
My question is, am I missing something? Is there some standard way of doing this I'd rather adhere to, or should I just roll my own like they do and call it a day?
Features I'm interested in for the protocol side besides the obvious ones: checksumming, digital signatures, authenticated encryption.
Based on your tag, despite I do not see this from your question, I assume for now that you want to develop a boot-loader for automotive ECUs, which have a CAN connection.
The relevant protocols, which provide the services, are ISO 14229-3 or SAE J1939/73, with the first one much more common to my experience.
For development purposes, also ASAM MCD-1 XCP has support for that.
However, these are just the communication services and does not include usual usage patterns, which differ a lot across the OEMs.
For security, the German OEMs put a document together called "HIS Security. Module Specification", which I unfortunately did not find any more on the web.
They also have a blueprint for the design of a boot-loader.
However, this is anyway somewhat outdated, as boot-loaders today often are at least partially based on AUTOSAR, like the applications.
Last from them, you could also get a document partially specifying how the services above are used for flashing an ECU.
If you need further input, feel free to ask.
However, you will need yourself access to the non-free industry standards and recommendations.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Could not find an answer on this question, so would like to initiate this:
Tibco EMS vs. MSMQ vs. MQ.
How do these 3 technologies compare?
Which one is better and in which kinds of scenarios?
Specifically, I think to use one of these in SOA environment (.NET + WCF), where the scenario will mature over time.
I have one additional specific interest in the performance, which is important to mention. So, if given a choice, performance is of a critical priority.
I would appreciate to have a comparison table for a clear picture.
Thanks!
EDIT:
I am concentrated on two parameters: performance and scalability.
Scalability - how do these technologies compare in terms of supported concurrent users' count? which can support more users? scenario does not matter, let's choose the scenario which is supported by all them - e.g. simple queues.
Performance - in exactly the same scenarios, which performs faster?
If you want to use WCF than non of them really matters. You will get most of them only when you use their direct API.
MSMQ - MS technology installed with every Windows installation. It is only transport technology with support for queues.
Tibco EMS - Tibco technology supporting both queues and topics (publish/subscribe). It is expensive and more suitable for enterprise scenarios. You will most probably need other Tibco tools and technologies as well to implement full SOA solution (Tibco ActiveMatrix product suite). .NET and WCF will be only apps connected to this infrastructure which is more designed for Java world. It runs on non Windows platforms as well and together with Tibco Business Works it offers connectors (adapters) to many LOB applications. I like APIs for Tibco products but I really don't like UIs of their tools.
IBM MQ - IBM technology supporting queues and it also somehow emulates topics (publish/subscribe). Again it is expensive commercial solution more suitable for enterprise scenarios where mainframes are involved - that is biggest MQ advantage - it runs "everywhere". But that is end of advantages. APIs for both Java and .NET are terrible. .NET API is full of bugs and it doesn't work as expected. IBM doesn't understand .NET libraries versioning which leads to terrible problems when moving your client application to machines with different MQ clients installed, etc.
Edit:
There were several question / comments about what problems MQ has? As few examples you can check my MQ questions. Not every question is actually an issue but you will find few of them pointing directly to bugs. Those issues can already be fixed in new MQ client versions but that doesn't mean there are no other. Generally I found MQ .NET API the most frustrating library I have ever used - it even beaten hated SharePoint.
On the other hand if you just need to send and receive some message and don't plan to do anything special or use low level features you should be OK. At the end the API is used for a while and common use cases should work - if you are not happy enough to hit regression bugs.
For a simple integration scenario - i.e. 2 applications interacting in a Point to point manner , no difference will be there. You would better check the support of each technology within your applications. And in that type of scenarios, you shouldn't be worried about performance as the messaging time shouldn't be the main issue. On the other hand, the real selection would be based on the target model for integrating your whole enterprise. For example,
- Are you doing any mediation functions - e.g: data transformation, protocol mapping ...etc
- Will you integrate systems in a point to point manner or you may consider having a Hub / ESB?
- Will you cover security aspects in your integration scenario (Authorization, authentication, auditing, encryption, certificate exchange ...etc)
Finally having such vision will give better understanding of what real constraints you've for your design. Personally, I would go for WCF only if I'm not expecting complex integration scenarios and I'm not willing to spend money on the solution. And I would go for IBM if I'm building a foundation for SOA. And will go to Tibco if I'm planning a Java based integration with a defined scope.
Again it is expensive commercial solution more suitable for enterprise scenarios
where mainframes are involved
Not sure why you mentioned mainframes. Many MQ enterprise customers don't have them.
IBM MQ - IBM technology supporting queues and it also somehow emulates
topics (publish/subscribe)
MQ v7.0.0 (released 2008) and onwards supports pub/sub topics as a native feature, there is no emulation involved.
APIs for both Java and .NET are terrible.
The MQ Classes for Java and JMS have evolved over 10+ years and are used heavily by thousands of enterprises.
.NET API is full of bugs and it doesn't work as expected.
The .Net API has been around for 7+ years over a few major releases of MQ. I would imagine that the obvious bugs would have been shaken out by now.
I am concentrated on two parameters: performance and scalability.
MQ has unlimited scalability. Performance is very good even with no tuning.
MQ is best only if you need to integrate with lots of mainframes. Pub/Sub is implemented poorly and the many APIs are 'strange to use'.
If all your applications are Windows, MSMQ might be a good choice, but it will be difficult to bridge into Unix or Java worlds.
The whole Java community standardized on JMS so TIBCO EMS is a good choice if you ever want to connect non-Windows applications.
My company runs a couple of B2B apps (written in Rails) dealing with parts and inventory and we've been trying to figure out the best way to integrate with some of our bigger users. We already offer the REST-style API that comes with Rails, but that, of course requires an IT Department on their end to decide to integrate it, so we'd like to lower that barrier if possible.
From what we've found, most of them are on SAP systems. Now, pretty much all I know about SAP is it's 1) expensive, 2) huge, 3) and does everything and anything you could ever need for your gigantic business to run. Naturally, this is all a bit imposing, and the resources on the site are a cross between impenetrable buzz-word laden sales material, and impenetrable jargon laden advanced technical material with little for the new, but technically competent user to be able to sink his teeth into.
So what I'm wondering is: as a 3rd party, that's not running a SAP installation, is there a way for us to offer access to our site's data through a web service or other API? Is it just a matter of providing or implementing a certain WSDL (and what would that be)? Is this feasible for someone without in-depth experience with SAP? Or is this a complete non-starter?
I'd say it's not possible without someone who knows the SAP system. You probably won't need to hire someone with in-depth SAP knowledge, but at least for the initial implementation, you'll need both the knowledge and a working system you can develop against. Technically speaking, it's not really that hard, but considering the fact that SAP systems are designed to handle multiple organizations, countries, legal systems, localizations and several thousands of users simultaneously, things are bound to be a bit more complex than almost any other software around - and most of the time not even bloated, it's just easy to get lost in that kind of flexibility.
My recommendation would be to find a customer (or a prospective customer) who has someone in their IT department with the necessary technical and processual knowledge and who is interested in conducting a development project. This way, you'd get access to a real system (testing of course) and someone who can explain to you the basics of the system. But, as I said, be prepared for complexity.
vwegert makes some excellent points.
As to this part of your question:
So what I'm wondering is: as a 3rd
party, that's not running a SAP
installation, is there a way for us to
offer access to our site's data
through a web service or other API? Is
it just a matter of providing or
implementing a certain WSDL (and what
would that be)?
Technically it is possible to expose any of your system's services as web-services to a client's SAP system. In order to do this you do not need any prior knowledge of SAP. (SAP should be able to import a WSDL, although there may be some limitations in the earlier pre-ECC5 systems).
For example a service that provides meter reads, airport departure schedules, industry trends etc is not dependend of what is in the user's system or how they set it up. However as soon as there is a need to initiate updates to the client system's data is when you need access to more specialised SAP knowledge.
Also note that many SAP functions can also be exposed as web services, but generally you do need someone with SAP (ABAP) knowledge to do this.
The ABAP language is actually fairly simple, but there is a huge learning curve to understand the data model and the myriad of configurable options in SAP.
As it is included in the license (according to SAP) we would prefer using Solution Manager over other tools, for the entire life-cycle of software development. Or is it highly recommended to use specific tools for the particular processes like Test Management? Any opinions?
in general before answering this question, please be aware that SAP will bring out a new support model and the features and functions available in your SolMan installation will differ according to the support you requested from SAP. If you stick to the Enterprise Support you will (nearly) get every functionality, for Standard Support you well get less and a lot of features will not be included. At the moment, SolMan 7.10 is in Ramp Up Phase and 7.20 will be released in 2011. Due to the fact, that SAP changes the kernel of the Solution Manager Stack, which is apparently CRM from 5.0 to 7.0 you should keep in mind, that any functionality you implement in your current SolMan will lead to high migration efforts.
Apart from this, if you look at the Enterprise version, my experience is that not all features are rather good and suitable. It also depends on the organization you are working in. The SAP tools focus only on SAP, so if you are working in an environment where non-SAP Java has an important part I would look for different tools. If you look into the change management (ChaRM), it is suitable for small landscapes and for big ones only with some effort. Here you should also consider at least to have a look at different technologies and tools. From my point, there are some things like monitoring, job scheduling etc. which are quite good, but for the more general application lifecycle management tools you should at least take other options into account.
There are a couple of questions on Stackoverflow asking whether x (Ruby / Drupal) technology is 'enterprise ready'.
I would like to ask how is 'enterprise ready' defined.
Has anyone created their own checklist?
Does anyone have a benchmark that they test against?
"Enterprise Ready" for the most part means can we run it reliably and effectively within a large organisation.
There are several factors involved:
Is it reliable?
Can our current staff support it, or do we need specialists?
Can it fit in with our established security model?
Can deployments be done with our automated tools?
How easy is it to administer? Can the business users do it or do we need a specialist?
If it uses a database, is it our standard DB, or do we need to train up more specialists?
Depending on how important the system is to the business the following question might also apply:
Can it be made highly available?
Can it be load balanced?
Is it secure enough?
Open Source projects often do not pay enough attention to the difficulties of deploying and running software within a large organisation. e.g. Most OS projects default to MySql as the database, which is a good and sensible choice for most small projects, however, if your Enterprise has an ORACLE site license and a team of highly skilled ORACLE DBAs in place the MySql option looks distinctly unattractive.
To be short:
"Enterprise ready" means: If it crashes, the enterprises using it will possibly sue you.
Most of the time the "test", if it may really be called as such, is that some enterprise (=large business), has deployed a successful and stable product using it. So its more like saying its proven its worth on the battlefield, or something like that. In other words the framework has been used successfully, or not in the real world, you can't just follow some checklist and load tests and say its enterprise ready.
Like Robert Gould says in his answer, it's "Enterprise-ready" when it's been proven by some other huge project. I'd put it this way: if somebody out there has made millions of dollars with it and gotten written up by venture capitalist magazines as the year's (some year, not necessarily this one) hottest new thing, then it's Enterprise-ready. :)
Another way to look at the question is that a tech is Enterprise-ready when a non-tech boss or business owner won't worry about whether or not they've chosen a good platform to run their business on. In this sense Enterprise-ready is a measure of brand recognition rather than technological maturity.
Having built a couple "Enterprise" applications...
Enterprise outside of development means, that if it breaks, someone can fix it. I've worked with employers/contractors that stick with quite possibly the worst managing hosting providers, data vendors, or such because they will fix problems when they crop up, even if they crop up a lot it, and have someone to call when they break.
So to restate it another way, Enterprise software is Enterprisey because it has support options available. A simple example: jQuery isn't enterprisey while ExtJS is, because ExtJS has a corporate support structure to it. (Yes I know these two frameworks is like comparing a toolset to a factory manufactured home kit ).
As my day job is all about enterprise architecture, I believe that the word enterprise isn't nowadays about size nor scale but refers more to how a software product is sold.
For example, Ruby on Rails isn't enterprise because there is no vendor that will come into your shop and do Powerpoint presentations repeatedly for the developer community. Ruby on Rails doesn't have a sales executive that takes me out to the golf course or my favorite restaurant for lunch. Ruby on Rails also isn't deeply covered by industry analyst firms such as Gartner.
Ruby on Rails will never be considered "enterprise" until these things occur...
From my experience, "Enterprise ready" label is an indicator of the fear of managers to adopt an open-source technology, possibly balanced with a desire not to stay follower in that technology.
This may objectively argued with considerations such as support from a third party company or integration in existing development tools.
I suppose an application could be considered "enterprise ready" when it is stable enough that a large company would use it. It would also imply some level of support, so when it does inevitable break.
Wether or not something is "enterprise ready" is entirely subjective, and undefined, and rather "buzz word'y".. Basically, you can't have a test_isEnterpriseReady() - just make your application as reliable and efficient as it can be..