How to avoid short-lifespan enterprise applications? - enterprise

A while ago another question referred to the (possibly urban tale) statistic that
... the average lifespan of software is about 3 years
At the time I came up with the following reasons (and I'm sure there are more possibly better ones):
A new major system (ERP, CRM, etc.) is implemented and it has an "integrated" module to replace the old app.
Same, but no integrated app - but the existing app is not adaptable (the people left, technology has changed, current IT policies have changed, users don't like the existing app.)
The company you acquired the basic app from, to customize it for your needs has disappeared.
Or you don't get along well with them any more.
The technology for the existing app is "obsolete" (according to the framework vendor/Microsoft/consultant/industry expert/new IT manager who has management's ear.)
"We're phasing out (Windows 95/Windows 98/Windows 2000/Windows XP/NT) and we need matching technology in our apps".
"We've learned a lot from (App Version n) and we'll do a lot better the second/third/fourth/n+1th time."
Job justification for developers/IT manager/Division VP/consulting company.
The users hate it.
We've merged/acquired a competitor/been acquired by a competitor and theirs is better.
Some of these are unavoidable (e.g. your company gets bought), but overall this is surely smething that needs to be avoided. Does your organization intentionally fight this syndrome? What effective strategies would you recommend?

That's why an application needs to be easy to expand, and you should be able to easily add-in all the buzzwords.
If you have a solid base code, most of the buzzwords are related to the UI (Vista Controls, Ajax, .net, ASP.net 3.5)...
You could be running COBOL in the back-end ( I wouldn't).
A new major system is implemented - There's nothing you can do.
current IT policies have changed, - The app should be adaptable.
users don't like/hate the existing app - why? cosmetic changes in the UI can fix this most of the time.
The company you acquired the basic app from, to customize it for your needs has disappeared. - I wouldn't do that, I'd prefer to write it myself.
The technology for the existing app is "obsolete" (according to the framework vendor/Microsoft/consultant/industry expert/new IT manager who has management's ear.) - same as the above, if the back-end is solid, you should follow these in the front-end.
"We're phasing out (Windows 95/Windows 98/Windows 2000/Windows XP/NT) and we need matching technology in our apps". - a simple compatibility test and minor UI elements solve this.
I'll also say that this is different when you compare in-house to commercial apps, if you're doing an in-house app, change guarantees your job (if you know what you're doing). If you're doing a commercial app, change is an opportunity to make more money, new features would get you upgrades from existing clients and new clients who are looking for the buzzwords, these buzzword could become your advantage when compared to a competitor.

The average lifetime of software I write at the moment is probably a few days. (I write a lot of scripts, so I might be an aberration. ;-) But the core system I work with is probably 15 to 20 years old now. The underlying OS is about 30 years old. There is nothing inherently wrong with either old or young software. In fact, software ages best when it's possible to adapt it to new uses.
Having layers of abstraction between functional parts make it easier to replace functionality in a system. For instance, we've gone through several different tape libraries on our system and now we are considering going to disk archives in the future. Since the "archive" portion of our system sits behind an abstraction layer, we can replace it fairly easily without replacing the rest of the system.
When possible, it's also best to use standard parts. That way, if you run into some limitation, it's likely others will have the same problems and more likely someone will come up with a fix.

Continuous improvement - add useful features at regular intervals
No show-stopping bugs in new versions - testing, testing, testing...
be nice to your clients and treat them with respect (most users really don't want to change their ERPs every three years so if you have a good realtions with them they'll be on your side)
Stay current with new technologies and integrate them in your application when needed

When gathering requirements and someone says "Situation X will always be the case, no exceptions", make it configurable. It will always change, no exceptions.

Most companies don't make it for 5 years. Their software implementations wouldn't be expected to last as long.

Related

What is MAGIC programming language? Which other language is closest in syntax? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have recently heard about Magic programming language from several sources and didn't recall ever hearing about it before. It was mentioned that it is a programming language from Israel.
I did some googling and couldn't find much information about it. I couldn't find any code examples, and wikipedia didn't have any information on it either.
I think this is the site for it http://www.magicsoftware.com/en/products/?catID=70 though I am not sure, as it mentions uniPaaS instead of magic. However other material on the site indicates that this is the new name for it.
I was interested in learning more about it from it's practitioners, rather than the company. I saw several claims on the internet that it provided really fast application development, similar to claims made by RoR proponents when it came out.
How does it compare to VB?
Is it still a better RAD tool than current .net or mvc frameworks like django, ror ...etc?
How hard is it to learn?
If you can post some sample code it would be most helpful as well.
Could this site be it? Though it links back to the page above.
You're right my friend, Magic is the original name of the "programming language", nowadays is called UniPaaS (Uni Platform as a Service), I use it to develop some business application. Maybe is the fastest way to create an applications(data manipulation), you can create apps in just a few days, but like everything in life has its own drawbacks:
it's very weird so that makes it
difficult to learn.
you do not have all the control of what's happening in the background
and you have to pay a lot for licensing (servers,clients, etc)
If you are interested in learning this, you can download a "free" version of the software that only works with sqlite databases called UniPaaS Jet.
Magic Language is as it’s called today uniPaaS, it used to be Magic than eDeveloper and now uniPaaS as PachinSV menchend before.
uniPaaS is an application platform enabling enterprises, independent software vendors (ISVs) and system integrators (SIs) to more successfully build and deploy business applications.
You can download the free version of uniPaaS Jet here: http://web.magicsoftware.com/unipaas-jet-download,
try it yourself and see how easy it is to use.
Magic technology as you descried is a Magic Software Enterprises tool (uniPaaS), you can find more information on:
official website: www.magicsoftware.com/en/products/?catID=70&pageID=55
uniPaaS Jet developer group on facebook: https://www.facebook.com/groups/unipaasJet/
Magic developer zone: devnet.magicsoftware.com/en/unipaas
Let me know if you find the information helpful
Bob
As PachinSV explained, there is a RAD once called Magic, then eDeveloper, now UniPaaS. This RAD is dedicated for database applications. Programming in this RAD does not look like anything else I know, you mostly don't write code as with usual languages, but it is nearly impossible to explain just with words. The applications are interpreted, not compiled.
As PachinSV said, when developing, you must follow UniPaaS' way of doing things. This is probably why so many people never manage to use Magic properly: if you thought like Magic before learning about it, then you will adapt to it easily; but if you have a long and successful experience using other database development tools, then often the Magic paradigm will never become natural to you. The learning curve is quite steep, you must learn a lot of things before being able to write a little application.
Previous versions stored the "code" inside a database table. The last version, UniPaas stores the code in xml files. I could send you an example, if PachinSV does not answer you before. But the files are pretty big: the smallest xml file I have in a test app is 4000 bytes, and any application is made of at least 11 files, an empty application is 7600 bytes. You must also understand that developers never use those files (they are undocumented AFAIK), they are only the storage format used internally by UniPaaS. The only way to use them is to set them up as a UniPaaS application.
I'm still an active MAGIC Developer... This is the old name used and its a completely different paradigm like some of you mentioned. I've been developing it from Magic version 8.x to eDeveloper 9.x to 10.x then renamed to UniPAAS.
The newer version is much easier to use and it is still very RAD in the sense that there is little or no code you write... a lot of the common programming tasks like IO, SQL command...etc is handled by the tool and is transparent ( so even less code to write since we use it in almost all types of applications)... Its mostly an Enterprise tool... you wouldnt use it for small application...
You can download the free version to learn the paradigm... but the enterprise licenses are expensive.. you need both the development tool and the runtime license if you want to deploy... so it can be costly for small scale projects...
I enjoy it personally, especially when you have to do quick proof of concepts or a quick data migration or porting onto any db platform and bridging any existing system through a wide range of gateways they provide with the licensed version.. It is up to date with the commonly used web technology out there...like SOAP, RIA ...
It's more popular in Europe... The HQ in the States is in Irvine... we used to have 2 branches in Canada but it closed down in 2001 .... Visit the Magic User Group on Yahoo... Its a very active forum with lots of cool people who will help you out in your quest...
http://tech.groups.yahoo.com/group/magicu-l/
I Programmed with Magic for 6 years and found it to be a amazingly fast tool, easy to understand if you are a competent database programmmer because all operations are really about data manipulation. It is certainly a niche area develop in and because of this jobs are few and far between. As it is interpreted there are really no bugs to make. It will work with many databases/connections simultaneously but there is a big memory and processing hit.
Drawbacks :
Little control over communications between machines and devices
No mobile API as yet
Niche area so few skilled practitioners or companies willing to invest.
Good Points :
You can say you are a Magician; you can impress people with uber fast apps development (really)
It is easy to understand if you don't have a PHD in Maths
zero programming "bugs" can creep in. What you do is what you get.
Developed in The original Magic PC referred to by several of the above folks.
It is exactly this: FAST, FAST, but expensive and rigid in what it will allow you to do. It works on a tick tack toe like matrix. Dropping in commands into the various sections determines when they are run. The middle column is run indefinitely until you break the cycle. It is like a do Until loop. If you have to do an item once you put it into this infinite loop and end it after one cycle.
The first column procedures are run first, ONCE, before the infinite middle column is run. The 3rd column of commands is run after the infinite cycle, once. It is very efficient and logical once you get over the idea of an infinite loop.
Types can be specified and an associated program to present the type. Then everywhere the type is used all the settings automatically kick in. I like especially that one can write the program and 5 months later change the name of a variable and it is carried throughout the program. In fact the program does not use your name for anything. The internal name of any and all variables is hidden to the end user, so of course it is not a problem to change a name. It takes a minute to write an input program for any table. It takes a minute to write an export/import program for all the data files in the database.
Attaching to a type of database like Btrieve or SQL independent of the program itself.
I stopped using the language because they demand more for the runtime engine than I could charge for the programs I wished to run with it. Bill Gates went the opposite direction. VB is superior in control and being able to drop `10 datagridviews onto the same screen, but development is 10 times slower.
It's niche then is PROOF of concept for a program in a big company or conversion, importing, exporting for a development company. It is good for $25k programs that are database heavy and not going mobile.
uniPaaS, Magic PC
I did some Magic work around 1993. It was a DOS based 4GL that came from Israel. Haven't seen it since.
How does it compare to VB?
It doesn't.
Is it still a better RAD tool than current .net or mvc frameworks like django, ror ...etc?
If you mean "is it more Rapid", then yes, otherwise no.
How hard is it to learn?
About as hard as learning MS Access.
Coincidentally, if you want to get an idea of what it is and how it works, I've found that comparing it to MS Access is handy. It works in much the same way from a user's or developer's perspective. Obviously what happens in the background is vastly different, but if you've ever developed a form in design view in Access, Magic will seem very familiar.
Google tells me there's also MAGIC/L. All I could find about it was this blurb:
A procedural language written in
Forth. Originally ran on Z80's under
CP/M and later available for IBM-PCs
and Sun 3s.
The only Magic programming language that I know about is one used by a company called Meditech. It's a proprietary language derived from MUMPS.
The language is truly miserable - here's a sample.

IDS an over-kill for a single-user app?

I have the following dilema: My clients (mom-n-pop pawnshops) have been using my mgmt. system, developed with ISQL, for over 20 years. Throughout these two decades, I have customized the app to each clients desire, or when changes in Laws/Regulations have required it. Most clients are single-user sites. Some have multiple stores, but have never wanted a distributed db, don't trust the reliability or security of the internet or any other type of networking. So, they all use Standard Engines. I've been able to work around some SE limitations and done some clever tricks with ISQL and SE, but sooner or later, new laws may require images of pawnshop customers, merchandise, electronic transmision, etc. and then it will be time to upgrade to IDS, re-write the app in 4GL or change to another RDBMS. The logical and easiest route would be IDS/4GL, however, when I mentioned Linux or Unix-like platforms to my clients, they reacted negatively and demanded a Windows platform, so the easiest solution could be 4Js, Querix, etc.?.. or Access, Visual FoxPro or ???.. anyone have suggestions?
This whole issue probably comes down to a couple of issues that you'll have to deal with.
The first thing is what application programming and development language Are you willing to learn and work with?
The other thing is what kind of Internet capabilities to you want?
So for example while looking at a report do you want to be able to click on a button and have the report converted to a PDF document, and then launch the e-mail client with that PDF attached?
What about after they enter all the information data into the system, perhaps each store would like their own miniature web site in which people in town could go there to check what they've have place of having to phone up the store and ask if they have a $3 used lighter (the labor of phone and checking for these cheap items is MORE than the cost of selling the item – so web really great for this type of scenario).
The other issue is what kind of interface do you want? I assume you currently have some type of green screen or text based interface? Or perhaps over the years you did convert over to a GUI (graphical user interface).
If still green screen (text based) you now you have to sit down and give a considerable amount of effort and time into the layout and how you of screens will work with a graphical based system. I can remember when going from green screens to color, all of a sudden now the choices and effort of having to choose correct colors and layouts for that screen actually increased the workload by quite a bit. And then I went from color test screens to that of a graphical interface, then again all of a sudden now we're presented with a large number of new controls, colors, and in addition to that we have large choices in terms of different fonts and sizes.
And then now with the web, not only do you deal at different kinds a button styles (round, oval, shading, shadows, glow effects), but in addition to all those hover effects and shading effects etc, you now have to get down to some pretty serious issues in terms of what kind of colors (theme) your software will adopt for the whole web site.
This really comes down to how much learning and time you are willing to invest into new tools and how much software you can and will produce for given amount of time and effort.
I quite partial to RAD tools when you get down into the smaller business marketplace. Most of the smaller businesses can not afford rates for a .net developer (it not so much the rate, as the time to build an application). So, using ms-access is a good choice in the smaller business market place. Access is still a good 3 to 5 times many of the other tools in the marketplace. So quote by .net developer to develop something might be 12,000 bucks, and the same thing in Access might be $3000. I mean that small business can not afford to pay you to write unit testing code. This type of extra cost is just not going to happen on the smaller scale projects.
The other big issue you have to deal is what kind of report writing system are you going to build into the system? This is another reason why I like for the smaller business applications is access is because the report writer is really fantastic. Access reports have a whole bunch of abilities to bake connections in from forms and queries and pass filters and parameters into those reports. And, often the forms and queries that you spend time building already can talk to reports with parameters and pass values in a way that again really reduces the workload (development costs).
I think the number one issue that you'll have to address here however is what you're going to do for your web based strategy? You absolutely have to have one. Even if you build the front end part in access, you might still want to use a free edition of SQL server for the back end part. There are several reasons for this, but one reason is then it makes it easy to connect multiple stores up over the Internet.
Another advantage of putting your data in some type of server based system, is now you can set up some type of web server for all the stores to use, and build a tiny little customize system that allows each store to have their products and listings online (but, they use YOUR web server, or one that you paying $15 per month to host all of those customers). This web part could be an optional component that maybe perhaps all customers don't necessarily want. It would work off of the data they have to enter into the system anyway.
One great advantage of adopting these web based systems is not only does it allow these stores to serve their customers far better, but it also opens up the doors for you to convert your software into a monthly fee based system, or at least some part of it such as the optional web hosting part you offer.
When I converted so my longer time applications from green screen mainframe type software into windows desktop based applications it opened up large markets for me. With remote desktop, downloading software, issuing updates from a web site, then these new software systems make all of these nuts and bolts part of delivering software very easy now and especially so for supporting customers in different cities that you've never met face to face.
So, if you talking still primarily single user and one location, Access will reduce your development costs by a lot. It really depends on how complex and rich of an application you are talking about. If the size and scope of the project is beyond one developer, then you talking more about developer scaling (source code control, object development methodology, unit testing, cost and time of setting up a server based database system like SQL server etc). So they're certainly tipping point here when you go beyond that tipping point of cost time in complex city, then I actually don't recommend access. So this all comes down to the right horse for the right course.
Perhaps that the end of the day, it really comes down to what application development system are you willing to invest the time to learn?
Look at Aubit4GL - that is, I believe, available (or can be compiled on) Windows.
Yes, IDS is verging on overkill for a single-user system, but if SE doesn't provide all the features you need, or anticipate needing in the near future, it is a perfectly sensible choice. However, with a modicum of care, it can be set up to be (essentially) completely invisible to the user. And for a non-stressful application like this, the configuration is not complicated. You, as the supplier, would need to be fairly savvy about it. But there are features like silent install such that you could have your own installer run the IDS installer to get the software onto the customer's machine without extra ado. The total size of the system would go up - IDS is a lot bigger on disk than SE is (but you get a lot more functionality). There are also mechanisms to strip out the bigger chunks of code that you won't be using - in all probability. For example, you'd probably use ON-Tape for the backups; you would therefore omit ON-Bar and ISM from what you ship to customers.
IDS is used in embedded systems where there are no users and no managers working with the system. The hardware sits in the cupboard (closet) and works, communicating over the network.
It's good to see folks still getting value out of "old school" Informix Tools. I was never adept at Perform, but the ACE report writer always suited me. We skipped Perform and went straight for FourGen, and I lament that I've never been as productive as I was with FourGen. It had it own kind of elegance from its code generators to it funky, but actually quit powerful, stand alone menu system.
I appreciate the modern UI dynamics, but, damn, is it hard to write applications today. Not just tools, but simply industry requirements et al (such as you may be experiencing in your domain). And the Web is just flat out murder.
I guess part of it is that since most "green screen" apps look the same, it's hard to make one that looks bad! With GUIs and the Web etc., you can't simply get away with a good field order and the labels lining up.
But, alas, such as it is, that is what we have.
I have not used it in, what now, 15 years, but you may also want to look at Alpha 5. It was a pretty powerful, but not overly complicated, database development package, and (apparently) still going strong.
I wouldn't be too afraid of IDS. It runs pretty simply. Out of the box with zero or little tweaking, the DB works and is efficient, and it used to be pretty trivial to install. It was no SE, in that SE's access was tied to the application (using a library) vs an independent server that is IDS. But, operationally, it's really straightforward -- especially for an app like what you're talking about. I appreciate that it might be overkill, but even today, the resource requirements won't necessarily be insane. There's a lot of functionality, of course, and flexibility that you won't use. But frankly, beyond "flat file" DBase style databases, pretty much ALL of the server based SQL databases are very powerful and capable and potentially complicated. But they don't have to be. They can still be used "simply" and easily (well, save for Oracle -- Oracle can't do anything "simply").
As far as exploring other solutions, don't be too afraid of the "OOP" stuff, as most applications, while they leverage OOP libraries, aren't really OOP themselves (they can be, they just typically aren't, they simply don't need to be). The biggest issue with many of the OOPs systems, is they're simply to finely structured. Dealing with events at far too low of a level. While many programs need to access to that fine level of control, most applications, particularly the ones much like yours, do not. So, the extra flexibility simply gets in the way or creates more boiler plate.
That said, you shouldn't be frightened away from them per se, citing lacking of expertise. They can be picked up reasonably quickly. But I would certainly exhaust the more specialized tools (like Alpha 5, or Access, etc.) first to see if they don't offer what you want.
In terms of Visual FoxPro, was and remains a peerless tool (despite flak from people who know little about it). It has a fast, native database engine, built-in SQL and powerful report designer and so on. But you also have to consider that Microsoft support will be dropped for it in 2014, there will never be a 64-bit version, and so on. And the file locking method it uses will be increasingly flaky on future versions of Windows IMO.

How do you handle technology updates in long running projects?

Let's assume you're in the middle of a long running project (long running = several years) and, as expected, there will be several things coming up with brand new releases. There might be a new .Net Framework with brand new features (e.g. Linq, Entity Framework, WPF, WF...), a new Visual Studio or V.next of your favorite Control Library, a new Mock Framework and a lot more things.
What are your guidelines for handling these technology updates? Do you adopt them instantly or do you ignore them until the end of the project? Do you have different guidelines for different things (Tools, Frameworks, supporting stuff)?
In my experience, these decisions are always made on a case-by-case basis. Several factors are considered, including:
How mature is the new technology? Does the organization like to be at the forefront working with bleeding edge new technologies, or does it prefer to work with proven tools and methodologies?
What skill sets do your people have? Are they consistent with use of the new technology, or is more training needed? Will improved productivity outweigh the time it takes to come up to speed?
What investment do you have in the existing technology? What is the cost of moving to the new technology? How much rework and rewriting of code is involved?
What is the requirement? Is it supported by the existing techology, or are new tools needed to fulfill the requirement?
What are the performance expectations? Does the new technology provide a performance improvement that cannot be met with the old technology?
What about the technological culture? Is the organization vendor specific (e.g. a Microsoft shop)? Can open-source code be used?
What is the scope of the project? Is it a large project that would benefit from supporting technologies like frameworks and tools, or is it a small project that would be unduly weighed down and complicated by these things?
How is the new technology supported? Does the vendor have good documentation? Is there someone you can talk to if you have problems? Or are you an organization that has people that know how to solve problems without a support contract?
Is the technology comfortable to work with? Does it seem to make sense? Is it clean and elegant? Do other people seem to like it? Are other people having problems with it?
Is the technology the latest flavor of the week? Has it proven itself in the battlefield to produce tangible results, or is it just a religion?
How much time do you have to learn the new technology and iron out the kinks? Do the benefits outweigh the costs?
As a very brief example, I chose Link to SQL for my most recent project, because the project was complex enough to warrant an ORM, L2S performs well and is lightweight, we are a Microsoft Shop, and it is my sense that the Entity framework is not quite ready for prime time (even though Microsoft says that it will be the go-to framework for the future).
Stick with what you've started with.
A large and long running project often comes with a huge and highly complex code-base. Any change or upgrade to a new version of a library can add bugs in very subtle and unexpected ways.
Also: For large projects the tools and libraries used should have been tested and evaluated in the design-phase. Unless you find a show-stopper or a security issue it's best to not upgrade.
Always remember: Don't change horses in the middle of a stream. :-)
I would say different factors pitch in, like-
Say a software is nearing its end of life, for example last April, Microsoft retired mainstream support for SQL Server 2000, and your product uses it then its wiser to go for the next version of SQL Server in your next release.
Another factor which comes into play is how much value does the new features in the latest release of a software would bring to your product. It may well be the case that the new release of .NET framework has something which does not add any value to your product, then that does not build a strong case to upgrade.
Budget is also an important factor. I think you need to upgrade licenses in order to step up to the next release unless you are already part of something like software assurance.
Training to the team is also a factor. If the latest release is going to add to your product then you will have to train your team as well.
Well, there could be other telling factors too. These were the ones off the top of my head. I hope it helps.
cheers
If you're talking about a framework-specific example, the biggest piece of advice I'll give you is keep the system and your application separate. This is why I love patterns such as Model-View-Controller - it keeps your code modular and means you can upgrade sections without breaking the app as an entirety.
On a more practical level, if your framework has a Git or SVN repository, checkout the usual 'system' directory from the repo, then you can call 'svn update' occasionally to keep up with the latest and greatest builds.
I would suggest that the project not last that long. Develop the application in smaller pieces with iterations every couple months. That way, as new technology comes out, you can make the necessary change and implement updates as you go rather then have to decide to redevelop the whole application. As you say, trying to develop the whole application as things change just doesn't work.
As another poster said, it's certainly a case-by-case basis thing. What you can upgrade and when is determined mostly by how hard or easy it is to test the new version of the system. Having a comprehensive automated test suite for your application helps a lot with this.
Generally, I try to update to the latest stable release of libraries and so on as often as possible, because that makes maintenance easier. If you don't update, you may find yourself patching or working around bugs in the version of the library you are using. If you update less frequently, each update will be more work because you have more changes to deal with, and it's been longer since you last touched the system, and thus you remember less about it.

How important is platform independence?

A lot of software frameworks, languages, platforms claim platform independence and boast it as a selling feature. However, I have failed to understand how could this such an important feature. For example, Java is said to be platform independent - but why should I care when I know that my webapp is going to run on only one platform? Is the overhead of making an application platform independent really worthwhile?
For webapps it mostly isn't an issue as they by definition are almost "platform independent". I mean, users of application mostly aren't tied to any particular platform.
For desktop apps it is a question of your potential client base. If you think that you will benefit from multi platform targeting, then it's worth to make your application platform independent, otherwise better stay away from it :)
If you know your app is going to run on only one platform you shouldn't care - you should evaluate the framework using the same criteria as every other framework on your target platform.
This of course depends on the application in question. If you know that the application is going to run on only one platform, then there's obviously no reason to require it to be platform independent. On the other hand, if you are building an application that is supposed to be usable for, say, next 15 years, how can you know that the platform you choose will even exist then? It's hard to predict the future, and therefore making your app platform independent gives you one headache less.
Platform independence doesn't necessarily imply overhead. Rather, it implies good programming practices; if you make your app orthogonal to the platform, then changing the platform is a breeze.
Sometimes it's impossible to avoid platform-dependent function calls, for example because of having to directly communicate with some hardware device at low level. Even then it's possible make the app "almost platform independent". Instead of scattering the platform dependent things everywhere, wrap them all strictly into one class/package/whatever. Then you need to change just that one unit in order to translate your app to another platform.
we develop a Java B2B application that is Unix only, but works on all Unix flavors (where java is available).
the advantage to have a multiplatform application is that our customers sometimes have knowledge in Linux, sometimes in Solaris, sometimes in FreeBSD, ...
this way we can adapt to the customer and not force them to use one specific platform
For example, Java is said to be platform independent - but why should I care when I know that my webapp is going to run on only one platform?
The fact that it's not advantageous to you doesn't mean that it's of no benefit. I'm sure many Java developers enjoy the fact that they don't have to recompile their application for each platform (hence it's a selling point). A web app that makes use of Active X exclusively for certain components will face more road blocks if, in the future, other platforms also become of interest.
Is the overhead of making an application platform independent really worthwhile?
Depends on what you mean by overhead. If it's a good framework, there might be minimal overhead. Of course if other platforms are of no interest to you, then yes, it's an overhead. However, the fact is that unlike a decade or so ago, more platforms are starting to matter these days (at least for web and desktop application). So, the overhead could be worth it in the long run.
If you're developping only server-side, you probably don't need to take care of it at the moment. However, you might be extremely happy down the road to find that you can run your application seamlessly on another OS if the needs arise (for instance, if asked for by a client, or if you have specific performance/fonctionnalities needs).
For a client-side application, platform-independence means a lot less work to be able to ship for Mac and Linux, and yes, that might be worth it.
You almost answer your own question. Platform independence is only important if you want your application to work on multiple platforms. If you don't, then that's one less thing to worry about.
Take OpenOffice or Firefox for example. You can use those on every major platform. That's important to them because they want everyone to be able to use them and have the same experience no matter what their OS is.
If your project is smaller and doesn't really need to be on every platform, then don't worry about it. It's really a judgment call for each program you develop.
but why should I care when I know that
my webapp is going to run on only one
platform?
You shouldn't. If you know that you are going to run on only one platform, platform independence is not very relevant to you.
But you are not equal to all the population of potential users. Other people will want to target pc's in multiple platforms.
It's like having a version in Chinese. If you're going to sell only in English speaking coutries, it's irrelevant. If you're trying to sell in China, it might help.
Theoretically, platform independence helps you avoid the so-called "vendor lock" while at the same time giving you a broader reach and potentially more customers.
In practical terms, you should evaluate your target audience and do good business calculation on whether the profit potential of being able to deliver to multiple platforms outweighs the cost of adopting a platform independent framework. After all, the framework might claim to work the same on all platforms, but you will have to verify that claim. Not to mention that no framework solves all problems for delivering an application, like deployment, configuration, centralized management, updating/upgrading and so on.
Of course, if your product is a server-based and the end user is going to consume it through an HTTP agent, you don't have to worry about it. For the most part and as long as you stay in the [relatively] safe realm of HTML, JavaScript and Flash.
Platform independence is a desirable feature for software vendors because they invest a large amount of money developing a modern, sophisticated application so they don't want to artificially cut out any market segment. They want to sell their baby to as many organizations as possible.
Software vendors try to convince IT departments that platform independence is a good thing because it avoids vendor lock-in. I'm sure that is importance, in theory; however, in practice most IT departments self impose vendor lock-in with their attitudes, usually concerning a particular technology vendor of high prominence.
"Platform independence" can mean different things to different people. For example, is "Windows XP" a different platform than "XP 64", or Vista, or Windows 7? It depends upon whether you write application software or drivers, and on what pre-installed libraries and services you depend on.
In the most general sense, no application can be truly platform-independent - you won't expect to run a web application on the embedded Linux in your toaster, or on a 16-MB Windows 3.11 machine.
But software frameworks that have platform independence as an architectural target are generally better prepared when your platform changes, and in any long-lived project, it will change, if only because hardware will be replaced every 3-5 years, and new hardware often comes with new OS versions.
You always pay for Flexibility.
Always.
Deciding if the cost is worth it (the pay offs can be very high) is entirely dependent on the needs of the individual/company at hand but there is always a cost. Many of these are implicitly assumed, for example:
Most people code to a file system agnostic[1] api rather than one assuming a particular implementation and this choice is correct so often as to be a reasonable default choice in the absence of any particular requirement in that area.
Nonetheless it is sometimes worth revisiting your core assumptions every so often simply to know what they are
[1] at least to the level of saying it's a tree with path separators '/' as opposed to talking ext3, NTFS, ReiserFS, etc...
For a web application that only you are going to use, the only point of being platform-independent is that it makes it easier on you if you change servers down the line.
Of course, languages like Java are used for a lot more than web applications - people write standalone(-ish) desktop programs in them as well, and for those it's a lot more useful to be platform independent. Sun can do the work of making sure Java runs on a whole bunch of different computers, and every Java application developer shares the benefits of that work for free, basically. It's especially beneficial to developers of mobile phone applications (not the iPhone or Android, but good old basic cell phones): writing different code for every different phone out there would be a nightmare. The fact that many phones include a JRE to run applications makes the developers' jobs easier.
One field where cross-platform is an issue even for the desktop applications is software for the scientific community. From my experience, the desktops in the academy are much more heterogeneous than the ones you see at home, offices etc.
Platform independence is not much of an issue when you target a certain platform but it is when you write an application. There are libraries and frameworks out there which solve about any problem you might encounter. Only you can't use them unless they have been written for your target platform.
Which is why it is usually a good thing for a library or framework to be as platform independent as possible because every developer on the planet is a possible client. In the next step, it makes it more simple for application developers to write code which runs on any platform. In the last years, we have seen the user numbers of Mac and Linux grow steadily. So if you can sell to them for little additional cost, why not?

What is "Enterprise ready"? Can we test for it?

There are a couple of questions on Stackoverflow asking whether x (Ruby / Drupal) technology is 'enterprise ready'.
I would like to ask how is 'enterprise ready' defined.
Has anyone created their own checklist?
Does anyone have a benchmark that they test against?
"Enterprise Ready" for the most part means can we run it reliably and effectively within a large organisation.
There are several factors involved:
Is it reliable?
Can our current staff support it, or do we need specialists?
Can it fit in with our established security model?
Can deployments be done with our automated tools?
How easy is it to administer? Can the business users do it or do we need a specialist?
If it uses a database, is it our standard DB, or do we need to train up more specialists?
Depending on how important the system is to the business the following question might also apply:
Can it be made highly available?
Can it be load balanced?
Is it secure enough?
Open Source projects often do not pay enough attention to the difficulties of deploying and running software within a large organisation. e.g. Most OS projects default to MySql as the database, which is a good and sensible choice for most small projects, however, if your Enterprise has an ORACLE site license and a team of highly skilled ORACLE DBAs in place the MySql option looks distinctly unattractive.
To be short:
"Enterprise ready" means: If it crashes, the enterprises using it will possibly sue you.
Most of the time the "test", if it may really be called as such, is that some enterprise (=large business), has deployed a successful and stable product using it. So its more like saying its proven its worth on the battlefield, or something like that. In other words the framework has been used successfully, or not in the real world, you can't just follow some checklist and load tests and say its enterprise ready.
Like Robert Gould says in his answer, it's "Enterprise-ready" when it's been proven by some other huge project. I'd put it this way: if somebody out there has made millions of dollars with it and gotten written up by venture capitalist magazines as the year's (some year, not necessarily this one) hottest new thing, then it's Enterprise-ready. :)
Another way to look at the question is that a tech is Enterprise-ready when a non-tech boss or business owner won't worry about whether or not they've chosen a good platform to run their business on. In this sense Enterprise-ready is a measure of brand recognition rather than technological maturity.
Having built a couple "Enterprise" applications...
Enterprise outside of development means, that if it breaks, someone can fix it. I've worked with employers/contractors that stick with quite possibly the worst managing hosting providers, data vendors, or such because they will fix problems when they crop up, even if they crop up a lot it, and have someone to call when they break.
So to restate it another way, Enterprise software is Enterprisey because it has support options available. A simple example: jQuery isn't enterprisey while ExtJS is, because ExtJS has a corporate support structure to it. (Yes I know these two frameworks is like comparing a toolset to a factory manufactured home kit ).
As my day job is all about enterprise architecture, I believe that the word enterprise isn't nowadays about size nor scale but refers more to how a software product is sold.
For example, Ruby on Rails isn't enterprise because there is no vendor that will come into your shop and do Powerpoint presentations repeatedly for the developer community. Ruby on Rails doesn't have a sales executive that takes me out to the golf course or my favorite restaurant for lunch. Ruby on Rails also isn't deeply covered by industry analyst firms such as Gartner.
Ruby on Rails will never be considered "enterprise" until these things occur...
From my experience, "Enterprise ready" label is an indicator of the fear of managers to adopt an open-source technology, possibly balanced with a desire not to stay follower in that technology.
This may objectively argued with considerations such as support from a third party company or integration in existing development tools.
I suppose an application could be considered "enterprise ready" when it is stable enough that a large company would use it. It would also imply some level of support, so when it does inevitable break.
Wether or not something is "enterprise ready" is entirely subjective, and undefined, and rather "buzz word'y".. Basically, you can't have a test_isEnterpriseReady() - just make your application as reliable and efficient as it can be..