What is "Enterprise ready"? Can we test for it? - enterprise

There are a couple of questions on Stackoverflow asking whether x (Ruby / Drupal) technology is 'enterprise ready'.
I would like to ask how is 'enterprise ready' defined.
Has anyone created their own checklist?
Does anyone have a benchmark that they test against?

"Enterprise Ready" for the most part means can we run it reliably and effectively within a large organisation.
There are several factors involved:
Is it reliable?
Can our current staff support it, or do we need specialists?
Can it fit in with our established security model?
Can deployments be done with our automated tools?
How easy is it to administer? Can the business users do it or do we need a specialist?
If it uses a database, is it our standard DB, or do we need to train up more specialists?
Depending on how important the system is to the business the following question might also apply:
Can it be made highly available?
Can it be load balanced?
Is it secure enough?
Open Source projects often do not pay enough attention to the difficulties of deploying and running software within a large organisation. e.g. Most OS projects default to MySql as the database, which is a good and sensible choice for most small projects, however, if your Enterprise has an ORACLE site license and a team of highly skilled ORACLE DBAs in place the MySql option looks distinctly unattractive.

To be short:
"Enterprise ready" means: If it crashes, the enterprises using it will possibly sue you.

Most of the time the "test", if it may really be called as such, is that some enterprise (=large business), has deployed a successful and stable product using it. So its more like saying its proven its worth on the battlefield, or something like that. In other words the framework has been used successfully, or not in the real world, you can't just follow some checklist and load tests and say its enterprise ready.

Like Robert Gould says in his answer, it's "Enterprise-ready" when it's been proven by some other huge project. I'd put it this way: if somebody out there has made millions of dollars with it and gotten written up by venture capitalist magazines as the year's (some year, not necessarily this one) hottest new thing, then it's Enterprise-ready. :)
Another way to look at the question is that a tech is Enterprise-ready when a non-tech boss or business owner won't worry about whether or not they've chosen a good platform to run their business on. In this sense Enterprise-ready is a measure of brand recognition rather than technological maturity.

Having built a couple "Enterprise" applications...
Enterprise outside of development means, that if it breaks, someone can fix it. I've worked with employers/contractors that stick with quite possibly the worst managing hosting providers, data vendors, or such because they will fix problems when they crop up, even if they crop up a lot it, and have someone to call when they break.
So to restate it another way, Enterprise software is Enterprisey because it has support options available. A simple example: jQuery isn't enterprisey while ExtJS is, because ExtJS has a corporate support structure to it. (Yes I know these two frameworks is like comparing a toolset to a factory manufactured home kit ).

As my day job is all about enterprise architecture, I believe that the word enterprise isn't nowadays about size nor scale but refers more to how a software product is sold.
For example, Ruby on Rails isn't enterprise because there is no vendor that will come into your shop and do Powerpoint presentations repeatedly for the developer community. Ruby on Rails doesn't have a sales executive that takes me out to the golf course or my favorite restaurant for lunch. Ruby on Rails also isn't deeply covered by industry analyst firms such as Gartner.
Ruby on Rails will never be considered "enterprise" until these things occur...

From my experience, "Enterprise ready" label is an indicator of the fear of managers to adopt an open-source technology, possibly balanced with a desire not to stay follower in that technology.
This may objectively argued with considerations such as support from a third party company or integration in existing development tools.

I suppose an application could be considered "enterprise ready" when it is stable enough that a large company would use it. It would also imply some level of support, so when it does inevitable break.
Wether or not something is "enterprise ready" is entirely subjective, and undefined, and rather "buzz word'y".. Basically, you can't have a test_isEnterpriseReady() - just make your application as reliable and efficient as it can be..

Related

Should developers be limited to certain software for development?

Should developers be limited to certain applications for development use?
For most, the answer would be as long as the development team agrees it shouldn't matter.
For a company that is audited for security certifications, is there a method that balances the risk of the company and the flexibility, performance of the developers?
Scope
coding/development software
build system software
3rd party software included with distribution (libraries, utilities)
(Additional) Remaining software on workstation
Possible solutions
Create white-list of approved software where developer must ask for approval for desired software before he/she can use it. Approval would be based on business purpose/security risk.
Create black-list for software. Developers list all software used. Review board periodically goes over list.
Has anyone had to work at a company that restricted developer tools beyond the team setting? How did they handle the situation?
Edit
Cleaned up question. Attempted to make less argumentative.
Limiting the software that developers can use on their work machines is a fantastic idea. This way, all the developers will quit, and then the company won't have to spend as much money on salaries and equipment, resulting in higher profits.
Real answer: NO!!!
No, developers should not be limited in the software they use, because it prevents them from successfully doing their jobs. Think about how much you are paying your team of developers, - do you really want all that money to go spiraling down the drain because you've artificially prevented them from solving problems?
1) Company locks down the pc and treats the developer as competent as a secretary
What happens when the developer needs to do something with administrative permissions? EG: Register a COM object, restart IIS, or install the product they're building? You've just shut them down.
2) Create a white-list of approved software...
This is also impractical due to the sheer amount of software. As a .NET developer I regularly (at least once per week) use upwards of 50 distinct applications, and am constantly evaluating newer upgrades/alternatives for many of these applications. If everything must go through a whitelist, your "approval" staff are going to be utterly swamped by just one or 2 developers, let alone a team of them.
If you take either of these actions, you'll achieve the following:
You'll burn giant piles of time and money as the developers sit on their thumbs waiting for your approval team, or doing things the long slow tedious way because they weren't allowed to install a helpful tool
You'll make yourself the enemy of the development department (not good if you want your devs to actually do what you ask them to do)
You'll depress team morale substantially. Nobody enjoys feeling like they're locked in a cage, and every time they think "This would be finished 5 hours ago if only I could install grep", they'll be unhappy.
A more acceptable answer is to create a blacklist for "problem" software (and websites) such as Pidgin, MSN messenger, etc if you have problems with developers slacking off. Some developers will also rail against this, but many will be OK with it, provided you are sensible in what you blacklist and don't go overboard.
I think developers should have total control on applications that they use as long as they can do their job with them. Developers' productivity is directly related to working environment and no one will like being restricted and everyone likes to use software they like themselves.
Of course there should be some standards in terms of version control, document format, etc., but generally developers should have right to use any programs they want.
And security should be developer's concern - company admins should care about setting up proper firewall to protect against any kinds of attacks.
A better solution would to create a secure independent environment for the developers. An environment that if compromised won't put the rest of the company at risk.
The very nature of the development is to create crafty ingenuous pithy solutions. To achieve this, failures must happen.
Whatever they do, don't take away the Internet in general. Google = Coding Help 101 :)
Or maybe just leave www.stackoverflow.com allowed haha.
I'd say this depends on quite a list of factors.
One is team size. If you have a team of half a dozen developers, this can be negotiated whenever a need for some application pops up. If you have a team of 100 developers, some policy is probably in order.
Another factor is what those developers do. If they compile C code using a proprietary compiler for an embedded platform, things are very different from a team producing distributed web or PC software in a constantly shifting environment.
The software you produce and the target customers are important, too. If you're porting the Linux kernel to some new platform, whether code leaks probably doesn't matter all that bad. OTOH, there are a lot of cases where this is very different.
There are more factors, but in the end it all boils down to two conflicting goals:
You want to give your developers as much freedom as possible, because that stimulates their creativity.
You want to restrict them as much as possible, as this reduces risks. (I'm talking of security risks as well as the risk to ship non-functioning software etc.)
You'll have to find a middle ground that doesn't hurt creativity while allowing enough guarantees to not to hurt the company.
Of course! If you want a repeatable build process, you don't want it contaminated by whatever random bit of junk a programmer happens to use as a tool to generate part of the code. Since whatever application you are building lasts much longer than anyone expects, you also want to ensure that the tools you use to build it are available for roughly the same duration; random tools from the internet don't provide any such gaurantee.
Your team should say "The following tools are allowed for build steps and nothing else" and attempt to make that list short.
Obviously, it shouldn't matter what a programmer looks at to decide what to do, so the entire Internet is just fine as long as its just-look. Nor does it matter if he produces code by magic (or random tool) as long as your team doesn't mind accepting just that tool's output as though it were written by hand.

How do you handle technology updates in long running projects?

Let's assume you're in the middle of a long running project (long running = several years) and, as expected, there will be several things coming up with brand new releases. There might be a new .Net Framework with brand new features (e.g. Linq, Entity Framework, WPF, WF...), a new Visual Studio or V.next of your favorite Control Library, a new Mock Framework and a lot more things.
What are your guidelines for handling these technology updates? Do you adopt them instantly or do you ignore them until the end of the project? Do you have different guidelines for different things (Tools, Frameworks, supporting stuff)?
In my experience, these decisions are always made on a case-by-case basis. Several factors are considered, including:
How mature is the new technology? Does the organization like to be at the forefront working with bleeding edge new technologies, or does it prefer to work with proven tools and methodologies?
What skill sets do your people have? Are they consistent with use of the new technology, or is more training needed? Will improved productivity outweigh the time it takes to come up to speed?
What investment do you have in the existing technology? What is the cost of moving to the new technology? How much rework and rewriting of code is involved?
What is the requirement? Is it supported by the existing techology, or are new tools needed to fulfill the requirement?
What are the performance expectations? Does the new technology provide a performance improvement that cannot be met with the old technology?
What about the technological culture? Is the organization vendor specific (e.g. a Microsoft shop)? Can open-source code be used?
What is the scope of the project? Is it a large project that would benefit from supporting technologies like frameworks and tools, or is it a small project that would be unduly weighed down and complicated by these things?
How is the new technology supported? Does the vendor have good documentation? Is there someone you can talk to if you have problems? Or are you an organization that has people that know how to solve problems without a support contract?
Is the technology comfortable to work with? Does it seem to make sense? Is it clean and elegant? Do other people seem to like it? Are other people having problems with it?
Is the technology the latest flavor of the week? Has it proven itself in the battlefield to produce tangible results, or is it just a religion?
How much time do you have to learn the new technology and iron out the kinks? Do the benefits outweigh the costs?
As a very brief example, I chose Link to SQL for my most recent project, because the project was complex enough to warrant an ORM, L2S performs well and is lightweight, we are a Microsoft Shop, and it is my sense that the Entity framework is not quite ready for prime time (even though Microsoft says that it will be the go-to framework for the future).
Stick with what you've started with.
A large and long running project often comes with a huge and highly complex code-base. Any change or upgrade to a new version of a library can add bugs in very subtle and unexpected ways.
Also: For large projects the tools and libraries used should have been tested and evaluated in the design-phase. Unless you find a show-stopper or a security issue it's best to not upgrade.
Always remember: Don't change horses in the middle of a stream. :-)
I would say different factors pitch in, like-
Say a software is nearing its end of life, for example last April, Microsoft retired mainstream support for SQL Server 2000, and your product uses it then its wiser to go for the next version of SQL Server in your next release.
Another factor which comes into play is how much value does the new features in the latest release of a software would bring to your product. It may well be the case that the new release of .NET framework has something which does not add any value to your product, then that does not build a strong case to upgrade.
Budget is also an important factor. I think you need to upgrade licenses in order to step up to the next release unless you are already part of something like software assurance.
Training to the team is also a factor. If the latest release is going to add to your product then you will have to train your team as well.
Well, there could be other telling factors too. These were the ones off the top of my head. I hope it helps.
cheers
If you're talking about a framework-specific example, the biggest piece of advice I'll give you is keep the system and your application separate. This is why I love patterns such as Model-View-Controller - it keeps your code modular and means you can upgrade sections without breaking the app as an entirety.
On a more practical level, if your framework has a Git or SVN repository, checkout the usual 'system' directory from the repo, then you can call 'svn update' occasionally to keep up with the latest and greatest builds.
I would suggest that the project not last that long. Develop the application in smaller pieces with iterations every couple months. That way, as new technology comes out, you can make the necessary change and implement updates as you go rather then have to decide to redevelop the whole application. As you say, trying to develop the whole application as things change just doesn't work.
As another poster said, it's certainly a case-by-case basis thing. What you can upgrade and when is determined mostly by how hard or easy it is to test the new version of the system. Having a comprehensive automated test suite for your application helps a lot with this.
Generally, I try to update to the latest stable release of libraries and so on as often as possible, because that makes maintenance easier. If you don't update, you may find yourself patching or working around bugs in the version of the library you are using. If you update less frequently, each update will be more work because you have more changes to deal with, and it's been longer since you last touched the system, and thus you remember less about it.

As a programmer how much are you expected to know outside of programming? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I'm wondering what you do as a programmer that's not programming but necessary for your task (eg: local setup, server setup, deployment, etc). I'm curious to know how many non-programming related tasks people are performing.
For example, when on web development projects I often:
Install servers
Manage user right/access to servers
Perform backups
Configure IIS/Apache
Setup FTP sites
On non-web projects I often:
Write build scripts
Setup source code management tools/procedures
Probably more stuff I'm not thinking of
Some tasks are more related to programming than others (such as writing build scripts) but others fall outside of my area of expertise (domain setup comes to mind). Just interested to know how many people perform tasks in their jobs that are not programming related.
The sad reality is that non-technical people look at technical people and expect them to know everything that is technology related, not understanding that there are specializations within technology which we might know nothing about.
I often think it is very much like a doctor that specializes in a particular discipline. All doctors have a baseline of knowledge in the medical field, but will not know the specifics of other specializations (a cardiologist will not know as much about anesthesiology and vice versa).
So while I think it is unreasonable for people to expect technologists to know everything, I do think that it is reasonable for them to expect that we know something when it comes to technology.
I think a more important facet of this question is how much one is expected to know about the specific domain where they apply their skills (finance, manufacturing, etc, etc). I think that is incredibly important, as having that domain knowledge makes them much more valuable as a programmer, as they can understand the problems on a deep level, and as a result, provide more comprehensive solutions for them.
Expected? Almost nothing, but everyone's always really happy when you know more.
The more you know outside the narrow confines of programming, the more valuable you are to your employer.
Things that have come up for me:
requirements gathering
writing use cases
evaluating test plans
negotiating with vendors
tax law
revenue recognition rules
ideas about how users behave
basic economic theory
usability guidelines
differences in consumer behavior in different countries
system administration (being a full on sysadmin)
database configuration, optimization, setup (basically being a DBA)
monitoring systems
networking principles and techniques (you'd be amazed how handy a packet trace can be when debugging something...)
being able to evaluate a business plan written by someone else
image manipulation
how to diffuse a situation and avoid arguments
how to corner someone and make them to commit to something when they don't want to
how to choose battles
I think the non-programming skill I use the most in my programming job is writing. It's really crucial to be able to explain ideas, designs, algorithms, and so on, and you can never count on being around to do it in person (or having the time). I spend a good amount of time at work writing up design documents and other documentation so other engineers can get their heads around my code and algorithms. So I'm really thankful that I had good writing classes in school and can put a sentence together. :-)
Probably depends on the size of the company you work for. As someone who has worked mainly at small to medium sized businesses, I've also been responsible for:
database creation, management, and tuning
supporting the internal applications I launch
managing website certificates
setting up external hosting
and I'm sure there's more as well
Well, since a programmer's primary tool is his computer, I think it's fair to assume some expertise with it. Most of those sorts of things you've described are difficult for someone unfamiliar with computers, but pretty easy (even with little prior experience) for someone who understands the domain and knows how to find and read documentation.
In a big, well-organized business or project, I'd expect someone who was more specifically familiar with those sort of administrative things to take care of them. However, if there's not enough of them to warrant a full-time job, then I don't think it's unreasonable to have anyone competent work on it; and programmers are probably at the head of the queue in that regard.
I find the vast majority of "bugs" discovered by users are configuration problems with the systems on which the application is installed. Having developers that understand the common machine and network setup errors is very desirable.
For example if an application sends email as part of its operation its useful to have developers knowledgable in DNS and SMTP configuration.
Of course it depends on your size of business, large organisations can probably shield developers from this by using other specialists.
I realized I'm never hired for the actual job, but as a problem solver. Whether I figure out what's going on, and fix it through code, or software, or something on the network, this seems to be the main perception of what clients want.
This will vary greatly depending on where you are. I've worked with people who know none of this stuff, and people who are experts.
Knowing this will help you greatly. In general it's always better to understand the environment your code is running in. Not understanding the context leaves you somewhat helpless.
Additionally there are often bugs that are not code related but configuration related, for example a page not showing up because of the apache configuration. You're very handicapped in debugging if you don't understand the environment.
People around a work place probably expect a programmer to be their IT HelpDesk guy... it happens around here to me. argh.
Where I work, all developers are expected to be able to use Subversion and have to be able to setup and configure Apache and Tomcat on their PC.
The biggest challenge is not the technical issues associated with getting the environment up and running but the domain knowledge required to effectively develop software in a small shop. For me, I work on a lot of different projects from a variety of sources in a mostly isolated development environment. This means that I need to come up to speed on the domain of the project pretty quickly in order to be effective in developing a solution. In the past I've worked on print accounting solutions, active directory management, research survey databases, and currently a quasi-CRM solution for a charitable organization. I wish I only had to know the nuts and bolts of setting up my development and build environment.
It often depends on the size of the company. In a little company, you have to know how to do everything, including systems admin, and network admin, even if your job is focused on programming.
In a big company, you get to see a little slice of the universe, and they often don't like you peeking outside of your box. Not only do you not need to learn everything, they're often unhappy with you if you try.
However, the more you understanding about the machines, how they work, and how they function in an operational environment, the easier it is to diagnose problems and write better software. The more you understanding about the domain you're writing applications for, the better you are able to differentiate between the users needs and their desires.
One of the coolest things about being a software developer is you have a life long excuse for sticking your nose into both the technologies and the various business domains. If you've shifted around to a few different industries, you tend to become loaded down with all sorts of interesting tidbits. There is always more to learn ...
Paul.
It's good to expose yourself to other technologies, but I really think it's a bad idea for you to not fully disclose the fact that you aren't experts in those areas (esp. domain setup). I've worked with people who thought they could do it all but ended up doing those tasks so poorly that with all the time (and money) they've spent trying to get it right, a consultant would have been paid for several times over.
I've worked at a company where I was responsible for everything "related to a computer" including the domain, PCs, database, custom software, builds, MS Office, PowerPoint, Quickbooks...; a mid-size company where it was development and builds; and a large company where I focus solely on the .Net code for my project (someone else handles the database and another handles reporting).
The mid-size company has been the best experience so far (pretty new at the large company) where I was given enough responsibility to feel useful and had easy access to everyone else to ask questions about those other tasks.
You are not alone out there. The position I signed up for was "ASP.NET Web Developer"... However, my job consists of:
Windows Server Administration
Limited Linux Administration (running
top to monitor CPU utilization and changing apache configs)
LDAP Administration / Tuning
MS SQL Server 2005 Administration /
Tuning
Database Development
Crystal Reports Developer
Perl Scripts
C# Win32 Developement
C# / ASP.NET Web Developement
Managing User Access Rights for
Windows Servers
Limited Network Troubleshooting
Being in a company that is constantly striving for supreme "Operation Effectiveness" my task list only grows by the day. I did not make up that list either. All of the items mentioned above, I have either touched or supported in the past 3 years I have worked in this company.
That being said, in a good development shop, you should have one specific task. As the saying goes, Jack of all trades ... master of none.
This depends greatly on what you're programming. If you're doing low level device drivers, it's vital that you understand the underlying hardware. If you're doing a standalone Java app, the better you understand the JVM and libraries you're using, the better - but it isn't strictly necessary to know a lot.
In general, the more you understand about your system environment, the better. How much your peers and management expect you to know depends on them.
Ignorance will, eventually, be punished. If not by your peers and management, the world will do it. Check any week's headlines or RISKS digest for examples where ignorance of the system environment cause software failure.
[rant mode on]
Ha, the curse of Excel and Word.
Outside work - particularly friends and family but sometimes when consulting or delivering software too, any and all non-technical people expect you to understand these. There's that internal groan when someone asks you across to have a look at a small problem they're having with some facet of Office. And because it's a client and you want to appear helpful you agree.
There's just this blanket expectation that because you're a developer you have an innate knowledge of configuring spreadsheets, fixing Word templates, and any and all other office techie tasks, and furthermore you can cast your eye over some badly configured Office mess and instantly diagnose what the problem is.
I can only just about manage to put together a spreadsheet to schedule my reoccuring invoices and set up a Word template to write them. I regularly tell people that too - but no-one ever listens.
It depends a lot on the type of software you're currently developing
For example, when I was working on software for a local government, I had to learn things like
What are the rules for registering animals (pets). What are the types of registrations, what discounts apply, what are penalties for not registering on time
How are council rates calculated. How are rates raised yearly (actually, the algorithm for raising yearly rates and its implementation was the most complex task I met so far).
How are building permits issued. What types of inspections can be performed. Who is involved in the process of issuing a building permit (owner, builder, architect, officers etc.)
How often are water meters read. How are water meters assigned to properties, how many dials are on a water meter, how to detach a water meter from one property and to attach to a different one
What are different pension types. What are different discounts that are granted depending on a pension type.
What are different types of receipts. What different types of terminal printers (those that are used to print small receipts) exist and how to print to them.
What are properties, strata children, what are rules for dividing properties into 'parcels' ...
Well, that's just part of non-programming stuff that I learned during the 2 years on the project. The most unfortunate thing here is that now that I moved to a different company, there is very little chance that any of this knowledge I will ever use.
My job title is "Senior Software Engineer". In point of fact, for most of the past several years, I did fairly little software development, but did do a lot of:
Systems & web administration
Static web page development with HTML (I don't consider that programming, although I have done PHP, CGI, and JavaScript).
As others have said, help desk sorts of stuff, although not as much as in the past.
As a "task leader", I'm expected to have some people/management skills, although that usually devolves to writing monthly reports. I also get sucked into CMMi stuff from time to time, which in an ideal world might be somewhat relevant, but is usually just record keeping so the employer can bid on new contracts which require it.
Working in science lab, there's a need to know some of the science, especially if you want/need to work on the code doing the scientific calculations.
Working in a (U.S.) government facility, there's lots of paperwork and a need to know lots of government regulation (e.g. Freedom of Information Act)
Fortunately, I've recently made an internal transfer where I'm doing more development work and less of this other stuff!
Personally, I find that knowing more is always good, it paves the way to the next level. The hardest things in life is at the integration point. Literally. People focus a lot on specializing, but don't forget that you need people who can straddle both realms.

Visual Studio Team System switching opinions

Assume your .NET-based development team is already using the following set of tools in its processes:
Subversion / TortoiseSVN / VisualSVN (source control)
NUnit (unit testing)
An open source Wiki
A proprietary bug-tracking system that is paid for
You are happy with Subversion and NUnit, but dislike the Wiki and bug-tracking system. You also would like to add some lightweight project-management software (like Fogbugz/Trac) - it does not have to be free, but obviously cheaper is better.
Can you make a compelling argument for adopting VSTS, either to add missing features and replace disliked software or to handle everything (including the source control)? Is the integration of all these features greater than the sum of the parts, or would it simply be better to acquire and replace the parts that you either do not like or do not have?
I remember looking into VSTS a few years ago and thought it was terribly expensive and not really better than many of the free options, but I assume Microsoft has continued to work on it?
VSTS is great, if you do everything in it. Unfortunately the price has not become better over the years. :( The CAL's are still ludicrously expensive. The only improvement is that if a person uses only the work item system, and works only with his/her own work items (no peeking at other person's work items!) then there is no need for a CAL. This makes it a bit easier to use it as an external bugreport system. Still it leaves a lot to be desired in this area.
There is one way to alleviate the cost - become Microsoft Certified Partner. If you are a simple partner, you get 5 VS/TFS licenses for free; if you are a Gold Certifiend Partner, you get 25 (if memory fails me not). That should be enough for most companies. But getting the Gold status might be tricky, depending on what you do.
If you only dislike those two parts, then perhaps it's better just to find a replacement for them instead for everything? There are many wiki systems out there, some should be to your liking. The same goes for bugtracking too.
We are extremely happy with not only the tools, but the integration that Team Foundation Server, and the various Team Editions have given us. We previously used Borland's StarTeam for source control and issue tracking with a 3rd party wiki, the name of which escapes me at the moment.
It came time for us to extend our licensing and support agreement with Borland, only to learn that the cost of adding users to our license and upgrading the product would cost us as much (a little more, actually) than biting the bullet and making the switch. One thing to consider is that you would normally pay for the development tools to begin with, so the cost is partially absorbed by our budget.
We also did not feel the need for getting Team Suite for every person. You might want to consider it for the developers, but other disciplines don't really have a benefit in using all of the tools in most companies.
We were able to get the appropriate team editions for twelve people, enough CALs for 50 users (for Team Explorer, Teamprise, Team Project Portals, Team Web Access), Teamprise for the five Mac Users that we have, and the Team Foundation Server software itself for under six figures. Considering that includes the developer tools that we normally would be buying, it was a good deal.
The upfront cost on new licensing also covered two years, so we could split the budget between the 2008 and 2009 fiscal years. The very important thing is to make sure not to let the licenses lapse, as the renewals on licenses cost a fraction of the initial cost and also include version upgrades.
As to the features, we are in the process of rolling out. About half of our department completed training, and I have already started migrating projects over. The development team absolutely loves the features and tight integration with their workflow. Version control is a snap, and work items (and their related reporting artifacts) are extensible to the nth degree. The fact that TFS relies heavily on bringing sanity to workflow management helps to tie in all of the processes to a level that you just can not get with multiple vendors.
My absolute favorite thing, though, is the extensibility model. Using the Team Foundation Server API, you can easily write check-in policies, write tools to interface with the system, develop plug-ins, and more. We are already seeing gains in productivity and the quality of our products through a minimal implementation.
Still on the horizon, though, is integrating Team Build. I have yet to set up a build project, but it seems to be seamless and painless. Time will tell... :-)
Edit - I forgot to mention that our migration to TFS includes licensing for the Test Load Agent. The load testing functionality within Team Test is one of, if not the absolute best that I have seen.
Where I'm at, we've settled on the following:
SVN for source control
Redmine for bug-tracking and wiki
NUnit for unit testing
CruiseControl.NET for our build server
Redmine is an open source Ruby on Rails application that supports multiple projects much better than Trac and seems to be much easier to administer. It's definitely worth checking out.
VSTS seems to be way too much money compared to other products. As an additional benefit, you also get the souce with open source solutions, which allows you to modify things to fit your need if the capability isn't there yet.
I'd stick with SVN and use trac or bugzilla or fogbugz. You could also do a trial of team server. In my opinion it is not worth the money. MS had their chance with version control and they screwed it up a long time ago. Too late to the party if you ask me and frankly I am not impressed with how they try to control all your development experience in the IDE with "integration" to the source control. I prefer the perforce/SVN and separate defect tracking solution.
With all that said, you probably can't go wrong with any of the following:
bugzilla or trac or fogbugz AND SVN
MS team thingamabob

How to avoid short-lifespan enterprise applications?

A while ago another question referred to the (possibly urban tale) statistic that
... the average lifespan of software is about 3 years
At the time I came up with the following reasons (and I'm sure there are more possibly better ones):
A new major system (ERP, CRM, etc.) is implemented and it has an "integrated" module to replace the old app.
Same, but no integrated app - but the existing app is not adaptable (the people left, technology has changed, current IT policies have changed, users don't like the existing app.)
The company you acquired the basic app from, to customize it for your needs has disappeared.
Or you don't get along well with them any more.
The technology for the existing app is "obsolete" (according to the framework vendor/Microsoft/consultant/industry expert/new IT manager who has management's ear.)
"We're phasing out (Windows 95/Windows 98/Windows 2000/Windows XP/NT) and we need matching technology in our apps".
"We've learned a lot from (App Version n) and we'll do a lot better the second/third/fourth/n+1th time."
Job justification for developers/IT manager/Division VP/consulting company.
The users hate it.
We've merged/acquired a competitor/been acquired by a competitor and theirs is better.
Some of these are unavoidable (e.g. your company gets bought), but overall this is surely smething that needs to be avoided. Does your organization intentionally fight this syndrome? What effective strategies would you recommend?
That's why an application needs to be easy to expand, and you should be able to easily add-in all the buzzwords.
If you have a solid base code, most of the buzzwords are related to the UI (Vista Controls, Ajax, .net, ASP.net 3.5)...
You could be running COBOL in the back-end ( I wouldn't).
A new major system is implemented - There's nothing you can do.
current IT policies have changed, - The app should be adaptable.
users don't like/hate the existing app - why? cosmetic changes in the UI can fix this most of the time.
The company you acquired the basic app from, to customize it for your needs has disappeared. - I wouldn't do that, I'd prefer to write it myself.
The technology for the existing app is "obsolete" (according to the framework vendor/Microsoft/consultant/industry expert/new IT manager who has management's ear.) - same as the above, if the back-end is solid, you should follow these in the front-end.
"We're phasing out (Windows 95/Windows 98/Windows 2000/Windows XP/NT) and we need matching technology in our apps". - a simple compatibility test and minor UI elements solve this.
I'll also say that this is different when you compare in-house to commercial apps, if you're doing an in-house app, change guarantees your job (if you know what you're doing). If you're doing a commercial app, change is an opportunity to make more money, new features would get you upgrades from existing clients and new clients who are looking for the buzzwords, these buzzword could become your advantage when compared to a competitor.
The average lifetime of software I write at the moment is probably a few days. (I write a lot of scripts, so I might be an aberration. ;-) But the core system I work with is probably 15 to 20 years old now. The underlying OS is about 30 years old. There is nothing inherently wrong with either old or young software. In fact, software ages best when it's possible to adapt it to new uses.
Having layers of abstraction between functional parts make it easier to replace functionality in a system. For instance, we've gone through several different tape libraries on our system and now we are considering going to disk archives in the future. Since the "archive" portion of our system sits behind an abstraction layer, we can replace it fairly easily without replacing the rest of the system.
When possible, it's also best to use standard parts. That way, if you run into some limitation, it's likely others will have the same problems and more likely someone will come up with a fix.
Continuous improvement - add useful features at regular intervals
No show-stopping bugs in new versions - testing, testing, testing...
be nice to your clients and treat them with respect (most users really don't want to change their ERPs every three years so if you have a good realtions with them they'll be on your side)
Stay current with new technologies and integrate them in your application when needed
When gathering requirements and someone says "Situation X will always be the case, no exceptions", make it configurable. It will always change, no exceptions.
Most companies don't make it for 5 years. Their software implementations wouldn't be expected to last as long.