How do you handle technology updates in long running projects? - process

Let's assume you're in the middle of a long running project (long running = several years) and, as expected, there will be several things coming up with brand new releases. There might be a new .Net Framework with brand new features (e.g. Linq, Entity Framework, WPF, WF...), a new Visual Studio or V.next of your favorite Control Library, a new Mock Framework and a lot more things.
What are your guidelines for handling these technology updates? Do you adopt them instantly or do you ignore them until the end of the project? Do you have different guidelines for different things (Tools, Frameworks, supporting stuff)?

In my experience, these decisions are always made on a case-by-case basis. Several factors are considered, including:
How mature is the new technology? Does the organization like to be at the forefront working with bleeding edge new technologies, or does it prefer to work with proven tools and methodologies?
What skill sets do your people have? Are they consistent with use of the new technology, or is more training needed? Will improved productivity outweigh the time it takes to come up to speed?
What investment do you have in the existing technology? What is the cost of moving to the new technology? How much rework and rewriting of code is involved?
What is the requirement? Is it supported by the existing techology, or are new tools needed to fulfill the requirement?
What are the performance expectations? Does the new technology provide a performance improvement that cannot be met with the old technology?
What about the technological culture? Is the organization vendor specific (e.g. a Microsoft shop)? Can open-source code be used?
What is the scope of the project? Is it a large project that would benefit from supporting technologies like frameworks and tools, or is it a small project that would be unduly weighed down and complicated by these things?
How is the new technology supported? Does the vendor have good documentation? Is there someone you can talk to if you have problems? Or are you an organization that has people that know how to solve problems without a support contract?
Is the technology comfortable to work with? Does it seem to make sense? Is it clean and elegant? Do other people seem to like it? Are other people having problems with it?
Is the technology the latest flavor of the week? Has it proven itself in the battlefield to produce tangible results, or is it just a religion?
How much time do you have to learn the new technology and iron out the kinks? Do the benefits outweigh the costs?
As a very brief example, I chose Link to SQL for my most recent project, because the project was complex enough to warrant an ORM, L2S performs well and is lightweight, we are a Microsoft Shop, and it is my sense that the Entity framework is not quite ready for prime time (even though Microsoft says that it will be the go-to framework for the future).

Stick with what you've started with.
A large and long running project often comes with a huge and highly complex code-base. Any change or upgrade to a new version of a library can add bugs in very subtle and unexpected ways.
Also: For large projects the tools and libraries used should have been tested and evaluated in the design-phase. Unless you find a show-stopper or a security issue it's best to not upgrade.
Always remember: Don't change horses in the middle of a stream. :-)

I would say different factors pitch in, like-
Say a software is nearing its end of life, for example last April, Microsoft retired mainstream support for SQL Server 2000, and your product uses it then its wiser to go for the next version of SQL Server in your next release.
Another factor which comes into play is how much value does the new features in the latest release of a software would bring to your product. It may well be the case that the new release of .NET framework has something which does not add any value to your product, then that does not build a strong case to upgrade.
Budget is also an important factor. I think you need to upgrade licenses in order to step up to the next release unless you are already part of something like software assurance.
Training to the team is also a factor. If the latest release is going to add to your product then you will have to train your team as well.
Well, there could be other telling factors too. These were the ones off the top of my head. I hope it helps.
cheers

If you're talking about a framework-specific example, the biggest piece of advice I'll give you is keep the system and your application separate. This is why I love patterns such as Model-View-Controller - it keeps your code modular and means you can upgrade sections without breaking the app as an entirety.
On a more practical level, if your framework has a Git or SVN repository, checkout the usual 'system' directory from the repo, then you can call 'svn update' occasionally to keep up with the latest and greatest builds.

I would suggest that the project not last that long. Develop the application in smaller pieces with iterations every couple months. That way, as new technology comes out, you can make the necessary change and implement updates as you go rather then have to decide to redevelop the whole application. As you say, trying to develop the whole application as things change just doesn't work.

As another poster said, it's certainly a case-by-case basis thing. What you can upgrade and when is determined mostly by how hard or easy it is to test the new version of the system. Having a comprehensive automated test suite for your application helps a lot with this.
Generally, I try to update to the latest stable release of libraries and so on as often as possible, because that makes maintenance easier. If you don't update, you may find yourself patching or working around bugs in the version of the library you are using. If you update less frequently, each update will be more work because you have more changes to deal with, and it's been longer since you last touched the system, and thus you remember less about it.

Related

Powerbuilder 10.5: upgrade or migrate away?

We've recently started to support a PowerBuilder 10.5 application and the question has come whether or not we should think about alternatives or keep the app running in PB 10.5. It is a classic PB app; an administrative software, build upon an Oracle DB.
Right now, the app works great, but there are two reasons why we reconsider:
The sole developer of this app is about to retire. He's the only one
who has all the PB-knowledge to support this app.
We might want to improve the offered services of the app. So integrations with other tools are right around the corner.
I'm not very familiar with PB, but I've read it (only the newest versions) is now supported by Appeon. The latest version is now 2017 R3, with a 2019 version coming up.
I'm wondering what the pro's & con's are of trying to update the current 10.5 version to the latest version. Is it worth it to update? Or what are the pro's & con's of sticking to the 10.5 version?
Or should we consider moving to a newer technology, since so few Powerbuilders are to be found nowadays? And if so, what technology would you advise?
Rather than just differences between the older and newer PB-versions, I'm looking for motivations to upgrade/migrate/do nothing at all.
Thanks.
So, there's no clear cut answer, but we can throw around some ideas on the non-technical bullet points (as requested).
Staying on 10.5: There's a lot to be said for "if it ain't broke, don't fix it." If it works and you're happy with what it does, don't move it.
However, since you've said that you're planning on moving it forward, you might want to consider that 10.5 doesn't support current operating systems (within a year, Windows systems currently supported by MS will be only Win8 and Win10), which were nothing but figments of imagination when 10.5 was out. Your 10.5 app may work on Win10 now, but that's solely because of MS's work on backward compatibility for apps, and that you haven't leveraged an area in PB that had a problem in a then-future version of Windows. If you need to add functionality, being on a version that at least suggests that it works on your operating system could be helpful.
Parallel argument for databases, the exception being that if your app uses SQL Anywhere, the database that used to come for free in several PB packages. It is now something you'd have to purchase separately.
One thing to remember about trying to move forward with an old version of anything is support. If you get stuck, the vendor will basically not talk to you, and the peer community has been shrinking, so you've got less chance of getting into a dialog with fellow developers.
Upgrading: Upgrading is usually a minor effort. The most frequent reasons I've seen exceptions to this: deprecated functionality, and coding that depends on behaviours that didn't stay consistent between versions (some behaviours are promised to stay consistent, but not all). Run a migration test with a trail version with your PB expert to get that question off the table.
One thing to keep in mind when upgrading is that the licensing model has changed. PB used to have a perpetual model (buy once, use forever), but it's now a subscription model. Whether this is an improvement for you or not is up to you to figure out.
Whether it is "worth it" to upgrade, in my mind it usually boils down to
OS support
DB support
vendor support
peer support
deprecated features, and whether I use them
new features, and whether I would use them (and you asked us not to discuss these last two items, which need to be weighed very individually anyway, and are well documented on Appeon's site)
"Migrating": I've put "migrating" in quotes, because I don't believe there's a technology that lets you "migrate" in the sense of a code translation. (I'll let you read one of my old tirades about wanting to "migrate" off PB.) What I'll talk about here is rewriting in a new technology. Both pulling business rules out of an old PB system and redesigning/rewriting in another technology is a big effort.
The biggest argument in favour these days is getting and keeping PowerBuilder talent. Getting people with PB under their belt is hard, and judging legitimate talent is challenging, even with someone with PB on your side of the interview table. (Leverage your retiring guy if you want to move forward with PB.) Training someone with PB is no small task either. Someone once asked me, not an educator, if I could come up with a course and train his team in a week. I laughed. After a two week course designed and given by professional educators from the then-vendor Powersoft, I came home and wrote incredibly embarrassing code. I also needed lots of time practicing, and getting feedback from my peers. If you can get someone or train someone, if they are only doing PB work a couple of weeks per year, those PB "muscles" will atrophy. No matter the technical arguments of PB vs something else, if you can't get PB talent to maintain it, PB is a dead end.
I'm afraid I'm not one to suggest an alternative technology. It used to be that, in terms of of rich client apps, you couldn't go wrong with choosing Microsoft, but since then, MS has sent the development community on some wild goose chases, that have ended in deprecated technologies. I wouldn't want to be the guy looking into the future to guess.
Good luck.
I would recommend migrating.
You will find several companies that offer migration to both java and .net which are the leading platforms.
In terms of UI for me currently the only option is web. Using other technologies does not make a lot of sense.
If your company uses a lot of MS stack, like MS OS, SQL server. Exchange, Sharepoint etc I will recommend migrating to C# otherwise migrating to Java makes more sense
Terry's answer is quite good but the point about migration was not addressed with respect to the new features in PowerBuilder 2019.
One major feature of PowerBuilder 2019 is a C# DataStore (compatible with .NET Core) and DataWindow object migration utility. The C# DataStore has the same APIs and transaction mechanism as the PowerScript DataStore. It is documented in detail on the Appeon Website: https://www.appeon.com/support/documents/appeon_online_help/powerbuilder/api_reference/PowerBuilder.Data/DataStore/IDataStore/IDataStore.html
Should you decide C# is the way to go, this feature of PowerBuilder 2019 makes the migration effort a "port" of the PowerScript non-visual code rather than a rewrite (for the reasons mentioned above).
Here is example PowerScript code:
public function datastore of_retrieve (date ad_start, date ad_end, decimal adec_amt);
Datastore lds
lds = Create Datastore
lds.dataobject = "d_order_customer"
lds.SetTransObject(SQLCA)
lds.Retrieve(ad_start, ad_end, adec_amt)
Return lds
end function
Here is the same example in C# using the C# DataStore:
public IDataStore GetOrderCustomerInfo(DateTime startDate, DateTime endDate, decimal amount)
{
IDataStore dataStore = new DataStore("d_order_customer", _context);
dataStore.Retrieve(startDate, endDate, amount);
return dataStore;
}

Choosing between dnx451 and dnxcore 50 for Azure Web App in terms of functionality, performance, etc

I am creating a new project that will run in Azure Web App on the new ASP.NET 5. We are not planning to run it on linux or anything like that, at least now. So the question is, should I try to keep both frameworks if possible just in case or I should prefer one of them. There are e.g. much less dependencies that I can use with dnxcore50 which is not so nice. So the main question is: are there any benefits of using dnxcore50 if running in Azure Web App, like: performance, stability, etc. over dnx451.
I have to start that I'm still the beginner in ASP.NET 5 (like the most other), so I didn't posted my answer before and you should ignore my reputation, because it's come from another subjects, which I know better.
I think that everybody, who switch to ASP.NET 5, ask the same question whether it does make sense to keep both framework in his projects. I try to post below my personal thoughts about the subject.
My personal choice is my short recommendation to you: keep both framework till you find some really important reason to drop one from there.
ASP.NET 5 is still not final. The strategy is not full fixed and it can be changed in a short time later. Just some examples. Previous beta versions have supported "Helios" as an option for hosting ASP.NET 5 applications on IIS. The option was dropped later (see the statement). Even the name dnxcore50 is renamed now to dotnet5.4 at least in all internal Microsoft components (see the announcement). One can suppose that some other things could be changed in the future. Thus I think that putting all your eggs in one basket would be too dangerous now: keeping of both frameworks could reduce the risk.
The next thing, which I found, was the following. dnxcore50 (dotnet5.4 or CoreFX or .NET Core foundational libraries) don't support many features supported by .Net Framework. One important example for me was missing XSD Schema validation (see here and here). I use XML only in combination with XSD Schema validation. I prefer JSON in the most other cases. Kipping of both frameworks in your project could helps you to locate the parts of your code, which could be not yet implemented in CoreFX. It could helps you to move the code in separate component or to change the implementation.
About the performance. One should distinguish potentiality of both frameworks from the current implementation. In general CoreFX was redesigned and decomposed. Many parts of one mscorlib was separated or removed (remoting, AppDomains and so on). It means that the performance of CoreFX should be better. Theoretically the factored API can provide better performance. Moreover one can more easy improve one parts of CoreFX and publish new version with improved performance. More modules instead of having one monolith gives us the new way for improvement of the performance and for fixing the bugs. On the other side replacing of dependencies to new version could be origin of new compatibility problems and thus it increases the risk and could decrease the stability. By keeping of both frameworks we can test whether the new problem exist in alternative framework. It allows us to suppose that the last changes of dependencies and not the last changes of our main code is the origin of new problems.
I can continue with pros and cons of the usage of every framework, but nodoby like to read long text and all my arguments forward me to the same practical decision: keeping by default of both frameworks in my projects as soon as I would find out a real requirement to drop one from the frameworks.
No major advantages really so far.
This might change in the future and why I'm planning to target both (CoreCLR and .NET 4.6). A lot of investment is being spent in CoreCLR but also on Docker and Service Fabric.
Just my 2 cents.

What is "Enterprise ready"? Can we test for it?

There are a couple of questions on Stackoverflow asking whether x (Ruby / Drupal) technology is 'enterprise ready'.
I would like to ask how is 'enterprise ready' defined.
Has anyone created their own checklist?
Does anyone have a benchmark that they test against?
"Enterprise Ready" for the most part means can we run it reliably and effectively within a large organisation.
There are several factors involved:
Is it reliable?
Can our current staff support it, or do we need specialists?
Can it fit in with our established security model?
Can deployments be done with our automated tools?
How easy is it to administer? Can the business users do it or do we need a specialist?
If it uses a database, is it our standard DB, or do we need to train up more specialists?
Depending on how important the system is to the business the following question might also apply:
Can it be made highly available?
Can it be load balanced?
Is it secure enough?
Open Source projects often do not pay enough attention to the difficulties of deploying and running software within a large organisation. e.g. Most OS projects default to MySql as the database, which is a good and sensible choice for most small projects, however, if your Enterprise has an ORACLE site license and a team of highly skilled ORACLE DBAs in place the MySql option looks distinctly unattractive.
To be short:
"Enterprise ready" means: If it crashes, the enterprises using it will possibly sue you.
Most of the time the "test", if it may really be called as such, is that some enterprise (=large business), has deployed a successful and stable product using it. So its more like saying its proven its worth on the battlefield, or something like that. In other words the framework has been used successfully, or not in the real world, you can't just follow some checklist and load tests and say its enterprise ready.
Like Robert Gould says in his answer, it's "Enterprise-ready" when it's been proven by some other huge project. I'd put it this way: if somebody out there has made millions of dollars with it and gotten written up by venture capitalist magazines as the year's (some year, not necessarily this one) hottest new thing, then it's Enterprise-ready. :)
Another way to look at the question is that a tech is Enterprise-ready when a non-tech boss or business owner won't worry about whether or not they've chosen a good platform to run their business on. In this sense Enterprise-ready is a measure of brand recognition rather than technological maturity.
Having built a couple "Enterprise" applications...
Enterprise outside of development means, that if it breaks, someone can fix it. I've worked with employers/contractors that stick with quite possibly the worst managing hosting providers, data vendors, or such because they will fix problems when they crop up, even if they crop up a lot it, and have someone to call when they break.
So to restate it another way, Enterprise software is Enterprisey because it has support options available. A simple example: jQuery isn't enterprisey while ExtJS is, because ExtJS has a corporate support structure to it. (Yes I know these two frameworks is like comparing a toolset to a factory manufactured home kit ).
As my day job is all about enterprise architecture, I believe that the word enterprise isn't nowadays about size nor scale but refers more to how a software product is sold.
For example, Ruby on Rails isn't enterprise because there is no vendor that will come into your shop and do Powerpoint presentations repeatedly for the developer community. Ruby on Rails doesn't have a sales executive that takes me out to the golf course or my favorite restaurant for lunch. Ruby on Rails also isn't deeply covered by industry analyst firms such as Gartner.
Ruby on Rails will never be considered "enterprise" until these things occur...
From my experience, "Enterprise ready" label is an indicator of the fear of managers to adopt an open-source technology, possibly balanced with a desire not to stay follower in that technology.
This may objectively argued with considerations such as support from a third party company or integration in existing development tools.
I suppose an application could be considered "enterprise ready" when it is stable enough that a large company would use it. It would also imply some level of support, so when it does inevitable break.
Wether or not something is "enterprise ready" is entirely subjective, and undefined, and rather "buzz word'y".. Basically, you can't have a test_isEnterpriseReady() - just make your application as reliable and efficient as it can be..

Visual Studio Team System switching opinions

Assume your .NET-based development team is already using the following set of tools in its processes:
Subversion / TortoiseSVN / VisualSVN (source control)
NUnit (unit testing)
An open source Wiki
A proprietary bug-tracking system that is paid for
You are happy with Subversion and NUnit, but dislike the Wiki and bug-tracking system. You also would like to add some lightweight project-management software (like Fogbugz/Trac) - it does not have to be free, but obviously cheaper is better.
Can you make a compelling argument for adopting VSTS, either to add missing features and replace disliked software or to handle everything (including the source control)? Is the integration of all these features greater than the sum of the parts, or would it simply be better to acquire and replace the parts that you either do not like or do not have?
I remember looking into VSTS a few years ago and thought it was terribly expensive and not really better than many of the free options, but I assume Microsoft has continued to work on it?
VSTS is great, if you do everything in it. Unfortunately the price has not become better over the years. :( The CAL's are still ludicrously expensive. The only improvement is that if a person uses only the work item system, and works only with his/her own work items (no peeking at other person's work items!) then there is no need for a CAL. This makes it a bit easier to use it as an external bugreport system. Still it leaves a lot to be desired in this area.
There is one way to alleviate the cost - become Microsoft Certified Partner. If you are a simple partner, you get 5 VS/TFS licenses for free; if you are a Gold Certifiend Partner, you get 25 (if memory fails me not). That should be enough for most companies. But getting the Gold status might be tricky, depending on what you do.
If you only dislike those two parts, then perhaps it's better just to find a replacement for them instead for everything? There are many wiki systems out there, some should be to your liking. The same goes for bugtracking too.
We are extremely happy with not only the tools, but the integration that Team Foundation Server, and the various Team Editions have given us. We previously used Borland's StarTeam for source control and issue tracking with a 3rd party wiki, the name of which escapes me at the moment.
It came time for us to extend our licensing and support agreement with Borland, only to learn that the cost of adding users to our license and upgrading the product would cost us as much (a little more, actually) than biting the bullet and making the switch. One thing to consider is that you would normally pay for the development tools to begin with, so the cost is partially absorbed by our budget.
We also did not feel the need for getting Team Suite for every person. You might want to consider it for the developers, but other disciplines don't really have a benefit in using all of the tools in most companies.
We were able to get the appropriate team editions for twelve people, enough CALs for 50 users (for Team Explorer, Teamprise, Team Project Portals, Team Web Access), Teamprise for the five Mac Users that we have, and the Team Foundation Server software itself for under six figures. Considering that includes the developer tools that we normally would be buying, it was a good deal.
The upfront cost on new licensing also covered two years, so we could split the budget between the 2008 and 2009 fiscal years. The very important thing is to make sure not to let the licenses lapse, as the renewals on licenses cost a fraction of the initial cost and also include version upgrades.
As to the features, we are in the process of rolling out. About half of our department completed training, and I have already started migrating projects over. The development team absolutely loves the features and tight integration with their workflow. Version control is a snap, and work items (and their related reporting artifacts) are extensible to the nth degree. The fact that TFS relies heavily on bringing sanity to workflow management helps to tie in all of the processes to a level that you just can not get with multiple vendors.
My absolute favorite thing, though, is the extensibility model. Using the Team Foundation Server API, you can easily write check-in policies, write tools to interface with the system, develop plug-ins, and more. We are already seeing gains in productivity and the quality of our products through a minimal implementation.
Still on the horizon, though, is integrating Team Build. I have yet to set up a build project, but it seems to be seamless and painless. Time will tell... :-)
Edit - I forgot to mention that our migration to TFS includes licensing for the Test Load Agent. The load testing functionality within Team Test is one of, if not the absolute best that I have seen.
Where I'm at, we've settled on the following:
SVN for source control
Redmine for bug-tracking and wiki
NUnit for unit testing
CruiseControl.NET for our build server
Redmine is an open source Ruby on Rails application that supports multiple projects much better than Trac and seems to be much easier to administer. It's definitely worth checking out.
VSTS seems to be way too much money compared to other products. As an additional benefit, you also get the souce with open source solutions, which allows you to modify things to fit your need if the capability isn't there yet.
I'd stick with SVN and use trac or bugzilla or fogbugz. You could also do a trial of team server. In my opinion it is not worth the money. MS had their chance with version control and they screwed it up a long time ago. Too late to the party if you ask me and frankly I am not impressed with how they try to control all your development experience in the IDE with "integration" to the source control. I prefer the perforce/SVN and separate defect tracking solution.
With all that said, you probably can't go wrong with any of the following:
bugzilla or trac or fogbugz AND SVN
MS team thingamabob

How to avoid short-lifespan enterprise applications?

A while ago another question referred to the (possibly urban tale) statistic that
... the average lifespan of software is about 3 years
At the time I came up with the following reasons (and I'm sure there are more possibly better ones):
A new major system (ERP, CRM, etc.) is implemented and it has an "integrated" module to replace the old app.
Same, but no integrated app - but the existing app is not adaptable (the people left, technology has changed, current IT policies have changed, users don't like the existing app.)
The company you acquired the basic app from, to customize it for your needs has disappeared.
Or you don't get along well with them any more.
The technology for the existing app is "obsolete" (according to the framework vendor/Microsoft/consultant/industry expert/new IT manager who has management's ear.)
"We're phasing out (Windows 95/Windows 98/Windows 2000/Windows XP/NT) and we need matching technology in our apps".
"We've learned a lot from (App Version n) and we'll do a lot better the second/third/fourth/n+1th time."
Job justification for developers/IT manager/Division VP/consulting company.
The users hate it.
We've merged/acquired a competitor/been acquired by a competitor and theirs is better.
Some of these are unavoidable (e.g. your company gets bought), but overall this is surely smething that needs to be avoided. Does your organization intentionally fight this syndrome? What effective strategies would you recommend?
That's why an application needs to be easy to expand, and you should be able to easily add-in all the buzzwords.
If you have a solid base code, most of the buzzwords are related to the UI (Vista Controls, Ajax, .net, ASP.net 3.5)...
You could be running COBOL in the back-end ( I wouldn't).
A new major system is implemented - There's nothing you can do.
current IT policies have changed, - The app should be adaptable.
users don't like/hate the existing app - why? cosmetic changes in the UI can fix this most of the time.
The company you acquired the basic app from, to customize it for your needs has disappeared. - I wouldn't do that, I'd prefer to write it myself.
The technology for the existing app is "obsolete" (according to the framework vendor/Microsoft/consultant/industry expert/new IT manager who has management's ear.) - same as the above, if the back-end is solid, you should follow these in the front-end.
"We're phasing out (Windows 95/Windows 98/Windows 2000/Windows XP/NT) and we need matching technology in our apps". - a simple compatibility test and minor UI elements solve this.
I'll also say that this is different when you compare in-house to commercial apps, if you're doing an in-house app, change guarantees your job (if you know what you're doing). If you're doing a commercial app, change is an opportunity to make more money, new features would get you upgrades from existing clients and new clients who are looking for the buzzwords, these buzzword could become your advantage when compared to a competitor.
The average lifetime of software I write at the moment is probably a few days. (I write a lot of scripts, so I might be an aberration. ;-) But the core system I work with is probably 15 to 20 years old now. The underlying OS is about 30 years old. There is nothing inherently wrong with either old or young software. In fact, software ages best when it's possible to adapt it to new uses.
Having layers of abstraction between functional parts make it easier to replace functionality in a system. For instance, we've gone through several different tape libraries on our system and now we are considering going to disk archives in the future. Since the "archive" portion of our system sits behind an abstraction layer, we can replace it fairly easily without replacing the rest of the system.
When possible, it's also best to use standard parts. That way, if you run into some limitation, it's likely others will have the same problems and more likely someone will come up with a fix.
Continuous improvement - add useful features at regular intervals
No show-stopping bugs in new versions - testing, testing, testing...
be nice to your clients and treat them with respect (most users really don't want to change their ERPs every three years so if you have a good realtions with them they'll be on your side)
Stay current with new technologies and integrate them in your application when needed
When gathering requirements and someone says "Situation X will always be the case, no exceptions", make it configurable. It will always change, no exceptions.
Most companies don't make it for 5 years. Their software implementations wouldn't be expected to last as long.