Single developer-to-do-all SDLC activities -- How should I proceed further? - development-environment

Working at client (non-IT) place as a .net programmer (alone) and asked to develope a windows application. No project manager, no SRS, no technical people to lead..., etc.
Directly getting requirement from customer on-their-need basis. It keep changes and has lot of ambguity. As the client is not understaning need of freezing requirement, it becomes huge headache to deal with. Has to do self document of requirement, coding, testing, bug-fixing and delivering build, educating users for application use by myself only. Reporing to a Boss, who is non-technical guy and always not understanding these problems.
Now it becomes, single developer-to-do-all SDLC activities. How should I proceed with this work environment?

Start by making demands on your environment, and on what is asked of you:
Demand that requirements and deadlines are fixed and agreed upon, in writing, before you write a single line of code.
Demand that you are given enough time for testing and bugfixing in the development cycle.
Demand that you are given time to setup source control, automatic builds etc (whatever you feel like you need for your development environment to promote effective work).
Demand that you are given time to write documentation, so that you can spend more time writing code and less time doing application demos.
Continue with backing it up:
Document and show your boss some statistics on how you use your time. If it turns out you use much less on actually writing code, maybe he'll consider giving some of the less programming-related tasks to some other member of the department.
And finally, remember that this this is not the only company in the world:
Robert L. Lead has a very good point in his How to be a Programmer: A Short, Comprehensive and Personal Summary: under Recognizing when to go home, he simply states:
Quit if you have to.
This might not be a very compelling option, but should it come to it, leave the company for the greener grass on the other side. Even telling your boss you're ready to quit if your working conditions don't improve might help you actually get what you want. I doubt that your company want to be left with a software product that suddenly can't be supported or updated, because their only developer quit.

Count your blessings, I'd say. Usually all the people standing between the developers and the users are just getting in the way of making successful software.
I think it is a good idea to adopt some agile tooling to organize yourself, like a scrum whiteboard, and by defining sprint periods/iterations. That will allow to manage your boss's and users' expectations, and still give them control over what should get priority. Don't forget to schedule for SDLC tasks, so you can make them visible to your boss. You should feel free to consider agile tooling as a supermarket: take what you think is useful and keep the rest in mind for later consideration.
As far as requirements documentation is concerned, I'd keep it very high level. I would not mind skipping it altogether but I can imagine that it feels sloppy, and it is perhaps also a way to document your achievements.

A combination of educating both your customer and boss; and an agile approach could be helpful here. It depends on how this project is billed to the customer.
If the customer is getting a fixed price deal, yet is allowed to change the specs, then educate your boss (or whoever is accountable for the financial results of the project) about the implications of this project. It means that the customer gets to ask for whatever they want, without needing to pay more. If the project isn't time boxed, your boss is giving away unlimited developer time at a fixed price. Make that clear. If the project is time boxed, explain that changing means redoing and that there's only so much redoing before you run out of time. If he doesn't see this is a problem, document your time use.
It's the equivalent of going to a car-repair shop, agreeing a price and then pushing the mechanic to not only fix your airconditioning (the original scope), but also replace your oil, uprate your suspension and do a full engine overhaul. In the long run, expect the customer to be demanding that the car flies, solve world hunger and bring world peace.
If you're on a billable hours project, then you're in more trouble. Your boss may not have any incentive for the customer to make reasonable demands, he may just care about you being effectively contracted out to a customer and bringing in revenue. In that case take charge of the project by agreeing an agile methodology with the customer, so you can at least deliver something that will address some customer needs. Feel free to take charge, it seems you're the de-facto manager - just make sure you understand what the terms of the contract for this project and work within those boundaries. If the contract is a bad deal, alert your boss, but your company will need to ride it out or renegotiate.
Work in two week sprints, and show to both your boss and client the ratio of functionality/features delivered vs. overhead (rework) vs other work (training,...). It may become clear quite quickly that your project is under resourced, or the demands to high for the price agreed. Track in spreadsheet, or use a lightweight agile project management tool something like TargetProcess
If the customer is unworkable and your boss only sees you at somebody to pimp out, reconsider if you want to work in such a place and if there is any particular reason why you're spending your professional time at your current company over another company.
Keep in mind that you could be in a reasonably strong position to push for some change to improve the situation. If you're the only developer in a non-IT shop, and you quit, your company will struggle to fulfill its obligations to its customer - your boss, lest he's a halfwit, will be mindful of that. Of course, threating to quit, is the nuclear option, don't play that card lightly.

What I usually do in such situations — and these situations I come across much more than I'd like — is to demand a certain minimum. You can only demand something if you have something you can use to pressure your "Boss". In a single-developer-does-all-and-can-do-all situation, your means of pressure is yourself.
There are some countries in the world where employees are badly protected and you have to be careful. For any other country, this almost becomes a no-brainer: simply demand minimum working conditions.
This means: you make a short-list of things you need. Keep it simple. Keep it — almost — free. Don't come with all kinds of procedures. Use a simple bug-tracking system you can also use for planning, report, feature-tracking and new-development (Jira comes to mind). Both your Boss and your Client should be taught to use it. For yourself you probably want to add source control if you don't have it already.
Now comes the tricky part: for a short while, become very strict. Use the comment threads of your tracking system for communication. Your client will continue to call you or e-mail you. Let him. But copy everything to the comment threads and write your answer there. Send the guy a link to the thread as an answer.
You Boss may not like this because he things it will slow things down. Tell him it will become clearer what can and will be done. It will also become clearer where his money (i.e.: your time) went. Tell him that you want to keep an overview yourself, but that it's good for him too. If he's not convinced, tell him to give me a call and then propose to him to do it 100% your way for two months. He'll make it one month and then you have a deal.
It's a tough game out there. But it's doable. Propose a few simple steps towards bettering your environment, communication and tracking. If he refuses, you refuse to do anything more and he'll be stuck with an even worse situation.

Related

How do you respond to the argument "No time to test/develop clean code, because of the deadline"?

Ok, I think this question is at the wrong place and I'll head over to https://softwareengineering.stackexchange.com/ to read/ask about it. Thanks all for your answers up to this point. :)
apologies ;) I'm sorry if this question is a little bit subjective, but I can not come up with a better title. I'll correct it if you know something better.
In my organization there is a lot of buzz about this whole automated testing and continuous integration thing, but one argument I constantly hear is this:
How should I develop good, clean, easy to maintain code and write unit tests, if the
deadline is already set and it is only half of my estimate?
I'm a developer myself, so I can understand this. But I always try to respond that not only the developers need a paradigm shift, but the management too.
If you are a developer and your estimates are cut half, no matter what you estimate, you are not going anywhere, no matter how complex or trivial your problems are. You need the backup of the management guys, the One Guy who is giving the money.
Conclusion?
Can you give me some help, may it be a good URL to read about this development/management conflict, a book or maybe a personal insight? Did you survive a large process shift like this in a Waterfall company that is now doing Lean development? Or do you know this argument and have a clever answer to it?
And please, help me rename or move this question. :-)
Update
Thanks for all the answers already! :) I think I have to make clear that my point wasn't the "do it twice as fast" statement from management. It's about the negative point of view that comes with this statement from a developer.
Is there anything I can do to help people to understand that this is not the default in software development? That the PM is not actively preventing writing good code and that maybe both sides need a bit more education about the pros/contras of clean code bases, good coverage and lots of automated tests?
One good example is Technical Debt. It's manager friendly. Imagine your credit card. If you accrue debt for a few weeks that can be helpful. You don't need to carry around cash for daily purchases and you pay it off at the end of the month.
This is like a crunch before a release. You take on some debt and then pay it back soon. If you keep charging things and never paying off that debt it starts to compound. That new feature you want it more difficult because the foundation you're building on is unsound. The debt you've accumulated is keeping you from acting quickly. If you're over your limit even typical small purchases won't go through.
You might also want to take a look at Facts and Fallacies of Software Engineering . It talks about estimates and the troubles they can cause when they're not reviewed as the project evolves.
It may sound defeatist to say this, but I've worked in a few shops that had this issue, and they never changed- or more accurately, I found that it was not possible to change the system from within.
The issue is that, from the perspective of the management that insists on this type of development, as long as the product is being released approximately on time, and the customers are buying it, goal accomplished. To put it another way, As long as you are making money, quality does not matter.
Now, you, I, and experienced management understand the long term cost of technical debt. It may be possible to explain to a rational manager the cost of technical debt, the compounding reduction in return on investment in programmer time (by far the most expensive part of a software project), and the fact that a clean, well designed, well tested code base means that new features can be implemented more quickly, and that more time can be spent on new features instead of fixing bugs- leading to a long term improvement in the mean time between releases.
It may be possible to explain this to your management, but every place I've worked that had these issues required a critical failure before they wised up. This usually involved a large portion of the team quitting from frustration, or a large drop in sales as quality diminished due to unrealistic scheduling (in turn leading to massive layoffs). Either way, although I've heard of organizations changing after the fact.
In short, try to explain the cost of technical debt, and the benefit of a clean codebase. Explain it in terms of sales, releases, and customer satisfaction, instead of from a technical perspective. If that doesn't work, start looking for a new job, because poor management leads to a poor product, and a poor product reflects poorly on you as a developer.
What do you mean your "estimates are cut half"? Do you mean you give an estimate, and management says, "No, do it in half that time"? That is unacceptable.
Someone must push back against management. (I say "someone" because I don't know your hierarchy.) There is no such thing as a free lunch. If they want it sooner, then they must make hard, painful tradeoffs. They must prioritize and drop lower-priority features.
If they say, "No. We need it all now. Do it sooner or else," hold the line. They may be surprised, and they may be upset, but you'll earn their respect. The changes will come when they start listening.
There doesn't need to be a conflict between management and development. The conflict is between management and time. It's not your fault it takes time to do things. It's their job to make the hard decisions to get the products out on time without overworking developers until they quit in exhaustion. Just saying "Wrong, do it in half that time" is not management. It's fantasy.
In reality, your management will probably continue to be foolish. If so, you can try to play their game: come up with a safe estimate that you feel is very safe with the automated testing and then double it. Complain loudly when they cut the hours by half, then sigh in resignation. Allow them to feel they are doing their job. Mission accomplished!
A good PM does not estimate. Ever! A good PM will get an estimate from the person who is going to do the work. They will not change it. They may try to coax the worker to change it but, since the worker is the one doing the job, they should be controlling the estimate.
If you have a PM who cuts your estimate in half, make sure your estimate was in writing and then use that to explain to him (sorry for the gender bias, English doesn't really have a good neutral pronoun) and hopefully his boss that the reason your work only seems half finished is because he was screwing around with your estimates.
Tactfully point out that, if they're not going to take your estimates seriously, they should leave you alone and just pluck any old number out of their derrière. That will have the same effect of missed deadlines and unhappy customers but without you wasting time providing numbers that are just going to be ignored anyway.
In any case, the cold war between bad PMs and smart developers will naturally lead to the situation where you should initially double your estimates so that the halving will have little effect :-)
It may sound obvious, but my answer to this question is:
Writing unit tests before the code will allow you to develop good, clean and easy to maintain code
I had a problem with my management who were concerned that developers would not be able to complete their tasks on time if they will be required to write unit tests - this is a common concern when trying to implement a TDD in a Waterfall company. So I made this statement and we had to prove it by writing tests before code and not missing the deadline :) Actually when you get used to it, it will allow to write even more code.
Typically, if you improve your programming practices and code quality, you'll almost certainly speed up your development, as you'll save more time debugging than you will in writing unit tests and trying to make everything right to begin with. Very few shops are in the position of spending more time on code quality than it saves.
Another danger is if management meddles in the process, rather than just serves up impossible deadlines. If they expect you to cowboy code in order to make the deadline, you really won't be able to use good practices.

Engineer accountability and code review processes [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
In your “enterprise” work environment, how are engineers held accountable for performing code inspections and unit testing? What processes do you follow (formal methodology or custom process) to ensure the quality of your software? Do you or have you tried implementing a developer "signoff" sheet for deliverables?
Thanks in advance!
Update: I forgot to mention we are using Code Collaborator to perform our inspections. The problem is getting people to "get it" and buy into doing them outside of a core group of people. As stalbot pointed out below it is a cultural change but the question becomes, how do you change your culture to promote quality initiatives such as reviews/unit tests?
• Our company uses peer code reviews. We conduct them as Over-The-Shoulder reviews and invite the team’s tester to participate in the meeting to gain a better understanding of the changes. We use Source Control software that requires check-in, code-review rules to be signed off. Nothing big, just another developer's name that has reviewed the code.
• There are definite benefits to code review as several studies have been able to demonstrate. For our company, it was evident that code quality increased as the number of support calls decreased and the number of reported bugs decreased as well. NOTE: Some of the benefits here were the result of implementing Scrum and abandoning Waterfall. More on Scrum below.
• The benefits of code review can be a more stable product, more maintainable code as it applies to structure and coding standards, and it allows developers to focus more on new features rather than “fire-fighting” bugs, and other production issues. There really aren’t any drawbacks if code reviews are conducted “right”. More on the “right way” below.
• Some of the hurdles to overcome while implementing code reviews are the idea that “big brother” is watching me and the idea that not having perfect code means torture and pain. Getting developers to trust each other is difficult sometimes, especially when it involves “pecking order” or the “holier than thou” attitudes and putting your hard work under a microscope. Trust is the key to resolving these issues. A developer must trust that they will not be punished by peers or management for mistakes in code. It happens to everyone. Make a note of the issue, get it resolved and move on.
Scrum
One of the benefits of using the Scrum methodology is that a development cycle (”sprint”) is short. The time-frame of the “sprint” is determined by what works best for your organization and will need some trial and error, but really shouldn’t be longer than four week iterations. The benefit is that it requires the developers communicate daily and communicate problems early on in the project. This was initially adopted by our development department and has spread to all areas of our company as the benefits of scrum are far reaching. For more information, see: http://en.wikipedia.org/wiki/SCRUM or http://www.scrumalliance.org/ . As the development iterations are smaller, the code review process reviews smaller pieces of code, making the review more likely to find problems than hours or days of formal reviews.
“Right Way”
Code Reviews done the “right way” is completely subjective. However, I personally believe that they should be informal, over-the-shoulder reviews. All of the participants in a review should avoid personally attacking the person being reviewed with statements such as “why did you do it that way?” or “what were you thinking?” etc. These types of comments diminish the trust between peers, leading to animosity, hours of arguing over the best/right way to code a solution. Keep in mind that developers do not think or code exactly the same, and there are many solutions to a problem.
Just a little clarification on over-the-shoulder reviews; these can be conducted via remote desktop sharing (pick flavor here), or in person. However, they shouldn’t be limited to the developers only. We typically invite our entire scrum team which consists of two developers per team, a tester, a documentation person, and product owner. All non-developers are there to gain a better understanding of the changes or new functionality being made. They are free to ask questions or provide input, but not to make coding decisions or comments. This has been effective as certain questions will be asked that may change the direction of the project as the initial requirements may have missed a scenario, but that is what agile is all about, change.
Suggestion
I would highly recommend researching scrum and code reviews, before mandating them. Create the basic rules for each and implement them as part of your culture to achieve a better quality product. It must become part of your culture so that it is part of a natural process and integrated at all levels, as it is a paradigm shift from poor quality, missed deadlines and frustration to better quality products, less frustration, and more on-time deliverables.
If you want to ensure that every changelist gets reviewed, before checkin, then you could have your source control tool reject unreviewed checkins. For example, a trigger could reject checkins without "CodeReview: " in the checkin comment. Although people could still lie, they could also be held accountable.
If you want to ensure that every changelist gets reviewed, after checkin, then you could see if Code Collaborator will play nicely with your source control system and automatically make a review task after each checkin (push or pull; whatever works). After that, use whatever "polite annoyance" features Code Collaborator has, to make sure reviews actually get done.
If you want people to review only some checkins, not all checkins, then good luck with that.
We have a pretty cool setup. Coders are expected to test their code before check-ins to ensure that it doesn't break the build and to write tests where they make sense to have but high coverage isn't required.
Complex methods are expected to be commented.
At the end of phases code is reviewed by the whole team.
Pair programming. Work items have a required field of collaborator, the person that you paired with for the work
We lean heavily on ITIL concepts. While we don't need the full scale ITSM that ITIL provides, we have implemented processes that draw from some of the best practices in ITIL, specifically in the areas of Change Management and Release Management.
Code reviews are part of our RM strategy. As a change or new piece of code makes its way through our RM process, a lot of eyes look at it. Ultimately the Release Manager makes the call on approval or rework, and all of this is documented (we use TFS and SharePoint). Formal code reviews are held by the Release Manager and the technical team he selects. The primary developer for a release candidate is held accountable for adherence to standards, functionality, and a verification of a completed test plan. If the quality standards aren't met, the deliverable is rejected and the project schedule is updated to reflect the rework.
Yes, this is all very heavy. I work in government and we have complex laws to follow, specifically in the areas of taxes, ADA compliance, and so on.
We use three basic rules
1) The developer is responsible for fixing bugs in code when unit tests don't exist. In cases where there is a test, the person breaking the test is responsible for fixing it.
2) Code reviews. There are some code review smells that are a good warning sign, over defensiveness and blame redirection being the two most common.
3) NO EMAILING CODE, JARs or config files. Everything is in the scm.
To create the culture 1st try define your standards and values and most of all make them known.
Then hire people who believe in them or who could be able to adapt to them. Don't hire someone who does not have any connection at all with your company values.
Make sure that those who respect these values and show improvements are "rewarded" and "properly" recognized and seen as models. Don't forget that for many is not all about the money.
Don't hesitate to take appropriate measures againts those who do not fulfill their responsibilities but make sure they know them. And have them accountable for their deeds.
Allow people to become used with any new responsibility.
To make change in culture is big deal. Still there are some ways to change.
Create awareness about code review and importance of code review tool. It can be done using training session.
Motivate the people : Giving some reward for the code reviews.
Change in process : Make sure that code review should be happen and properly. It can be done using checklist and part of release process.
Do not try to change completely. Slowly introduce newer changes. Create small group to observe and discuss the change in code review process.
Provide the solution instead of create problem. Process should not be overhead. It comes automatically. Provide solutions to peoples problem related to the process.

Low Friction Minimal Requirements Gathering [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
How can our team gather requirements from our "Product Owner" in as low friction yet useable of a way as possible?
Now here's the guidelines- No posts that it can't be done or that the business needs to make a decision that it cares about quality, yada yada. The product I work for is a small group that has been successful for years. I just want to help them step it up a notch.
Basically, I'm on a 6 or 7 person team with one Product Owner. She does a great job but is juggling a few different roles (as I believe is common on extremely small teams). Usually requirements are given at sporadic times (email convos, face to face discussions, meetings, etc). They are never entered into a system and sometimes this results in features missing a release or the release getting pushed back since everyone forgot about the necessary feature.
If you're in a similar situation but you found a way to overcome this, I'd love to hear it. I'm happy to write code to help ease this situation but it can't be a web site that the Product Owner has to go to in order to get anything done. She is extremely busy and we need some way of working together as a team in order to gather these requirements.
I'm currently thinking of something like this: Developers and team members gather requirements discussed in face to face meetings and write some quick notes on the features discussed on a wiki page. Product owner is notified whenever these pages are updated and it then becomes her responsibility to ensure accuracy.
Pros: We'll have some record of the features. Cons: The developers are taking responsibility for something that they ordinarily wouldn't. I'm okay with that here. I think in this situation it's teamwork.
Of course once we do this, then we're going to see that the product owner probably doesn't have enough time to ensure feature accuracy. Ultimately she is overburdened and I think this will help showcase that fact, but I just need to be able to draw attention to that first.
So any suggestions?
P.S. her time is extremely limited so it is considered unreasonable to expect her to need to type in the requirements after discussion. She only has time to discuss them once and move on.
Although the concept of "product owner" is a littl ambiguous to me, I think I am working in very similar circumstances: the customer is extremely buzy and always is a bottleneck in developing requirements.
On the surface, what we try to do in this situation is quite obvious and seemingly simple: we try to make sure that the customer is involved in "read-only / talk-only" mode. No writing. Minimum reading. Mostly talking.
The devil, of course, is in details. So, here are some specifics about our process (in no particular order):
We often start from recording problem statements, which are the ultimate sources of requirements. In fact, sometimes a problem statement is all that we record initially, just to make sure it does not get lost.
NB: It is important to distinguish problem statements from requirements. Although a problem statement sometimes clearly implies some requirement, in general a single problem statement may yield a whole bunch of requirements (each having its own severity and priority); moreover, sometimes a given requirement my define a solution (usually just a partial one) to multiple problems.
One of the main reasons of recording problem statements (and this is very relevant to your question!) is that semantically they are somewhat "closer to the customer's skin" and more stable than requirements derived from them. I believe those problem statements make it much easier and quicker to put the customer into proper context whenever he has time to provide feedback to the development team.
We do record all the requirements (and back-track them to problem statements), regardless of when are we going to implement them. Priorities govern the order in which requirements get implemented. Of course, they also govern the order in which customer reviews unfinished requirements.
NB: A single fat document containing all requirements is an absolute no-no! All the requirements are placed in "problem tracking database", along with bug reports. (A bug is just a special case of a problem in our book.)
We always try to do our best to minimize the number of iterations necessary to "finalize" each requirement (or a group of related requirements). Ideally, a customer should have to review a requirement only once.
Whenever the first review turns out to be insufficient (happens all the time), and the requirement in question is complex enough to require a lot of text and/or illustrations, we make sure that the customer does not have to re-read everything from scratch. All the important changes/additions/deletions since the previous reviwed version are highlighted.
While a problem or requirement remains in an unfinished state, all the open issues (mostly questions to customer) are embedded into the document and highlighted. As a result, whenever the customer has time to review requirements, he does not have to call a meeting and solicit questions from the team; instead the customer can open any unfinished document first, see what exactly is expected from him, and then decide what's the best way and time (for him) to address any of the open issues. Sometimes the customer chooses to write a email or add a comment directly to the problem document.
We try our best to establish and maintain official domain vocabulary (even if it gets scattered across the documentation). Most importantly, we practically force the customer to stick to that vocabulary.
NB: This is one of the most difficult parts of the process, and customer tries to "rebel" from time to time. However, at the end of the day everybody agrees that it is the only way to make precious meetings with the customer as efficient as possible. If you ever attended one-hour meetings where 30 minutes were being spent just to get everybody on the same page (again), I'm sure you would appreciate having a vocabulary.
NB: Whenever possible, any changes in the official vocabulary get reflected in the very next release of the software.
Sometimes, a given problem can be solved in multiple ways, and the right choice is not obvious without consulting with the customer. It means that there will be a "menu of requirements" for the customer to pick from. We document such "menus", not just the finally chosen requirement.
This may seem controversial and look like an unnecessary overhead. However, this approach saves a lot of time whenever the customer (usually few weeks or months down the road) suddenly jumps in with a question like "why the heck did we do it this way and not that way?" Also, it is not such a big deal to hide "rejected branches" using proper organization/formatting of requirements documentation. Boring but doable. :-)
NB: When preparing "menus of requirements", it is very important not to overdo them. Too many choices or too many choice nesting levels - and the next review may require much more customer's time than really necessary. Needless to say that the time spent on elaborated branches may be totally wasted. Yes, it is difficult to find some balance here (it greatly depends on the always-in-a-hurry customer's ability to think two or more steps ahead and do it quickly). But, what can I say? If you really want to do your job well, I am sure that after some time you will find the right balance. :-)
Our customer is a very "visual" guy. Therefore, whenever we discuss any significant user interface elements, screen mockups (or even lightweight prototypes) often are extremely helpful. Real time savers sometimes!
NB: We do screen mockups exclusively for the customer, only in order to facilitate discussions. They may be used by developers too, but in no way do they substitute user interface specifications! More often than not, there are some very important UI details that get specified in writing (now - primarily for developers).
We are lucky enough to have a customer with a very technical background. So we do not hesitate to use UML diagrams as discussion aid. All kinds of UML diagrams - as long as they help customer to get into proper context quicker and stay there.
I am talking about requirements-level UML diagrams, of course. Not about implementation-level ones. I believe that even not very technical people can start digging requirements-level UML diagrams sooner or later; you just have to be patient and know what to put on a diagram.
Obviously, the cost of such process greatly depends on analytical and writing skills of the team, and of course on the tools that you have at your disposal. And I must admit that in our case this process appears to be quite expensive and slow. But, taking into account the very low rate of bugs and low rate of "vapor-features"... I think, in the long run, we get very good payback.
FWIW: According to Joel's nice classification of software products, this project is an "internal" one. So we can afford to be as agile as our customer can handle. :-)
"Developers and team members gather requirements discussed in face to face meetings and write some quick notes"
Start with that. If you aren't taking notes, just make one small change. Take Notes. Later, you might post them to a wiki or create a feature backlog or start using Scrum or bugzilla or something.
First, however, make small changes. Write stuff down sounds like something you're not doing, so just do that and see what improves and what you can do next. Be Agile. Work Incrementally.
You might want to be careful of the HiPPO in the room. The Highest Paid Person's Opinion is not always a good one. We've tended to focus more on providing great tools and support for developers. These things, done right, take some of the hassle out of development, so that it becomes faster and more fun. Developers are then more flexible in terms of their workload, and more amenable to late-breaking changes.
One-Click testing and deployment are a couple of good ones to start with; make sure every developer can run up their own software stack in a few seconds and try out ideas directly. Developers are then able to make revisions quickly or run down side paths they find interesting, and these paths are often the most successful. And by successful I mean measured success based on real metrics gathered right in the system and made readily available to all concerned. The owner is then able to set the metrics, which they probably care about, rather than the requirements, which they either don't care about or have no experience in defining.
Of course it depends on the owner and your particular situation, but we've found that metrics are easier to discuss than requirements, and that developers are pretty good at interpreting them too. A typical problem might be that customers seem to spend a long time filling their shopping carts but don't go on to checkout.
1) A marketing requirement might be to make the checkout button bigger and redder. 2) The CEO's requirement might be to take the customer straight to checkout, as the CEO only ever buys one item at a time anyway. 3) The UI designer's requirement might be to place a second checkout button at the top of the cart as well as the existing one at the bottom. 4) The developer's requirement might be some Web 2.0 AJAX widget that follows the mouse pointer around the screen. Who's right?
Who cares... the customer probably saw the ridiculous cost of delivery and ran away. But redefine the problem as a metric, instead of a requirement, and suddenly the developer becomes interested. The developer doesn't have to do 10 rounds with the CMO on what shade of red the button should be. He can play with his Web 2.0 thing all week, and then rush off the other 3 solutions on Monday morning. Each one gets deployed live for 48 hours and the cart-to-checkout rate gets measured and reported instantly. None of it makes any difference, but the developer got to do their job and the business shifts it's focus onto the crappy products they sell and the price they gauge on delivery.
Well, ok, so the example is contrived. There's a lot of work in there to make sure that the project is small, the team is experienced, hot deployment is simple, instant rollback is provided, and that everyone's on board. What we wanted to get to is a state where the developer's full potential is not wasted, so that's why they're involved not just from the start, but also in the success. Start out with an issue like the number of clicks during registration is too high, run it through a design committee, and we found that the number of clicks actually went up in the design specification. That was our experience anyway. But leave the developer some freedom to just reduce the number of clicks and you might actually end up with a patented solution, as we did. Not that the developer cares about patents, but it had merit - and no clicks!

How can I think like a user? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
We're neck deep in a project right now, schedules are tight (but reasonable). Our general strategy is to get a strong beta done, release it for testing, and get feedback from our testers.
Quite frequently, we're being hit by small things that spiral into long, time-costing discussions. They all boil down to one thing: While we know what features we need, we are having trouble with the little details, things like 'where should this message go' and 'do they need this feedback immediately, or will it break their flow, so we should hold off'?
These are all things that our testers SHOULD catch, but
a) Each 'low priority' bug like this drains time from critical issues
b) We want to have as strong a product as possible
and
c) Even the best testing group will miss things from time to time.
We use our product, and we know how our users use the old version...but we're all at a loss as to how to think like a user when we try to use the new version (which has significant graphical as well as underlying changes).
edit - a bit more background:
We're writing a web app used by a widely-distributed base of users. Our app is a big part of their jobs, but not the biggest (and, of course, we only matter to them when it doesn't work). Getting actual users in to use our product is difficult, as we're geographically distant from the nearest location that serves as an end user (We're in Ohio, and I think the nearest location we serve is 3+ hours away).
The closest we can get is our Customer Service team (who have been a big help, really) but they don't really think like the users either. They also serve as our testers (it really motivates them to find bugs when they know that any they DON'T find may mean a big upswing in number of calls). We've had three (of about 12 total) customer service reps back here most of the week doing some preliminary testing...they've gotten involved in the discussions as well.
Watching someone using the app is a huge benefit to me. Possibly someone who is not entirely familiar with it.
Seeing how they try to navigate, how they try to enter information or size windows. Things we take for granted after creating/running the app hour after hour, day after day.
Users will always try and do things you never expected and watching them in action might bring to light how you can change something that might have seemed minor, but really makes a big impact on them.
Read Don't make me think.
Speaking generally, you can't. There's not any way you can turn off the "programmer" part of your brain and think like a user.
And you're right about (c), testing groups don't necessarily catch all the bugs. But the best thing you can do is get a testing group comprised of real, honest-to-goodness end users, and value their feedback. Draw further conclusions from their general comments.
If you want to know how your users will see your system, the closest you can get is usability testing with real users. Everything else is just heuristics and experience, and is also subject to error. There's no such thing as a bug-free product, but you should be able to get a "strong" product with usability testing.
Buy a cheap, easy to use video camera and record your testers using the app. Even better, get some people unfamiliar with the app. to use it and video them. It's relatively cheap, and you'd be surprised what it will highlight.
I like policy of "eating your own dog food"("http://en.wikipedia.org/wiki/Eat_one's_own_dog_food). It brings you one step closer, because you become a user, although you might think like one.
Try to use your app when you are very hurry (e.g. you have someone who waits for a dinner).
You will see all this little things because you have to wait, you have to go back to the mouse of the keyboard, etc.
And also, make your wife use it. Or your mother.
Another useful test : help someone to use it, by phone. If he can't find the button with your directions, that's probably a bug.
The important thing is to get enough information that you yourself can become a "user". Once you do that you can answer most questions yourself.
The way I always do this is to go talk with them about what they need to do, what they typically do, and how they use their current tools to do it. Then (very important) sit with them while they do it. Make sure you get on with them well enough that you can come back to them with questions about how they handle edge cases you think of later (often the answer will be the appalling "we go around the system manually for that").
I will almost always notice something they are doing that is a royal PITA that they didn't bring up because they are used to having to do that and don't know any better. I will always notice that their %90 typical workflow isn't the easiest workflow the tools provide.
You can't really rely on plain old-fashioned requirements gathering by itself, because that is asking them to think like a developer. They generally don't know what is possible to do with your software, what is easy, and what is hard. Also they typically have no clue on GUI design principles. If you ask them for design input they will just tell you to put any new control on their favorite page, until the thing looks like a 747 control panel.
The problem is often that even the users don't know what they want until they are actually working with the software. Sometimes, a small oversight can be a big usability problem, sometimes a well thought out function that was requested by many users sees only little use.
My suggestions to decrease the risk of not implementing the right usability features:
Take a look at users actually doing their day to day work. Even if they use another software or no software at all. You will be able to determine the artifacts they often need to get their job done. You will see what data they frequently need. Concentrate on the artifacts, data and workflows most used. They should be the most usable. Exotic workflows may be a bit more time consuming for the users than often used workflows.
Use working prototypes of the GUI to let users work through a realistic workflow. Watch them and note what hinders them and what works well. Adjust your prototypes accordingly.
If an issue arises in an often-used part of your software, it is time to discuss it now and in details. If the issue concerns a seldom used part, make it a low priority issue and discuss it if you have the time. If issues or suggestions are low priority, they should stay low priority. If you can't determine if solution A or solution B is the best, don't run in circles with the same arguments over and over. Just implement one of the solutions and see if the beta testers like it. The worst thing you could do is waste time over tiny issues, while big issues need to be fixed.
A software will never be perfect, because the viewpoints of users differ. Some users will think that a minor problem breaks the whole application. Others will live with even severe usability issues. People tend to lend their ear to those who argue the loudest. Get to know your users to separate the "loud" issues from the important ones. It takes experience to do this, and sometimes you will make wrong decisions, but there is no perfect way, only one of steady improvement.
If you can, set aside a certain amount of usability development resources for the rollout phase of your software. Usability issues will arise when people start working with it in a real production environment. Sometimes it is not important to present the perfect software, but to solve issues quickly as they arise.
The flippant (yet somewhat accurate) answer to how to think like a user is put a knitting needle in your ear and push really hard.
The longer response is that we as programmers are not normal and I mean that in a good way. I scratch my head at the number of people who still run executables they receive from strangers in emails and then wonder how their computer got infected.
Any group of people will in time develop their own jargon, conventions, practices and expectations. As a programmer you will expect different things from an operating system than Joe User will. This is natural, to be expected yet hard to work around.
It's also why BAs (business analysts) exist. They typically come from a business or testing background and don't think like programmers. They are your link to the users.
Really though, you should be talking to your users. There's no poitn debating what users do. Just drag a few in and see what they do.
A usability test group will help.. tests not focused on discovering bugs, but on the learning curve of the new design, made by a group of users, not programmers.
I treat all users like malicious idiots.
Malicious because I assume all users are going to try and break my code, do stuff that is not allowed, avoid typing in valid data, and will do anything in their power to make my life hell.
Idiots because again I can't assume they will understand simple stuff like phone formats, will run away screaming if presented to many choices, and will not make any leap of faith on complicated instructions. The goal is to hold their hand the entire way.
At the same time, its important to make sure the user doesn't realize you think they're an idiot.
To think like a user, be one. But are these actually bugs that your testers are reporting? Or are they "enhancement requests"? If the software behaves as designed per requirements and they just don't like the way it operates, that's not a bug. That's a failure of requirements and design. Make it work, make it rock solid, make it easy to change and you'll be able to make it what your users want.
I see some good suggestions here, especially observing people trying to use you app. One thing I would suggest is to look at the order in which things are presented to the user on paper forms (if they use these to do data entry from) and make the final data entry page mimic that order as closely as possible. So many data entry errors (and loss of data entry speed) are from them having to jump around on the page and losing their place. I did some work for a political campaign this year and in every case, entering data was made much more difficult because the computer screen did things in a differnt order than the paper inputs. This is particularly important if the form is one that can't be changed (like a voter registration form, a campaign has to use what the state provides) to match the computer screen. ALso be consistent from screen to screen if possible. If it is first Name last name on one form, making it last name first name on the next will confuse people and guanteee data entry errors.
If you are truly interested in understanding users though I strongly suggest taking a course in Human factors engineering. It is an enlightening experience.
The 'right' way to do this is to prototype (or mock up) your new interface features, and watch your users try to use them. Nothing is as enlightening as seeing a real user try to use a new feature.
Unfortunately, given most projects time and resources, this is not possible. If that is the position you are in I would recommend you discuss in the team who has the best grasp of usability, and then make them responsible for usability decisions - but that person will need to regularly consult real users to make sure his/her ideas are consistent with what the users want.
I'd suggest doing some form of usability testing; I've participated in such in the past, and found them quite useful.
If you were writing a ticketing system, for example, bring up tasks, and ask questions like "how would you update this ticket" or "what do you expect to happen if this button is clicked".
You don't necessarily need a full application, either, in some places screen shots can be used.
You could take the TDD/BDD approach and get the users involved before beta, having them work with you on refining requirements as you write your unit tests. We're beginning to incorporate some of those trends into our current project, and we're seeing fewer bugs in the areas where we have involved the users earlier.
There is no "think like a user" technique, get your hands on someone who knows nothing of the project and throw what you have done at them.
It's the only way to see how the look + feel + functionality present themselves to the end user.
Once you shocked that person who knew nothing of the product, listen to all of their idiotic (or so you think they are) complaints, fix them, arrange every silly cosmetic thing they point out (either by fixing the UI or by improving whichever documentation you had)..
and after you have satisfied the person you chose to look at your app from zero knowledge on the subject first round, pick another ...and another... until they stop being shocked when they see it, and they don't get stuck on.. "ok.. what does this do?" kind of phases.
You (as a member of the project, be it the project manager, developer, etc) will never think like a user is my answer to that question.
Old saying: You can make something "fool proof" but you can't make it "Damn-fool proof".
Additionally: When you make something "idiot proof" the world invents a better idiot.
Other than that, I agree with what everyone else said.
Ask someone with absolutely no knowledge, insight or programming experience to use the program and try to figure out every function of the program.
People who would NEVER use such a program are most likely to find bugs.
See it as a new Safari user (or FF) who tries to put the URL inside the search field...
As a programmer you guess no-one would be that stupid (or, well.. unknowing), but people actually sometimes find themselves in these situations. As a programmer, we miss these things.

How to find (and keep) a tester who is developer

I work for a software vendor whose market is developer tools and we have been looking for a QA person for our products.
Since we are a small shop the position will be a combination of Support and QA however since we make developer tools, our support consists in large part of actual development (in that the person must read and understand our customers code and find and point out bugs in it).
The QA portion will also consist of writing applications (in a variety of platforms and languages) and testing how they work with our tools.
The main issue I am running into is when you tell someone with development experience that the position contains "QA" it its title (or even in the job description) they shy away from considering the job.
I'm very interested in feedback and suggestions for how I can find a good person to fill this job and ensure that they are happy doing it. Any ideas?
Money and responsibility.
The reason I shy away from these types of jobs is they dont tend to hold my interest long enough. Having real development tasks should keep you out of that category. The other problem is the salary is usually significantly lower with that in the title.
I am a developer, but spent time working as a QA person (test writing, automation, tool writing/coding). I saw it as something I was doing on the side, and would eventually move out of.
The main reason I wanted out was that it simply was not the career I wanted. No amount of money/responsibility would change that. However I think respect has something to do with it as well. A lot of QA work is simply unappreciated, so that is something that would need to be clearly explained as "not how things work at your company."
I would find someone who wants a QA position, but has strong developement/coding/problem solving skills. They could fill in doing the tool creation or other small coding tasks, but it would be on the side. Sort of a reverse of my feelings above.
I think the ideal combination of jobs is product manager + QA. What I mean by product manager is someone who writes requirements documents and is responsible for making sure the product meets the requirements. This person would be a peer of the lead developer, not a superior. A person who is a developer but likes management and wants to take that career path might be very interested in that combination of roles.
To start with you can just take "QA" out of the title and description if that seems to be 'hot button' that is keeping candidates from looking at the position seriously.
From your description, your position doesn't have much in common with a traditional 'tester' role - the work is mostly writing and thinking about code, not banging on someone else's code and trying to break it. Think of it as a fairly eclectic, tools-oriented development position, and try to advertise and staff it accordingly. (And expect to pay accordingly as well - you get what you pay for.) There are quite a few developers out there who have good skills, but maybe a little shorter attention span than others, and who would prefer to work on a succession of mini projects rather than a longer-lasting piece of a bigger project.
You may just want to keep "QA" out of the title, and call the position "Developer Support" or something like that. Don't mislead any candidates about the duties of the role, but you can cast it more as a "You will be responsible for building the releases and ensuring they are ready to ship to customers."
Also make sure that there is a career path that leads into more development, not more QA, if that's what the candidate wants.
Finally, make sure that the other developers treat this person as a fellow developer, and not as somebody outside the team.
It's sad that "QA" has some stigma attached to it among developers, but it does.
I was a programmer working as a tester for a little time. If I may, the answer is quite simple: let them do whatever they want.
If you give them free reign, I can guarantee that your software will be tested in ways you never imagined.
If, on the other hand, you try to control such a person, then they will grow to despise you. This is inevitable.
The benefits outweight the costs. If you're a large corp then this decision is easy. Just hire software developers and tell them to "go to town" on your product. You'll love the results.
Money and responsibility are key, as Adam and Chops point out. Quality engineers should be on the same pay scale as the developers. Interesting work is also an important factor. The role sounds like a nice variety of tasks.
At my company, developers are often loaned to the test team between projects or when test team is swamped. Some have a knack, others don't. Still, most developers would rather test their own code than find bugs in others' work. The test managers actively woo developers with strong testing skills. I resisted switching to the test team for seven years. A promotion, a 20% raise and a promise that my role was primarily trouble-shooting, management and planning finally convinced me to switch. I do more hands on testing than I thought I would, but I get the challenging work too.
Pay comparable to development. Be truthful; disclose actual expectations of the role. Change the title to Software Quality Engineer.
I agree with Adam, money and responsibility are key. I would suggest that, if you're within a small company, that your QA team is small/non-existent. That probably means there's good opportunity for someone to come in and make a genuine effort to contribute and shape your companies QA policy, procedure and workflow.
Our company had a similar issue with QA, and we're still not there 100% with it. But giving the QA person the power to dictate policy and procedure, and participate in all aspects of product development so as to keep them in the loop has worked well for us. This means, when it comes to QA and testing, we've got someone who understands the product, knows it inside and out, has been heavily involved from the start, and has heavily shaped the procedures they themselves, and the development team, will follow. Responsibility is key.
Most developers are neither good testers nor do they enjoy testing, and you want someone who is both. Be honest in your job ad that the position is NOT a stepping stone to a developer position and you will have fewer applicants but a better chance of keeping who you do hire. QA typically has lousy pay, so if you are willing to pay better, you should be able to find someone. You won't keep them if you hire someone who wants to write code all day, regardless of how much you pay them.
I think you have a toughie here:
The cost of a full time developer for doing the job you require would be too high.
Most dev's (including myself) would get incredibly fed up, very quickly. Most dev's passion is coding, they want to do it as much as possible. Where TBH, from what you have said, it may be very little in the job role you have.
I would say perhaps look for a Junior, someone fresh with little experience. They will probably mould better to your testing/QA process, and it gives them a chance to start looking at production code, with perhaps opportunity to work with it.
Unless you are lucky, I would not expect a "developer" to stay for long, so either expect a bit of turnover, or possibly expand to a full dev role if required, and get a cheaper sole tester in.
I know you are a small shop, so finances may be a large part to play, but I would say you need to weigh up the possibility of getting a dev in and fixing the problems you have if they occur that often. Testers are cheap by comparison. May be best to get a tester in, find all the issues, then get a contractor/part time dev in to fix issues.
Dude, A certain company I work for has found the solution to your problems. Hire QE not QA. QA (Quality Assurance) does have a stigma to it. The job title itself implies boring rote tasks to most developers. QE (Quality Engineering) sounds just as bad, but doesn't scare off nearly as many people.
If all else fails just hire a developer. I mean seriously, you want someone who can write code, so hire someone who has training in that. The thing is, you need to look at your applicants and talk to them. You are looking for someone who knows how QE works and you want to hire a developer that works in the language your program uses not what it's written in.
The most common title for this posotion is "Software Developer in Test".
But I think another trouble is much more important - its hard to prevent a person with good testing and dev knowledge from migrating to Dev team