How do you verify that users' requirements are addressed in the code you're working on? [closed] - requirements

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
Code can be perfect, and also perfectly useless at the same time. Getting requirements right is as important as making sure that requirements are implemented correctly.
How do you verify that users' requirements are addressed in the code you're working on?

You show it to the users as early and as often as possible.
Chances are that what they've asked for isn't actually what they want - and the best way of discovering that is to show them what you've got, even before it's finished.
EDIT: And yes, this is also an approach to answering questions on StackOverflow :)

You write tests that assert that the behavior the user requires exists. And, as was mentioned in another answer, you get feedback from the users early and often.

even if you talk with the user, and get everything right, the user might have gotten it wrong. They won't know until they use the software that they didn't want what they asked for. the surest way is to do some sore of prototype that allows the user to "try it out" before you write the code. you could try something like paper prototyping

If possible, get your users to write your acceptance tests. This will help them think through what it means for the application to work correctly. Break the development down into small increments that build on each other. Expose these to the customer early (and often), getting them to use it, as others have said, but also have them run their acceptance tests. These should also be developed in tandem with the code under test. Passing the test won't mean that you have completely fulfilled the requirements (the tests themselves may be lacking), but it will give you and the customer some confidence that you are on the right track.
This is just one example of where heavy customer interaction pays off when developing code. The way to get the most assurance that you are developing the right code is having the customer participating in the development effort.

How do you verify that users' requirements are addressed in the code you're working on?
For a question put in this form the answer is "You can't".
The best way is to work with users from the very first days, show them prototypes and incorporate their feedback continuously.
Even so, at the end of the road, there will likely be nothing resembling what was originally discussed and agreed on.

Ask them what they want you to build before you build it.
Write that down and show them the list of requirements you have written down.
Get them to sign off on the functional design.
Build a mock up and confirm that it does what they want it to.
Show them the features as it is being implemented to confirm that they are correct.
Show them the application when it's finished and allow them to go through acceptance testing.
They still wont be happy but you will have done everything you can.
Any features that are not in the document they signed off can be considdered change requests which you can charge them extra. Get them to sign off everything you show them, to limit your liability

by using development method that often controls alignement between the implementation and the requirements.
For me, the best way is to involve a "expert customer" to validate and test in a interative way as often as possible the implementation ....
If you don't, you risk to have, as you said, a very beautiful soft perfectly useless....

you can try personas; a cohort of example users that use the system.
quantify their needs, wants, and make up scenarios of what is important to them; and what they need to get done with the software.
most importantly- make sure that the users (the persona's) goals are met.
here's a post I wrote that explains it in more detail.

You write unit tests that expect an answer that supports the requirements. If the requirement is to sum a set of numbers, you write
testSumInvoice()
{
// create invoice of 3 lines of $1, $2, $3 respectively
Invoice myInvoice = new Invoice().addLine(1).addLine(2).addLine(3);
assertTrue(myInvoice.getSum(), 6);
}
If the unit test failed, either your code is wrong or possible was changed due to some other requirement. Now you know that there is a conflict between the two cases that needs to be resolved. It could be as simple as updating the test code or as complex as going back to the customer with a newly discovered edge case that isn't covered by the requirements.
The beauty of writing unit tests is it forces you to understand what the program should do such that if you have trouble writing the unit test, you should revisit your requirements.

I don't really agree that code can be perfect...but that's outside of the real question. You need to find out from the users prior to any design or coding is done what they want - ask them 'what does success look like', 'what do you expect when the system is complete', 'how do you expect to use it'...and video tape the response, mindmap it, or wireframe it and than give review it with them to ensure you captured the most important aspects. You can than use those items to verify the iterative deliveries...expect the users to change their mind/needs over time and once they have 'it in their hand' (IKIWISI - I Know It When I See It)...and record any change requests in the same fashion.

AlbertoPL is right: "Most of the time even the users don't know what they want!"
And if they know, they have a solution in mind and specify aspects of that solution instead of just telling the problem.
And if they tell you a problem, they may have other problems without being aware that these are related by having a common cause or a common solution.
Thus, before you implement mockups and prototypes, go and watch the use of what the customer already has or what the staff is still doing by hand.

Related

How to hand over a project systematically? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
We have a project hand over from on shore team to our team (off shore) not long ago. However we were having difficulties for the hand-over process.
We couldn't think of any questions to ask during their design walk-through, because we were overwhelmed by the sheer amount of information. We wanted to ask, but we didn't know what to ask. Since they got no question from us, the management think that the hand-over process was been done successfully.
We had tried to go through all the documentation from our company wiki page before attending the handover presentation, but there are too many documents, we don't even know where to start with.
I wonder, are there any rules or best practices that we can follow, to ensure a successful project hand-over, either from us, or to us.
Thanks.
In terms of reading the documentation, personally I'd go for this order:
Get a short overview of the basic function of the application - what is it meant to achieve. The business case is probably the best document which will already exist.
Then the functional specification. At this point you're not trying to understand any sort of how or technology, just what the app is meant to do. If it's massive, ask them what they key business processes are and focus on those.
Then the high level technical overview. This should include an architecture diagram, required platforms, versions, config and so on. List any questions you have.
Then skim any other useful looking technical documents - certainly a FAQ if there is one, test scripts can be good too as they outline detailed "how to" type scenarios. Maybe it's just me but I find reading technical documents before I've seen the system a waste - it's too academic and they're normally shockingly written. It's certainly an area I'd limit the time I spent on if I didn't feel I was getting a reasonable return for the time I was spending.
If there are several of you arrage structured reviews between you and discuss the documents you've read, making sure you've got what you need to out of it. If the system is big then each take an area and present to the others on it - give yourselves a reason to learn as much as possible and knowing you're going to be quizzed is a good motivator. Make a list of questions where you don't understand something. Having structured reviews between you will focus your minds and make it more of an interactive task, rather than just trawling through page after page of tedious document.
Once you get face to face with them:
Start with a full system demo. Ask questions as they come up, don't let them fob you off with unclear answers - if they can't answer something have it written down and task them with getting the answer.
Now get the code checked out and running on your machines. Do this on at least two machines - one they lead, one you lead. Document the whole process - this is the most important step. If you can't get the code running you're screwed.
Go through the build process. Ensure that you can build the app (including any automated build and unit tests they may have). Note that all unit tests should pass - if they don't or if they say "oh, that one always fails" then they need to fix that before final acceptance.
Go through the install process. Do this at least twice, one they lead, once you lead. Make sure that it's documented.
Now come up with a set of common business functions carried out with the application. Use this to walk the code with them. The code base will be too big to cover the whole thing but make sure you cover a representative sample.
If there is a database or an API do a similar exercise. Come up with some standard data you might need to extract or some basic tasks you might need to carry out using the API and spend some time working through these with them.
Ask them if there's anything they think you should know.
Make sure that any questions you've written down anywhere else are answered.
You may consider it worth going through the bug list (open and closed) - start with the high priority ones and talk through anything particularly worrying looking. Even if they've fixed it it may point at a bit of code which is troublesome.
And finally if the opportunity exists - if there are any outstanding bugs or changes, see if you can pair program a couple.
Do not finally accept the app unless you are 100% sure you can:
Get the code to compile
Get the code to build (including the database)
Get the application installed
Do not accept handover is complete until they have:
Documented anything you picked up on that wasn't covered to your satisfaction
Answered ALL of your questions - a question they won't answer after being asked repeatedly screams of something they're hiding
And grab their e-mail addresses and phone numbers. Even if it's only informal they'll probably be willing to help out if the shit really hits the fan...
Good luck.
My basic process for receiving a handover would be:
Get a general overview of the app, document it
Get a list of all future work that the client expects
... all known issues
... any implementation specifics
As much up-to-date documentation they have
If possible, have them write some tests for critical components of the system (or at least get them thoroughly documented)
If there is too much documentation (possible) just confirm that it is all up to date, and make sure you find out from them where to start, if it is not clear.
Ask as many question as possible; anything that comes to mind, because you may not have the chance again.
Most handovers, perhaps all of them, will cause a lot of information to be lost. The only effective way to perform a handover that I have seen is to do it gradually. One way to do it is to allow a few key people from phase One to stay on the project well into Phase Two.
The extreme solution is to get rid of all handovers, and start using an Agile mindset.
As a start, define the exit criteria for the handover. This should be discussed, negotiated and agreed with both parties and make sure higher management knows this. Then write up a checklist of all things needed to achieve the exit criteria and chase it.
Check out "Software Requirements" and Software Requirement Patterns for ideas on questions to ask when gathering information about a project. I think that just as they would work for new development, they would also help you to come to terms with an existing project.

Should human factor be taken into account when deciding on what process to use? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
When you are deciding on what methodology or process to use for your project, should you take into account the human factors? If there is any resistance to things, do you go with the flow or force people to change?
For example, say you want to push for pair programming but the team members resist to working in that mode (or show dislikes), what would you do? Make them get used to it, try to convince them to do it or go with the flow and let them do what they like?
The human factor is the most important one.
If you consider nothing else, consider the culture and proclivities of the group.
People who want a process to fail will succeed. It's far easier to alter process than to alter people.
First try to reason, you may be wrong:
If you have resistance to certain things, you can usually give your points for why you think it's good, and hear their points for why they think it is bad, and come to some common grounds.
You should never force people into doing something against their will, but instead try to convince them based on your logical reasoning. Many times you will see reasons from them that changes your point of view.
If your developer is too afraid to voice their opinion, then you should make them feel comfortable with giving their opinion. If they are still reluctant, then you should consider new developers.
Foot in the door principle:
If you want to try some new concept that neither you nor they have experience in, say pair programming, then you can ask them to try it for 1-2 weeks and then you can sit together again after this trial period and assess the effectiveness. I think most people will find it perfectly reasonable to try something new if they have no experience in it, if it is for the purpose of finding out the method's effectiveness, and if it is only for a trial period.
If after this trial period, the thing you were testing was successful, then your developer will be more open to the idea.
Don't change them, find someone who fits:
If you are 100% for some way of doing things, and your developer is 100% against it, and he won't try it and has no logical reason why, instead of trying to change him you're better off finding a developer that will fit into your way of doing things.
If they are 100% against what you want to change, you have to make a decision. Is the developer themselves more important to you, or is the process that you want to change more important.
If you force someone into something they don't want to do, they will find a way to make your method fail.
Yes. Your development process needs to be humane. That said, there are better and worse development practices and you should strive to use the better practices. The best methodologies understand both human strengths and weaknesses and have practices that promote the former and compensate for the latter.
For example, most agile processes put a high value on trusting developers to do the right thing -- to work hard and value quality. They allow developers to have significant input into the process and into the product. This takes advantage of the human quality of rising to expectations. On the other hand, humans have trouble managing too much complexity at one time, so agile practices insist on breaking things down into manageable chunks.
On the other hand, we know that people don't like to do things that don't directly add value to their work. Agile practices, recognizing the value of things like unit testing, insist on this however and require the developer to conform to it despite the initial reluctance. Using TDD compensates for this somewhat by giving real value to developing tests -- you do them first and let them guide the design. It's a bit of the carrot and stick approach to get developers over the initial reluctance to the point where they can experience the value of the method and buy into it on their own.
Adapting the Process
The key to developing a good process with your people lies in adapting the process to the amount of ceremony that you need or want. We use the RUP where I work and one of the central goals of the RUP is to tailor the amount of ceremony in your process to fit your project and the personnel.
For instance, small projects require far less ceremony and tool support. As well, people new to a process need time to adapt. It's best not to flood them with information and let them adapt at their own pace.
Show Me the Money!
To get people to buy into a new process is to let them make a mistake (or present an example form the past) and then show them how the process could have helped prevent the mistake. Try and draw a direct line to show how the process will help them improve the way they work.
For instance: if people are resistant to automating builds and running tests automatically then the next time they release a fix for something that broke a piece of code that was already working use that opportunity to illustrate that an automated test would have caught the error before it got released, saving everyone time and money.
Automation
To ensure people can adapt to a process is to remove as much human intervention from them as you can. Automate builds, tests, reporting as much as possible using information that is automatically captured.
How this helps support process is by removing the "nag" factor. Many people resist new process because they figure it means more work for them to do or extra work that produces little result in the end. By automating existing tasks and gathering data from them you get a lot of benefit without increasing any individual developers workload.
A classic example is continuous integration. Continuous Integration tools like CruiseControl, TeamCity or Hudson can work with version control repositories to extract latest versions of source code, build that code, execute and archive test results and package stuff for deployment. This requires no extra effort on the part of the developer but you get a lot of extra "process" in return. You now know how good your source code is, you can distribute it easily and you can catch bugs earlier.

Scrum, but with no testing or documentation [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
What do you do when you join a team that says they use Scrum, but only use it as a time-management tool and not the whole process?
How can I reinstate back testing and documentation?
I was thinking to start off with adding user stories specifically for testing and documenting.
Perhaps someone else has more experience with this then I do about this as I am sure its not that uncommon.
The key to scrum is that a task be identifiable as "done" before it can be classed as done. How does you company assess whether something is done without reviewing documentation and tests?
Perhaps they have an unusual, but valid, way of doing it. Or perhaps they have missed the point of "done tasks". I'd suggest you start by asking them how they measure down and whether it could be improved. Then suggest documentation and testing as the way of improving the process.
Note that neither testing nor documentation are in fact part of Scrum. Scrum is a pure project management approach - the required engineering practices, like the ones you mention, are supposed to "emerge" during the project. And most specifically, they are supposed to be identified during the heartbeat retrospectives that you do at the end of every sprint. Are you doing those? Can you bring up your concerns there - and are they actually the biggest concerns the team has?
Is the issue that they don't have any documentation and tests, or that they aren't implementing the entire Scrum methodology? Those are 2 very different problems in my mind.
I would much prefer an organization that has taken the time and effort to find and fit a development process that matches their development style as opposed to mandating down from on high the one true process. So I would not be concerned at all if they were using a process that they called Scrum but that didn't meet all the "official" guidelines. Try to determine why the process is the way it is. Chances are that if they have taken the time to tailor it, the team will be receptive to your ideas, especially if you have taken the time to determine why things are the way they are. If you simply approach it as "this isn't Scrum and so isn't right", you will probably not make much headway, but by being pragmatic about the benefits you can likely make some substantial improvements.
Alternatively, if they aren't doing testing and don't have any documentation I would consider that a fairly bad sign. And by documentation I am taking the minimalist view here - a list of features, bug tracking, etc. - I would be very concerned by the absence of these items, less concerned by the absence of items higher up the abstraction list. In the absence of support from management, I would suggest you lead by example. Take it on yourself to setup a simple bug tracking system (there are several - in a pinch, simple text lists in a central location work as well). Don't declare your features complete until someone else has tested it. This can be as simple as walking over to another developer and asking them to try it in front of you. If someone claims a feature is complete, take a few minutes to familiarize yourself with it. If you find a bug, politely mention it to the responsible developer. Slowly build an environment where the team can see the benefits of running tests and tracking features and bugs.
Most teams operate in this manner simply because of a mistaken belief that they don't have time to "do it right", or that they will get to it later. Often this will occur when a simple proof-of-concept done by a developer or two as a side-project turns into a full-on development effort. By showing that it can actually save time and effort, and reducing the initial costs to the rest of the team, you will often find that it becomes ingrained as part of the process without ever actually being officially endorsed or accepted.
If you have management support it will make it much easier, but always be careful to make sure that the team is receptive to the changes. This may mean it takes longer than you want, but so be it, without the team's support any mandated process will fail at the first sign of pressure, which is when you need the process the most.
*Disclaimer - On my last project I spearheaded the movement to tailor the SCRUM process to fit our environment. The "official" process was simply untenable for our client, but it was still an invaluable guide in tailoring our process.
"adding user stories specifically for testing and documenting"
While meta-user stories might make sense in some circles, it rarely works out well. Software folks rarely cope well with meta-user stories, they either don't get the idea that they can change their own processes by writing a story, or -- more typically -- they engineer the meta-user story to death.
When you're interviewing users, it feels like they're making the user story up. Certainly, you're making it up as you listen to them and try to capture it.
When an IT organization tries to make up its own user stories about how IT should work, the process falls apart. Until the organization has done the thing (testing, for example) a bunch of times manually, they're not really qualified to write user stories. Then, after they've done it, they don't need software development processes, they'll just automate the important bits a little at a time.
I think change has to come from a less formal direction. Actually balking at calling something "done" that hasn't been tested is a good starting point.
IT doesn't do things unless forced. So, meet the users and find out why they're not requiring testing. Coach them to require testing. Tell them the consequences and the words to use.
A lot can go wrong in an organization to lead to poor processes. It's important to know what's wrong, and create a demand for change. The best possible thing is to have your boss complaining that you're not fixing it, rather than you suggesting that perhaps it would be good to fix it.
[It doesn't feel right when your boss demands you fix the process, but it's about the only way change will happen.]

What do you do with a developer who does not test his code? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
One of our developers is continually writing code and putting it into version control without testing it. The quality of our code is suffering as a result.
Besides getting rid of the developer, how can I solve this problem?
EDIT
I have talked to him about it number of times and even given him written warning
If you can do code reviews -- that's a perfect place to catch it.
We require reviews prior to merging to iteration trunk, so typically everything is caught then.
If you systematically perform code reviews before allowing a developer to commit the code, well, your problem is mostly solved. But this doesn't seem to be your case, so this is what I recommend:
Talk to the developer. Discuss the consequences for others in the team. Most developers want to be recognized by their peer, so this might be enough. Also point out it is much easier to fix bugs in the code that's fresh in your mind than weeks-old code. This part makes sense if you have some form of code owneship in place.
If this doesn't work after some time, try to put in place a policy that will make commiting buggy code unpleasant for the author. One popular way is to make the person who broke the build responsible for the chores of creating the next one. If your build process is fully automated, look for another menial task to take care of instead. This approach has the added benefit of not pinpointing anyone in particular, making it more acceptable for everybody.
Use disciplinary measures. Depending on the size of your team and of your company, those can take many forms.
Fire the developer. There is a cost associated with keeping bad apples. When you get this far, the developer doesn't care about his fellow developers, and you've got a people problem on your hands already. If the work environment becomes poisoned, you might lose far more - productivity-wise and people-wise - than this single bad developer.
As a developer who rarely tests his own code, I can tell you the one thing that's made me slowly shift my behavior...
Visibility
If the environment allows pushing code out, waiting for users to find problems, and then essentially asking "How about now?" after making a change to the code, there's no real incentive to test your own stuff.
Code reviews and collaboration encourage you to work towards making a quality product much more than if you were just delivering 'Widget X' while your coworkers work on 'Widget Y' and 'Widget Z'
The more visible your work is, the more likely you are to care about how well it works.
Code review. Stick all of your dev's in a room every Monday morning and ask them to bring their most proud code-based accomplishment from the previous week along with them to the meeting.
Let them take the spotlight and get excited about explaining what they did. Have them bring copies of the code so other dev's can see what they're talking about.
We started this process a few months ago, and it's astonishing to see the amount of sub-conscious quality checks that take place. After all, if the dev's are simply asked to talk about what they're most excited about, they'll be totally stoked to show people their code. Then, other dev's will see the quality errors and publicly discuss why they're wrong and how the code should really be written instead.
If this doesn't get your dev to write quality code, he's probably not a good fit for your team.
Make it part of his Annual Review objectives. If he doesn't achieve it, no pay rise.
Sometimes though you do just have to accept that someone is just not right for your team/environment, it should be a last resort and can be tough to handle but if you have exhausted all other options it may be the best thing in the long run.
Tell the developer you would like to see a change in their practices within 2 weeks or you will begin your company's disciplinary procedure. Offer as much help and assistance as you can, but if you can't change this person, he's not right for your company.
Using Cruise Control or a similar tool, you can make checkins automatically trigger a build and unit tests. You would still need to ensure that there are unit tests for any new functionality he adds, which you can do by looking at his checkins.
However, this is a human problem, so a technical solution can only go so far.
Why not just talk to him? He probably won't actually bite you.
Make him "babysit" the build, and become the build manager. This will give him less time to develop code (thus increasing everyone's performance) and teach him why a good build is so necessary.
Enforce test cases - code cannot be submitted without unit test cases. Modify the build system so that if the test cases don't compile and run correctly, or don't exist, then the entire task checkin is denied.
-Adam
Publish stats on test code coverage per developer, this would be after talking to him.
Here are some ideas from a sea shanty.
Intro
What shall we do with a drunken sailor, (3×)
Early in the morning?
Chorus
Wey–hey and up she rises, (3×)
Early in the morning!
Verses
Stick him in a bag and beat him senseless, (3×)
Early in the morning!
Put him in the longboat till he’s sober, (3×)
Early in the morning!
etc. Replace "drunken sailor" with a "sloppy developer".
Depending on the type of version control system you are using you could set up check-in policies that force the code to pass certain requirements before being allowed to check-in. If you are using a sytem like Team Foundation Server it gives you the ability to specify code-coverage and unit testing requirements for check-ins.
You know, this is a perfect opportunity to avoid singling him out (though I agree you need to talk with him) and implement a Test-first process in-house. If the rules aren't clear and the expectations are known to all, I've found that what you describe isn't all that uncommon. I find that doing the test-first development scheme works well for me and improves the code quality.
They may be overly focused on speed rather than quality.
This can tempt some people into rushing through issues to clear their list and see what comes back in bug reports later.
To rectify this balance:
assign only a couple of items at a time in your issue tracking system,
code review and test anything they have "completed" as soon as possible so it will be back with them immediately if there are any problems
talk to them about your expectations about how long an item will take to do properly
Peer programming is another possibility. If he is with another skilled developer on the team who dies meet quality standards and knows procedure then this has a few benifits:
With an experienced developer over his shoulder he will learn what is expected of him and see the difference between his code and code that meets expectations
The other developer can enforce a test first policy: not allowing code to be written until tests have been written for it
Similarly, the other developer can verify that the code is up to standard before it is checked-in reduicing the nmber of bad check-ins
All of this of course requires the company and developers to be receptive to this process which they may not be.
It seems that people have come up with a lot of imaginative and devious answers to this problem. But the fact is that this isn't a game. Devising elaborate peer pressure systems to "name and shame" him is not going to get to the root of the problem, ie. why is he not writing tests?
I think you should be direct. I know you say that you've talked to him, but have you tried to find out why he isn't writing tests? Clearly at this point he knows that he should be, so surely there must be some reason why he isn't doing what he's been told to do. Is it laziness? Procrastination? Programmers are famous for their egos and strong opinions - perhaps he's convinced for some reason that testing is a waste of time, or that his code is always perfect and doesn't need testing. If he's an immature programmer, he might not fully understand the implications of his actions. If he's "too mature" he might be too set in his ways. Whatever the reason, address it.
If it does come down to a matter of opinion, you need to make him understand that he needs to set his own personal opinion aside and just follow the rules. Make it clear that if he can't be trusted to follow the rules then he will be replaced. If he still doesn't, do just that.
One last thing - document all of your discussions along with any problems that occur as a result of his changes. If it comes to the worst you may be forced to justify your decisions, in which case, having documentary evidence will surely be invaluable.
Stick him on his own development branch, and only bring his stuff into the trunk when you know it's thoroughly tested. This might be a place where a distributed source control management tool like GIT or Mercurial would excel. Although with the increased branching/merging support in SVN, you might not have too much trouble managing it.
EDIT
This is only if you can't get rid of him or get him to change his ways. If you simply can't get this behaviour to stop (by changing or firing), then the best you can do is buffer the rest of the team from the bad effects of his coding.
If you are at a place where you can affect the policies, make some changes. Do code reviews before check ins and make testing part of the development cycle.
It seems pretty simple. Make it a requirement and if he can't do it, replace him. Why would you keep him?
I usually don't advocate this unless all else fails...
Sometimes, a publicly-displayed chart of bug-count-by-developer can apply enough peer pressure to get favorable results.
Try the Carrot, make it a fun game.
E.g The Continuous Integration Game plugin for Hudson
http://wiki.hudson-ci.org/display/HUDSON/The+Continuous+Integration+Game+plugin
Put your developers on branches of your code, based on some logic like, per feature, per bug fix, per dev team, whatever. Then bad check-ins are isolated to those branches. When it comes time to do a build, merge to a testing branch, find problems, resolve, and then merge your release back to a main branch.
Or remove commit rights for that developer and have them send their code to a younger developer for review and testing before it can be committed. That might motivate a change in procedure.
You could put together a report with errors found in the code with the name of the programmer that was responsible for that piece of software.
If he's a reasonable person, discuss the report with him.
If he cares for his "reputation" publish the report regularly and make it available to all his peers.
If he only listens to the "authority", do the report and escalate the issue to his manager.
Anyway, I've seen often that when people are made aware of how bad they seem from outside, they change their behaviour.
Hey this reminds me of something I read on xkcd :)
Are you referring to writing automated unit test or manually unit testing prior to check-in?
If your shop does not write automated tests then his checking in of code that does not work is reckless. Is it impacting the team? Do you have a formalized QA department?
If you are all creating automated unit tests then I would suggest that part of your code review process include the unit tests as well. It will become obvious that the code is not acceptable per your standards during your review.
Your question is rather broad but I hope I provided some direction.
I would agree with Phil that the first step is to individually talk to him and explain the importance of quality. Poor quality can often be linked to the culture of the team, department and company.
Make executed test cases one of the deliverables before something is considered "done."
If you don't have executed test cases, then the work is not complete, and if the deadline passes before you have the documented test case execution, then he has not delivered on time, and the consequences would be the same as if he had not completed the development.
If your company's culture would not allow for this, and it values speed over accuracy, then that's probably the root of the problem, and the developer is simply responding to the incentives that are in place -- he is being rewarded for doing a lot of things half-assed rather than fewer things correctly.
Make the person clean latrines. Worked in the Army. And if you work in a group with individuals who eat a lot of Indian food, it wont take long for them to fall in line.
But that's just me...
Every time a developer checks something in that does not compile, put some money in a jar. You'll think twice before checking in then.
Unfortunately if you have already spoken to him many times and given him written warnings I would say it is about time to eliminate him from the team.
You might find some helpful answers here: How to make junior programmers write tests?
I'd be tempted to suggest elaborating a bit on what you've tried and what results you got as this may have changed a bit but here are my initial suggestions:
Is it any tests or comprehensive tests? Some may code blindly and do zero tests, but this is rather rare, IME. Usually there are some tests done but not enough to cover most of the cases that would be comprehensive testing.
Group dynamics may help. I'd assume he is part of a team and that the team's view may be of some help here. In a way this is trying to get peer pressure which is usually a bad thing but sometimes it can be used in good ways.
How well spelled out were the warnings? In a way this can seem childish but there is a chance that what you think of as testing may not be the same as his. Do you want nUnit tests, an excel spreadsheet, logs from his computer, or something else as proof of the existence and use of tests? From what you've described there isn't anything to confirm that he did understand what you meant, was going to use tests and provide evidence of doing so.
Check-in policy question. Some places, such as my current workplace, encourage committing often which can mean that one does commit code without tests. Is there a known, accepted and well-followed policy where you are? That's another aspect here.

Coding Test - allow use of web? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
During hiring a .NET web developer I give the candidate a coding test.
I tend to limit the candidate to MSDN installed on the test server - I think it holds everything the candidate needs to complete the task.
I admit, this is not the normal case as I don't expect the candidate to do his work without use of the web.
On the other hand I don't want the candidate to google for a complete example and copy-paste it, i want to evaluate his skills.
The question is do I need to allow free use of the web during the test?
If you think the whole coding test is wrong - I would like to hear alternatives you may have for me.
As you say, 'I don't expect the candidate to do his work without use of the web' why not allow it too during the test? And what if he does copy and paste? I do that too. Surely the key is to know where to look, be discerning with what you find and apply it intelligently. Do you want to hire someone with a terrific memory or someone who can develop software for you?
When I was at school, calculators were just becoming affordable. As their use was seen as unavoidable, the exams were changed. Simple number-crunching was no longer tested in the way it was before (it was important then). Rather problem-solving techniques were to be tested.
I usually allow candidates to use whatever resources they want. After they're done, I sit down with them and go through their code together, ask questions like why they chose that particular approach etc.
If a couple of minutes of Googling was enough to not just copypaste some code but to learn enough about it to be able to defend the decisions within, then he's intelligent enough!
There are tests, where web access can be given, and there are where it doesn't really make sense.
Case where its fine to allow web access
When its unlikely to find even 60 percent of the code over the net
When you will ask to explain the code after he/she completed the code
A very specific solution using SQL query, which is unlikely to be found on the web
Case where its fine to not allow web access
Some basic programs like, recurssion, fibonacci, factorial, string manipulation, small trick programs, etc. There is no need of computer even in some of these cases
I'm very sceptical about coding tests during interviews. I think that a lot of the test I have seen, represent very specific (artificial, non real-world) problems where you would use the internet to solve them.
I think it's not really important to know how to solve such problems by heart - often time it is much more important that you know how and where to search for answers.
If you want to test the persons during the interview, I think it is better to ask them some conceptual questions instead of a specific programming problem. E.g: questions about object orientation, polymorphism, design of n-tier application, etc. etc.
Or as an example from the ASP.NET world, ask the interviewed person question such as: what is ViewState, what is a postback, what is session-/application-state, etc.
If you want to get an idea of how a candidate will perform in a job, I think it's best to try and make the conditions of the test as close as possible to the actual working conditions.
It should be pretty easy to prevent copy-and-pasters from slipping through the cracks by asking the candidate to explain his/her code.
Well, one thing you want to be aware of is that the developer you hire might not know everything that he will be thrown during the time he is working for you. If you ask him a question that he doesn't know off the top of his head you would want and expect him to research it and come back to you with proof that he understood the concepts that he just learned.
I say let them use the web - but ask them to explain in their own words how their code works. Most of my knowledge comes from online resources. However, I make sure that every line of code I write I understand.
There is a baseline knowledge that developers in a particular field should know; but you also want to figure out how quickly he can learn new things. A good test IMO is to throw a question you know he doesn't know and see how long he can figure it out using the resources he would have if he were an employee of your company.
Is your goal to see what basic knowledge the candidate has and if he can code without copying solutions from the web, then don't allow internet access. If you want to see what strategies he employs to get to a solution, let him use the web if he wants to.
I personally find it more interesting if a candidate can solve problems on a larger scale than just solving a simple programming problem. So I tend to ask him about the methods he uses when programming (Unit testing? Ever worked with it? What do you think of it?). This gives me a better picture than coding in an interview situation.
Sometimes it helps if you ask the candidates beforehand to bring a one-page coding sample to take a look at their coding style. This also saves you time during the interview.
It's important to make sure a candidate is resourceful - you don't want your programmer sitting there when they get stuck, not moving forward; you want them to use whatever resources are at hand - be it MSDN, picking someone else's brains, using the web, etc - to get the job done. Cut-n-paste from the web does seem like cheating, but (a) if you design your task carefully then it will be unique enough for there not to be a standard answer they can copy from the web, and (b) isn't re-using existing code a key part of building software? It's not much different from using 3rd-party libraries, to avoid reinventing the wheel. On the downside, of course, you also want them to show they can develop algorithms, so the unique task needs to include some element that requires that without the solution already being on the web. Trouble is, forums are the achilles heel to all of that since they can simply ask for the solution and someone, somewhere, is going to hand over the answer unwittingly!
Allow the candidate to use the web but tell him beforehand that if he used the web, you will have to evaluate HOW he solved the problem.
If he used the web for something simple such as finding the syntax or parameters which he forgot, don't mark him down. This is normal.
If he used the web for something like look at how a specific function is used, don't mark him down. This is normal.
If he searched for a specific code and then copy-paste it, then ask him about how the code works. If he can explain how the code works, then there's no reason to mark him down. If he can't explain it without looking at the site where he got the code, you have to mark him down.
If he used stackoverflow.com, check his profile for questions, answers and badges. From there, you can check how good a programmer he is.
It all depends what you want out of your successful candidate. I contest the view that knowing how to google makes you a good programmer because the simple fact is that the internet is full of bad examples as well as good ones. You don't really want your codebase to reflect how lucky your googler was on the day he cut and pasted all his code off the web. You want it to demonstrate sound practices, proven methodologies & elegant, efficient solutions that your team understand and are enthusiastic about. Not a jumble of styles that don't resemble each other. There's a wealth of good to be gotten from knowing how to get help from the interweb but real knowledge and ancient wisdom is being lost every day that people who don't really understand what they are doing are given jobs because they appear to solve problems with their ability to "google it".
If you really want to give your candidates access to the web then by all means do, but make the questions hard and scrutinise the results to see if they've picked the first solution they found or if they've picked the best solution to the problem.
As do many other respondents, I'd rather employ a resourceful developer who know how to use the web to the fullest to draw on other's experiences and previous work, than a developer who limits himself and his applications to the MSDN way of doing things.
I copy other peoples code all the time - daily in fact. The knack of it depends on finding the right solution quickly and integrating it into your existing work.
So let your candidate use the web and ask him how he came to his solutions. You might learn more about him from his methods than from how will he can remember previous solutions.
Three things I'd do.
Let applicants send in a coding example along with their cv.
Let applicants produce some real-life code (maybe even pair-program with a developer on your team) this will show you if they can actually use the tools. Internet is a tool too so they should be able to use internet.
Let applicants solve a problem in pseudo code on a blackboard during the interview. In this case you can be their "internet" by helping them.
These three approaches will show you different things. The first is a good early warning mechanism but can easily be faked (they could just download oss code from the web somewhere). The second is good to see if they can actually code but they might score badly if they're unfamiliar with the tools you use. The third will show you if they can solve theoretical problems but won't show you if they actually are good team players or if they write maintainable code.
I recently had a friend start talking to me on IM, he was in a coding test job interview. He had a couple SQL questions. At first i thought, hell you've got to do this yourself. I'm not going to help you cheat during an interview.
Then i thought about it again. I've been answering questions and talking to him about various technical issues for years on IM as part of his work. So when he encounters problems in the real world with the job if he gets hired, he'll do the same thing.
We don't talk about it much, but having a good network of friends to ask questions, and knowing how to search out relevant answers on the net are a big part of being an effective programmer or sysadmin. I've met people who were super smart programmers, but didn't really know how to find information online. They missed a lot, were kind of out of the loop. Knowing how to use resources should be important.
When i do interviews i often ask people what websites they read, what development tools they use, and why. It's a similar thing. Sure it's not about how they write x line of code, but it's about how they work.
No how to get around somebody just copy and pasting "answers". Well first, don't ask questions which have pat answers. Secondly when i'm interviewing i like to give people some code, ask them to refactor it, have them talk through what they are thinking. Then ask them to write some new code which implements a feature. Pair program with them. It's hard to hide inability to code when pair programming. While they are pairing, it totally makes sense to say, "let's go look up the api on the date time library."