User Stories - Problems that can't be made user stories [closed] - process

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I am from an XP background. I know the process very well and have solid working experience with it. I have found it to be the best way to develop software.
I find myself in the position of a process doctor of sorts and this creates much self examination and revaluation of my own understandings.
A very common thing I hear is that some work can’t be made into stories. I personally don’t believe this. The excuses include
Its too big (The developer will have nothing to show until the end of 5 weeks).
it’s a complicated algorithm or abstract concept (will take 5 weeks to write and nothing to show).
This question is to get hints, tips or suggestions.
I am looking for hints, tips, and suggestions as to how to address these and similar perceived problems (and more if you can think of them).
I will mark the answer that has the most information on how to get around users/developers who wont write stories and address their many excuses as to why not (I have only listed a few and there are many more).

Here are a few resources that I've collected over time and that might help:
Patterns for Splitting User Stories
Story Weight Reduction Toolkit
Twenty Ways to Split Stories
Ways to split user stories
Too big or too complicated, there is always a way to put a story on diet (maybe you won't obtain the final result in one iteration but this doesn't mean you can't and, well, there will be more than one iteration).

So basically, your question is "What can I do if people claim a task is too big for a user story, and can't be split up.
In my experience, almost any problem can be split up. Ask them if they can implement a simplified version, leave out advanced features, maybe even use default values in some places; basically anything to produce something that gives meaningful (i.e. testable) results within one iteration.
Remember: The point of an iteration is not to deliver complete functionality, but just useful and testable functionality.
This splitting can be difficult, but it forces you to consider what you really need first, which is very valuable. The developers may bitch about it (I often do myself :-)), but it's really necessary. Breaking down big tasks into manageable user stories is at the very heart of all agile methods.
That said, if the task really, really, really cannot be broken down (think complex mathematical algorithm in a research setting, that takes weeks to even understand the basics of), then your iteration is too short. The iteration needs to be long enough to produce meaningful results. And if most of your problems are so hard that they take 2-3 months to get anything done, then that's your iteration length. But I've never seen a project where that was really the case...

Usually when you get "it's too big", what they are really saying is "I only have a vague idea how this should work". You need to work with them to better define it until it becomes possible to split it into logical parts that can be more easily managed.

users/developers who wont write stories
Users aren't supposed to write user stories. They aren't supposed to tell you user stories. You can expect them to talk about how they work, the problems that bother them and what they would like to have to facilitate their everyday work.
You, in your turn, is supposed to listen to them and take notes. If they allow, use a tape recorder or a camera. Then you bring the collected information back when you replay it and identify what you call user stories. You discuss them with the team and when you have agreement you have use cases to target in your development.
What role developers play, is up to you. If they just coders, they don't take part in the process.
If they in part act as consultants, then they help define user stories.

The "algorithmic specification" problem is common.
Many people prefer to write code and don't really care who the user is or what they do.
I try to get them to focus by asking these questions.
What action can the person take? What could they possibly do with the information? If they have some responsibility, they can take action to deny, approve, hold, reject, reprocess, stop, start, something. If the user can't take any action, you need to ask if they're really stake-holders.
What decision do they have to make? How do the decide which action (if any) to take? We can't automate that decision -- that's why people are in the loop.
What information does this person need to make the decision to take action.
Information-Decision-Action.
We only write software to prepare information for people to make decisions so they can take action.
If that's not the focus, then the stories get out of control.

Its basically the duty and responsibility of the product owner. And there can be any requirements/task that cannot be split into User Stories. I found many such discussions on SCrum Master Forums

If development team claims that the story is too big and can not fit within the sprint.. take their feedback and try to split the story with must have and nice to have tasks and try to split it based on that.
check this flowchart.. can be a help: http://www.agileforall.com/wp-content/uploads/2012/01/Story-Splitting-Flowchart.pdf

Related

How to hand over a project systematically? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
We have a project hand over from on shore team to our team (off shore) not long ago. However we were having difficulties for the hand-over process.
We couldn't think of any questions to ask during their design walk-through, because we were overwhelmed by the sheer amount of information. We wanted to ask, but we didn't know what to ask. Since they got no question from us, the management think that the hand-over process was been done successfully.
We had tried to go through all the documentation from our company wiki page before attending the handover presentation, but there are too many documents, we don't even know where to start with.
I wonder, are there any rules or best practices that we can follow, to ensure a successful project hand-over, either from us, or to us.
Thanks.
In terms of reading the documentation, personally I'd go for this order:
Get a short overview of the basic function of the application - what is it meant to achieve. The business case is probably the best document which will already exist.
Then the functional specification. At this point you're not trying to understand any sort of how or technology, just what the app is meant to do. If it's massive, ask them what they key business processes are and focus on those.
Then the high level technical overview. This should include an architecture diagram, required platforms, versions, config and so on. List any questions you have.
Then skim any other useful looking technical documents - certainly a FAQ if there is one, test scripts can be good too as they outline detailed "how to" type scenarios. Maybe it's just me but I find reading technical documents before I've seen the system a waste - it's too academic and they're normally shockingly written. It's certainly an area I'd limit the time I spent on if I didn't feel I was getting a reasonable return for the time I was spending.
If there are several of you arrage structured reviews between you and discuss the documents you've read, making sure you've got what you need to out of it. If the system is big then each take an area and present to the others on it - give yourselves a reason to learn as much as possible and knowing you're going to be quizzed is a good motivator. Make a list of questions where you don't understand something. Having structured reviews between you will focus your minds and make it more of an interactive task, rather than just trawling through page after page of tedious document.
Once you get face to face with them:
Start with a full system demo. Ask questions as they come up, don't let them fob you off with unclear answers - if they can't answer something have it written down and task them with getting the answer.
Now get the code checked out and running on your machines. Do this on at least two machines - one they lead, one you lead. Document the whole process - this is the most important step. If you can't get the code running you're screwed.
Go through the build process. Ensure that you can build the app (including any automated build and unit tests they may have). Note that all unit tests should pass - if they don't or if they say "oh, that one always fails" then they need to fix that before final acceptance.
Go through the install process. Do this at least twice, one they lead, once you lead. Make sure that it's documented.
Now come up with a set of common business functions carried out with the application. Use this to walk the code with them. The code base will be too big to cover the whole thing but make sure you cover a representative sample.
If there is a database or an API do a similar exercise. Come up with some standard data you might need to extract or some basic tasks you might need to carry out using the API and spend some time working through these with them.
Ask them if there's anything they think you should know.
Make sure that any questions you've written down anywhere else are answered.
You may consider it worth going through the bug list (open and closed) - start with the high priority ones and talk through anything particularly worrying looking. Even if they've fixed it it may point at a bit of code which is troublesome.
And finally if the opportunity exists - if there are any outstanding bugs or changes, see if you can pair program a couple.
Do not finally accept the app unless you are 100% sure you can:
Get the code to compile
Get the code to build (including the database)
Get the application installed
Do not accept handover is complete until they have:
Documented anything you picked up on that wasn't covered to your satisfaction
Answered ALL of your questions - a question they won't answer after being asked repeatedly screams of something they're hiding
And grab their e-mail addresses and phone numbers. Even if it's only informal they'll probably be willing to help out if the shit really hits the fan...
Good luck.
My basic process for receiving a handover would be:
Get a general overview of the app, document it
Get a list of all future work that the client expects
... all known issues
... any implementation specifics
As much up-to-date documentation they have
If possible, have them write some tests for critical components of the system (or at least get them thoroughly documented)
If there is too much documentation (possible) just confirm that it is all up to date, and make sure you find out from them where to start, if it is not clear.
Ask as many question as possible; anything that comes to mind, because you may not have the chance again.
Most handovers, perhaps all of them, will cause a lot of information to be lost. The only effective way to perform a handover that I have seen is to do it gradually. One way to do it is to allow a few key people from phase One to stay on the project well into Phase Two.
The extreme solution is to get rid of all handovers, and start using an Agile mindset.
As a start, define the exit criteria for the handover. This should be discussed, negotiated and agreed with both parties and make sure higher management knows this. Then write up a checklist of all things needed to achieve the exit criteria and chase it.
Check out "Software Requirements" and Software Requirement Patterns for ideas on questions to ask when gathering information about a project. I think that just as they would work for new development, they would also help you to come to terms with an existing project.

Scrum, but with no testing or documentation [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
What do you do when you join a team that says they use Scrum, but only use it as a time-management tool and not the whole process?
How can I reinstate back testing and documentation?
I was thinking to start off with adding user stories specifically for testing and documenting.
Perhaps someone else has more experience with this then I do about this as I am sure its not that uncommon.
The key to scrum is that a task be identifiable as "done" before it can be classed as done. How does you company assess whether something is done without reviewing documentation and tests?
Perhaps they have an unusual, but valid, way of doing it. Or perhaps they have missed the point of "done tasks". I'd suggest you start by asking them how they measure down and whether it could be improved. Then suggest documentation and testing as the way of improving the process.
Note that neither testing nor documentation are in fact part of Scrum. Scrum is a pure project management approach - the required engineering practices, like the ones you mention, are supposed to "emerge" during the project. And most specifically, they are supposed to be identified during the heartbeat retrospectives that you do at the end of every sprint. Are you doing those? Can you bring up your concerns there - and are they actually the biggest concerns the team has?
Is the issue that they don't have any documentation and tests, or that they aren't implementing the entire Scrum methodology? Those are 2 very different problems in my mind.
I would much prefer an organization that has taken the time and effort to find and fit a development process that matches their development style as opposed to mandating down from on high the one true process. So I would not be concerned at all if they were using a process that they called Scrum but that didn't meet all the "official" guidelines. Try to determine why the process is the way it is. Chances are that if they have taken the time to tailor it, the team will be receptive to your ideas, especially if you have taken the time to determine why things are the way they are. If you simply approach it as "this isn't Scrum and so isn't right", you will probably not make much headway, but by being pragmatic about the benefits you can likely make some substantial improvements.
Alternatively, if they aren't doing testing and don't have any documentation I would consider that a fairly bad sign. And by documentation I am taking the minimalist view here - a list of features, bug tracking, etc. - I would be very concerned by the absence of these items, less concerned by the absence of items higher up the abstraction list. In the absence of support from management, I would suggest you lead by example. Take it on yourself to setup a simple bug tracking system (there are several - in a pinch, simple text lists in a central location work as well). Don't declare your features complete until someone else has tested it. This can be as simple as walking over to another developer and asking them to try it in front of you. If someone claims a feature is complete, take a few minutes to familiarize yourself with it. If you find a bug, politely mention it to the responsible developer. Slowly build an environment where the team can see the benefits of running tests and tracking features and bugs.
Most teams operate in this manner simply because of a mistaken belief that they don't have time to "do it right", or that they will get to it later. Often this will occur when a simple proof-of-concept done by a developer or two as a side-project turns into a full-on development effort. By showing that it can actually save time and effort, and reducing the initial costs to the rest of the team, you will often find that it becomes ingrained as part of the process without ever actually being officially endorsed or accepted.
If you have management support it will make it much easier, but always be careful to make sure that the team is receptive to the changes. This may mean it takes longer than you want, but so be it, without the team's support any mandated process will fail at the first sign of pressure, which is when you need the process the most.
*Disclaimer - On my last project I spearheaded the movement to tailor the SCRUM process to fit our environment. The "official" process was simply untenable for our client, but it was still an invaluable guide in tailoring our process.
"adding user stories specifically for testing and documenting"
While meta-user stories might make sense in some circles, it rarely works out well. Software folks rarely cope well with meta-user stories, they either don't get the idea that they can change their own processes by writing a story, or -- more typically -- they engineer the meta-user story to death.
When you're interviewing users, it feels like they're making the user story up. Certainly, you're making it up as you listen to them and try to capture it.
When an IT organization tries to make up its own user stories about how IT should work, the process falls apart. Until the organization has done the thing (testing, for example) a bunch of times manually, they're not really qualified to write user stories. Then, after they've done it, they don't need software development processes, they'll just automate the important bits a little at a time.
I think change has to come from a less formal direction. Actually balking at calling something "done" that hasn't been tested is a good starting point.
IT doesn't do things unless forced. So, meet the users and find out why they're not requiring testing. Coach them to require testing. Tell them the consequences and the words to use.
A lot can go wrong in an organization to lead to poor processes. It's important to know what's wrong, and create a demand for change. The best possible thing is to have your boss complaining that you're not fixing it, rather than you suggesting that perhaps it would be good to fix it.
[It doesn't feel right when your boss demands you fix the process, but it's about the only way change will happen.]

Low Friction Minimal Requirements Gathering [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
How can our team gather requirements from our "Product Owner" in as low friction yet useable of a way as possible?
Now here's the guidelines- No posts that it can't be done or that the business needs to make a decision that it cares about quality, yada yada. The product I work for is a small group that has been successful for years. I just want to help them step it up a notch.
Basically, I'm on a 6 or 7 person team with one Product Owner. She does a great job but is juggling a few different roles (as I believe is common on extremely small teams). Usually requirements are given at sporadic times (email convos, face to face discussions, meetings, etc). They are never entered into a system and sometimes this results in features missing a release or the release getting pushed back since everyone forgot about the necessary feature.
If you're in a similar situation but you found a way to overcome this, I'd love to hear it. I'm happy to write code to help ease this situation but it can't be a web site that the Product Owner has to go to in order to get anything done. She is extremely busy and we need some way of working together as a team in order to gather these requirements.
I'm currently thinking of something like this: Developers and team members gather requirements discussed in face to face meetings and write some quick notes on the features discussed on a wiki page. Product owner is notified whenever these pages are updated and it then becomes her responsibility to ensure accuracy.
Pros: We'll have some record of the features. Cons: The developers are taking responsibility for something that they ordinarily wouldn't. I'm okay with that here. I think in this situation it's teamwork.
Of course once we do this, then we're going to see that the product owner probably doesn't have enough time to ensure feature accuracy. Ultimately she is overburdened and I think this will help showcase that fact, but I just need to be able to draw attention to that first.
So any suggestions?
P.S. her time is extremely limited so it is considered unreasonable to expect her to need to type in the requirements after discussion. She only has time to discuss them once and move on.
Although the concept of "product owner" is a littl ambiguous to me, I think I am working in very similar circumstances: the customer is extremely buzy and always is a bottleneck in developing requirements.
On the surface, what we try to do in this situation is quite obvious and seemingly simple: we try to make sure that the customer is involved in "read-only / talk-only" mode. No writing. Minimum reading. Mostly talking.
The devil, of course, is in details. So, here are some specifics about our process (in no particular order):
We often start from recording problem statements, which are the ultimate sources of requirements. In fact, sometimes a problem statement is all that we record initially, just to make sure it does not get lost.
NB: It is important to distinguish problem statements from requirements. Although a problem statement sometimes clearly implies some requirement, in general a single problem statement may yield a whole bunch of requirements (each having its own severity and priority); moreover, sometimes a given requirement my define a solution (usually just a partial one) to multiple problems.
One of the main reasons of recording problem statements (and this is very relevant to your question!) is that semantically they are somewhat "closer to the customer's skin" and more stable than requirements derived from them. I believe those problem statements make it much easier and quicker to put the customer into proper context whenever he has time to provide feedback to the development team.
We do record all the requirements (and back-track them to problem statements), regardless of when are we going to implement them. Priorities govern the order in which requirements get implemented. Of course, they also govern the order in which customer reviews unfinished requirements.
NB: A single fat document containing all requirements is an absolute no-no! All the requirements are placed in "problem tracking database", along with bug reports. (A bug is just a special case of a problem in our book.)
We always try to do our best to minimize the number of iterations necessary to "finalize" each requirement (or a group of related requirements). Ideally, a customer should have to review a requirement only once.
Whenever the first review turns out to be insufficient (happens all the time), and the requirement in question is complex enough to require a lot of text and/or illustrations, we make sure that the customer does not have to re-read everything from scratch. All the important changes/additions/deletions since the previous reviwed version are highlighted.
While a problem or requirement remains in an unfinished state, all the open issues (mostly questions to customer) are embedded into the document and highlighted. As a result, whenever the customer has time to review requirements, he does not have to call a meeting and solicit questions from the team; instead the customer can open any unfinished document first, see what exactly is expected from him, and then decide what's the best way and time (for him) to address any of the open issues. Sometimes the customer chooses to write a email or add a comment directly to the problem document.
We try our best to establish and maintain official domain vocabulary (even if it gets scattered across the documentation). Most importantly, we practically force the customer to stick to that vocabulary.
NB: This is one of the most difficult parts of the process, and customer tries to "rebel" from time to time. However, at the end of the day everybody agrees that it is the only way to make precious meetings with the customer as efficient as possible. If you ever attended one-hour meetings where 30 minutes were being spent just to get everybody on the same page (again), I'm sure you would appreciate having a vocabulary.
NB: Whenever possible, any changes in the official vocabulary get reflected in the very next release of the software.
Sometimes, a given problem can be solved in multiple ways, and the right choice is not obvious without consulting with the customer. It means that there will be a "menu of requirements" for the customer to pick from. We document such "menus", not just the finally chosen requirement.
This may seem controversial and look like an unnecessary overhead. However, this approach saves a lot of time whenever the customer (usually few weeks or months down the road) suddenly jumps in with a question like "why the heck did we do it this way and not that way?" Also, it is not such a big deal to hide "rejected branches" using proper organization/formatting of requirements documentation. Boring but doable. :-)
NB: When preparing "menus of requirements", it is very important not to overdo them. Too many choices or too many choice nesting levels - and the next review may require much more customer's time than really necessary. Needless to say that the time spent on elaborated branches may be totally wasted. Yes, it is difficult to find some balance here (it greatly depends on the always-in-a-hurry customer's ability to think two or more steps ahead and do it quickly). But, what can I say? If you really want to do your job well, I am sure that after some time you will find the right balance. :-)
Our customer is a very "visual" guy. Therefore, whenever we discuss any significant user interface elements, screen mockups (or even lightweight prototypes) often are extremely helpful. Real time savers sometimes!
NB: We do screen mockups exclusively for the customer, only in order to facilitate discussions. They may be used by developers too, but in no way do they substitute user interface specifications! More often than not, there are some very important UI details that get specified in writing (now - primarily for developers).
We are lucky enough to have a customer with a very technical background. So we do not hesitate to use UML diagrams as discussion aid. All kinds of UML diagrams - as long as they help customer to get into proper context quicker and stay there.
I am talking about requirements-level UML diagrams, of course. Not about implementation-level ones. I believe that even not very technical people can start digging requirements-level UML diagrams sooner or later; you just have to be patient and know what to put on a diagram.
Obviously, the cost of such process greatly depends on analytical and writing skills of the team, and of course on the tools that you have at your disposal. And I must admit that in our case this process appears to be quite expensive and slow. But, taking into account the very low rate of bugs and low rate of "vapor-features"... I think, in the long run, we get very good payback.
FWIW: According to Joel's nice classification of software products, this project is an "internal" one. So we can afford to be as agile as our customer can handle. :-)
"Developers and team members gather requirements discussed in face to face meetings and write some quick notes"
Start with that. If you aren't taking notes, just make one small change. Take Notes. Later, you might post them to a wiki or create a feature backlog or start using Scrum or bugzilla or something.
First, however, make small changes. Write stuff down sounds like something you're not doing, so just do that and see what improves and what you can do next. Be Agile. Work Incrementally.
You might want to be careful of the HiPPO in the room. The Highest Paid Person's Opinion is not always a good one. We've tended to focus more on providing great tools and support for developers. These things, done right, take some of the hassle out of development, so that it becomes faster and more fun. Developers are then more flexible in terms of their workload, and more amenable to late-breaking changes.
One-Click testing and deployment are a couple of good ones to start with; make sure every developer can run up their own software stack in a few seconds and try out ideas directly. Developers are then able to make revisions quickly or run down side paths they find interesting, and these paths are often the most successful. And by successful I mean measured success based on real metrics gathered right in the system and made readily available to all concerned. The owner is then able to set the metrics, which they probably care about, rather than the requirements, which they either don't care about or have no experience in defining.
Of course it depends on the owner and your particular situation, but we've found that metrics are easier to discuss than requirements, and that developers are pretty good at interpreting them too. A typical problem might be that customers seem to spend a long time filling their shopping carts but don't go on to checkout.
1) A marketing requirement might be to make the checkout button bigger and redder. 2) The CEO's requirement might be to take the customer straight to checkout, as the CEO only ever buys one item at a time anyway. 3) The UI designer's requirement might be to place a second checkout button at the top of the cart as well as the existing one at the bottom. 4) The developer's requirement might be some Web 2.0 AJAX widget that follows the mouse pointer around the screen. Who's right?
Who cares... the customer probably saw the ridiculous cost of delivery and ran away. But redefine the problem as a metric, instead of a requirement, and suddenly the developer becomes interested. The developer doesn't have to do 10 rounds with the CMO on what shade of red the button should be. He can play with his Web 2.0 thing all week, and then rush off the other 3 solutions on Monday morning. Each one gets deployed live for 48 hours and the cart-to-checkout rate gets measured and reported instantly. None of it makes any difference, but the developer got to do their job and the business shifts it's focus onto the crappy products they sell and the price they gauge on delivery.
Well, ok, so the example is contrived. There's a lot of work in there to make sure that the project is small, the team is experienced, hot deployment is simple, instant rollback is provided, and that everyone's on board. What we wanted to get to is a state where the developer's full potential is not wasted, so that's why they're involved not just from the start, but also in the success. Start out with an issue like the number of clicks during registration is too high, run it through a design committee, and we found that the number of clicks actually went up in the design specification. That was our experience anyway. But leave the developer some freedom to just reduce the number of clicks and you might actually end up with a patented solution, as we did. Not that the developer cares about patents, but it had merit - and no clicks!

Coding Test - allow use of web? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
During hiring a .NET web developer I give the candidate a coding test.
I tend to limit the candidate to MSDN installed on the test server - I think it holds everything the candidate needs to complete the task.
I admit, this is not the normal case as I don't expect the candidate to do his work without use of the web.
On the other hand I don't want the candidate to google for a complete example and copy-paste it, i want to evaluate his skills.
The question is do I need to allow free use of the web during the test?
If you think the whole coding test is wrong - I would like to hear alternatives you may have for me.
As you say, 'I don't expect the candidate to do his work without use of the web' why not allow it too during the test? And what if he does copy and paste? I do that too. Surely the key is to know where to look, be discerning with what you find and apply it intelligently. Do you want to hire someone with a terrific memory or someone who can develop software for you?
When I was at school, calculators were just becoming affordable. As their use was seen as unavoidable, the exams were changed. Simple number-crunching was no longer tested in the way it was before (it was important then). Rather problem-solving techniques were to be tested.
I usually allow candidates to use whatever resources they want. After they're done, I sit down with them and go through their code together, ask questions like why they chose that particular approach etc.
If a couple of minutes of Googling was enough to not just copypaste some code but to learn enough about it to be able to defend the decisions within, then he's intelligent enough!
There are tests, where web access can be given, and there are where it doesn't really make sense.
Case where its fine to allow web access
When its unlikely to find even 60 percent of the code over the net
When you will ask to explain the code after he/she completed the code
A very specific solution using SQL query, which is unlikely to be found on the web
Case where its fine to not allow web access
Some basic programs like, recurssion, fibonacci, factorial, string manipulation, small trick programs, etc. There is no need of computer even in some of these cases
I'm very sceptical about coding tests during interviews. I think that a lot of the test I have seen, represent very specific (artificial, non real-world) problems where you would use the internet to solve them.
I think it's not really important to know how to solve such problems by heart - often time it is much more important that you know how and where to search for answers.
If you want to test the persons during the interview, I think it is better to ask them some conceptual questions instead of a specific programming problem. E.g: questions about object orientation, polymorphism, design of n-tier application, etc. etc.
Or as an example from the ASP.NET world, ask the interviewed person question such as: what is ViewState, what is a postback, what is session-/application-state, etc.
If you want to get an idea of how a candidate will perform in a job, I think it's best to try and make the conditions of the test as close as possible to the actual working conditions.
It should be pretty easy to prevent copy-and-pasters from slipping through the cracks by asking the candidate to explain his/her code.
Well, one thing you want to be aware of is that the developer you hire might not know everything that he will be thrown during the time he is working for you. If you ask him a question that he doesn't know off the top of his head you would want and expect him to research it and come back to you with proof that he understood the concepts that he just learned.
I say let them use the web - but ask them to explain in their own words how their code works. Most of my knowledge comes from online resources. However, I make sure that every line of code I write I understand.
There is a baseline knowledge that developers in a particular field should know; but you also want to figure out how quickly he can learn new things. A good test IMO is to throw a question you know he doesn't know and see how long he can figure it out using the resources he would have if he were an employee of your company.
Is your goal to see what basic knowledge the candidate has and if he can code without copying solutions from the web, then don't allow internet access. If you want to see what strategies he employs to get to a solution, let him use the web if he wants to.
I personally find it more interesting if a candidate can solve problems on a larger scale than just solving a simple programming problem. So I tend to ask him about the methods he uses when programming (Unit testing? Ever worked with it? What do you think of it?). This gives me a better picture than coding in an interview situation.
Sometimes it helps if you ask the candidates beforehand to bring a one-page coding sample to take a look at their coding style. This also saves you time during the interview.
It's important to make sure a candidate is resourceful - you don't want your programmer sitting there when they get stuck, not moving forward; you want them to use whatever resources are at hand - be it MSDN, picking someone else's brains, using the web, etc - to get the job done. Cut-n-paste from the web does seem like cheating, but (a) if you design your task carefully then it will be unique enough for there not to be a standard answer they can copy from the web, and (b) isn't re-using existing code a key part of building software? It's not much different from using 3rd-party libraries, to avoid reinventing the wheel. On the downside, of course, you also want them to show they can develop algorithms, so the unique task needs to include some element that requires that without the solution already being on the web. Trouble is, forums are the achilles heel to all of that since they can simply ask for the solution and someone, somewhere, is going to hand over the answer unwittingly!
Allow the candidate to use the web but tell him beforehand that if he used the web, you will have to evaluate HOW he solved the problem.
If he used the web for something simple such as finding the syntax or parameters which he forgot, don't mark him down. This is normal.
If he used the web for something like look at how a specific function is used, don't mark him down. This is normal.
If he searched for a specific code and then copy-paste it, then ask him about how the code works. If he can explain how the code works, then there's no reason to mark him down. If he can't explain it without looking at the site where he got the code, you have to mark him down.
If he used stackoverflow.com, check his profile for questions, answers and badges. From there, you can check how good a programmer he is.
It all depends what you want out of your successful candidate. I contest the view that knowing how to google makes you a good programmer because the simple fact is that the internet is full of bad examples as well as good ones. You don't really want your codebase to reflect how lucky your googler was on the day he cut and pasted all his code off the web. You want it to demonstrate sound practices, proven methodologies & elegant, efficient solutions that your team understand and are enthusiastic about. Not a jumble of styles that don't resemble each other. There's a wealth of good to be gotten from knowing how to get help from the interweb but real knowledge and ancient wisdom is being lost every day that people who don't really understand what they are doing are given jobs because they appear to solve problems with their ability to "google it".
If you really want to give your candidates access to the web then by all means do, but make the questions hard and scrutinise the results to see if they've picked the first solution they found or if they've picked the best solution to the problem.
As do many other respondents, I'd rather employ a resourceful developer who know how to use the web to the fullest to draw on other's experiences and previous work, than a developer who limits himself and his applications to the MSDN way of doing things.
I copy other peoples code all the time - daily in fact. The knack of it depends on finding the right solution quickly and integrating it into your existing work.
So let your candidate use the web and ask him how he came to his solutions. You might learn more about him from his methods than from how will he can remember previous solutions.
Three things I'd do.
Let applicants send in a coding example along with their cv.
Let applicants produce some real-life code (maybe even pair-program with a developer on your team) this will show you if they can actually use the tools. Internet is a tool too so they should be able to use internet.
Let applicants solve a problem in pseudo code on a blackboard during the interview. In this case you can be their "internet" by helping them.
These three approaches will show you different things. The first is a good early warning mechanism but can easily be faked (they could just download oss code from the web somewhere). The second is good to see if they can actually code but they might score badly if they're unfamiliar with the tools you use. The third will show you if they can solve theoretical problems but won't show you if they actually are good team players or if they write maintainable code.
I recently had a friend start talking to me on IM, he was in a coding test job interview. He had a couple SQL questions. At first i thought, hell you've got to do this yourself. I'm not going to help you cheat during an interview.
Then i thought about it again. I've been answering questions and talking to him about various technical issues for years on IM as part of his work. So when he encounters problems in the real world with the job if he gets hired, he'll do the same thing.
We don't talk about it much, but having a good network of friends to ask questions, and knowing how to search out relevant answers on the net are a big part of being an effective programmer or sysadmin. I've met people who were super smart programmers, but didn't really know how to find information online. They missed a lot, were kind of out of the loop. Knowing how to use resources should be important.
When i do interviews i often ask people what websites they read, what development tools they use, and why. It's a similar thing. Sure it's not about how they write x line of code, but it's about how they work.
No how to get around somebody just copy and pasting "answers". Well first, don't ask questions which have pat answers. Secondly when i'm interviewing i like to give people some code, ask them to refactor it, have them talk through what they are thinking. Then ask them to write some new code which implements a feature. Pair program with them. It's hard to hide inability to code when pair programming. While they are pairing, it totally makes sense to say, "let's go look up the api on the date time library."

How can I think like a user? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
We're neck deep in a project right now, schedules are tight (but reasonable). Our general strategy is to get a strong beta done, release it for testing, and get feedback from our testers.
Quite frequently, we're being hit by small things that spiral into long, time-costing discussions. They all boil down to one thing: While we know what features we need, we are having trouble with the little details, things like 'where should this message go' and 'do they need this feedback immediately, or will it break their flow, so we should hold off'?
These are all things that our testers SHOULD catch, but
a) Each 'low priority' bug like this drains time from critical issues
b) We want to have as strong a product as possible
and
c) Even the best testing group will miss things from time to time.
We use our product, and we know how our users use the old version...but we're all at a loss as to how to think like a user when we try to use the new version (which has significant graphical as well as underlying changes).
edit - a bit more background:
We're writing a web app used by a widely-distributed base of users. Our app is a big part of their jobs, but not the biggest (and, of course, we only matter to them when it doesn't work). Getting actual users in to use our product is difficult, as we're geographically distant from the nearest location that serves as an end user (We're in Ohio, and I think the nearest location we serve is 3+ hours away).
The closest we can get is our Customer Service team (who have been a big help, really) but they don't really think like the users either. They also serve as our testers (it really motivates them to find bugs when they know that any they DON'T find may mean a big upswing in number of calls). We've had three (of about 12 total) customer service reps back here most of the week doing some preliminary testing...they've gotten involved in the discussions as well.
Watching someone using the app is a huge benefit to me. Possibly someone who is not entirely familiar with it.
Seeing how they try to navigate, how they try to enter information or size windows. Things we take for granted after creating/running the app hour after hour, day after day.
Users will always try and do things you never expected and watching them in action might bring to light how you can change something that might have seemed minor, but really makes a big impact on them.
Read Don't make me think.
Speaking generally, you can't. There's not any way you can turn off the "programmer" part of your brain and think like a user.
And you're right about (c), testing groups don't necessarily catch all the bugs. But the best thing you can do is get a testing group comprised of real, honest-to-goodness end users, and value their feedback. Draw further conclusions from their general comments.
If you want to know how your users will see your system, the closest you can get is usability testing with real users. Everything else is just heuristics and experience, and is also subject to error. There's no such thing as a bug-free product, but you should be able to get a "strong" product with usability testing.
Buy a cheap, easy to use video camera and record your testers using the app. Even better, get some people unfamiliar with the app. to use it and video them. It's relatively cheap, and you'd be surprised what it will highlight.
I like policy of "eating your own dog food"("http://en.wikipedia.org/wiki/Eat_one's_own_dog_food). It brings you one step closer, because you become a user, although you might think like one.
Try to use your app when you are very hurry (e.g. you have someone who waits for a dinner).
You will see all this little things because you have to wait, you have to go back to the mouse of the keyboard, etc.
And also, make your wife use it. Or your mother.
Another useful test : help someone to use it, by phone. If he can't find the button with your directions, that's probably a bug.
The important thing is to get enough information that you yourself can become a "user". Once you do that you can answer most questions yourself.
The way I always do this is to go talk with them about what they need to do, what they typically do, and how they use their current tools to do it. Then (very important) sit with them while they do it. Make sure you get on with them well enough that you can come back to them with questions about how they handle edge cases you think of later (often the answer will be the appalling "we go around the system manually for that").
I will almost always notice something they are doing that is a royal PITA that they didn't bring up because they are used to having to do that and don't know any better. I will always notice that their %90 typical workflow isn't the easiest workflow the tools provide.
You can't really rely on plain old-fashioned requirements gathering by itself, because that is asking them to think like a developer. They generally don't know what is possible to do with your software, what is easy, and what is hard. Also they typically have no clue on GUI design principles. If you ask them for design input they will just tell you to put any new control on their favorite page, until the thing looks like a 747 control panel.
The problem is often that even the users don't know what they want until they are actually working with the software. Sometimes, a small oversight can be a big usability problem, sometimes a well thought out function that was requested by many users sees only little use.
My suggestions to decrease the risk of not implementing the right usability features:
Take a look at users actually doing their day to day work. Even if they use another software or no software at all. You will be able to determine the artifacts they often need to get their job done. You will see what data they frequently need. Concentrate on the artifacts, data and workflows most used. They should be the most usable. Exotic workflows may be a bit more time consuming for the users than often used workflows.
Use working prototypes of the GUI to let users work through a realistic workflow. Watch them and note what hinders them and what works well. Adjust your prototypes accordingly.
If an issue arises in an often-used part of your software, it is time to discuss it now and in details. If the issue concerns a seldom used part, make it a low priority issue and discuss it if you have the time. If issues or suggestions are low priority, they should stay low priority. If you can't determine if solution A or solution B is the best, don't run in circles with the same arguments over and over. Just implement one of the solutions and see if the beta testers like it. The worst thing you could do is waste time over tiny issues, while big issues need to be fixed.
A software will never be perfect, because the viewpoints of users differ. Some users will think that a minor problem breaks the whole application. Others will live with even severe usability issues. People tend to lend their ear to those who argue the loudest. Get to know your users to separate the "loud" issues from the important ones. It takes experience to do this, and sometimes you will make wrong decisions, but there is no perfect way, only one of steady improvement.
If you can, set aside a certain amount of usability development resources for the rollout phase of your software. Usability issues will arise when people start working with it in a real production environment. Sometimes it is not important to present the perfect software, but to solve issues quickly as they arise.
The flippant (yet somewhat accurate) answer to how to think like a user is put a knitting needle in your ear and push really hard.
The longer response is that we as programmers are not normal and I mean that in a good way. I scratch my head at the number of people who still run executables they receive from strangers in emails and then wonder how their computer got infected.
Any group of people will in time develop their own jargon, conventions, practices and expectations. As a programmer you will expect different things from an operating system than Joe User will. This is natural, to be expected yet hard to work around.
It's also why BAs (business analysts) exist. They typically come from a business or testing background and don't think like programmers. They are your link to the users.
Really though, you should be talking to your users. There's no poitn debating what users do. Just drag a few in and see what they do.
A usability test group will help.. tests not focused on discovering bugs, but on the learning curve of the new design, made by a group of users, not programmers.
I treat all users like malicious idiots.
Malicious because I assume all users are going to try and break my code, do stuff that is not allowed, avoid typing in valid data, and will do anything in their power to make my life hell.
Idiots because again I can't assume they will understand simple stuff like phone formats, will run away screaming if presented to many choices, and will not make any leap of faith on complicated instructions. The goal is to hold their hand the entire way.
At the same time, its important to make sure the user doesn't realize you think they're an idiot.
To think like a user, be one. But are these actually bugs that your testers are reporting? Or are they "enhancement requests"? If the software behaves as designed per requirements and they just don't like the way it operates, that's not a bug. That's a failure of requirements and design. Make it work, make it rock solid, make it easy to change and you'll be able to make it what your users want.
I see some good suggestions here, especially observing people trying to use you app. One thing I would suggest is to look at the order in which things are presented to the user on paper forms (if they use these to do data entry from) and make the final data entry page mimic that order as closely as possible. So many data entry errors (and loss of data entry speed) are from them having to jump around on the page and losing their place. I did some work for a political campaign this year and in every case, entering data was made much more difficult because the computer screen did things in a differnt order than the paper inputs. This is particularly important if the form is one that can't be changed (like a voter registration form, a campaign has to use what the state provides) to match the computer screen. ALso be consistent from screen to screen if possible. If it is first Name last name on one form, making it last name first name on the next will confuse people and guanteee data entry errors.
If you are truly interested in understanding users though I strongly suggest taking a course in Human factors engineering. It is an enlightening experience.
The 'right' way to do this is to prototype (or mock up) your new interface features, and watch your users try to use them. Nothing is as enlightening as seeing a real user try to use a new feature.
Unfortunately, given most projects time and resources, this is not possible. If that is the position you are in I would recommend you discuss in the team who has the best grasp of usability, and then make them responsible for usability decisions - but that person will need to regularly consult real users to make sure his/her ideas are consistent with what the users want.
I'd suggest doing some form of usability testing; I've participated in such in the past, and found them quite useful.
If you were writing a ticketing system, for example, bring up tasks, and ask questions like "how would you update this ticket" or "what do you expect to happen if this button is clicked".
You don't necessarily need a full application, either, in some places screen shots can be used.
You could take the TDD/BDD approach and get the users involved before beta, having them work with you on refining requirements as you write your unit tests. We're beginning to incorporate some of those trends into our current project, and we're seeing fewer bugs in the areas where we have involved the users earlier.
There is no "think like a user" technique, get your hands on someone who knows nothing of the project and throw what you have done at them.
It's the only way to see how the look + feel + functionality present themselves to the end user.
Once you shocked that person who knew nothing of the product, listen to all of their idiotic (or so you think they are) complaints, fix them, arrange every silly cosmetic thing they point out (either by fixing the UI or by improving whichever documentation you had)..
and after you have satisfied the person you chose to look at your app from zero knowledge on the subject first round, pick another ...and another... until they stop being shocked when they see it, and they don't get stuck on.. "ok.. what does this do?" kind of phases.
You (as a member of the project, be it the project manager, developer, etc) will never think like a user is my answer to that question.
Old saying: You can make something "fool proof" but you can't make it "Damn-fool proof".
Additionally: When you make something "idiot proof" the world invents a better idiot.
Other than that, I agree with what everyone else said.
Ask someone with absolutely no knowledge, insight or programming experience to use the program and try to figure out every function of the program.
People who would NEVER use such a program are most likely to find bugs.
See it as a new Safari user (or FF) who tries to put the URL inside the search field...
As a programmer you guess no-one would be that stupid (or, well.. unknowing), but people actually sometimes find themselves in these situations. As a programmer, we miss these things.