As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I am going to look for a job as a software tester (a SDET maybe), especially for website test. I have some vague impression of this area and got a couple of specific questions as below:
Among so many documents, such as functional spec, design spec, which should I pay more attention to? How to view them in a tester's view?
Any good suggestions about writing test spec?
Any attention should be paid to website test?
These are just some questions I got now, I'll update with more shortly.
I'd like to hear your voice very much. Many thanks.
Credentials: I'm an SDET with 5 years of experience, 2 of those years testing web applications.
1- I'd say testerab has a pretty good answer. There is no single document that you can invariably rely upon across companies or even teams within a single company. Pay attention to whichever document has information.
I'd augment that answer with this advice: Don't be surprised if the documentation is insufficient. Strike up strong relationships with people who help define the product (the dev, the business owner, the program manager, etc.). You will nearly always be relying on them for some of your specifications, since it is difficult to cover everything on paper (and, as you gain expertise as a tester, you will learn to see things that others don't notice). Try to write down any "verbal specs" as you hear them, and ideally get any requests for specification clarification in writing or email. Gathering them all in a public document is wise, and may help to uncover if two people have very different ideas about what the spec "ought" to be.
2- Testerab has a good answer to this question, also, here: How Do You Keep Automated Tests in Synch With Test Plans
"1) Who reads it? 2) Who should probably read it, but currently you suspect they don't bother? (Do you know why they don't bother?) 3) What information do they need to get from it? Does it give them that info? 4) How do you currently present that information? Does that work for your readers/non-readers? 5) What sort of feedback do you need to get from the readers of your test plan? 6) Do you have any regulatory requirements that you need to satisfy with your test planning? "
Test plans, like product specs, will vary greatly depending on the needs of your group. If you are in an Agile group you may spend very little time on your test plan, doing little more than outlining the areas you need to cover - or you might not even have a test plan at all, but just a conversation with the team about what will be sufficient testing for everyone to feel confident about making decisions about the product. Other companies will have very specific guidelines you will need to follow.
Cem Kaner's classic book "Testing Computer Software" is slightly outdated, but still a good place to start and discusses test planning. I'd recommend you buy a copy quite strongly, unless someone can recommend something as authoritative that is more current. Last I heard, this was still the software testing book.
3- I'm having a little trouble understanding this question, but will do my best. Do you mean, what specifically will you need to know to test websites? First, what do you mean by websites? Do you mean web applications? If so, you will probably need to understand server / client architecture, web services, databases and basic SQL, at least rudimentary security testing, integration testing, functional testing, and will benefit from an understanding or specialization in performance testing, load testing, more security testing, and familiarity with web GUI testing with Selenium or Watir.
Some helpful things for us to know to help you get started:
How much experience do you have, both as a developer and as a tester? If you are just getting started in your career, what is your educational background?
How much experience do you have working with web applications, and in what roles (dev, test, PM, etc.)?
And, you might want to try asking some of these questions over at http://www.softwaretestingclub.com - this is a site for software testers to build community. You will get a lot of good advice and support there, so long as you are active in the community, and many of the most influential software test writers hang out there. If you do stop by there, feel free to look me up!
Hope this helps!
Edit: Added some info to answer q. #2 and to mention Cem Kaner's book.
I'm a developer with 2 years .NET experience and 1.5 years previous testing experience and an ISTQB/ISEB Foundation qualification.
To answer your questions:
1: A test manager will (typically) have a test plan and awareness of the specification documents to be tested against. Using what a developer is using is a good start. If the development methodology is agile this will probably be "user story".
A good way to look at the documents is to go through and look at where individual elements of functionality are specified and create steps to exercise them (see some of the functional techniques below).
2: What do you mean by "test spec"?
You will need to prioritise the areas of the application that need testing and understand the coverage needed. A "Test case spec". (or test script) will fit into higher level documents (like Test Plans, and Test Strategies) can be efficiently and effectively written using some Black box (Functional) techniques including:
Equivalence Partitioning,
Boundary Value Analysis,
Decision Tables,
State Transition analysis,
Use Case analysis (which could be based on a user story)
to come up with scripts that contain test cases. These techniques can be looked up online.
White box (Structural) testing involves an awareness of the code and includes:
Statement Coverage,
Decision coverage
If you're are looking at a website, this may involve JavaScript; QUnit is a testing framework for automating JavaScript testing and would be useful to research. NUnit is a commonly used test framework for .NET applications (including web applications) - NUnit was ported from its Java equivanlent JUnit and has been expanded (most probably owing to the popularity of .NET).
3: I don't understand what you mean by this? A web application will need to be tested in many different ways, and contains server and client functionality that will be tested using different techniques and the testing needs will need to be analysed. It will depend on the project.
As mentioned in other answers there are also other types of testing:
Unit - modular testing of functions at the lowest possible levels
Integration - testing functionality between different functional areas
Regression - testing to ensure that previously working functionality hasn't been broken by changes
System testing (Functional) - ensuring that the code/system under test is working as specified
System testing (Non-functional) - ensuring that aspects of the system that may not be specified are appropriate e.g. performance, load, stress, interoperability, maintainability, reliability, portability, usability
Acceptance (something called User Acceptance Testing or UAT) - ensuring that the system under test is fit for use
As mentioned in other answers, you will be retesting existing defects and inclusion of these to your test scripts is a good idea.
Hopefully this answer has given you a lot of food for thought and a good base for research. Testing qualifications or a role as a Junior Tester in an established team to build your understanding and experience could prove to be very useful.
"Among so many documents, such as functional spec, design spec, which should I pay more attention to? How to view them in a tester's view?"
Being able to extract useful information from many different sources of documentation is a critical skill for a tester, so you're right to identify that as an area you need to look at. The documents you need to look at will vary from project to project, and from company to company, so there isn't one good answer about what document you need to look at - but having good specification analysis skills will mean you'll be able to cope with whatever you're given.
For that, I'd strongly recommend this BBST course on specification based testing - it will show you how to analyse specifications, applying the Satisfice Heuristic Test Strategy model. That should also help you with your second question about writing a test spec.
http://www.testingeducation.org/BBST/BBSTSpecificationTesting.html
I'd recommend the BBST courses in general - the course materials are all available freely online, at the website above.
If you're really serious about testing, you should also consider taking the online course from the Association of Software Testing. The Foundations course is free to members, and you'll get the opportunity to practice your skills online, gain really valuable feedback on how you present yourself and your ideas, and you'll also meet a lot of outstanding testers, both as fellow pupils and as instructors. It's hard work - but if you're willing to put the effort in you will really get a tremendous amount out of it. Being able to discuss the basics with other people will really help you to get a deeper understanding.
my 50c
If you don't have test specs, or any kind of specs, you can transform your bug reports into test plan.
For each bug report that occurs, create one test item. That way - you'll have list of tests that you can follow when doing regression testing.
Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I'm wondering what you do as a programmer that's not programming but necessary for your task (eg: local setup, server setup, deployment, etc). I'm curious to know how many non-programming related tasks people are performing.
For example, when on web development projects I often:
Install servers
Manage user right/access to servers
Perform backups
Configure IIS/Apache
Setup FTP sites
On non-web projects I often:
Write build scripts
Setup source code management tools/procedures
Probably more stuff I'm not thinking of
Some tasks are more related to programming than others (such as writing build scripts) but others fall outside of my area of expertise (domain setup comes to mind). Just interested to know how many people perform tasks in their jobs that are not programming related.
The sad reality is that non-technical people look at technical people and expect them to know everything that is technology related, not understanding that there are specializations within technology which we might know nothing about.
I often think it is very much like a doctor that specializes in a particular discipline. All doctors have a baseline of knowledge in the medical field, but will not know the specifics of other specializations (a cardiologist will not know as much about anesthesiology and vice versa).
So while I think it is unreasonable for people to expect technologists to know everything, I do think that it is reasonable for them to expect that we know something when it comes to technology.
I think a more important facet of this question is how much one is expected to know about the specific domain where they apply their skills (finance, manufacturing, etc, etc). I think that is incredibly important, as having that domain knowledge makes them much more valuable as a programmer, as they can understand the problems on a deep level, and as a result, provide more comprehensive solutions for them.
Expected? Almost nothing, but everyone's always really happy when you know more.
The more you know outside the narrow confines of programming, the more valuable you are to your employer.
Things that have come up for me:
requirements gathering
writing use cases
evaluating test plans
negotiating with vendors
tax law
revenue recognition rules
ideas about how users behave
basic economic theory
usability guidelines
differences in consumer behavior in different countries
system administration (being a full on sysadmin)
database configuration, optimization, setup (basically being a DBA)
monitoring systems
networking principles and techniques (you'd be amazed how handy a packet trace can be when debugging something...)
being able to evaluate a business plan written by someone else
image manipulation
how to diffuse a situation and avoid arguments
how to corner someone and make them to commit to something when they don't want to
how to choose battles
I think the non-programming skill I use the most in my programming job is writing. It's really crucial to be able to explain ideas, designs, algorithms, and so on, and you can never count on being around to do it in person (or having the time). I spend a good amount of time at work writing up design documents and other documentation so other engineers can get their heads around my code and algorithms. So I'm really thankful that I had good writing classes in school and can put a sentence together. :-)
Probably depends on the size of the company you work for. As someone who has worked mainly at small to medium sized businesses, I've also been responsible for:
database creation, management, and tuning
supporting the internal applications I launch
managing website certificates
setting up external hosting
and I'm sure there's more as well
Well, since a programmer's primary tool is his computer, I think it's fair to assume some expertise with it. Most of those sorts of things you've described are difficult for someone unfamiliar with computers, but pretty easy (even with little prior experience) for someone who understands the domain and knows how to find and read documentation.
In a big, well-organized business or project, I'd expect someone who was more specifically familiar with those sort of administrative things to take care of them. However, if there's not enough of them to warrant a full-time job, then I don't think it's unreasonable to have anyone competent work on it; and programmers are probably at the head of the queue in that regard.
I find the vast majority of "bugs" discovered by users are configuration problems with the systems on which the application is installed. Having developers that understand the common machine and network setup errors is very desirable.
For example if an application sends email as part of its operation its useful to have developers knowledgable in DNS and SMTP configuration.
Of course it depends on your size of business, large organisations can probably shield developers from this by using other specialists.
I realized I'm never hired for the actual job, but as a problem solver. Whether I figure out what's going on, and fix it through code, or software, or something on the network, this seems to be the main perception of what clients want.
This will vary greatly depending on where you are. I've worked with people who know none of this stuff, and people who are experts.
Knowing this will help you greatly. In general it's always better to understand the environment your code is running in. Not understanding the context leaves you somewhat helpless.
Additionally there are often bugs that are not code related but configuration related, for example a page not showing up because of the apache configuration. You're very handicapped in debugging if you don't understand the environment.
People around a work place probably expect a programmer to be their IT HelpDesk guy... it happens around here to me. argh.
Where I work, all developers are expected to be able to use Subversion and have to be able to setup and configure Apache and Tomcat on their PC.
The biggest challenge is not the technical issues associated with getting the environment up and running but the domain knowledge required to effectively develop software in a small shop. For me, I work on a lot of different projects from a variety of sources in a mostly isolated development environment. This means that I need to come up to speed on the domain of the project pretty quickly in order to be effective in developing a solution. In the past I've worked on print accounting solutions, active directory management, research survey databases, and currently a quasi-CRM solution for a charitable organization. I wish I only had to know the nuts and bolts of setting up my development and build environment.
It often depends on the size of the company. In a little company, you have to know how to do everything, including systems admin, and network admin, even if your job is focused on programming.
In a big company, you get to see a little slice of the universe, and they often don't like you peeking outside of your box. Not only do you not need to learn everything, they're often unhappy with you if you try.
However, the more you understanding about the machines, how they work, and how they function in an operational environment, the easier it is to diagnose problems and write better software. The more you understanding about the domain you're writing applications for, the better you are able to differentiate between the users needs and their desires.
One of the coolest things about being a software developer is you have a life long excuse for sticking your nose into both the technologies and the various business domains. If you've shifted around to a few different industries, you tend to become loaded down with all sorts of interesting tidbits. There is always more to learn ...
Paul.
It's good to expose yourself to other technologies, but I really think it's a bad idea for you to not fully disclose the fact that you aren't experts in those areas (esp. domain setup). I've worked with people who thought they could do it all but ended up doing those tasks so poorly that with all the time (and money) they've spent trying to get it right, a consultant would have been paid for several times over.
I've worked at a company where I was responsible for everything "related to a computer" including the domain, PCs, database, custom software, builds, MS Office, PowerPoint, Quickbooks...; a mid-size company where it was development and builds; and a large company where I focus solely on the .Net code for my project (someone else handles the database and another handles reporting).
The mid-size company has been the best experience so far (pretty new at the large company) where I was given enough responsibility to feel useful and had easy access to everyone else to ask questions about those other tasks.
You are not alone out there. The position I signed up for was "ASP.NET Web Developer"... However, my job consists of:
Windows Server Administration
Limited Linux Administration (running
top to monitor CPU utilization and changing apache configs)
LDAP Administration / Tuning
MS SQL Server 2005 Administration /
Tuning
Database Development
Crystal Reports Developer
Perl Scripts
C# Win32 Developement
C# / ASP.NET Web Developement
Managing User Access Rights for
Windows Servers
Limited Network Troubleshooting
Being in a company that is constantly striving for supreme "Operation Effectiveness" my task list only grows by the day. I did not make up that list either. All of the items mentioned above, I have either touched or supported in the past 3 years I have worked in this company.
That being said, in a good development shop, you should have one specific task. As the saying goes, Jack of all trades ... master of none.
This depends greatly on what you're programming. If you're doing low level device drivers, it's vital that you understand the underlying hardware. If you're doing a standalone Java app, the better you understand the JVM and libraries you're using, the better - but it isn't strictly necessary to know a lot.
In general, the more you understand about your system environment, the better. How much your peers and management expect you to know depends on them.
Ignorance will, eventually, be punished. If not by your peers and management, the world will do it. Check any week's headlines or RISKS digest for examples where ignorance of the system environment cause software failure.
[rant mode on]
Ha, the curse of Excel and Word.
Outside work - particularly friends and family but sometimes when consulting or delivering software too, any and all non-technical people expect you to understand these. There's that internal groan when someone asks you across to have a look at a small problem they're having with some facet of Office. And because it's a client and you want to appear helpful you agree.
There's just this blanket expectation that because you're a developer you have an innate knowledge of configuring spreadsheets, fixing Word templates, and any and all other office techie tasks, and furthermore you can cast your eye over some badly configured Office mess and instantly diagnose what the problem is.
I can only just about manage to put together a spreadsheet to schedule my reoccuring invoices and set up a Word template to write them. I regularly tell people that too - but no-one ever listens.
It depends a lot on the type of software you're currently developing
For example, when I was working on software for a local government, I had to learn things like
What are the rules for registering animals (pets). What are the types of registrations, what discounts apply, what are penalties for not registering on time
How are council rates calculated. How are rates raised yearly (actually, the algorithm for raising yearly rates and its implementation was the most complex task I met so far).
How are building permits issued. What types of inspections can be performed. Who is involved in the process of issuing a building permit (owner, builder, architect, officers etc.)
How often are water meters read. How are water meters assigned to properties, how many dials are on a water meter, how to detach a water meter from one property and to attach to a different one
What are different pension types. What are different discounts that are granted depending on a pension type.
What are different types of receipts. What different types of terminal printers (those that are used to print small receipts) exist and how to print to them.
What are properties, strata children, what are rules for dividing properties into 'parcels' ...
Well, that's just part of non-programming stuff that I learned during the 2 years on the project. The most unfortunate thing here is that now that I moved to a different company, there is very little chance that any of this knowledge I will ever use.
My job title is "Senior Software Engineer". In point of fact, for most of the past several years, I did fairly little software development, but did do a lot of:
Systems & web administration
Static web page development with HTML (I don't consider that programming, although I have done PHP, CGI, and JavaScript).
As others have said, help desk sorts of stuff, although not as much as in the past.
As a "task leader", I'm expected to have some people/management skills, although that usually devolves to writing monthly reports. I also get sucked into CMMi stuff from time to time, which in an ideal world might be somewhat relevant, but is usually just record keeping so the employer can bid on new contracts which require it.
Working in science lab, there's a need to know some of the science, especially if you want/need to work on the code doing the scientific calculations.
Working in a (U.S.) government facility, there's lots of paperwork and a need to know lots of government regulation (e.g. Freedom of Information Act)
Fortunately, I've recently made an internal transfer where I'm doing more development work and less of this other stuff!
Personally, I find that knowing more is always good, it paves the way to the next level. The hardest things in life is at the integration point. Literally. People focus a lot on specializing, but don't forget that you need people who can straddle both realms.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I am about to go for an interview for a software testing summer job. What questions should I ask the professor about this + I have never done software testing before, any good reference material you can recommend will be appreciated.
thanks
You should be prepared to discuss a variety of testing terms, such as:
"black box" testing, "white box" testing, etc.
unit tests
functional tests
smoke tests
BVTs (Build Verification Tests)
the differences between stress testing and load testing
performance testing
globalization testing
interoperability testing
manual testing vs. automated testing (when?, why?)
api testing
security testing
regression testing
code coverage testing
(etc...)
You likely don't need experience in all of them, but you should express an awareness.
A general knowledge of the following is helpful (refer to IEEE 829 for a start):
- test plans - what should be in a good plan?
- test cases - what should be in a good test case?
- test design specifications
- incident reporting (including bug tracking)
- software specifications - what does one look for?
You should start thinking about how you would test different things. What are the base cases? Are there any boundary cases? What could be wrong with any given product or item? Think creatively...
For a few starting references on testing, I suggest looking at the following:
Cem Kamer's book on software testing
Wikipedia for some more starting points
IEEE 829 (related articles should be sufficient to get you thinking, as the full spec is good for insomniacs)
If you've never done software testing before, it would be a good idea to learn some things quickly.
I'd recommend checking out the Black Box Software Testing course, available free (without an instructor) at http://www.testingeducation.org/BBST, or in an instructor-led version that is free to members of the Association for Software Testing (http://www.associationforsoftwaretesting.org). This is a university-level course, hours and hours of video, supplementary materials, quizzes, self-tests, and pointers to other information.
James Bach and I co-author and teach a course called Rapid Software Testing (http://www.developsense.com/courses.html). The course notes for that are available for free at James' Web site, http://www.satisfice.com/rst.pdf.
I've written a lot of articles on testing for Better Software magazine. They're available free at http://www.developsense.com/publications.html.
In addition, there's a blog post for you: http://www.developsense.com/2009/02/how-can-trainee-improve-his-her-skills.html
There are several testing communities online where you can ask questions and get mentorship. http://www.softwaretestingclub.com and http://www.testrepublic.com are two of them.
Best of luck.
---Michael B.
Besides the questions you will be asked, don't forget the interview is actually a conversation. And you look much better if you ask questions yourself. So, let me say few things I'd ask if I were you :)
For me, when it comes to working as a tester, most important is communication. How well you can communicate with team members, managers, team that develops the software you test.
Do they use some kind of bug tracking system, if so, what system is it? Is it the same system the development team uses?
Does this tool cover most of communication needs, or there gonna be a lot of calls / email exchanging resulting in a total mess in discussions about issues?
Is there any automated tool used for testing? This gets you quite close to what are your responsibilities on this position, so will probably be covered in the interview anyway.
Do you get 2 monitors ;) ? (Really, getting a second display was like a huge improvement for me in tester job). Do you get the tools that make your work faster and more effective?
Terms, definitions and tools are important thing... but analytical skills, logic, communication and other skills may be more important.
Maybe It won't be a summer job, but a career.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
I'm trying to establish more formal requirements and testing procedures then we have now, but I can't find any good reference examples of documents involved.
At the moment, after feature freeze testers "click through the application" before deployment, however there are no formal specification what needs to be tested.
First, I'm thinking about a document which specifies every feature that needs to be tested, something like this (making this up):
user registration form
country dropdown (are countries fetched from the server correctly?)
password validation (are all password rules observed, is user notified if password is too weak?)
thank-you-for-registration
...and so on. This could also serve as something client can sign as a part of requirements before programmers start coding. After the feature list is complete, I'm thinking about making this list a first column in a spreadsheet which also says when was the feature last tested, did it work, and if it didn't work how did it break. This would give me a document testers could fill after each testing cycle, so that programmers have to-do list, with information what doesn't work and when did it break.
Secondly, I'm thinking of test cases for testers, with detailed steps like:
Load user registration form.
(Feature 1.1) Check country dropdown menu.
Is country dropdown populated with countries?
Are names of countries localized?
Is the sort order correct for each language?
(Feature 1.2) Enter this passwords: "a", "bob", "password", "password123", "password123#". Only the last password should be accepted.
Press "OK".
(Feature 2) Check thank-you note.
Is the text localized to every supported language?
This would give testers specific cases and checklist what to pay attention to, with pointers to the features in the first document. This would also give me something to start automating testing process (currently we don't have much testing automation apart from unit tests).
I'm looking for some examples how others have done this, without too much paperwork. Typically, tester should be able to go through all tests in an hour or two. I'm looking for a simple way to make client agree on which features should we implement for the next version, and for testers to verify that all new features are implemented and all existing features are working, and report it to programmers.
This is mostly internal testing material, which should be a couple of Word/Excel documents. I'm trying to keep one testing/bugfixing cycle under two days. I'm tracking programming time, implementation of new features and customer tickets in other ways (JIRA), this would basically be testing documentation. This is lifecycle I had in mind:
PM makes list of features. Customer signs it. (Document 1 is created.)
Test cases are created. (Document 2.)
Programmers implement features.
Testers test features according to test cases. (And report bugs through Document 1.)
Programmers fix bugs.
GOTO 4 until all bugs are fixed.
End of internal testing; product is shown to customer.
Does anyone have pointers to where some sample documents with test cases can be found? Also, all tips regarding the process I outlined above are welcome. :)
ive developed two documents i use.
one is for your more 'standard websites' (e.g. business web presence):
http://pm4web.blogspot.com/2008/07/quality-test-plan.html
the other one i use for web-based applications:
http://pm4web.blogspot.com/2008/07/writing-system-test-plan.html
hope that helps.
First, I think combining the requirements document with the test case document makes the most sense since much of the information is the same for both and having the requirements in front of the testers and the test cases in front of the users and developers reinforces the requirement and provides varying view points of them. Here's a good starting point for the document layout: http://www.volere.co.uk/template.htm#anchor326763 - if you add: steps to test, resulting expectations of the test, edge/bound cases - you should have a pretty solid requirement spec and testing spec in one.
For the steps, don't forget to include an evaluate step, where you, the testers, developers, etc. evaluate the testing results and update the requirement/test doc for the next round (you will often run into things that you could not have thought of and should add into the spec...both from a requirements perspective and testing one).
I also highly recommend using mindmapping/work-breakdown-structure to ensure you have all of the requirements properly captured.
David Peterson's Concordion web-site has a very good page on technique for writing good specifications (as well as a framework for executing said specifications). His advice is simple and concise.
As well you may want to check out Dan North's classic blog post on Behavior-DrivenDevelopment (BDD). Very helpful!
You absolutely need a detailed specification before starting work; otherwise your developers don't know what to write or when they have finished. Joel Spolsky has written a good essay on this topic, with examples. Don't expect the spec to remain unchanged during development though: build revisions into the plan.
meade, above, has recommended combining the spec with the tests. This is known as Test Driven Development and is a very good idea. It pins things down in a way that natural language often doesn't, and cuts down the amount of work.
You also need to think about unit tests and automation. This is a big time saver and quality booster. The GUI level tests may be difficult to automate, but you should make the GUI layer as thin as possible, and have automated tests for the functions underneath. This is a huge time saver later in development because you can test the whole application thoroughly as often as you like. Manual tests are expensive and slow, so there is a strong temptation to cut corners: "we only changed the Foo module, so we only need to repeat tests 7, 8 and 9". Then the customer phones up complaining that something in the Bar module is broken, and it turns out that Foo has an obscure side effect on Bar that the developers missed. Automated tests would catch this because automated tests are cheap to run. See here for a true story about such a bug.
If your application is big enough to need it then specify modules using TDD, and turn those module tests into automated tests.
An hour to run through all the manual tests sounds a bit optimistic, unless its a very simple application. Don't forget you have to test all the error cases as well as the main path.
Go through old bug reports and build up your test cases from them. You can test for specific old bugs and also make more generalizations. Since the same sorts of bugs tend to crop up over and over again this will give you a test suite that's more about catching real bugs and less about the impossible (or very expensive) task of full coverage.
Make use of GUI and web automation. Selenium, for example. A lot can be automated, much more than you think. Your user registration scenario, for example, is easily automated. Even if they must be checked by a human, for example cross browser testing to make sure things look right, the test can be recorded and replayed later while the QA engineer watches. Developers can even record the steps to reproduce hard to automate bugs and pass that on to QA rather than taking the time consuming, and often flawed, task of writing down instructions. Save them as part of the project. Give them good descriptions as to the intent of the test. Link them to a ticket. Should the GUI change so the test doesn't work any more, and it will happen, you can rewrite the test to cover its intention.
I will amplify what Paul Johnson said about making the GUI layer as thin as possible. Separate form (the GUI or HTML or formatting) from functionality (what it does) and automate testing the functionality. Have functions which generate the country list, test that thoroughly. Then a function which uses that to generate HTML or AJAX or whatever, and you only have to test that it looks about right because the function doing the actual work is well tested. User login. Password checks. Emails. These can all be written to work without a GUI. This will drastically cut down on the amount of slow, expensive, flawed manual testing which has to be done.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
In your “enterprise” work environment, how are engineers held accountable for performing code inspections and unit testing? What processes do you follow (formal methodology or custom process) to ensure the quality of your software? Do you or have you tried implementing a developer "signoff" sheet for deliverables?
Thanks in advance!
Update: I forgot to mention we are using Code Collaborator to perform our inspections. The problem is getting people to "get it" and buy into doing them outside of a core group of people. As stalbot pointed out below it is a cultural change but the question becomes, how do you change your culture to promote quality initiatives such as reviews/unit tests?
• Our company uses peer code reviews. We conduct them as Over-The-Shoulder reviews and invite the team’s tester to participate in the meeting to gain a better understanding of the changes. We use Source Control software that requires check-in, code-review rules to be signed off. Nothing big, just another developer's name that has reviewed the code.
• There are definite benefits to code review as several studies have been able to demonstrate. For our company, it was evident that code quality increased as the number of support calls decreased and the number of reported bugs decreased as well. NOTE: Some of the benefits here were the result of implementing Scrum and abandoning Waterfall. More on Scrum below.
• The benefits of code review can be a more stable product, more maintainable code as it applies to structure and coding standards, and it allows developers to focus more on new features rather than “fire-fighting” bugs, and other production issues. There really aren’t any drawbacks if code reviews are conducted “right”. More on the “right way” below.
• Some of the hurdles to overcome while implementing code reviews are the idea that “big brother” is watching me and the idea that not having perfect code means torture and pain. Getting developers to trust each other is difficult sometimes, especially when it involves “pecking order” or the “holier than thou” attitudes and putting your hard work under a microscope. Trust is the key to resolving these issues. A developer must trust that they will not be punished by peers or management for mistakes in code. It happens to everyone. Make a note of the issue, get it resolved and move on.
Scrum
One of the benefits of using the Scrum methodology is that a development cycle (”sprint”) is short. The time-frame of the “sprint” is determined by what works best for your organization and will need some trial and error, but really shouldn’t be longer than four week iterations. The benefit is that it requires the developers communicate daily and communicate problems early on in the project. This was initially adopted by our development department and has spread to all areas of our company as the benefits of scrum are far reaching. For more information, see: http://en.wikipedia.org/wiki/SCRUM or http://www.scrumalliance.org/ . As the development iterations are smaller, the code review process reviews smaller pieces of code, making the review more likely to find problems than hours or days of formal reviews.
“Right Way”
Code Reviews done the “right way” is completely subjective. However, I personally believe that they should be informal, over-the-shoulder reviews. All of the participants in a review should avoid personally attacking the person being reviewed with statements such as “why did you do it that way?” or “what were you thinking?” etc. These types of comments diminish the trust between peers, leading to animosity, hours of arguing over the best/right way to code a solution. Keep in mind that developers do not think or code exactly the same, and there are many solutions to a problem.
Just a little clarification on over-the-shoulder reviews; these can be conducted via remote desktop sharing (pick flavor here), or in person. However, they shouldn’t be limited to the developers only. We typically invite our entire scrum team which consists of two developers per team, a tester, a documentation person, and product owner. All non-developers are there to gain a better understanding of the changes or new functionality being made. They are free to ask questions or provide input, but not to make coding decisions or comments. This has been effective as certain questions will be asked that may change the direction of the project as the initial requirements may have missed a scenario, but that is what agile is all about, change.
Suggestion
I would highly recommend researching scrum and code reviews, before mandating them. Create the basic rules for each and implement them as part of your culture to achieve a better quality product. It must become part of your culture so that it is part of a natural process and integrated at all levels, as it is a paradigm shift from poor quality, missed deadlines and frustration to better quality products, less frustration, and more on-time deliverables.
If you want to ensure that every changelist gets reviewed, before checkin, then you could have your source control tool reject unreviewed checkins. For example, a trigger could reject checkins without "CodeReview: " in the checkin comment. Although people could still lie, they could also be held accountable.
If you want to ensure that every changelist gets reviewed, after checkin, then you could see if Code Collaborator will play nicely with your source control system and automatically make a review task after each checkin (push or pull; whatever works). After that, use whatever "polite annoyance" features Code Collaborator has, to make sure reviews actually get done.
If you want people to review only some checkins, not all checkins, then good luck with that.
We have a pretty cool setup. Coders are expected to test their code before check-ins to ensure that it doesn't break the build and to write tests where they make sense to have but high coverage isn't required.
Complex methods are expected to be commented.
At the end of phases code is reviewed by the whole team.
Pair programming. Work items have a required field of collaborator, the person that you paired with for the work
We lean heavily on ITIL concepts. While we don't need the full scale ITSM that ITIL provides, we have implemented processes that draw from some of the best practices in ITIL, specifically in the areas of Change Management and Release Management.
Code reviews are part of our RM strategy. As a change or new piece of code makes its way through our RM process, a lot of eyes look at it. Ultimately the Release Manager makes the call on approval or rework, and all of this is documented (we use TFS and SharePoint). Formal code reviews are held by the Release Manager and the technical team he selects. The primary developer for a release candidate is held accountable for adherence to standards, functionality, and a verification of a completed test plan. If the quality standards aren't met, the deliverable is rejected and the project schedule is updated to reflect the rework.
Yes, this is all very heavy. I work in government and we have complex laws to follow, specifically in the areas of taxes, ADA compliance, and so on.
We use three basic rules
1) The developer is responsible for fixing bugs in code when unit tests don't exist. In cases where there is a test, the person breaking the test is responsible for fixing it.
2) Code reviews. There are some code review smells that are a good warning sign, over defensiveness and blame redirection being the two most common.
3) NO EMAILING CODE, JARs or config files. Everything is in the scm.
To create the culture 1st try define your standards and values and most of all make them known.
Then hire people who believe in them or who could be able to adapt to them. Don't hire someone who does not have any connection at all with your company values.
Make sure that those who respect these values and show improvements are "rewarded" and "properly" recognized and seen as models. Don't forget that for many is not all about the money.
Don't hesitate to take appropriate measures againts those who do not fulfill their responsibilities but make sure they know them. And have them accountable for their deeds.
Allow people to become used with any new responsibility.
To make change in culture is big deal. Still there are some ways to change.
Create awareness about code review and importance of code review tool. It can be done using training session.
Motivate the people : Giving some reward for the code reviews.
Change in process : Make sure that code review should be happen and properly. It can be done using checklist and part of release process.
Do not try to change completely. Slowly introduce newer changes. Create small group to observe and discuss the change in code review process.
Provide the solution instead of create problem. Process should not be overhead. It comes automatically. Provide solutions to peoples problem related to the process.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
If you had to do without one or the other in a software project, which would you pick?
I've had plenty of projects in which the client or PM thought they could get away without one or the other. We always paid the price.
Turn this around and repeat after me: "Tests are requirements." :-)
If you mean "formal requirements", I can and easily do without those. I would much prefer a living, breathing customer who can tell me what they want over a rigid, out-of-date document. Having switched to TDD, I wouldn't ever want to go back to a "no test" environment. I choose informal requirements -- stories, on-site customer, and customer-written acceptance tests -- over formal requirements and no tests.
I'd say you could go without Testing rather than Requirements. If you don't have requirements, how do you know what you're developing?
If the programmers are good enough, they should be able to catch most of the egregious errors that testing would find.
You have to test against the requirements, so if you don't have requirements you can't do testing. So if you have to pick one, you can only pick requirements.
But not doing testing is a path to failure. Guaranteed.
If I had to pick one, it would be requirements.
It doesn't have to be a formal, excruciatingly detailed document with twenty signatures, but you have to know exactly what the customer wants and more importantly what the customer needs.
The requirements are also your first communication to the development team. How will they know what you're asking if you're not asking it clearly? At best you're at grave risk of building the wrong thing right. I'd rather have the right thing built slightly wrong.
If I were asked to choose between requirements or testing I would choose to polish up my resume. You really can’t do without either in any projects because the basic project lifecycle is:
Define Needs/Goals (AKA Requirements)
Design & Build to the requirements
Verify that you built to spec (to requirements.)
If you dont have success criteria and goals that are verifiable (and then are verified) how can you insure that you are going to succeed? And if you dont have a chance to succeed, why start the project?
I would say requirements because there always seems to be some level of "feature creep" from the client when you are developing software. Testing is one of the crucial pieces in the SDLC.
Requirements and testing are important for most projects but if you really have to pick, you should go with requirements. One of the advantages of picking requirements over testing is that, you might save some development time since the developers know what they have to build, and if the development is done with extra time in hand, you can allocate that time for testing :)
tests (feature and integration) are more important than requirements; if you can specify the tests then you have also specified the requirements, at least implictly
comments are also the developer documentation, with unit tests being the how-to 'quickstart' examples ;-)
Not sure if the requirements are referred to as an artefact or as a process. Although it is possible to skip requirements as artefact especially for smaller teams and still deliver a product, skipping requirements as process is out of question. Requirements as artefact let you model the system at cost lower than building the entire thing, do feasibility, estimates, and for a larger and more disperse team to cut communication overheads and have a common ground under the feet. Neglect the requirements and you get louse estimates (regardless if you plan a lot up front or just do a short sprint), poor idea of feasibility and possibly very inefficient communication and a lot of miscommunication.
Requirements as a process on the other hand is going to exist regardless if it is formally acknowledged or not. You cannot really exclude it, you can pretend requirements process does not exist or integrate into the design, coding, testing or into stages as late as pilot and maintenance. Obviously treating the process in this way mean it will not get fair amount of attention and resource. Consequences normally range from delivering something that is ultimately useless to having to fix the now obvious shortcomings of the product later in the development cycle or even discovering the real requirements once the product fails in the field, increasing the cost of development, defaulting on the deadlines, ruining team’s good name, destroying user confidence etc.
Testing usually boils down to validation and verification, more recently testing technology improvements let automated testing to be used as a solid tool for achieving greater efficiency in debugging and reducing time necessary for regression testing. Validation is making sure that the team has built the right product, i.e. scoped requirements are correct, not contradictory and there are no gaps. Verification on the other hand is making sure that the product is built right: no technical defects, accidental errors etc.
As we can see testing provides a safety net in the scenario where requirements were neglected. Normally as the team starts testing they need to refine their understanding of requirements and as a result modify the software. Since both requirement artefacts and software itself just represent different levels of fidelity in modelling a solution for a real life problem, and software as a model is order of magnitude more precise the testing of application evaluates requirements as well (regardless if they are implicit or explicit, formally analysed or informally communicated).
Normally the alternative to testing is to let users report a substantially larger amount of defects and shortcomings and try and fix them as part of maintenance (meaning later in product lifecycle), increasing the cost of every fix.
So requirements versus testing? Fire the manager. Ok, skip requirements if you want the project schedule slip during the testing phase and get yourself into the mess of building not what users need, skip the testing if you just need to show utter disrespect to your users.
Without requirements you don't need testing since what you end up with is exactly what was spec'd
There are categories of software that can be developed perfectly well without requirements, at least anything more than a vaguely expressed idea the length of an email.
Thing is, if you have a specific client, and a project manager, it is unlikely your software is in one of them. It's unlikely someone is specifically paying you to, say, 'make me a fun game involving a juggling monkey'.
The only category of software that can be developed without testing is failware: where your company has managed to sucker some customer into paying whether or not the software works (or if you have a really dumb customer, pay more if it doesn't work, in support and maintenance).
That's probably more likely: contracts structured so that success is less profitable than failure are still fairly common. If you think that's the case, and you want to develop working software, then consider switching to a job where your interests and your bosses are less opposed.
Without Requirements can we make a Test Plan? So We Cant do Testing even if we pick Testing instead of Requirements.
So Requirements should be Priority even if you consider Agile Testing Environment.