How can a large number of developers write software together without either a cumbersome process or poor quality software? - process

I work at a company with hundreds of people writing software for essentially the same product. The quality of the software has to be high because so many people depend on it (not least the developers themselves). Because of this every major issue has resulted in a new check - either automated or manual.
As a result the process of delivering software is becoming ever more burdensome. So that requires more developers which... well you can see it is a vicious circle.
We now have a problem with releasing software quickly - the lead time even to change one line of code for a very serious issue is at least one day.
What techniques do you use to speed up the delivery of software in a large organization, while still maintaining software quality?

I also work in a large and cumbersome organization. I've had success implementing several agile software development methodologies, but there are two in particular that I have found especially valuable.
Iterative and incremental development - Keep your release cycle short and tight. Across a large team, if numerous changes are made in between releases, you can find yourself in an integration nightmare.
Large organizations lean towards big project plans with lengthy development time lines. Fight this. Planning a project out an entire year makes no sense when your whole perception might change after the first two weeks of development. Condition your stakeholders to get used to the idea of making small incremental releases and adapting to ever-changing requirements.
Automated Unit Tests - This is a great way to insure you are releasing quality software. The worst bugs are caused by seemingly innocent code changes that have unintended consequences elsewhere. Comprehensive unit tests are maybe the best way to catch this kind of bug. If you can graduate to test driven development or behavior driven development, even better.
Any large organization is going to have some mediocre developers. It's a fact of life. Automated unit testing is a great way to keep an eye on them. Have one of your better developers write their unit tests for them. You will have assurance that at least their code works (even if it's ugly).

See Continuous-Integration
And please read The Joel Test: 12 Steps to Better Code.

There are many ways to improve the process, but the key component is modularity. By clearly separating responsibilities (both in code and in the organization) and defining clear and consistent interfaces, one large team can work as many small teams all tied together.

Wise but gloomy sayings:
Fred Brooks: Adding programmers to a late project makes it later.
Gerry Sussman: Programmers combine like resistors in parallel.
Somebody else: The cost of a project is the number of programmers times the length of the schedule.
Something that might work:
Organize the project so that there is a core application built by one or two people, preferably operated via a language. Then let all the programmers effectively be users of the core, by coding in that language.
I'm thinking of language-based products: SAS, S/R, MATLAB, etc.

Related

What types of architecture or architecture layers are not suitable for automated testing?

I was recently tasked with developing automated build and release pipelines for one of my company's legacy applications. After some investigation, I keep hearing from managers and other devs that certain application layers and architectures don't lend themselves to automation, particularly automated testing. Therefore, it's often suggested I shouldn't bother trying to apply DevOps principles and AT unless I want to re-architect the whole app.
The common cited example would be PL/SQL backends or monolithic architectures. I asked why these were not suitable, but never got a really clear answer. Does anyone have any insight on when automated test should not be used in favor of dumping the old architecture and starting fresh?
Short answer - ones that suffer from testability issues.
For a more in depth one, let's first admit that many software systems are untestable, or not immediately testable. So that, the effort of
trying to apply DevOps principles and AT
is far greater than the ROI. Such notorious example is Google's ReCAPTCHA, which causes some pain for the automation testing folks (like me). The devs are actually right to say that it will take be a
re-architect the whole app
journey, as testabilty is highly related to other key software qualities such as encapsulation, coupling, cohesion, and redundancy.
common cited example would be PL/SQL backends or monolithic architectures
Now, that is totally not the case. The firt one is more data-centric and requires a deeper understanding, but there are solutions to that as well. As to, single-tiered software applications - one can argue that in contrast to the mSOA, monolithic applications are much easier to debug and test. Since a monolithic app is a single indivisible unit, you can run end-to-end testing much faster/easier.
Put simply - if your app is highly testable, is highly usable. In case, the architecture and design were aligned to a very, very specific company needs - no wonder, is usable only up to a point.

What is the best method for Software/Web Application Testing?

Does any one know how to manage Software/Web Applications Testing process ?
The Hexawise test case generator can dramatically reduce your test design time (e.g., figuring out what combinations of browsers, configurations and functions should be tested so as to achieve maximum coverage in a manageable number of test cases).
Free trials are available at: http://hexawise.com/users/new
Disclosure: I created the tool based on the applied statistics-based Design of Experiments principles my dad, William G. Hunter, used to teach about when he was a professor at the University of Wisconsin.
/Slight tangent: it is surprising to me that Design of Experiments methods are widely used in manufacturing today, but rarely used in software testing. The same types of combinatorial explosions exist in both manufacturing (e.g,. thing about how many billions of ways there could be to manufacture a radial tire) and software testing (where billions of potential use cases exist in most non-trivial software applications particularly when browser types and/or configuration options are thrown into the mix). These Design of Experiments methods dramatically improve efficiency in both situations (and are extremely easy to prove in software testing projects by simple "bake-offs"); despite this they remain unused in the vast majority of testing organizations, even at Fortune 100 firms. I'm hoping to raise awareness and help change this. End of tangent/
Justin
There are lots of aspects to it, but one tool that is quickly becoming important in our arsenal is Selenium.
You need to test multiple layers of your software. Unit tests cover the smallest segments of code, like a specific method's output, rather than the way methods interact. Then at the opposite range of the spectrum there is the integrated user experience testing, which mimics a user pressing buttons and doing things with your application (using a tool like Selenium if you're writing web applications).

What is your or your company's programming process? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I'm looking for process suggestions, and I've seen a few around the site. What I'd love to hear is what you specifically use at your company, or just you and your hobby projects. Any links to other websites talking about these topics is certainly welcome!
Some questions to base an answer off of:
How do users report bugs/feature requests to you? What software do you use to keep track of them?
How do bugs/feature requests get turned into "work"? Do you plan the work? Do you have a schedule?
Do you have specs and follow them? How detailed are they?
Do you have a technical lead? What is their role? Do they do any programming themselves, or just architecture/mentoring?
Do you unit test? How has it helped you? What would you say your coverage is?
Do you code review? When working on a tight deadline, does code readability suffer? Do you plan to go back later and clean it up?
Do you document? How much commenting do you or your company feel comfortable with? (Description of class, each method and inside methods? Or just tricky parts of the code?)
What does your SCM flow look like? Do you use feature branches, tags? What does your "trunk" or "master" look like? Is it where new development happens, or the most stable part of your code base?
For my (small) company:
We design the UI first. This is absolutely critical for our designs, as a complex UI will almost immediately alienate potential buyers. We prototype our designs on paper, then as we decide on specifics for the design, prepare the View and any appropriate Controller code for continuous interactive prototyping of our designs.
As we move towards an acceptable UI, we then write a paper spec for the workflow logic of the application. Paper is cheap, and churning through designs guarantees that you've at least spent a small amount of time thinking about the implementation rather than coding blind.
Our specs are kept in revision control along with our source. If we decide on a change, or want to experiment, we branch the code, and IMMEDIATELY update the spec to detail what we're trying to accomplish with this particular branch. Unit tests for branches are not required; however, they are required for anything we want to incorporate back into trunk. We've found this encourages experiments.
Specs are not holy, nor are they owned by any particular individual. By committing the spec to the democratic environment of source control, we encourage constant experimentation and revision - as long as it is documented so we aren't saying "WTF?" later.
On a recent iPhone game (not yet published), we ended up with almost 500 branches, which later translated into nearly 20 different features, a huge number of concept simplifications ("Tap to Cancel" on the progress bar instead of a separate button), a number of rejected ideas, and 3 new projects. The great thing is each and every idea was documented, so it was easy to visualize how the idea could change the product.
After each major build (anything in trunk gets updated, with unit tests passing), we try to have at least 2 people test out the project. Mostly, we try to find people who have little knowledge of computers, as we've found it's far too easy to design complexity rather than simplicity.
We use DOxygen to generate our documentation. We don't really have auto generation incorporated into our build process yet, but we are working on it.
We do not code review. If the unit test works, and the source doesn't cause problems, it's probably ok - but this is because we are able to rely on the quality of our programmers. This probably would not work in all environments.
Unit testing has been a god-send for our programming practices. Since any new code can not be passed into trunk without appropriate unit tests, we have fairly good coverage with our trunk, and moderate coverage in our branches. However, it is no substitute for user testing - only a tool to aid in getting to that point.
For bug tracking, we use bugzilla. We don't like it, but it works for now. We will probably soon either roll our own solution or migrate to FogBugz. Our goal is to not release software until we reach a 0 known bugs status. Because of this stance, our updates to our existing code packages are usually fairly minimal.
So, basically, our flow usually looks something like this:
Paper UI Spec + Planning » Mental Testing » Step 1
View Code + Unit Tests » User Testing » Step 1 or 2
Paper Controller & Model Spec + Planning » Mental Testing » Step 2 or 3
Model & Controller Code + Unit Tests » User Testing » Step 3 or 4
Branched Idea » Spec » Coding (no unit tests) » Mental Testing » Rejection
Branched Idea » Spec » Coding (no unit tests) » Mental Testing » Acceptance » Unit Tests » Trunk » Step 2 or 4
Known Bugs » Bug Tracker » Bug Repair » Step 2 or 4
Finished Product » Bug Reports » Step 2 or 4
Our process is not perfect by any means, but a perfect process would also imply perfect humans and technology - and THAT's not going to happen anytime soon. The amount of paper we go through in planning is staggering - maybe it's time for us to get a contract with Dunder Mifflin?
I am not sure why this question was down voted. I think it's a great question. It's one thing to google search, and read some random websites which a lot of times are trying to sell you something rather than to be objective. And it's another thing to ask SO crowd which are developers/IT Mangers to share their experiences, and what works or doesn't work for their teams.
Now that this point is out of the way. I am sure a lot of developers will point you towards "Agile" and/or Scrum, keep in mind that these terms are often used very loosely especially Agile. I am probably going to sound very controversial by saying this which is not my intention, but these methodologies are over-hyped, especially Scrum which is more of a product being marketed by Scrum consultants than "real" methodology. Having said that, at the end of a day, you got to use what works the best for you and your team, if it's Agile/Scrum/XP or whatever, go for it. At the same time you need to be flexible about it, don't become religious about any methodology, tool, or technology. If something is not working for you, or you can get more efficient by changing something, go for it.
To be more specific regarding your questions. Here's the basic summary of techniques that have been working for me (a lot of these are common sense):
Organize all the documents, and emails pertaining to a specific project, and make it accessible to others through a central location (I use MS OneNote 2007 and Love it for all my documentation, progess, features, and bug tracking, etc.)
All meetings (which you should try to minimize) should be followed by action items where each item is assigned to a specific person. Any verbal agreement should be put into a written document. All documents added to the project site/repository. (MS OneNote in my case)
Before starting any new development, have a written document of what the system will be capable of doing (and what it wont do). Commit to it, but be flexible to business needs. How detailed the document should be? Detailed enough so that everyone understands what the final system will be capable of.
Schedules are good, but be realistic and honest to yourself and business users. The basic guideline that I use: release quality and usable software that lacks some features, rather than a buggy software with all the features.
Have open lines of communication among your dev. team and between your developers and business groups, but at the end of a day, one person (or a few key people) should be responsible for making key decisions.
Unit test where it makes sense. But DO NOT become obsessive about it. 100% code coverage != no bugs, and software works correctly according to the specs.
Do have code standards, and code reviews. Commit to standards, but if it does not work for some situations allow for flexibility.
Comment your code especially hard to read/understand parts, but don't make it into a novel.
Go back and clean up you code if you already working on that class/method; implementing new feature, working on a bug fix etc. But don't refactor it just for the sake of refactoring, unless you have nothing else to do and you're bored.
And the last and more important item:
Do not become religious about any specific methodology or technology. Borrow the best aspects from each, and find the balance that works for you and your team.
We use Trac as our bug/feature request tracking system
Trac Tickets are reviewed, changed to be workable units and then assigned to a milestone
The trac tickets are our specs, containing mostly very sparse information which has to be talked over during the milestone
No, but our development team consists only of two members
Yes, we test, and yes, TDD has helped us very much. Coverage is at about 70 Percent (Cobertura)
No, we refactor when appropriate (during code changes)
We document only public methods and classes, our maximum line count is 40, so methods are usually so small to be self-describing (if there is such a thing ;-)
svn with trunk, rc and stable branches
trunk - Development of new features, bugfixing of older features
rc - For in house testing, bugfixes are merged down from trunk
stable - only bugfixing merged down from trunk or rc
To give a better answer, my company's policy is to use XP as much as possible and to follow the principles and practices as outlined in the Agile manifesto.
http://agilemanifesto.org/
http://www.extremeprogramming.org/
So this includes things like story cards, test-driven development, pair programming, automated testing, continuous integration, one-click installs and so on. We are not big on documentation, but we realize that we need to produce just enough documentation in order to create working software.
In a nut shell:
create just enough user stories to start development (user stories here are meant to be the beginning of the conversation with business and not completed specs or fully fleshed out use cases, but short bits of business value that can be implemented in less then 1 iteration)
iteratively implement story cards based on what the business prioritizes as the most important
get feedback from the business on what was just implemented (e.g., good, bad, almost, etc)
repeat until business decides that the software is good enough

Are embedded developers more conservative than their desktop brethrens? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I've been in the embedded space for a while now, and it seems that most programmers I talk to seem to be doing things pretty much the same way it was done 15 years or more ago: Waterfall(ish) Development, command line tools and a small group uses lint.
Contrast this with the server/desktop environment, where there seems to be lots of activity related to all sorts of facets of programming:
XP, Scrum, Iterative, Lean/Agile
Continuous Integration
Automated Builds
Automated Unit Testing Frameworks
Refactoring tool support
Is it just that the embedded environment makes it more difficult to implement new practices or tools?
Is it that the mindset of embedded programmers steers them away from new tools/concepts?
Is it that management in the typical embedded industry behind the curve compared to IT focused fields?
I do realize that this is a generalization, and some embedded projects do use Scrum, Agile, CI, Automated Builds (in fact I worked at a company that had that in place since the 80s). But my impression is that it is a very small percentage.
We are all used to the fact that our desktop PC crashes once in a while (or at least an application on the desktop suddenly disappears). It's no big deal. The next patch will fix it.
In the embedded space, you are building something which can't be patched. Lives can depend on your device (in a car, an elevator or a medical system). Most devices are installed and then must run unattended for years. So embedded people tend to be very conservative. TCP/IP is often "too modern". They stick to their trusty serial bus with a communication "stack" that is roughly 50 lines of assembler code.
What's worse, you simply don't have the abundance of space on the device which means you can't use one of the latest programming languages which make TDD and automated builds a bliss.
Next, a lot of embedded development environments are proprietary. If your supplier doesn't support it, you won't get it. Linux has started to weaken this in the past years but a whole lot of devices are not powerful enough to run Linux, yet. And even if they were, the CPU power would be used for something else instead of running a fancy OS which comes with source.
So yes, there are powerful forces working in the background to keep the embedded space where it is.
Are embedded developers more conservative than their desktop brethrens?
Yes, because they are more concerned with the consequences of making errors. It’s a big deal to patch an embedded device. Not so much for a desktop app.
Waterfallish development is necessary in the embedded world because you are generally building hardware at the same time as the software. You need to know as soon as possible how much memory, how much processor speed, how big a flash, what if any special hardware is necessary etc...The hardware design can’t complete until you know these answers. Once you decide, that is pretty much it. The lead time for redoing a board is far too long. If you mess up then the software is going to have to work around any short-comings. Not usually an ideal situation.
As for the tools, that is largely based on what the supplier provides and any biases of the developers. On some projects I have used XP Embedded and got pretty much everything that the desktop developer gets.
XP, Scrum, Iterative, Lean/Agile:
Since most of the design is done up front (by necessity), and you usually don’t have working hardware when it is time to code, the quick turn-around processes don’t really provide much benefit.
Continuous Integration/Automated Builds
Nice to have, but not really necessary. What…it takes about 15 seconds to open the IDE and press the compile button.
Automated Unit Testing
No reason why this shouldn't be done, but only part of the code can truly be automatically tested because the other part is either hardware dependent or has some other dependencies like timing. So you can't really tell if the code is working by the automated tests.
Refactoring Tool Support
The vendors of embedded processors product is the processor. They provide the IDE support in order to encourage you to purchase their processor. They couldn't possibly afford to pay for a Visual Studio sized development team in order to add all the bells and whistles to the IDE which isn't even their product.
These some reasons I can think of:
Embedded teams are usually smaller that desktop/Web teams. Code base is smaller.
System testing is much more important than unit testing. The software needs to be tested together with hardware. Automated testing is not possible and can only be applied to a small fraction of the code base.
Embedded engineers have a different skill set than software engineers. They interact with hardware, know how to use an oscilloscope and a logic analyzer. Usually, the difficult part of their job is to find a glitch in the hardware. They do not have the time to adopt modern software methologies.
Embedded programmers are mostly electrical engineers, not computer scientists or software engineers.
They excel in their field of expertise. They bring a slower more methodical approach than most computer programmers. When it comes to programming firmware, electrical engineers know just enough to be dangerous.
Here are some of the things I've noticed electrical engineers doing in C:
All code in ONE single file
Math like variable names: x, y, z
No or missing indentation
No stardard comment headers
No comments at all
Too many comments
In their defense EE's didn't train to be computer programmers, it's not their job. I think software is the hardest part of creating embedded devices. Designing PCBs and choosing components requires skill but pales in comparison to the complexity of 10,000 lines of code.
Embedded programmers also have to deal IDE's that look and behave like the IDE's of the 90's.
MPLAB
AVR Studio
Is it just that the embedded environment makes it more difficult to implement new practices or tools?
It's partly a matter of scale. Software is NOT the product, the product is the product. however, there are thousands of different types of microcontrollers and microprocessors out there, and the most popular thousand have 3-4 different compilers that aren't completely compatible.
So a given tool is only going to be used by a few hundred or thousand engineers.
In windows development, however, there are millions of programmers of many levels - the tools produce software directly which is the product, and so it's going to get more eyeballs, and more money.
Each new product that an engineer puts out might have a different processor.
Is it that the mindset of embedded programmers steers them away from new tools/concepts?
Embedded programmers are generally software or firmware engineers, as opposed to programmers. Engineering implies a certain amount of design, design analysis, and design proof prior to implementation - in other words a ton of work is done before the first line of code is written, and the documentation, ideally, is specific enough that implementation is merely turning pseudocode like documentation into compilable code.
New tools and concepts are needed in the design phase, not the implementation phase. An IDE with intellisense may be nice, but by the time the code is being written it's useless cruft - they already know what they need.
CAD - computer aided design - tools are being developed for firmware engineers that are used in the design phase to develop models and simulations that are directly turned into code. Matlab and simulink are good examples of this. The system as a whole is designed.
In fact, one might wonder why software developers are still writing code while the engineers are making data/program flow charts and state machine diagrams. Why is UML uptake so slow in the application world? It sounds like application developers can use some of the tools in common use among embedded systems engineers...
Is it that management in the typical embedded industry behind the curve compared to IT focused fields?
Actually, it's likely the reverse. When a project starts the engineers have to pick the processor.
The processor manufacturers get less money on older chips, so they pitch the latest and greatest, and they are generally cheaper overall than the chips used in the previous design (either by die shrinks, more integration, etc).
So the design is actually using the latest and greatest chips.
The downside is that the compiler and tools are often immature. They can only build so much on the older tools, and since the target moves with each new processor, they can't focus on a lot of the nice features application programmers might like. Especially since many of those features won't be useful to an embedded engineer.
There are many other factors, some of which are enumerated by other answers, but it's really a different field even though they both involve programming.
-Adam
I would also add a couple of points here:
In general embedded projects tend to be smaller than desktop projects. This decreases the need for very elaborated software processes.
Requirements for embedded project are often precise and better defined. Therefore SCRUM and agile are not so crucial
Finally embedded projects are generally a mix of software and hardware. The software being only a part of the project embedded developpers invest less time in software processes
I agree with much that's been written here:
Old tools without the bells and whistles (far fewer refactorings available due to C/C++'s preprocessor directives, if any at all) (time consuming to choose a unit test framework vs simply using JUnit).
It's true that waterfall feels more efficient. If I'm going to open the hood and get into a hard-to-access place, I'll want to do as much as I can while I'm there, rather than exiting and closing the hood after each task just to open it again. The idea that creating the most important features first allows you the option of shipping when promised instead of going late can also be hard to grasp when you believe nothing is optional, which might be true. IME, though, when the deadline looms something always becomes unnecessary.
Less visibility into the system makes it riskier to revisit existing code to refactor or change functionality. There are often timing issues, which automated tests running on the host using stubs and mocks won't catch. It can be hard for someone who's been bitten by these issues to take a different perspective.
I'll add one more; the language of agile/scrum is in workstation programmer's terms. To an embedded developer who knows just enough C to get the job done, what is a class, object, or method? When the "user" is typically regarded as a physical person clicking and typing, and the product has no person user interface, it's easy to dismiss the idea as Not Applicable. This may change with James Grenning's forthcoming book about TDD in C. I've been reading the beta ebook and it's quite good.
I would say it's more lack of good toolsets. It's really frustrating when you want to use C++ for its compile-time features not present in C (templates, namespaces, object-orientedness, etc) rather than its run-time features (exceptions, virtual functions) -- but the device manufacturers & 3rd parties just give you a C compiler, not C++. This probably results more from market size (hundreds of millions of PCs running Windows, with hundreds of thousands or even millions of developers -- vs. hundreds of thousands of Chip X, with hundreds or low thousands of developers) than from device capability.
edit: w/r/t robustness: there are different markets out there. The car/elevator/aeronautics/medical device market is going to have to be rigorous about getting rid of bugs. Other markets (toys, MP3 players, & other consumer electronics) can be less rigorous, especially if it's possible to upgrade code in the field. ("Oops! We're sorry we deleted your music library! We just fixed that bug, you can grab the latest release at our website at your convenience!")
I'd say different sorts of problem environments.
The biggest problem with the waterfall methodology is that requirements change. In every environment I've been in, there has been at least the likelihood of a requirements change, which means that the successful methodologies are those that keep flexibility as long as possible. Even if the customer has signed off in blood, and stands to forfeit his left hand if he suggests a change, there are changes coming in the future.
In embedded programming, it is possible to nail the requirements down up front. They come from the behavior of the system as a whole, and engineers are good at nailing down system requirements. Nobody's going to come in halfway through and say that the user now wants the pacemaker to deliver syncopated impulses while the recipient is dancing.
Once the requirements are frozen beyond thawing, which never happens in software designed for human use, waterfall is a very efficient methodology. The team proceeds from well-specified requirements to overall design, then detailed design, then coding, verifying all the way that the stages are done correctly. Then it's time to debug the code (since it's never perfect when written), and final tests to make sure the code meets the requirements.
I would also posit that some fields are inherently conservative. The transportation industry for example, where trains and planes may have life spans of 30 years or so. Customers tend to require tried and true practices, probably derived from IEEE. Waterfall is what customers know, waterfall is what customers demand.

Requirements or Testing? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
If you had to do without one or the other in a software project, which would you pick?
I've had plenty of projects in which the client or PM thought they could get away without one or the other. We always paid the price.
Turn this around and repeat after me: "Tests are requirements." :-)
If you mean "formal requirements", I can and easily do without those. I would much prefer a living, breathing customer who can tell me what they want over a rigid, out-of-date document. Having switched to TDD, I wouldn't ever want to go back to a "no test" environment. I choose informal requirements -- stories, on-site customer, and customer-written acceptance tests -- over formal requirements and no tests.
I'd say you could go without Testing rather than Requirements. If you don't have requirements, how do you know what you're developing?
If the programmers are good enough, they should be able to catch most of the egregious errors that testing would find.
You have to test against the requirements, so if you don't have requirements you can't do testing. So if you have to pick one, you can only pick requirements.
But not doing testing is a path to failure. Guaranteed.
If I had to pick one, it would be requirements.
It doesn't have to be a formal, excruciatingly detailed document with twenty signatures, but you have to know exactly what the customer wants and more importantly what the customer needs.
The requirements are also your first communication to the development team. How will they know what you're asking if you're not asking it clearly? At best you're at grave risk of building the wrong thing right. I'd rather have the right thing built slightly wrong.
If I were asked to choose between requirements or testing I would choose to polish up my resume. You really can’t do without either in any projects because the basic project lifecycle is:
Define Needs/Goals (AKA Requirements)
Design & Build to the requirements
Verify that you built to spec (to requirements.)
If you dont have success criteria and goals that are verifiable (and then are verified) how can you insure that you are going to succeed? And if you dont have a chance to succeed, why start the project?
I would say requirements because there always seems to be some level of "feature creep" from the client when you are developing software. Testing is one of the crucial pieces in the SDLC.
Requirements and testing are important for most projects but if you really have to pick, you should go with requirements. One of the advantages of picking requirements over testing is that, you might save some development time since the developers know what they have to build, and if the development is done with extra time in hand, you can allocate that time for testing :)
tests (feature and integration) are more important than requirements; if you can specify the tests then you have also specified the requirements, at least implictly
comments are also the developer documentation, with unit tests being the how-to 'quickstart' examples ;-)
Not sure if the requirements are referred to as an artefact or as a process. Although it is possible to skip requirements as artefact especially for smaller teams and still deliver a product, skipping requirements as process is out of question. Requirements as artefact let you model the system at cost lower than building the entire thing, do feasibility, estimates, and for a larger and more disperse team to cut communication overheads and have a common ground under the feet. Neglect the requirements and you get louse estimates (regardless if you plan a lot up front or just do a short sprint), poor idea of feasibility and possibly very inefficient communication and a lot of miscommunication.
Requirements as a process on the other hand is going to exist regardless if it is formally acknowledged or not. You cannot really exclude it, you can pretend requirements process does not exist or integrate into the design, coding, testing or into stages as late as pilot and maintenance. Obviously treating the process in this way mean it will not get fair amount of attention and resource. Consequences normally range from delivering something that is ultimately useless to having to fix the now obvious shortcomings of the product later in the development cycle or even discovering the real requirements once the product fails in the field, increasing the cost of development, defaulting on the deadlines, ruining team’s good name, destroying user confidence etc.
Testing usually boils down to validation and verification, more recently testing technology improvements let automated testing to be used as a solid tool for achieving greater efficiency in debugging and reducing time necessary for regression testing. Validation is making sure that the team has built the right product, i.e. scoped requirements are correct, not contradictory and there are no gaps. Verification on the other hand is making sure that the product is built right: no technical defects, accidental errors etc.
As we can see testing provides a safety net in the scenario where requirements were neglected. Normally as the team starts testing they need to refine their understanding of requirements and as a result modify the software. Since both requirement artefacts and software itself just represent different levels of fidelity in modelling a solution for a real life problem, and software as a model is order of magnitude more precise the testing of application evaluates requirements as well (regardless if they are implicit or explicit, formally analysed or informally communicated).
Normally the alternative to testing is to let users report a substantially larger amount of defects and shortcomings and try and fix them as part of maintenance (meaning later in product lifecycle), increasing the cost of every fix.
So requirements versus testing? Fire the manager. Ok, skip requirements if you want the project schedule slip during the testing phase and get yourself into the mess of building not what users need, skip the testing if you just need to show utter disrespect to your users.
Without requirements you don't need testing since what you end up with is exactly what was spec'd
There are categories of software that can be developed perfectly well without requirements, at least anything more than a vaguely expressed idea the length of an email.
Thing is, if you have a specific client, and a project manager, it is unlikely your software is in one of them. It's unlikely someone is specifically paying you to, say, 'make me a fun game involving a juggling monkey'.
The only category of software that can be developed without testing is failware: where your company has managed to sucker some customer into paying whether or not the software works (or if you have a really dumb customer, pay more if it doesn't work, in support and maintenance).
That's probably more likely: contracts structured so that success is less profitable than failure are still fairly common. If you think that's the case, and you want to develop working software, then consider switching to a job where your interests and your bosses are less opposed.
Without Requirements can we make a Test Plan? So We Cant do Testing even if we pick Testing instead of Requirements.
So Requirements should be Priority even if you consider Agile Testing Environment.