Need help regarding Test Management tool Testlink - testing

Our company is small and in a project only 1 or 2 testers are assigned. And our all test related things are maintained on Excels sheets. And for bug tracking we are using Mantis. We create test cases on Excel sheet and execute them via same.
Is TestLink or any other test management tool will be helpful to us or not. As number of testers are less so there are no merging of test cases are done. Only one QA develop test cases and execute it. Please suggest me if it will be any help to us or not.
If so please suggest only free application

I am working for a startup and we are pretty much using Testlink. Our QA team is always pretty small (in between 1-3). It's very helpful for organizing and keeping the test cases for your whole system. It becomes more useful when you go for a release. You can assign your testers based on a test build so that they can go through test cases one after another and mark which tests are passing or not. Finally, you can generate report based on those for your build.
Hope that helps.

Regardless of if there is only one tester or many, it is still a good practice to make use of a test management tool and using a lightweight solution will make you more productive.
There are many benefits over using a static excel file and we recently put together a short blog post which goes into a little detail of the benefits to organizing your testing process with a test management tool which may be of interest.
If you are using Mantis to track your issues, you will often find that test management tools integrate with tools like this so that when a test fails a ticket is automatically created and this is a huge time saver.

Related

How do you organize your business requirements and tests?

as of right now i'm working at place where's there's a lot of legacy codes and pretty much no useful documentation.
and most of the time we just treat business requirements as whatever that is already implemented previously.
i'm looking for any tools or useful method to keep all the requirements for future use and for regression testing mostly.
i'm thinking of maybe linking them up to tests/unit test too so that the business requirements are linked directly to the coding logic.
Any good tools or resources to get me started?
thanks~
Updates
As of now i'm making things simple on myself by writing use case and then create some simple use case diagram using this awesome tool and then convert each use case into a test plan. The test plan is meant for the end user, thus i just make it into a simple step by step flow. I had plans to automate this part using selenium but it wasn't working that well on our website and was taking too long. It's a bit TDD, but i think it create simple understandable goal for both end user and the developer, i hope.
So for now it's just excel and doc file, lugged into the project doc folder and check into cvs/svn doomed to be outdated and be forgotten :P
Business requirements can be well capture in FitNess tests. As for Unit Test they sur help, and but both together in continuous integration like Hudson to detect regression ASAP.
PS: Sorry pretty much all links go to some articles I wrote because I'm also interested in that subject.
Here are some methods/systems that I have used
HP Quality Center
Big and bulky. Not very agile but it works and has a lot of feautres.
It's used in many larger corporations and if you can afford you can get great support from HP
https://h10078.www1.hp.com/cda/hpms/display/main/hpms_content.jsp?zn=bto&cp=1-11-127-24%5E1131_4000_100__
Bugzilla-Testopia
Open Source test case management extension for Bugzilla, managed by Mozilla. Which is good enough in my book to give it a try.
http://www.mozilla.org/projects/testopia/
Excel/Open Office Calc
Just do everything in spreadsheets and link between them.
Very flexible, everybody knows how to use them and you propbably have the software in your organization already.
Other Open Source Solutions
List of 15+ Open Source Test Management Tools
http://www.jayphilips.com/2009/09/10/15-open-source-test-management-tools/

How to manage test cases using Confluence?

We would like to use Confluence for writing and managing our test cases. Confluence Testplan plugin seems close to what I'm looking for, but it's a bit too simple and limited.
How are you using Confluence to manage your test cases?
We both do and don't use Confluence for managing our test cases.
We Don't
In my project we use, and love, Confluence. But only for knowledge documentation and spreading. I'm sorry but I can't see how Confluence would be a good idea of writing and managing our test cases.
We Do
We use excel/calc spreadsheets to write an manage manual test cases. We write them on a very high level. E.g. "Log in and upload a jpeg image." We expect all tester to have high domain knowledge and know how to log in and upload images.
Then we upload the spreadsheets to Confluence a special page. Every time the tests are run, before every release/sprint demo, we check those out. We enter in the results (sometimes add new tests) and check the spreadsheet back in again with comments.
It works fine, is fast, flexible, low overhead and it's ready to send to management or the customer anytime.
IMHO, honestly spreadsheets beats most test managing tools.
Assuming that you are using Jira for Agile management, it is best practice to associate test cases with Jira tickets. Confluence does a nice job of allowing users to link user stories within the wiki. For instance you can create a 'sub task' against a user story. Typically I write automated tests for all the user stories I test. So I can associate a git commit with a particular QA sub-task so it makes sense linking a ticket. You might want to look at the Confluence api, I link my automated test results into confluence which prints out my test cases.
On the topic of using spreasheets.. Its a terrible practice. Test Cases should be accessible by anyone and I don't mean on shared drive somewhere. Product, management and anyone in engineering should be able to visit a page and look at the test cases, coverage & results.
If the question is about functional testing or BDD, did you check GreenPepper? See the documentation.
We're not using Confluence for test cases right now, but we are for use cases. I wrote up some examples about how we manage use cases here. The general idea could probably be applied to test cases also.

How integration tests are performed on your company/job/project?

I want to improve integration tests methods where I work and I would like to know how this process happens in other places.
Things like:
- When test plans writing begin
- Proportion between testers, developers and stuff (entire applications or modifications) to be tested
- What kind of methods are used for integration testing.
Actually, I test webapps and test plans are managed with Test Link. Bugs found are reported on Bugzilla. I am trying to automate tests with Selenium RC, but I takes some time to write the plans and write the code to execute on Selenium. And time is something that I dont have, because I am testing 3 or more aplications.
Most of my problems are caused by differences between test environment and production environment. But tests are taking too long to begin. If someone finishes a modification today, it will take about 3 weeks for me to begin tests. And the test process queue keeps growing.
It would be really good if anyone suggests something that would improve testing process (like more people testing,etc). But mostly, I would like to hear how testing process works on other places.
Thanks.
For us the integration test is generally performed by the developer before a commit. Just simple surface test to see that nothing obvious is broken.
Then we deploy the code from trunk on a development server connected to a test database that is a complete copy of the production database and have the users responsible for the new functionality do acceptance test and further integration tests on that server.
We have a concept of "super user" to organize this. Super users are responsible for educating other users in their area of expertise and answering helpdesk questions related to the usage of the system. The super users are also the people who are involved in feature requests and requirement discussions for all features related to their work.
So when a new feature is developed the super user is the one who first validate the design suggestion and than performs the final stages of testing before deployment.
This setup is good because it ensures that domain experts are the ones who validate the system functionality and removes some responsibilities from the IT-department.
The bad thing is that they are not usually very technical or good testers. As users they tend to see the the system for what is is rather than what it could be. The fact that they also have their ordinary functions in the organization as full time employees also means that they are a very limited resource in terms of testing.
I'll assume you mean integration testing as in checking to see if the parts of the application work together, (for example, getting the database and the website to work together after the DBA and web developer respectively say they're done) And I'll use an example from my current project
I code generate several configuration files so I can observe the application with certain modules on/off, namely error reporting, authentication, debug mode compilation, with/without SSL. Development environments are likely to have "friendly error pages" turned off, no authentication, no SSL, etc.
I also use a build script to create a copy of the application for each variant of the config file
It is helpful to pedantically reproduce the characteristics of production to staging and development as much as you can-- use virtual machines if you lack the hardware
I also wrote into the production code bases a few pages that test the sort of things that break when code move from one machine to another, i.e. does the db connection work, do emails send, is the temp folder writable and made that page the home page of the server operator
The key is automating as much as you can. Frequent integration testing catches issues earlier.
From check in to packaging code for deployment, it takes me 8 minutes of automated work and 1/2 hour of manual clicking for smoke tests.

How do you organize your release tests?

In the company where I work we have major releases twice every year. Extensive testing (automated and manual) is done the weeks before.
The automated tests produce logfiles, the results of the manual tests are written down in test plans (Word documents). As you can imagine this results in a lot of different files to be managed and interpreted by the test engineers.
How do you organize your release tests?
E.g. Do use a bug tracker? Do you use any other tools? How do you specify what has to be tested? Who does the testing? How is the ratio developers / testers?
You could use combination of a bug tracker (JIRA, Mantis, Bugzilla) and test case management tool like testlink
It's almost impossible to properly organise the testing without keeping good track of your test and their results.
We use PMC suite(1) and it has a very useful organisation structure for the tests:
Test Scenarios (batteries of tests)
Test Cases (linked to the Requirements)
Test runs with their respective results
These are linked to the Bugs which are in their turn linked to the Tasks etc.
When a manual test is run the tester executes a scenario and goes through the test cases, with the results being tracked. All found issues are documented as Bugs.
1. it's developed by my company but please don't consider this to be a ad :)
If you develop with MS products and technologies you could always look at Team Foundation Server. I find it fits perfectly for managing automated Unit Testing/builds, managing bugs, managing test results, assigning testing tasks, etc. That is what we use. Its not cheap thoguh, but worth the investment if its in the budget.

Test planning/documentation/management tools

I'm looking for a good, preferably free, test planning and documentation tool. Ideally something that will keep track of which tests have been run against which software version, with reporting ability. There's a whole bunch of tools listed here but are there any others, and which ones have you had the best experience with? (You do run tests, right?)
UPDATE 2008-01-29
So far TestLink and Fitness have been mentioned. A related question yielded also a link to the ReadySet project, an open collection of software planning documentation templates.
I have used TestLink and found it okayish, but I cannot say I enjoyed using it. Has anyone had any experience with Fitnesse? Or are there any other free tools out there that you have used and found satisfactory?
You should definitely try out Klaros-Testmanagement http://www.klaros-testmanagement.com which has an free, unrestricted Community Edition.
Yes, we do run tests, but not nearly as many as I'd like !
I highly recommend TestLink - the list of tools that you linked to shows that it's had more downloads than all of the other tools put together.
I've used QualityCenter/TestDirectory for a long time.
I'm now using testlink and I must say that I prefer QualityCenter/TestDirectory by far, even if it based on some buggy ActiveX control.
QualityCenter/TestDirectory is more easier to use and the interface is quite better.
TestLink and QualityCenter/TestDirectory are mainly for manual test case (however, you can use Quick Test Pro on QualityCenter/TestDirectory to automatize your tests).
Fitnesse is another kind of tool in my mind : basically, you write your test case on a wiki and link that to a JUnit test. Another tools like that are GreenPepper, Concordion, etc.
PractiTest is a very good option. Not free but very affordable - http://www.practitest.com
A little late on this one, but I would have to suggest you try TestLodge for your manual test management. It works in a simular way to what TestLink does, but it gives off a more professional interface and is something that we also allow our clients to use for their uat phase.
We use Quality Center / Test Director stuff. Its expensive as far as I know, and it's not that great.
I've heard good things about Fitnesse but I don't know how good it's test tracking is.
I know I just recently saw a slick looking test tracker for Trac or something, but I can't find it now...
Quality Center. It's expensive, but it is the best
I'm with Patrick - good ol' office tools :)
I just write mine in Microsoft Word
This is the structure I developed: Writing a System Test Plan.
One thought, and perhaps not a good one, would be to have every test submit a ticket to your ticketing system when it's run indicating the test name, build version, and date, and test results.
That would make the results searchable later-on.
We use a home-grown Access database.
This database keeps track of our requirements, test cases, test plan and test runs. We're able to produce an up-to-date RVTM, keep track of progress against the plan, and assign tasks to testers. We integrated it with Outlook, so each tester is assigned a task from the plan by the QA lead. When they're complete, they just tick it off in Outlook and it updates the database.
For our small team of testers it works nicely, and we're free to customize it however we want.