Recently I focused on the static analysis software, especially the Indus and and Soot Java frameworks. I want to test these software. Can anyone can provide comprehensive test cases? I think the test cases I write are not typical enough.
My standard advice in evaluating static analysis tools is to test them on the real software you’ll be using them for: “Pitfall II: Don’t buy a tool based on bugs it finds in other people’s code. Before you commit to a static analysis tool, make sure that it finds important bugs in your real code. Bugs found in open source or demo code can be very impressive; but your organization’s code, while it’s under development (which is the cheapest time to find bugs) will be very different from code which has already been made public.” Supplemental Proceedings of the 21st IEEE International Symposium on Software Reliability Engineering, http://pobox.com/~flash/Static_Analysis_Deployment_Pitfalls.pdf.
Your best bet is to contact the vendors of these software packages and ask them for test cases. It is in their own interest to have as many as possible right now.
One way of obtaining test cases is holding onto the input files you get from your users when things break -- maybe distill the input to the least amount of input necessary to trigger the bug in the broken version, and make sure newer versions work correctly.
Related
My company recently did a POC and have decided to use an commercial CMS. It is being implemented and we have been asked to Test it. What is there to be tested in a 3rd party CMS that has already been tested and being sold in the market?. Any direction would be great!
I would recommend adjusting your mind-set. What do you know about the test regime of this product? To start from the point of view that it's a commercially shipped product, it must have been tested, so I don't need to test it, is deeply flawed thinking.
First, all software has bugs.
Second, in testing the product you could reasonably focus on your proposed usage scenarios. You may choose patterns of use that were not anticipated by the development and test teams. At the very least you gain experience of the product's capabilities and limitations.
Third, installation into your environment may impact the system in unexpected ways. So at the very least the product must be exercised in your environment before you start to trust it. You need to explore the operational aspects, backups and restores for example. Now, before the system is live, is the time to find out how to recover from a disk crash.
I would ask the vendor is they have a regression suite you can run in your environment. if not I would devise a quick check-list of my own, trying to think about corner cases. Then also start to explore how your teams will use the product. Presumably there will be a "Build Master" role? Work with the people in that role and walk through some common scenarios. The likelyhood is that you will uncover some ways of working that are better than others.
Summary: testing isn't just about finding bugs (though you may well find some) but it's also about understanding the product better and learning how best to use it.
as of right now i'm working at place where's there's a lot of legacy codes and pretty much no useful documentation.
and most of the time we just treat business requirements as whatever that is already implemented previously.
i'm looking for any tools or useful method to keep all the requirements for future use and for regression testing mostly.
i'm thinking of maybe linking them up to tests/unit test too so that the business requirements are linked directly to the coding logic.
Any good tools or resources to get me started?
thanks~
Updates
As of now i'm making things simple on myself by writing use case and then create some simple use case diagram using this awesome tool and then convert each use case into a test plan. The test plan is meant for the end user, thus i just make it into a simple step by step flow. I had plans to automate this part using selenium but it wasn't working that well on our website and was taking too long. It's a bit TDD, but i think it create simple understandable goal for both end user and the developer, i hope.
So for now it's just excel and doc file, lugged into the project doc folder and check into cvs/svn doomed to be outdated and be forgotten :P
Business requirements can be well capture in FitNess tests. As for Unit Test they sur help, and but both together in continuous integration like Hudson to detect regression ASAP.
PS: Sorry pretty much all links go to some articles I wrote because I'm also interested in that subject.
Here are some methods/systems that I have used
HP Quality Center
Big and bulky. Not very agile but it works and has a lot of feautres.
It's used in many larger corporations and if you can afford you can get great support from HP
https://h10078.www1.hp.com/cda/hpms/display/main/hpms_content.jsp?zn=bto&cp=1-11-127-24%5E1131_4000_100__
Bugzilla-Testopia
Open Source test case management extension for Bugzilla, managed by Mozilla. Which is good enough in my book to give it a try.
http://www.mozilla.org/projects/testopia/
Excel/Open Office Calc
Just do everything in spreadsheets and link between them.
Very flexible, everybody knows how to use them and you propbably have the software in your organization already.
Other Open Source Solutions
List of 15+ Open Source Test Management Tools
http://www.jayphilips.com/2009/09/10/15-open-source-test-management-tools/
Our software vendor is currently working on a project to migrate our enterprise scale laboratory system from Tru64 unix to Red Hat.
This obviously means recompiling with a new compiler and perform lots of testing.
While the vendor will do their own testing, we also need to do acceptance testing.
We don't exactly trust the vendor will be as thorough with their testing as we hope.
So I have been tasked to think of things that will need to be tested.
This is a laboratory system, so things such as calculations and rounding (and general maths) need to be tested.
But I thought I would ask the SO community for advice on what to test or perhaps past experiences with this sort of thing?
You will need to test everything. Whatever you tested in your original environment, you will need to test in your new environment.
Eventually, you'll gain confidence that most of your tests will simply never fail in the new environment. There will surely be a set of tests that will always succeed, as long as the old and new environments are Unix-based systems. That's fine - that's a set of tests you won't need to run constantly. I'd still keep those tests around to run once per release of the new OS or per release of your product, however, just to be safe.
Check it works on 32 and 64 bit CPUs, spaces in filenames, users don't need admin rights to run it or change configs
One unix to another isn't a huge leap.
If you can come up with a suite of regression tests, you can use those scenarios via an automated tool against the original and ported systems to make sure they match. The QA and UAT tests that you currently run against the system would probably be a good starting point, and then you could add any critical edge cases (such as those that test the math in detail) as needed. Paul's suggestion above about compiler issues would also allow derivation of some good edge cases; I'd recommend looking at that scope from the perspective of both the Tru64 and RHEL compilers.
A fair amount of my recent experience is with JMeter, which has a number of assertions, pre-conditions, and post-conditions that can be evaluated to ensure compliance. A number of the tools in this space would also allow you to do load testing, if appropriate.
If your system doesn't have a remotely accessible interface (like a web-based or socket-based interface), you could potentially do the same thing with scripted tools.
Thirteen or fourteen years ago, I couldn't move an Informix database from SCO OpenServer to Linux because SCO used 16-bit inode numbers, Linux used 32-bit inode numbers, and Linux's 'personalities' was nowhere near as advanced as it is today. I can appreciate your skepticism.
If you can re-run old experiments with saved data and saved outcomes, that would be my preferred place to start. Given simple datatypes, the precision or range of operations may be vastly different on different compilers/platforms, so I wouldn't be surprised if small differences in output are common, so exact matches may not be realistic, but certainly results should be close enough to not influence the larger 'outcomes' of your testing runs.
Rather than searching for test cases, use it for what you've already done with it. (As an aside, that's also a good way to build test cases for software development.)
Differences in precision between the standard math library functions. They are not the same on different systems. If you need consistent calculations, you will need to replace them. Look into crlibm and/or fdlibm.
It seems to me that if you are writing in an interpreted language that it must be difficult to sell software, because anyone who buys it can edit it/change it/resell it without much difficulty.
How do you get around this? I have a couple of PHP apps that I'm reluctant to sell to people as it seems that it's too simple for them to change/read/edit/sell what I've produced.
Hardly anyone sells code. We sell the ability to create, edit, support and understand the code.
As a potential buyer of your application, I might find these features attractive:
The ability to change the code to suit my needs
The ability to read the code to better understand what it's doing
... and yes ...
The ability to sell my modifications
All three of those are features.
The third one might be a feature you can't afford to give me. Fix that through legal measures, not technical measures. That's what licensing is for. You could also sell more expensive licenses which do allow resale.
There are things you could do to remove the first two features, but bear in mind that in doing so you are reducing the overall value of your product to some people, and therefore its sale price.
For many people the primary reason for using Open Source software is not that it costs nothing -- it's that you get the source code.
People sell the service of creating web sites all the time. Also, even a compiled language can be altered, it`s just more difficult.
Most of the time the user base does not understand how to make the changes or what to do with the scripts so you are really selling your knowledge of how to install and change the scripts.
Don't sell the software, sell "licences".
I'll try to explain better, build the web app but provide hosting for it. this way your client will never get to "hold" the source code.
If you really must deliver the source code, Obfuscating is the way to go ;)
Possible routes to go:
Translate to a bytecode, binary or an obfuscated format
For instance, Splunk is written mostly in Python, and distribute bytecode. The EVE online client uses Stackless Python to compile to an executable binary.
Host the solution yourself
Put up a website, charge for use.
License the software
They get the source, but cannot legally modify or redistribute the source.
Open source the solution
Anyone can change the code, but you are the de-facto authority on it, and you can earn money by selling support, consultancy and customization services.
You could also consider a combination of approaches. E.g., partition your solution into several stand alone packages, and then open source some of them, and sell bytecode versions of other parts. What you then sell is the complete solution, as well as other services, and some people may benefit and enhance other parts of the solution.
Plenty of companies make money off of applications in interpreted languages and happily distribute the source code with them. Don't take this personally, but your program probably isn't going to be popular enough to have a large following of pirates. And anybody who would pirate your software probably isn't going to buy it in the first place. If they can't pirate it, they'll pirate somebody elses.
Whatever you do, please don't obfuscate your code. It's not an effective means of preventing infringement and it won't do anything other than make life miserable for you and your customers.
Protecting your secret bits is getting more and more difficult.
IMHO, your solution really depends on your target market. If you are targeting business, just give them the code with a good license, and possibly some type of defect so you can determine who gave your code away if that ever happens. Businesses will mostly pay for your app just to stay compliant; it's not worth the legal hassles. And if an individual gets your app for free, that's probably a good thing, since they will try to convince their current and future employers to buy it.
If you are targeting individuals, and can do it as a web app (which you obviously are with PHP), do it as a hosted service, and either sell a monthly subscription or allow free access and find another way to monetize it.
If you definitely need to or want to distribute it to individuals for whatever reason, you can give it away for free, and try to monetize customizations, add-ins, & other support features.
This is a problem that's been discussed a lot, and a few hours’ worth of really focused googling should reveal all the current philosophies on this.
I hope this helps.
Obfuscation may be a good way to go
With PHP you have the option of using the Zend Guard for PHP. I believe it compiles the source code in a way similar to what the php interpreter does,
so it should also increase performance. Of course the price of $ 600 may be too much for your liking ;-)
Anyway, I see no reason why you shouldn't distribute your code with an open source license (see the Open Source Initiative for a list of licenses available). You can find one that prohibits your customer from redistributing your app.
EDIT:
As Novelocrat points out in his comment, a license that prohibits distribution is per definitionem not an Open Source license, the term Open Source refers to a lot more than just the availability of the source code. (See also the answers to this related question for further discussion).
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
If you had to do without one or the other in a software project, which would you pick?
I've had plenty of projects in which the client or PM thought they could get away without one or the other. We always paid the price.
Turn this around and repeat after me: "Tests are requirements." :-)
If you mean "formal requirements", I can and easily do without those. I would much prefer a living, breathing customer who can tell me what they want over a rigid, out-of-date document. Having switched to TDD, I wouldn't ever want to go back to a "no test" environment. I choose informal requirements -- stories, on-site customer, and customer-written acceptance tests -- over formal requirements and no tests.
I'd say you could go without Testing rather than Requirements. If you don't have requirements, how do you know what you're developing?
If the programmers are good enough, they should be able to catch most of the egregious errors that testing would find.
You have to test against the requirements, so if you don't have requirements you can't do testing. So if you have to pick one, you can only pick requirements.
But not doing testing is a path to failure. Guaranteed.
If I had to pick one, it would be requirements.
It doesn't have to be a formal, excruciatingly detailed document with twenty signatures, but you have to know exactly what the customer wants and more importantly what the customer needs.
The requirements are also your first communication to the development team. How will they know what you're asking if you're not asking it clearly? At best you're at grave risk of building the wrong thing right. I'd rather have the right thing built slightly wrong.
If I were asked to choose between requirements or testing I would choose to polish up my resume. You really can’t do without either in any projects because the basic project lifecycle is:
Define Needs/Goals (AKA Requirements)
Design & Build to the requirements
Verify that you built to spec (to requirements.)
If you dont have success criteria and goals that are verifiable (and then are verified) how can you insure that you are going to succeed? And if you dont have a chance to succeed, why start the project?
I would say requirements because there always seems to be some level of "feature creep" from the client when you are developing software. Testing is one of the crucial pieces in the SDLC.
Requirements and testing are important for most projects but if you really have to pick, you should go with requirements. One of the advantages of picking requirements over testing is that, you might save some development time since the developers know what they have to build, and if the development is done with extra time in hand, you can allocate that time for testing :)
tests (feature and integration) are more important than requirements; if you can specify the tests then you have also specified the requirements, at least implictly
comments are also the developer documentation, with unit tests being the how-to 'quickstart' examples ;-)
Not sure if the requirements are referred to as an artefact or as a process. Although it is possible to skip requirements as artefact especially for smaller teams and still deliver a product, skipping requirements as process is out of question. Requirements as artefact let you model the system at cost lower than building the entire thing, do feasibility, estimates, and for a larger and more disperse team to cut communication overheads and have a common ground under the feet. Neglect the requirements and you get louse estimates (regardless if you plan a lot up front or just do a short sprint), poor idea of feasibility and possibly very inefficient communication and a lot of miscommunication.
Requirements as a process on the other hand is going to exist regardless if it is formally acknowledged or not. You cannot really exclude it, you can pretend requirements process does not exist or integrate into the design, coding, testing or into stages as late as pilot and maintenance. Obviously treating the process in this way mean it will not get fair amount of attention and resource. Consequences normally range from delivering something that is ultimately useless to having to fix the now obvious shortcomings of the product later in the development cycle or even discovering the real requirements once the product fails in the field, increasing the cost of development, defaulting on the deadlines, ruining team’s good name, destroying user confidence etc.
Testing usually boils down to validation and verification, more recently testing technology improvements let automated testing to be used as a solid tool for achieving greater efficiency in debugging and reducing time necessary for regression testing. Validation is making sure that the team has built the right product, i.e. scoped requirements are correct, not contradictory and there are no gaps. Verification on the other hand is making sure that the product is built right: no technical defects, accidental errors etc.
As we can see testing provides a safety net in the scenario where requirements were neglected. Normally as the team starts testing they need to refine their understanding of requirements and as a result modify the software. Since both requirement artefacts and software itself just represent different levels of fidelity in modelling a solution for a real life problem, and software as a model is order of magnitude more precise the testing of application evaluates requirements as well (regardless if they are implicit or explicit, formally analysed or informally communicated).
Normally the alternative to testing is to let users report a substantially larger amount of defects and shortcomings and try and fix them as part of maintenance (meaning later in product lifecycle), increasing the cost of every fix.
So requirements versus testing? Fire the manager. Ok, skip requirements if you want the project schedule slip during the testing phase and get yourself into the mess of building not what users need, skip the testing if you just need to show utter disrespect to your users.
Without requirements you don't need testing since what you end up with is exactly what was spec'd
There are categories of software that can be developed perfectly well without requirements, at least anything more than a vaguely expressed idea the length of an email.
Thing is, if you have a specific client, and a project manager, it is unlikely your software is in one of them. It's unlikely someone is specifically paying you to, say, 'make me a fun game involving a juggling monkey'.
The only category of software that can be developed without testing is failware: where your company has managed to sucker some customer into paying whether or not the software works (or if you have a really dumb customer, pay more if it doesn't work, in support and maintenance).
That's probably more likely: contracts structured so that success is less profitable than failure are still fairly common. If you think that's the case, and you want to develop working software, then consider switching to a job where your interests and your bosses are less opposed.
Without Requirements can we make a Test Plan? So We Cant do Testing even if we pick Testing instead of Requirements.
So Requirements should be Priority even if you consider Agile Testing Environment.