I have read the Load runner basics and understood an overview of the components of Load Runner and general workflow.
As its Load Testing of a website, I need to plan real time scenarios of the functionality of the website with example 100 users who log-in simultaneously.
In Load runner,I need to create all these users that emulate steps of real users using the application These which would be virtual users…Vuser.
Could you all please help me writing this script? please help me by giving a script to create a vuser and description of the script.
The component is VuGen (Virtual User Generator) of Load runner. VuGen then also runs them. How to execute it?
Rohit,
There is no substitute for training in the tool. The path of self education in this space is one which has been trod by many and the results are almost universally poor for the traveler. Affirm your core skills related to development, test, project management, systems analysis, monitoring and diagnostic actions, Attend training, work with a mentor. That is the path to success.
James Pulley
Moderator (LoadRunner Yahoo Groups, Advanced LoadRunner Yahoo Group, lr-LoadRunner Google Group, SQA Forums LoadRunner, SQAForums WinRunner, Linkedin LoadRunner, Linkedin LoadRunnerbythehour, The Pithy Performance list)
You can find useful articles on Performance Testing principals and also tips & tricks for performance testing tools (including HP Loadrunner) on this site:
http://www.perftesting.co.uk/
the best way of studying and understanding loadrunner is the buit in tutorial.
after installation you can get a pre-installed 10-day evaluation license, under which you can use the extensive tutorial of writing and creating the script.
good luck!
Start with this,
https://automation-performance.blogs...e-testing.html
Then (in sequence)
https://automation-performance.blogs...oadrunner.html
https://automation-performance.blogs...scripting.html
https://automation-performance.blogs...nner-tool.html
Thereafter few other topics,
https://automation-performance.blogs...oadrunner.html
https://automation-performance.blogs...ttp-codes.html
https://automation-performance.blogs...ation-and.html
Once you are done with these, you will start getting some hang of what you are into.....you can take it forward from there
And for advance learning of Performance Engineering you can go to following link,
https://automation-performance.blogs...gineering.html
Related
My team is migrating from manual to automation testing. Our first issue is that we cannot find any tool which satisfy our need in generating visual and in-depth execution report.
We really need your help to find out a tool providing detailed test reports and can be analyzed to visualize test execution status, performance, and flakiness in different ways.
My team have used several tools: Selenium, TestNG, Robot Framework & Katalon Studio. So from my own experience, I think the nearest tool that we used - Katalon can help you. Their native reports provide the "snapshot" information of each execution session with intuitive charts & graphs.
You can identify easily particular issues with flaky analysis on your test executions that visualization. And of course, it's available for free usage.
My team is using Katalon analytics, it has a lot of useful features and it's free. I have just begun to use it and I think it's quite good at making detailed reports. You should try it out.
I know this is a late response to the post.
But still, if someone in need of a good test automation dashboard based on just few api calls, Ares dashboards (From ZenQ team) is a much better option to try and its completely free.
ARES, is an acronym for Test Automation Results dashboard. It's a TestAutomation framework/tool agonistic solution, that simplifies the collection of Test automation results and their analysis via live dashboard, daily/weekly trends, frequent failures etc.
Website: http://www.testastra.com/
Below repo has some code samples, documentation and usage of ARES test automation dashboard:
https://github.com/testastra/ARES
My team is using https://github.com/last-hit-aab/last-hit for UI automation test, this tool is powerful, can record and replay without script change .
I am working on a research of test case prioritization. And I need some sample program with set of test cases. I found some program here.
But I need some more. If anyone have any resources like that please share with me. It will be help!
Thanks in advance
Incremental Scenario Testing Tool
Description
IST supports the test teams in managing their complexity by adaptively selecting and prioritizing the test cases according to the past test results. ISTT guides the testers through a test session with high-level test scenarios generated on the fly.
Features
IST is an innovative hybrid of scripted and exploratory testing to automate the test planning for flexible, adaptable and efficient test sessions.
Find as many bugs as possible by generating prioritized scenarios based on the analysis of previous test runs.
Involving developers allows focusing test resources on areas of the software which are naturally error-prone.
Install it on your own server or try it for free on the demo server at https://istt.symphonyteleca.com/ ! (request trial access through the contact mailing list, which is linked from the login page)
for download or more information please visit http://sourceforge.net/projects/istt/
TestCube
Description:
Jataka Testcube is a Web-based test case management tool designed to integrate & track enterprise-wide Test Cases.
for download or more information please visit http://www.jatakasource.org/testcube/#1
as of right now i'm working at place where's there's a lot of legacy codes and pretty much no useful documentation.
and most of the time we just treat business requirements as whatever that is already implemented previously.
i'm looking for any tools or useful method to keep all the requirements for future use and for regression testing mostly.
i'm thinking of maybe linking them up to tests/unit test too so that the business requirements are linked directly to the coding logic.
Any good tools or resources to get me started?
thanks~
Updates
As of now i'm making things simple on myself by writing use case and then create some simple use case diagram using this awesome tool and then convert each use case into a test plan. The test plan is meant for the end user, thus i just make it into a simple step by step flow. I had plans to automate this part using selenium but it wasn't working that well on our website and was taking too long. It's a bit TDD, but i think it create simple understandable goal for both end user and the developer, i hope.
So for now it's just excel and doc file, lugged into the project doc folder and check into cvs/svn doomed to be outdated and be forgotten :P
Business requirements can be well capture in FitNess tests. As for Unit Test they sur help, and but both together in continuous integration like Hudson to detect regression ASAP.
PS: Sorry pretty much all links go to some articles I wrote because I'm also interested in that subject.
Here are some methods/systems that I have used
HP Quality Center
Big and bulky. Not very agile but it works and has a lot of feautres.
It's used in many larger corporations and if you can afford you can get great support from HP
https://h10078.www1.hp.com/cda/hpms/display/main/hpms_content.jsp?zn=bto&cp=1-11-127-24%5E1131_4000_100__
Bugzilla-Testopia
Open Source test case management extension for Bugzilla, managed by Mozilla. Which is good enough in my book to give it a try.
http://www.mozilla.org/projects/testopia/
Excel/Open Office Calc
Just do everything in spreadsheets and link between them.
Very flexible, everybody knows how to use them and you propbably have the software in your organization already.
Other Open Source Solutions
List of 15+ Open Source Test Management Tools
http://www.jayphilips.com/2009/09/10/15-open-source-test-management-tools/
We would like to use Confluence for writing and managing our test cases. Confluence Testplan plugin seems close to what I'm looking for, but it's a bit too simple and limited.
How are you using Confluence to manage your test cases?
We both do and don't use Confluence for managing our test cases.
We Don't
In my project we use, and love, Confluence. But only for knowledge documentation and spreading. I'm sorry but I can't see how Confluence would be a good idea of writing and managing our test cases.
We Do
We use excel/calc spreadsheets to write an manage manual test cases. We write them on a very high level. E.g. "Log in and upload a jpeg image." We expect all tester to have high domain knowledge and know how to log in and upload images.
Then we upload the spreadsheets to Confluence a special page. Every time the tests are run, before every release/sprint demo, we check those out. We enter in the results (sometimes add new tests) and check the spreadsheet back in again with comments.
It works fine, is fast, flexible, low overhead and it's ready to send to management or the customer anytime.
IMHO, honestly spreadsheets beats most test managing tools.
Assuming that you are using Jira for Agile management, it is best practice to associate test cases with Jira tickets. Confluence does a nice job of allowing users to link user stories within the wiki. For instance you can create a 'sub task' against a user story. Typically I write automated tests for all the user stories I test. So I can associate a git commit with a particular QA sub-task so it makes sense linking a ticket. You might want to look at the Confluence api, I link my automated test results into confluence which prints out my test cases.
On the topic of using spreasheets.. Its a terrible practice. Test Cases should be accessible by anyone and I don't mean on shared drive somewhere. Product, management and anyone in engineering should be able to visit a page and look at the test cases, coverage & results.
If the question is about functional testing or BDD, did you check GreenPepper? See the documentation.
We're not using Confluence for test cases right now, but we are for use cases. I wrote up some examples about how we manage use cases here. The general idea could probably be applied to test cases also.
What are the most common things to test in a new site?
For instance to prevent exploits by bots, malicious users, massive load, etc.?
And just as importantly, what tools and approaches should you use?
(some stress test tools are really expensive/had to use, do you write your own? etc)
Common exploits that should be checked for.
Edit: the reason for this question is partially from being in SO beta, however please refrain from SO beta discussion, SO beta got me thinking about my own site and good thing too. This is meant to be a checklist for things that I, you, or someone else hasn't thought of before.
Try and break your own site before someone else does. Your web site is basically a publicly accessible API that allows access to a database and other backend systems. Test the URLs as if they were any other API. I like to start by cataloging all URLs that have some sort of permenant affect on the state of the system - this is easy if you are doing Ruby on Rails development or trying to follow a RESTful design pattern. For each of those URLs, try running a GET, POST, PUT or DELETE HTTP methods with different parameters so that you can ensure that you're only giving access to what you want to give access to.
This of course is in addition to obvious: Functional testing, Load Testing, SQL Injection, XSS etc.
Turn off javascript and make sure your site can still be navigated.
Even if you want to ignore the small but significant number of people who have it disabled, this will impact search engines as well.
YSlow can give you a quick analysis of different metrics.
What do friendly bots see (eg: Google); check using Google Webmaster Tools;
Regarding tools for running functional tests of a web pages, I've found that Selenium IDE to be useful.
The Firefox (version 2 only compatible at the moment) plug in lets your capture almost all web events, and save them and replay them in the same browser.
In conjunction with another Firefox https://addons.mozilla.org/en-US/firefox/addon/1843"> Firebug
you can create some very powerful tests.
If you want to set up Selenium Remote Control
you can then convert the Selenium IDE tests into nUnit tests, which you can run automatically.
I use cruise control and run these web tests as part of a daily build.
The nice thing about using Selenium remote control is that it can run the same functional tests on multiple browsers and operating systems, something that you can't do with the IDE.
Although the web tests will take ages to run, there is an version of Selenium called Selenium Grid that lets you use any old hardware you have spare to run the tests in parallel as part of a computing grid. Not tried this myself, but it sounds interesting.
All of the above is open source and free which helped me convince management to use if :-)
For checking the cross browser and cross platform look of your site, browershots.org is maybe the best free tool that can safe a lot of time and costs.
There's seperate stages for this one.
Firstly there's the technical testing, where you check all technical functionality:
SQL injections
Cross-site Scripting (XSS)
load times
stress levels
Then there's the phase where you have someone completely computer-illiterate sit down and ask them to find something. Not only does it show you where there's flaws in your navigational logic (I find that developers look upon things way differently than 'other people') but they're also guaranteed to find some way to break your site.