I am using JMeter to test my own web application with the HTTP request. The final result seems okay. But I have one question are there any details of testing standard? Because I am writing a report which needs some data as a reference.
For example, something like the connected time and loading speed should lower than XXXXms or sample time should between XX and XX
I didn't find there are any references about this. So is there anyone knows about this which I can be used as reference data
I doubt you will be able to find "references". Normally when people invest into performance testing they have either non-functional requirements to check or they better spend money on performance testing to see if/when/where their system breaks instead of loosing it for every minute of system unexpected downtime.
So if you're developing an internal application for your company users will "have to" wait until it does its job as they don't have any alternative. On the other hand they will loose their valuable time so you will be like "serial programmer John"
If you're running a e-commerce website and it is not responsive enough - users just go to your competitors and never return.
If you still want some reference numbers:
According to a report by Akamai, 49% of respondents expected web pages to load in under 2 seconds, while 30% expect a 1-second response and 18% expected a site to load immediately. 50% of frustrated users will visit another website to accomplish their activity, while finally, 22% will leave and won't return to a website where problems have occurred
Similarly, a Dynatrace survey last year found that 75 percent of all smartphone and tablet users said they would abandon a retailer's mobile site or app if it was buggy, slow or prone to crashes.
See Why Performance Testing Matters - Looking Back at Past Black Friday Failures article for more information.
Feng,
There is no standard acceptance criteria for application performance. Most of the time Product owner takes the decision of acceptable response time, but we as a performance tester should always recommend to keep the response time within 2 seconds.
If you are running the performance testing first time of your application then its good to set the benchmark & baseline of your application based on that you can run your future tests and suggest the recommendation to the development team.
In performance testing, you can set benchmarks for following KPIs
Response time
Throughput
Also, its recommended to share detailed performance report to the stackholders so that they can easily take their decision. JMeter now provides Dashboard Report that has all the critical KPIs and performance related information.
Related
We use an internal website for our office work as part of our daily routine. The website is however works haphazardly and is frequently down or has a very slow response time. In spite of taking up the issue with the Admin multiple times, there is no improvement in the website performance. However, the IT Dept. denies that there is a systemic problem and justifies that the issue is very rare.
I want suggestions on the desktop software / browser extensions that I can use to record the response time of the website for every 5 minutes during the Office hours so that I can present the data in the next meeting presided by our 'big boss'. I prefer that the software / browser extension keeps calculating and storing the response times in the background as per the time interval I set.
Note: I know that browser extensions cannot write to files to hard disk, but even a popup with the median response time will do as I just want to show that there is a systemic problem in the website with supporting data.
I'm new with website performance testing field and will be using JMeter. After playing with it, I am still having troubles with identifying what to optimize in a website load time?
I'm currently still learning about the load testing - who should I give the performance report to? Developers/Programmers? or Network department? Example of an error I usually get is 502 error or timeouts.
Thanks in advance.
JMeter cannot identify anything, all it does is executing HTTP requests and measuring response times. Ideally it should be you, who takes JMeter raw results, performs analysis and creating the final report highlighting current problems and bottlenecks (and ideally what needs to be done to fix them)
Consider the following checklist:
You load test needs to be realistic, a test which doesn't represent real-life application usage does not make sense. So make sure your JMeter test carefully represents real users in terms of cookies, headers, cache, downloading images, styles and scripts, virtual user groups distribution, etc.
Increase and decrease the load gradually, this way you will be able to correlate such metrics as transactions per second and response time with increasing/decreasing number of users so make sure you apply reasonable ramp-up and ramp-down settings.
Monitor the application under test health. The reason of error may be as simple as lack of hardware resources (CPU, RAM, Disk, etc.). It can be done using i.e. PerfMon JMeter Plugin.
Do the same for JMeter instance(s). JMeter measures response time from "just before sending the request" until "last response byte arrives" so if JMeter is not able to send requests fast enough - you will have high response time without other visible reason.
Website load time is a combination of many factors including the browser rendering time, script execution time, resource download time etc. You can't use JMeter to validate the front end time. You can achieve it using chrome developer tools and other similar tools available for each browser. Refer https://developers.google.com/web/fundamentals/performance/
JMeter is primarily used for measuring the protocol level performance to ensure that you server can process the heavy workloads when it is subjected to real time stress conditions from several customers. It won't compute the java script execution time or HTML parsing time. Your JMeter script should be written in such a way that it emulates the logic of your java script executions and other presentation logic to form the request inputs and the subsequent requests.
Your question is way too open ended and you might have to start with a mentor who can help you with the whole process and train you.
Also, the mindset for functional testing and performance testing are totally different. Lot of key players in the performance area have suggested to measure the load time as part of the functional testing efforts while the majority of the server side performance is validated by the performance team.
I want to know that how can I test my website (web-based program) performance with the factors of speed and response time when using MS-SQL Server and ASP.net
Actually I want to know when my users increased to 1,000,000 and more, how the speed and performance changed?
Thank you
There are a number of tools to run load tests against web sites; I like JMeter (http://jmeter.apache.org/) - open source, free, easy to use - but there are lots of others - google "web performance testing" and take your pick.
All those tools allow you to specify a number of concurrent users, wait times between page requests, and then specify one or more user journeys through the site. They will give you a report showing response times as the number of users changes.
You can install the load testing applications on any machine; most have the concept of "controller", and "load agent". The controller orchestrates the load test, while the load agents execute the tests. Generating the equivalent load of 1 million visitors is likely to require significant horse power - you may need to use one of the cloud providers of load testing solutions. Again, Google is your friend here.
We are working with a .NET 3.5 app which is fast approaching legacy status. We have an existing SOAP service which reads records from our database and saves them to a third party MS SQL database, sending all the data rows in a single batch.
This has always worked fine, but recently we've taken on a much larger client than any we've had before, and they are transmitting much larger batches, so much so that they have begun to fail. We've upped the time out and max memory sizes in IIS, and maxed out the maxRequestLength in the web.config, but we are still bumping up against size problems.
So, I understand that long term, we should consider moving away from SOAP and into WCF, and plans for that are in the works. But in the mean time, we need a short term fix for this new client. And of course, to make the business and sales people happy, we need it kinda quickly.
I'm wondering what the best-practice approach might be. Initially I'm thinking something like this, but I could be thinking inside the box too much:
Establish a bench mark of # of records over which we don’t want to attempt to sync all at once.
Before attempting to save the data, check the number of records against that bench mark
If it's above it, then break the transmission down into segments which are each below that benchmark. SELECT TOP 10000 * FROM table WHERE sent = false, etc., if the benchmark is 10000. Then update sent to true for those records once submitted. Repeat.
Obviously, this will slow the process down, so to handle the user experience, we may want to toss in a status bar so they can see the progress.
Am I on the right track?
In addition to the comments from John, you should consider if you are solving the problem in the most optimal way.
It looks like you are triggering a one way sync between 2 database by calling a web service. This approach leads to the time out and memory problems that you are experiencing.
If your goal is to do the one way sync, you could use a free framework such as Microsofts sync framework: http://msdn.microsoft.com/en-US/sync
Recently a Rails 3 app we built and host had some issues with the Google Analytics tracker installed. This resulted in vastly diminished statistics being tracked during the last month. We have our production logs from the app and I'm wondering if anyone knows of any way to parse these to produce visitor statistics (similar to what web analytics packages would provide). We need to deliver a stats report this week and would like to have some account for the missing visitors. Any suggestions or help would be greatly appreciated!
Probably the better place to look would be your web server logs. 5 or 10 years ago all the popular analytics software gobbled up web server logs, and there are a few free ones our there. Google "web log analytics" and see if there's anything suitable.
The problem is, web logs contain all traffic, and for many websites, this can be from all sorts of sources you don't care about, like GoogleBot and others that crawl your site to add to search indexes ... and many more. Look for software that will try to filter these out, and will also know to ignore assets (JS, CSS, images, etc.). Analytics doesn't have to worry about this kind of stuff since it's based on cookies and javascript running in a real visitor's browser.
No matter how good these programs are, there are two things you'll need to take into account.
Numbers will not align with GA, and you'll go crazy if you try to make them add up -- the differences can be astonishingly large, as much as 20% or more.
It may be more work than it's worth to get the software configured -- even if you do, the level of detail pales in comparison to GA.
If you're handy with grep, the Rails log might help you get some quick-and-dirty counts (although they also record all traffic, unless users need to log in, in which case logs may be a little less noisy).
A different approach might be to look in your database -- is there anything you can track that acts as a proxy for a visit or any other goal you have been tracking? How useful this is depends entirely on your app and what you store in the database.
Some combination of the above may be the best way to get at something, but I hate to be the bearer of bad news -- it's very likely that what you're able to glean from logs creates more confusion than it's worth. Been there, tried that :-(
Tom