Software / Browser Extension to check the median response time time of a website during specific timings - response-time

We use an internal website for our office work as part of our daily routine. The website is however works haphazardly and is frequently down or has a very slow response time. In spite of taking up the issue with the Admin multiple times, there is no improvement in the website performance. However, the IT Dept. denies that there is a systemic problem and justifies that the issue is very rare.
I want suggestions on the desktop software / browser extensions that I can use to record the response time of the website for every 5 minutes during the Office hours so that I can present the data in the next meeting presided by our 'big boss'. I prefer that the software / browser extension keeps calculating and storing the response times in the background as per the time interval I set.
Note: I know that browser extensions cannot write to files to hard disk, but even a popup with the median response time will do as I just want to show that there is a systemic problem in the website with supporting data.

Related

Jmeters test standard

I am using JMeter to test my own web application with the HTTP request. The final result seems okay. But I have one question are there any details of testing standard? Because I am writing a report which needs some data as a reference.
For example, something like the connected time and loading speed should lower than XXXXms or sample time should between XX and XX
I didn't find there are any references about this. So is there anyone knows about this which I can be used as reference data
I doubt you will be able to find "references". Normally when people invest into performance testing they have either non-functional requirements to check or they better spend money on performance testing to see if/when/where their system breaks instead of loosing it for every minute of system unexpected downtime.
So if you're developing an internal application for your company users will "have to" wait until it does its job as they don't have any alternative. On the other hand they will loose their valuable time so you will be like "serial programmer John"
If you're running a e-commerce website and it is not responsive enough - users just go to your competitors and never return.
If you still want some reference numbers:
According to a report by Akamai, 49% of respondents expected web pages to load in under 2 seconds, while 30% expect a 1-second response and 18% expected a site to load immediately. 50% of frustrated users will visit another website to accomplish their activity, while finally, 22% will leave and won't return to a website where problems have occurred
Similarly, a Dynatrace survey last year found that 75 percent of all smartphone and tablet users said they would abandon a retailer's mobile site or app if it was buggy, slow or prone to crashes.
See Why Performance Testing Matters - Looking Back at Past Black Friday Failures article for more information.
Feng,
There is no standard acceptance criteria for application performance. Most of the time Product owner takes the decision of acceptable response time, but we as a performance tester should always recommend to keep the response time within 2 seconds.
If you are running the performance testing first time of your application then its good to set the benchmark & baseline of your application based on that you can run your future tests and suggest the recommendation to the development team.
In performance testing, you can set benchmarks for following KPIs
Response time
Throughput
Also, its recommended to share detailed performance report to the stackholders so that they can easily take their decision. JMeter now provides Dashboard Report that has all the critical KPIs and performance related information.

Test website program performance under pressure visualization

I want to know that how can I test my website (web-based program) performance with the factors of speed and response time when using MS-SQL Server and ASP.net
Actually I want to know when my users increased to 1,000,000 and more, how the speed and performance changed?
Thank you
There are a number of tools to run load tests against web sites; I like JMeter (http://jmeter.apache.org/) - open source, free, easy to use - but there are lots of others - google "web performance testing" and take your pick.
All those tools allow you to specify a number of concurrent users, wait times between page requests, and then specify one or more user journeys through the site. They will give you a report showing response times as the number of users changes.
You can install the load testing applications on any machine; most have the concept of "controller", and "load agent". The controller orchestrates the load test, while the load agents execute the tests. Generating the equivalent load of 1 million visitors is likely to require significant horse power - you may need to use one of the cloud providers of load testing solutions. Again, Google is your friend here.

How to avoid spammer to use my FTP, bandwidth and mySQL of my site?

THE PROBLEM
My server gave me an ultimatum (3 business days):
"We regret to say That database is currently consuming excessive resources on our servers Which causes our servers to degrade performance Affecting ITS customers to other database driven sites are hosted on this server That. The database / tables / queries statistical information's are provided below:
AVG Queries / logged / killed
79500/0/0
There are Several Reasons where the queries gets Increased. Unused plugins will Increase the number of queries. If the plugins are not causing the issue, you can go ahead and block the IP addresses of the spammers Which will optimize the queries. Also you can look for any spam Existed contents in the database and clear them up.
You need to check for the top hitters in the Stats page. Depending upon the bandwidth accessed, top hits and IP you need to take specific actions on Them to optimize the database queries. you need to block the Unknown robot (Identified by 'bot *'). Since These bots are scraping content from your website, blog comment spamming your area, harvesting email addresses, sniffing for security holes in your scripts, trying to use your mail form scripts as relays to send spam email. .htaccess Editor tool is available to block the IP address."
THE BACKGROUND
The site is made ​​100% from us in VB. NET, mySQL and platform of Win (except the Snitz Forum). The only point from which we received SPAM was a form for comments which now has a captcha. We talk of more than 4000 files between tools articles, forums, etc. for a total of 19GB of space. Only upload it takes me 2 weeks.
STATISTICS OF ROBOTS
Awstats tells us for the month of February 2012:
ROBOT AND SPIDER
Googlebot
+303 2572945 accesses
5:35 GB
Unknown robot (Identified by 'bot *')
772520 accesses +2740
259.55 MB
BaiDuSpider
+95 96 639 access
320.02 MB
Google AdSense
35907 accesses
486.16 MB
MJ12bot
33567 +1208 access
844.52 MB
Yandex bot
+104 18 876 access
433.84 MB
[...]
STATISTICS OF IP
IP
41.82.76.159
11681 pages
12078 accesses
581.68 MB
87.1.153.254
9807 pages
10734 accesses
788.55 MB
[...]
other
249561 pages
4055612 accesses
59.29 GB
THE SITUATION
Help!!! I don't know how to block IP with .htaccess and I don't know what IP! I'm not sure! Awstats ends without the past 4 days!
I already tried in the past to change the password of FTP and account, nothing! The goal is not I think are generic attacks aimed at obtaining backlinks and redirects (often do not work)!
This isn't really an htaccess issue. Look at your own stats. You've had ~4M hits generating some 12Kb per hit in the last 4 days. I ran the OpenOffice.org user forums for 5 years and this sort off access rate can be typical for a busy forum. I used to run on a dedicated quad-core box, but migrated this a modern single core VM and when tuned, this took this sort of load.
The relative Bot volumetrics are also not surprising as a % of these volumes, nor are the 75K D/B queries.
I think that what your hosting provider is pointing out is that you are using an unacceptable amount of system (D/B) resources for your type of account. You either need to upgrade your hosting plan or examine how you can optimise your database use. E.g. are your tables properly indexed and do you routinely do a Check/Analyze/Optimize of all tables. If not then you should!
It may well be that spammers are exploiting your forum for SPAM link posts, but you need to look at the content in the first instance to see if this is the case.

debugging a slow site -> long delay between connection and data sending

i ran a test from pingdom tools to check the loading time of my website... the result is that i have a lot of files that, in spite of being very small (5kB), take a lot of time (1 second or more) to load because there is a big delay between the beginning of the connection and the beginning of data downloading (in pingdom tools, this results in a very large green bar).
Have a look at this for example: http://tools.pingdom.com/default.asp?url=http%3a%2f%2fwww.giochigratis-online.net%2f&id=5691308
How can i lower the "green bar" time? Is this an apache problem (like, i dont know, the number of max. parallel connections, or something similar...), or an hardware problem? Cpu-limited, bandwith-limited, or what else?
I see that many other websites have very little green bars... how do they reduce the delay between the connection and the real data sending?
Thanks!
ps.: the site is made with drupal. Homepage generation takes about 700ms
pps.: i tested 3 other websites on the same server: same problem.
I think it could be the problem with max no. of parallel connections as you mentioned - either on server or client side. For instance, Firefox has default of network.http.max-connections-per-server = 15 (see here) while you have >70 files to be downloaded in your domain and next 40 from Facebook.
You can reduce number of loaded images by generating sprites i.e. the image consisting of multiple small images, and then using CSS to diplay them properly in places that you want. This is widely used e.g. by Google - see http://www.google.com/images/nav_logo83.png

Robust and Accurate IIS Reporting tool for fail over IIS Services

Good day,
I need to be able to produce IIS usage reports for our SharePoint 2007 Custom application. The application runs on 2 IIS 6 Service farm for load balance/fail over purposes.
Here is the list of requirements that my management poses:
1.Daily Visitors (per farm).
2.Daily Hits (per farm).
3.Daily activity (hits, page views, visitors, avg. session duration).
4.Activity by hour of Day (for the whole farm).
5.Activity by day of week (for the whole farm combined).
6.Activity by month.
7.Page access statistics / Most popular pages.
8.Top authenticated users.
9.Browser use statistics.
10.Client OS Use statistics.
So I need to combine report results from the 2 IIS Boxes on the load balanced rotation.
So far I have tried these tools.:
1.Web Log Expert - produces desired report types, can combine IIS logs from multiple locations. But the tool has some major bugs, such as:
a. some important information is being missed from the report: in authenticated user report the test user I log into the application is missing from the report, the user is not specified in the ignore filter and that user is found in the ISS logs.
b. Bug with time and dates. Even though there is an option to adjust the time from GMT to whatever, that changes is not being obeyed by the software. It can be fixed however by converting the W3C standard log files into ncsa format with convlog utility. However, in this case, the Browser and OS usage data is gone and lost from the report.
2. Samurize - I am a bit perplexed with configuring it to report on the W3C log files. There is a lack of good tutorials or other information on that software as well.
Please recommend the tools that worked out for you and ideally answer at least a number of specified requirements.
Thanks.
Nintex has a program - no idea if it's any good (but their workflow does rock).