Google analytics doesn' t count requests from jmeter - apache

I am testing my website (which calls google anlytics) with jmeter and all my responses are ok, I can even see the response data, but google anlytics doesn't show anything.
Does that means that my requests failed?
This is test plan
Test Plan
Thread Group
http header Manager
http cache manager
Sample Request to my website
View Results Tree

JMeter is not a browser so it does not execute javascript inside the html pages , where typically google analytics calls are:
http://jmeter.apache.org/
So it is not a real problem.
Looking at your test plan, it contains elements that are not ok during load test as they consume a lot of resources, like View Results Tree:
http://jmeter.apache.org/usermanual/best-practices.html (16.6)

In general it is also considered a poor performance practice to include third party items where you do not have direct written permission to test from the site owners. If you do find an error in Google Analytics just who would you report it to and what would be your reconciliation path to ensure that it is fixed prior to your release?

Related

Active Collab API doesn't show all reports

I'm fetching all the open reports tagged as CHECKPOINT using the collab API and it's working fine. Although, when I run a custom report for the tasks, at Active Collab website, I get more and different results than what I fetched.
What I need is to get exactly the same results shown when running a custom report. Does anyone know how can I fix it or if it's a problem with the API itself?
Open browser console and see which requests ActiveCollab's web interface is making when making fetching data to build a report. Compre this with requests that you are making and see where the differences are.
Web interface gets served by the same API as your app is, so both can be made to work the same. As long as they are making the same requests, as same users.
Solution by OP.
By creating a data filter, it shows all the reports if "include_all_projects": true! Simple is that

How to record individual resource performance for a webpage on apache JMeter HTTP Request

I am trying to load test a web page using apache Jmeter. I am able to record the response time of the whole web page. Is it possible to see individual performance of each requested resource(all resource request is done via http) of that webpage; so that i can identify which resource is taking time to load.
See there are two things here
Web Page performance and
Actual Server Performance
You can use pagespeed or Yslow to determine how is your individual web page is performing. That will give you total rendering time, size of images, js and css. In case it will suggest if you are missing out on compression or combining js etc.
Next is server performance. It is pretty much possible to measure it but that depends on your requirements. Run your tests for and apply aggregate report. Aggregate report will give you response time for each page. This will be http response and not the rendering time.
I would suggest you come up with realistic workload and then run these tests. Also use some resource monitoring tool like Perfmon, sar or vmstat depending on operating system to monitor the target server performance.

How to properly benchmark / stresstest single-page web application

I am somehow familiar with benchmarking/stress testing traditional web application and I find it relatively easy to start estimating maximum load for it. With tools I am familiar (Apache ab, Apache JMeter) I can get a rough estimate of the number of request/second a server with a standard app can handle. I can come up with user story, create a list of page I would like to check and benchmark them separately. A lot of information can be found on the internet how to go from novice like me to a master.
But in my opinion a lot of things is different when benchmarking single page application. The main entry point is the most expensive request, because the user loads majority of things needed for proper app experience (or at least in my app it is this way). After it navigation to other places is just ajax request, waiting for json, templating. So time to window load is not important anymore.
To add problems to this, I was not able to find any resources how people do it properly.
In my particular case I have a SPA written with knockout, and sitting on apache server (most probably this is irrelevant). I would like to have rough estimate how many users can my app on the particular server handle. I am not looking for a tool recommendation (also it would be nice), I am looking for experienced person to share his insight about benchmarking process.
I suggest you to test this application just like you would test any other web application, as you said - identify the common use cases, prepare scripts for them, run in the appropriate mix and analyze the results.
Web-applications can break in many different ways for different reasons. You are speculating that the first page load is heavy and the rest is just small ajax. From experience I can tell you that this is sometimes misleading - for example, you can find that the heavy page is coming from cache and the server is not working hard for it, but a small ajax response requires a lot of computing power or long running database query or has some locking in the code that cause it to break or be slow under load - that's why we do load testing.
You can do this with any load testing tool, ideally one that can handle those types of script with many dynamic values. My personal preference is WebLOAD by RadView
I am dealing with similar scenario, SPA application where first page loads and there after everything is done by just requesting for other html pages and/or web service calls to get the data.
My goal is to stress test the web server and DB server.
My solution is to just create request for those html pages(very low performance issue, IMO, since they are static and they can be cached for a very long time in the browser) and web service call requests. Biggest load will come from the request for data/processing via the web service call requests.
Capture all the requests for html and web service calls using a tool like fiddler, and use any load test tools(like JMeter) to run these requests using as many virtual users as you want to test your application with.

How to avoid the Google Webmaster Tools API limit

I just wrote an application which gets the crawl issues of Google Webmaster Tools. The API has a limit of 100 entries per request. Is there any solution or workaround to get all crawl issues in one time?
Google Api Query bottleneck issue generally can be resolved by controlling each or (batch) query thread(s) via static / dynamic your internal thread-running rate (i.e., java concurrency with threadpool/buffer). In short, you would want to have your own query thread controller.

How to find inbound links to a given URL on the fly?

Technorarati's got their Cosmos api, which works fairly well but limits you to noncommercial use and no more than 500 queries a day.
Yahoo's got a Site Explorer InLink Data API, but it defines the task very literally, returning links from sidebar widgets in blogs rather than just links from inside blog content.
Is there any other alternative for tracking who's linking to a given URL (think of the discussion links that run below stories on Techmeme.com)? Or will I have to roll my own?
Well, it's not an API, but if you google (for example): "link:nytimes.com", the search results that come back show inbound links to that site.
I haven't tried to implement what you want yet, but the Google search API almost certainly has that functionality built in.
Is this for links to Urls under your control?
If so, you could whip up something quick that logs entries in the Referrer HTTP header.
If you wanted to do to this for an entire web site without altering application code, you could implement as an ISAPI filter or equivalent for your web server of choice.
Information available publicly from web crawlers is always going to be incomplete and unreliable (not that my solution isn't...).