How to differentiate between test and live websites without changing back-end? - testing

We have an issue where we have a website for test and an equivalent website for live. What we are finding is that due to carelessness our testers are using the wrong site (e.g. testing on the live site!).
We have total control over both sites, but since the test site is for acceptance testing by the users we don't want to make them different in any way. The sites must look the same and there is also a layer of management that will kick up a storm if the test and live sites are in any way different.
How have other people solved this problem? I am thinking of a browser plugin to make the browser look different somehow (e.g. changing the colour of the location bar when on the test website). Does anyone know of a plugin or a technique that would work? (We primarily use Firefox and Chrome)
Thanks,
Phil
UPDATE
We eventually settled on a program of: different credentials for the test and live site (this was not popular!) and making a series of plugins available for those who wanted them (colourize tabs for Chrome and Firefox users - we never did find a good plugin for IE).
Thanks to those who answered.

In our company we use different site's names:
www.dev.site.com - for developers
www.qa.site.com - for QA's
www.site.com - production site
Another good practic is to use different users credentials for dev\qa and prod sites.

Related

Lighthouse metrics variance

please tell me why there is such a big difference between the included "Total Blocking Time" on the site https://developers.google.com/ and the local litehouse for a mobile device?
I am not able to give you the address and show you the site, there is just a text and a picture. The site is made on nuxt js and assembled completely statically.
My computer - mackbook air m1, google uses less powerful emulation and results in fewer parrots?
This official page explains it well: https://web.dev/performance-scoring/
Common problems:
A/B tests or changes in ads being served
Internet traffic routing changes
Testing on different devices, such as a high-performance desktop and a low-performance laptop
Browser extensions that inject JavaScript and add/modify network requests
Antivirus software
Otherwise, this one is another answer: https://www.debugbear.com/blog/why-is-my-lighthouse-score-different-from-pagespeed-insights
Google PageSpeed uses a “combo” of lab and real-world data[historical data of your website], whereas Lighthouse uses lab data only[locally tested on your machine] to build its report.
Note: You should trust PSI metrics over just lab data.

How to selenium test web sites depending on each other? (OAuth2 IdS, protected sites)

I have an IdS (Thinktecture Identity Server3) and various web sites trusting the IdS.
I have selenium tests for IdS and for each of the sites.
I use TeamCity and Octopus Deploy.
Changes in IdS should trigger test of dependent web sites. Changes in individual sites should trigger only test of the site (as it is).
What is the best way of ensuring this? I should think this is a common problem? ;)
BR, Anders
One way to do so, is to use App settings configuration options of the .Net itself. You can use config transformations to create different configuration per site and change. You will have to however map each, though. This will allow you to keep everything in the project. Example of such script creating transformed config files using the command line transform execution tool. Or if you prefer to use TeamCity with XML pokes. I've used the later with great success on Selenium and multi site platform test framework. Before each test build that was chained, we modified the XMLs, so the execution was dedicated to the related Git branch or repo that TeamCity was set to monitor.
I found what I was looking for in the most obvious of places. On the web site builds, I added a Finish Build Trigger pointing to the ids build. This way all my sites (I have only one :)) gets selenium tested.

How do I go about safely taking a screenshot of a website that I know is infected with malware?

Background:
One of my clients' websites has become a malware infested hotbed.
Disposing of the malware has proven difficult and time consuming, and, in the meantime, we still have had to do work on the site.
For now, we went to some trouble to do our work - creating a disposable VM to just run a web browser, so we can see what the site looks like for the designers' work, for example.
I'm wondering if there's an easier (and faster) way to get an idea what the design of the site looks like. Not everyone on the project is tech savvy enough to be trusted with, for example, properly handling switching VMs.
Question:
Is there a method for safely seeing what a malware infested website looks like (for example, a service which will browse the site for me and send a screenshot), one which ideally is easy and simple enough to use that I can trust our non-tech-savvy designers to user?
You might take at look at Internet Archive: Wayback Machine to see if the site has been archived.
If a screenshot is all you need, there are several online browser simulators, such as Net Renderer (which will run any inputted web URL in a given version of Internet Explorer and then supply a screenshot). You might also try BrowserStack, which requires an account, and is not free, but does have a free trial period, and offers more than Internet Exploder.
You could also try running a browser in Sandboxie, which is simpler to set up and use than a VM (you just install it, and then use the windows right-click menu to launch any program in a sandbox of your choosing). However, it isn't free for commercial use.
I don't know if exist a standalone tool to parse a website for malwares, but I think this can help you, it's a google tool that you can you with a request and they will send you a response.
Follow the link:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=168328
Hope it helped.

How to stress test simulating heavy load using Selenium

I have a system to test, which is a video ads distribution technology. I need to load every video like 1-2 mins to serve the ads. The videos are played in a Flash client and streamed as FLV streams like in YouTube.
The reason why I need to test it only via browsers -- and every other method won't work -- is to stress test both the video streaming servers and the ads servers simultaneously and displaying ads in real-time.
I have used Selenium, WatiN, Automation Anywhere and many other automation tools. However, when I am trying to start like 10000 browsers on my machine (32GB RAM, 16-core CPU), none of them are able to do the job.
With Selenium, I am able to start the maximum FireFox instances so far, but that's still too low: half of the instances don't run the test.
Any suggestions to do with Selenium?
You aren't going to run 10,000 browsers on your machine. That would give 3.2MB of physical memory per browser instance and I'm pretty sure FireFox just won't like that.
You could create a JMeter script that hits your server with many threads. It won't interact with the UI but would simulate the load of many clients hitting whatever URLs you tell it. I believe it also includes the ability to record a session and play it back for easy setup of your sessions.
Selenium isn't really optimized for load/stress testing, especially if you're running your browsers locally. Running 1000+ browsers is going to choke even the beefiest server. Though RAM is an obvious bottleneck, you also have limited CPU resources and bandwidth. The latter being a primary concern if you are loading videos.
Not to mention you'd be testing from a single IP with 10k browsers, so load balancing may not kick in properly, as well as the actual distribution of video ads to specific virtual users.
If you want to stick with existing Selenium tests, I've had good experiences with BrowserMob. They basically have a huge grid to do real browser load-testing, distributed across AWS.
Another recommendation would be an actual performance testing tool. I'd recommend Soasta CloudTest. They have a free version that runs 100 users so you can see if it will be a good fit for you. I have found that scripting for CloudTest is relatively simple.
Disclaimer: My experiences with both companies have been as a paying customer and I have never worked for either.
If you are using Windows machine then as per my experience there is a limit on number of browser window instances to be opened. As per my test last time, it does restrict between 100-150 browser windows.
I would recommend you using headless robot, which doesn't require opening browser window. I think latest version of Selenium has that capability. But it seems to be more like a load test as you are trying to simulate 10,000+ user instances, I would recommend you using load testing tool like JMeter or LoadRunner.
It looks to me that you are trying to verify what the client will see based on high traffic, no?
In that case, Joel is quite correct. If you absolutely have to see what the client sees, you could use threaded hits and just dump the results in a database. That'll show you anything the client would see anyway, and it's a lot easier to sort through than thousands of browser instances.
Either way, your client will not see errors if there are no errors present on the server side. If you're testing functionality in bandwidth restricted environments, CPU-intensive environments, or memory-intensive environments, those are much easier achieved than running thousands of browser instances.
Your post smells of some form of ad-based fraud to me, but either way: have you considered using different web browsers besides Firefox? PhantomJS is a headless webkit-based browser that is compatible with Selenium. It supports all the core browser features like DOM handling, CSS selectors, Javascript and Canvas. I do not know if it supports Flash.
This post has a decent list of other headless and automatable web-browsers that you might consider.
Also, if each browser instance is instantiating a Flash plugin, don't neglect the possibility that the issue could be with Flash and not Firefox. Alternatively, why instantiate several different Firefox processes? Can you accomplish what you want through the use of tabs instead?
The in-house way to this wiht selenium is using browsermob proxy and multiple broswser agents to recreate the experience of different users, changing the ip is more difficult because it requires changing your home network.
Here is a good example

Tools for finding Non SSL resources in web page (firebug like tool)

I'm trying to find a non-SSL resource that is being loaded on my site.
This happens occasional where one of us forgets to use the https version of a resource (like some js in a CDN).
My question is there any firebug-like tools to find these "Turds in the punch bowl"? I want my green padlock back :)
Besides Firebug, which you've mentioned, you can use the developer tools in Chrome:
Tools menu -> Developer Tools
Go through the list of loaded resources in the Network tab
Alternatively, the HttpFox extension for Firefox can also be useful. It will keep logging the traffic even when you change pages, which may be useful in some cases.
(This is very similar to Firebug.)
mitm-proxy is great for stuff like this - http://crypto.stanford.edu/ssl-mitm/
You run it on your local machine in a console window, set your browser to use it as a proxy, and you can watch /log everything that your browser requests. It's a little noisy since it shows SSL hand-shaking and file contents, but you can filter that down. When you need to debug SSL communications it's invaluable to see those details though..
mitm-proxy is based on http://grinder.sourceforge.net/g3/tcpproxy.html which has more in the way of scripting capabilities.