Webdriver(Selenium2) - How to make selenium operate elements without wating for connecting to external AD links? - selenium

Environment:
- Selenium 2.39 Standalone Server
- PHP 5.4.11
- PHPUnit 3.7.28
- Chrome V31 & ChromeDriver v2.7
I'm testing a website,which invokes a lot of Advertisement Systems,such as Google AD.
The browser takes a lot of time to connect to external AD links , even all the elements of the page has already been loaded.
If my internet network was not fast when I ran my tests on a webpage,
Selenium would wait for a very long time ,since the AD links responsed slowly.
Under this condition ,Selenium usually waits for over 60 seconds, and throws a timeout exception.
I'm not sure how Senelium works, but it seems that Selenium has to wait for a sign of webpage's full loading, then pulls the DOM to find elements.
I want to make selenium operate elements without waiting for connectiong to external AD links.
Is there a way to do that ? Thank you very much.

I would suggest that you could make use of a proxy. Browsermob integrates well with selenium, very easy to use it:
// start the proxy
ProxyServer server = new ProxyServer(4444);
server.start();
// get the Selenium proxy object
Proxy proxy = server.seleniumProxy();
// This line will automatically return http.200 for any request going to google analytics
server.blacklistRequests("https?://.*\\.google-analytics\\.com/.*", 200);
// configure it as a desired capability
DesiredCapabilities capabilities = new DesiredCapabilities();
capabilities.setCapability(CapabilityType.PROXY, proxy);
// start the browser up
WebDriver driver = new FirefoxDriver(capabilities);
I'm not sure how Senelium works, but it seems that Selenium has to
wait for a sign of webpage's full loading, then pulls the DOM to find
elements.
It is pretty much like this. The default loading strategy is "NORMAL" which means:
NORMAL of type DOMString
The remote end MUST wait until the "document.readyState" of the frame currently handling commands equals "complete", or there are no
more outstanding network requests other than XMLHttpRequests.

I finally found a simple solution for my condition.
I decide to block these Ad requests and tried some firewall and proxy softwares,for example,
comodo,privatefirewall, etc.
comodo is too heavy and complex ,privatefirewall doesn't support wildcards, and firewall would interrupt tests. At last I choosed a proxy software CCproxy. Trial Version is enough.
I create a rule for localhost ,to make it can request my test website domain only, and all other requests are rejected.
Running a test costs about 1-2 minutes before and only 30 seconds now ,it's apparently more stable and fast without connecting to the useless Ad links.
Here're configuration steps:
1.launch CCproxy with Administor privilege( you should set it using Adminisrator in the file property)
2.click Options, select AutoStartup,select AutoDetected for Local IP Address. click OK.
3.create a txt file ,input your domains,like " *.rong360.com*;*.rong360.*; "
4.click Account, select PermitOnly for Permit Category;
click New, input 127.0.0.1 for IP Address/Range;
select WebFilter,click the E button at right side to create a filter;
click the ... button,select the text file you create at Step3,
select PermittedSites. click OK
click OK.
5.click OK to return to the main UI of CCproxy.
6.launch IE and config the local proxy with 127.0.0.1:808
other browsers will use this config automatically too.
now you can run the tests again , you'll feel better if have same condition :)

Related

Simulating intermittent network failures with Selenium

Running a test that has a long running (non-request/response action) along with polling to check the status. While this polling is going on, I'd like to make a few packets disappear. I keep seeing things that look like this might work with WebDriver only to come up short. Is there anyway to do this completely inside of selenium or do I have to go to a completely external proxy?
My thoughts were to act like an ad blocker in that I could watch what was being requested and refuse certain connections and return things like 502s or return nothing at all. But I'd like it to be under the control of the test, not an external setup.
You can manipulate requests by using a proxy. It's simplest to block requests. Depending on how your webapp is performing this might be a way to do it.
Check out Browsermob proxy usage:
Look here to get started: URL blacklisting with BrowserMobProxy in Robot Framework/Selenium?
// Start the server and get the selenium proxy object
ProxyServer server = new ProxyServer(proxy_port); // package net.lightbody.bmp.proxy
server.start();
server.setCaptureHeaders(true);
// Blacklist google analytics
server.blacklistRequests("https?://.*\\.google-analytics\\.com/.*", 410);
// Or whitelist what you need
server.whitelistRequests("https?://*.*.yoursite.com/.*. https://*.*.someOtherYourSite.*".split(","), 200);
Proxy proxy = server.seleniumProxy(); // Proxy is package org.openqa.selenium.Proxy
// configure it as a desired capability
DesiredCapabilities capabilities = new DesiredCapabilities();
capabilities.setCapability(CapabilityType.PROXY, proxy);
// start the driver ;
Webdriver driver = new FirefoxDriver(capabilities);
return driver;
But you should be able to dynamically configure the proxy to alternate request blocking.

TestCafe EC2 Network logs

We are "successfully" running our gherkin-testcafe build on ec2 headless against chromium. The final issue we are dealing with is that at a certain point in the test a CTA button is showing ...loading instead of Add to Bag, presumably because a service call that gets the status of the product, out of stock, in stock, no longer carry, etc. is failing. The tests work locally of course and we have the luxury of debugging locally opening chrome's dev env and inspecting the network calls etc. But all we can do on the ec2 is take a video and see where it fails. Is there a way to view the logs of all the calls being made by testcafe's proxy browser so we can confirm which one is failing and why? We are using. const rlogger = RequestLogger(/.*/, {
logRequestHeaders: true,
logResponseHeaders: true
});
to log our headers but not getting very explicit reasons why calls are not working.
TestCafe uses the debug module to perform internal logging functionality. So, in order to view the TestCafe proxy logs, you can set the DEBUG environment variable in the following manner:
export DEBUG='hammerhead:*'

Testing site with IP addr whitelist using BrowserStack automate + cloud hosted CI

I have a test system (various web pages / web applications), that is hosted in an environment accessible only via machines with IP addresses that are white listed. I control the white list.
Our CI system is cloud hosted (Gitlab), so VMs are spun up dynamically as needed to run automated integration tests as a part of the build pipeline.
The tests in question use BrowserStack automation to run Selenium based tests, which means the source IP addresses of the BrowserStack automation driven requests that hit the test environment are dynamic, as BS is cloud hosted. Also the IP addresses of our test runner machines that call / invoke the BrowserStack automation are dynamic as well.
The whole system worked fine before the intro of IP white listing on the test environment. Since white listing was enabled, the BrowserStack tests can no longer access the environment URLs (due to not being able to white list the dynamic IPs).
I have been trying to get the CI driven tests working again using BS "Local Testing" feature, outlined here https://www.browserstack.com/local-testing.
I have set-up a dedicated Linux VM with a static IP address (cloud hosted). I have installed and am running the BrowserStackLocal.exe binary, using our BS key. It starts up fine and says it has connected to BrowserStack via a web socket. My understanding is this should cause all http(s) etc requests that come from my CI / BrowserStack automation driven tests to be routed through that stand-alone machine (via BS cloud), resulting in it's static IP address being the source of the requests seen at the test environment. This IP addr is white listed.
This is the command that is running on the dedicated / static IP machine:
BrowserStackLocal.exe --{access key} --verbose 3
I have also tried the below, but it made no apparent difference:
BrowserStackLocal.exe --{access key} --force-local --verbose 3
However, this does not seem to work? Either through "live" testing if I try and access the test env directly through BrowserStack, or through BS automate. In both cases the http(s) requests all time out and cannot access our test environment URLs. Also even with --verbose 3 logging level enabled on the BrowserStackLocal.exe process, I never see any request being logged on the stand-alone / static IP machine when I try to run the tests in various ways.
So I am wondering if this is the correct way to solve this problem? Am I misunderstanding how to do this? Do I need to run the BrowserStackLocal.exe perhaps on the same CI runner machine that is invoking the BS automation? This would be problematic as these have dynamic IPs as well (currently).
Thanks in advance for any help!
EDIT/UPDATE: I managed to get this to work!! (Sort of) - it's just a bit slow. If I run the following command on my existing dedicated / static IP server:
BrowserStackLocal.exe --key {mykey} --force-local --verbose 3
Then on another machine (like my dev laptop) if I hit the BS web driver server http://hub-cloud.browserstack.com/wd/hub, and access the site http://www.whatsmyip.org/ to see what IP address comes back, and it did (eventually) come back with my static IP machines address! The problem though is it was quite slow - 20-30 secs for that one site hit, so still looking at alternative solutions. Note for this to work your test code must set the "local" browserstack capability flag to 'true' - eg for Node.js:
// Input capabilities
var capabilities = {
'browserstack.local' : 'true'
}
UPDATE 2: Turning down the --verbose logging level on the local binary (or leaving that flag off completely) seemed to improve things - I am getting 5-10 sec response times now for each request. That might have to do. But this does work as described.
SOLUTION: I managed to get this to work - it's just a bit slow. If I run the following command on my existing dedicated / static IP server (note adding verbose logging seems to slow things down more, so no --verbose flag used now):
BrowserStackLocal.exe --key {mykey} --force-local
Then on another machine (like my dev laptop) if I hit the BS web driver server http://hub-cloud.browserstack.com/wd/hub, and access the site http://www.whatsmyip.org/ to see what IP address comes back, and it did come back with my static IP machines address. Note for this to work your test code must set the "local" browserstack capability flag to 'true' - eg for Node.js:
// Input capabilities
var capabilities = {
'browserstack.local' : 'true'
}
So while a little slow, that might have to do. But this does work as described.

How to run selenium chrome nodes using proxy?

I'm using Docker Selenium images to run browser nodes, repo is available here https://github.com/SeleniumHQ/docker-selenium. There is no documentation on how config.json can be used to provide proxy values.
I'm using Selenium version 2.44.0.
In my infrastructure, there are certain assets that are sourced from a location which needs proxy configuration on browser to access them. I'm trying to setup proxy on a chrome node. According to this documentation here, proxy can be set like following:
java -jar selenium-2.44.0.jar -Dhttp.proxyHost=192.168.2.10 -Dhttp.proxyPort=80
My proxy does not require, usename and password hence I have ignored those values.
What is not clearly mentioned on SeleniumHQ documentation is, whether it needs proxy configuration on both hub or nodes or just the nodes. I've tried different combinations but haven't worked for me.
Actual commands i'm running are:
For Hub:
java -jar /opt/selenium/selenium-server-standalone.jar -role hub -Dhttp.proxyHost=192.168.2.10 -Dhttp.proxyPort=80 -hubConfig /opt/selenium/hubconfig.json
When I run command above, I can see -D* values being displayed on console config.
For node:
xvfb-run --server-args=":99.0 -screen 0 1360x1020x24 -ac +extension RANDR" java -jar /opt/selenium/selenium-server-standalone.jar -Dhttp.proxyHost=192.168.2.10 -Dhttp.proxyPort=80 -role node -hub http://$HUB_PORT_4444_TCP_ADDR:$HUB_PORT_4444_TCP_PORT/grid/register -nodeConfig /opt/selenium/config.json
When I run this command I can see the proxy values on console again but I the assets are not loaded by the browser.
Also, on a side note it seems like this can be done on developers side (in java code) but I'm keen to solve it on my (operations) side.
Thanks - here is what we got:
First you need a way to verify your settings made it into the browser.
chrome://net-internals/proxyservice.config#proxy
The actual command line instruction is:
/chromeexec --proxy="http=http://proxyserver:port/;https=http://proxyserver:port/"
Note that the colons will blow up on the bash command line if you don't use double-quotes.
Now if you're sending this from the Webdriver Java code programmatically - you'll need to escape out the double quotes - so the proxy server setting in Java may look like:
org.openqa.selenium.Proxy proxy = new org.openqa.selenium.Proxy();
proxy.setHttpProxy("\"http://proxyserver:port/\"")
Alternatively you can pass this in as an execution parameter.
DesiredCapabilities capabilities = DesiredCapabilities.chrome();
capabilities.setCapability("chrome.switches", Arrays.asList("--proxy \"http=http://proxyserver:port/;https=http://proxyserver:port/\""));
WebDriver driver = new ChromeDriver(capabilities);
Now your origin question was about accessing external resources with the proxy. What we did (similar to your question) was to pass a proxy exception for the site we were hitting so the external resources would go via the proxy.
So then you add an exception for your primary website - assuming the resource is 10.1.10.5 then it looks like:
--proxy-bypass-list=10.1.10.5
Which then we do in code as:
capabilities.setCapability("chrome.switches", Arrays.asList("--proxy=\"http=http://proxyserver:port/;https=http://proxyserver:port/\"" "--proxy-bypass-list=10.1.10.5"));
Note that setting username and password is a bug in Chrome. (Please star it if this holds you up. )
If you need a username and password, then the solution is a PAC file.
The syntax is:
--proxy-pac-url=file:///proxy.pac
The file format looks like:
if (host == "mylocalserver.com")
{
return 'DIRECT';
} else {
return return "PROXY wcg2.example.com:8080 ";
}
For the case of usernames and passwords in proxy settings, note the following:
Proxy auto-configuration files do not support hard-coded usernames and passwords. There's good reasoning behind this too, since providing support for hard-coded credentials would open up significant security holes, as anybody would be able to easily view the required credentials to access the proxy.
Rather configure the proxy as a transparent proxy, that way you won't need a username and password. You mention in one of your comments that the proxy server is located outside your LAN, which is why you require authentication. However, most proxies support rules based on the source IP, in which case it's a simple matter of only allowing requests originating from your corporate network.
The original proxy auto-config specification was originally drafted by Netscape in 1996. The original specification is no longer available directly, but you can still access it using The Wayback Machine's archived copy. The specification hasn't changed much, and is still largely the same as it was originally. You'll see the specification is quite simple, and that there is no provision for hard-coded credentials.
To solve this problem - you can use this tool:
https://github.com/sjitech/proxy-login-automator
This tool can create a local proxy and automatically inject user/password to real proxy server. Support PAC script.

how to handle cross-domain testing in selenium

How to handle cross-domain functionality in selenium.can any one explain me plz?
For ex: need to open Google.com and gmail, Using same selenium session object, I was seeing permission denied error , i tried with *iehta, Proxy injection mode as well it didn't work can you help me out..
I found this anwer on stackexchange.com:
You should be able to do so while using browsers with elevated
security privileges like *chrome for firefox. So you could just do
selenium.open("newURL");
in your test. Problem of changing URL is, it change in domain and
normal Selenium browser mode is restricted by Java Script's Same
Origin Policy, as I mentioned above browsers with elevated security
privileges should get you going.
I suppose this is the point where you are trying to load another URL
in same selenium session -
sel.open("www.google.com");
sel.waitForPageToLoad(stimeout);
First - don't use waitForPageToLoad, open api takes care of it.
Now if sel.open does not work then you should definitely encounter
error. Don't keep you method in try catch block and see the error you
encounter....
1:
https://sqa.stackexchange.com/questions/761/can-the-base-url-be-changed-in-the-same-browser-session-using-selenium-rc
If can't open two different domains with one selenium object, try using a different object for each domain (e.g. an object called seleniumGoogle and an object calledseleniumGmail).