Setting Device Width for Remote Webdriver in Selenium - selenium

I am using Selenium grid 40 with Firefox remote driver that runs in windows 7. I also use C# API. I was wondering how to Set Device width on headless Firefox browser. The device width is less than ipads max width and this causes it to pick up ipad specific css that is defined like below:
#media only screen and (min-device-width: 481px) and (max-device-width: 1024px) and (orientation:portrait) {
/* For portrait layouts only */
}
#media only screen and (min-device-width: 481px) and (max-device-width: 1024px) and (orientation:landscape) {
/* For landscape layouts only */
}
I have already changed window size using:
driver.Manage().Window.Size = new System.Drawing.Size(1290,900)
But it still picks up those css directives.
Other information: My grid node is virtual machine that nobody actually logs into. I remotely run the selenium grid, and that might be the reason why device width is small. it might default to smallest resolution for windows. If there is way to change that it might help me, but I am not aware of it.
Update: I tried to set all instances of DefaultSettings.XResolution, DefaultSettings.YResolution and DefaultSettings.BitsPerPel in registry to 1290, 900 and 16 through a powershell script and restart the computer but it didn't work.

I don't know how to set the device width using selenium, but I figured out how to set it using the remote frame buffer. For me this was Xvbf. I pass in the screen resolution when I start the service.
Below is an example of an Xvfb service with a resolution of 1024x768 with a depth of 24.
https://gist.github.com/dloman/931d65cbb79b00593ac3dd5d0cdf37d9

My experience is limited to using the Python implementation of Selenium so this may not work for you, but there you can use driver.set_window_size(width, height) before performing the driver.get() action that will load the desired page.
The key is the order in which you perform these actions: you want to tell selenium to change the dimensions of the browser before you load the page, because otherwise you are simulating the same behaviour as resizing a browser window after the initial page load. As you may know, when resizing a browser window, a page reload is needed for some 'responsive' features to be activated.

Related

Chrome Headless Selenium documentHeight

I am using Selenium C# to drive a Headless instance of Chrome
((ChromeOptions)_Options).AddArgument("--headless");
((ChromeOptions)_Options).AddArgument("window-size=1920,1080");
I have run into the problem that my javascript is always detecting both
$(document).height()
and
$(window).height()
as being 1080 in height, which is not accurate. The document height should be much taller in some cases. Is there a reason this is not working correctly and/or a work around to solve the issue?
In my troubleshooting, I grabbed the value for this javascript, and discovered that it was also 1080.
Math.max(document.body.scrollHeight, document.body.offsetHeight,
document.documentElement.clientHeight,
document.documentElement.scrollHeight,
document.documentElement.offsetHeight)
This particular page is definitely taller than the screen, and I used the Selenium GetScreenshot() method to take a picture and verify the scrollbar exists and content exists below the visible area.
For clarification, this does work correctly when running the headed version of Chrome. And the javascript in question is being run from JQuery's method:
$(document).ready(function () {

Selenium - Element not visible when the browser set to mobile responsive mode

I am testing the browser for mobile responsiveness. I changed the browser window size to iPhone 5 which is 320 x 568 using this command
driver.Manage().Window.Size = new Size(320, 568);
When I run the test, the browser opens fine according to the mentioned size without any issue. But it fails to find a hyperlink text which is displayed on the page. I get Element not visible exception when I could actually see the link text on the screen. So, could anyone help me solve this issue or have any ideas that I could try?
Any help would be highly appreciated.
Thanks.
Perhaps it's due to the time delay, that means code executes even before the link appears, So write the following code in your language
Code from Ruby Selenium-binding
wait = Selenium::WebDriver::Wait.new(timeout: 10) # seconds
wait.until { driver.find_element(id: "foo").displayed? }
driver.find_element(id: "foo").click
Try to scroll to the element.
You could use java script to do that.
In Python this can be done via
WebDriver.execute_script("arguments[0].scrollIntoView();", elem)
Some elements of the DOM of the webpage change when you test for mobile responsiveness, so selenium is unable to locate the element that you are specifically trying to target.So, you should try to debug and find the methods where the code is failing to perform the action.Then you should find the locators for those elements in "mobile responsiveness view" and trigger only those methods when you are testing for mobile.

Animation with delay in firefox developer edition weird behavior

I have recently started using Firefox developer edition and i have found a strange behavior on animations with a delay.
http://plnkr.co/edit/Tr1nd5r0gAyy3eH3Z7Zh?p=preview
.selection-buttons.ng-animate.ng-hide-add {
animation: 30s linear 1s fadeOutDown;
opacity: 1;
}
.selection-buttons.ng-animate.ng-hide-remove {
animation: 30s linear 1s fadeInUp;
opacity: 0;
}
If you open this Plunker with Firefox Developer the animation will play correctly but then the text will hide again for a couple seconds and then come back (or will be shown again for a couple seconds and then disappear when you click the 2nd time).
I have tested this in other browsers (Chrome, Firefox standard, Safari and Pale Moon) and the animation works fine.
The animations are realized using animate.css which i used the animation names in the CSS in order to use them together with ng-animate from angular, but I haven't tested normal animations without ng-animate
I have noticed that ng-animate removes its classes way after the animation ended. I have timed it and the class removal is delayed exactly by half of the animation duration (if you increase the animation duration, the delay will increase accordingly) so the animation ends but the other properties added by the class (such as opacity) are left until the classes are removed. This happens for any delay value, regardless of the value set, as long as there is a delay on the animation it will do this.
Surprisingly it will happen even if there's a delay on transitions (but still using an animation) either from the transition property or from the transition-delay property
If i remove the delay from the CSS attribute the animation works fine.
I have also tested using animation-name, animation-duration and animation-delay instead of the animation property
I have opened a ticket on bugzilla and it has been confirmed to be a bug introduced by a new feature (hiRes dom timestamps) replacing the current timestamps on Firefox versions starting from 44 and Chrome versions starting from 49.
Firefox Developer edition uses experimental builds, while Firefox standard only stable builds (latest stable is currently 42, which is unaffected, and so is version 43).
https://bugzilla.mozilla.org/show_bug.cgi?id=1231619
The Angular team has been notified of this bug

Browser is sending request for same content (PNG image) multiple times

I've a PNG file named icons.png hosted on an apache server. Basically this file is a combination of other small image elements (CSS Sprite). This file loads with a normal 200 OK response when the page loads for the 1st time.
After the page loads; there are some links hovering on which a custom tooltip is triggered. This tooltip displays part of the image from the icons.png file as a background image of some HTML elements.
For example the HTML code is like this:
jQuery(".profile").tipTip({delay: 200, defaultPosition: "top", content: "<a href='#' style='width: 32px; height: 32px; display: block; background: url(images/icons.png) no-repeat -200px -64px'></a>"});
[There are some other places in the HTML file where icons.png has been referred to]
Now every time I hover on the link, the tooltip is showing up, but
simultaneously the browser is sending a HTTP request to the server for
the icons.png file. And the response code from the server is
coming as 304 Not Modified.
Although the content of the file is not being fetched, but the overhead of sending the headers (around 166 Bytes) is still there every time, which in turn is causing a latency of 1.5 s (I'm on a damn slow connection). During this period of 1.5 s the tooltip element has no background image & suddenly the image is showing up out of nowhere.
Here are some screen-shots of
Chrome network panel:
Firebug net panel:
HTTP headers:
As far as I know once a resource has been fetched the browser should hold it in its cache & fetch from there whenever necessary, instead of requesting the server multiple times.
As I've found out that the server is not sending any "Expires" or "Cache-Control" header along with the content. I'm not sure whether this can be the reason behind such exceptional behavior of Chrome. Any suggestion is highly appreciated.
P.S: The application is hosted on Heroku's shared hosting environment. I'm using Firefox 15.0 & Chrome Version 21.0.1180.89 on Ubuntu 12.04 x86_64.
Every time you show an element for the first time, that is the point at which it downloads any associated background images... in modern browsers at least.
Your multiple requests are likely to be because they are the times you hovered over a new tooltip, bringing it in to visibility, and thus prompting the image to be called.
Your instincts are right though, the issue would be that if no caching header configuration is done directly on your server, or through .htaccess files, then it will keep requesting the server with a http request to see if it needs to download a newer version or not. As soon as you sort out your "expires" headers, which can be set through mod_expires, it'll start to return a locally requested version of the file each time instead.
Source: http://httpd.apache.org/docs/2.2/mod/mod_expires.html
I just recently came across with this behaviour as well, during developing locally. An element with a sprite background had a :hover state in the CSS file, which pointed on the same sprite background image with a different background position and that caused a very small, but nevertheless noticeable latency when switching the background of the element.
.class {
background: transparent url('sprite.png') 0 0 scroll;
}
.class:hover {
background: transparent url('sprite.png') -50px 0 scroll;
}
One way of making sure this doesn't happen is simply using the background-position CSS property only.
.class {
background: transparent url('sprite.png') 0 0 scroll;
}
.class:hover {
background-position: -50px 0;
}
Of course cache control is still necessary, this approach of coding can save you some headache.

I want to capture document of Safari in Mac OS X

I want to get a image of web page from Safari not Safari application screenshot.
I try to capture this by CGWindowImageList? function but this function capture application screenshot.
Try and load up the page in a UIWebView and drawing the view onto a buffer unless you specifically need to capture a running Safari session. This is what Paparazzi does, if I recall correctly.