Selecting multiple elements with Selenium - selenium

I'm creating List of all available elements with below Xpath.
IList<IWebElement> test= Driver.FindElements(By.XPath("//*[#id='middle-container']//div[#class='middle-section match-list']//div[contains(#class,'title')]//span[contains(text(),'" + Event.Trim() + "')]//..//..//..//..//div[contains(#class,'drop-down-content')]//table[contains(#class,'hidden-xs')]//tr//td[contains(#class,'bettype')]//a[#class='bet']`//span"));
So all the elements available in that Xpath need to be clicked. Running foreach loop:
foreach (var item in availableSports)
{
item.Click();
}
}
My problem is let's say if test contains more than, I think, 10 elements, it is stopping the click event after around 8 to 9 clicks, and raising this error:
StaleElementReferenceException
So just wondering how can I write the method which will continue click until last available element without fail.

You are getting StaleElementReferenceException because something has changed in the DOM after you performed the FindElements operation.
You have mentioned that you are clicking on the items in the list. Does this click action reload the page or navigate to a different page. In both cases the DOM has changed. Hence the Exception.
You can handle this(hopefully) with the following logic. I am a JAVA guy and the following code is in JAVA. But I think you get the idea.
IList<IWebElement> test= Driver.FindElements(By.XPath("//*[#id='middle-container']//div[#class='middle-section match-list']//div[contains(#class,'title')]//span[contains(text(),'" + Event.Trim() + "')]//..//..//..//..//div[contains(#class,'drop-down-content')]//table[contains(#class,'hidden-xs')]//tr//td[contains(#class,'bettype')]//a[#class='bet']`//span"));
// Instead of using the for each loop, get the size of the list and iterate through it
for (int i=0; i<test.length; i++) {
try {
test.get(i).click();
} catch (StaleElementReferenceException e) {
// If the exception occurs, find the elements again and click on it
test = test= Driver.FindElements(By.XPath("//*[#id='middle-container']//div[#class='middle-section match-list']//div[contains(#class,'title')]//span[contains(text(),'" + Event.Trim() + "')]//..//..//..//..//div[contains(#class,'drop-down-content')]//table[contains(#class,'hidden-xs')]//tr//td[contains(#class,'bettype')]//a[#class='bet']`//span"));
test.get(i).click();
}
}
Hope this helps you.

Related

org.openqa.selenium.StaleElementReferenceException: stale element reference: element is not attached to the page document for finding element

I use this code to click on
List<WebElement> list = driver.findElements(By.xpath("//div[#class='ag-center-cols-container']//div"));
// check for list elements and print all found elements
if(!list.isEmpty())
{
for (WebElement element : list)
{
// System.out.println("Found inner WebElement " + element.getText());
}
}
// iterate sub-elements
for ( WebElement element : list )
{
System.out.println("Searching for " + element.getText());
if(element.getText().equals(valueToSelect))
{
new WebDriverWait(driver, 30).until(ExpectedConditions.invisibilityOfElementLocated(By.xpath("//div[#class='overlay ng-star-inserted']")));
element.click();
break; // We need to put break because the loop will continue and we will get exception
}
}
But from time to time I get this error at this line element.getText():
org.openqa.selenium.StaleElementReferenceException: stale element reference: element is not attached to the page document
Do you know how I can implement a lister in order to fix this issue>
org.openqa.selenium.StaleElementReferenceException:
Indicates that a reference to an element is now "stale" --- the element no longer appears on the DOM of the page. The reason for this expectation is may be your DOM got updated or refreshed. For an example, after performing an action like click() your DOM may get updated or refreshed. In this time when you are trying to find an element on DOM you will experience this error.
You have to re-find that element in updated or refreshed DOM
But from time to time I get this error at this line element.getText():
Create a reusable method to handle this exception.
Code:
public static String getTextFromElement(WebElement element, WebDriver driver) {
try {
return element.getText();
} catch (org.openqa.selenium.StaleElementReferenceException e) {
new WebDriverWait(driver, 15).until(ExpectedConditions.visibilityOf(element));
return element.getText();
}
}
For example, you can call in this way.
System.out.println("Found inner WebElement " + getTextFromElement(element, driver));

How to click child links of each menu in Selenium with Java

I am trying to use the link https://www.bloomingdales.com/
Click child links of each menu .Below is the code i tried .
public void iClickOnFOBSShouldVerifyTheRespectivePages() throws Throwable {
List allElements = Elements.findElements(By.xpath("//ul[#id='mainNav']/li/a"));
for (int i = 0; i <= allElements.size(); i++) {
List<WebElement> links = Elements.findElements(By.xpath("//ul[#id='mainNav']/li/a"));
WebElement ele = links.get(i);
ele.click();
List<WebElement> childlinks = Elements.findElements("left_facet.left_nav");
for (int j = 0; j <= childlinks.size(); j++) {
List<WebElement> ele2 = Elements.findElements(By.xpath("left_facet.left_nav"));
WebElement ele3 = links.get(i);
ele3.click();
Navigate.browserBack();
}
}
Below is the error i am getting
org.openqa.selenium.StaleElementReferenceException: stale element reference: element is not attached to the page document
Try This::List<WebElement>ele = driver.findElements(By.xpath("//ul[#id='mainNav']/li/a"));
Actions act =new Actions(driver);
Thread.sleep(4000);
for (int i = 0; i <= ele.size(); i++) {
WebElement a =ele.get(i);
act.moveToElement(a).build().perform();
Thread.sleep(2000);
List<WebElement>ChildMenu=driver.findElements(By.xpath("//nav[#id='nav']/div[2]/div/div/div[#class='flyoutCol']/div/ul/li"));
System.out.println("Sub-Menu="+ChildMenu.size());
for(int j=0;j<ChildMenu.size();j++){
ChildMenu.get(i).click();
Thread.sleep(2000);
driver.navigate().back();
Thread.sleep(2000);
act.moveToElement(a).build().perform();
On clicking any of the links, the browser will open a new page. At that moment the elements in the page corresponding to allElements, ele and ele2 disappear ("element is not attached to the page document") in the browser and are thus not valid anymore. I would expect that the first click on a submenu child would work but anything after that will fail.
What you could do is first check how many children each submenu has, store this in an array and then create a double loop somewhat similar to what you have already done. For each submenu and submenu child click, you can't use any of the webelements in memory of your program because their state was changed so you would need to initialize a completely new webelement.
With xpath you should be able to reference directly to an element like 'parent - 3rd submenu - 7th submenu child'.
I am unfortunately not able to quickly create working code, otherwise I would have done so.
What I am wondering is: what is the purpose of this code? You don't do any assertions, so if any submenu child click leads to an error page, your code will not fail. Is this desired? It might not even fail if clicking doesn't even open an error page. I guess this code would fail if some of the submenu children are present in the HTML but for some reason not visible.
If this is meant to be a test, I would recommend to make a handful of separate tests that cover a few examples (and assert that clicking leads to the expected page) instead of looping through everything. This will make the tests far more understandable for yourself and others.

Click Next link until pagination Next action is disabled in selenium webdriver

I want to verify elements in a row inside a table in by navigating each page through pagination.I am able to navigate to each page and assert element but the probelm is at last page the loop still continues even though the Next link is grayed out.
When Next link is disabled
<span>
<a class="next_btn inactive">NEXT ></a>
</span>
When Next link is enabled
<span>
<a class="next_btn" href="home.do?action=Next&start=10&end=20&sort=&
type=00&status=&fromDate=04/02/2017&toDate=05/02/2017&school=&
district=0009">NEXT ></a>
</span>
Actual Code
public void submissionType() throws Exception {
driver.findElement(By.linkText("NEXT >"));
while(true) {
processPage();
if (pagination.isDisplayed() && pagination.isEnabled()){
pagination.click();
Thread.sleep(100L);
} else
break;
}
}
private void processPage() {
String statusColumn="//td[#class='gridDetail'][2]";
List<WebElement> list = table.findElements(By.xpath(statusColumn));
for (WebElement checkbox : list) {
String columnName=checkbox.getText();
Asserts.assertThat(columnName,"File");
}
}
Instead of identifying the element with By.linkText("NEXT >"), try identifying it with By.cssSelector("a.next_btn").
When you use this approach, then when the object becomes disabled, its class name would change, and hence your object would no longer get identified and your loop would break.
Edit: Add try block and Catch NoSuchElement exception to avoid exceptions.
I know you already accepted an answer but one of the statements is not correct. The locator, By.cssSelector("a.next_btn"), will find both the enabled and disabled button so it will not cause the loop to break.
Looking at your code, I would offer a few suggestions/corrections.
.isEnabled() only really works on INPUT tags so testing for that doesn't really accomplish anything here.
Using Thread.sleep() is not a good practice. You can google some explanations but basically the problem is that it's an inflexible wait. If the element you are waiting for becomes available in 15ms, you are still going to wait 10s or whatever your sleep is set to. Using an explicit wait (WebDriverWait) is a best practice.
I would clean up your functions and write them like
public void submissionType()
{
WebDriverWait wait = new WebDriverWait(driver, 10);
By nextButtonLocator = By.cssSelector("a.next_btn:not(.inactive)");
List<WebElement> nextButton = driver.findElements(nextButtonLocator);
while (!nextButton.isEmpty())
{
processPage();
nextButton.get(0).click();
wait.until(ExpectedConditions.stalenessOf(nextButton.get(0))); // wait until the page transitions
nextButton = driver.findElements(nextButtonLocator);
}
}
private void processPage()
{
for (WebElement checkbox : table.findElements(By.xpath("//td[#class='gridDetail'][2]")))
{
Asserts.assertThat(checkbox.getText(), "File");
}
}

How to select the checkbox in the list of checkboxes ONE-BY-ONE in selenium?

I need to select all the checkboxes from here one-by-one after every 3 second. I tried couple of xpaths with list,none of them have worked
Tried xpaths:
//div/div[#class='filters-list sdCheckbox ']
Using input and type. But none of them worked. Can you please help me out?
Reference website: https://www.snapdeal.com/products/storage-devices?sort=plrty
->Capacity at the left hand corner
By.xpath("//a[#class='filter-name']") this one listed out all the filters of page.
The xPath "//div[#data-name='Capacity_s']/div[#class='filters-list sdCheckbox ']/input" will fetch you the list of all input elements that you need to check.
There is a container DIV that holds all the filters of a certain type, e.g. Brand, Capacity, etc. The one for Brand is shown below.
<div class="filter-inner " data-name="Brand">
Under that container, all the LABEL tags are what you need to click to check the boxes. So, we can craft a CSS selector using the grouping as a filter to reach only the checkboxes we want.
"div[data-name='Brand'] label"
Since this is something I assume you will reuse, I would write this as a function.
public static void CheckFilters(String filterGroup)
{
WebDriverWait wait = new WebDriverWait(driver, 10);
List<WebElement> filters = driver.findElements(By.cssSelector("div[data-name='" + filterGroup + "'] label"));
// System.out.println(filters.size()); // for debugging
for (int i = 0; i < filters.size(); i++)
{
filters.get(i).click();
// wait for the two overlays to disappear
wait.until(ExpectedConditions.invisibilityOfElementLocated(By.cssSelector("div.searcharea-overlay")));
wait.until(ExpectedConditions.presenceOfElementLocated(By.cssSelector("div.filterLoader.hidden")));
// reload the element list after the refresh so you don't get StaleElementExceptions
filters = driver.findElements(By.cssSelector("div[data-name='" + filterGroup + "'] label"));
}
}
and you would call it like
driver.get("https://www.snapdeal.com/products/storage-devices?sort=plrty");
CheckFilters("Brand");

WebDriver does not use latest page source

I'm running into problems when executing Selenium 2.18 WebDriver tests against an Oracle SSXA-based site, which translates to tons of popups, Ajax-loaded content and iframes. For a given page, based on manual observation, the page is initially loaded with an empty sslw_doc_content_id span (no text). About a second later, the span still exists and contains text.
To check that this page has loaded, I'm using a WebDriverWait with a Predicate that checks that the sslw_doc_content_id span has non-empty text:
new Predicate<WebDriver>() {
#Override
public boolean apply(final WebDriver input) {
return StringUtils.isNotEmpty(input.findElement(By.id("sslw_doc_content_id")).getText());
}
}
Somehow, WebDriver always finds the WebElement but always returns an empty string when calling WebElement.getText(). And so this predicate always evaluates to false.
Inspecting the page with Chrome or Firefox shows that the element exists and does have text. When debugging the predicate, I've observed that input.getPageSource() contains the span with no text on its first invocation, but that input.getPageSource() contains the span with some text on its second invocation (after the page has been ajax-refreshed).
Why doesn't WebDriver consider the refreshed page source on the second invocation?
Thanks!
You can try loop:
int seconds = 0;
for (int seconds =0; seconds<30; seconds++) {
if (apply(driver)){
break;
}
Thread.sleep(1000);
}
The above loop will check every second if text is there and do it for 30 seconds. If it finds it, it will escape the loop. You can still add one more check after:
if (!apply(driver)){
throw new RuntimeException("Expected text still not there");
}
Or another approach, which I am using: Whenewer I have page to load for long time, I select element which is loaded among the last on the page and check its presence the same way:
for (int seconds =0; seconds<30; seconds++){
try{
driver.findElement(By.id("lastelementonpage"));
}catch (Exception e){
Thread.sleep(1000);
}
}
It turns out that WebDriver does not allow access to hidden elements, and this span was hidden.
http://code.google.com/p/selenium/wiki/FrequentlyAskedQuestions#Q:_Why_is_it_not_possible_to_interact_with_hidden_elements?
I've resorted to using the JavaScript-based solution described in the above link.