Has anyone tried working on fetching the webelements present on "About Chrome" page(use the following url: "chrome://settings/help").
Although I could see them on DOM and when I try and find them using locators; they don't get highlighted.
Not sure if there is way that I can interact with the webelements.
Related
In this page :
https://www.bedbathandbeyond.com/store/product/o-o-by-olivia-oliver-turkish-modal-bath-towel-collection/5469128?categoryId=13434
I can see a button with "Add to Cart" text , I can also see it in dev tools.
But when the same page source is retrieved by ChromeHeadless using selenium, and my script searches for it, this text is not present.
I tried with selecting show page source in the browser, the source too did not have the "Add To Cart text"
Further I used a curl to GET page, "Add To Cart" wasn't in the returned page source either.
What am I doing wrong?
is the page hiding the button?
How can I check for its presence, for product availability check?
The elements you are looking for are inside the shadow DOM. You need to access the shadow root first. Hard to see exactly what is going on in the DOM without some trial and error, but something like this:
WebElement shadowHost = driver.findElement(By.cssSelector("#wmHostPdp"));
SearchContext shadowRoot = shadowHost.getShadowRoot();
WebElement addToCart = shadowRoot.findElement(By.cssSelector(".shipItBtnCont button"));
More info on Shadow DOM & Selenium — https://titusfortner.com/2021/11/22/shadow-dom-selenium.html
I was trying to capture or (find the css/xpath) the inline warning message that disappears after few seconds (BTW, I am using Selenium WebDriver / Java for my automation).
eg: In the below public link, I try to click Reset Button without entering any email. The text box briefly shows 'Please fill out this field." I want to automate if it is showing this message as expected.
https://app.shipt.com/password_resets/new
Please help.
PS: I tried to search this website and google but could not find any useful information.
For actions that appear or disappear after certain time you should use Expected Conditions:
WebDriverWait wait = new WebDriverWait(driver, 10);
WebElement element = wait.until(ExpectedConditions.elementToBeClickable(By.id("someid")));
And then you can click on element as usual.
However in case of the shipit page you are trying to automate the popup is a native HTML5 popup, so you cannot use Selenium directly to get the message, and you have to use this workaround:
Stackoverflow - how to get HTML5 error message in Selenium
I'm doing a test on c# with selenium for this site http://onliner.by.
At first I have to be authorized on this site.
I found(by xpath) button with name"Вход" in the right upper corner and click on it.
Then there's refreshed and page changed, but the link remained the same (http://onliner.by).
And I need to enter login and password on this page and sumbit it. But I cannot do it.
I founded Xpath paths of this elements and I used this code:
//this doesn't work
driver.FindElement(By.XPath("//*[#id='auth-container__forms']/div/div[2]/form/div[1]/div[1]/input")).SendKeys("user");
driver.FindElement(By.XPath("//*[#id='auth-container__forms']/div/div[2]/form/div[1]/div[2]/input")).SendKeys("password");
driver.FindElement(By.XPath("//*[#id='auth-container__forms']/div/div[2]/form/div[3]/div/button")).Click();
How can I do it? I tried to use SwitchTo().Frame but it didn't help too.
I will be very grateful for help.
You have to wait element availablity like this
new WebDriverWait(driver, TimeSpan.FromSeconds(timeOut)).Until(ExpectedConditions.ElementExists((Locator Value)));
Upon typing xpath in fire-path text field, if x-path is correct then it'll display the corresponding HTML code. It was working fine previously.
But now it's not displaying the corresponding HTML code even though the xpath is correct.
Can anyone help me to find the solution for this problem? I even uninstalled fire-path and installed again but still, it's not working.
If you visit the GitHub Page of FirePath, it clearly mentions:
FirePath is a Firebug extension that adds a development tool to edit, inspect and generate XPath expressions and CSS3 Selectors
Now if you visit the home page of FireBug, it clearly mentions that :
The Firebug extension isn't being developed or maintained any longer. We invite you to use the Firefox DevTools instead, which ship with Firebug.next
So the direction is clear that we have to use DevTools [F12] which comes integrated with the Mozilla Firefox 56.x + releases onwards.
Example Usage :
Now, let us assume we have to identify the xpath of the Search Box on Google Home Page.
Open Mozilla Firefox 56.x browser and browse to the url https://www.google.co.in
Press F12 to open the DevTools
Within the DevTools section, on the Inspector tab, use the Inspector to identify the Search Box WebElement.
Copy the xpath (absolute) and paste it in a text pad.
Construct a logical unique xpath.
Within the DevTools section, on the Console tab, within JS sub menu, paste the logical unique xpath you have constructed in the following format and hit Enter or Return as follows:
$x("logical_unique_xpath_of_search_box")
The WebElement identified by the xpath will be reflected.
The new version of Firefox is not supporting firebug.
You can use chrome dev tools if you like so.
I personally writing XPath using chrome dev tools
For more info refer my answer here
Is there a way to get the xpath in google chrome?
Summary: When I try to go to the third page in a web site I'm trying to
screen scrape using Selenium and Chrome I can't find any elements on the third page.
I see Html for 2nd page in Driver.PageSource.
Steps:
Navigate.GoToUrl("LoginPage.aspx")
Find username and password elements.
SendKeys to Username and Password.
Find and click on Submit\Login button.
Web Site displays main menu page with two Link style menu items.
Find desired menu item using FindElement(By.LinkTest("New Person System")).
Click on link menu item. This should get me to the "Person Search" page (the 3rd page).
Try to wait using WebDriverWait for element on "Person Search" page. This fails to find element on new page after 5-10 seconds.
Instead of using WebDriverWait I then simply wait 5 or 10 seconds for page to load using Thread.sleep(5000). (I realize WebDriverWait is the better design option.
Try to find link with text "Person".
Selenium fails to find link tag.
I see desired "Person Search" page displayed in Chrome.
I see last page Html in ChromeDriver.PageSource.
Chrome geckodriver.exe. Build 7/21/2017.
NUnit 3.7.1
Selenium 3.4
VB
I used IE for another project with similar environment. Didn't have a problem getting to any page (once I coded Selenium correctly).
The web site I'm trying to screen scrape only supports recent IE version. I'm testing a legacy app for another project that requires IE 8. So using IE is out of the question.
Maybe I should try Firefox...