I wish to click on a specific button of my page using webdriver from Selenium
However, all the buttons have the same class name and they do not have Ids.
As it is the first button I have tried this :
button = driver.find_element(By.CLASS_NAME,"actionButton-0-2-74[0]").click()
then I tried with XPATH to click directly on the link doing this :
buttonbutton = driver.find_element(By.XPATH,"//input[#xmlns='http://www.w3.org/2000/svg']")
but the message error shows >
NoSuchElementException: Message: no such element: Unable to locate element: {"method":"xpath","selector":"//input[#xmlns='http://www.w3.org/2000/svg']"}
the html is in the picture below
enter image description here
Related
This question already has answers here:
Can not click on a Element: ElementClickInterceptedException in Splinter / Selenium
(7 answers)
ElementClickInterceptedException: Message: element click intercepted Element is not clickable error clicking a radio button using Selenium and Python
(1 answer)
ElementClickInterceptedException: Message: element click intercepted: Element <label> is not clickable with Selenium and Python
(2 answers)
Closed 2 years ago.
I would like to have Selenium navigate thorught the items of a navbar of this page, then retrieve its contents and pass them to beatifulsoup.
My code works fine for the first page, after which I get the following error:
ElementClickInterceptedException: Message: element click intercepted: Element ... is not clickable at point (77, 73). Other element would receive the click: <div class="fc-dialog-overlay"></div>
Previous answers have not helped.
Here is my code. I would truly appreciate any help.
webdriver.ChromeOptions()
options.add_argument('--ignore-certificate-errors')
options.add_argument('--incognito')
options.add_argument('--headless')
driver = webdriver.Chrome(options=options)
target_page = 'https://www.rollingstone.com/music/music-lists/500-greatest-songs-of-all-time-151127/'
driver.get(target_page)
header = driver.find_element_by_id('pmc-gallery-list-nav-bar-render')
items = header.find_elements_by_tag_name('a')
songs = []
for item in items:
driver.maximize_window()
time.sleep(5)
item.click()
page_source = driver.page_source
soup = BeautifulSoup(page_source)
song_all = soup.find_all('h2', {'class':'c-gallery-vertical-album__title'})
for song in song_all:
songs.append(strip_extra_chars(song.get_text()))
driver.quit()
Let's comment options.add_argument('--headless') and see how program runs.
You need to identify which element overlap your link.
For example, I tried to run your code and this dialog apears:
In this case, to disable notification you can add:
prefs = {"profile.default_content_setting_values.notifications": 2}
options.add_experimental_option("prefs", prefs)
But this dialog doesn't match .fc-dialog-overlay locator so probably your problem with another element. You need to find it.
Another solution, you can click using js driver.execute_script()
Geb automation : Failing to click on element. Can someone help me to resolve this.
Note : Button is available at left bottom of the page but it is visible.
data-haspromo="" data-element="add" id="qa_button_0" type="button" class="amain btn primary" data-formname="" data-partnumber="12314" data-count="0" data-crossitem="" data-noncompliant="" data-partpreferred="false">Add
Here is my code to click on a button in page. It is failing to identify the element and click on it.
def clickAdd() {
Thread.sleep(3000)
waitFor (60){$("#qa_button_0").displayed}
$('#qa_button_0').click()
Thread.sleep(3000)
}
I am facing an issue where dropdown has tag. but still I am unable to select value in dropdown and it is throwing exception. I am able to get dropdown values but unable to select
Here is complete details
URL : https://semantic-ui.com/modules/dropdown.html
Testcase: Select multiple values in Skill dropdown.( Find attachment for exact field on web page.
WebDriver driver = new FirefoxDriver();
driver.get("https://semantic-ui.com/modules/dropdown.html");
driver.manage().timeouts().implicitlyWait(10L, TimeUnit.SECONDS);
WebElement Dropdown = driver.findElement(By.name("skills"));
Select sel = new Select(driver.findElement(By.name("skills")));
List<WebElement> Options = sel.getOptions();
System.out.println(Options.size());
for(int i=0;i<Options.size()-1;i++) {
driver.findElement(By.xpath("//*[#id=\"example\"]/div[4]/div[1]/div[2]/div[4]/div[1]/div[8]/div")).click();
System.out.println(Options.get(i).getAttribute("value"));
if(Options.get(i).getAttribute("value").equalsIgnoreCase("angular")||Options.get(i).getAttribute("value").equalsIgnoreCase("Graphic Design")||Options.get(i).getAttribute("value").equalsIgnoreCase("HTML")) {
Thread.sleep(6000);
sel.selectByIndex(i);
}
}
}
Exception:
Exception in thread "main" org.openqa.selenium.ElementNotInteractableException:
Please help to suggest for this.
Your drop down is a simulated drop down by css, not a HTML native drop down: Select. So you can not operate it as native drop down.
After look into the HTML code of your dropdown, there is an embed native drop down, but it's always invisible no matter you expand options or not. Selenium can't operate on invisible element(But you can read value/attribute from it), that's why the exception you met.
Actually all options come from the div class="menu", so you should click the option from div class="menu" as below screenshot show:
Code to resolve your problem:
// click arrow down to expand options
driver.findElement(By.cssSelector("select[name='skills'] + i")).click();
// choose option: Angular
driver.findElement(By.xpath("//div[contains(#class, 'multiple')][select[#name='skills']]//div[.='Angular']"));
I'm trying to scrape reviews from this site:
https://www.bbb.org/sacramento/business-reviews/heating-and-air-conditioning/elite-heating-air-conditioning-in-elk-grove-ca-47012326/reviews-and-complaints
But the content of the reviews isn't been loaded with by scrapy.
I tried then to use selenium to push the button and load the content:
url = 'https://www.bbb.org/losangelessiliconvalley/business-reviews/plumbers/bryco-plumbing-in-chatsworth-ca-13096711/reviews-and-complaints'
driver_1 = webdriver.Firefox()
driver_1.get(url)
content = driver_1.page_source
REVIEWS_BUTTON = '//*[#class="button orange first"]'
button = driver_1.find_element_by_xpath(REVIEWS_BUTTON)
button.click()
But selenium isn't able to find the button from the above xapth, I'm getting the following error:
selenium.common.exceptions.NoSuchElementException: Message: Unable to locate element: {"method":"xpath","selector":"//*[#class=\"button orange first\"]"}
Your button located inside an iframe, so you need to switch to it first and then handle the button:
REVIEWS_BUTTON = '//*[#class="button orange first"]'
driver_1.switch_to_frame('the_iframe')
button = driver_1.find_element_by_xpath(REVIEWS_BUTTON)
button.click()
driver.switch_to_default_content()
Actions expected out of the code below :
User successfully logs in.
User moves to the top right corner of the website and clicks on the greeting link "Hi ....!".
Step 2 is not happening because the greeting hyperlink is not identified by WebDriver. What am I doing wrong?
WebDriver driver = new FirefoxDriver();
driver.manage().window().maximize();
driver.get("http://flipkart.com");
driver.findElement(By.xpath(".//*[#id='container']/div/div/header/div[2]/div/div[1]/ul/li[8]/a")).click();
driver.findElement(By.xpath("//input[#class='fk-input login-form-input user-email']")).sendKeys("emailid");
driver.findElement(By.xpath("//input[#class='fk-input login-form-input user-pwd']")).sendKeys("password");
driver.findElement(By.xpath(".//*[#id='fk-mainbody-id']/div/div/div[1]/div/div[4]/div[7]/input")).click();
driver.findElement(By.linkText("Greeting _link")).click();
Error message:
Exception in thread "main" org.openqa.selenium.NoSuchElementException: Unable to locate element: {"method":"link text","selector":"Greeting _link"}
The HTML is:
<li class="_2sYLhZ _2mEF1S" data-reactid="26">
<a class="_1AHrFc _2k0gmP" data-reactid="27" href="#">Hi Neha!</a>
<ul class="_1u5ANM" data-reactid="28">
As I seeing after login there is no link which looks like as Hi username..!, but accroding to your comment I observe that you are talking about My account link which is visible in my case, I'm just rewriting your code which will automate from login to logout as below :-
driver.get("http://www.flipkart.com/");
driver.manage().window().maximize();
WebDriverWait wait = new WebDriverWait(driver, 10);
wait.until(ExpectedConditions.elementToBeClickable(By.linkText("Log In"))).click(); //it will click on login button
wait.until(ExpectedConditions.visibilityOfElementLocated(By.cssSelector("input.user-email"))).sendKeys("user name"); //it will fill user name
wait.until(ExpectedConditions.visibilityOfElementLocated(By.cssSelector("input.user-pwd"))).sendKeys('password'); //it will fill password
wait.until(ExpectedConditions.elementToBeClickable(By.cssSelector("input.login-btn"))).click(); //it will click on login button
WebElement myAccount = wait.until(ExpectedConditions.elementToBeClickable(By.partialLinkText("My Account")));
Mouse mouse = ((HasInputDevices)driver).getMouse();
mouse.mouseMove(((Locatable)hoverElement).getCoordinates()); //it will perform mouse over on My Account link, if in your case it show as 'Hi Neha!' you can replace it.
wait.until(ExpectedConditions.elementToBeClickable(By.partialLinkText("Log Out"))).click(); //it will click on logout click after mouse over
Hope it helps..:)