Accept cookies with SeleniumBasic on IBM website - vba

I have VBA code that enters www.ibm.com.
There is a cookies consent window I want to bypass.
]1
I tried to click on the accept button by Xpath but I get error.
Run-time Error 7:
NoSuchElementError
Element not found for XPath=/html/body/div[8]/div[1]/div/div[3]/a[1]
rendered error message
I tried to wait until it fully loads, but still no success.
Here is my attempt:
Dim driver
Set driver = CreateObject("Selenium.EdgeDriver")
With driver
.start
.get "https://ibm.com"
.FindElementByXPath("/html/body/div[8]/div[1]/div/div[3]/a[1]").Click
.Quit
End With
I found something about a switch to frame but I do not know how to perform.

your button is inside a frame, so you could something like this:
Sub CloseConsent()
With New ChromeDriver
.get "http://hcad.org/quick-search/"
.SwitchToFrame .FindElementByXPath("//iframe[#class='truste_popframe']", timeout:=10000)
.FindElementByXPath("//a[#class='call']", timeout:=10000).Click
End With
End Sub

Related

Can't set chrome preferences in vba selenium

I've created a script in vba in combination with selenium to parse the first headline from this webpage. Most of the times my script throws this error timeout or this error Run-time error 21; Application defined or Object defined error while sometimes it works flawlessly. As the page takes too much time to load it's content, I suppose I'm having one of the side effect of a slow loading page, so I wish to disable images from that page.
I've tried with:
Sub TestSelenium()
Const URL$ = "https://www.marketscreener.com/"
Dim driver As Object, post As Object
Set driver = New ChromeDriver
driver.get URL
Set post = driver.FindElementByCss(".une_title")
MsgBox post.Text
driver.Quit
End Sub
When I go for python selenium binding, I can use this option to disable images:
option = webdriver.ChromeOptions()
chrome_prefs = {}
option.experimental_options["prefs"] = chrome_prefs
chrome_prefs["profile.default_content_settings"] = {"images": 2}
chrome_prefs["profile.managed_default_content_settings"] = {"images": 2}
driver = webdriver.Chrome(options=option)
I know there are options to set different preferences in vba but in case of disabling images I can't find any proper way to set them:
driver.SetPreference
driver.AddArgument
How can I set chrome preferences in vba selenium to let the page load quickly without images?
To disable images from that page, this is how you can set the preference within vba selenium bindings:
driver.SetPreference "profile.managed_default_content_settings.images", 2
Your script looks like the following when you implement the above suggestion:
Sub TestSelenium()
Const URL$ = "https://www.marketscreener.com/"
Dim driver As Object, post As Object
Set driver = New ChromeDriver
driver.SetPreference "profile.managed_default_content_settings.images", 2
driver.get URL
Set post = driver.FindElementByCss(".une_title")
MsgBox post.Text
Stop
driver.Quit
End Sub
You could always try running it headless? That should remove any delay associated with image loading.
driver.AddArgument "--headless"

Trouble clicking on a load more button using vba

I'm trying to click on the Load More button located at the bottom of the left window of this webpage using vba in combination with selenium but the script always throws timeout error pointing at this .Get Url line. Although it seems I've defined an accurate xpath to locate the element, I can't think further as to what I should do now to achieve the same.
How can I click on that Load More button?
Sub ClickOnLoadMore()
Const Url$ = "http://www.ratemyprofessors.com/search.jsp?queryoption=TEACHER&queryBy=schoolDetails&schoolID=457&schoolName=James+Madison+University&dept=select"
Dim driver As New ChromeDriver, post As Object
With driver
.Get Url
Set post = .FindElementByXPath("//div[contains(.,'Load More')]")
.ExecuteScript "arguments[0].scrollIntoView();", post
post.Click
End With
End Sub
I see two "Load More" buttons. Both are matched by "//div[contains(.,'Load More')]". The first one is hidden. You need to handle second one.
Try this XPath
"//div[#class='content' and . = 'Load More']"
At least for me there were a couple of banners to dismiss as well as scrolling. There was no problem with the get line
Option Explicit
Public Sub ClickOnLoadMore()
Const Url$ = "http://www.ratemyprofessors.com/search.jsp?queryoption=TEACHER&queryBy=schoolDetails&schoolID=457&schoolName=James+Madison+University&dept=select"
Dim driver As New ChromeDriver, post As Object
With driver
.get Url
If .FindElementsByCss(".close-notice.close-this").Count > 0 Then
.FindElementByCss(".close-notice.close-this").Click
End If
.SwitchToFrame .FindElementByCss("[id^='spout-unit-iframe']")
With .FindElementByCss("#spout-ads #spout-header-close")
.ScrollIntoView
.Click
End With
.SwitchToDefaultContent
.ExecuteScript "document.querySelector('.result-list [onclick*=LoadMore]').scrollIntoView(true);" & _
"window.scrollBy(0, -(window.innerHeight - this.clientHeight) / 2);"
.FindElementByCss(".result-list [onclick*=LoadMore]").Click
Stop '<== Delete me later
'other code
.Quit
End With
End Sub

How to click the date picker selenium VBA?

I'd like to use selenium VBA to download some data from Yahoo Finance KOSPI COmposite Index.
I got the difficulty when click the date picker arrow to get the mini window to select the end date as today. I tried to record the marco through selenium IDE in chrome, but IDE does not record the step when I click the arrow of the Time period to get the date picker visible.
Below is my code in VBA.
Public Function seleniumKorea(bot As WebDriver)
Dim url As String
url = "https://finance.yahoo.com/quote/%5EKS11/history?period1=1484018309&period2=1515554309&interval=1d&filter=history&frequency=1d"
bot.Start "chrome", url
bot.Get "/"
'Not sure how to add date picker here
bot.FindElementByName("endDate").Clear
bot.FindElementByName("endDate").SendKeys (Date)
bot.FindElementByXPath("(.//*[normalize-space(text()) and normalize-space(.)='End Date'])[1]/following::button[1]").Click
Application.Wait (Now + TimeValue("0:01:00"))
bot.FindElementByXPath("(.//*[normalize-space(text()) and normalize-space(.)='As of'])[1]/following::div[4]").Click
Application.Wait (Now + TimeValue("0:01:00"))
bot.FindElementByXPath("(.//*[normalize-space(text()) and normalize-space(.)='Currency in KRW'])[1]/following::span[2]").Click
Application.Wait (Now + TimeValue("0:01:00"))
End Function
I tried to use the ByXPath to get the svg class but failed.
Thanks in advance.
I would use the following which submits the OATH consent if required and scrolls the date picker into the viewport
Option Explicit
Public Sub DatePicking()
Dim d As WebDriver
Set d = New ChromeDriver
Const URL = "https://finance.yahoo.com/quote/%5EKS11/history?period1=1484018309&period2=1515554309&interval=1d&filter=history&frequency=1d/"
With d
.get URL
If .Title = "Yahoo is now part of Oath" Then
.FindElementByCss("form").submit
End If
With .FindElementByCss("[data-test='date-picker-full-range']")
.ScrollIntoView
.Click
End With
With .FindElementByCss("[name=startDate]")
.Clear
.SendKeys "05/10/2017"
End With
With .FindElementByCss("[name=endDate]")
.Clear
.SendKeys "05/10/2017"
End With
Stop '<==Delete me later
.Quit
End With
End Sub
Check this out. It should serve the purpose. I used xpath to solve the issue.
Sub CustomizeDate()
Const Url$ = "https://finance.yahoo.com/quote/%5EKS11/history?period1=1484018309&period2=1515554309&interval=1d&filter=history&frequency=1d"
Dim driver As New ChromeDriver
With driver
.get Url
.FindElementByXPath("//input[#data-test='date-picker-full-range']", timeout:=5000).Click
.FindElementByXPath("//input[#name='startDate']").Clear.SendKeys ("01/05/2017")
.FindElementByXPath("//input[#name='endDate']").Clear.SendKeys ("01/08/2017")
.FindElementByXPath("//button/span[.='Done']", timeout:=5000).Click
.FindElementByXPath("//button/span[.='Apply']", timeout:=5000).Click
End With
End Sub
Don't use hardcoded delay for any item to appear. Use Explicit Wait instead. The script will wait upto 5000, meaning 5 seconds for that item to be available.

Cannot Login with Excel VBA Selenium

So I am tryin to create a Script that will automatically login for me on this website:
https://teespring.com/login
This is my script so far:
Sub openBrowser()
'Open new browser
Set driver = New Selenium.ChromeDriver
'Navigate to Website
urlWebsite = "https://teespring.com/login"
driver.Get urlWebsite
So I have tried to enter my username with the following lines of code:
driver.FindElementByCss("").sendKeys Username
But I got an error saying element not visible
Is there any way I can still automate the login process?
thanks for your help and if you need further information I will try my best to as I am still learning how to handle vba selenium :-)
It is not visible because it is inside a form.
You need to access via form:
Option Explicit
Public Sub EnterInfo()
Dim d As WebDriver
Set d = New ChromeDriver
Const URL = "https://teespring.com/login"
With d
.Start "Chrome"
.get URL
With .FindElementByClass("js-email-login-form")
.FindElementByCss("input[name=email]").SendKeys "myEmail"
.FindElementByCss("input[name=password]").SendKeys "myPassWord"
.FindElementByCss("input[type=submit]").Click
End With
Stop '<== Delete me after inspection
.Quit
End With
End Sub

Loop over a set of pages with Selenium Basic (VBA)

Task:
So my first foray into Selenium and I am attempting to:
Find the number of pages in a pagination set listed at the bottom of https://codingislove.com/ This is purely to support task 2 by determining the loop end.
Loop over them
I believe these are linked but for those that want a single issue. I simply want to find the correct collection and loop over it to load each page.
The number of pages is, at time of writing, 6 as seen at the bottom of the webpage and shown below:
As an MCVE I simply want to find the number of pages and click my way through them. Using Selenium Basic.
What I have tried:
I have read through a number of online resources, I have listed but a few in references.
Task 1)
It seems that I should be able to find the count of pages using the Size property. But I can't seem to find the right object to use this with. I have made a number of attempts; a few shown below:
bot.FindElementsByXPath("//*[#id=""main""]/nav/div/a[3]").Size '<==this I think is too specific
bot.FindElementsByClass("page-numbers").Size
But these yield the run-time error 438:
"Object does not support this property or method"
And the following doesn't seem to expose the required methods:
bot.FindElementByCss(".navigation.pagination")
I have fudged with
bot.FindElementsByClass("page-numbers").Count + 1
But would like something more robust
Task 2)
I know that I can navigate to the next page, from page 1, with:
bot.FindElementByXPath("//*[#id=""main""]/nav/div/a[3]").Click
But I can't use this in a loop presumably because the XPath needs to be updated.
If not updated it leads to a runtime error 13.
As the re-directs follow a general pattern of
href="https://codingislove.com/page/pageNumber/"
I can again fudge my way through by constructing each URL in the loop with
bot.Get "https://codingislove.com/page/" & i & "/"
But I would like something more robust.
Question:
How do I loop over the pagination set in a robust fashion using selenium? Sure I am having a dense day and that there should be an easy to target appropriate collection to loop over.
Code - My current attempt
Option Explicit
Public Sub scrapeCIL()
Dim bot As New WebDriver, i As Long, pageCount As Long
bot.Start "chrome", "https://codingislove.com"
bot.Get "/"
pageCount = bot.FindElementsByClass("page-numbers").Count + 1 '
For i = 1 To pageCount 'technically can loop from 2 I know!
' bot.FindElementByXPath("//*[#id=""main""]/nav/div/a[3]").Click 'runtime error 13
' bot.FindElementByXPath("//*[#id=""main""]/nav/div/a[2]/span").Click ''runtime error 13
bot.Get "https://codingislove.com/page/" & i & "/"
Next i
Stop
bot.Quit
End Sub
Note:
Any supported browser will do. It doesn't have to be Chrome.
References:
Finding the number of pagination buttons in Selenium WebDriver
http://seleniumhome.blogspot.co.uk/2013/07/how-can-we-automate-pagination-using.html
Requirements:
Selenium Basic
ChromeDriver 2.37 'Or use IE but zoom must be at 100%
VBE Tools > references > Selenium type library
To click the element, it must be visible in the screen, so you need to scroll to the bottom of the page first (selenium might do this implicitly some times, but I don't find it reliable).
Try this:
Option Explicit
Public Sub scrapeCIL()
Dim bot As New WebDriver, btn As Object, i As Long, pageCount As Long
bot.Start "chrome", "https://codingislove.com"
bot.Get "/"
pageCount = bot.FindElementsByClass("page-numbers").Count
For i = 1 To pageCount
bot.ExecuteScript ("window.scrollTo(0,document.body.scrollHeight);")
Application.wait Now + TimeValue("00:00:02")
On Error Resume Next
Set btn = bot.FindElementByCss("a[class='next page-numbers']")
If btn.IsPresent = True Then
btn.Click
End If
On Error GoTo 0
Next i
bot.Quit
End Sub
Similar principle:
Option Explicit
Public Sub GetItems()
Dim i As Long
With New ChromeDriver
.Get "https://codingislove.com/"
For i = 1 To 6
.FindElementByXPath("//*[#id=""main""]/nav/div/a[3]").SendKeys ("Keys.PageDown")
Application.Wait Now + TimeValue("00:00:02")
On Error Resume Next
.FindElementByCss("a.next").Click
On Error GoTo 0
Next i
End With
End Sub
Reference:
'http://seleniumhome.blogspot.co.uk/2013/07/how-to-press-keyboard-in-selenium.html
If you're only interested in clicking through each of the pages (and getting the number of pages is just an aid to doing this) then you should be able to click this element until it's no longer there:
<span class="screen-reader-text">Next Page</span>
Using
bot.FindElementByXpath("//span[contains(text(), 'Next Page')]")
Have a loop click that link on each page load. Eventually it wont be there. Then use VBA's error/exception handling to handle whatever the equivalent of NoSuchElementException is in this implementation of WebDriver. You will need to re-find the element each time in the loop.
How about trying like this? Few days back I could figure out that there is an option .SendKeys("keys.END") which will lead you to the bottom of a page so that the driver can reach out the expected element to click. I used If Err.Number <> 0 Then Exit Do within the do loop so that if the scraper encounters any error, it will break out of loop as in, element not found error in this case when the clicking on the last page button is done.
Give this a shot:
Sub GetItems()
Dim pagenum As Object
With New ChromeDriver
.get "https://codingislove.com/"
Do
On Error Resume Next
Set pagenum = .FindElementByCss("a.next")
pagenum.SendKeys ("Keys.END")
Application.Wait Now + TimeValue("00:00:03")
pagenum.Click
If Err.Number <> 0 Then Exit Do
On Error GoTo 0
Loop
.Quit
End With
End Sub
Reference to add to the library:
Selenium Type Library