How to import multiple pages into Excel from ESPN - vba

I need to import data from ESPN Fantasy Football into Excel. This includes multiple pages and a web query does not work. The URL is http://games.espn.com/ffl/tools/projections?startIndex=0. When you click next, the URL becomes http://games.espn.com/ffl/tools/projections?startIndex=40 and 80 and so on. Basically, the startIndex increments to multiples of 40 when you click on the next page.
This is the code I've user so far modifying the code provided by #Dee and it seems to adding a new worksheet but has no data in it.
Private Const URL_TEMPLATE As String =
"URL;http://games.espn.com/ffl/tools/projections?startIndex={0}"
Private Const NUMBER_OF_PAGES As Byte = 7
Sub test()
Dim page As Byte
Dim queryTableObject As QueryTable
Dim url As String
For page = 1 To NUMBER_OF_PAGES
url = VBA.Strings.Replace(URL_TEMPLATE, "{0}", page * 40)
Set queryTableObject = ActiveSheet.QueryTables.Add(Connection:=url, Destination:=ThisWorkbook.Worksheets.Add.[a1])
queryTableObject.WebSelectionType = xlSpecifiedTables
queryTableObject.WebTables = "14"
queryTableObject.Refresh
Next page
End Sub
The end result should be pulling all the columns from each page into a tab in one Excel sheet. Is there a way to do this? Please assist.
Thanks,
J

The Id of the table is not 14 but playertable_0. If you need to paste all the data to one single sheet try to use the following modified code. Number of pages is here determined just by guessing. You will have to add some more code to prove if e.g. the Next element is present on the page and do it in do-loop or just simply guess the number of pages. HTH
Option Explicit
Private Const URL_TEMPLATE As String = _
"URL;http://games.espn.com/ffl/tools/projections?startIndex={0}"
Private Const NUMBER_OF_PAGES As Byte = 20
Private Const TableId As String = "playertable_0"
Sub test()
Dim page As Byte
Dim queryTableObject As QueryTable
Dim url As String
Dim ws As Worksheet
Dim target As Range
Dim lastRow As Long
Set ws = ThisWorkbook.Worksheets.Add
Set target = ws.[a1]
lastRow = 1
For page = 0 To NUMBER_OF_PAGES
url = VBA.Strings.Replace(URL_TEMPLATE, "{0}", page * 40)
Set queryTableObject = ActiveSheet.QueryTables.Add( _
Connection:=url, _
Destination:=target)
With queryTableObject
.BackgroundQuery = False ' Run the query synchronously
' to be able to compute the last row
.WebSelectionType = xlSpecifiedTables
.WebTables = TableId
.Refresh
End With
lastRow = ws.Cells.Find( _
what:="*", _
SearchDirection:=xlPrevious, _
SearchOrder:=xlByRows) _
.Row
Set target = ws.Cells(lastRow + 1, 1)
target.Select
Next page
End Sub

Give this a go. I hope it will fix all the issues you are having.
Sub Espn_Data()
Const URL = "http://games.espn.com/ffl/tools/projections?startIndex="
Dim http As New XMLHTTP60, html As New HTMLDocument, page As Long
Dim htmla As Object, tRow As Object, tCel As Object, ws As Worksheet
For page = 0 To 120 Step 40 '''input the highest number replacing 120 to get them all
With http
.Open "GET", URL & page, False
.send
html.body.innerHTML = .responseText
End With
With ThisWorkbook
Set ws = .Sheets.Add(After:=.Sheets(.Sheets.Count))
ws.Name = page
End With
Set htmla = html.getElementsByClassName("playerTableTable tableBody")(0)
For Each tRow In htmla.Rows
For Each tCel In tRow.Cells
c = c + 1: Cells(x + 1, c) = tCel.innerText
Next tCel
c = 0
x = x + 1
Next tRow
c = 0
x = 0
Next page
End Sub

The Web Query for ESPN doesn't work any longer with ESPN Fantasy Football... I don't know why as it only returns all of the page info except what is in the table... Don't insult my intelligence, I've tried all of the settings...
I have come to discover that Excel only allows IE 6 as a web browser as you can find this in the 'File/Options/Advanced/General' "Web Options" then Browsers and then scroll into Target Browsers.... There are your choices for Web Query to obtain Data.... ESPN does not support any of those browsers so like you, we're screwed until EXCEL updates our choices - Yes, I know... It's 2021 :(

Related

How to extract a list of vehicles from multiple pages?

Good day - This is a follow to my previous post, but I feel since it was aswered I cant get further updates to it, so I created a new post to a different issue.
I am reading the list of cars from Craigslist Miami using VBA. The code is working fine with printing the links for each vehicle and price. The only issue when I select a different city for instance, Los Angeles, the URL sysntax changes and I am not sure how to read the class name. Code below works fine for Miami link (in the code), but does not work for this link:
https://losangeles.craigslist.org/search/cta#search=1~list~1~0
Sub newandoptimized()
Dim link As HTMLLinkElement
Dim blog As HTMLLinkElement
Dim price As HTMLLinkElement
Dim IE As Object
Dim html As HTMLDocument
Dim URL As String
Dim URLParameter As String
Dim page As Long, counter As Long
'Dim http As Object
Dim links As Object
Dim blogpost As Object
Dim priceonly As Object
Dim StartCell As Range
Dim increment As Integer
Dim htmlele1 As HTMLLinkElement
Dim ss As Integer
Dim ee As Integer
' This is the first cell that a blog post hyperlink is created in
Set StartCell = Range("A1")
URL = "https://miami.craigslist.org/search/cta"
Set IE = CreateObject("InternetExplorer.Application")
Application.ScreenUpdating = True
' CHnage this to False if you want to hide IE
IE.Visible = True
counter = 0
page = 0
'Set the number of pages of the website to go through in the browser
For page = 0 To 480 Step 120 'increment by 120 - total 4 pages
' Debug.Print page
If page >= 0 Then URLParameter = "?s=" & page
IE.navigate URL & URLParameter
'Wait for the browser to load the page
Do Until IE.readyState = 4
DoEvents
Loop
Set html = IE.document
Set links = html.getElementsByTagName("h3")
Index = 0
For Each link In links
If InStr(LCase(link.outerHTML), "result-heading") Then
Set blogpost = link.getElementsByTagName("a")
Set priceonly = link.getElementsByClassName("result-price")
Set Results = html.getElementsByClassName("result-row")
For Each blog In blogpost
StartCell.Offset(counter, 0).Hyperlinks.Add _
Anchor:=StartCell.Offset(counter, 0), Address:=blog, _
TextToDisplay:=link.innerText
StartCell.Offset(counter, 1).Value = Results(Index).getElementsByTagName("span")(0).innerText
Index = Index + 1
Next blog
counter = counter + 1
End If
Next link
Next page
IE.Quit
Set IE = Nothing
Columns("B:B").Select
Selection.NumberFormat = "$#,##0.00"
Columns("D:D").Select
Selection.NumberFormat = "m/d/yyyy;#"
End Sub

Need to split the data in the cell into different columns and help in copying data from website to excel using vba

So i am copying the traffic data from this website.
I have used the following code so far:
Sub main()
Dim IE As InternetExplorer
Dim i
Set IE = New InternetExplorer
IE.Navigate "https://www.cp24.com/mobile/commuter-centre/traffic"
Do
DoEvents
Loop Until IE.ReadyState = ReadyState_Complete
Dim Doc As HTMLDocument
Set Doc = IE.Document
Dim AllRoute As String
Set holdingsClass =
Doc.getElementsByClassName("trafficWidget")
ActiveSheet.Range("A1").Value = holdingsClass(0).textContent
IE.Quit
End Sub
There are two problems i am facing
1) It's copying all the data in traffic widget class into one cell so its deleting data when the cell runs out of space
2) I want a way to split the data so right now everything shows up in one cell
It should look like this
col.A col.B col.C col.D
HighwayName Current Ideal Delay
Any guidance would be appreciated?
Here you go using CSS selectors to target the information required.
Option Explicit
Sub Getinfo()
Dim http As New XMLHTTP60, html As New HTMLDocument '< XMLHTTP60 is for Excel 2016 so change according to your versione.g. XMLHTTP for 2013
Const URL As String = "https://www.cp24.com/mobile/commuter-centre/traffic"
Application.ScreenUpdating = False
With http
.Open "GET", URL, False
.send
html.body.innerHTML = .responseText
End With
Dim routeNodeList As Object, currentNodeList As Object, idealNodeList As Object, delayNodeList As Object
With html
Set routeNodeList = .querySelectorAll(".location")
Set currentNodeList = .querySelectorAll(".current")
Set idealNodeList = .querySelectorAll(".ideal")
Set delayNodeList = .querySelectorAll(".delaymin")
End With
Dim i As Long
For i = 0 To routeNodeList.Length - 1
With ActiveSheet
.Cells(i + 2, 1) = routeNodeList.item(i).innerText
.Cells(i + 2, 2) = currentNodeList.item(i).innerText
.Cells(i + 2, 3) = idealNodeList.item(i).innerText
.Cells(i + 2, 4) = delayNodeList.item(i).innerText
End With
Next i
Application.ScreenUpdating = True
End Sub
References required (VBE > Tools > References):
HTML Object library and MS XML < your version
Example output:
Late bound version:
Option Explicit
Public Sub Getinfo()
Dim http As Object, html As Object, i As Long
Const URL As String = "https://www.cp24.com/mobile/commuter-centre/traffic"
Application.ScreenUpdating = False
With CreateObject("MSXML2.serverXMLHTTP")
.Open "GET", URL, False
.send
Set html = CreateObject("HTMLFile")
html.body.innerHTML = .responseText
End With
Dim counter As Long: counter = 1
With ActiveSheet
For i = 0 To html.all.Length - 1
Select Case html.all(i).className
Case "location"
counter = counter + 1
.Cells(counter, 1).Value = html.all(i).innerText
Case "current"
.Cells(counter, 2).Value = html.all(i).innerText
Case "ideal"
.Cells(counter, 3).Value = html.all(i).innerText
Case "delaymin"
.Cells(counter, 4).Value = html.all(i).innerText
End Select
Next i
End With
Application.ScreenUpdating = True
End Sub

Extracting data from a live website

I am a bit new to using VBA to extract data from a website to excel and i was wondering if you guys can help me on extracting data from a website to excel using vba and then make it run every hour?
I can use the code to educate myself on it
Basically, i want to go to the following website
And then just copy paste the travel times for all the roads into excel. This would include Current, Ideal and delay.
Would it be possible to do that?
You will need to add two references in your VBA library to run this code.
To add required references: (VBE > Tools > References)
Microsoft HTML Object Library & Microsft Internet Controls
After running this, you will notice that you will need to Split the string outputs to isolate the individual variables (current, ideal, delay) that you want and then you will need to systematically assign these to a table in your excel.
You should do some research on web-scraping to fully understand what is happening. If you navigate to the page > right click on a route > inspect element > you will see the below code is pulling from the tag "tr". ("tr")(3) will corrospond with the 3rd route detailed on the site.
Sub MainSub()
Dim IE As InternetExplorer
Set IE = New InternetExplorer
IE.Navigate "https://www.cp24.com/mobile/commuter-centre/traffic"
Do
DoEvents
Loop Until IE.ReadyState = ReadyState_Complete
Dim Doc As HTMLDocument
Set Doc = IE.Document
Dim FirstRoute As String
Dim SecondRoute As String
FirstRoute = Trim(Doc.getElementsByTagName("tr")(1).innerText)
SecondRoute = Trim(Doc.getElementsByTagName("tr")(2).innerText)
MsgBox FirstRoute & vbNewLine & vbNewLine & vbNewLine & SecondRoute
IE.Quit
Set IE = Nothing
End Sub
This should do what you want!
Option Explicit
Sub Web_Table_Option_One()
Dim xml As Object
Dim html As Object
Dim objTable As Object
Dim result As String
Dim lRow As Long
Dim lngTable As Long
Dim lngRow As Long
Dim lngCol As Long
Dim ActRw As Long
Set xml = CreateObject("MSXML2.XMLHTTP.6.0")
With xml
.Open "GET", "https://www.cp24.com/mobile/commuter-centre/traffic", False
.send
End With
result = xml.responseText
Set html = CreateObject("htmlfile")
html.body.innerHTML = result
Set objTable = html.getElementsByTagName("Table")
For lngTable = 0 To objTable.Length - 1
For lngRow = 0 To objTable(lngTable).Rows.Length - 1
For lngCol = 0 To objTable(lngTable).Rows(lngRow).Cells.Length - 1
ThisWorkbook.Sheets("Sheet1").Cells(ActRw + lngRow + 1, lngCol + 1) = objTable(lngTable).Rows(lngRow).Cells(lngCol).innerText
Next lngCol
Next lngRow
ActRw = ActRw + objTable(lngTable).Rows.Length + 1
Next lngTable
End Sub
Set appropriate references...

Excel VBA Macro: Scraping data from site table that spans multiple pages

Thanks in advance for the help. I'm running Windows 8.1, I have the latest IE / Chrome browsers, and the latest Excel. I'm trying to write an Excel Macro that pulls data from StackOverflow (https://stackoverflow.com/tags). Specifically, I'm trying to pull the date (that the macro is run), the tag names, the # of tags, and the brief description of what the tag is. I have it working for the first page of the table, but not for the rest (there are 1132 pages at the moment). Right now, it overwrites the data everytime I run the macro, and I'm not sure how to make it look for the next empty cell before running.. Lastly, I'm trying to make it run automatically once per week.
I'd much appreciate any help here. Problems are:
Pulling data from the web table beyond the first page
Making it scrape data to the next empty row rather than overwriting
Making the Macro run automatically once per week
Code (so far) is below. Thanks!
Enum READYSTATE
READYSTATE_UNINITIALIZED = 0
READYSTATE_LOADING = 1
READYSTATE_LOADED = 2
READYSTATE_INTERACTIVE = 3
READYSTATE_COMPLETE = 4
End Enum
Sub ImportStackOverflowData()
'to refer to the running copy of Internet Explorer
Dim ie As InternetExplorer
'to refer to the HTML document returned
Dim html As HTMLDocument
'open Internet Explorer in memory, and go to website
Set ie = New InternetExplorer
ie.Visible = False
ie.navigate "http://stackoverflow.com/tags"
'Wait until IE is done loading page
Do While ie.READYSTATE <> READYSTATE_COMPLETE
Application.StatusBar = "Trying to go to StackOverflow ..."
DoEvents
Loop
'show text of HTML document returned
Set html = ie.document
'close down IE and reset status bar
Set ie = Nothing
Application.StatusBar = ""
'clear old data out and put titles in
'Cells.Clear
'put heading across the top of row 3
Range("A3").Value = "Date Pulled"
Range("B3").Value = "Keyword"
Range("C3").Value = "# Of Tags"
'Range("C3").Value = "Asked This Week"
Range("D3").Value = "Description"
Dim TagList As IHTMLElement
Dim Tags As IHTMLElementCollection
Dim Tag As IHTMLElement
Dim RowNumber As Long
Dim TagFields As IHTMLElementCollection
Dim TagField As IHTMLElement
Dim Keyword As String
Dim NumberOfTags As String
'Dim AskedThisWeek As String
Dim TagDescription As String
'Dim QuestionFieldLinks As IHTMLElementCollection
Dim TodaysDate As Date
Set TagList = html.getElementById("tags-browser")
Set Tags = html.getElementsByClassName("tag-cell")
RowNumber = 4
For Each Tag In Tags
'if this is the tag containing the details, process it
If Tag.className = "tag-cell" Then
'get a list of all of the parts of this question,
'and loop over them
Set TagFields = Tag.all
For Each TagField In TagFields
'if this is the keyword, store it
If TagField.className = "post-tag" Then
'store the text value
Keyword = TagField.innerText
Cells(RowNumber, 2).Value = TagField.innerText
End If
If TagField.className = "item-multiplier-count" Then
'store the integer for number of tags
NumberOfTags = TagField.innerText
'NumberOfTags = Replace(NumberOfTags, "x", "")
Cells(RowNumber, 3).Value = Trim(NumberOfTags)
End If
If TagField.className = "excerpt" Then
Description = TagField.innerText
Cells(RowNumber, 4).Value = TagField.innerText
End If
TodaysDate = Format(Now, "MM/dd/yy")
Cells(RowNumber, 1).Value = TodaysDate
Next TagField
'go on to next row of worksheet
RowNumber = RowNumber + 1
End If
Next
Set html = Nothing
'do some final formatting
Range("A3").CurrentRegion.WrapText = False
Range("A3").CurrentRegion.EntireColumn.AutoFit
Range("A1:C1").EntireColumn.HorizontalAlignment = xlCenter
Range("A1:D1").Merge
Range("A1").Value = "StackOverflow Tag Trends"
Range("A1").Font.Bold = True
Application.StatusBar = ""
MsgBox "Done!"
End Sub
There's no need to scrape Stack Overflow when they make the underlying data available to you through things like the Data Explorer. Using this query in the Data Explorer should get you the results you need:
select t.TagName, t.Count, p.Body
from Tags t inner join Posts p
on t.ExcerptPostId = p.Id
order by t.count desc;
The permalink to that query is here and the "Download CSV" option which appears after the query runs is probably the easiest way to get the data into Excel. If you wanted to automate that part of things, the direct link to the CSV download of results is here
You can improve this to parse out exact elements but it loops all the pages and grabs all the tag info (everything next to a tag)
Option Explicit
Public Sub ImportStackOverflowData()
Dim ie As New InternetExplorer, html As HTMLDocument
Application.ScreenUpdating = False
With ie
.Visible = True
.navigate "https://stackoverflow.com/tags"
While .Busy Or .READYSTATE < 4: DoEvents: Wend
Set html = .document
Dim numPages As Long, i As Long, info As Object, item As Object, counter As Long
numPages = html.querySelector(".page-numbers.dots ~ a").innerText
For i = 1 To 2 ' numPages ''<==1 to 2 for testing; use to numPages
DoEvents
Set info = html.getElementById("tags_list")
For Each item In info.getElementsByClassName("grid-layout--cell tag-cell")
counter = counter + 1
Cells(counter, 1) = item.innerText
Next item
html.querySelector(".page-numbers.next").Click
While .Busy Or .READYSTATE < 4: DoEvents: Wend
Set html = .document
Next i
Application.ScreenUpdating = True
.Quit '<== Remember to quit application
End With
End Sub
I'm not making use of the DOM, but I find it very easy to get around just searching between known tags. If ever the expressions you are looking for are too common just tweak the code a bit so that it looks for a string after a string).
An example:
Public Sub ZipLookUp()
Dim URL As String, xmlHTTP As Object, html As Object, htmlResponse As String
Dim SStr As String, EStr As String, EndS As Integer, StartS As Integer
Dim Zip4Digit As String
URL = "https://tools.usps.com/go/ZipLookupResultsAction!input.action?resultMode=1&companyName=&address1=1642+Harmon+Street&address2=&city=Berkeley&state=CA&urbanCode=&postalCode=&zip=94703"
Set xmlHTTP = CreateObject("MSXML2.XMLHTTP")
xmlHTTP.Open "GET", URL, False
On Error GoTo NoConnect
xmlHTTP.send
On Error GoTo 0
Set html = CreateObject("htmlfile")
htmlResponse = xmlHTTP.ResponseText
If htmlResponse = Null Then
MsgBox ("Aborted Run - HTML response was null")
Application.ScreenUpdating = True
GoTo End_Prog
End If
'Searching for a string within 2 strings
SStr = "<span class=""address1 range"">" ' first string
EStr = "</span><br />" ' second string
StartS = InStr(1, htmlResponse, SStr, vbTextCompare) + Len(SStr)
EndS = InStr(StartS, htmlResponse, EStr, vbTextCompare)
Zip4Digit = Left(Mid(htmlResponse, StartS, EndS - StartS), 4)
MsgBox Zip4Digit
GoTo End_Prog
NoConnect:
If Err = -2147467259 Or Err = -2146697211 Then MsgBox "Error - No Connection": GoTo End_Prog 'MsgBox Err & ": " & Error(Err)
End_Prog:
End Sub

Get website data from Urls using VBA

I have multiple urls stored in Excel sheet. I want to Get data reside within particular div tag. For One Website it works fine
Sub Cityline()
Dim IE As Object
Set IE = CreateObject("Internetexplorer.application")
IE.Visible = True
IE.navigate "http://Someurl.com/bla/bla/bla"
Do While IE.busy
DoEvents
Loop
Do
DoEvents
Dim Doc As Object
Set Doc = IE.Document
Dim workout As String
workout = Doc.getElementsByClassName("CLASS_NAME_OF_DATA")(0).innertext
Range("A2") = workout
Loop
End Sub
I used Below code for loop Through all urls but its not working
Sub GetData()
Dim oHtm As Object: Set oHtm = CreateObject("HTMLFile")
Dim req As Object: Set req = CreateObject("msxml2.xmlhttp")
Dim oRow As Object
Dim oCell As Range
Dim url As String
Dim y As Long, x As Long
x = 1
For Each oCell In Sheets("sheet1").Range("A2:A340")
req.Open "GET", oCell.Offset(, 1).Value, False
req.send
With oHtm
.body.innerhtml = req.responsetext
With .getelementsbytagname("table")(1)
With Sheets(1)
.Cells(x, 1).Value = oCell.Offset(, -1).Value
.Cells(x, 2).Value = oCell.Value
End With
y = 3
For Each oRow In .Rows
Sheets(1).Cells(x, y).Value = oRow.Cells(1).innertext
y = y + 1
Next oRow
End With
End With
x = x + 1
Next oCell
End Sub
But its not working
can any one suggest me where i went wrong ?
I used Fetching Data from multiple URLs but it doesn't works for me.
Please guide me how to get data from all urls at a Time
I'm new to SO, so apologies to the mods if this should be in comments (I couldn't get it to fit).
I agree with Silver's comments, but I thought I'd suggest a different approach that might help. If you have URLs in a column of cells, you could create a custom VBA function that will extract the relevant data out of the HTML. Just use this function in the cells to the right of your URL to return the relevant data from the HTML. An example is this:
Public Function GetHTMLData(SiteURL As String, FieldSearch As String) As String
Dim IE As Object
Dim BodyHTML As String
Dim FieldStart As Integer
Dim FieldEnd As Integer
Set IE = CreateObject("InternetExplorer.Application")
With IE
.Navigate SiteURL
Do While .Busy Or .ReadyState <> 4
DoEvents
Loop
BodyHTML = IIf(StrComp(.Document.Title, "Cannot find server", vbTextCompare) = 0, _
vbNullString, .Document.body.innerhtml)
FieldStart = InStr(1, BodyHTML, FieldSearch) + Len(FieldSearch) + 12
FieldEnd = InStr(FieldStart, BodyHTML, "<")
GetHTMLData = Mid(BodyHTML, FieldStart, FieldEnd - FieldStart)
.Quit
End With
Set IE = Nothing
End Function
The function above has 2 input parameters: the URL and a string that will be searched for within the HTML. It will then return a string from within the HTML, starting from 12 characters after the searched parameter and ending at the following '<' within the HTML.
Hope that helps.