tl;dr: How can one send a POST request in VBA by using an XMLHttpRequest within the Internet Explorer?
I'm about automating the use of page on the internet. The idea is to start from Excel to gather some data in a worksheet, then transfer these data to the internet page, click some buttons in the meantime, and thereafter work on manually.
I recorded the whole process when executed manually by the F12 Developer tools. By bringing the data into the textboxes of the internet page, some Javascript events take place which fire a POST request through XMLHttpRequest:
Now, by replicating the procedure described above in VBA (using getElementById, .click(),.value= and the like), it appears that the relevant events are not fired: no such POST request as shown above are sent to the server, and no additional masks open. This is why I wanted to omit the textbox-filling-and-clicking approach and simply replicate the POST request.
How can I do this from VBA by using the Internet Explorer?
Disclaimer: I'm aware how to send a POST request using MSXML2.serverXMLHTTP, which is described hundreds of times over the web. However, there it's always either-or: either use the Internet Explorer, or MSXML2.serverXMLHTTP. However, I need the request within the Internet Explorer.
These links looks promising |VB5| C# |
Navigate method, 4th argument |Another VB example using StrConv for byte array conversion|
It looks like you simply use the Navigate method and simply data packed into a byte array for the fourth argument. One of the above links is old but the newer link looks very similar so I think the interface has not changed much.
Do please post feedback.
Related
I have a list in "SharePoint in Microsoft 365" and am looking to extend its usefulness. I would like to create a button or something that can take a row of data and send it to an external site via API (another tool that we have created).
To be clear:
SharePoint initiates a GET or POST request
the request must have row or field data attached
the destination is outside of Microsoft/SharePoint
Can it be done?
(Apologies if this is a repeat question, I could only find answers that pertained to older versions of SharePoint and focused on hitting external APIs to retrieve data, not send it.)
OK, i think the best choice to send Data from a selected row, is to use the ListView Command Set here you can do anything with #pnp/nodejs.
Otherwise...
PowerAutomate can help send data.
I'm trying to get Excel to download a CSV file, from a link that changes by the day, with a click on a button. The thing is, it's locked behind an agreement-number, ID and password.
I, however, got two API tokens:
TheAppSecretToken
TheAgreementGrantToken
The link is:
https://secure.e-conomic.com/secure/generelt/exportdata2.asp?mode=doexport&kartotek=5&fradato=01-01-2017&tildato=01-02-2018&vcseparator=%3B&vcQualifier=%22
If people have another way, than using a VBA-code, to download this file with a click on a button, don't hold back with the suggestion.
I appreciate any help I can get, thank you. :-)
EDIT: It's not a duplicate for another question, as this uses Tokens, and or 3 login informations.
EDIT2: nvm. that the link is changing from day-to-day, I figured out that I can just put the date as far out in the future, as I like.
Edit:
I assume that the login call needs to be an POST request, but this is only a guess as I can't test it with the information you have given
Just change the loginBody with the information you need or change it to the format that the URL needs (like JSON) and you can send as many information as need
If needed you can also set more headers for any other tokens you have
URL = "YOURURL"
loginBody = "username=username&password=password&token=token"
HttpObj.Open("POST", URL)
HttpObj.SetRequestHeader("Content-Type","application/x-www-form-urlencoded")
HttpObj.Send(loginBody)
Old answer:
As I'm not allowed to comment it seems what you are trying to do is explained here:
How do i download a file using VBA (Without internet explorer)
I am performing an integration to a third party site and they have asked me to redirect to their URL with a bunch of POST variables.
The only way I can work out how to do this is by creating an HTML form with hidden fields, and then trigger a JS click event on a button to POST the form to their site.
Ideally I would like to do this from the VB code as the data may be sensitive and there are bits I don't want to render in the client side.
Has anyone any experience of doing this successfully. I have googled around but can only find ways to use GET variables on the URL or POST and read the response.
How does one write a script to download one's Google web history?
I know about
https://www.google.com/history/
https://www.google.com/history/lookup?hl=en&authuser=0&max=1326122791634447
feed:https://www.google.com/history/lookup?month=1&day=9&yr=2011&output=rss
but they fail when called programmatically rather than through a browser.
I wrote up a blog post on how to download your entire Google Web History using a script I put together.
It all works directly within your web browser on the client side (i.e. no data is transmitted to a third-party), and you can download it to a CSV file. You can view the source code here:
http://geeklad.com/tools/google-history/google-history.js
My blog post has a bookmarklet you can use to easily launch the script. It works by accessing the same feed, but performs the iteration of reading the entire history 1000 records at a time, converting it into a CSV string, and making the data downloadable at the touch of a button.
I ran it against my own history, and successfully downloaded over 130K records, which came out to around 30MB when exported to CSV.
EDIT: It seems that number of foks that have used my script have run into problems, likely due to some oddities in their history data. Unfortunately, since the script does everything within the browser, I cannot debug it when it encounters histories that break it. If you're a JavaScript developer, use my script, and it appears your history has caused it to break; please feel free to help me fix it and send me any updates to the code.
I tried GeekLad's system, unfortunately two breaking changes have occurred #1 URL has changed ( I modified and hosted my own copy which led to #2 type=rss arguments no longer works.
I only needed the timestamps... so began the best/worst hack I've written in a while.
Step 1 - https://stackoverflow.com/a/3177718/9908 - Using chrome disable ALL security protocols.
Step 2 - https://gist.github.com/devdave/22b578d562a0dc1a8303
Using contentscript.js and manifest.json, make a chrome extension, host ransack.js locally to whatever service you want ( PHP, Ruby, Python, etc ). Goto https://history.google.com/history/ after installing your contentscript extension in developer mode ( unpacked ). It will automatically inject ransack.js + jQuery into the dom, harvest the data, and then move on to the next "Later" link.
Every 60 seconds, Google will force you to re-login randomly so this is not a start and walk away process BUT it does work and if they up the obfustication ante, you can always resort to chaining Ajax calls and send the page back to the backend for post processing. At full tilt, my abomination script collected 1 page a second of data.
On moral grounds I will not help anyone modify this script to get search terms and results as this process is not sanctioned by Google ( though not blocked apparently ) and recommend it only to sufficiently motivated individuals to make it work for them. By my estimates it took me 3-4 hours to get all 9 years of data ( 90K records ) # 1 page every 900ms or faster.
While this thing is going, DO NOT browse the rest of the web because Chrome is running with no safeguards in place, most of them exist for a reason.
One can download her search logs directly from Google (In case downloading it using a script is not the primary purpose),
Steps:
1) Login and Go to https://history.google.com/history/
2) Just below your profile picture logo, towards the right side, you can find an icon for settings. See the second option called "Download". Click on that.
3) Then click on "Create Archive", then Google will mail you the log within minutes.
maybe before issuing a request to get the feed the script shuld add a User-Agent HTTP header of well known browser, for Google to decide that the request came from that browser.
I want to be able to retrieve dynamic data from a web page (share prices). I started out by retrieving the html code before I realised that as it is live data, the html code will be of little use. Although I am looking to capture specific data, all i wish to do is process a webpage that I specify which will return the text off that website and not the HTML code. Basically a copy and paste of the entire page would be great..
Any ideas would be really appreciated!
'Screen Scraping' by parsing HTML is so early 2000s...what I would do is read up on Amazon's Mechnical Turk. You can develop a queued architecture where you submit urls to this Mechnical Turk service. The service would automatically distribute these bits of work to users who would then do the dirty task of copying and pasting out the valuable stock quote information you require. Users around the world would anxiously await delivery of the next URL to their Mechanical Turk inbox...pinning for the opportunity to copy/paste out another share price for your application. Sure, it might take a few minutes to update your prices, but hey, they would be HAND parsed by REAL people around the globe! Just think of the possibilities!
Well, the HTML contains the text of the website, so you "just" need to parse the HTML.
EDIT: If the data is not in the HTML but loaded dynamically, the situation is different. As I can see, you have two options:
Find out how the data is loaded (i.e. read the JavaScript on the page). If it is updated via some web service, you could query the same web service in your program.
Use a web browser to get the data and then get the dynamic HTML tree of the page. Maybe the WPF Webbrowser control can help you with this, but I'm not sure since I've never done this myself.
Is it possible to find this same data provided in a ready-to-consume format rather than scraping HTML for it? It seems like there's probably public web-services for stock quotes.
For example: A quick search for "Stock price webservice" turned up http://www.webservicex.net/stockquote.asmx; an ASMX web-service that is easy to consume in .NET.
In your Visual Studio project you should be add a reference to this service via the "Add Web Reference" command; the dialog you're given varies depending on whether your project is targeting for .NET 2.0 or .NET 3.0/3.5.
I added a reference to the service named StockPriceProxy:
Public Function GetQuote(ByVal symbol As String) As String
Using quoteService As New StockPriceProxy.StockQuote
return quoteService.GetQuote(symbol)
End Using
End Function