Say I have a webpage that renders all it's contect using javaScript.
It has to fetch the data from somewhere (likely some API). How do I see what webpages/API the javaScript calls?
I have used the chrome-console and looked under fetch/XHR, but the only thing that pops up is an .svg file.
Do I need then to inspect each javaScript function under JS and look at, what they call or..?
Finding out which servers are contacted via Devtools is not always easy. You can try extensions like webQsee (https://chrome.google.com/webstore/detail/webqsee-web-sniffer-recor/gamdpfnfkjknkimfbboonmgdfnondfme). They explicitely list the domains the browser is communicating with.
Related
I'm using Selenium (python) to automate some tests on websites. Because selenium's API is quite limited, I'm using a web extension to perform advanced javascript tests.
What would be the proper way to communicate results from the webextension back to the python script? So far, I'm passing them trough console.log messages, but it fails if the target site overrides console.log() (and it seems quite hack-y anyway).
I'd probably tackle this the following way:
Firstly, if you have control over the web extension's source code, then I'd add a simple method which serializes your data into a nice format, then stores it into the browser's local storage.
!Note: If you haven't worked with this, don't worry, there are multiple examples online. You have to keep into consideration that you're also limited to a 5-10 MB local storage limit for your data (varying across browsers).
Secondly, you'd have to read the localStorage values. I see two ways to do this:
make use of your underlying Selenium-based framework's API (usually all of them have some sort of localStorage/cookies API call). For example: in most frameworks you can use the .execute() command (or executeScript) to set the local storage value: browser.execute('localStorage.setItem('PerduData', <dataObject>);');
use plain JavaScript inside your scripts to set the same local storage value
I'm sure there are multiple ways to achieve the same outcome. If you're skilled at working with files, you can also consider storing the data object on other browser, or machine (OS) storage areas.
Lastly, I think the most elegant way to achieve this is by using some third-party storage which has a publicly exposed API that can ultimately accessed by both webextension & Selenium script.
Hope this helps!
So I'm using Google Custom Search (Google CSE) and I'm trying to use the refinement functionality to redirect search queries to Google Scholar.
Basically I'm following exactly the documentation found here. However it turns out that, despite there being documentation, this functionality doesn't exist, and it doesn't appear that Google has any plans to implement it in the near future (see the StackOverflow post here).
My question is, does anyone have a hack/workaround for this problem, so that I could use Google CSE to search Google Scholar?
Server Side
You can use something like https://github.com/ckreibich/scholar.py to parse the results from google scholar yourself and expose it as an API that you could consume and render any way you liked.
It would use scholar search under the hood. However, since this isn't an official API this might break at any time, it also requires you to have server side resources to service the requests, but would let you have the nicest interface that you have full control over.
IFrame
You can open an iframe at the particular URL, and this can be embedded inside your page. It looks a bit clunkier, but it means you don't have to link externally and you can embed it locally
<iframe src='http://scholar.google.com/scholar?q={query}'></iframe>
See documentation here. It might be specifically what renders well for you.
External Link
Alternatively, you can just open a new tab/window with:
<a href='http://scholar.google.com/scholar?q={query}' target='_blank'> My Link </a>
i'm using React & Node JS for building universal apps (). I'm also using react-helmet as library to handle page title, meta, description, etc.
But i have some trouble when loading content dynamically using ajax, google crawler cannot fetch my site correctly because content will be loaded dynamically. Any suggestion to tackle this problem?
Thank you!
I had similar situation but with backend as django, but I think which backend you use doesnt matter.
First of let me get to the basics, the google bots dont actually wait for your ajax calls to get completed. If you want to test it out register your page at google webmaster tools and try fetch as google, you will see how your page is seen by bots(mine was just empty page with loading icon), so since calls dont complete, not data and page is empty, which is bad for SEO ,as bots read text.
So what you need to do, is try server side rendering. This you can do in 2 ways either you prerender.io or create templates on backend which are loaded when the page is called for the first time, after which your single page app kicks in.
If you use prerender its paid but pre-render internally uses phantom.js which you are you can directly use. But it did not work out really well for me so I went with creating templates on the backend. So the bots or the user when come to page for first time(or first entry) the page is served from backend else front end.
Feel free to ask in case in any questions :)
I'm thinking something like pulling in a json file and generating a page from it. I'm just wanting to know if this is a possibility in Shopify.
We're wanting to move our current site completely over to Shopify, and we're hoping we can still have much of the functionality we got from angular and node.
Yes you can do this through Application proxies.
An application proxy is a feature that fetches and displays data on a
Shopify shop page from an outside location that you, the developer,
specify. For app developers, this means that the content on these
proxy pages can be dynamic; capable of being updated as much as you
feel necessary. App developers can use application proxies to display
image galleries, statistics and custom forms, just to name a few
examples.
Basically when Shopify receives an HTTP request on a proxied path, it forwards that request to your specified URL so you can do as you wish. Even further, if you set the Content-Type: application/liquid on your response headers, Shopify will render the template for you; so you could use your very own liquid templates making this page look and behave exactly as the rest of your shop.
There's information on how to get started on the official Shopify Documentation.
I've no idea what it is exaclty I need, but I'll use Facebook as an example. When you load another page (for example: on the home page, you click your profile) the page itself doesn't look like it reloads.The data loads of course, and the URL changes, however the topbar and chat bar stay put.
I'm looking to do something similar to this. Is this me assuming too much and it's really a simple cache function, or is there a JavaScript way to do it, or any other?
Thanks.
It's using AJAX, which is a combination of various technologies (primarily JavaScript) to achieve this dynamic client-server interaction within the overall context of a page.
Essentially what you'd have for your setup is something like this:
A page which the user requests and is returned to the browser. This page will contain (or reference in external files, etc.) some JavaScript code to drive the interactive functionality. For example, a "link" may trigger a JavaScript function instead of navigate to another page. That function will make an AJAX call to the server.
An AJAX handler on the server. Think of it like another page, or like a web service of some kind. It expects requests to come from JavaScript code, not from humans. (Though humans can call it manually if they want, so don't return sensitive data or anything like that.) This can return data in any number of formats (JSON, XML, HTML, etc.) and the client-side JavaScript code will use that response in code.
Depending on your preferred web development technologies, there are various frameworks and tools to help with AJAX functionality. My personal favorite is just to use the AJAX methods in jQuery and to manually control the responses from the server. If you specify your development platform, I can help find some useful examples to get you started.
This is all done using AJAX. Depending on what you want, you can load entire HTML chunks, or just load the data and have the page's javascript output what's needed.