jquery tabs taking long time to load due to Google OAuth2 - jquery-ui-tabs

I recently moved to google Oauth2 authorisation for my webpage that has multiple jquery tabs.
On clicking the tab, a Get request is triggered, after shifting to Google OAuth2, the time to load these tabs have increased by atleast 5x times. Previously I used server-side Apache Authorisation. I am assuming the Get request data is getting authorised by google then rendered on page (which might be causing this latency), is there some way to solve this problem.
I tried serially loading each tab as one of the ways,by trying this solution but that didnt work well. Although I think there could be a better way to solve this.

Related

Active Collab API doesn't show all reports

I'm fetching all the open reports tagged as CHECKPOINT using the collab API and it's working fine. Although, when I run a custom report for the tasks, at Active Collab website, I get more and different results than what I fetched.
What I need is to get exactly the same results shown when running a custom report. Does anyone know how can I fix it or if it's a problem with the API itself?
Open browser console and see which requests ActiveCollab's web interface is making when making fetching data to build a report. Compre this with requests that you are making and see where the differences are.
Web interface gets served by the same API as your app is, so both can be made to work the same. As long as they are making the same requests, as same users.
Solution by OP.
By creating a data filter, it shows all the reports if "include_all_projects": true! Simple is that

Causes of duplicate apache POST requests, other than double submission of form?

This might sound like a question that gets asked frequently but I am not looking for solutions to handle duplicate requests. I just want to know what could cause Apache to receive duplicate requests in the first place.
I have been running into a rather sporadic problem. I have a form that does a POST request on submit but the request can somehow get duplicated just a second later (according to access logs). This used to be a more frequent problem because we were not handling it as gracefully so I put in some client side code to disable the submit button during the form submit event. This prevents double submission (obviously as long as javascript is enabled), but the problem still persists in a very randomly manner. One thing I noticed from logs is clients that cause the issue are android phones running Chrome. Does mobile Chrome do funky things like retry POST requests on it's own? When testing it on my own, I could never get duplicate POST requests to occur, unless I remove the javascript code that disables the submit button.
Web server setup is super simple. No load balancing or anything, just a single server running Apache 2.2.15. We use PHP 5.6 but that probably has nothing to do with this.
I guess it is users doubleclicking rather than clicking, and the application they use transforms every click into a new POST request. Here I'd look into the application design.
Usually I use frameworks that totally cover this and thus can only guess. Clicking the button should not only trigger the POST request but also disable the button while the action is in progress. So JavaScript code could look like
disable button
post the data
enable button
If, due to the POST, the browser navigates to another page this would not be harmful at all.
EDIT: Seeing you did exactly what I suggested, maybe there is another cause.
Suppose users POST their data, and then the screen goes dark, or they switch applications. When they reactivate the browser, is it possible the browser reloads the page by repeating the last request?
I think frameworks cover such situations by responding with a redirect as response to POST, and the redirect would retrieve the data via GET. Since GET is idempotent, it can be run repeatedly without further damage.

Handling SEO on Isomorphic React

i'm using React & Node JS for building universal apps (). I'm also using react-helmet as library to handle page title, meta, description, etc.
But i have some trouble when loading content dynamically using ajax, google crawler cannot fetch my site correctly because content will be loaded dynamically. Any suggestion to tackle this problem?
Thank you!
I had similar situation but with backend as django, but I think which backend you use doesnt matter.
First of let me get to the basics, the google bots dont actually wait for your ajax calls to get completed. If you want to test it out register your page at google webmaster tools and try fetch as google, you will see how your page is seen by bots(mine was just empty page with loading icon), so since calls dont complete, not data and page is empty, which is bad for SEO ,as bots read text.
So what you need to do, is try server side rendering. This you can do in 2 ways either you prerender.io or create templates on backend which are loaded when the page is called for the first time, after which your single page app kicks in.
If you use prerender its paid but pre-render internally uses phantom.js which you are you can directly use. But it did not work out really well for me so I went with creating templates on the backend. So the bots or the user when come to page for first time(or first entry) the page is served from backend else front end.
Feel free to ask in case in any questions :)

How can I extract data behind a login page using import.io

I need to crawl some data that sits behind a login page. To be able to scrap it I need a tool that is able to login and then crawl the pages behind it. Is it possible to do this behind import.io?
Short version: yes, it is.
Longer version:
There are at least two ways, both require you to sign up and download the desktop app (all free)
Extractor version (simpler):
Point the browser to the page where the login is. Login normally, then train your API to extract the data you need. The downside of using this method is that it will only work as long as you are logged in. If you want import.io to login for you you'll need the..
Authenticated version:
As above, but create an authenticated API. This will record for login procedure and execute it for you every time you execute the API
Since the chosen answer doesn't work anymore :( I recommend Cloudscrape. You will get a free trial with 20 hours of crawling and/or scraping if you sign up. For data behind a login you will need a scraper.
Handy tutorials
Tutorial for logging in with scraper.
Tutorial for pagination.

Injected scripts not firing for Google results pages

As a newbie to Safari Extensions I have what I am sure is a terribly trivial question ...
Here goes: I'm building an extension to work with some search engines. To cut a long story short I have boiled my issue down to its simplest form. I have an injected script (an end script). This fires as planned on the Google homepage. But when I enter a query to Google the script does not fire when the subsequent results page loads.
For example, to keep things really basic for testing I created a simple script that just writes to the console; I've set the access level to All so that it fires for all pages. I can see the console message when I open the Google homepage but I dont see it when the subsequent results pages load.
For all intents and purposes it seems as if the transition from Google homepage to results page is not a normal one (that is, not a conventional page load) and does not cause injected scripts to fire. I've only seen this problem on Google so I assume it is something to do with their page loading mechanism. I've tried it with Google Instant on and off and both produce the same behaviour.
e
It's one of those problems that seems so basic as to be stupefying! Please help.