SEOstats API Google Pagerank Blocked by Google - api

I wrote a php script to get all the urls' pageranks of my company website. But get the following response from GetWithCurl($url) - $str. It looks like Google has some restriction to get the pagerank dynamically.
Is there any way to resolve it? or contact with google? but how. Thank you!
Sorry...GoogleSorry...We're sorry...... but your computer or network may be sending automated queries. To protect our users, we can't process your request right now.See Google Help for more information.© 2009 Google - Google Home

Well.. That is not an issue with SEOstats. The problem is that Google detected that you send automated requests to it, which is against their Terms of Services!
You should be able to "solve" this by getting a fresh IP from your provider (turn router off/on) or sending your requests through proxies. Anyway, you must decrease your request frequency to avoid getting blocked again!
See: https://github.com/eyecatchup/SEOstats/issues/33

Related

API For Google PageSpeed Insight

I just setup Google PageSpeed Insight into my Google Webmaster but whenever I am trying to do PageSpeed Test this error occurs "The referrer https://www.googleapis.com/ does not match the referrer restrictions configured on your API key. Please use the API Console to update your key restrictions."
I already created API for my URL and Created Restriction of HTTPS Referrers and submitted my Website in it but still not working.
Any solution for it?
You have set your restrictions incorrectly, the error message points you directly to the problem.
Remove all restrictions and try again, then slowly add restrictions until you reach the problem.
If you have restricted Accept requests from these HTTP referrers (web sites) (Optional) then bear in mind you have to verify your site first for some APIs to function correctly.

Getting 411 Response for http GET request on cloudflare URL

Our website uses cloudflare as it's CDN to handle loads.
One of our apps requests the URL http://www.codenameone.com/files/cn1libs/CN1JSON.cn1lib with a get request. This works fine for every machine/location we tested but we have user complaints that they are getting an HTTP 411: "Length Required" response.
Since this is a GET request content-length doesn't seem like a header we would need to send...
Our server logs don't show any 411 response so my only conclusion is that this is a failure on the cloudflare side. However, since we can't reproduce this and the cloudflare aspect is a black box I don't have much to go on in terms of debugging.
I tried contacting cloudflare support but effectively got the usual "run around" asking me to send traces from a users machine on the other side of the world which is not something I can realistically do.
After a long session with cloudflare support it seems that unless you are an enterprise subscriber log files for access just don't exist. So effectively their support sees cloudflare as a black box just like we do.
Since the problem clearly isn't in our servers my educated guess is that this is a bug in cloudflare related to some odd edge case.
If someone has a better answer than this I'll gladly accept it.

Google OAuth 2 works only from localhost [duplicate]

I'm trying to set a web service that needs the user's Google Latitude info, so I'm using Google OAuth to get the user authorization stuff.
However, when trying to set the redirection URI in the Google APIs Console for a web application client ID I get a message error if I try to set it to 'http://PUBLIC_IP/'.
I need to test it with non local users (thus localhost can't be used), so I would like to know if having a web domain is mandatory in order to use Google's OAuth. If not, how can I solve this issue?
This is not currently supported. I filed a feature request and will update on progress.
Update: Essential app verification activities have continued to make support of IP address-based apps unlikely. These verification activities are necessary to provide protections against abuse of user accounts. In addition, the cost of setting up dedicated domains has been reduced significantly since this feature was requested. Please read other responses here about possible options.
You can use xip.io to work around it.
For example: '192.168.0.50.xip.io:3000' will resolve to '192.168.0.50:3000'
I ran into this issue too and so I entered a URL with a .com extension and also entered it into my /etc/hosts file. Works like a charm.
It totally sucks that my entire app now has to be developed on an apparently 'live' domain though.
I used my public hostname. It helps if you have a static IP address. I used http://www.displaymyhostname.com/ to get my hostname. I plugged it straight into the Authorized JavaScript origins field when I created a new Web Application Client ID.
P.S. My hostname looked something like this: 111.111.111.111.static.exetel.com.au
You can use a dynamic DNS. I used ddns.net which offers a free solution. Basically, you enter your FQDN as this: yourcompany.ddns.net as your domain. When looked up for an IP address, the .net domain points to ddns; when ddns.net is looked up, it looks up in its database for your company, returns the IP. So mine looks like this: https://wigwam.ddns.net and everything works fine. You don't need to buy a domain, you can substitute your known IP, and Google is happy with that.
Your IP must be static, of course.
Yes, as of now you still need to have a domain name to use Google OAuth in your application. If you have a static public IP and don't want to buy a domain name, you could use a free subdomain from FreeDNS to link to your public IP. Seemed to work well enough for me with a Django app.
Echoing what Breno said in response to his earlier comment:
Apologies for the lack of updates here. Essential app verification activities have continued to make support of IP address-based apps unlikely. These verification activities are necessary to provide protections against abuse of user accounts. In addition, the cost of setting up dedicated domains has been reduced significantly since this feature was requested. Please read other responses here about possible options.
You can read more about Google's app verification requirements [1] and Google's policies requiring secure handling of data [2].
[1] https://support.google.com/cloud/answer/9110914?hl=en
[2] https://developers.google.com/identity/protocols/oauth2/policies#secure-response-handling.
xip.io is not working anymore as an alternative you can use nip.io the same way for example:
10.0.0.1.nip.io:8000 will resolve to 10.0.0.1:8000
It seems like xip.io is down, but there are alternatives such as sslip.io and nip.io. However, I couldn't get either of these to work.
I ended up hosting the main file server on the main machine, and ran said server on a 192.168.1.xx IP address. I then ran servers on each of the test machines (including a second server on the main machine), all of which were on the localhost address. Any requests that the localhost servers received were then passed off to the 192.168.1.xx server, which allowed testing on all of the devices.
This should also work with public facing IP addresses.

Rails: Detecting bot IPs to get around shortener pings

I have an application that logs clicks by users. The problem is, these clicks are being pushed through Twitter, which shortens every single link with t.co. Because of this, Twitter appears to hit the link between 7-15 times from different IPs, probably to do things like logging and SPAM protection. The issue is that this logs 7-15 "clicks" on my app that didn't come from actual users.
I am wondering if there is a way to detect if a visit is coming from an actual user or is simply being cURL'd or something of the sort from a bot or spider.
The one method that seemed it could have worked was using http://www.projecthoneypot.org/ 's API to see if the IPs hitting my site are coming from known bots. I found a gem to help (http://cl.ly/GlT8) but kept getting a NET DNS error while trying to use it.
I am fresh out of ideas. Would really appreciate any assistance!
Twitter should set its User-Agent: http header properly so you can filter those out. This can be forged of course but it's a start.
You can obtain the header in rails with request.headers["User-Agent"].

If i take my website offline for maintenance, does that ruin my Google Juice (tm)?

if I take my website offline (eg. for an IIS7 site, I'm using the app_offline.htm file), then all requests goto my maintenance page.
But, Google (and other search engines) don't 'know' that? they try to hit http://www.blahblah.com/whatever and it returns the maintenance page (and a 404 http status .. which IMO is BAD .. shouldn't it be 50<something> SERVER UNAVAILABLE, but that's another debate for another day... )
anyways ... as the google bot is crawling my site .. and my site is offline ... will that mean google thinks my site has bad pages/links/etc.. and as such ... damages my google juice score/rating/magic stuff?
are there tricks to tell google bot 'easy tiger! my site's offline so be nice to me 'cause u're not going to find anything to trawl' .. ??
You should respond with a 503 Service Unavailable with a Retry-After header. See:
http://googlewebmastercentral.blogspot.com/2011/01/how-to-deal-with-planned-site-downtime.html
http://googlewebmastercentral.blogspot.com/2006/08/all-about-googlebot.html
http://www.seroundtable.com/archives/015171.html
http://www.blogstorm.co.uk/handle-googlebot-during-site-downtime/