We're sometimes see 404 errors when accessing https://www.googleapis.com/oauth2/v2/userinfo?alt=json in our application.
apiclient.errors.HttpError: <HttpError 404 when requesting https://www.googleapis.com/oauth2/v2/userinfo?alt=json returned "Not Found">
The overall rate is low, though it is a bit weird to be getting back this error given that this endpoint does exist.
Can you please try again? I think this was a temporary outage.
Thanks for reporting the issue.
Related
I've been looking into New Relic, BugSnag, and a few others, but no one seems to be catching 404 errors in their client-side monitoring. For example, say I have the following script tag in a given page:
<script src="//cdn.example.com/app.js1445291270"></script>
with the minor but critical typo of missing a ? after .js.
This of course returns a 404 and the script never loads.
Are there any services that would catch client-side errors like this?
You should be able to do this with Bugsnag. You can detect whether the status= 404, and use Bugsnag.notifyException() to send the error in. Hope this helps!
TrackJS error monitoring supports this use case. Here is how to add monitoring for resource loading failures.
So I've got a problem where a small percentage of incoming requests are resulting in "400 bad request" errors and I could really use some input. At first I thought they were just caused by malicious spiders, scrapers, etc. but they seem to be legitimate requests.
I'm running Apache 2.2.15 and mod_perl2.
The first thing I did was turn on mod_logio and interestingly enough, for every request where this happens the request headers are between 8000-9000 bytes, whereas with most requests it's under 1000. Hmm.
There are a lot of cookies being set, and it's happening across all browsers and operating systems, so I assumed it had to be related to bad or "corrupted" cookies somehow - but it's not.
I added \"%{Cookie}i\" to my LogFormat directive hoping that would provide some clues, but as it turns out half the time the 400 error is returned the client doesn't even have a cookie. Darn.
Next I fired up mod_log_forensic hoping to be able to see ALL the request headers, but as luck would have it nothing is logged when it happens. I guess Apache is returning the 400 error before the forensic module gets to do its logging?
By the way, when this happens I see this in the error log:
request failed: error reading the headers
To me this says Apache doesn't like something about the raw incoming request, rather than a problem with our rewriting, etc. Or am I misunderstanding the error?
I'm at a loss where to go from here. Is there some other way that I can easily see all the request headers? I feel like that's the only thing that will possibly provide a clue as to what's going on.
We set a lot of cookies and it turns out we just needed to bump up LimitRequestFieldSize which defaults to 8190. Hope this helps someone else some day...
I came across 404 error a few times and i have difficulties in debugging this kind of problem.
What is the strategy and tools available to analyse such problems (firebug, logs...).
How to differentiate and fix the cause ?
page not existing ,wrong path , redirection and rewriting ,server problem ...
404 error code means that a file is not found for whatever reason.
Just check that the file exists and that the path you use is right.
You can analyse sent requests and received responses headers and body in your browser's developper console if you want more details about why some request failed.
Something is access my site using this URL:
/(Yvax:%20uggc:/jjj.tbbtyr-nanylgvpf.pbz/hepuva.wf)uggc:/jjj.tbbtyr-nanylgvpf.pbz/hepuva.wf
So I get error:
[error] [exception.CHttpException.404] exception 'CHttpException'
Is it something to worry about?
How can I stop YII from reporting on this specific error and other similar 404 errors that are unwanted?
Probably just a scanner script. We get hit by these all the time. Review security, but as long as that checks out you should be fine.
If you have to take a site down for some type of unavoidable maintenance task (and it's not a big enough site that you have a backup server), what HTTP status code should you have your server return to minimize the possibility that search engines will think the site is gone?
I found this list of status codes from W3C, of which the following seem applicable:
503 Service Unavailable
500 Internal Server Error
408 Timeout
404 Not Found
I think 503 is the most appropriate, but I don't know what search engines might prefer.
From the horse's mouth:
If my site is down for maintenance, how can I tell Googlebot to come back later rather than to index the "down for maintenance" page?
You should configure your server to return a status of 503 (network unavailable) rather than 200 (successful). That lets Googlebot know to try the pages again later.
Don't send a 404 -- they may remove you from their index.
I'd probably send a 503 and an appropriate Retry-After, although I don't know if anything actually uses the header.
According to Google the 503 code would be the way to go, since it means "the server is temporarily unavailable."
Also check out the W3C page on the same.