Google Webtools: Fetch and Render - seo

I have a question about one of the google webtools - Fetch as Google -> Fetch and Render.
I get some errors about my resources when I use it. If I use Fetch only I get "Complete" status and that is great. But if I use Fetch and Render, I get "Temporarily unreachable" status for some of my images, but if I click on a link of an image it is reachable. There could be a lot of images on my pages and I think that this feature has a limit for count of external resources that are loaded to render a screenshot image. What do you think? Has anyone encountered with this problem? Also, I wonder how many common between this feature and the real goolgebot engine? Does the googlebot get this errors too or not.. Should I worry about it or not..
Google says:
About "Partial" status - "You can assess the gravity of the situation by clicking through the missing resources. A fetch with the Partial status could mean a small problem with the page (e.g. few or insignificant resources could not be retrieved)"
But when I click on an url that is marked as unreachable it is reachable.
For example:
http://cdn.zenfolio.net/img/s10/v109/p458738824-2.jpg?sn=2YH Image Temporarily unreachable
http://cdn.zenfolio.net/img/s5/v124/p533056357-2.jpg?sn=2YH Image Temporarily unreachable
http://cdn.zenfolio.net/img/s5/v119/p79952255-2.jpg?sn=2YH Image Temporarily unreachable
http://cdn.zenfolio.net/img/s5/v132/p201642078-2.jpg?sn=2YH Image Temporarily unreachable
http://cdn.zenfolio.net/img/s7/v152/p126202968-2.jpg?sn=2YH Image Temporarily unreachable
http://cdn.zenfolio.net/img/s5/v124/p189570842-2.jpg?sn=2YH Image Temporarily unreachable
http://cdn.zenfolio.net/img/s7/v155/p124919933-2.jpg?sn=2YH Image Temporarily unreachable
You can try this http://www.photographercentral.com/photographers/us/louisiana/new-orleans url to test.
Thanks

The Engine that Google use for fetch and render(FNR) is not standard browser . Many a time a valid webpage is reported by FNR as error / not reachable.Partial loading is fine but if you see error while fetching / Page loaded incorrectly then here are the measures you can take to debug the issue.
1) Create a page in your site that prints user-agent , browser version etc , and open that page using FNR . This way you will know which browser it's using and test and fix problem for that browser.
2) You can use some error aggregator tool we use track.js (others are rollbar etc), and load a particular page in FNE using some weired params . e.g. www.example.com/pageurl?bot=93 . Now you can search bot=93 easily in track js error list and will know what excatly are the errors generated for that page.

you just delete www from your address then you can see complete fetech and render.
change Preferred domain, if it's with www delete www and if it's not with www add www.
its work for me.

Related

Ctrl+5F shows me the two different types of display

Attachment
I'm currently working on a website.
Attached is the part of the website that I can watch on my computer monitor.
Changed Display below is the what I want.
However, if I keep hitting 'ctrl+f5', the screen shows me either unchanged display or changed display.
I have no idea why it shows me two different types of screen.
As far as I know, 'ctrl+f5' deletes the cache and updates data but it is not for me.
Ridiculously, If I keep hitting 'f5', I can only have the changed display as I want.
I guess I have a problem on css because I get an error message: DevTools failed to load SourceMap: Could not load content for http://localhost:8090/asset/css/sub.css.map: HTTP error: status code 404, net::ERR_HTTP_RESPONSE_CODE_FAILURE.
Does anyone know the keys on this problem?
There's not enough info here for anyone to tell you how to fix your problem with any amount of certainty.
What do you mean by changed display? Just the website looking different? Or a different debug preview (like scaling your website down to a phone's screen size)?
If it looks different, that might just have to do with caching.
The difference between CTRL + F5 and just F5 is that CTRL + F5 doesn't use your browser's cache, instead fetching everything fresh from the server, whereas just F5 uses your browser's cache. Your browser generally keeps track of when it cached things and will automatically fetch data anew if the time of caching was too long ago.
The former generally takes longer to load, naturally, which might be why the website looks different, at least until everything has been loaded.
Other than that, CSS gets a little weird sometimes, applying styles in a weird order. This generally has to do with the order in which, and where in your HTML document you actually load your stylesheets. Generally, loading all of them in the head of the document is a good idea. Complete redefinitions of styles in separate stylesheets can get very weird, even if it should follow normal precedence (Thread on CSS precedence)
Though, again, you'll have to elaborate on your problem further, maybe provide some screenshots, for anyone to be able to definitively help you.

video.js Change Src When Seeking Without Resting Playback

My video src is an AWS presigned request url, it expires in x amount of time. The video will start playing just fine in video.js. For large video files after the brief url expiration time, changing the seek bar causes a network error because the original src link has expired. How do you refresh the src with another unexpired presigned url without restarting from the beginning of the video? I don't want the video to go back the beginning.
So far I have found that you can capture the change of the seek bar by listening for the event 'timeupdate' and in the passed event testing for e.manuallyTriggered.
Thanks
i had this same issue today. i'm using plyr instead of videojs, so i'm not sure if you can do this exact thing, but my solution was:
bind an error handler to the player for when the link has expired and someone tries to play/seek, and then in the handler...
store the current time of the video
send an ajax request to my server to get an updated signed URL
update the source of the player
set the current time of the video to the previously stored time
it's kind of slow/clunky, but it's the best fix i could come up with at the moment, aside from loading the entire video before allowing playback (which didn't seem like great UX).
update: this does work with videojs...but it doesn't work with either player in Safari, which apparently doesn't send the error event at all.

Import.io > Extractor : page never load, so cannot extract datas

Import.io is working pretty fine, but there is one website I would like to extract datas, but when I start the extractor, then enter the URL http://restaurant.michelin.fr/restaurants/france/75000-paris/restaurants-michelin/page-4/ which is loaded. Then I press the ON button, but the page won't load, nothing is displayed.... blank page and looks like it's still loading... In that case, how can I do ? I've also tried with the crawler, but same result. I restarted the program and computer but always the same issue. Thanks a lot.
The import.io desktop app browser uses firefox24. Few websites aren't compatible with the browser and this appears to be what is happening in this case.
It does however work in Magic! https://magic.import.io/
Once you have published the Magic API, you can then use the tools in MyData such as Bulk and Chain to add more URLs.
I have just tried to save a Magic API and it worked a treat. The only disadvantage here is that you won't be able to edit the columns until after you have extracted the data.

Show a single thumbnail when posting on facebook

A little background info is that my team and I developed a website for a Real Estate Agency and I've been assigned the task of setting the image of the currently selected property into facebook's sharing feature.
The webpage for the property is dynamic as there are several listings, so what I've done is select the first image that is loaded on the page and set it to the og:image meta tag.
Now let's say I copy the URL and post it on Facebook, it'll show the correct thumbnail, HOWEVER, it'll also show multiple thumbnails from other listings.
All images on the website are over 200 x 200px and are within an aspect ratio of 3:1.
My question is, how do I tell Facebook to only take my initial image and not grab others while it's as it.
Is there perhaps a SelectSingleImage property that I can apply?
I've already spent more time searching for the answer to this issue than I would have liked, so thanks for any help provided, it's much appreciated.
One method I use sometimes is to recognize Facebook's server and simply provide it with different data. This way you can actually only have one image on the page (as far as Facebook knows).
I don't know anything about vb.net, but here is a simple code sample in PHP. All it does is perform a regular expression on the user agent of the request to match it against the string "facebook".
$isFacebook = false;
if(preg_match("/facebook/",strtolower($_SERVER["HTTP_USER_AGENT"]))) {
$isFacebook = true;
}
Facebook may very well change their user agent signature one day, but for now, I'm pretty sure you'll be safe but keep synced with the Developers Blog and the Roadmap.
It seems that Facebook saved those images in cache for some bizarre reason, but to resolve this issue all I had to do was enter the URL into Facebook's Linter tool which in turn cleared the cache on their server.

Facebook Page Tab app not appearing to non-page admin users

I am experiencing a problem similar to 2 others who have recently written about this issue.
A newly created Page Tab Facebook app displays for admin users but not for regular users.
I only have 8 page tabs currently so it cannot be that there are too many.
Also, sandbox mode is disabled ( have tried both enabled and disabled).
Can anyone think of a reason that this might be occurring?
I added the tab with the code:
http://www.facebook.com/dialog/pagetab?app_id=YOUR_APP_ID&next=YOUR_URL
Could it have something to do with https in the url as opposed to http?
I am at a loss and do not know how to go about solving this issue.
Any ideas, however far-out they may be,would be aprreciated.
Thank you to anyone who might think they can help...
Have you forgotten to take the app out of Sandbox Mode? (try toggling it just in case, even if it's not in sandbox mode, as this is by far the most likely explanation here)
Also, check there aren't demographic restrictions set on the app or page via the API, as in this case only the users that meet the restrictions will see the tab. (though the admins will always see it)
Also check that you've configured the Page Tab URL and Secure Page Tab URL settings for the app correctly, as if a user uses HTTPS and the App doesn't support it they won't see the tab
If the tab isn't appearing at all it's almost certainly one of those two problems ,if it's displaying but the content isn't rendering, also check your code and make sure it's not fataling on checking the signed_request for non-admins or something like that