Improve loading speed withouth loosing search ranking - seo

I have a webpage whit many areas whose visibility can get toggled by the user.
The default visibility state for those area is hidden (css, display: none).
I don't have control to what's going to be put inside, but it could be a lot of images.
I saw with firefox's network observer all images where loaded with the page. This is quite a waste of bandwidth since the user might choose not to display every areas.
I came to a workarround, I put all that content inside a <script type="late-rendering"></script> and to avoid any potential conflict (eg: "" inside the content), I replace all "<" with "8691jQfdtxm" (randomly picked string). Then when the user want to make an area visible, I just fill the area with that content after replacing 8691jQfdtxm with "<".
It works fine, but I think proceeding like this will make crawlers (eg: Google) think my webpage is pure garbage. How could I avoid that?

Unless search engines were heavily relying on the alt tags of your images, or their filenames, there is little risk you will loose search rankings. If your site does load more quickly instead, it will provide a better user experience, which will be probably detected by Google, and this influences rankings positively.

Google executes a lot of Javascript these days. And your trick of breaking the html with a random string seems hokey to me.
I would preload all the textual content ( e.g. have it all in there on first load, with the div closed via display:none ). This content will not count as much as visible content - but it does count.
Then I'd do a delayed loading of the images. Like with make all your images something like:
<img src="blank.jpg" loadlater="realimage.jpg">
blank.jpg can be a tiny image. when the div opens you can use javascript/jquery to rewrite each src with loadlater.

Related

Responsive design and content duplication issues

i am working on responsive design site. I have a large navigation contained in one UL but want to turn it to two UL's on smaller screens.
From a technical point of view using media query this is not pausing me a problem but means that the same links are twice in the source.
Also, on some of the pages we want to add a condensed content for the smaller screens, again, I could have the two variations of the content into two DIV's, with alsways one hidden depending on the device.
The question I have is about the search engines, I am guessing this would be seen as content duplication and could lead to penalties. What would be the best option then?
Repeat navigation 2 times will not create real problem: even if Google do not ignore one of the navigations, will just mean that these links have a bit more of importante in your page.
But about duplicate content, this is bad for one aditional reason that is not just a possible SEO problem: you will serve two times same content, and will waste bandwidth. If this will be a real problem, maybe not if all time at least one have on CSS display:none, but is a better idea if you try to think a way that with only CSS you can serve the same content for diferent display widths.

Javascript served tooltips - bad for Google / SEO?

I have a client who wants a feature on his site that he has seen on a competitors. It is essentially a group of icons where, when you mouseover them, an extended tooltip appears with content, links, etc...
The tooltips are not hidden divs. The tooltip content appears nowhere in the source code of the page itself. I believe the text of the tooltips is being called from an external file (e.g. an XML file or some such thing) via javascript.
My question(s) are this:
a) since the tooltip content isn't actually on the page, does it even affect SEO efforts at all?
b) would Google consider this spam (or at best questionable)?
Many thanks!
a) since the tooltip content isn't actually on the page, does it even
affect SEO efforts at all?
It wont affect SEO efforts in the slightest
b) would Google consider this spam (or at best questionable)?
No.
I should also point out from an accesibility point of view this is pretty bad practice as well.
a) No, all content loaded from external scripts won't be considered relevant for SEO. So it's just like you don't have extra content.
If your text is in display: none or visibility: hidden , it will affect SEO but make sure that user have access to the content.
b) No because you just want to give extra information and it won't be used by Google. Google takes content as spam when it is hidden and user doesn't have access.

Googlebot and "hidden" content inside dynamically shown (js based) tabs within a page - Impact on SERPS?

Let says someone has 'legitimately' hidden content within a page.
To explain this further, imagine the following:
<div id="tab-one">This is the content inside tab one</div>
<div id="tab-two">This is the content inside tab two</div>
Tab one
Tab two
From an seo perspective, assuming that none of this is done to manipulate google. And in fact, "tab two" contains spam free, relevant data, how does this impact seo?
Will googlebot index, and conciser the 'hidden' content as part of the content of the page?
Will it use this content in the same way as though the content was "visible" on the page without the use of javacscript?
Thanks.
I don't believe there's an official Google response on this topic in the past, however, from experience I can tell you that Google will index the tabbed content just fine. You'll even see SEO traffic from the content. If you're site is fairly clean, I wouldn't worry about being flagged as having "hidden content", as long as the content is accessible by user action (e.g. clicking), and obviously clickable.
However, you'll want to consider this. Say for example, some of the content in a hidden tab is a product description such as "child safe". If a users is looking for "child safe products", and they arrive at your site through a search engine, they probably won't immediate see that information because they don't know it's buried behind a tab.
Most users don't spend a lot of time hunting, so to a user they might not find the content and bounce because they don't feel like they found the relevant information they were looking for. If you subscribe to the idea that Google and Bing use search query refinements as a search signal, this could potentially "harm" your SEO.
Personally, unless it's truly tertiary information, I wouldn't put it behind a tab unless crucial to the Ux. From my experience, users don't mind scrolling if the information is relevant ... but they tend to have "tab" blindness or only really interact with "hidden" elements when it's part of the navigation or already in a transactional flow.
p.s. An alternative is to use crawlable AJAX or pushState() to have the individual tabs indexed separately on their own URLs. But you'll want to be careful ... if you're rendering out the main content on the tab "pages", you might have a duplicate content concern. If it makes sense, you can potentially use the rel="next" and rel="prev" spec that Google released (but only supported by Google right now).
In Webmaster Tools you will find the option to Fetch as Google. There you can see just how Google is crawling the page. I've noticed some JavaScript carousel libraries are crawled, while others aren't. It's just a matter of how Google is able to read the JavaScript code.
As far as impact goes, it's not like all hidden content is bad. The content is still crawled (As you will see with the fetch). Now if there was an abundance of keyword-stuffed content, that would be susceptible to penalty.
Used correctly, it's definitely still beneficial.
The hidden content will be crawled, and this is not a problem for Google, many sites have this kind of menu. I suppose the hidden tabs are not keywords stuffed and useful for the users, so you shouldn't worry about this - it is useful for the user and googlebot!

Does changing the order of HTML with Javascript help SEO

On my website, I have a booking widget at the top of each page to allow visitors to enter our booking engine. The code behind it uses quite a bit of HTML, pushing down the content on each page in the source. In an attempt to better my SEO, I decided to have the code placed in a DIV tag at the bottom of the page, and, when the DOM is ready, I use JQuery to physically move the DIV from the bottom of the DOM to the top where it needs to be to render correctly.
My question is if this is really helping SEO? Does Google look at the DOM/Source after all Javascript has run, or before? Does moving these few hundred lines of HTML to the bottom of the HTML source gain me any advantage?
Spiders do not process javascript. So any content that appears/moves or is created by javascript will appear as if it hasn't been moved or created at all.
I'd be really surprised if web crawlers execute the scripts on the page. They probably scan the raw response.
That doesnot have any effect on the SEO.
But placing the javascript at the bottom will defnitely help you to load the webpages faster.
There is no harm for SEO as well, you can defnitely proceed with your approach
There is a distinction between javascript executed on load versus during the user session. The on-load javascript is more times than not indexed by google. The dynamic content or alterations on the client side are not well indexed.
So, it can't be ignored.

How to tell image search which image matters?

Google image search seems to do a poor job on a site I run in identifying which image on a page should be indexed. In addition it doesn't seem to link that image with lots of the associated data.
Are there any ways of focusing attention for spiders on particular images and associated data, do they need to be within the same tags, or adjacent on the page?
A few tips:
Use a descriptive name, i.e. "tabby-cat.jpg" instead of "img02396.jpg".
Use alt tags on images.
Use descriptive text on the page and around the image.
Make sure the images are in the generated source, i.e. if you click "View source" in your browser, you see <img> tags.
It's also useful to validate your site at http://validator.w3.org in case there are major errors like missing brackets etc that could prevent a spider from parsing the page. (Note: I wouldn't worry about making everything 100% valid since Google is fine with invalid code)
Images in CSS (i.e. backgrounds) are not indexed AFAIK. However I'd suggest using CSS backgrounds for "design" images (a subtle way of getting Google to ignore site headers, custom borders, shadows, etc).
Nor are any images generated from Javascript.
Make sure you're not blocking images through robots.txt. I know that Joomla does this by default.
Sign up at Google Webmaster Tools, add your site, then allow it to be used in Google's "Image Labeller" game which should help tag images.
All images on a page should be indexed. If they aren't then improve your alt tags and possibly rename the image file. There really isn't anything more you can do since search-engines do not read any other context for the image itself except size. If google thinks the image is a duplicate it won't index it either.
Of course if images really do inherit context from the surrounding page then you could just use less images or move them into CSS.
I think Search robot can not read images as we do, so the simple and must thing you should do to your images is using descriptive names, so that spider could know what this image all about. Second one is using ALT tags on images, put in keywords relating to the images.
Those thing are what I do.