noscript text is appearing in Google - seo

I have added in the bottom of my html like this (just like how stackoverflow has it implemented):
<noscript>This site works best with Javascript is enabled</noscript>
but in one of my pages that has very little text, the text "Javascript is disabled" appears in Google search.
Is there a way to tell Google to avoid indexing this part? Or is there a better alternative instead of using <noscript> tag?

The issue is that Google often won't render Javascript. It can - but it often won't.
You either need to present a pre-rendered page or provide it with a meta description that accurately describes the content. Look up tags and how Google uses them to embellish it's search listings.
Other options like or can encourage Google from deviating from the provided description. However, a pre-rendered page for it to scrape is always more reliable.

Related

Googlebot and "hidden" content inside dynamically shown (js based) tabs within a page - Impact on SERPS?

Let says someone has 'legitimately' hidden content within a page.
To explain this further, imagine the following:
<div id="tab-one">This is the content inside tab one</div>
<div id="tab-two">This is the content inside tab two</div>
Tab one
Tab two
From an seo perspective, assuming that none of this is done to manipulate google. And in fact, "tab two" contains spam free, relevant data, how does this impact seo?
Will googlebot index, and conciser the 'hidden' content as part of the content of the page?
Will it use this content in the same way as though the content was "visible" on the page without the use of javacscript?
Thanks.
I don't believe there's an official Google response on this topic in the past, however, from experience I can tell you that Google will index the tabbed content just fine. You'll even see SEO traffic from the content. If you're site is fairly clean, I wouldn't worry about being flagged as having "hidden content", as long as the content is accessible by user action (e.g. clicking), and obviously clickable.
However, you'll want to consider this. Say for example, some of the content in a hidden tab is a product description such as "child safe". If a users is looking for "child safe products", and they arrive at your site through a search engine, they probably won't immediate see that information because they don't know it's buried behind a tab.
Most users don't spend a lot of time hunting, so to a user they might not find the content and bounce because they don't feel like they found the relevant information they were looking for. If you subscribe to the idea that Google and Bing use search query refinements as a search signal, this could potentially "harm" your SEO.
Personally, unless it's truly tertiary information, I wouldn't put it behind a tab unless crucial to the Ux. From my experience, users don't mind scrolling if the information is relevant ... but they tend to have "tab" blindness or only really interact with "hidden" elements when it's part of the navigation or already in a transactional flow.
p.s. An alternative is to use crawlable AJAX or pushState() to have the individual tabs indexed separately on their own URLs. But you'll want to be careful ... if you're rendering out the main content on the tab "pages", you might have a duplicate content concern. If it makes sense, you can potentially use the rel="next" and rel="prev" spec that Google released (but only supported by Google right now).
In Webmaster Tools you will find the option to Fetch as Google. There you can see just how Google is crawling the page. I've noticed some JavaScript carousel libraries are crawled, while others aren't. It's just a matter of how Google is able to read the JavaScript code.
As far as impact goes, it's not like all hidden content is bad. The content is still crawled (As you will see with the fetch). Now if there was an abundance of keyword-stuffed content, that would be susceptible to penalty.
Used correctly, it's definitely still beneficial.
The hidden content will be crawled, and this is not a problem for Google, many sites have this kind of menu. I suppose the hidden tabs are not keywords stuffed and useful for the users, so you shouldn't worry about this - it is useful for the user and googlebot!

SEO - META Tags and Google

I just found out that Google recently decided to start using their own "title" when they display their search results. Also, after checking Yahoo and Bing I saw that the way they are displaying their results are the same but in completely different way than Google.
I guess my question would be, if there is an actual "correct" way of adding titles to my pages in order for Google to display what I want them to and this way get the same results with Yahoo/Bing that are currently using the page's title as a search result (sometimes they pick up the first tag and use it as title).
Any recommendations or links to follow for more studying would be appreciated.
There's nothing you can really do about it. Google will choose what title to display based on criteria they have not made public. This usually is the page's title as found in the <title> tag but if Google feels a different title better summarizes the page's content they may choose to display something else.
You can try to change your page titles to better reflect the page's content and see if that helps.
Using optimal keyword prominency in meta tags according to guidelines... and Google will pick up your meta tags. See our news portal's source and metas (keywords: hírek, választás 2014, etc.): http://valasztas2014.hir24.hu/

SEO: Can dynamically generated links be crawled?

I have a page containing <div> tags with onclick="" code that calls an ajax request to get json data, and then iterates through the results to form links (<a />) to append to the page. These links do not exist in any other place on my website. How can I make these dynamically generated links crawlable?
My initial thought was to turn the <div> tags into <a> tags with a href="#", but with my limited knowledge of how typical crawlers work, i don't think this would solve my problem since the "#" would be what's recognized by the crawler, and not necessarily the dynamically generated output. This is besides the point that i don't want the scroll positioning to be altered at all, which would also rule out giving the <a> tag an id and having it reference itself.
Do I have any options aside from making a new page containing all of the links i need to be crawled? Thanks.
As a general rule, content that is created or made available through JavaScript cannot be found or indexed by search engines. Google does support crawlable Ajax but using it as the only means of accessing your content is bad for accessibility. Also, other search engines can't get to that content which is also not a good thing. Basically crawable ajax is a bad thing.
You should always make your content available without requiring JavaScript to get it. Then you can improve your site by adding JavaScript to make getting the content faster or easier. This is called Progressive Enhancement and is how good websites are built.

SEO - Does google+other search engines index links within <noscript> tags?

I have setup some dropdown menus allowing users to find pages on my website by selecting options across multiple dropdowns:
eg. Color of Car, Year
This would generate a link like: mysite.xyz/blue/2010/
The only problem is, because this link is dynamically assembled with Javascript, I've also had to assemble each possible combination from the dropdowns into a list like:
<noscript>
No javascript enabled? Here are all the links:
<a href='mysite.xyz/blue/2009/'>mysite.xyz/blue/2009/</a>
<a href='mysite.xyz/blue/2010/'>mysite.xyz/blue/2010/</a>
<a href='mysite.xyz/red/2009/'>mysite.xyz/red/2009/</a>
<a href='mysite.xyz/red/2010/'>mysite.xyz/red/2010/</a>
</noscript>
My question is, if I put these in a tag like this, will I be penalized or anything by search engines such as Google? I've already been doing so for some navigational stuff which required offsets etc. However, now I would be listing a whole list of links here too. I want to provide them here, moreso so that google can actually index my pages - but for those without javascript, they can still navigate too.
Your thoughts? Also.. even though I have some links that appear to have been indexed, I AM NOT 100% SURE, which is why I'm asking :P
If the noscript code represents an alternative to the javascript code, then it should be fine I think, but Google does try to spot fishy seo and may penalize, so it's better to avoid doing this when possible.
In your case, consider spending some time making a drop down menu such that you can have the links on the page in a list item and use javascript + css to simulate a drop down menu, this way you will not need to use the noscript tag.
A decade ago, I made my website using image links for internal navigation (this at a time when CSS was brand-new and HTML4 Transitional was normal). I then added text navigation links at the bottom of the page.
I believe this (and your idea) is a common enough technique that, as long as you really aren't trying to do something sketchy, Google et al should interpret correctly.
I think the noscript tag is irrelevant, but having a giant list of links links may make their algorithms think you're doing some fishy SEO. Like having a wall of keywords.
Google (or whoever) would index these, and as long as you're not going overboard with a bunch of BS links I don't see a problem. Though from an SEO standpoint, it's not good to create menus from javascript or flash. I might look for an alternative that uses anchor tags with some CSS to dress it up.

Scribd Search Engine Optimization Features for PDF

All recently noticed that PDF documents in Scribd are also SEO friendly for search engines. For example the link http://www.scribd.com/doc/17135767/FREE-by-Chris-Anderson
If you open the page and see the HTML source code, the plain text from the PDF is not presented. However if you open the cached version of the page from Google search it appears a tag html_wrapper which contains the text from the entire PDF document.
Do they display different content depending of User-agent that make the request - ex. browser or bots?
I've heard some SEO practices that don't recommend displaying different content for bots? How bad practice is this from SEO prospective?
this is what google sees
http://webcache.googleusercontent.com/search?q=cache:-LY7o-liYlsJ:www.scribd.com/doc/17135767/FREE-by-Chris-Anderson+site:www.scribd.com/doc/17135767/FREE-by-Chris-Anderson&hl=en&strip=1
yeah, you should not display googlebot different content then a human user, said that there are ways to do ok conditional rendering (i.e.: render for no cookie clients, render for no javascript clients, render for clients without a language header, ...) this kind of rendering can be missleading, but if is not missleading then it might be ok for google. if you do this kind of conditional rendering it's then always a question of intend.