"Inline" Google sitelinks - seo

I want to achieve this when a site is searched for on Google (sub links below description).
Are these Google Sitelinks?
From what I've seen when researching into this, Sitelinks are larger and sit side-by-side, as shown in the image in this question.
If these aren't Sitelinks, can they be defined and how would this be done?

Yes these are sitelinks, the large and big sitelinks mainly they appear for the homepage or any other pages you have with a high page rank.
The little links that appears beside each other are also sitelinks for page that have less page rank or less content with poor HTML structure.
You can't control which links to appear on Google, many factors affect them like HTML structure, page rank, content, CTR and search query.
You can only remove them from Google webmaster tools by demoting a certain link from a certain page you have.

These are one-line sitelinks, introduced in 2009-04.
They are similar to the "full two-column" sitelinks, but one-line sitelinks can appear for every result, not only the first one.

Related

Google search results site map?

I was wondering how to achieve the following when searching for my website on Google. I've tried searching around for it but I'm not sure what the exact term is so I haven't gotten anywhere.
Basically, when my website is searched in Google, I'd like the subpages to be indexed like shown in the image below, instead of coming up as another result. Is this possible or is it something that Google does for you?
Take a look at this screenshot:
Google calls them sitelinks.
You can’t enforce them currently:
We only show sitelinks for results when we think they'll be useful to the user. If the structure of your site doesn't allow our algorithms to find good sitelinks, or we don't think that the sitelinks for your site are relevant for the user's query, we won't show them.
At the moment, sitelinks are automated.
For encouraging Google to display them for your site, see the question on Webmasters SE:
What are the most important things I need to do to encourage Google Sitelinks?
They also have a "sitelinks" tag.

Schema.org siteNavigationElement

I'm having trouble getting the Webmaster Tools rich snippet testing tool to properly return markup for schema.org's WebPageElement types.
http://schema.org/WebPageElement
Does anyone have a site that hosts this markup?
I'm looking for solutions for a website that has undesirable snippets returned on Google search. The website is an interactive library of slide presentations, with an advanced search function.
Many different search pages on this site are being dropped from the Google index every week. The snippet returned on these pages includes the navigation menu. There is no h1 tag and the first line of the navigation menu is in bold, so Google is identifying the menu as the main content of the page and returning this info in the search results.
I need Google to put the actual page content in the search results, to increase click through rate and resolve a probable duplicate content issue.
I thought it would be good to put an h1 tag on the site, and add schema for WebPageElement, SiteNavigationElement, WPHeader, WPFooter, and WebPage.
Does anyone have examples of this markup on their site?
In the past I've used the rich snippet tool and had it return error, and in every instance I found that my code did indeed contain an error, so I don't think it's the tool.
I have implemented several of the schema.org WebPageElement types in http://gamesforkidsfree.net/en/ including siteNavigationElement
You can check how it is being recognized by Google in Rich Snippets Testing Tool.
Also in Google Webmaster Tools, there is a section to check this kind of markup at "Optimization / Structured Data", for this case it shows:
Type Schema Items # Pages
---------------------------------------------------------
ItemPage schema.org 109,657 6,866
WPAdBlock schema.org 20,727 6,973
SiteNavigationElement schema.org 7,350 7,322
WPHeader schema.org 7,319 7,319
WPFooter schema.org 7,319 7,319
WebPage schema.org 649 649
Regarding duplicate content you can have a look at one of the many Google support pages about canonicalization (isn't that duplicate content? :) e.g. canonicalization -> hints.
It would be easier to answer if you could show the actual website or a SERP screenshot. By the way I don't think that your problem can be solved using that kind of markup since there is no evidence that Google supports it even if Schema.org is a Google initiative.
For what I understand you have two different kind of issues:
Bad search snippets. Google shows in the search snippet a fragment of the on page text that is relevant to the user query. So what you see on the search snippet largely depends on the query you typed in the search box. If you see a piece of the navigation menu in the snippets it could be that there is no relevant text in the indexed page so Google does not have anything better to show than the text in the navigation menu
Search pages being dropped from the Google index. This is a different, and more serious, problem. Are those "search pages" a good and relevant result compared to the other pages ranking for the query you are typing? Is the main topic of the page clear and explicit (remember that sometimes you nee to spoon-feed the search engines)? I'm giving you more questions than answers but, as I stated before, is not easy to diagnose a SEO problem without seeing the web site.
All the above being said, google does show in its SERP when you define BREADCRUMP and schema.org as a whole is being made by the search engine giants so implementing it ensures some level of better understanding of the bots about your page. Search engines do not tell you everything they do but if you follow the main standards they produce together you pretty much ensure yourself good content availability within the SERPs.
You shouldn't count much on the impact from that though.
I suggest you focus mainly on pretty urls, canonical usage, title, description and proper implementation of schema.org itemprop for your main content type on the inner pages as well as H1 for your title.
Also try to render your main content as high as possible within the html and avoid splitting your title, summary and image… best case scenario they should be close to each other with H1, IMG and P elements and not be divided by divs, tables and so on.
You can have a look at this site http://svejo.net/1792774-protsesat-na-tsifrovizatsiya-v-balgariya-zapochva
It has a pretty good SEO on its article pages and shows up quite nicely and often in SERPs because of its on-page SEO.
I hope this helps you.

Linking together >100K pages without getting SEO penalized

I'm making a site which will have reviews of the privacy policies of hundreds of thousands of other sites on the internet. Its initial content is based on my running through the CommonCrawl 5 billion page web dump and analyzing all the privacy policies with a script, to identify certain characteristics (e.g. "Sells your personal info").
According to the SEO MOZ Beginner's Guide to SEO:
Search engines tend to only crawl about 100 links on any given page.
This loose restriction is necessary to keep down on spam and conserve
rankings.
I was wondering what would be a smart way to create a web of navigation that leaves no page orphaned, but would still avoid this SEO penalty they speak of. I have a few ideas:
Create alphabetical pages (or Google Sitemap .xml's), like "Sites beginning with Ado*". And it would link "Adobe.com" there for example. This, or any other meaningless split of the pages, seems kind of contrived and I wonder whether Google might not like it.
Using meta keywords or descriptions to categorize
Find some way to apply more interesting categories, such as geographical or content-based. My concern here is I'm not sure how I would be able to apply such categories across the board to so many sites. I suppose if need be I could write another classifier to try and analyze the content of the pages from the crawl. Sounds like a big job in and of itself though.
Use the DMOZ project to help categorize the pages.
Wikipedia and StackOverflow have obviously solved this problem very well by allowing users to categorize or tag all of the pages. In my case I don't have that luxury, but I want to find the best option available.
At the core of this question is how Google responds to different navigation structures. Does it penalize those who create a web of pages in a programmatic/meaningless way? Or does it not care so long as everything is connected via links?
Google PageRank does not penalize you for having >100 links on a page. But each link above a certain threshold decreases in value/importance in the PageRank algorithm.
Quoting SEOMOZ and Matt Cutts:
Could You Be Penalized?
Before we dig in too deep, I want to make it clear that the 100-link
limit has never been a penalty situation. In an August 2007 interview,
Rand quotes Matt Cutts as saying:
The "keep the number of links to under 100" is in the technical
guideline section, not the quality guidelines section. That means
we're not going to remove a page if you have 101 or 102 links on the
page. Think of this more as a rule of thumb.
At the time, it's likely
that Google started ignoring links after a certain point, but at worst
this kept those post-100 links from passing PageRank. The page itself
wasn't going to be de-indexed or penalized.
So the question really is how to get Google to take all your links seriously. You accomplish this by generating a XML sitemap for Google to crawl (you can either have a static sitemap.xml file, or its content can be dynamically generated). You will want to read up on the About Sitemaps section of the Google Webmaster Tools help documents.
Just like having too many links on a page is an issue,having too many links in a XML sitemap file is also an issue. What you need to do is paginate your XML sitemap. Jeff Atwood talks about how StackOverflow implements this: The Importance of Sitemaps. Jeff also discusses the same issue on StackOverflow podcast #24.
Also, this concept applies to Bing as well.

schema.org markups for search results pages

Was wondering if there are some markups in schema.org for a search results page which Google currently honors .. I was trying
ItemList (http://schema.org/ItemList)
and
AggregateOffer (http://schema.org/AggregateOffer),
but none of them seems to be coming up on Google yet (as in they still dont support it or show up that markup on the search page). Are there any other markups I can try ?
Thank you :)
Search for a restaurant, place, or product and you'll see microformats that google recognizes and uses to format its search results. Yelp reviews all also have a price range. They are used widely. I am pretty sure they use the Places stuff widely as well, and believe I have seen cases of books having author name and so on displayed.
But...
How they are used, in what cases, for what sites, and for what queries google decides to use this information is entirely up to the search engine.
Within weeks of announcements about microformats for product ratings, sites entirely unrelated to the topic were adding microformats having product rating information, so think of them as a hint that Google (and other SE's) might use in some cases when they are confident that it's accurate and helpful.
It might just take time for Google to trust your site.

Can we "instruct" Google to display links in Google sitelinks

When you search for something on google, e.g. stackoverflow.com it shows you sitelinks on the search results page. Is there a way to manipulate this information. Or is there some way to suggest google that link x, link y and link z should be promoted on the search results page.
Short answer: not at present: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=47334
Quote:
At the moment, sitelinks are completely automated. We're always working to improve our sitelinks algorithms, and we may incorporate webmaster input in the future. There are best practices you can follow, however, to improve the quality of your sitelinks. For example, for your site's internal links, make sure you use anchor text and alt text that's informative, compact, and avoids repetition.
It seems you can remove pages you'd rather weren't in there, but you can't promote things you'd like (beyond normal good SEO) and you can't make Google display sitelinks if it isn't already.
As Andrew said, all you can really do regarding sitelinks is:
Be considered an authority website by
Google worthy of sitelinks in the
first place, and
Remove sitelinks
that you don't want from your Google
Webmaster control panel.
I have some high-ranking websites with sitelinks and it's clear that Google doesn't always have an easy time discerning what's worthwhile. Sometimes, my vBulletin forums even include sitelinks to random members from the first page of Google. The next day (or refresh), the sitelinks include non-prominent subforums. It truly feels random unless you have a more straightforward navigation system like a Wordpress blog.