When you search for something on google, e.g. stackoverflow.com it shows you sitelinks on the search results page. Is there a way to manipulate this information. Or is there some way to suggest google that link x, link y and link z should be promoted on the search results page.
Short answer: not at present: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=47334
Quote:
At the moment, sitelinks are completely automated. We're always working to improve our sitelinks algorithms, and we may incorporate webmaster input in the future. There are best practices you can follow, however, to improve the quality of your sitelinks. For example, for your site's internal links, make sure you use anchor text and alt text that's informative, compact, and avoids repetition.
It seems you can remove pages you'd rather weren't in there, but you can't promote things you'd like (beyond normal good SEO) and you can't make Google display sitelinks if it isn't already.
As Andrew said, all you can really do regarding sitelinks is:
Be considered an authority website by
Google worthy of sitelinks in the
first place, and
Remove sitelinks
that you don't want from your Google
Webmaster control panel.
I have some high-ranking websites with sitelinks and it's clear that Google doesn't always have an easy time discerning what's worthwhile. Sometimes, my vBulletin forums even include sitelinks to random members from the first page of Google. The next day (or refresh), the sitelinks include non-prominent subforums. It truly feels random unless you have a more straightforward navigation system like a Wordpress blog.
Related
I want to achieve this when a site is searched for on Google (sub links below description).
Are these Google Sitelinks?
From what I've seen when researching into this, Sitelinks are larger and sit side-by-side, as shown in the image in this question.
If these aren't Sitelinks, can they be defined and how would this be done?
Yes these are sitelinks, the large and big sitelinks mainly they appear for the homepage or any other pages you have with a high page rank.
The little links that appears beside each other are also sitelinks for page that have less page rank or less content with poor HTML structure.
You can't control which links to appear on Google, many factors affect them like HTML structure, page rank, content, CTR and search query.
You can only remove them from Google webmaster tools by demoting a certain link from a certain page you have.
These are one-line sitelinks, introduced in 2009-04.
They are similar to the "full two-column" sitelinks, but one-line sitelinks can appear for every result, not only the first one.
I was wondering how to achieve the following when searching for my website on Google. I've tried searching around for it but I'm not sure what the exact term is so I haven't gotten anywhere.
Basically, when my website is searched in Google, I'd like the subpages to be indexed like shown in the image below, instead of coming up as another result. Is this possible or is it something that Google does for you?
Take a look at this screenshot:
Google calls them sitelinks.
You can’t enforce them currently:
We only show sitelinks for results when we think they'll be useful to the user. If the structure of your site doesn't allow our algorithms to find good sitelinks, or we don't think that the sitelinks for your site are relevant for the user's query, we won't show them.
At the moment, sitelinks are automated.
For encouraging Google to display them for your site, see the question on Webmasters SE:
What are the most important things I need to do to encourage Google Sitelinks?
They also have a "sitelinks" tag.
I'm making a site which will have reviews of the privacy policies of hundreds of thousands of other sites on the internet. Its initial content is based on my running through the CommonCrawl 5 billion page web dump and analyzing all the privacy policies with a script, to identify certain characteristics (e.g. "Sells your personal info").
According to the SEO MOZ Beginner's Guide to SEO:
Search engines tend to only crawl about 100 links on any given page.
This loose restriction is necessary to keep down on spam and conserve
rankings.
I was wondering what would be a smart way to create a web of navigation that leaves no page orphaned, but would still avoid this SEO penalty they speak of. I have a few ideas:
Create alphabetical pages (or Google Sitemap .xml's), like "Sites beginning with Ado*". And it would link "Adobe.com" there for example. This, or any other meaningless split of the pages, seems kind of contrived and I wonder whether Google might not like it.
Using meta keywords or descriptions to categorize
Find some way to apply more interesting categories, such as geographical or content-based. My concern here is I'm not sure how I would be able to apply such categories across the board to so many sites. I suppose if need be I could write another classifier to try and analyze the content of the pages from the crawl. Sounds like a big job in and of itself though.
Use the DMOZ project to help categorize the pages.
Wikipedia and StackOverflow have obviously solved this problem very well by allowing users to categorize or tag all of the pages. In my case I don't have that luxury, but I want to find the best option available.
At the core of this question is how Google responds to different navigation structures. Does it penalize those who create a web of pages in a programmatic/meaningless way? Or does it not care so long as everything is connected via links?
Google PageRank does not penalize you for having >100 links on a page. But each link above a certain threshold decreases in value/importance in the PageRank algorithm.
Quoting SEOMOZ and Matt Cutts:
Could You Be Penalized?
Before we dig in too deep, I want to make it clear that the 100-link
limit has never been a penalty situation. In an August 2007 interview,
Rand quotes Matt Cutts as saying:
The "keep the number of links to under 100" is in the technical
guideline section, not the quality guidelines section. That means
we're not going to remove a page if you have 101 or 102 links on the
page. Think of this more as a rule of thumb.
At the time, it's likely
that Google started ignoring links after a certain point, but at worst
this kept those post-100 links from passing PageRank. The page itself
wasn't going to be de-indexed or penalized.
So the question really is how to get Google to take all your links seriously. You accomplish this by generating a XML sitemap for Google to crawl (you can either have a static sitemap.xml file, or its content can be dynamically generated). You will want to read up on the About Sitemaps section of the Google Webmaster Tools help documents.
Just like having too many links on a page is an issue,having too many links in a XML sitemap file is also an issue. What you need to do is paginate your XML sitemap. Jeff Atwood talks about how StackOverflow implements this: The Importance of Sitemaps. Jeff also discusses the same issue on StackOverflow podcast #24.
Also, this concept applies to Bing as well.
Pretty much that is the question. Is there a way that is more efficient than the standart sitemap.xml to [add/force recrawl/remove] i.e. manage your website's index entries in google?
I remember a few years ago I was reading an article of an unknown blogger that was saying that when he write news in his website, the url entry of the news will appear immediately in google's search result. I think he was mentioning about something special. I don't remember exactly what.. . some automatic re-crawling system that is offered by google themselves? However, I'm not sure about it. So I ask, do you think that I am blundering myself and there is NO OTHER way to manage index content besides sitemap.xml ? I just need to be sure about this.
Thank you.
I don't think you will find that magical "silver bullet" answer you're looking for, but here's some additional information and tips that may help:
Depth of crawl and rate of crawl is directly influenced by PageRank (one of the few things it does influence). So increasing your site's homepage and internal pages back-link count and quality will assist you.
QDF - this Google algorithm factor, "Query Deserves Freshness", does have a real impact and is one of the core reasons behind the Google Caffeine infrastructure project to allow much faster finding of fresh content. This is one of the main reasons that blogs and sites like SE do well - because the content is "fresh" and matches the query.
XML sitemaps do help with indexation, but they won't result in better ranking. Use them to assist search bots to find content that is deep in your architecture.
Pinging, especially by blogs, to services that monitor site changes like ping-o-matic, can really assist in pushing notification of your new content - this can also ensure the search engines become immediately aware of it.
Crawl Budget - be mindful of wasting a search engine's time on parts of your site that don't change or don't deserve a place in the index - using robots.txt and the robots meta tags can herd the search bots to different parts of your site (use with caution so as to not remove high value content).
Many of these topics are covered online, but there are other intrinsic things like navigational structure, internal linking, site architecture etc that also contribute just as much as any "trick" or "device".
Getting many links, from good sites, to your website will make the Google "spiders" reach your site faster.
Also links from social sites like Twitter can help the crawlers visit your site (although the Twitter links do not pass "link juice" - the spiders still go through them).
One last thing, update your content regularly, think of content as "Google Spider Food". If the spiders will come to your site, and will not find new food, they will not come back again soon, if each time they come, there is new food, they will come a lot. Article directories for example, get indexed several times a day.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
In SEO people talk a lot about Google PageRank. It's kind of a catch 22 because until your site is actually big and you don't really need search engines as much, it's unlikely that big sites will link to you and increase your PageRank!
I've been told that it's easiest to simply get a couple high quality links to point to a site to raise it's PageRank. I've also been told that there are certain Open Directories like dmoz.org that Google pays special attention to (since they're human managed links). Can anyone speak to the validity of this or suggest another site/technique to increase a site's PageRank?
Have great content
Nothing helps your google rank more than having content or offering a service people are interested in. If your web site is better than the competition and solves a real need you will naturally generate more traffic and inbound links.
Keep your content fresh
Use friendly url's that contain keywords
Good: http://cars.com/products/cars/ford/focus/
Bad: http://cars.com/p?id=1232
Make sure the page title is relevant and well constructed
For example: Buy A House In France :. Property Purchasing in France
Use a domain name that describes your site
Good: http://cars.com/
Bad: http://somerandomunrelateddomainname.com/
Example
Type car into Google, out of the top 5 links all 4 have car in the domain: http://www.google.co.uk/search?q=car
Make it accessible
Make sure people can read your content. This includes a variety of different audiences
People with disabilities: Sight, motor, cognitive disabilities etc..
Search bots
In particular make sure search bots can read every single relevant page on your site. Quite often search bots get blocked by the use of javascript to link between pages or the use of frames / flash / silverlight. One easy way to do this is have a site map page that gives access to the whole site, dividing it into categories / sub categories etc..
Down level browsers
Submit your site map automatically
Most search engines allow you to submit a list of pages on your site including when they were last updated.
Google: https://www.google.com/webmasters/tools/docs/en/about.html
Inbound links
Generate as much buzz about your website as possible, to increase the likely hood of people linking to you. Blog / podcast about your website if appropriate. List it in online directories (if appropriate).
References
Google Search Engine Ranking Factors, by an SEO company
Creating a Google-friendly site: Best practices
Wikipedia - Search engine optimization
Good content.
Update it often.
Read and digest everything at Creating a Google-friendly site: Best practices.
Be active on the web. Comment in blogs, correspond genuinely with people, in email, im, twitter.
I'm not too sure about the domain name. Wikipedia? What does that mean? Mozilla? What word is that? Google? Was a typo. Yahoo? Sounds like that chocolate drink Yoohoo.
Trying to keyword the domain name shoehorns you anyway. And it can be construed as a SEO technique in the future (if it isn't already!)
Answer all email. Answer blog comments. Be nice and helpful.
Go watch garyvee's Better Than Zero. That'll motivate you.
If it's appropriate, having a blog is a good way of keeping content fresh, especially if you post often. A CMS would be handy too, as it reduces the friction of updating. The best way would be user-generated content, as other people make your site bigger and updated, and they may well link to their content from their other sites.
Google doesn't want you to have to engineer your site specifically to get a good PageRank. Having popular content and a well designed website should naturally get you the results you want.
A easy trick is to use
Google webmaster tool https://www.google.com/webmasters/tools
And you can generate a sitemap using http://www.xml-sitemaps.com/
Then, don't miss to use www.google.com/analytics/
And be careful, most SEO guides are not correct, playing fair is not always the good approach. For example,everyone says that spamming .edu sites is bad and ineffective but it is effective.