Schema data on inner page - seo

I have a website and I'm beginning to add schema tags to it. One worry I have is having schema data only inside subpages.
My reviews page is located under /testimonials and the schema data works perfectly as tested in Googles schema rich snippets tool.
However, these reviews don't appear anywhere on the home page, so the review schema is NOT appearing on the home page. Should I add them hidden on the home page in the HTML so that they're picked up, or is there a way to tell Google that my reviews page is located at /testimonials?

To answer your first question, no, you should never hide your schemas. That goes against Google's guidelines and they will just ignore your markups. Secondly, a homepage is typically not a good place to mark up reviews and ratings, because the markups should be indicative of the main content on the page and because the page should also include a mechanism to gather and post customer reviews. So again, without that mechanism, Google won't trust your review markups.
So my advice would be to create a strong testimonials page that includes your business' reviews and ratings along with a system to gather and post them to that page. If you mark them up well and structure your markups correctly (don't trust Google's testing tool to notify you of all errors), Google may very well display a rating rich snippet for that page. And with good SEO, you can have both pages appearing on the first page of Google for relevant search queries, with rich snippets.

Related

SEO Search Only content

We have a ton of content on our website which a user can get to by performing a search on the website. For example, we have data for all Public companies, in the form of individual pages per company. So think like 10,000 pages in total. Now in order to get to these pages, a user needs to search for the company name and from the search results, click on the company name they are interested in.
How would a search bot find this page? There is no page on the website which has links to these 10,000 pages. Think amazon, you need to search for your product and then from the search results, click on the product you are interested in to get to it.
The closest solution I could find was the sitemap.xml, is that it? Anything which doesn't require adding 10,000 links to an xml file?
You need to link to a page, or for it to be close to the homepage for it to stand a decent chance of getting indexed by Google.
A sitemap helps, sure, but a page still needs to exist in the menu / site structure. A sitemap reference alone does not guarantee a resource will be indexed.
Google - Webmaster Support on Sitemaps: "Google doesn't guarantee that we'll crawl or index all of your URLs. However, we use the data in your Sitemap to learn about your site's structure, which will allow us to improve our crawler schedule and do a better job crawling your site in the future. In most cases, webmasters will benefit from Sitemap submission, and in no case will you be penalized for it."
If you browse Amazon, it will be possible to find 99% of the products available. Amazon do a lot of interesting stuff in their faceted navigation, you could write a book on it.
Speak to an SEO or a usability / CRO expert - they will be able to tell you what you need to do - which is basically create a user friendly site with categories & links to all your products.
An XML sitemap pretty much is your only on-site option if you do not or cannot link to these products on your website. You could link to these pages from other websites but that doesn't seem like a likely scenario.
Adding 10,000 products to an XML sitemap is easy to do. Your sitemap can be dynamic just like your web pages are. Just generate it on the fly when requested like you would a regular web page and include whatever products you want to be found and indexed.

Google displaying website title differently in search results

Google displays my website’s page title differently to how it is meant to be.
The page title should be:
Graphic Designer Brighton and Lewes | Lewis Wallis Graphic Design
It displays fine in Bing, Yahoo and on my actual website.
However, Google displays it differently:
Lewis Wallis Graphic Design: Graphic Designer Brighton and Lewes
This is annoying as I want my keywords "graphic designer brighton" to go before my name.
I am using the Yoast SEO plugin and my only suspicion is that there might be a conflict between that and my theme, Workality.
Has anyone got any suggestions as to why this might be happening?
Google Search may change webpage titles they show in the result page (since 2012-01):
We use many signals to decide which title to show to users, primarily the <title> tag if the webmaster specified one. But for some pages, a single title might not be the best one to show for all queries, and so we have algorithms that generate alternative titles to make it easier for our users to recognize relevant pages.
See also the documentation at http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35624:
Google's generation of page titles and descriptions (or "snippets") is completely automated and takes into account both the content of a page as well as references to it that appear on the web. The goal of the snippet and title is to best represent and describe each result and explain how it relates to the user's query.
[…]
While we can't manually change titles or snippets for individual sites, we're always working to make them as relevant as possible.
In my answer on Webmasters SE I linked to questions from people having the same issue.
Is is possible that you changed the title, or installed the plugin, and Google hasn't picked up the changes yet?
It can take a few weeks for Google to pick up changes to your site, depending on how often it spiders it. The HTML looks fine so I can only think that Google hasn't got round to picking up the changes yet.

SEO: secure pages and rel=nofollow

Should one apply rel="nofollow" attribute to site links that are bound for secure/login required pages?
We have a URI date based link structure where the previous year's news content is free, while the current year, and any year prior to the last, are paid, login required content.
The net effect is that when doing a search for our company name in google, what comes up first is Contact, About, Login, etc., standard non-login required content. That's fine, but ideally we have our free content, the pages we want to promote, shown first in the search engine results.
Toward this end, the link structure now generates rel="follow" for the free content we want to promote, and rel="nofollow" for all paid content and Contact, About, Login, etc. screens that we want at the bottom of the SEO search result ladder.
I have yet to deploy the new linking scheme for fear of, you know, blowing up the site SEO-wise ;-) It's not in great shape to begin with, despite our decent ranking, but I don't want us to disappear either.
Anyway, words of wisdom appreciated.
Thanks
nofollow
I think Emil Vikström is wrong about nofollow. You can use the rel value nofollow for internal links. The microformats spec and the HTML5 spec don't say the opposite.
Google even gives such an example:
Crawl prioritization: Search engine robots can't sign in or register as a member on your forum, so there's no reason to invite Googlebot to follow "register here" or "sign in" links. Using nofollow on these links enables Googlebot to crawl other pages you'd prefer to see in Google's index. However, a solid information architecture — intuitive navigation, user- and search-engine-friendly URLs, and so on — is likely to be a far more productive use of resources than focusing on crawl prioritization via nofollowed links.
This does apply to your use case. So you could nofollow the links to your login page. Note however, if you also meta-noindex them, people that search for "YourSiteName login" probably won't get the desired page in their search results, then.
follow
There is no rel value "follow". It's not defined in the HTML5 spec nor in the HTML5 Link Type extensions. It isn't even mentioned in http://microformats.org/wiki/existing-rel-values at all. A link without the rel value nofollow is automatically a "follow link".
You can't overwrite a meta-nofollow for certain links (the two nofollow values even have a different semantic).
Your case
I'd use nofollow for all links to restricted/paid content. I wouldn't nofollow the links to the informational pages about the site (About, Contact, Login), because they are useful, people might search especially for them, and they give information about your site, while all the content pages give information about the various topics.
Nofollow is only for external links, it does not apply to links within your own domain. Search engines will try to give the most relevant content for the query asked, and they generally actively avoid taking the website owners wishes into account. Thus, nofollow will not help you here.
What you really want to do is make the news content the best choice for a search on your company name. A user searching for your company name may do this for two reasons: They want your homepage (the first page) or they more specifically want to know more about your company. This means that your homepage as well as "About", "Contact", etc, are generally actually what the user is looking for and the search engines will show them at the top of their results pages.
If you don't want this you must make those pages useless for one wanting to know more about your company. This may sound really silly. To make your "About" and "Contact" pages useless to one searching for your company you should remove your company name from those pages, as well as any information about what your company does. Put that info on the news pages instead and the search engines may start to rank the news higher.
Another option is to not let the search engine index those other pages at all by adding them to a robots.txt file.

SEO: Allowing crawler to index all pages when only few are visible at a time

I'm working on improving the site for the SEO purposes and hit an interesting issue. The site, among other things, includes a large directory of individual items (it doesn't really matter what these are). Each item has its own details page, which is accessed via
http://www.mysite.com/item.php?id=item_id
or
http://www.mysite.com/item.php/id/title
The directory is large - having about 100,000 items in it. Naturally, on any of the pages only a few items are listed. For example, on the main site homepage, there are links to about 5 or 6 items, from some other page there links to about a dozen different items, etc.
When real users visits the site, they can use search form to find item by keyword or location - so there would be a list produced matching their search criteria. However when, for example, a google crawler visits the site, it won't even attempt to put a text into the keyword search field and submit the form. Thus as far as the bot is concern, after indexing the entire site, it has covered only a few dozen items at best. Naturally, I want it to index each individual item separately. What are my options here?
One thing I considered is to check the user agent and IP ranges and if the requestor is a bot (as best I can say), then add a div to the end of the most relevant page with links to each individual item. Yes, this would be a huge page to load - and I'm not sure how google bot would react to this.
Any other things I can do? What are best practices here?
Thanks in advance.
One thing I considered is to check the user agent and IP ranges and if
the requestor is a bot (as best I can say), then add a div to the end
of the most relevant page with links to each individual item. Yes,
this would be a huge page to load - and I'm not sure how google bot
would react to this.
That would be a very bad thing to do. Serving up different content to the search engines specifically for their benefit is called cloaking and is a great way to get your site banned. Don't even consider it.
Whenever a webmaster is concerned about getting their pages indexed having an XML sitemap is an easy way to ensure the search engines are aware of your site's content. They're very easy to create and update, too, if your site is database driven. The XML file does not have to be static so you can dynamically produce it whenever the search engines request it (Google, Yahoo, and Bing all support XML sitemaps). You can find out mroe about XML sitemaps at sitemaps.org.
If you want to make your content available to search engines and want to benefit from semantic markup (i.e. HTML) you should also make sure your all of content can be reached through hyperlinks (in other words not through form submissions or JavaScript). The reason for this is twofold:
The anchor text in the links to your items will contain the keywords you want to rank well for. This is one of the more heavily weighted ranking factors.
Links count as "votes", especially to Google. Links from external websites, especially related websites, are what you'll hear people recommend the most and for good reason. They're valuable to have. But internal links carry weight, too, and can be a great way to prop up your internal item pages.
(Bonus) Google has PageRank which used to be a huge part of their ranking algorithm but plays only a small part now. But it still has value and links "pass" PageRank to each page they link to increasing the PageRank of that page. When you have as many pages as you do that's a lot of potential PageRank to pass around. If you built your site well you could probably get your home page to a PageRank of 6 just from internal linking alone.
Having an HTML sitemap that somehow links to all of your products is a great way to ensure that search engines, and users, can easily find all of your products. It is also recommended that you structure your site so more important pages are closer to the root of your website (home page) and then as you branch out gets to sub pages (categories) and then to specific items. This gives search engines an idea of what pages are important and helps them organize them (which helps them rank them). It also helps them follow those links from top to bottom and find all of your content.
Each item has its own details page, which is accessed via
http://www.mysite.com/item.php?id=item_id
or
http://www.mysite.com/item.php/id/title
This is also bad for SEO. When you can pull up the same page using two different URLs you have duplicate content on your website. Google is on a crusade to increase the quality of their index and they consider duplicate content to be low quality. Their infamous Panda Algorithm is partially out to find and penalize sites with low quality content. Considering how many products you have it is only a matter of time before you are penalized for this. Fortunately the solution is easy. You just need to specify a canonical URL for your product pages. I recommend the second format as it is more search engine friendly.
Read my answer to an SEO question at the Pro Webmaster's site for even more information on SEO.
I would suggest for starters having an xml sitemap. Generate a list of all your pages, and submit this to Google via webmaster tools. It wouldn't hurt having a "friendly" sitemap either - linked to from the front page, which lists all these pages, preferably by category, too.
If you're concerned with SEO, then having links to your pages is hugely important. Google could see your page and think "wow, awesome!" and give you lots of authority -- this authority (some like to call it link juice" is then passed down to pages that are linked from it. You ought to make a hierarchy of files, more important ones closer to the top and/or making it wide instead of deep.
Also, showing different stuff to the Google crawler than the "normal" visitor can be harmful in some cases, if Google thinks you're trying to con it.
Sorry -- A little bias on Google here - but the other engines are similar.

Search Engine Optomisation

My neighbour popped over last night to ask me for help with regards to his company's website. He said that it used to be ranked pretty high on Google but has since fallen off completely.
Now, I'm a Windows App programmer hence my request for help. I took a look and there the meta tags seem ok. I recommended that he add a <h1>heading</h1> to the pages with a page title to help reinforce the content.
I also suggested that finding related websites and getting them to link to his site was good for search ranking.
Are there any other general strategies / tools that could help?
He site is: http://www.colofinder.co.uk/
ps. BTW: this isn't just an attempt to have StackOverflow link to my neighbour's site - I'm aware that links from SO don't add to its ranking.
Go to http://ooyes.net/blog/a-step-by-step-15-minute-seo-audit-%28a-sample-from-seo-secrets%29 and read it. Then go to http://www.searchenginejournal.com/55-quick-seo-tips-even-your-mother-would-love/6760/ and read it. Then go to your friends site and look at it with that information in mind. Off the top of my head, I would add flip the company name and page title in the "title" tags. Look at the google analytics account and see how people are coming to the site. That will give you an idea of where you should start your efforts to build a workable base.
First of all he needs to be make sure that his website contents are well managed and to the point. Then Page title has to be pin point, meta tags are obsolete so try meta description. Then Main Heading should be under h1 tag, sub heading under h2 and further sub heading h3. Try to update your website one in a month.
Use community websites like Facebook, Twitter and linkidin and other related forums for posting updates about completed projects and must give inbound links. You can use your company name as an inlink to your primary website and project name as an inlink of subpage of your company website.
Keep on posting at least once in a week. Post website URL to online directories will be a great help. Do not use Blackhat SEO techniques like cloaking. Do not use any invisible text/div in your website. Make sure that whenever you give your website link any where, give the most to the point and appropriate link.
Your link should have to have that stuff against you are posting your link/sublink. Make a section on your website for tag clouds/google tags, this will be a great attraction for search engines and they will link your website to other popular websites.
Make sure these tags should be directed to top ranking website which should have relevant material. I hope this will help. Feel free if you have trouble to understand anything i have mentioned above. Best of Luck