Do Canonical Tags Prevent Google Indexing? - seo

We're about to embark on a restructuring of our Website, and we will be separating some of our customers into different groups.
Currently all of our customers visit our homepage: www.example.com
What we are going to be doing is sending customers to specific landing pages depending on marketing segmentation.
For instance, people who we know are more likely to book a hotel might go to www.example.com/hotels, whilst people who like cars will go to www.example.com/cars.
The content might be ever so slightly different (a banner or parameter might change) but the vast majority of text (copy, layout) will stay the same.
Firstly, are Canonical Tags appropriate to use in this case to direct any Google juice back to www.example.com?
Secondly, since we will be marketing to specific groups, we will not want these pages to be indexed by Google, nor for them to appear in search rankings. With this in mind, are Canonical Tags still the correct tag to be using? That is, do Canonical Tags pass on the Google Juice to the canonical page, meaning the referrer page is not indexed?

If the core content of all those pages is the same then I think using the canonical tag will work. If Google accepts the canonicalness of the pages then it will always send people to the page you specify.
What do you mean by "sending customers to specific landing pages depending on marketing segmentation"? How is that implemented?
If all that changes is adverts then why not use the one page and dynamically insert the adverts that suit the visitor?

Seems that if you don't want the specific landing pages indexed by Google, or appearing in the search rankings, then the pages wouldn't have any 'Google juice' to consolidate. In that case, canonical tags won't hurt, but I don't think they'll have any effect.
To keep the landing pages from being indexed, you could use robots.txt, as well as the robots meta tag.

Related

Best Approach for integrating Microdata schema.org

I am developing a simple website and want to implement microdata on it.
The website is for a local business and simply has the default structure (about, services, contact, etc..).
My question is if microdata can be cloned on every page or if I should change from page to page. Logically I would say that I should change from page to page, but on the other hand information like facebook page, twitter and map will keep the same so I don't know what should I do.
I take the chance to ask if there is any better category to list a software company, I am using local business but maybe there should be better ones that I am missing (this applies for meta description and keywords also on the different sections of the site)
You should declare only the start- or contact/aboutme site with your Local Buissiness Information.
On all other site depend on the content like article, product etc.

Multiple categories of item in sitemap.xml

On my site I have items that can be in few or more categories.
Links to one item may look like (for example):
example.com/category_id_1/item_id_1
example.com/category_id_67/item_id_1
example.com/category_id_106/item_id_1
So I dont understand do I need to set all links for one item in sitemap.xml or just certain one? If only one - which one?
Which way would be more correct according to SEO optimization?
If all the three URLs serves same content it will create some duplicate content issues.
You didn't mention whether you are using a dedicated product URL or not. Better approach to use product URLs will be something like,
yourdomain.com/products/xyx-product

Multipage Bootstrap and Google Analytics

I have sort of a problem how to use Google Analytics properly with Boostrap.
My page has 3 level deep subpages and the last subpage has it's own subdomain. In GA I see I can use max. 50 tracking codes within one service. What if I need more than that?
You are limited to 50 properties not 50 pages. Each property can track many pages and (up to 10 million hits a month for the free version) and events.
Typically you would use the same property code on all pages on the same site so you can see all that data together (though with option to drill down).
You would only use a new property code for a new site (though your subdomain might qualify for that if you want to track it separately).
So the two questions you want to ask yourself are:
Do you want to be able to report on two pages together? E.g. To see that your site gets 10,000 hits and 20% are for this page and 5% are for that page. Or people start at this page and then go to that page and then on to this page. If so it should be the same analytics property.
Do different people need to see these page stats? And is it a problem if they do? If so put as a separate property so you can permission separately.
It sounds like these are part of the same site so I'd be veering towards tracking them together on same property.
On a different note you should set one page as the main version (with a rel canonical tag) and redirect other version to that page to avoid confusing search engines thinking you have duplicated content. Do you have a reason for having the same content on two different addresses? It can cause SEO and other problems.

Reselling products from other site - should i worry about duplicate content

I want to sell some products that are also prezent on another webshop. They are providing a datafeed with every information about the product, and they have nothing against that i post the info on my webshop.
The question is should i worry about duplicate content? The number of products is to high and it`s not worth rewriteing their description. Will google think that i stole the content?
Depends.
Personally i would prevent Google from indexing DC pages by adding this to the <head>...</head>:
<meta name="robots" content="noindex,follow"/>
The URLs, which come into question, won't rank anyway. So it's (usually) Ok to keep them completely out of Google's sight and don't have to worry any more about all the Algorithm-Updates.
Or, if i have a lot of pages and need more Crawl-Budget, i would use the robots.txt file:
User-agent: *
Disallow: /path/to/affiliate/products/
In this case the Linkjuice cannot flow freely within my site anymore, but all the important pages get indexed. Plus it's incredibly easy to implement. (Just don't do this if you have a lot of deeplinks to your Products from your Homepage etc.)
Matt Cutts in 2009:
"Can product descriptions be considered duplicate content?"
http://www.youtube.com/watch?v=z07IfCtYbLw
He doesn't say "its bad" but he clearly shows that Google doesn't like it.
Matt Cutts in 2012:
"Is it useful to have a section of my site that re-posts articles from other sites?"
http://www.youtube.com/watch?v=o7sfUDr3w8I States that it's propably a good idea to remove DC Pages (Like content from RSS-Feeds, Press Releases or Product-Description Feeds).
So to make a long story short - I really don't say "start panic" or whatever, i just say "remove everything from your site which could send out negative signals to Google, so you don't have to worry about it anymore" and then you can go on and build up your Brand to sell as many products as possible ;o)
Do worry about the content the site comes under the category of Affiliate sites so the product description would be same. it wont effect your site
If you want to do it properly I would get all the content re-writen. There is an amazing service out there too callws wordai.com.
Their site will re-write the content for you as if a human has on the Turing Plan.
You can then check the content with copyscape.com too see how unique it is!
Best of luck.

What's more important in SEO: Title or link data?

I'm developing a store locator web site where users may search for a brand and get a list of stores selling this brand.
Now I'm doing some SEO. My goal is that when someone is googling for a store name or storename + city, then my site will be listed on page one.
If you visit a store on my site today, the title will show:
storename, city, country - at mysite.com
My URL will look like this:
http://mysite.com/store/?store=Mardou+&+Dean&storeid=5459
My question is:
- Should I add city name and country in my URL?
- Would it be good or bad in terms of SEO to have this url:
http://mysite.com/store/Norway/Oslo/Mardou+&+Dean/?storeid=5459
In terms of usability,the last url is best, but not sure if it matters to search engines?
I know that there is a lot more to SEO, but now I'm just wondering about this part.
Depends on how much information you want to expose to search engine, you should adjust the data used in these Microdata.
This article is worth reading:
http://www.vanseodesign.com/web-design/html5-microdata/
The impact of keywords in the URL is not as important as you think. Ensuring that your targeted terms are in the page title, content and headings, internal and external backlink anchor text etc are all much more powerful signals.
Don't confuse "exact match domains" with exact match URLs.
For more "educated" "opinion" on what matters, take a look at http://www.seomoz.org/article/search-ranking-factors