Ho do I disallow urls for specific countries in robots.txt file? - seo

I have multi-site in WordPress and have a setup like this
domain.com (main version for Australia)
domain.com/us
domain.com/eu
...
I want if Australian users to search on google, which should only show the main URL domain.com and if someone in USA and Canada searches, only domain.com/us should appear in google search results, similarly, all Europeans, should be able to see domain.com/eu version only.
What can be the best optimal approach?
Thanks.
Currently, I'm using a redirection plugin which I think blocks all the URLs to crawl on google. So, figured out that maybe the robots.txt file could fix the issue.

Related

Add multiple domains for the same website in google webmasters

I have 3 domains
these 2
http://www.janhendrikx.be
http://www.standenbouw-jan.be
redirect to
http://www.ontwerpbureaujan.be/
In google webmasters I added ontwerpbureaujan
Do I have to add the others too? Or do I get duplicate content
Do I have to use canonical URLs? How?
'Standenbouw' is my main seo keyword, maybe I should add http://www.standenbouw-jan.be to webmasters, not ontwerpbureaujan ? ...
I'm sure janhendrikx.be & standenbouw-jan.be have 301 redirects to ontwerpbureaujan.be
If not, please get it done by your Web host else Google will actually consider them as three different websites and will crawl them all.
If they have 301 redirects, then you could add the others too and Google would know that they are the same site and hence wouldn't crawl all of them.
But you could still add all of them and it may increase your pagerank for the keyword and in that case you could mention http://www.ontwerpbureaujan.be/ as your preferred domain on the site settings area.
Here are a few recommendations by Google itself:
https://support.google.com/webmasters/answer/139066?hl=en#4
Good luck!

SEO when subdomains point to the same site?

My subdomains are going to be city names:
miami.mysite.com
newyork.mysite.com
I don't know how most sites handle subdomains. My idea is simply to point them all to mysite.com and somehow get the subdomain name with PHP so that I echo the city posts and content with PHP.
Providing all subdomains have different Titles and Description. Will google index each subdomain as a different website?
Yes, Google will index each one as a separate site. However make sure you consider the pros and cons. Here's a good starting point: http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites
My opinion is go with subfolders (e.g. mysite.com/miami) instead of subdomains, mainly because consolidating inbound links to a single hostname will build more authority over time than if the same link juice is diluted among hundreds of subdomains. Also I think it would be hard to build enough unique content on each subdomain to support or justify having a separate site.

Is Addon domain affecting SEO

I am just a learn in the field of SEO and i have a main domain and an addon domains. Both have separate websites. Consider main.com is my main domain and addon.com is my addon domain name which is pointed to a sub directory called "addon".
I can access addon.com by using the following 3 ways.
addon.com
main.com/addon
addon.main.com
Are these urls are indexed separately by search engines? If so how can i prevent this?
Does Search engine think main.com/addon as a page in the main.com?
I am not sure i need to worry about all these things or just leave it as it is. I searched to google but couldn't find a right answer.
It may be too late to answer. However, it may benefit others.
Primarydomain and subdomain or addon-domain will not be linked by the search engines automatically, unless you link them purposefully or inadvertently. Except all conditions are true:
Your web root normally public_html has no index page
Directory indexing of your web root is opened, eventually
exposing/linking your sub-folder -which is attached to your
addon-domain- to google and entire world.
In that scenario robots.txt solution is not recommended, because search engines may ignore robot.txt rules.
Reference
Google will only index pages if they are linked to or listed in the sitemap. You can stop the addon.main.com or main.com/addon being indexed by using noindex tags:
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
or disallowing it in the robots.txt
The search engine will consider main.com/addon as a page of main.com - if sites are completely separate i'd recommend using a separate domain (preferably a keyword rich domain) but it's up to you really
We have three domain names with the same content. For the three domains, it will return a 200 OK HTTP code. It will look like duplicates of the same content. If there is a canonical tag on every page it will be better.
The best would be to create a redirection on the subdomain panel in cpanel so that at least addon.main.com would redirect to addon.com
Then, you can add a robots.txt to the root path of the primary domain and add
user-agent:*
disallow:/
so that no robot will visit main.com/addon
Google gives less weight to subdomain hosted site of another domain.
Superbad for SEO
If you are hosting for SEO and love the convenience of cPanel, then forget hosting domains as addon domains.
#Vasanthan R.P.
Its an excellent question, often overlooked by SEO professionals. +1 for you

Using DNS to Redirect Several Domains into One Single Content. Disaster?

When I searching our web site on Google I found three sites with the same content show up. I always thought we were using only one site www.foo.com, but it turn out we have www.foo.net and www.foo.info with the same content as www.foo.com.
I know it is extremely bad to have the same content under different URL. And it seems we have being using three domains for years and I have not seen punitive blunt so far. What is going on? Is Google using new policy like this blog advocate?http://www.seodenver.com/duplicate-content-over-multiple-domains-seo-issues/ Or is it OK using DNS redirect? What should I do? Thanks
If you are managing the websites via Google Webmaster Tools, it is possible to specify the "primary domain".
However, the world of search engines doesn't stop with Google, so your best bet is to send a 301 redirect to your primary domain. For example.
www.foo.net should 301 redirect to www.foo.com
www.foo.net/bar should 301 redirect to www.foo.com/bar
and so on.
This will ensure that www.foo.com gets the entire score, rather than (potentially) a third of the score that you might get for link-backs (internal and external).
Look into canonical links, as documented by Google.
If your site has identical or vastly
similar content that's accessible
through multiple URLs, this format
provides you with more control over
the URL returned in search results. It
also helps to make sure that
properties such as link popularity are
consolidated to your preferred
version.
They explicitly state it will work cross-domain.

Will IP Masking effects SEO Results / Ranking?

When users entering domain www.example.com, it has to check for the country from the IP and should redirected to some other language specific domain eg: www.example.co.in. Will the search engine crawler recognize both www.example.com and www.example.co.in? Will this effect the search engine ranking?
Someone could guide me on the disadvantages of using IP masking.
Thanks & Regards,
Kavitha
I think it is interesting to note that Google returns HTTP/1.1 302 Found to redirect you to your country specific domain when you visit google.com from any country outside the US.
I suggest reading Matt Cutt's article (a Google software engineer) on how Google handles the 302 Redirect: SEO advice: discussing 302 redirects.
Different search engines handle the 302 redirect in a different way. Google also makes a distinction between redirects towards the same domain, and off-domain redirects. In general, using redirects will make your SEO more complicated and very tricky, and you risk having your original domain ignored by search engines.
You may also want to check out the following article on how the Google crawler handles the various HTTP status codes: Google Webmaster Central - HTTP status codes.
HTTP/1.1 302 Found code is used for temporary redirects, so it could work. However, it is not recommended as Google will be unable to identify domain example.com with any content at other domains - basically, what I am saying is that Google spider will be redirected too based on its IP and only (presumably) English content will be indexed for this domain. If that's OK with you, then you are set.
Please, be advised this is a big no-no(!) from usability perspective and users everywhere hate this behaviour. Google offers alternative link to go to google.com even after redirection.