What is a good way to do multi-language sites in Umbraco? - seo

We are going to implement our site in both English and German. What is the best solution to redirect customers to the right site. I thought of having the domains like www.mydomain.com/en and www.mydomain.com/de. My question is what is the best way to redirect the user to the right site?
Should I have a landing page at www.mydomain.com where you could switch language, or should I simply look at browser settings and redirect to the appropriate version of the site?
Also are there any SEO issues with this setup?

The best solution is often to use two domains. And 2 site trees in Umbraco and set the Hostnames / Domains on the Tree. Thee domain name counts for SEO.
It can also fine with one domain, a solution I use is on the www.mydomain.com you put a redirect to the /en or /de
#inherits Umbraco.Web.Mvc.UmbracoTemplatePage
#{
Layout = null;
}
#{
var language = "";
String[] userLang = Request.UserLanguages;
if (userLang != null && userLang.Length > 0)
{
language = userLang[0];
}
if (language.StartsWith("de"))
{
Response.Redirect("de");
}
Response.Redirect("en");
}
This code is only check the language, you can also use a geo/ip database or set a cookie when you know the language. Depend om your situation. Do your visitor wants a landing page to choose a language? SEO does not matter whether you have it or not.

Related

Disallow double or junk wildcard subdomains in htaccess mod rewrite for SEO

I have wildcard subdomains enabled on my domain. I use this so that I can rewrite urls like es.domain.com to domain.com/page.php?lang=es and display to the user the local language version of page.php.
The one potential problem I see with allowing wildcard subdomains is that people can link to www.es.domain.com or even anything.they.like.domain.com and it will display a perfectly working clone of the website. I presume this 'duplicate content' is bad for SEO.
Can anyone come up with a RewriteRule which detects subdomains of more than 2 letters (www. excluded of course) and 301 redirects offending urls to the clean base domain.com? I'm having trouble when I consider domains like domain.co.uk which already look like they are on a subdomain.
As a side note, are there any similar implications for SEO on the opposite side of the url, with query parameters? For example, domain.com?param=anything-I-like will surely show a duplicate page. How does Google handle this content?
UPDATE:
Here's the rewrite rule I'm using currently. If I wanted to clean up bad urls with PHP, I'd need to modify this to catch all subdomains. i need to do this generically (without specifying domain.com) as it's going to be used on a CMS. Any suggestions?
RewriteCond %{HTTP_HOST} ^([a-z]{2})\.
RewriteRule p/(.*) page.php?p=$1&lang=%1
I honestly can't speak to fixing your actual issue, but I can confirm that anything.I.want.domain.com is really, REALLY bad for SEO. I've got two years' experience in the field and I'm currently working on a project cleaning up links for our main U.S. site. A couple of the biggest problems have come from sites just like you described where there were around 100 *.domain.com. The biggest issue is the effect of this problem with trust flow, it basically sends a link's trust rating to 0 and tells Google that, not only should this link be disregarded, the domain it came from and links to should be investigated for potential spammy-ness.
As to your final question on implications:
Query parameters can be just as helpful or detrimental as any other URL structure, so you want to be careful with those, as well. If you've got different language versions of your site, be sure to have one (especially if you don't have entirely unique content) as the rel-canonical page. The thing is, linking structure is important to search engines, but not overly so. It's one of many metrics. I'd be far more concerned about the subdomains. If you happen to be able to sneak in some small, basic keywords that help describe the page in with your query vars, it could help a bit. I would, however, highly suggest that you have a three or four tiered structure to your site, supported in the URLS.
See this
Google tends to like: domain.com/landingpage/category/subcategory?somevars=44
Going more than three deep spreads you too thin and less than that makes the site too bulky to navigate. I believe it's covered somewhat here if you've never seen it: http://moz.com/beginners-guide-to-seo
Search Engine Journal
Single Grain and
Moz
can answer a lot of your SEO questions and tools like:
Majestic
Soolve
Mozcast
SERPMetrics Flux
can help a lot, too. Try doing a little reading and see if you can decide a good scheme for your links.
Again, sorry, I don't know really any Apache, but hopefully that'll help!
Presumably you have a rewrite rule that takes anything in front of domain.com and puts it into the lang parameter. Rather than having a rewrite rule to do the redirecting, have your page.php script examine the lang parameter and issue a redirect for invalid values.
Thanks to all for the info & replies on this. The solution I've found is to write a more generic .htaccess rule to catch all subdomains and forward them to PHP for processing. PHP then checks if the subdomain is valid and if not, 301 redirects the visitor to the root domain. This way if someone links to blah.blah.domain.com, Search engines should see that as a link to just domain.com. I'm only using language subdomains on my site but it should work for any subdomains you want to use.
Here's the htaccess rewrite:
The regex works by finding the last instance of more than 3 domain-name-valid characters, followed by a dot, followed by any other string. The idea is that it finds the domain name in the url, then captures everything before it. Obviously this wont work for domains which are shorter than 3 characters.
#All sub domains are redirected to p.php for processing:
RewriteCond %{HTTP_HOST} ^(.*)\.[a-z0-9\-]{3,}\..*
RewriteRule (.*) p.php?subdom=%1 [L]
Here's the PHP:
function redirect301($page='/'){
header("HTTP/1.1 301 Moved Permanently");
header("Location:{$page}");
exit();
}
$subdom = $_REQUEST['subdomain']; //you should sanitise this if using this script!
$defaultLang = 'en';
$alternateLangs = "de|es"; //list of allowed subdomains
$alternateLangs = explode('|',$alternateLangs);
if(!empty($subdom) && $subdom!= 'www'){
if( !in_array($subdom,$alternateLangs) ) redirect301(); //redirect to homepage
$ISOlangCode = $subdom; // en,es,de,etc - capture code for use later
}
if($defaultLang && $ISOlangCode == $defaultLang) redirect301(); //disallow subdomain for default language (redirect to homepage)
Hopefully this helps someone out.

Is there a way to recognize google bots with the php?

I just made a website for an alcoholic drink. They need to have the age verification on all links. It's a single page website and I use backbone routing system. I've created the check with the SESSION object, so I am loading the intro view (age verification view) if the SESSION object is unset. This is all working as expected, but the problems are google bots. When they are trying to crawl my pages the app is always loading the intro (age verification) view. Here is a link for the website , but I think it won't be very useful, because I guess that this is more a logical then a technical question...
So..my question is how to redirect only visitors and to let google bots see the actual content of the page? Should I use cookies or there is a way to achieve this with the php?
Yes. Something like
If ($_SERVER['HTTP_USER_AGENT'] == "Googlebot") {
$_SESSION['ageverified'] = true;
// do more
}
Should work.
See here for all the exact user-agent names and what they crawl.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1061943

Hash character in URLs (accessing and redirecting in Apache)

It looks as though this question has been asked in part by some others, but I can't find the answer I'm looking for specifically, so I thought I'd pose my particular scenario in case anyone is able to help.
We have an old website (developed externally by a third party) that is due to be retired and replaced by a new site designed in house. For reasons best known to themselves, the developers of the old site used the hash character as part of the URL for the old site (www.mysite.com/#/my-content-stuff). To assist with the transition and help with SEO I need to set up 301 redirects for the top performing URLs from the old site. As I'm now discovering however, I'm not able to set up a simple redirect in the .htaccess file as I believe it takes the hash character to be a comment and ignores the remainder of the line. I've tried escape characters, using %23 instead, wildcard matching, nothing seems to work.
As a workaround, I wondered about simply creating dummy files with the same paths and URLs as the old site had, then simply creating HTML redirects within them to drive traffic to the correct new pages, but it looks as though the server is doing something similar regarding the hash character in the URL, and ignoring anything afterit. So, if I create a sub-folder on my news server called '#' and create a file in there called 'test.html', I expected to be able to just go to 'www.myNEWsite.com/#/test.html', but it just takes me to the default root file of my site.
Please can anyone shed any light on how I might get around this? I must admit I'm not that clued up on Apache so I'm having to learn a lot as I go.
Many thanks in advance for any pointers or info anyone can provide.
Cheers,
Rich
A hash character in the URL specifies the anchor, and it's not even sent to your webserver. A redirect is impossible on the server side, and the old developer probably did it using JavaScript. Implement fallback URLs without the hash instead, and have a global JavaScript script detect these URLs and redirect automatically.
Hash tags cannot be read by the server. They are regarded as locations within the document and are therefore not exposed to the server. The client is the only one whom see's these. The best you could do is use a "meta refresh" tag, or alternatively, you could use javascript to detect the url, and if its one which requires 301 redirection, use "window.location" to move the user to a full url where mod_rewrite or a php page can issue a 301 header.
However neither are SEO friendly and only really solve the issue for users that click onto an old link via an external site
<!-- Put in head tag so the page does not wait to load the content-->
<script type="text/javascript">
if(window.location.hash != "") {
var h = window.location.hash.match(/#\/?(.*)/i)[1];
switch(h) {
case "something_old":
window.location = "/something_new.html";
break;
case "something_also_old":
window.location = "/something_also_new.html";
break;
}
}
</script>

Page contains multiple canonical issues

i am using ISS SEO toolkit to analyze our website. Form that i got 450 canonical issues. all the errors in the same format as follows:
The page with URL "http://dynamicsexchange.com/images/Logoscroll/Images/511201091716pm_a.jpg" can also be accessed by using URL "http://www.dynamicsexchange.com/images/Logoscroll/Images/511201091716pm_a.jpg".Search engines identify unique pages by using URLs. When a single page can be accessed by using any one of multiple URLs, a search engine assumes that there are multiple unique pages. Use a single URL to reference a page to prevent dilution of page relevance. You can prevent dilution by following a standard URL format.
Please help me to solve these problems with examples. and i am using masterpage concept.
i am using IIS web server please give me the solution to set the 301 redirect (i am using master page concept) with example
Choose whether you want your visitors to use http://dynamicsexchange.com or http://www.dynamicsexchange.com. Then set up a 301 redirect to redirect any traffic from the one you don't want to the one you do want.
That way any traffic which accidentally links to the wrong one will be redirected to the right one. Search engines will then record all links to the right one, and so your page rank won't be diluted.
How to set up the 301 redirect will depend on what your web server is.

How Can I Deal With Those Dead Links After Revamping My Web Site?

Couple of months ago, we revamped our web site. We adopted totally new site structure, specifically merged several pages into one. Everything looks charming.
However, there are lots of dead links which produce a large number of 404 errors.
So how can I do with it? If I leave it alone, could it bite back someday, say eating up my pr?
One basic option is using 301 redirect, however it is almost impossible considering the number of it.
So is there any workaround? Thanks for your considering!
301 is an excellent idea.
Consider you can take advantage of global configurations to map a group of pages. You don't necessary need to write one redirect for every 404.
For example, if you removed the http://example/foo folder, using Apache you can write the following configuration
RedirectMatch 301 ^/foo/(.*)$ http://example.org/
to catch all 404 generated from the removed folder.
Also, consider to redirect selectively. You can use Google Webmaster Tools to check which 404 URI are receiving the highest number inbound links and create a redirect configuration only for those.
Chances are the number of redirection rules you need to create will decrease drastically.
301 is definitely the correct route to go down to preserve your page rank.
Alternatively, you could catch 404 errors and redirect either to a "This content has moved" type page, or your home page. If you do this I would still recommend cherry picking busy pages and important content and setting up 301s for these - then you can preserve PR on your most important content, and deal gracefully with the rest of the dead links...
I agree with the other posts - using mod_rewrite you can remap URLs and return 301s. Note - it's possible to call an external program or database with mod_rewrite - so there's a lot you can do there.
If your new and old site don't follow any remapable pattern, then I suggest you make your 404 page as useful as possible. Google has a widget which will suggest the page the user is probably looking for. This works well once Google has spidered your new site.
Along with the other 301 suggestions, you could also split the requested url string into a search string routing to your default search page (if you have one) passing those parameters automatically to the search.
For example, if someone tries to visit http://example.com/2009/01/new-years-was-a-blast, this would route to your search page and automatically search for "new years was a blast" returning the best result for those key words and hopefully your most relevant article.