Redirecting old unused domain to new domain will help to increase Google authority? - seo

I have been doing blogging since 5 years back and from last year i had stopped blogging and delete the domain content. (During last year all pages are removed from indexed )
now I had purchased a new domain BlogTechie and i am planning to 301 redirect that old domain to new domain.
Is It helped to gain SEO authority in Google or I should start from scratch without worrying about old domain.
I am also adding settings in webmaster tools to inform Google for the change.

SEOs attribute a large portion of most search engines' ranking algorithms to link-based factors. It's possible there may be old links to your pages out there on the internet on other websites. You can capitalize on this if you still own the old URL and boost your new domain's ranking with redirects.
If you know some of your older content's URLs, it might make sense to have a one to one redirect to the new page. If you're using apache, you can do this with an .htaccess file:
RewriteEngine On
RedirectMatch 301 /folder/oldpage.php http://www.newdomain.org/newpage.php
Anything remaining can redirect to the root.
RewriteRule ^(.*)$ http://newdomain.com/ [R=301]
Check out SEO Moz for more explanation on this: http://moz.com/learn/seo/redirection

Related

guidelines for htaccess code to transition users to new website

Suppose I have an old website with 10 webpages, and I want to deprecate it and move visitors to a new website with 100 webpages.
Two of the webpages of the old website map directly to two corresponding webpages on the new website. All other pages on the old website should go to the new website's home page. How to set that up?
I know I can use .htaccess in the public_html folder of the old website to create some permanent redirect rules for individual pages. So for the two pages that need to map one-to-one, I can do:
Redirect 301 /oldfile1.htm http://www.example.net/newfile1.htm
Redirect 301 /def/oldfile2.htm http://www.example.net/123/456/newfile2.htm
But what about all the other webpages on the old website? This is where my knowledge of .htaccess lacks. Does the .htaccess stop executing when it reaches one of the redirects above? If so, then perhaps I simply place the following AFTER the above code to catch the remaining pages?
Redirect 301 / http://www.example.net/
Or, something else? Also, will the redirect directly above map all webpages it sees to the home page of the new website (I assume so), or a matching directory/webpage page on the new website? -- That is, I don't want the situation where http://www.olddomain.com/abc/Oldfile1234.html takes users to http://www.example.net/abc/Oldfile1234.html on the new website (rather, it should take users to http://www.example.net) since most webpages do not map one-to-one.
Lastly, in the .htaccess file on the old website public_html directory, how to account for users coming from https versus http, and www versus non-www URLs?
I'm hoping there's common strategy people use for this sort of thing, since it should be fairly common, so I don't have to re-invent one.
You should be using RedirectMatch for precise matching using regular expressions. You can place these rules in root .htaccess:
RedirectMatch 301 ^/oldfile1\.htm$ http://www.example.net/newfile1.htm
RedirectMatch 301 ^/def/oldfile2\.htm$ http://www.example.net/123/456/newfile2.htm
RedirectMatch 301 ^ http://www.example.net/

Using .htaccess to remove www canonical: should I still verify www and non-www in webmaster tools?

I have an SEO guy that is confusing me. He mentioned that in Google Webmaster Tools I should verify the www version of a site along with the non www (the non www is already verified). So I informed him there's no need because I use .htaccess to 301 redirect all www urls to the non-www url for canonical reasons (like Matt Cutts recommends). He tells me that I still should verify both versions.
I see verifying as me telling Google I want them to index things on a certain domain/subdomain...and the whole point of the canonical is that I DON'T want them indexing www subdomain urls! Not to mention, if every request to the www subdomain is redirected, is it even possible to verify it?
Should I try to do this or should I not verify the www?
Your SEO guy is not an 'SEO moron". You can verify both www prefix and 'non www' prefix in Google Web master tools no problem and then select which one is the preferred domain, www, or non www. You can not select a preferred domain in GWT without verifying both prefixes
Google explains this here:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=44231
That "SEO guy" is an "SEO moron". If your site can be pulled up with the www and without it you are technically serving up duplicate content and is exactly what Google doesn't want. You are much smarter then that "SEO guy". You're doing it the right way. Don't change a thing.
You need to verify both to prove you are the owner of both, because technically with and without www are in fact different websites. www is just a sub-domain, and can point to a different site if need be.
So you must prove to Google you manage both. Then because Google knows you manage both, and have the authority to specify what to do with both domains, you then tell Google the site is to use www (or not, whichever you want), and you keep the redirects.
Google will not index a domain that 301 redirects to another domain.
I do agree that its not nice to have both www and non-www listed in webmaster separately though. But you could in theory have loads and loads of different sites as sub-domains, so they must be treated as different sites by Google.

Proper 301 redirect for sites

I have a bit of a complex question. I am moving sites from
http://www.hikingsanfrancisco.com
to
http://www.comehike.com
The directory structures will not be the same throughout both sites. What are some of the best practice things I can do in order to retain most of my existing SEO strength in both the general domain and individual pages for searches related to the other pages?
Thank you,
Alex
If most of the URLs are staying the same and just the domain is changing, you could create an .htaccess file in the root folder at the old site with the following:
Options +FollowSymLinks
RewriteEngine on
RewriteRule (.*) http://www.comehike.com/$1 [R=301,L]
This will make hikingsanfrancisco.com/some-page go to comehike.com/some-page.
Otherwise in that same htaccess file you could add a line for each redirect. So if hikingsanfrancisco.com/big-hikes is now going to comehike.com/even-bigger-hikes the redirect would look like:
Redirect 301 /big-hikes http://www.comehike.com/even-bigger-hikes
That 301 tells Google to now consider the new URL correct.
To redirect the whole site no matter what to the new URL you could use this:
Redirect 301 / http://www.comehike.com/
A 301 Redirect, page by page, is the best option (If you can use regular expressions is easier). Redirect the old page to a page in the new site with similar content.
Use the change of address tool in Google Webmasters tools.
Try to contact some of yours referrals to change the links that target your site.

How to prevent a search engine from indexing a directory for a particular domain?

I have a web hosting package with 2 domains pointing to it. I've noticed on Google that it has indexed the directory of one of the domains for the other domain. Is there a way of preventing this from happening.
You could try with the Robots exclusion standard but is no guarantee.
Redirect all pages of one of your domains to the other one. You can do that with .htaccess and modRewrite similar to this:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^example\.com$ [NC]
RewriteRule ^(.*)$ http://www.example.com/$1 [R=301,L]
This would perform a 301 redirect (Permanently moved) from example.com to www.example.com.
For SEO purposes you never want to have duplicate content (identical pages on different URLs), there should always be exactly one URL for your content, all other possible URLs should redirect to that one.
Updating your robots.txt will definitely solve the problem in the future, but I think the question you should be asking is, How did Google know those pages were there?
First, you should ensure that a user can't traverse your site's filesystem (if your server is *nix, .htaccess should have something like Options -Indexes). And if you had a public link anywhere that joined the two sites on a single domain, that could be how Google found it. If you are careful to keep your site clean and never point to the files in the other docroot, there should be no problem hosting one domain off the subdirectory of another domain.
You can clear Google's index of those pages by using their Webmaster Tools. In order to identify yourself as the site's owner, you'll need to install a unique file (they create it for you) in the root directory of your various document roots, then you can manually update the parts of your site that they've indexed. This applies only to Google.
If you've been indexed by other search engines (and you probably have been if Google indexed you), you should try to figure out how they got there, fix the problem, move the second site to another folder (causing the pages to report 404 Page Not Found on your main domain) and then get the the search engines to reindex.
If you are using Linux, then some additions to your .htaccess file would probably work, but the specifics would depend on your site setup.

Multiple domains for one site: alias or redirect?

I'm setting up a number sites right now and many of them have multiple domains. The question is: do I alias the domain (with ServerAlias) or do I Redirect the request?
Obviously ServerAlias is better/easier from a readability or scripting perspective. I have heard however that Google likes it better if everything redirects to one domain. Is this true? If so, what redirect code should be used?
Common vhost examples will have:
ServerName example.net
ServerAlias www.example.net
Is this wrong and should the www also be a redirect in addition to example2.net and www.example2.net? Or is Google smart enough to that all these sites (or at least the www) are the same site?
UPDATE: Part of the reasoning for wanting aliases is that they are much faster. A redirect for a dialup user just because they did (or didn't) use the www adds significantly to initial page load.
UPDATE and ANSWER: Thanks Paul for finding the Google link which instructs us to "help your fellow webmasters by not perpetuating the myth of duplicate content penalties". Note, however, this only applies to content ON THE SAME SITE, exemplified in the article with "www.example.com/skates.asp?color=black&brand=riedell or www.example.com/skates.asp?brand=riedell&color=black". In fact, the article explicitly says "Don't create multiple pages, subdomains, or domains with substantially duplicate content."
Redirecting is better, then there is always one, canonical domain for your content. I hear Google penalises multiple domains hosting the same content, but I can't find a source for that at the moment (edit, here's one article, but from 2005, which is ancient history in Internet years!) (not correct, see edit below)
Here's some mod-rewrite rules to redirect to a canonical domain:
RewriteCond %{HTTP_HOST} !^www\.foobar\.com [NC]
RewriteCond %{HTTP_HOST} !^$
RewriteRule ^/(.*) http://www.foobar.com/$1 [L,R=permanent]
That checks that the host isn't the canonical domain (www.foobar.com) and checks that a domain has actually been specified, before deciding to redirect the request to the canonical domain.
Further Edit: Here's an article straight from the horses mouth - seems it's not as big an issue as you might think. Please read this article CAREFULLY as it distinguishes between duplicate content on the same site (as in "www.example.com/skates.asp?color=black&brand=riedell and www.example.com/skates.asp?brand=riedell&color=black") and specifically says "Don't create multiple pages, subdomains, or domains with substantially duplicate content."
SSL certificates can also be an issue (wild card certs mitigate this but are more expensive).
So if the cert is only bound to www.example.com, it won't validate for example.com. If this circumstance applies to your case, then carefully handling, redirects and hyperlink references in your html and javascript is very important.
If they are entirely different domain names, you will want to redirect because otherwise cookies can not be shared between the two. If a user logs into your website at example1.com, they will need to log in again if they visit example2.com.
If they are just different subdomains (example.com vs www.example.com) this won't matter.
Server aliasing can cause problems with CGI session continuity: since cookies are attached to the domain they were served from, CGI scripts have to be carefully written so that they are aware of the aliasing, or all links within and into the site have to be relative, or both - it is much harder to avoid niggly little hard-to-debug problems due to the browser serving you different cookies based on whether the user last entered your site through name.tld or www.name.tld.
Nowadays I doubt it matters. If you see both entries in google, then you know you're doing it wrong.
If half the links to your site refer to one URL and half refer to another, each URL is only going to get half the pagerank. Even if Google doesn't penalize your rank for having duplicate content, you're going to suffer.