I have hundreds of blog posts that need to be redirected due to a site redesign. Current the URL for each blog item contains a number followed by the name of the post. For instance, www.mysite.com/2199-this-is-the-blog-post-name. I want to redirect all of the posts to new directories such that the URL will appear as www.mysite.com/new-directory/2199-this-is-the-blog-post-name.
What I want to know is what is the easiest way redirect these. I would like to know if there is a way that any string starting with a number, for example 2, could then be redirected (I am not worried about over non-blog content URL that may start with a 2). I have tried several ReWriteCond/ReWriteRule but have yet to find anything that works.
Try the following in your htaccess
RedirectMatch 301 ^/([0-9]+.+)$ /newdir/$1
Related
History/Context: I have a site, 'www.oldexample.com' built in 1998, HTML 4.01 transitional on Apache, cpanel server. Until last fall our main keywords got us to top 10. After mobile changes and Panda etc, Dropped to page 2 or 3 for all but one very specific keyword. The old site, 'www.oldexample.com' has many good back links and history in google and all main directories. I am rebuilding a test site now which is on 'mycompany.myshopify.com' as it addresses all my google errors issues on oldsite. I have set up my 'www.newexample.com' to redirect to the shopify site which is called up under 'www.newexample.com'.The myshopify.com URL does not show up at all.
Question: If I were to do cpanel 301 redirect of whole 'oldexample.com' to 'newexample.com' would I still benefit from the many links and history of oldsite?
When you say that the shopify URL doesn't show at all, do you mean it's not showing when you search for those keywords, or it's not indexed at all? If it's the latter, prompt Google to index it using Google Search Console. If it's the former, there are a number of things that could have affected this:
the authority of the new site - if you've just launched it, it naturally won't have the authority of the previous site and therefore is less likely to get visibility
you are correct that the backlinks would have played a major part in this. What you need to do is to redirect the old domain to the new one you want to appear in Google. For example, if you want to actually take people to newsite.shopify.com, you should redirect the old domain directly to that one. If you redirect the old one to newdomain.com, which you then redirect to newsite.shopify.com the result won't be the same. Link value is lost via redirects. Ideally, you should get in touch with as many 3rd party websites linking to your old domain and ask them to update the links to point to newsite.shopify.com
Even if you do that you might still not see those rankings because of various other factors. If you fancy posting the actual URLs and keywords in question, I can spare a few minutes to have a look.
hello im renewing my website using a another CMS and i wanderd if i should use new dirs name for the content.
example:
i have a keyword that im showing up the first in google under the domain
www.domain.com/content/view/articleName
in the new website it caled
www.domain.com/blog/articleTitle
the reason is because i have also
www.domain.com/news/articleTitle
www.domain.com/events/eventName
will it b bad for my seo?,
how should i do it corectly?
Make sure that url's are same as before, with same content on it, otherwise you may lost rankings.
Or you can 301 redirect old url to new one, but that's not recommended because not all link juice will be passed and you may lost few ranking positions.
I have wildcard subdomains enabled on my domain. I use this so that I can rewrite urls like es.domain.com to domain.com/page.php?lang=es and display to the user the local language version of page.php.
The one potential problem I see with allowing wildcard subdomains is that people can link to www.es.domain.com or even anything.they.like.domain.com and it will display a perfectly working clone of the website. I presume this 'duplicate content' is bad for SEO.
Can anyone come up with a RewriteRule which detects subdomains of more than 2 letters (www. excluded of course) and 301 redirects offending urls to the clean base domain.com? I'm having trouble when I consider domains like domain.co.uk which already look like they are on a subdomain.
As a side note, are there any similar implications for SEO on the opposite side of the url, with query parameters? For example, domain.com?param=anything-I-like will surely show a duplicate page. How does Google handle this content?
UPDATE:
Here's the rewrite rule I'm using currently. If I wanted to clean up bad urls with PHP, I'd need to modify this to catch all subdomains. i need to do this generically (without specifying domain.com) as it's going to be used on a CMS. Any suggestions?
RewriteCond %{HTTP_HOST} ^([a-z]{2})\.
RewriteRule p/(.*) page.php?p=$1&lang=%1
I honestly can't speak to fixing your actual issue, but I can confirm that anything.I.want.domain.com is really, REALLY bad for SEO. I've got two years' experience in the field and I'm currently working on a project cleaning up links for our main U.S. site. A couple of the biggest problems have come from sites just like you described where there were around 100 *.domain.com. The biggest issue is the effect of this problem with trust flow, it basically sends a link's trust rating to 0 and tells Google that, not only should this link be disregarded, the domain it came from and links to should be investigated for potential spammy-ness.
As to your final question on implications:
Query parameters can be just as helpful or detrimental as any other URL structure, so you want to be careful with those, as well. If you've got different language versions of your site, be sure to have one (especially if you don't have entirely unique content) as the rel-canonical page. The thing is, linking structure is important to search engines, but not overly so. It's one of many metrics. I'd be far more concerned about the subdomains. If you happen to be able to sneak in some small, basic keywords that help describe the page in with your query vars, it could help a bit. I would, however, highly suggest that you have a three or four tiered structure to your site, supported in the URLS.
See this
Google tends to like: domain.com/landingpage/category/subcategory?somevars=44
Going more than three deep spreads you too thin and less than that makes the site too bulky to navigate. I believe it's covered somewhat here if you've never seen it: http://moz.com/beginners-guide-to-seo
Search Engine Journal
Single Grain and
Moz
can answer a lot of your SEO questions and tools like:
Majestic
Soolve
Mozcast
SERPMetrics Flux
can help a lot, too. Try doing a little reading and see if you can decide a good scheme for your links.
Again, sorry, I don't know really any Apache, but hopefully that'll help!
Presumably you have a rewrite rule that takes anything in front of domain.com and puts it into the lang parameter. Rather than having a rewrite rule to do the redirecting, have your page.php script examine the lang parameter and issue a redirect for invalid values.
Thanks to all for the info & replies on this. The solution I've found is to write a more generic .htaccess rule to catch all subdomains and forward them to PHP for processing. PHP then checks if the subdomain is valid and if not, 301 redirects the visitor to the root domain. This way if someone links to blah.blah.domain.com, Search engines should see that as a link to just domain.com. I'm only using language subdomains on my site but it should work for any subdomains you want to use.
Here's the htaccess rewrite:
The regex works by finding the last instance of more than 3 domain-name-valid characters, followed by a dot, followed by any other string. The idea is that it finds the domain name in the url, then captures everything before it. Obviously this wont work for domains which are shorter than 3 characters.
#All sub domains are redirected to p.php for processing:
RewriteCond %{HTTP_HOST} ^(.*)\.[a-z0-9\-]{3,}\..*
RewriteRule (.*) p.php?subdom=%1 [L]
Here's the PHP:
function redirect301($page='/'){
header("HTTP/1.1 301 Moved Permanently");
header("Location:{$page}");
exit();
}
$subdom = $_REQUEST['subdomain']; //you should sanitise this if using this script!
$defaultLang = 'en';
$alternateLangs = "de|es"; //list of allowed subdomains
$alternateLangs = explode('|',$alternateLangs);
if(!empty($subdom) && $subdom!= 'www'){
if( !in_array($subdom,$alternateLangs) ) redirect301(); //redirect to homepage
$ISOlangCode = $subdom; // en,es,de,etc - capture code for use later
}
if($defaultLang && $ISOlangCode == $defaultLang) redirect301(); //disallow subdomain for default language (redirect to homepage)
Hopefully this helps someone out.
Couple of months ago, we revamped our web site. We adopted totally new site structure, specifically merged several pages into one. Everything looks charming.
However, there are lots of dead links which produce a large number of 404 errors.
So how can I do with it? If I leave it alone, could it bite back someday, say eating up my pr?
One basic option is using 301 redirect, however it is almost impossible considering the number of it.
So is there any workaround? Thanks for your considering!
301 is an excellent idea.
Consider you can take advantage of global configurations to map a group of pages. You don't necessary need to write one redirect for every 404.
For example, if you removed the http://example/foo folder, using Apache you can write the following configuration
RedirectMatch 301 ^/foo/(.*)$ http://example.org/
to catch all 404 generated from the removed folder.
Also, consider to redirect selectively. You can use Google Webmaster Tools to check which 404 URI are receiving the highest number inbound links and create a redirect configuration only for those.
Chances are the number of redirection rules you need to create will decrease drastically.
301 is definitely the correct route to go down to preserve your page rank.
Alternatively, you could catch 404 errors and redirect either to a "This content has moved" type page, or your home page. If you do this I would still recommend cherry picking busy pages and important content and setting up 301s for these - then you can preserve PR on your most important content, and deal gracefully with the rest of the dead links...
I agree with the other posts - using mod_rewrite you can remap URLs and return 301s. Note - it's possible to call an external program or database with mod_rewrite - so there's a lot you can do there.
If your new and old site don't follow any remapable pattern, then I suggest you make your 404 page as useful as possible. Google has a widget which will suggest the page the user is probably looking for. This works well once Google has spidered your new site.
Along with the other 301 suggestions, you could also split the requested url string into a search string routing to your default search page (if you have one) passing those parameters automatically to the search.
For example, if someone tries to visit http://example.com/2009/01/new-years-was-a-blast, this would route to your search page and automatically search for "new years was a blast" returning the best result for those key words and hopefully your most relevant article.
With ASP.NET MVC (or using HttpHandlers) you can dynamically generate URLs, like the one in this question, which includes the title.
What happens if the title changes (for example, editing it) and there's a link pointing to the page from another site, or Google's Pagerank was calculated for that URL?
I guess it's all lost right? (The link points to nowhere and the pagerank calculated is lost)
If so, is there a way to avoid it?
I use the same system as is in place here, everything after the number in the URL is not used in the db query, then I 301 redirect anything else to be the title.
In other words, if the title changed, then it would redirect to the correct place. I do it in PHP rather than htaccess as it's easier to manage more complex ideas.
I think you're generally best off having the server send a permanent redirect to the new location, if possible.
That way any rank which is gained from third party links should, in theory, be transferred to the new location. I'm not convinced whether this happens in practice, but it should.
The way Stackoverflow seems to be implemented everything after the question number is superfluous as far as linking to the question goes. For instance:
SEO and hard links with dynamic URLs
links to this question, despite the fact that I just made up the 'question title' part out of thin air. So the link will not point to nowhere and the PageRank is not lost (though it may be split between the two URLs, depending on whether or not Google can canonicalize them into a single URL).
Have your app redirect the old URL via a 301 Redirect. This will tell Google to transfer the pagerank to the new URL.
If a document is moved to a different URL, the server should be configured to return a HTTP status code of 301 (Moved Permanently) for the old URL to tell the client where the document has been moved to. With Apache, this is done using mod_rewrite and RewriteRule.
The best thing to help Google in this instance is to return a permanent redirect on the old URL to the new one.
I'm not an ASP.NET hacker - so I can't recommend the best way to implement this - but Googling the topic looks fairly productive :-)
Yes, all SEO is lost upon a url change -- it forks to an entirely new record. The way to handle that is to leave a 301 redirect at the old title to the new one, and some search engines (read: Google) is smart enough to pick that up.
EDIT: Fixed to 301 redirect!