i have 3 domains that are multilanguage and all hosted in the same servers/databases
for example: thanks.com/en , gracias.com/es and danke.com/de
so in terms of text content there is no duplicate content, since each text is properly translated.
the problem i am facing is with images.
for example:
a_cow.jpg is the same for all domains and will be loading at thanks.com/a_cow.jpg , gracias.com/a_cow.jpg and danke.com/a_cow.jpg
my question is if this will be counted as duplicated content by search engines, since the same image can be accessed from multiple domains.
should i force them to load all in one domain? for example, in gracias.com load the image with thanks.com / a_cow.jpg
i can do that in htaccess but i am wondering if i should do it or not, and what are the pros and cons.
thanks in advance
my question is if this will be counted as duplicated content by search engines, since the same image can be accessed from multiple domains.
It will count as duplicate image, but not duplicate content as it is typically understood in SEO. It is only an issue if you are trying to get traffic from those images. Since they have different URLs, they will be considered as different images in competition for the same traffic, although they are the same.
should i force them to load all in one domain? for example, in gracias.com load the image with thanks.com / a_cow.jpg
Not if you don't care about getting traffic from these images. Otherwise, it would help yes.
but on the other hand, the other 2 domains will load images from a external domain.
That is not an issue. Keep in mind that if you load images from one domain only, it is that domain that will get the traffic, not the others.
Related
I'm working on a website which currently has two different domains pointing at it:
example1.com
example2.com
I have read that serving identical content to multiple domains can harm rankings.
The website being served is largely the same with the exception of item listings (think of an e-commerce site) and a few other minor tweaks (title, description, keywords, etc). Depending on the domain used it will adapt to serve different items.
Does this resolve the issue of serving duplicated content across multiple domains thus not harming the rankings?
Or would I be better to 301 redirect to a single domain and go from there?
If both your URLs show the same styled product listing then it will definitely affect the search engine result. Give a different look to both your websites in terms of displaying product or changing navigation menu. Put a slightly different image and add different descriptions to display your product.
If you run a website with same content and design on two different domains even with modified title, description and keywords, it is bad SEO practice and your website will be penalized by search engines.
Best option would be making a new website design with original content for the second domain and optimize it. Other wise you can make a 301 redirect for pointing domain 2 to the domain 1, this will not harm you nor help you!
I have also seen multiple domains having same website, content, title and description.. But to my surprise that domain is ranking well.. Crazy search engines!
I have a client who has brought a truck load of domains he wants me to redirect to his site.
A few of them are the same name with different top level domains (mysite.com, mysite.co.uk etc etc) but a lot of them are keyword related (mylocation-businessType.com etc etc).
I am wondering if either of these will be negative for SEO. I am thinking the top level domain changes will be fine, and expected by google, but the keywords might be views as a bit hacky?
What are the good people of stackoverflow's view on this?
If they are redirected properly then they'll have no effect at all. The only advantage will be if the name makes sense and a user might type it in. eg. identical names with and without hyphens.
For this situation all of the other answers are correct, you won't get any benefits in Pagerank, etc. and it wouldn't be useful except to pickup direct traffic to your domain names that you are then redirecting.
How would it affect your SEO though? That's a little trickier. Two ways of looking at it:
1.) Competitors could do this to you and it'd be completely out of your control. If redirecting a bunch of domains did any real harm to rankings it'd be a great way to do negative SEO, or "Google Bowling," and could be used to take down a site's rankings. That isn't the case though, so it probably wouldn't have too much of a negative effect.
UNLESS
2.) The nameservers for your redirected domains match the nameservers for your main domain. Pointing all domains to the same set of nameservers will help show that all domains are under the control of the same webmaster.
Even if you are using different nameservers and using 301 redirects as recommended, if the server with your redirects comes back to (at least) the same Class C IP address as your main site's server, a search engine would still be able to tie you together as likely being run by the same owner.
Either of these setups can identify you as the source of the redirects and devalue the ranking ability of your main site since there is a much higher likelihood the redirects are coming from you.
winwaed is correct. If you're doing a proper 301 redirect, the other domains are only valuable if people directly type them in. They won't rank, won't get any link juice, and won't get any inbound links. If you do seed inbound links, google will treat them as if they point to the target of your 301 redirect. It's a waste of time to just directly do that for SEO purposes.
The way to use each of those domains for SEO would be to build a bit of unique content on each one, get some inbound links, and then link out to your target page. Not really worth doing unless you really spend a lot of time at it, and google still tends to penalize obvious gaming of the system like that.
They won't contribute toward ranking, however keyword domains do get some amount of advantage for those terms. So, the way to use them is to build sites on all of them and funnel traffic to the main site.
Of course, they can also be used for extra backlinks, but you really want different C class IP addresses from the servers. For that reason you might want to go with SEO hosting.
Matt Cutts from Google explained it in this video:
http://www.youtube.com/watch?v=r1lVPrYoBkA
and here:
http://www.youtube.com/watch?v=a70ygsHgvMw
He also said if he was doing this, he would redirect each of sites to the target sites' different important pages. If the redirected domains had pageranks before, they will still flow pagerank (not exactly but a lower pagerank).
So, we're trying to up our application in the rankings in the search engines, and one way our SEO guy told us to do that was to register similar domains...for example we have something like
http://www.myapplication.com/parks.html
so..we acquired the domain parks.com (again just an example).
Now when people go to http://www.parks.com ...we want it to display the content of http://www.myapplication.com/parks.html.
I could just put a forwarding page there, but from what i've been told that makes us look bad because it's technically a permanent redirect..and we're trying to get higher in the search engine rankings, not lower.
Is this a situation where we would use the Server.Transfer method of ASP.net?
How are situations like this handled, because I've defiantly seen this done by many websites.
We also don't want to cheat the system, we are showing relevant content and not spam or tricking customers in anyway, so the proper way to do achieve what i'm looking for would be great.
Thanks
Use your "similar" domain names to host individual and targetted landing pages that will point to your master content.
It's easier to manage and you will get a higher conversion rate.
Having to create individual page will force you to write relevent content and will increase the popularity of the page.
I also suggest you to not only build landing pages, but mini sites (of few pages).
SEO is sa very high demanding task.
Regarding technical aspects: Server.Transfer is what you should use. Never use Response.Redirect, Google and other search engines will drop your ranking.
I used permanent URL rewrite in the past. I changed my website and since lots of traffic was coming from others website linking mine, I wanted to have a permanent solution.
Read more about URL rewriting : http://msdn.microsoft.com/en-us/library/ms972974.aspx
We have a family of sites (about games) with shared content. Each site has its own top level domain, and most content has a "home" domain, but all content is accessible on each domain. This allows a user who is logged in on, for example, the board game site, to page through their new subscribed content and see pages about RPGs or video games (content that is based in another of our domains) without having to jump to another domain.
I am concerned that this duplicate content will be used to penalize us in search engine rankings. Canonical links do not work across domains. Google recommends using 301 redirects to force all users to a single domain for a particular page, but we do not want to do that because we don't want to force users off their preferred domain. In addition, we have other content that genuinely belongs to multiple domains--lists that might include games from multiple domains,for example.
How can we continue to show our content in this way, without being penalized for having duplicate content across domains?
Have a read of this article, Google does support cross domain canonical. So just point it to the single source of truth!
http://searchengineland.com/google-supports-cross-domain-canonical-tag-32044
I have a site with a huge number (well, thousands or tens of thousands) of dynamic URLs, plus a few static URLs.
In theory, due to some cunning SEO linkage on the homepage, it should be possible for any spider to crawl the site and discover all the dynamic urls via a spider-friendly search.
Given this, do I really need to worry about expending the effort to produce a dynamic sitemap index that includes all these URLs, or should I simply ensure that all the main static URLs are in there?
That actual way in which I would generate this isn't a concern - I'm just questioning the need to actually do it.
Indeed, the Google FAQ (and yes, I know they're not the only search engine!) about this recommends including URLs in the sitemap that might not be discovered by a crawl; based on that fact, then, if every URL in your site is reachable from another, surely the only URL you really need as a baseline in your sitemap for a well-designed site is your homepage?
If there is more than one way to get to a page, you should pick a main URL for each page that contains the actual content, and put those URLs in the site map. I.e. the site map should contain links to the actual content, not every possible URL to get to the same content.
Also consider putting canonical meta tags in the pages with this main URL, so that spiders can recognise a page even if it's reachable through different dynamical URLs.
Spiders only spend a limited time searching each site, so you should make it easy to find the actual content as soon as possible. A site map can be a great help as you can use it to point directly to the actual content so that the spider doesn't have to look for it.
We have had a pretty good results using these methods, and Google now indexes 80-90% of our dynamic content. :)
In an SO podcast they talked about limitations on the number of links you could include/submit in a sitemap (around 500 per page with a page limit based on pagerank?) and how you would need to break them over multiple pages.
Given this, do I really need to worry
about expending the effort to produce
a dynamic sitemap index that includes
all these URLs, or should I simply
ensure that all the main static URLs
are in there?
I was under the impression that the sitemap wasn't necessarily about disconnected pages but rather about increasing the crawling of existing pages. In my experience when a site includes a sitemap, minor pages even when prominently linked to are more likely to appear on Google results. Depending on the pagerank/inbound links etc. of your site this may be less of an issue.