vBulletin forum under multiple domains - apache

Hope someone will give me a hand with this problem I have. So here it goes.
There is a website with integrated vBulletin forum inside. The forum is accessible through
https://site.de/forum domain. The main site itself has many other domains based on locale. That is to say, there is a https://site.ch, https://site.it, https://site.at, etc (each one is in corresponding language).
Now there is a need to have this forum under at least 2 of this additional domains. I mean, there should be https://site.ch/forum domain, wich will contain the same forum, but with some differences in style and, of course, will have working inside-forum links with it's own domain (site.ch). The whole system is to be SEO-ed also.
So now my question is how to achieve this? I know there are some sort of plugins to manage multi-domain access, but they are not supported and are still in beta version.
At first, how to setup the forum to work under multiple domains?
And then, maybe I need to manually change some code to set the $vbulletin->options['bburl'] that is used to generate the links inside forum?
And the last one, how do I make all this search engine optimized??

You're asking numerous questions, you might get better results if you created a separate question for each of:
1) How to use one forum directory for multiple domains? (with the vbulletin tag and the tag for the web server you are using)
2) How to set the language based on the current domain in vbulletin? (with the vbulletin tag and one or more of these tags: localized, locale, multi-language, multilanguage)
3) Best practices for duplicate content presented in multiple languages on multiple domains (with the seo and vbulletin tags)
Some Answers:
1) If you're using the apache web server, you could add something like this to your httpd.conf file:
Alias /forums /var/www/...xxx.../forum_directory // use the path to your forum directory, no trailing slash
<Directory /var/www/...xxx.../forum_directory>
Order allow,deny
Allow from all
</Directory>
Then in the vbulletin ACP, change the setting for your basepath URL to "No":
Admin Control Panel -> Site Name / URL / Contact Details -> Always use Forum URL as Base Path
2) There are a few plugins that detect the language used by the browser and set vBulletin to use that language:
Language Detection
Set forum-language automatic to browser-language for first-time-visitors
3) SEO covers many things, but to deal with having duplicate content on multiple domains you can look at the Google Webmaster Central Blog.
This posting is helpful:
Working with multi-regional websites
A section from the post: Dealing with duplicate content on global websites
Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible for all pages and variations from the start. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both "example.de/" and "example.com/de/" show German language content for users in Germany), it would make sense to choose a preferred version and to redirect (or use the "rel=canonical" link element) appropriately.
I don't have anything on the other search engines.

Related

Google Custom Search - Add/remove sites to search dynamically

Google Custom Search has a feature to specify the sites you want the search engine to search - "Sites to search" feature.
I have a requirement to add/remove these sites on the fly. Is there any api or any other way provided by Google with which I can achieve this?
Here you can find the relevant information:
https://developers.google.com/custom-search/docs/tutorial/creatingcse
To create a custom search engine:
Sign into Control Panel using your Google Account (get an account if you don't have one).
In the Sites to search section, add the pages you want to include in your search engine. You can include any sites you want, not just
the sites you own. You can include whole site URLs or individual pages
URLs. You can also use URL patterns.
https://support.google.com/customsearch/answer/71826?hl=en
URL patterns
URL patterns are used to specify what pages you want included in your
custom search engine. When you use the control panel or the Google
Marker to add sites, you're generating URL patterns. Most URL patterns
are very simple and simply specify a whole site. However, by using
more advanced patterns, you can more precisely pick out portions of
sites.
For example, the pattern 'www.foo.com/bar' will only match the single
page 'www.foo.com/bar'. To cover all the pages where the URL starts
with ' www.foo.com/bar', you must explicitly add a '' at the end. In
the form-based interfaces for adding sites, 'foo.com' defaults to
'.foo.com/*'. If this is not what you want, you can change it back in
the control panel. No such defaulting occurs for patterns that you
upload. Also note that URLs are case sensitive - if your site URLs
include capital letters, you'll need to make sure your patterns do as
well.
In addition, the use of wildcards in URL patterns allows you to
include or exclude multiple pages or portions of a site all at once.
So, basically you've to navigate to the "Sites to search section" and enter the needed sites there. If you want to change these site on the fly, you've to manipulate your URL pattern.
There's also an option to use the XML configuration files. You just have to add (or remove) your sites there:
https://developers.google.com/custom-search/docs/annotations
Annotations: The annotations XML file lists the webpages or websites
you want your search engine to cover, and indicates any preferences
you have about how these sites should be ranked in your search
results. Each site and its associated information is called an
annotation. More information about the annotations XML file.
An example for an annotation:
<Annotation about="http://www.solarenergy.org/*">
<Label name="_cse_abcdefghijk"/>
</Annotation>
Using api we can add filter "siteSearch"=>"somedmain.com somdomain2.com","siteSearchFilter"=>"e" but there will be spacing between seperate domains.

Duplicate content and international sites clarification

Something is not clear, here is my case:
i want to have have the same content for us and uk people,
could i safely avoid duplicate content with thoses url:
www.example.us/info.html (hosted on us server)
www.example.co.uk/info.html (hosted on uk server)
from google :
Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this might not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
Seems not clear for me, what do you think about my case ?!
flau
Go for hreflang. When implemented properly, you will avoid all duplicate content issues.
if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers
That covers your scenario:
Choose one as your preferred URL for the US and make it redirect (or use canonical), and
Follow hreflang guidelines: https://support.google.com/webmasters/answer/189077?hl=en

Geotargeting a blog folder with GWT

I'm about to launch a blog in a multilingual website.
The website uses geotargeting: site.com/fr/ for france, /be/ for belgium, ch for switzerland, ...
I was wondering if the blog should be run in root level: site.com/blog/
in that case, how the blog could be geotargeted?
Thanks a lot
You should have different URLs for each region/language. For example:
example.com/fr/blog or
example.com/be/blog
Or, even:
example.com/blog/fr or
example.com/blog/be
That depends on you. The main thing is to separate URLs for different languages/regions.
After you do all this, you should add hreflang attributes. That way you tell Google what version of a URL should be displayed when someone searches in certain language/from certain region.
If you use hreflang, you don't have to set geotargeting in WMT. If you still want to do that, you should add separate folders to WMT as different websites.

Associated Content & SEO, Sitemaps with External links, using CNAMEs to include External Links as my own in the sitemap

Is there any HTML code or page paramater or metaname that can tell search engines that the content of a page is closely linked to another page on another domain..
I keep the content metatag updated and also the keyword metatag.
I don't want to show these links to my visitors.
1)
I need to know if there is a protocol for communicating related links specifically to crawlers so as to improve my ranking
Is there any way via code I can tell crawlers (crawlers specifically, like how No Follow is addressed to crawlers) that mydomain.com/Porduct.php is closely linked to say
http://ebay.com/sameProduct
http://wikipedia.com/GenericProduct or
http://google.com?q=someKeywords
Should I include external links or CNAME mapped External links(Read Q3) inside the content tag ?? Would that make a difference
2)
Can I include these links in my Sitemap.. Common sense would suggest that links in my sitemap should be hoisted on my domain. Still though I did ask since the sitemap takes in the full URL including the domain name.
3)
If a particular well indexed page has content largely similar to mine can I map a CNAME of my page to that site and include that in the sitemap?? would that amount to cheating ??
First of all, I'm not sure what do you want to achieve there. Search engines in general are already pretty good at recognizing what your page is about. If your content is about product A, write a description about product A, have images about product A, let your users comment about or review product A, or add microdata to your page (i.e. http://schema.org/Product). All these will help search engines recognize that your page is about that product, just like that page on the other site which also have content about the same product.
To answer your questions:
1) I'm not aware of any tag like that which would also be supported by search engines.
2) In your Sitemap you can include only URLs that point to a location on the same hostname the Sitemap is hosted on (there are some exceptions, but those are irrelevant now). See http://www.sitemaps.org/protocol.html for more info about Sitemaps.
3) A CNAME resource record specifies that the domain name is an alias of another domain name, and thus it can't be used the way you described.
Lastly, you're trying to do something for crawlers which is usually a bad idea. Create an awesome website, something useful for the users, something they would love and they'd miss in case you closed the shop. Just focus on the user and all else will come.

How to get sites identical in content but different in language and TLD indexed by major search engines?

Is it possible to get two "editions" of a website both indexed by the major search engines (Google/Yahoo/Bing/Teoma) which differ in content language only and are hosted under different TLDs?
Say English content is available at "http://domain.com/", German content at "http://domain.de/". Now, if e.g. Google.com is used I want it to list the "domain.com" entry and vice versa. Is "Duplicate Content" an issue here?
Depending on website software you use (wordpress, joomla, custom, etc), you might have a plugin or addon for each that supports multiple domains and search-engine pinging/seo. If that's the case, it should be possible.
I'm assuming your website layout is the same but you have a ".com" and ".de" TLD pointing to the same directory/software installation and a (auto?) language selector to choose between English and German.
Edit: (for quick readers)
It shouldn't need separate webspace for each site. What I do for my sites to get them submitted is use Sitemaps. I've never generated one myself, so I can't help in that aspect. However, you could generate sitemaps for each language (e.g. sitemap.en.xml.gz | sitemap.de.xml.gz) and have your application ping search engines with these sitemaps. Essentially, you'll have the same content but in different languages and it'll be in a sitemap which can be submitted to google/bing/yahoo/etc.
I used this method on a wordpress blog I had and every time I submitted/changed content, it would re-generate sitemaps (updating links/etc) and ping the search engines again.