Google Custom Search Engine for www.example.com/* but exclude example.com/* - google-custom-search

I am having trouble setting up the following Google CSE:
It should search for URLs like http://www.example.com/*.
But it should exclude URLs like http://example.com/* because that is a different website.
I first setup the search engine to look for www.example.com but that will include all URLs like http://example.com/*. I then tried add a setup to exclude example.com but that will exclude all URLs like http://www.example.com/* (so there will be no results.) I then tried several possible variants I can think of, but the results all fall into these two categories (all or none.)

Related

Google Custom Search - Add/remove sites to search dynamically

Google Custom Search has a feature to specify the sites you want the search engine to search - "Sites to search" feature.
I have a requirement to add/remove these sites on the fly. Is there any api or any other way provided by Google with which I can achieve this?
Here you can find the relevant information:
https://developers.google.com/custom-search/docs/tutorial/creatingcse
To create a custom search engine:
Sign into Control Panel using your Google Account (get an account if you don't have one).
In the Sites to search section, add the pages you want to include in your search engine. You can include any sites you want, not just
the sites you own. You can include whole site URLs or individual pages
URLs. You can also use URL patterns.
https://support.google.com/customsearch/answer/71826?hl=en
URL patterns
URL patterns are used to specify what pages you want included in your
custom search engine. When you use the control panel or the Google
Marker to add sites, you're generating URL patterns. Most URL patterns
are very simple and simply specify a whole site. However, by using
more advanced patterns, you can more precisely pick out portions of
sites.
For example, the pattern 'www.foo.com/bar' will only match the single
page 'www.foo.com/bar'. To cover all the pages where the URL starts
with ' www.foo.com/bar', you must explicitly add a '' at the end. In
the form-based interfaces for adding sites, 'foo.com' defaults to
'.foo.com/*'. If this is not what you want, you can change it back in
the control panel. No such defaulting occurs for patterns that you
upload. Also note that URLs are case sensitive - if your site URLs
include capital letters, you'll need to make sure your patterns do as
well.
In addition, the use of wildcards in URL patterns allows you to
include or exclude multiple pages or portions of a site all at once.
So, basically you've to navigate to the "Sites to search section" and enter the needed sites there. If you want to change these site on the fly, you've to manipulate your URL pattern.
There's also an option to use the XML configuration files. You just have to add (or remove) your sites there:
https://developers.google.com/custom-search/docs/annotations
Annotations: The annotations XML file lists the webpages or websites
you want your search engine to cover, and indicates any preferences
you have about how these sites should be ranked in your search
results. Each site and its associated information is called an
annotation. More information about the annotations XML file.
An example for an annotation:
<Annotation about="http://www.solarenergy.org/*">
<Label name="_cse_abcdefghijk"/>
</Annotation>
Using api we can add filter "siteSearch"=>"somedmain.com somdomain2.com","siteSearchFilter"=>"e" but there will be spacing between seperate domains.

Subsite as unique Google search result

I have a question regarding SEO:
let's say I have a site example.com, and a subsite example.com/mysite.
Is it possible (with proper SEO) for "mysite" to show up in Google search results as a unique result? Right now it's under "More links from example.com domain". If this is not possible, would subdomain work (mysite.example.com)?
With regards,
Looted
Actually "example.com/mysite" is look like a page of "example.com" so it is possible to show up in Google search results as a unique result.
And sub-domain like- mysite.example.com is possible to display in Google search results separately.

Is serving the same website to multiple domains bad for SEO?

I'm working on a website which currently has two different domains pointing at it:
example1.com
example2.com
I have read that serving identical content to multiple domains can harm rankings.
The website being served is largely the same with the exception of item listings (think of an e-commerce site) and a few other minor tweaks (title, description, keywords, etc). Depending on the domain used it will adapt to serve different items.
Does this resolve the issue of serving duplicated content across multiple domains thus not harming the rankings?
Or would I be better to 301 redirect to a single domain and go from there?
If both your URLs show the same styled product listing then it will definitely affect the search engine result. Give a different look to both your websites in terms of displaying product or changing navigation menu. Put a slightly different image and add different descriptions to display your product.
If you run a website with same content and design on two different domains even with modified title, description and keywords, it is bad SEO practice and your website will be penalized by search engines.
Best option would be making a new website design with original content for the second domain and optimize it. Other wise you can make a 301 redirect for pointing domain 2 to the domain 1, this will not harm you nor help you!
I have also seen multiple domains having same website, content, title and description.. But to my surprise that domain is ranking well.. Crazy search engines!

Planning url rewrite for my web app

I'm working on a site which shows different products for different countries. The current url scheme I'm using is "index.php?country=US" for the main page, and "product.php?country=US&id=1234" to show a product from an specific country.
I'm planning now to implement url rewrite to use cleaner urls. The idea would be using each country as subdomain, and product id as a page. Something like this:
us.example.com/1234 -> product.php?country=US&id=1234
I have full control of my dns records and web server, and currently have set a * A record to point to my IP in order to receive *.example.com requests. This seems to work ok.
Now my question is what other things I'd need to take care of. Is it right to assume that just adding a .htaccess would be enough to handle all requests? Do I need to add VirtualHost to each subdomain I use as well? Would anything else be needed or avoided as well?
I'm basically trying to figure out what the simplest and correct way of designing this would be best.
The data you need to process the country is already in the request URL (from the hostname). Moving this to a GET variable introduces additional complications (how do you deal with POSTs).
You don't need seperate vhosts unless the domains have different SSL certs.

vBulletin forum under multiple domains

Hope someone will give me a hand with this problem I have. So here it goes.
There is a website with integrated vBulletin forum inside. The forum is accessible through
https://site.de/forum domain. The main site itself has many other domains based on locale. That is to say, there is a https://site.ch, https://site.it, https://site.at, etc (each one is in corresponding language).
Now there is a need to have this forum under at least 2 of this additional domains. I mean, there should be https://site.ch/forum domain, wich will contain the same forum, but with some differences in style and, of course, will have working inside-forum links with it's own domain (site.ch). The whole system is to be SEO-ed also.
So now my question is how to achieve this? I know there are some sort of plugins to manage multi-domain access, but they are not supported and are still in beta version.
At first, how to setup the forum to work under multiple domains?
And then, maybe I need to manually change some code to set the $vbulletin->options['bburl'] that is used to generate the links inside forum?
And the last one, how do I make all this search engine optimized??
You're asking numerous questions, you might get better results if you created a separate question for each of:
1) How to use one forum directory for multiple domains? (with the vbulletin tag and the tag for the web server you are using)
2) How to set the language based on the current domain in vbulletin? (with the vbulletin tag and one or more of these tags: localized, locale, multi-language, multilanguage)
3) Best practices for duplicate content presented in multiple languages on multiple domains (with the seo and vbulletin tags)
Some Answers:
1) If you're using the apache web server, you could add something like this to your httpd.conf file:
Alias /forums /var/www/...xxx.../forum_directory // use the path to your forum directory, no trailing slash
<Directory /var/www/...xxx.../forum_directory>
Order allow,deny
Allow from all
</Directory>
Then in the vbulletin ACP, change the setting for your basepath URL to "No":
Admin Control Panel -> Site Name / URL / Contact Details -> Always use Forum URL as Base Path
2) There are a few plugins that detect the language used by the browser and set vBulletin to use that language:
Language Detection
Set forum-language automatic to browser-language for first-time-visitors
3) SEO covers many things, but to deal with having duplicate content on multiple domains you can look at the Google Webmaster Central Blog.
This posting is helpful:
Working with multi-regional websites
A section from the post: Dealing with duplicate content on global websites
Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible for all pages and variations from the start. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both "example.de/" and "example.com/de/" show German language content for users in Germany), it would make sense to choose a preferred version and to redirect (or use the "rel=canonical" link element) appropriately.
I don't have anything on the other search engines.