multiple url restrictions in google custom search - google-custom-search

Hi I have set up a paid google custom search for my domain.
Is there a way to limit the results based on the directory that the search is currently in?
The directories that I need to limit the search to are company divisional sites, so should only show results based on that divisional directory
For example:
In http://www.mysite.com/* shows the results that appear in the site
In http://www.mysite.com/directory1/* the results that appear in directory1
In http://www.mysite.com/directory2/* shows the results that appear in directory2
etc.
Thanks

you can limit the results to a domain using:
&as_sitesearch=mydomain.com
so, using this technique, you can limit your results to a folder in a domain. For example:
http://www.google.com/cse?cx=YOUR-GOOGLE-ID&client=google-csbe&output=xml_no_dtd&filter=1&q=post&num=10&start=10&&as_sitesearch=YOUR-DOMAIN.com/send-uk
The previous url will only display the links in yourdomain that are inside SEND-UK. Just tested and it works fine :-).
You can see this example working in this Drupal module, where I am using exactly this idea, even I did not know that you could limit also by folder:

Related

Filter Google Custom Search Engine results by site

I have been having issues using the Custom Search Engine API while (trying to) using the functionality of its specific site search.
I have created (using the web console) a CSE and defined it to search in 2 sites:
*.ebay.com
*.amazon.com
First, when searching for the term 'pcrush' (with curl), I receive results from amazon.com and ebay.com domains, as expected. this is ok.
curl -X GET 'https://www.googleapis.com/customsearch/v1?key=<my-key>&cx=<my-cx>&q=pcrush'
When I try making the same search but limit the results to be just from a specific site, I still get results from both eay and amazon.
Here, I want to receive only the ebay.com results:
adding &as_sitesearch=ebay.com
adding &as_sitesearch=ebay.com&as_dt=i
adding &as_sitesearch=amazon.com&as_dt=e
all of the above example, when the domain is *.ebay.com or *.amazon.com
It all still returned results both from ebay.com and amazon.com
I should mention we are following the API as described here.
What am I doing wrong?
Thanks in advance.

How to remove URLs with argument in google result

I have a website which I have recently started and also submitted my sitemap on google webmaster tool. My site got index whiten short time but whenever I search about my website on google, I see two three version of my same pages with diff URL arguments on each
Means suppose my site name is example.com, so when I search about exmaple.com on Google I get the results like following
www.example.com/?page=2
www.example.com/something/?page=3
www.example.com
As I know result 1 and result 3 are same, why are they being shown separately ? I don't have any such URL in my sitemap and not even in any of my html page so why is this happening I am little confused. I want to get rid of it
Also result no 2 should be displayed simple as www.exaple.com/something
and not like www.example.com/something?page=3
There is actually a setting in google webmaster tool which helps in removing URLs with parameters. To access & configure the setting, navigate to Webmaster tool --> Crawl --> URL Parameters and set them according to your needs
I also found following article useful for understanding concept behind those parameters and how could we remove pages getting crawled with unnecessary parameters
http://www.shoutmeloud.com/google-webmaster-tool-added-url-parameter-option-seo.html

In a google search result, how do links show up beneath a result

I have a sitemap ready already done.
How do I get google results to show a set of links beneath the link?
I understand that google generates sitelinks by itself, which in turn cause these type of results to show:
Is there a difference between the first image and the second? are they both sitelinks auto-generated by google?
If not, what's the difference between the first image and the second?
Thank you
Google adds them if it thinks your site is big enough or popular enough to warrant them. So unfortunately you can't add code to your site to cause these to display.
Google also doesn't allow you to choose the links if it does display them. You can ask them not to display certain links here in Google Search Console (aka Google Webmaster tools) if you've registered your site there but not which ones to display there. How it decides which ones to display is not publically stated but you can probably influence them with a combination of site structure, common links on all pages and which pages are most popular on your site.
The difference between the two is the first one is a generic search (mma) which this site shows for and the second is a specific or band search (mma fighting) which exactly matches this specific site and/or brand.

Infinite amt of Google custom search boxes per website?

I've got a site where users can create groups (we call them games)
www.ongoingworlds.com/games/270/
www.ongoingworlds.com/games/287/ etc
Each of these games has it's own user-generated content. I want to use a Google custom search for each game. But I can't see an easy way to amend the embed code to add a dynamic path, and I don't want to have to register multiple (hundreds) of GCSEs separately to get an embed code for each.
What would be the best way of allowing each of these URLs (above) to have their own GCSE?
You can search subparts of your site by using a combination of site: operator and webSearchQueryAddition parameter on gcse element.
webSearchQueryAddition appends additional search term to your user's query. If for each of the "games" you change the webSearchQueryAddition to point to the "game" base url, the search results will be matching that url. You can inject that parameter programmatically with e.g. javascript, for each of the "games".
Documentation is here: https://developers.google.com/custom-search/docs/element#supported_attributes
And here is working example:
http://jsfiddle.net/t2s5M/

What should i add to my site to make google index the subpages as well

I am a beginner web developer and i have a site JammuLinks.com, it is built on php. It is a city local listing search engine. Basically i've written search pages which take in a parameter, fetch the records from the database and display it. So it is dynamically generating the content. However if you look at the bottom of the site, i have added many static links where i have hard coded the parameters in the link like searchresult.php?tablename='schools'. So my question is
Since google crawls the page and also the links listed in the page, will it be crawling the results page data as well? How can i identify if it has. So far i tried site:www.jammulinks.com but it results the homepage and the blog alone.
What more can i add to make the static links be indexed by it as well.
The best way to do this is to create a sitemap document (you can even get the template from Google's webmaster portion of their sites, www.google.com/webmasters/ I believe).