I've got a site where users can create groups (we call them games)
www.ongoingworlds.com/games/270/
www.ongoingworlds.com/games/287/ etc
Each of these games has it's own user-generated content. I want to use a Google custom search for each game. But I can't see an easy way to amend the embed code to add a dynamic path, and I don't want to have to register multiple (hundreds) of GCSEs separately to get an embed code for each.
What would be the best way of allowing each of these URLs (above) to have their own GCSE?
You can search subparts of your site by using a combination of site: operator and webSearchQueryAddition parameter on gcse element.
webSearchQueryAddition appends additional search term to your user's query. If for each of the "games" you change the webSearchQueryAddition to point to the "game" base url, the search results will be matching that url. You can inject that parameter programmatically with e.g. javascript, for each of the "games".
Documentation is here: https://developers.google.com/custom-search/docs/element#supported_attributes
And here is working example:
http://jsfiddle.net/t2s5M/
Related
Trying to figure out how can i search in mutliple sites using Google Custom Search JSON API.
Meaning that search will be only from a specific sites list.
i was playing with the api explorer - https://developers.google.com/custom-search/v1/reference/rest/v1/cse/list?apix_params=%7B%22cx%22%3A%22011602274690322925368%3Atkz2zvvpmk0%22%2C%22siteSearch%22%3A%22www.walla.co.il%22%7D
and noticed the site search query key, but it can only accept a single string not a list of sites:
enter image description here
What is the way to search in only in specific sites?
Thanks
There's a couple things you can do.
If you know the specific sites you want to search, you can add them as refinements to your engine. Then query for that refinement by adding 'more:<REFINEMENT_LABEL>' to the query.
Or, add 'site:' operators to the query itself. For example cats site:cnn.com OR site:bbc.com
I am making an API wherein I want to dynamically get data from the site http://transportformumbai.com/mumbai_local_train.php
Depending on start and end station and timings I want to get the list of all available trains along with the table given by clicking on viewroute column table. i.e. for eg.
I am using import.io connector... But it works well with a single textbox but not with multiple textboxes (Refer this link)or dropdown lists...
Can anyone guide what should I do next...
Apart from import.io is there anyother alternative?
I am a newbie working with crawlers... So please justify your answer.
What is web scraping... Do I have to use web scraper??
Thank you.
Actually, if you look in the URL bar the parameters for destination and time are defined there (highlighted below), so you don't need to worry about drop down menus, or using a Connector.
Use an Extractor on this page:
http://transportformumbai.com/get_schedule_new.php?user_route=western&start_station=khar_road&end_station=malad&start_time=00&end_time=18
Train it to get every column - note that the view route column contains links.
You can create a separate Extractor for the "view route" page:
http://transportformumbai.com/view_route_new.php?trainno=BYR1097&user_route=western&train_origin=Churchgate&train_end=Bhayandar&train_speed=S
Now you should "Chain" the second Extractor to the first one and it will pull that information from every link on the first one.
If you want to choose different destinations and times, just change the URL parameters of the original link.
http://support.import.io/knowledgebase/articles/613374-how-do-i-get-data-behind-dropdown-menus
Your best bet here seems to have an API for every URL combination. You have to analyze the URL structure.
I have created a custom google search application for my website.
Below is the url
https://www.google.com/cse/
to create application.
Under the Auto complete section i have enabled autocomplete,but still it dont suggest me the options when i start typing on search box.
It suggest me only the keywords that We define under custom autocompletions.
SO my question is : Do we need to provide all the custom keywords that we want to autocomplete in search box or google just creates its own autosuggestion from the website ?
It creates autosuggestions from the website, but it may take few days to collect all data. It also takes into account user's queries on your website.
As previous answer suggests, autocomplete can take a few days to begin working. In addition you must specify one or more specific sites / pages to which to restrict the custom search engine. On occasion, you may also have to specify that the CSE only search those site(s) / page(s) instead of simply emphasizing them in the results.
Is it possible for google autocomplete api to specify to return results only for my site not for all sites? I see that there is param ds, but only purpose for that is to search in youtube. So how can I get autocomplete or maybe related or suggested search words only for single site?
I needed the very same thing and so far the only way I found to get this working is to create a custom search engine and then add it as a parameter to the autocomplete call:
http://clients1.google.com/complete/search?client=partner&gs_ri=partner&partnerid={0}&ds=cse
Where {0} is your custom search id
Certain features such as returning the results as XML don't work if you use the partner id but at least all the autocomplete results will be from your site.
You can also have multiple search engines and use different ones in different textboxes. Results are just a json string you parse.
Good luck
I am trying to use the wikimedia public apis for accessing the english wikipedia database.
I would like to have a way to obtain all the page ids linked to a given page.
If I do like this:
http://en.wikipedia.org/w/api.php?action=query&titles=computer&format=xml
I am only able to obtain the page id of the 'computer' page.
I know I could parse for the 'href' tags inside that page and make n queries, but it is not very efficient.
Can I achieve this through apis alone?
It looks like you're looking for the backlinks module.
With that, you can do something like:
http://en.wikipedia.org/w/api.php?action=query&bltitle=computer&list=backlinks&format=xml
Also, the API uses paging, so you'll most likely need to add &bllimit=max to the query and then make follow-up requests to get the remaining pages.