how to dynamically extract data from dropdown lists or multiple textboxes using import.io - api

I am making an API wherein I want to dynamically get data from the site http://transportformumbai.com/mumbai_local_train.php
Depending on start and end station and timings I want to get the list of all available trains along with the table given by clicking on viewroute column table. i.e. for eg.
I am using import.io connector... But it works well with a single textbox but not with multiple textboxes (Refer this link)or dropdown lists...
Can anyone guide what should I do next...
Apart from import.io is there anyother alternative?
I am a newbie working with crawlers... So please justify your answer.
What is web scraping... Do I have to use web scraper??
Thank you.

Actually, if you look in the URL bar the parameters for destination and time are defined there (highlighted below), so you don't need to worry about drop down menus, or using a Connector.
Use an Extractor on this page:
http://transportformumbai.com/get_schedule_new.php?user_route=western&start_station=khar_road&end_station=malad&start_time=00&end_time=18
Train it to get every column - note that the view route column contains links.
You can create a separate Extractor for the "view route" page:
http://transportformumbai.com/view_route_new.php?trainno=BYR1097&user_route=western&train_origin=Churchgate&train_end=Bhayandar&train_speed=S
Now you should "Chain" the second Extractor to the first one and it will pull that information from every link on the first one.
If you want to choose different destinations and times, just change the URL parameters of the original link.

http://support.import.io/knowledgebase/articles/613374-how-do-i-get-data-behind-dropdown-menus
Your best bet here seems to have an API for every URL combination. You have to analyze the URL structure.

Related

Crawling recursively with import.io

I want to crawl all the links, sub-links and so on, that live inside a page (recursively).
Is there a recursive option in import.io? If so, how do I use it?
Can you tell us more about the specific use case? What site / sub section are you trying to extract data from?
Based on your questions, you may want to check out the "Chain APIs" feature.
Essentially it allows you to have an API that extracts a set of links, and feed that set into a second API that extracts sub links.
http://support.import.io/knowledgebase/articles/629686-chain-apis-combine-two-apis

custom google search autosuggestion

I have created a custom google search application for my website.
Below is the url
https://www.google.com/cse/
to create application.
Under the Auto complete section i have enabled autocomplete,but still it dont suggest me the options when i start typing on search box.
It suggest me only the keywords that We define under custom autocompletions.
SO my question is : Do we need to provide all the custom keywords that we want to autocomplete in search box or google just creates its own autosuggestion from the website ?
It creates autosuggestions from the website, but it may take few days to collect all data. It also takes into account user's queries on your website.
As previous answer suggests, autocomplete can take a few days to begin working. In addition you must specify one or more specific sites / pages to which to restrict the custom search engine. On occasion, you may also have to specify that the CSE only search those site(s) / page(s) instead of simply emphasizing them in the results.

Infinite amt of Google custom search boxes per website?

I've got a site where users can create groups (we call them games)
www.ongoingworlds.com/games/270/
www.ongoingworlds.com/games/287/ etc
Each of these games has it's own user-generated content. I want to use a Google custom search for each game. But I can't see an easy way to amend the embed code to add a dynamic path, and I don't want to have to register multiple (hundreds) of GCSEs separately to get an embed code for each.
What would be the best way of allowing each of these URLs (above) to have their own GCSE?
You can search subparts of your site by using a combination of site: operator and webSearchQueryAddition parameter on gcse element.
webSearchQueryAddition appends additional search term to your user's query. If for each of the "games" you change the webSearchQueryAddition to point to the "game" base url, the search results will be matching that url. You can inject that parameter programmatically with e.g. javascript, for each of the "games".
Documentation is here: https://developers.google.com/custom-search/docs/element#supported_attributes
And here is working example:
http://jsfiddle.net/t2s5M/

Best approach to build a DYNAMIC query-by-example form in AngularJS?

I'm relatively experienced with Angular having written many directives, but I have a new requirement where I have to build a query-by-example form into which a user can enter different search criteria. My problem is that I do not know ahead of time what the possible criteria will be. This criteria information will be coming from the server via an ajax request and can differ per user. Thus I will need to dynamically construct a suitable user interface based on the information I get from the server.
I have built individual directives suitable for capturing the search criteria (for example a custom calendar control for date criteria) but I am unsure of the best approach to adding these directives to a form dynamically. Is this even possible in Angular?
I have built something like this before in jQuery but its not so clear to me how I would best do this in an 'Angular way'?
Any suggestions would be most appreciated!
Everything that you can express as a model-to-view projection can be implemented in AngularJS.
I think here you can make a model consisting of "query params". Each of them has name, type and data for filter builder. For example, for the "select" type a data can contain a list of all possible values to choose from.
Then you iterate through the list of "query params" with ng-repeat, rendering each control differently according to its type. That's all.
If I understood the task wrong, please provide more info.

Pass a string to various websites

I have a product code which I need to enter into 6 different websites in order to pull different information from them about the product. Is there away to save this product code into some sort of variable and pass it into each websites input box and it return all the information from each one automatically? Really have no idea where to go/start with this so if anyone can brainstorm a few ideas to get me moving that would be great.
In order get what you are planning for:
You need a script which visits the specified web site,
then at the website, you can get the element by tag.
For instance in javascript,
var textBox = document.getElementByTag(Input);
This will give you a reference to text field to enter the text. It can be done as follows:
textBox.value = "any string";
Once you have done this, you will have to retrieve the results from the page, based on the website layout.
So if you can specify about your work in detail, you would get better response.
Assuming you're talking about using an ordinary GUI browser, the best you can do is copy it to your system clipboard, and paste it into each page on the browser.
If you're talking about a programmatic web-access like wget or curl, it depends on what language you are writing your script in.
you have to create the web request for each web site and find a way to parse the response which will be HTML
have a look at the HttpWebRequest you can find lots of example on internet that shows how you can create an HTTP POST to a website.
http://www.terminally-incoherent.com/blog/2008/05/05/send-a-https-post-request-with-c/