Retrieving information from Yelp API for Excel file of restaurants - api

Total API newbie here. I have an Excel file of restaurants in my city. For each restaurant, I have its name, address, city, state, zip code, and coordinates. I would like to retrieve additional information about these restaurants, like their ratings and price levels, and add those variables to my datafile. Is this something I could do through Yelp Fusion API? I've tried googling my question but I am still unclear if this is possible or not. I'm also unsure on how to code this because I only know R and there is very little example code for Yelp Fusion API in R.

The answer to your question is Yes, you can use Yelp Fusion API to pull yelp information relating to those businesses in your excel file.
With that said, I have no experience with the R language, but I do know it can be done in javascript without using any fancy libraries (except for accessing the Excel file), so my guess is you could probably pull it off with R as well.
Here is my non-professional-programmer algorithm for solving your problem...
Extract each business from the spreadsheet and place them in a data structure.
Construct the parameters for your API call in JSON format.
Loop through each business and separately send each request to Yelp.
Save each response from the Yelp API to a new data structure, such as an array.
Parse the desired information from each response and then either: 1) add it to your source spreadsheet or 2) combine the existing spreadsheet data with the new yelp data and then save to a new spreadsheet.
There is probably a better way to do this, but this is the approach I would take.

Related

How do you scrape json from APIs but from multiple pages? (scrapy)

I'm trying to get json user informations from mastodon api https://mastodon.online/api/v1/accounts/1 (user id number). the problem is every page only stores one user info at a time, however I want to collect all the information at once. Is there a way to collect the json files at number order (https://mastodon.online/api/v1/accounts/{1,2,3,4,...}) then store it all in one json file?
I've been looking around for answers and everytime I used one that is similar to my question it wouldn't work. if anyone can help it would be really great, I've been stuck all day trying stuff out.
documentation; https://docs.joinmastodon.org/methods/accounts/#retrieve-information

How can I save or get data about places near me without breaking policies

This is more of a general programming question.
I'm trying to create an app, think of it as a Yelp clone. I have most of it working but I'm missing one important feature. The data of the places around me. For now I'm only focused on food, so I'd like it if I search something like "Pizza", it'd show me all the pizza joints near me.
I was originally planning to use Google Places API. However if you havent heard, they're changing their pricing and lowering the free tier and upping the cost by a huge margin.
There's also the problem of saving the data. One workaround I saw a user suggest was to just keep using Google's API, but every time you make the query, store the data in your own DB as well (I only need address and name and latitude and longitude) so eventually, you'd have what you need in a sense. However I also want to have something like a simple rating system for each place like Yelp, but Google (and all other places like MapBox, Here Maps, etc) states something along the lines of "info from their API should not be stored or cached for more than 24hrs" but it's very broad and not specific.
So what I was planning to do was, call the Google API, grab the 3 info I need (Address, Name, Lat/Lng), add more fields to store the rating, likes, whatever else the user will add. Then store it in my database, but that doesn't seem like a solution now.
So does anyone have any ideas or advice? Or know of a service where I can get the details of all the food places? And if possible, can anyone confirm that storing the Name, Address, Lat&Lng is a violation of their policy since in my eyes, it's public data, but something like the rating that Google provides, or the pictures that Google provides, now that's Google property.
For obtaining places you can use OpenStreetMap, e.g. using Overpass API. Since larger traffic can be expected you should run your own database(s) instead of using the public APIs.
However OSM doesn't contain ratings. So you have to combine this data with some other publicly available rating system.

Kapow Robot - Extract business Operating hours from Google Search Results

Is it possible to create a Kapow Robot that can search Google for the Operating hours of the Businesses from our list/database and update the timings if changes are made?
Please share if there are any other more efficient ways than the KAPOW robot that can be implemented with minimal effort and cost-effectiveness.
That's what the Google Places API is there for. While you could in theory just open Google Maps in a Load Page action, enter the query string and then parse the results, I would advise against it. Here's why:
The API will be faster, returning results in a structured manner (JSON)
Kapow has actions for calling RESTful services and parsing/modifying JSON
Google does not like robots parsing their pages, and most likely will lock you out (i.e. present you with Captchas sooner or later)
If you decide to go for the API, here's what you should do:
Get your API key first, see this page for details: https://developers.google.com/places/web-service/get-api-key. Note that the free plan allows for 1,000 requests within a 24-hours limit (https://developers.google.com/places/web-service/usage)
Maintain the place ids for all the businesses you'd like to query regularly, and update your list.
For each place, retrieve the details as described in the API documentation. The opening hours will be within the JSON response: https://developers.google.com/places/web-service/details
Update your list. I'd recommend using a definite type in Kapow for that, and using the actions Store in Database and Query Database. In case you need the data elsewhere, you may create additional robots (e.g. for Excel files, sending data per email, et cetera).

Is there a way to get the survey count totals using the SurveyMonkey API?

I have been working with the SurveyMonkey API for a few days now.
My ultimate goal is to be able to gather the voting results for each question in a survey.
For example... if I have a 5 question survey and each question has 3 options/answers... I'd like to gather the results of each question/option.
From what I'm finding in the API documentation... this is not possible.
Can this really not be possible?
Is there a way to gather the results of each question/answer combo using the API?
I hope I'm simply overlooking something.
Thanks!
It is definitely possible to get this kind of information - you can get the metadata of the survey via the API and all response data. How you process and parse that is up to you.
The most common use case to get a list of survey results is done the following way:
Get a list of respondent_ids via get_respondent_list
Send these respondent_ids to get_responses to get the raw response data
Match up the ids from this data with the ids described in the survey's metadata, which you get from get_survey_details

RSS Feeds or API to access REIT information?

I have a Web application that needs to display up to date information on REITs and tickers like AX.UN, BEI.UN, CAR.UN etc..
For example, I need to automate consumption of information on pages such as
http://ca.finance.yahoo.com/q?s=AX-UN.TO
Are there rss feeds or apis I can use to import this kind of data? I don't want to copy and paste this information into my website on a daily basis.
On the very site you link to, there's a small link that says "download data". If you have a database with the symbols you want to track, it would be pretty easy to construct that download URL on the fly, query it, parse the CSV, and load that data into your database for display on your website.
ETA: Here's the link:
http://ca.finance.yahoo.com/d/quotes.csv?s=AX-UN.TO&f=sl1d1t1c1ohgv&e=.csv
Just have your program replace the "AX-UN.TO" with whatever symbol you want, and it should grab the data for that symbol.
Take a look at http://www.mergent.com/servius - particularly the Historical Securities Data API. Not sure if it has REIT data - if it doesn't, it may be available by special arrangement from them.