Amadeus Flight Inspiration Search not working properly - amadeus

I've already tested the Flight Inspiration Search on test mode and worked fine with the restricted data, so I've decided to move to live with the API. Anyway, it looks like I'm still getting restricted data, because I've got errors (code 500 - not supported origin and destination) for airports like JFK, LTN, etc.
I've changed the base URL and the credentials, so it should retrieve the proper data. Do you have any suggestions for this?

The Flight Inspiration Search API is built on top of a pre-computed cache this is why, even in production, you don't return all possible options. The API computes every day the most trending options based on past searches and bookings and fills the cache, which means that the cache is dynamic. That's why you don't return anything with your defined criteria.
For real-time data you should use the Flight Offers Search API.

Related

How can I save or get data about places near me without breaking policies

This is more of a general programming question.
I'm trying to create an app, think of it as a Yelp clone. I have most of it working but I'm missing one important feature. The data of the places around me. For now I'm only focused on food, so I'd like it if I search something like "Pizza", it'd show me all the pizza joints near me.
I was originally planning to use Google Places API. However if you havent heard, they're changing their pricing and lowering the free tier and upping the cost by a huge margin.
There's also the problem of saving the data. One workaround I saw a user suggest was to just keep using Google's API, but every time you make the query, store the data in your own DB as well (I only need address and name and latitude and longitude) so eventually, you'd have what you need in a sense. However I also want to have something like a simple rating system for each place like Yelp, but Google (and all other places like MapBox, Here Maps, etc) states something along the lines of "info from their API should not be stored or cached for more than 24hrs" but it's very broad and not specific.
So what I was planning to do was, call the Google API, grab the 3 info I need (Address, Name, Lat/Lng), add more fields to store the rating, likes, whatever else the user will add. Then store it in my database, but that doesn't seem like a solution now.
So does anyone have any ideas or advice? Or know of a service where I can get the details of all the food places? And if possible, can anyone confirm that storing the Name, Address, Lat&Lng is a violation of their policy since in my eyes, it's public data, but something like the rating that Google provides, or the pictures that Google provides, now that's Google property.
For obtaining places you can use OpenStreetMap, e.g. using Overpass API. Since larger traffic can be expected you should run your own database(s) instead of using the public APIs.
However OSM doesn't contain ratings. So you have to combine this data with some other publicly available rating system.

Kapow Robot - Extract business Operating hours from Google Search Results

Is it possible to create a Kapow Robot that can search Google for the Operating hours of the Businesses from our list/database and update the timings if changes are made?
Please share if there are any other more efficient ways than the KAPOW robot that can be implemented with minimal effort and cost-effectiveness.
That's what the Google Places API is there for. While you could in theory just open Google Maps in a Load Page action, enter the query string and then parse the results, I would advise against it. Here's why:
The API will be faster, returning results in a structured manner (JSON)
Kapow has actions for calling RESTful services and parsing/modifying JSON
Google does not like robots parsing their pages, and most likely will lock you out (i.e. present you with Captchas sooner or later)
If you decide to go for the API, here's what you should do:
Get your API key first, see this page for details: https://developers.google.com/places/web-service/get-api-key. Note that the free plan allows for 1,000 requests within a 24-hours limit (https://developers.google.com/places/web-service/usage)
Maintain the place ids for all the businesses you'd like to query regularly, and update your list.
For each place, retrieve the details as described in the API documentation. The opening hours will be within the JSON response: https://developers.google.com/places/web-service/details
Update your list. I'd recommend using a definite type in Kapow for that, and using the actions Store in Database and Query Database. In case you need the data elsewhere, you may create additional robots (e.g. for Excel files, sending data per email, et cetera).

What is the maximum results returned for YouTube Data API v3 call

Context
I am in the process of providing some consultancy on doing a HTTP GET using YouTube Data API V3; in order to develop a Windows based application to GET a list of results from Youtube, for say a specific CATEGORY, or a specific TAG.
We are open to using any programming language(I'm from a C++ background and am hoping You tube will support direct HTTP connections without using Google client SDK and so on) to connect to YouTube and (HTTP) GET data.(Once a month or so, so YouTube API quotas should not be problem).
The Issue
We are being told by some of my client's web developers that YouTube API v3 will only return a maximum of 500 records/results, for say a query that returns JUST the Total viewers, the Video's link, and basic meta data such as that.
S, say I wish to find 5,000 results for category "House music" or "basketball" - and I have the Developer Key etc are all set up, would that be possible?
If so, what GET fields would I need to populate(such as "max_results_per_page")?
Thank you.
The API won't provide more than ~500 search results for any arbitrary query. It's by design. Technically, it means that the nextPageToken field won't be returned once you hit ~500 results. No additional parameter can change that.
If you want more than ~500 results for a query, you have to split it into more specific sub-queries. I'd suggest using the publishedAfter and publishedBefore parameters to achieve that, but feel free to experiment with the other ones here.
This only holds for the search-Query. Other queries like "PlaylisItem:list" deliver more results. I have tested with 100.000 items to get the videos of a playlist.

Programmatic Querying of Google and Other Search Engines With Domain and Keywords

I'm trying to find out if there is a programmatic way to determine how far down in a search engine's search results my site shows up for given keywords. For example, my query would provide my domain name, and keywords, and the result would return a say 94 indicating that my site was the 94th result. I'm specifically interested in how to do this with google but also interested in Bing and Yahoo.
No.
There is no programmatic access to such data. People generally roll out their own version of such trackers. Get the Google search page and use regexes to find your position. But now different results are show in different geographies and results are personalize.
gl=us parameter will help you getting results from US, you can change geography accordingly to get the results.
Before creating this from scratch, you may want to save yourself some time (and money) by using a service that does exactly that [and more]: Ginzametrics.
They have a free plan (so you can test if it fits your requirements and check if it's really worth creating your own tool), an API and can even import data from Google Analytics.

Google Analytics retrieve custom variables statistics

Edit refurbished the question that was not clear
New to GA, I'm looking at the way to retrieve automatically custom variables data statistics
The query would have
a start and an end dates (possibly equal)
a variable name
For instance, a Page-level variable Brand takes only three possible values, that are set by the web server, and seen by the client.
The values are Apple, Google and Microsoft.
The query to Google-Analytics could be something like (pseudo-code), provided that I use an authentication token previously acquired
...getstatistics?myToken=123&variable=Brand&datefrom=20110121&dateto=20110121
And the result could be some xml like data
<variable>Brand</variable><value>Apple</value><count>3214</count>
<variable>Brand</variable><value>Google</value><count>4321</count>
<variable>Brand</variable><value>Microsoft</value><count>1345</count>
Meaning for instance that the page-level custom variable Brand was set to the value Apple by the web server (and thus seen by the client / sent to GA) 3214 times.
What is the correct way/protocol to query values/statistics from GA, in order to get statistics related to custom variables?
So, this is my understanding of what you're doing:
You're setting page-level custom variables (important technical note: these need to be called before the _trackPageview or some other call, else they won't be tracked.)
Your code might looks something like this:
_gaq.push(['_setCustomVar', 2, 'Brand', 3]);
Now, when querying the Google Analytics API, its important to note that the slot # is very important, since the slot you're accessing is explicitly named in the query.
So, to do this, you'd need to set your dimensions to ga:customVarName2 and ga:customVarValue2, and decide what metric you're interesting it getting. You mention Page views, so you'd use ga:pageviews. (You're by no means limited to pageviews. You can use any Metric besides a couple of the AdWords specific ones.)
This query would return you all of the custom variable from this slot, and the number of pageviews associated with them.
You also mentioned you'd want to be able to filter by value.
You'd do that by setting the filter value to something like ga:customVarValue2==Apple.
You can see what a query like that would look like here in the query explorer.
Here's a sample screenshot:
Finally, all Google Analytics API queries by default require you to set a date range, so you could query that on your own.
All you need to do is decide which library you want to use as interface, and you're set to go.
Google has a handy resource, called the Google Analytics Data Explorer that can help answer a lot of your questions by letting you experiment through an interface, as long as you login with your Google Analytics credentials.
As you add parameters using their tools, the system will automatically build your URL/Query.
If that's not enough, Google also has some Interactive Examples using JavaScript. Like the Data Explorer, you can also login with your Google Analytics credentials and run the examples to see what data would be returned.
These tools are awesome because they help take the guesswork out of figuring out how to target the exact data you're searching for.