I'm integrating with the Sport Radar API. This is a pretty great API but I'm noticing it doesn't have a particular Endpoint that I was hoping to leverage.
I noticed that the API does not have a Players index call. In order to get all the players in the league you have to go through this process (Confirmed by the API team)
1) Call an endpoint that lists the ids of each team in the league.
2) For each Team id call an endpoint that gets each team and lists each player for each team.
All in all that is over 30 API requests for each time I need to run a fairly common function.
MY CURRENT SOLUTION:
Store the Players in the database. So instead of reaching out through the API every time I need to list the players. I can just make a DB query for the players. This solution seems heavy handed though, I'd like for my application to not have to keep track of the players.
MY QUESTION
What are some ways of solving this problem? Again the questions is: What are ways of avoiding many api calls in order to get common data? Is the best solution just to make the 30 api call each time? Thank you!
I am working with the sport radar API and we had a similar problem, to solve it we use a cache to have this information faster and filtered, basically we connect a redis and run a cron in a service that updates the data (in our case each 24 hours).
Related
I am developing an app in Flutter and ReactNative to display realtime weather for a list of cities. I am retrieving the weather data from OpenWeatherMap. I am finding it hard to understand the concept of calls per minute. There is a limit of 60 calls/minute on a free plan, 600 for Startup plan, and so on.
My question is, if by some miracle my app has 100,000 users in the future, and they all have 5 cities in their app, does this mean that to get realtime weather data the number of API calls would be 5x100,000 at any given moment for my single registered API key?
Even with an enterprise plan (200,000 calls/minute), this seems not achievable.
Am I missing something here? I am interested in realtime data, not historical. The analogy could be extended to a stock trading app as well where I would loop fetchData() over an interval of 1 or 2 seconds. But if there are thousands of users, I'm not sure how to handle the calls.
Please help me understand how I can achieve this or if I'm wrong somewhere.
Is it possible to create multiple API keys for the YouTube Data API?
The majority of Live YouTube Subscriber Counters use loads of different API keys for their counters (as can be seen in their JavaScript code).
The aim of doing so is to not exceed the daily quota limit of 1,000,000 and having to send requests every few seconds per page visited would mean that the limit would be reached very quickly.
How are they able to get away with this?
Here is a SO post to answer your question.
Technically you can run your application using different API Keys it
should work fine. Technically there is nothing wrong with creating
additional projects on Google Developer console. You don't need to go
as far as creating another Google account.
I have been looking online and saw many similar/same posts but all were extremely old (latest I found was from 2011) so since technology changes, I thought I ask too.
I wonder how a flight comparison website (where you cannot book flights and can only be redirected to other websites) get their data.
Is it all by now through api's or is it throgh scrapping data (which would be not so reliable)? Ive been reading online, trying to find out if thats the case but it doesnt really seem that EVERY airline and EVERY flight search website (with booking option) provides an api. So I wonder how sites like Kayak get their data if not every airline/every flight booking website provides an api?
Also, I came across some api's like
QPX Express API
skyscanner travel api (which I checked out on some website which is using it and it does seem that data is quite limited ?!)
Travelport api
Amadeus API
Sabre travel api
Wego Affiliate Network (which seems really great but search takes super long)
I wonder if anyone has experience with the mentioned api's and how good their are /if using them is 'the way' of doing it or if its actually much more realiable to request data directly from each airline and booking website (if thats possible)?
Thanks a lot!
If we take Kayak as the example, as that is who you mentioned, they approach the data in two forms.
They have API PULL connections to GDS companies (i.e. Sabre), some airlines and large online travel companies such as Expedia etc.
Smaller airlines in particular PUSH their inventory and fares from their inventory to companies such as Kayak.
Aggregation companies generally provide PUSH access though companies who want to PUSH their data have to comply with the aggregators requirements/standards.
It is a supply and demand service. Aggregation companies will generally request access to large established companies, however, will also allow companies to push their data to them if they wish.
The data is not normally scrapped, it is through API and web service platforms.
If I may ask, I was wondering how the API request limit handles multiple store calls?
Scenario:
We have one backend service "polling" a store with one core request and 'x' number of requests for images by productId for each item in "LineItems" (I doubt this will be an extraordinary figure) every 5 seconds, but I'm curious to know if we had Five stores and Five background service polling respective stores, would this total to the request limit? I.E how is this tracked, by IP?
I'm hoping that it's on a per store basis thus other stores have there own "Bucket". I have read through "Some" documentation but not sure that info is giving the knowledge I require.
There a lot of older posts and articles, all with conflicting info.
I fully appreciate this is not a programming issue per-say but was hoping this was the place for such a question. As ever, appreciate everyone's time.
Regards,
All shops are provided with a two API calls per second limit, after you've hammered your bucket for 40 other calls. If you are polling a shop every 5 seconds, then you are in no danger of hitting any limits. Any one App can call N stores and this limit is per store, not per App.
The problem is that my application sends a large user data, say around 1000+ entries. The API works in a linear fashion where one user grade data is inserted at a time, which results in my product's session timeout. While we can always increase the session timeout at this end, however just wanted to check if D2L provides any API that PUTs/Pushes multiple user grades. Or any alternative approach will be appreciated.
Currently, the Valence Learning Framework API does not provide a way to do bulk-creation of grade objects or values. This is a feature that several clients have asked for and it is on the development roadmap for the platform; however, D2L does not yet have a firm estimate on delivery for this functionality.