rails heroku: bouncing/masking ip addresses while making requests in heroku - ruby-on-rails-3

I want to be able to pull how many followers Twitter accounts have in rails. However, I want to do this for many accounts each day 10,000 +. I am only allowed around 150 requests per ip address.
I am a newb to rails, but I have heard of solutions like ip masking, bouncing, and proxy servers to get around this problem.
I have also heard that heroku ip's change all the time for your app, so this may not be a problem.
My main question is...can anyone explain what strategy is possible to make more calls to an api with rate limiting with a rails app on heroku?

Trying to circumvent the restrictions of the API is a very bad idea. You can require users to authorize with Twitter in order to get certain requests to count against their individual API limits instead of yours.
Also, not all calls are rate limited. Some have individual limits, others are limited as part of a group. Look into more creative ways to use the API in ways that reduce the number of requests you need to make.

Related

Time out request with Here API

I would like to know Here servers automatically reject IPs associated to Tor servers?
Because, I've tried many times to do a request with API (which never answered).
When checking the HERE FAQs it does not say anything about blocking certain IPs, f. ex. from Tor, but it does tell you a bit about limits there adhere to, such as 250,000 Transactions per month.
That's probably a good starting point to check for limits of the HERE API.

API call request limit

I have been looking into various different APIs which can provide my the weather data I need in JSON format. A lot of these API's have certain limits such as: in order to get more requests per minute, you need to pay more money per month so that your app can make more API requests.
However, a lot of these API's also have free account which five you limited access to them.
So what I was thinking is, wouldn't it be possible for a developer to just make lots of different developer accounts with an API provider and then just make lots of different API keys?
That way, they wouldn't have to pay anything as they could stick with the free accounts. Whenever one of the API keys has reached the maximum daily request calls, the developer could just put a switch statement in their code which gets their software to use a different API key.
I see no reason why this wouldn't work from a technical point of view... but, is such a thing allowed?
Thanks, Dan.
This would technically be possible, and it happens.
It is also probably against the service's terms, a good reason for the service to ban all your sock puppet accounts, and perhaps even illegal.
If the service that offers the API has spent time and money implementing a per-developer limit for their API, they have almost certainly enforced that in their terms of service, and you would be wise to respect those.
(relevant xkcd)

Foursquare API limitation and design advice

Foursquare encourages developers to use maximum caching before doing repetitive calls to Foursquare API in order not to extend hourly limit of usage (5000 requests / hour).
So, does it mean is it a bad idea to access Venues API directly from mobile app?
Do we need to make our mobile app retrieve results from our server instead of calling Foursquare directly?
Thanks
Accessing Foursquare using your own servers as a proxy is generally encouraged. As you stated, it's better for caching, but also has the advantages of
being able to make changes quickly without waiting for an app submission (and can therefore propagate to all your users, even ones that don't actively upgrade)
ability to collect and aggregate information about call volume, errors, etc. more easily

Get many geo maps for addresses

How can I get maps for addresses without requests limits ? Google provide only 2500 requests per day. First of all, I want to use free services. Thank you.
You left a ton of info out... What the heck is maps for addresses? Do you mean map tiles? Or are you talking about geocoding? Like getting addresses for maps.
Is it a website making the calls or mobile? Where are you exicuting the code from?
If you are talking about gps geocoding (getting an adress from a GPS cord) then there are tricks you can use to get around those limits. If it's based on a key then its a 2500 limit for the key. However, there are apis you can use that are based on calling IP (google is one) If you make the client make the call then unless your client is making 2500 calls your good to go.
You will notice here that the geocoding call doesn't require an api key. So the usagelimit is going to be based on calling IP
https://developers.google.com/maps/documentation/geocoding/#GeocodingRequests
Here's a similar question Usage limit on Bing geocoding vs Google geocoding?.
Google will start denying your request at around 2500. Bing has a much higher daily limit (used to be 30k - i think it's up to 50k now).
There are a number of free geo-coding services. I recommend staggering your requests to use multiple services if you need a large number of addresses coded daily. Here's a list of 54 providers: http://www.programmableweb.com/apitag/geocoding.

Conditional Rate Limiting (Nginx or Webapp)?

I am implementing a REST API which requires throttling. I know that, ideally, you would place this logic in nginx. However, I have some unique constraints.
Namely, I have one class of users who should NOT be rate limited. It would not be useful to implement a rate limit on a per IP basis (the nginx way).
Users of the API are differentiated on a APIKey basis. Using a caching system, I could count requests per APIKEY and handle rate limiting accordingly. That involves more setup and is not as scalable, I would imagine.
Any suggestions?
You could setup multiple virtual hosts that are individually throttled at different limits. You could do your count and then redirect selected users to these virtual hosts to be throttled.