There is a crm system, which provides webhooks to notify about some events happening in it (for instance, user creates some entity or moves it to the next 'status'). At some event a webhook triggers my script, that creates a new entity in this crm system using it's API. The system's API has a rate limit – 7 request per second. By violating this rule the access to the API may be restricted for the account for some time.
The problem is, that if user changes 'status' of 1000 entities the webhook triggers my script 1000 times, so it calls the API 1000 times and that may violate the rate limit. Is there any way to temporarily 'store' all requests that came from the webhook and then launch the script no more than n-times per second?
The script is written in php and is located on my Apache server now, but later it may be put on client's server or somewhere else.
I've read about Rabbit MQ and Kafka. But it seems to be an overkill for this task. Or maybe it's OK? I just don't have enough experience with these systems.
Any help would be appreciated.
Related
This is my issue:
I have an API which updates every 30 seconds the result displays on an endpoint
I want to display the result on my website for all visitors and so update it automatically every 30 seconds
I don't want each visitor to send a request on my API as it would overwhelm the API (and it is clearly not the right way to implement it I guess).
Is there a way to send one request every 30 seconds from my website backend to my API in order to display it for all visitors on the website?
Or maybe there is another smarter/efficient way to do it?
Another question I'm asking, I want exactly the same "front" website content for all users. I mean, there would have requests from frontend to backend to have the information, but I don't want some users to have the information earlier as their requests would be few seconds before other users. I was thinking to "send requests based on GMT+1 hour for example", I don't know if it makes sense or if there is another way ?
N.B.: I'm using Wix services, maybe I would have to change and build my proper website
Thanks a lot for your answers
Hugo
Is there a way to send one request every 30 seconds from my website backend to my API in order to display it for all visitors on the website?
You could use realtime communication between your website and your backend server. For example Socket.IO or SignalR depends on the stack you are using. So instead of sending a call every 30 seconds, the API server would dispatch an event to all clients and tell them to update (or send the data along when you dispatch the event).
I want exactly the same "front" website content for all users.
As for your second question, if you opt to use realtime communication to sync data between backend and frontend then there should be no need for having requests sent on a timer as your data is always updated actively from the server side.
I have successfully set up a DM bot with the Account Activity API. Everything works very well, except that sometimes the message sent to the bot (through the Twitter's web interface or mobile application) doesn’t fire a webhook to my server. The messages could be quick replies responses or plain text.
The reason is obviously not a downtime of my server since I tried to make a conversation between 2 webhook registered users (so my server receive the webhooks for both users) and for the same message sent, I have successfully received the webhook of the sender (the user) but not for the recipient (the bot).
As the bot isn’t in production yet, the reason is not an overload of messages. There is currently only 2 users that make conversations. From my experience, around 10% of messages are "lost".
I'm using the free (sandbox) Account Activity API tier, but as I understand the only differences between the free and paid versions are a higher number of subscriptions (I'm fine with 15) and the “Retries” feature. Regarding this feature, it is specified that “The Account Activity API provides a retry feature when the client’s web app does not return a ‘success’ 200 response for an account activity webhook event.”
It clearly states that the event failure concerns the client’s side, not the Twitter side. Considering this issue (my server doesn't receive the webhook at all), there is no guarantee that every event will be delivered even if in a paid plan.
This is a big inconvenience for bots since a button can only be clicked once, so the user must retry the conversation from the beginning (besides the fact that the bot "doesn't work"...)
So my questions are :
Is anyone here experience this issue ?
Is this a “bug or a feature” of the free Account Activity API ? I mean, at random the free tier doesn't fire the webhook on purpose (even if it's not specified in the docs) ?
Is there a way to see or measure the webhook failures Twitter side, via the dashboard for instance ?
A guess is that the events could be more accurate if the account is verified (with a blue badge) or hit a followers number threshold ? The treatment could be different due to the potential surge of events, so they are monitored with more ressources, thus more reliable ?
I already create a topic in the official Twitter forum and there is at least one other person in the same case, but no official answer from Twitter so far.
Thanks a lot !
BR,
Simon
I've got an official answer from Twitter :
Unfortunately it is not possible to achieve 100% delivery rate when there is only 1 delivery attempt for an event, which is why we have retries (and even then, retries are not a guarantee either). Things can go wrong; maybe internal issues in Twitter Data Centers, routing issues in the internet, hosting issues at your webhook, etc.
So from the time being, it seems that there's not way to have a 100% success delivery when you build a bot on Twitter.
Full answer can be read here.
Does anyone know how many API calls PayPal allow each day? I'm using GetRecurringPaymentsProfileDetails to check if payment has been successfully made and I might be doing the API call a lot of times each day. They have the error code on their docs but didn't specify the rate limit.
Thanks
I've never run into any daily limit issues with API calls. That said, I try to avoid hitting the API that many times.
I would recommend you take a look at Instant Payment Notification (IPN). It will automatically POST details about transactions to a URL listener you have on your server. That script that receive the data and update your system accordingly. This happens in real-time so everything would update automatically and immediately. This way you don't have to hit the API to pull details because IPN will feed those details as they happen.
We already have a system in place that uses Restful APIs in order to send let's say SMS. All of our clients are using our server to send their requests to Rest API so we drop connections except our server IP to handle authentication.
Now policy has been changed. We want to expose our APIs to the outside world. We now want to be able to push to user under specific circumstances. Let's say that I want to send a delivery report to the user when SMS has been delivered. Or when something has been scheduled for a specific time, when that time arrives user get notified.
How to handle these notifs? Has anyone used the same or similar approach?
Assuming you can reach your clients back via HTTP. The model to do this is to use callbacks. When someone posts a scheduled job on your server, they should also post a callback URI where your server can notify when the job is complete.
Sample below:
https://schedulingSevrer.com/runSchedule?callback=http://clientserver.com/reportStatusHere
So when the job is done your callback will be like
http://clientserver.com/reportStatusHere?jobId=12345&status=complete
Or if your clients are mobile apps on Andorid you can use the Google Push notifications.
I am getting this error whenever I try to follow someone on Instagram via API no matter how many follows have been done before:
{"meta":{"error_type":"APIError","code":400,"error_message":"Client request limit reached"}}
My app allows authenticated users to follow interesting people. I know that there is a 5000 call/hour limit per authenticated user, but it fails even with new users.
Do my app is reaching some kind of client level limit?
APIs like follow, unfollow, comment are limited to 350 requests per hour. However sending requests from client side will fix this problem to some extent but it allows the users to see your API token.
In this case it looks like it would be beneficial to get some more data from your users. You could use Google analytics to track the "follow" action
https://developers.google.com/analytics/devguides/collection/gajs/eventTrackerGuide
This would give you a timestamp and information about user behavior.
Even with an advertised rate of X requests per hour, one user hammering the service with your API key can cause everyone to get throttled. (Not guaranteed, but pretty common practice for companies to keep their services alive)
It might be a good idea to reset your API, its possible (though unlikely) that someone has acquired your key and is using it.