This is my issue:
I have an API which updates every 30 seconds the result displays on an endpoint
I want to display the result on my website for all visitors and so update it automatically every 30 seconds
I don't want each visitor to send a request on my API as it would overwhelm the API (and it is clearly not the right way to implement it I guess).
Is there a way to send one request every 30 seconds from my website backend to my API in order to display it for all visitors on the website?
Or maybe there is another smarter/efficient way to do it?
Another question I'm asking, I want exactly the same "front" website content for all users. I mean, there would have requests from frontend to backend to have the information, but I don't want some users to have the information earlier as their requests would be few seconds before other users. I was thinking to "send requests based on GMT+1 hour for example", I don't know if it makes sense or if there is another way ?
N.B.: I'm using Wix services, maybe I would have to change and build my proper website
Thanks a lot for your answers
Hugo
Is there a way to send one request every 30 seconds from my website backend to my API in order to display it for all visitors on the website?
You could use realtime communication between your website and your backend server. For example Socket.IO or SignalR depends on the stack you are using. So instead of sending a call every 30 seconds, the API server would dispatch an event to all clients and tell them to update (or send the data along when you dispatch the event).
I want exactly the same "front" website content for all users.
As for your second question, if you opt to use realtime communication to sync data between backend and frontend then there should be no need for having requests sent on a timer as your data is always updated actively from the server side.
Related
There is a crm system, which provides webhooks to notify about some events happening in it (for instance, user creates some entity or moves it to the next 'status'). At some event a webhook triggers my script, that creates a new entity in this crm system using it's API. The system's API has a rate limit – 7 request per second. By violating this rule the access to the API may be restricted for the account for some time.
The problem is, that if user changes 'status' of 1000 entities the webhook triggers my script 1000 times, so it calls the API 1000 times and that may violate the rate limit. Is there any way to temporarily 'store' all requests that came from the webhook and then launch the script no more than n-times per second?
The script is written in php and is located on my Apache server now, but later it may be put on client's server or somewhere else.
I've read about Rabbit MQ and Kafka. But it seems to be an overkill for this task. Or maybe it's OK? I just don't have enough experience with these systems.
Any help would be appreciated.
On page https://developers.soundcloud.com/
Register a new app (Currently unavailable)
A question for developers. How soon will fix the registration of apps?
Is it possible to register my application in manual mode?
I need my clientId!
Apparently they haven't given out new client IDs for months (another user says so).
To answer your question though, no. As Soundcloud says when you attempt to register an app, "...we will no longer be processing API application requests at this time."
I'm trying to use the API too. It seems like we either need to use client IDs belonging to other people, wait an unspecified period of time, or there may be select uses of their API that don't require a client ID.
It depends on what kind of content you want to access.
If you need only public content than you can obtain the CLIENT ID from your AJAX requests made to soundcloud. Open network debugger in your browser and explore AJAX requests you should see some queries made to api.... and the query string should contain client id:
https://api-v2.soundcloud.com/tracks/687584815/playlists_without_albums?offset=85&limit=5&client_id=xxxxxxxxxxxxxxxxxxxx
Hope that helps.
We are trying to call one of Google's APIs and analyze the data and then display it for the user on our site. We realize we can make calls to the API through our server using php or python, but because of Google's limit, we are looking for an option that will have the API call come from the user, instead of our server. Is there a way for a web app to distribute the requests amongst the users instead of make all the calls from our server?
Thank you very much as this has been a tricky one to Google.
I highly doubt that this is possible. The API request is per API key and you can't provide every user with their own API key.
If the scenario you are trying to achieve requests possibly the same data for multiple users. I highly suggest you save/cache the API request. This will minimize the times your server send API request to google since you saved that request to your server. You can then set an expiration for your saved/cached API request to renew X days.
Scenario
user 1: request data for item 123 from API.
server: sends request to API server, and save
user 2: request data for item 123
server: return saved API response
We already have a system in place that uses Restful APIs in order to send let's say SMS. All of our clients are using our server to send their requests to Rest API so we drop connections except our server IP to handle authentication.
Now policy has been changed. We want to expose our APIs to the outside world. We now want to be able to push to user under specific circumstances. Let's say that I want to send a delivery report to the user when SMS has been delivered. Or when something has been scheduled for a specific time, when that time arrives user get notified.
How to handle these notifs? Has anyone used the same or similar approach?
Assuming you can reach your clients back via HTTP. The model to do this is to use callbacks. When someone posts a scheduled job on your server, they should also post a callback URI where your server can notify when the job is complete.
Sample below:
https://schedulingSevrer.com/runSchedule?callback=http://clientserver.com/reportStatusHere
So when the job is done your callback will be like
http://clientserver.com/reportStatusHere?jobId=12345&status=complete
Or if your clients are mobile apps on Andorid you can use the Google Push notifications.
Our site uses Facebook connect. When a new user signs up we ask for permission to pull their interest data, their list of friends, and their friends' interests. Fetching this data used to be a very quick process (couple seconds). Over the last week or so, the time to fetch this data has increase to 10+ seconds. According to Facebook insights, our site is not being throttled. We didn't make any changes to our site.
Anyone else experiencing this issue with Facebook? Have any ideas for how to address it?
Thanks!
As of 1/26 at 7:55 PM EST, the live status page doesn't indicate any irregular activity.
Sometimes this occurs because a user simply has a lot of likes and interests. I would recommend making this operation asynchronous following a flow something like this:
User connects with your app
Get the access token and store it in a queue that a background process can access.
Get all the information you need immediately to make the app work.
Some time later
In a background process, grab an access token from the queue, parse it and handle it however you'd like.
A simpler, although less stable option, is redirecting the user to a page upon installation which makes an AJAX request to that page telling it to download the information from the graph. This keeps the response time low, but does require your user to have Javascript enabled and for them to stay on the destination page long enough for the request to be created.