Distributed Computing: Cache user based messages for x minutes and then persist - redis

I have a usecase in which I receive notifications for users from other users.
Most of the time these notifications are consumed by other users within X minutes.
Once consumed I need not save the notification data on my backend. Ordered delivery of notification is important to users
I want to think of a caching based solution or a store which can keep the notifications in memory for x minutes and then persist it at the same time gives ordered notifications user wise.

Use Distributed Messaging to enable realtime messaging. You should use the Application Initiated Custom Events feature.
Btw, TayzGrid is an Open source In-Memory Data Grid also known as Distributed Cache in your case.

Related

Notifications for inactive users

I’m implementing a solution that will notify users in a scenario very similar to a chat.
I’ll be using SignalR and Azure Notifications Hub for the job.
There are two scenarios that I need to address:
Notifying users that are currently connected and using my app - either web or mobile
Notifying users who are NOT currently using the app
I think SignalR will work just fine for the first scenario which is fairly easy.
My question is handling the second scenario which will require Azure Notifications Hub.
In a typical chat app, though it’s not real-time, there’s little delay before an inactive user receives a notification for a new message he receives.
How do I “trigger” a process that will send a notification through Azure Notifications Hub?
I can think of two ways to handle this:
Figure out a way to keep a list of users who currently have an active connection to my app. These users can be notified through SignalR. So after each new message, I could trigger a notification process that will use Azure Notifications Hub for users who are NOT in the active users list i.e. those who are NOT actively connected to my app.
Another approach would be to assume no one is connected. I could create an Azure Functions app that runs every minute to check on messages that are NOT opened. It would then compile a list of users who need to be notified and call Azure Notifications Hub process to notify them.
I don’t want to reinvent the wheel and want to tap into the experience of those who’ve already implemented a solution like this.
I’m currently leaning towards the second approach. I’d appreciate your suggestions, especially on approaches that I haven’t thought of. Thanks!

Firebase Cloud Messaging sending messages one by one

I am developing an app where I want to send notifications to multiple users from my backend server. These notifications will contain 4 different contents, three of them will go to 3 different users, but the forth one will go to more than 1000 users, the frequency of theses messages will be 3 to 7 times a week. Is it ok to send this amount of messages one by one using the Cloud Messaging api? Or should I group the messages before sending?
I thought about creating a topic in the Cloud Messaging, and subscribe the users to it and then send a message, but I don't if it is correct to keep subscribing/unsubscribing users just to send one message.
NOTE: I've never used any push notifications service before.
The Firebase Cloud Messaging infrastructure delivers billions of messages per day. The volume you're describing sounds well within reason for it.
Without knowing more about the use-case it's hard to say whether using a topic would be a better approach, so I recommend reading the documentation on topic messages to get a better understanding for it.

How to throttle the amount of request to an API?

There is a crm system, which provides webhooks to notify about some events happening in it (for instance, user creates some entity or moves it to the next 'status'). At some event a webhook triggers my script, that creates a new entity in this crm system using it's API. The system's API has a rate limit – 7 request per second. By violating this rule the access to the API may be restricted for the account for some time.
The problem is, that if user changes 'status' of 1000 entities the webhook triggers my script 1000 times, so it calls the API 1000 times and that may violate the rate limit. Is there any way to temporarily 'store' all requests that came from the webhook and then launch the script no more than n-times per second?
The script is written in php and is located on my Apache server now, but later it may be put on client's server or somewhere else.
I've read about Rabbit MQ and Kafka. But it seems to be an overkill for this task. Or maybe it's OK? I just don't have enough experience with these systems.
Any help would be appreciated.

Strategy for notification checking

Is there a recommended strategy for checking of notifications within my AngularJS app?
By 'notification' I'm talking about message alerts that are to be displayed to a user when they're logged into the application.
My plan is to notify the user of unread notifications in the app's NavBar as shown below:
My app communicates with my restFul API (written using Node.js, express, MongoDB), so I anticipate that new notification will be written to a MongoDB collection with details the user the notification is intended for.
What I'm unsure about is how the AngularJS application will check for notifications once a user is logged on. I could call my API for unread notifications every time the user navigates from one path to another but that seems simplistic and it wouldn't work if a new notification occurs whilst a user is viewing a page.
Another way would be some sort of timer system that checked, say, every 30 seconds. But this would results in unnecessary polling of my API when there aren't any new notification for a user.
So, wondering if there is a recommended strategy. Thanks for your help.
Polling is a solution but it is very inefficient. The solution to your problem are websockets. Websockets is a technology that provides a full-duplex bidirectional communication between your clients and your server. So you can send messages from your server to your connected clients. Your server maintains an array of connected clients and you just have to know which ID you need to send a message to it.
For your stack, the best solution I have came to is Socket.io http://socket.io
It also have cool features. For example, you can "observe" models, so if a model change in your database, for example an update to a user profile is made, you can trigger an event and automagically send a message to your client. This client get and handles the notification and do something, like put a badge on your alerts icon.
Hope this is useful for you.

Quickblox chat data replication

We are going to build a mobile app that will simplify a communications between our customers and our workers. Quickblox seems to be a good chose for us but we have a question:
We would like to save all text messages that are sent between customers and workers in our database. What is the best way to implement it? Is it possible to automatically add a server bot to every conversation? Is it possible to poll a Quickblox and request all conversations/messages of all users for some date?
We have a Kafka http://kafka.apache.org integration with our Chat - each message also goes to a Kafka queue, so you can connect to Kafka and consume all messages on your side and save them to your DB for example.
This feature is available starting from Enterprise plan http://quickblox.com/plans/