What's the more efficient way to build whatsapp like chat functionalities with Quickblox? - quickblox

What's the more efficient way to build whatsapp-like chat functionalities with Quickblox, in particular these behaviors:
receive a visible/audible notification of a new message from user
B while in a chat with user A
update a counter with the number of unread messages
receive a visible/audible notification of a new message while not in
a chat (eg, the list of on going conversations)
while in a chat, receive messages from opponent and don't show
remote notifications of those same messages
So far, I'm inclined to a solution like this:
use chat rooms for 1:1 chat for the history functionality
register each 1:1 chat room in Custom Objects at the time of
creation with meta information including a dateOfLastReceivedMessage
field (1 extra call to QB for each user)
every time chatRoomDidReceiveMessage is called, update the date of
dateOfLastReceivedMessage field (2 extra calls to QB; search the
record and update it)
with every message sent, also send a push notification (1 extra call
to QB)
every time didReceivedRemoteNotification is called, compare the date
of the message (a) with dateOfLastReceivedMessage in Custom Objects
(b) (1 extra call to QB):
if a > b -> notification is a new/unread message: increment counter of new messages and show a visual clue/play a sound.
if a > b -> notification is not a new/unread message: do nothing.
For someone familiarized with Quickblox, does this look correct or is there a better way to achieve the same behavior?

1) use chat rooms for 1:1 chat for the history functionality
2) register each 1:1 chat room in Custom Objects at the time of creation
with meta information including a dateOfLastReceivedMessage field (1
extra call to QB for each user)
3) every time chatRoomDidReceiveMessage is called, update the date of
dateOfLastReceivedMessage field (2 extra calls to QB; search the
record and update it)
You can use 1-1 Chat (not chat room) and also have Chat history, QuickBlox released plugin for this http://quickblox.com/developers/Chat/1:1_Chat_history
All Chat history will be stored in CustomObjects module.
You will be able to use great search API to request chat history
with every message sent, also send a push notification (1 extra call
to QB)
Correct
every time didReceivedRemoteNotification is called, compare the date
of the message (a) with dateOfLastReceivedMessage in Custom Objects
(b) (1 extra call to QB): if a > b -> notification is a new/unread
message: increment counter of new messages and show a visual clue/play
a sound. if a > b -> notification is not a new/unread message: do
nothing.
Yes, should work this way

Related

Cumulocity - managedObject Event - detect device first connection

Looking to understand whether there is a a bulletproof event from the namagedObject side of c8y where we know the device has just connected.
I have a microservice that listens for events in real time and I want to trigger a process once we know a device has connected to send its payload.
We have used:
"c8y_Connection": {"status":"CONNECTED"}
We have had the microservice log to Slack all events from managedObjects where we saw for three days the "status":"CONNECTED" value in the payload of our demo devices at reporting times.
But after three days, we see no more this "CONNECTED" state (all payloads showing "DISCONNECTED").
What I am trying to achieve from the inventoryObject event is to understand when a device had connected and sent payload to know when data had arrived. I then go get the data and process it externally. This is post registration and as part of the daily data send cycle for my type of device.
What would be the best way to understand when a device has sent payload in a microservice? I want to notify an external application with either “data is arriving for id 35213” or even better, “data has arrived for device 35213, and here’s the $payload”.
Just as a general information ahead:
The c8y_Connection fragment showing connected shows an active MQTT connection or an active long polling connection and it is only evaluated once every minute.
So if the client is just sending data and immediately disconnecting afterwards this might not picked up.
If you want to see the device having send something to Cumulocity maybe the c8y_Availability fragment is a better as it holds the timestamp when the device last send something.
{ "lastMessage": "2022-10-11T14:49:50.201+09:00", "status": "UNAVAILABLE"}
Also here the evaluation (or better the update to database) only happens every minute.
Both c8y_Availability and c8y_Connection however are only generated if the availability monitoring has been activated for the device (by defining a required interval for the device).
So if you have activated the availability monitoring and you see a "lastMessage" you can reliably say that the device has already send something to Cumulocity.

Retrieving messages in chat application

I am working on an application for chatting. I wondering how to design the API for getting messages from a server.
When the application loads the chat window, I display the last 20 messages from the server using the endpoint:
/messages?user1={user1Id}&user2={user2Id}&page=0
Secondly, I allow user to load and display previous messages when the users scrolls to the top using the same endpoint but with different page (1)
/messages?user1={user1Id}&user2={user2Id}&page=1
But this design doesn't work correctly when users start to send messages to each others. The reason is that the endpoint returns the messages using descending order. Invoking the endpoint will give different result sets before sending/receiving a message and after.
My goal is to get always 20 previous messages when a user scrolls through conversation history.
How would you implement this in a clean way (including the REST API design)? I can imagine some solutions, but they seem dirty to me.
REST does not scale well for real time applications. It takes too many resources to open the HTTP connection again and again. Better to use websockets for it.
As of the upper problem if you start pagination with the latest message, not with the first message, then I would not wonder that it changes. To keep the pages after it you need to send the message id you started with. So for example: GET ...&count=20&latest=1, after that you get a 20 list of messages, you get the message id of the latest one and do the pagination with GET ...&count=20&basePoint={messageId}&page=0 to always get the exact same page no matter that you got new messages. After that you continue with GET ...&count=20&basePoint={messageId}&page=1 and so on... You will have negative pages GET ...&count=20&basePoint={messageId}&page=-1 if there are newer messages. Though I don't know any chatting application which uses pagination this way.
What I would do is GET ...&count=20&latest=1 and get the 20th message and do GET ...&count=20&before={messageId_20th} after that get the last message again from that list which is the 40th and do GET ...&count=20&before={messageId_40th} and so on. To get the new messages you can do GET ...&count=20&latest=1 again or GET ...&count=20&after={messageId_1st}.
None of the above is really good if you are using a caching mechanism in the client. I would add something like key frames in videos, so key messages. For example every 20th message can be a key message. I would do caching based on the key message ids so the responses could be cacheable, something like GET ...&count=20&latest=1 would return messages and one in the middle of the list would have a property of keyMessage=true. I would start the next request from the last key message something like GET ...&count=20&before={messageId_lastKey}, so the response will be cacheable.
Another way of doing this is starting pagination from the very first message. So when you do GET ...&count=20&latest=1 it will write the index of the message, something like 1234th message. You can do caching based on every 20th message just like in the key message solution, but instead of the message ids, you can do GET ...&count=20&to=1220 and merge the pages on the client. Or if you really want to spare with data, then GET ...&from=1201&to=1220&before=1215 or GET ...&from=1201&to=1220&last=1214 or GET ...&from=1201&to=1214, etc... And continue normal pagination with GET ...&count=20&to=1200 or GET ...&from=1181&to=1200.
What is good with the upper fixed pages approach, that you can use range headers too. For example GET .../latest would return the last 20 message with the header of Content-Range: messages 1215-1234/1234. After that you can do GET .../ Range: messages=1201-1214 and GET .../ Range: messages=1181-1200 and so on... When the total message count is updated in the response Content-Range: messages 1181-1200/1245, then you can automagically download the new message with GET .../ Range: messages=1235-1245 or with GET .../ Range: messages=1235- if you expect the 1245 to change meanwhile. You can do the same thing with the URI too, but if you have a standard solution like range headers it is better to use it instead of reinventing the wheel.
As of the &user1={user1Id}&user2={user2Id} part I would order it based on alphabet, so always user1 would be earlier in alphabetic order than user2, something like &user1=B&user2=A -> &user1=A&user2=B. So that will be properly cacheable too, not necassarily on the client, but you can add a server side cache too, so if the two participating users try to get the same conversation recently it won't be queried from the database again. Another way of doing this is adding a conversation id, which can be random unique or generated from the two user ids, but it must be always the same for the two users e.g. /conversations/A+B or /conversations/x23542asda. I would do the latter, because if you start support conversations with multiple users it is better to have a dedicated conversation identifier. Another thing you can support is having multiple topic related conversations between the two users, so you won't need to do /conversations/A+B/{topic} just use a unique id and search for conversations before creating a new one, so either /conversations?participants[]=A&participants[]=B&topic={topic} will give you an empty list or a list with a single conversation. This could be used to list coversations of a single user /conversations?participants[]=A or list conversations of a single user in a certain topic /conversations?participants[]=A&topic={topic}.
Generally speaking, for "system design" questions, there isn't any one correct answer. Multiple approaches can be considered, based on their pros and cons. You've mentioned you want a clean way to retrieve messages, so your expectation might be different from mine.
Here is a simple approach/idea to get you started:
Have a field in your database that can be used to order messages. It could be a timestamp, a message_id, or something else that you want. Let's call this m_id for now.
On your client, have a field that tracks the m_id of the earliest message currently available locally on the client. This key (let's call it earliest_m_id) will be used to query the database, because you can simply fetch x number of messages before the earliest_m_id.
This would solve your issue of incorrect messages being fetched. Using paging will not work, since pages will continuously change with the exchange of messages.

Scalable and best way to get notifications not read count and mentions of a chat

Imagine and application like Whatsapp that for each chat has a count of mentions and messages not read:
I want to implement a scalable system to handle notification count of an app. Here what I've think about possible solutions and their problems:
1) Create a counter for each user in each group collection and increase by 1 for each new message:
➜ Problem: if I have chats with 500, 1000, 10000 users I will have to do 500, 1000, 10000 field updates.
➜ Test: I've created a new collection with 50M of documents. Update time for 6000 users = 0.15 seconds. Update time for 100000 users = 14.2 seconds. It's not scalable.
Notifications Model: (compound index: roomId: 1, channelId: 1, userId: 1)
{
roomId: string,
channelId: string,
userId: string,
unread_messages: int,
unread_mentions: int,
last_read: date
}
2) Save the last message read from each user and when doing the initial data GET, count for each chat, from the last message read to the last, and limit it.
➜ Problem: if you have 200 chats and you limit the number of notifications to 100 and it has been a while without logging into the application, you will have to count 100 * 200 rooms. When the "Count" operation is quite expensive for databases.
➜ Test: I've counted 100 messages per chat and 200 chats = 8.4 seconds. Messages indexed by id and timestamp. A lot of time for client login.
3) Set up a PUB / SUB using for example ActiveMQ, RabbitMQ or Kafka, and for each chat create a queue.
➜ Problem: You duplicate messages in the database and in queue/topics, in addition to being shared queues you would have to make queries if I am user X up to where I have read the last time and when you connect as a subscriber those messages are consumed and they are no longer available to other consumers.
In kafka, if each topic it's a chat, I can't do a count of pending notifications without getting all pending messages and consuming them. So, if I consume this messages and I dont enter in a chat, there will be no notifications the next time I log in.
Can you think of any more options or are any of the ones I mentioned previously are scalable?
Thank you very much in advance.
In order to solve this, you can keep the count of written messages in every chat and the count of read messages for every user in every chat. Essentially, the difference between these numbers is the number of unread messages of a user for a specific chat.
Let' say there are 1000 online users, all in 100 chat rooms, 10 users active in each room and 990 inactive in each room. Each active user, all of a sudden writes one message in the chat. This will produce 1000 messages and only 1000 counts (10 per chat). Users which are inactive will only receive the new count for each chat, but their own count for read messages stays the same. For those active in a chat there is no count, since the number of their read messages will equal the count of the chat.
If one user is offline and enters online in one chat, he will get 10 messages and one update for the number of read messages. If he is enrolled on all 100 chats, he will get 1000 messages and 100 updates if he reads all.
If one user is online, but not active in any chat (app in the background), he will get the new count for every chat that is written into. Since there is a read message count for every chat in his profile, the client will have to do the math and display the difference.
This can be further optimised by letting the client do some work and update the backend with the number of read messages. This basically offloads the backend for half of the operations in the example above, so the effective number of operations done in the backend will be 1000.
Of course, there can be further optimisations done, like bidirectional asynchronous updates that are being sent at controlled time intervals or number of messages. This allows for both the client and the backend to send bulk notifications and control use of resources.
Given context you provided, I think solution 1) is perfectly viable, but decouple counter update from visualization and keep these info in memory.
Now imagine the following process:
application start
during start a separate thread is running, doing first counter (in 14.2 seconds, acceptable on start )
these information are loaded in some kind of in memory database ( for example redis ), for quick access -> this is your "user in memory notification counter cache" with a simple map (uid,[c]) where uid is userId and [c] is array of counter.
you can limit this map for each user, for example at max 255 chat/groups, otherwise your application need to compute and update/extend the map (like the limit you mentioned)
periodically you can "compact" this map and purge from memory unused counters ( each night, as example, or each 2 hours, depends on your requirements ) to keep memory on check and don't explode
user1 access to the application first time
application fire a request and get unread messages notifications from the cache ( in memory, so really quick)
user2 send a message to user1, now some scenarios:
user1 is not online ( app closed ), so a "slow" refresh for user1 (and only this) unread notification counter can be triggered to update the in memory cache ( and some seconds are acceptable )
user1 is online, chat is openend and messages is delivered. In this case counter cache doesn't require a refresh
user1 is online, but not in the chat specific chat, but for example in the chat list. I suppose some kind of trigger can be fired and ask for updated/refresh list of notification messages for the user, BUT for only the chat with user2, not for all --> I think this is the key, so you can update/refresh is on the app and in the in memory central cache
I think this will solve your problem, get more speed and quickness, but require:
application know the status for each user (online/offline) and store it for quick access ( another map in the in memory database maybe ? )
local user app know when a new message in a specific chat is available
I suppose this two requirements are already part of your system, for a "chat messaging platform kind-of"

Prevent device notification client side with React Native Firebase when a FCM notification arrives (based on user preferences)

I'd like to know if there any a way to prevent the device display a notification when a FCM message with a notification arrives, in case the user has decided to mute notifications and only have them arrive silently (making them act as data only messages, but being notification messages). I imagine like, in the handler/callback call a method to prevent the device notification and do extra processing afterwards.
I know I can use data only messages, but that approach would be harder since I must use multiple topics or token lists and somewhat more backend logic to achieve that.
Thanks
If the message contains a notification property, it is automatically handled by the OS when the app is not active. There is nothing you can do to prevent that.
As you said: if the message only contains a data property, it is always passed to your code for handling, and you can decide what to do with it. So that's the way to go if you want full client-side control over whether a notification is displayed for a message.

Apple Wallet Passes : Update Calls trigger

There are numerous articles on "how to implement update service" is already present. However, I have serious question below:
when will pass get updated technically? what is the trigger for updating?
When will be passbook's update service be called?
when update push notification is clicked by user
when pass is opened by user
silently done in background
when automatic update is off and user opens a pass
Please help
Pass updates can be initated in one of two ways:
The users does a pull-to-refresh on the pass.
You send a push notification to the device.
In the case of number 2, the device responds to the push notification by reaching out to the service specified by the pass's WebServiceUrl.
It sends the device identifier and pass type identifier. The web service is responsible for determining the passes that need updating and it returns a set of serial numbers. It does this by looking at the update tag and applies whatever mechanism makes sense for the pass.
The device will then request an updated pass for each of these serial numbers.
The process is described in far more detail in Apple's Documentation:
https://developer.apple.com/library/content/documentation/UserExperience/Conceptual/PassKit_PG/Updating.html#//apple_ref/doc/uid/TP40012195-CH5-SW1