Redis Cache Pub Sub delete after receive - redis

Is there a way to delete a message from a subscriber, once it is received by that subscriber so as to prevent it from being read by another subscriber to the same channel?
Note: this behavior is supported on Azure Service Bus..

NO, you cannot do that with Redis pubsub.
However, you can achieve the goal with Redis Stream. You can create a consumer group with the XGROUP CREATE command. For each message in a Redis Stream, only one consumer in the group can read the message. Check this for detail.

Related

Redis Subscribing to a channel ( key space notifications should be enabled ??)

I am working on a node JS app that connect to redis server and subscribe to a channel to get the messages.
There is a bit of confusion, should we really enable "key space notifications" on redis config to get the events in client
The same scenario I have tried using rdis cli, with which i see "key space notifications" are not enabled at the same time I have subscribed to a channel with a pattern, so when ever I publish a message from the other client, I am able to capture that event in subscribed client.
Is the "key space notifications" mandatory , but the POC says other way.
Does any one know what should be the right approach here, subscribing to channel is suffice to get messages, and its nothing to do with "key-space-notifications" ??
From Redis Keyspace Notifications
Keyspace notifications allow clients to subscribe to Pub/Sub channels in order to receive events affecting the Redis data set in some way.
Examples of events that can be received are:
All the commands affecting a given key.
All the keys receiving an LPUSH operation.
All the keys expiring in the database 0.
Events are delivered using the normal Pub/Sub layer of Redis, so clients implementing Pub/Sub are able to use this feature without modifications.
So, if you need just pub/sub, there is no need of extra configuration regarding Keyspace Notifications

Redis Pub/Sub - Publisher also a subscriber?

I'm new to Redis, and I've been messing around with the Pub/Sub. Due to dependency factors, I would like the publisher to also be the subscriber of a channel, such that when the publisher sends a message through the channel, they also receive the message. Is this possible?
No it is not possible with pub/sub because there is no persistence. When the publisher publishes the message to a channel, only the connected clients of the channel will receive the message. No message will be saved. Since your publisher will not be connected as subscriber, you can't receive what you published before. Even a subscriber looses connection and connects back, he will not receive the messages while he was disconnected.
There are some workarounds such as whenever you publish a message, you may send it to a sorted set/list and read it later.
Another way to do it maybe using keyspace notifications but didn't try it. You may check the details here

Is there a way to publish a message as record is added to a key in redis?

Here goes my use case:
We use redis appender to write our log messages to redis. These messages have MDC data (Trace Id) to track individual requests. We want other application to subscribe to the trace id and get all the messages logged (As they are inserted). Can we have some sort of a trigger that can publish the message as it is being added?
The appender does not provide us with the ability to publish to a channel and we don't want to create a custom publisher for this use case. I am sure this use case is not unique and am hoping for a recommendation. basically looking for something like a trigger that rdbms have on insert.
Redis Keyspace Notifications sound like they might fit your use case: https://redis.io/topics/notifications
You can subscribe to a variety of notification types and I would guess that one of those would fit your need.
Consider using the Stream (v5) data type for storing your log, and having consumers consume that stream for incoming updates.

Redis PubSub message order in cluster is not guaranteed?

Is the message order of pubsub messages in a redis cluster in any way guaranteed?
We are using a Redis cluster (v3.2.8) with 5 master nodes, each with one slave connected & we noticed that we sometimes get pubsub messages in wrong order when publishing to one specific master for one specific channel and being subscribed to slave nodes for that channel.
I could not find any statements related to pubsub message order in cluster on redis.io nor on the redis-github repo.
First of all, if you are using PUBLISH, then it is blocking and returns only after messages have been delivered, so yes the order is guaranteed.
There are 2 problematic cases that I see: Pipelining and Client disconnection.
Pipelining
From the documentation
While the client sends commands using pipelining, the server will be forced to queue the replies, using memory.
So, if a queue is used, the order should be guaranteed.
Client disconnection
I can't find it in the documentation, but if the client is not connected or subscribed when the message is published, then it wont receive anything. So in this case, there is no guarantee.
If you need to persist messages, you should use an a list instead.

Redis Channel configuration for Publish subscribe

How long will messages published to a Redis channel stay there ? Also, is there is a way to configure the max. lifetime of a message per channel ? Is there a way to control the channel size or does the channel continues to store messages as long as the Redis server has free memory ?
Redis pub/sub doesn't persist published messages. What you are looking for seems more like a message queue which can be implemented using a combination of pub/sub and lists. For more information see pattern sections in RPOPLPUSH command.