Is it possible to get a notification if some one subscribe to a certain topic in redis. For example A can subscribe to B using SUBSCRIBE B command but want if B doesnot want A to subscribe to it. Is there is a way that B gets a notification when A subscribe to it ?
Related
I'm working on a library for interacting with the Discord API. My current setup is:
A gateway, each handling x amounts of shards - so that I can spin up as many of these as I like to scale well. These gateways publish events received to a redis message queue.
A client, which subscribes to the message queue, and responds to events received.
However, there are some scenarios - working with message components - where I want a specific client to handle events related to that message. This client will then use the node.js event emitter to emit an event in itself which is then received by a 'collector' in my code.
Does anyone have any recommendations how I might stop other clients from picking up the event from the message queue, so that only this specific client picks it up? Is it possible for a subscriber to 'read' an event before it like accepts it? As then all clients could read an event to see if it like matches a list of events its waiting for?
I need help: How to make sure that the events of the one queue are necessarily processed by 2 listeners (and 1 and 2). Is it possible ? Example: my data is stored in 2 databases in different services, when I create a queue event, I need both services to delete information in their databases.
No; RabbitMQ doesn't work that way; each consumer needs its own queue; bind two queues to the exchange and make sure both get all messages either by binding with the same routing key or by using a Fanout exchange.
See the tutorials https://www.rabbitmq.com/getstarted.html
Now I want to using redis to implement a pub/sub system, I have 1000+ rss channel in my system. And a scrapy rss channel update app write by Python 3. The 10000+ users to subscribe the channel they like.
When get the article, in the scrapy side, I just want to send the article to channel in redis, and now I store users subscribed articles list in redis like this:
cruise:user:1234:subscrible 1,2,3(article id list data structure.....)
Now I want when send article to channel, the article id would automatic push into the user subscribe list header. Is it possible to implement in redis?
Now I just know consumer the article on the client side, and find users who subcribe the channel, and using lpush to insert the article id into redis. But the problem is:
when a channel has 10000000 users subscribe, the should invoke lpush command for 10000000+ times. any better sulution? is it possible using redis to maintain the sub/pub relationship and automatic pend article id by redis itself?
like this model, but the client is a list in redis, I want channel send article id into the client list header.
I think you do not need binding user to articles. Just binding user to channel, then binding channel to articles.
When get a new article, first, assure and find the channel, then put article id in related channel article list.
I have a use case in which I want to enable notification only for a certain set of keys, so that when those keys expire I can get a notification from redis.
I have followed this answer to implement this.
I have set parameter notify-keyspace-events to "Ex"
To accomplish this I am adding keys that I want notification for in DB-0 and the other keys in DB-1. But I am recieveing notification for both the DBs. Is there any way to just get notification from a particular DB?
According to redis documentation :
"Redis can notify Pub/Sub clients about events happening in the key space.
This feature is documented at http://redis.io/topics/notifications
For instance if keyspace events notification is enabled, and a client
performs a DEL operation on key "foo" stored in the Database 0, two
messages will be published via Pub/Sub:
PUBLISH keyspace#0:foo del
PUBLISH keyevent#0:del foo
"
But I am receiving notification from both DB-0 and DB-1.
PS : I know I can filter keys in my application, but I store too many expiring keys in redis and sending notification for all the expiring will increase load on my redis server.
I think you subscribed to a pattern that matches all DBs' notification message, e.g. PSUBSCRIBE __key*__:*.
In fact, you can specify the db index in the subscribed pattern: PSUBSCRIBE __keyspace#0__:* and PSUBSCRIBE __keyevent#0__:*. This way, you'll only received notification of DB0.
I am using RabbitMQ to send notifications to the user. The user can read his queue at any time.
The problem I am facing is that the queue is filled with lots of notifications during the night, and when the user returns in the morning, he has to process these messages sequentially. A lot of these notifications are even duplicates.
I guess it would make sense to improve this at the publisher side. That is, before adding a new notification, we investigate if there are already pending notifications in the queue. If this is the case, we only queue a new notification if it is really a new notification, hence avoiding duplicates.
We might even go further and extend this by combining notifications: instead of simply queuing a new notification, we could replace the present notifications from the queue by a new one which holds the sum of these notifications and the new one (for example in an array of inner notifications).
Is this possible with AMQP/RabbitMQ?
This rabbitmq plugin has been written to tackle your issue.
You can enable de-duplication on a queue via setting its x-message-deduplication argument to true.
Then, your publishers will need to provide the x-deduplication-header message header with a value meaningful for de-duplication. The value could be a unique message ID or the MD5/SHA1 hash of the body for example.
No, by default you can't replace an existing message.