How to create Custom key-space-events on redis - redis

I am using Redisson Library and want to create an ObjectListner implementation using key-space-event for streams TOUCH operation.
How can I add custom Listener ?

Events for TOUCH operation isn't supported by Redis

Related

RabbitMQ create listener dynamically

I'd like to know if there's a way to create a RabbitListener in such a way I can set programmatically on which queue it should listen for messages. In short, our business case requires that users are able to create queues dynamically through an API (we managed to do this). Since users can create queues with arbitrary names, I don't know whether is possible or not with Spring and RabbitMQ library to create listeners dynamically.
I know that I can create a RabbitListener and use the queues = my_queue_name annotation in order to create a listener for my_queue_name but how I'm supposed to do this when I don't know the queue name ?
I can provide code snippets of my discoveries, but they're not properly working (or not at all!) and I'm worried that this isn't the best way to handle the problem.
Thanks in advance for the support.

Serverless framework for trigger

I am looking for a serverless framework(free) , where i can create a kafka trigger and when triggered a kube function is to be invoked (python)
I have tried nuclio but the problem is that i have kafka version higher and they do not support higher than 2.4.
I want something like:
apiVersion: "nuclio.io/v1beta1"
kind: "NuclioFunction"
spec:
runtime: "python:3.6"
handler: NuclioKafkaHandler:consumer
minReplicas: 1
maxReplicas: 1
triggers:
myKafkaTrigger:
kind: kafka-cluster
attributes:
initialOffset: earliest
topics:
- nuclio
brokers:
- kafka-bootstrap:9092
consumerGroup: Consumer
And a kube function like:
def consumer(context, event):
context.logger.debug(event.body)
print(event.trigger.kind)
As simple as these two files and i have already existing kafka cluster so i just want to have trigger on that.
what are the possible alternatives apart from nuclio? I looked into kubeless seemed complicated. Fission does not support python.
I don't know much about Nuclio but the scenario you described looks possible with Knative.
Simplest way, you can create a Knative Service for your consumer. For the Kafka part, you can use a KafkaSource to get the events into Knative Eventing system. In your KafkaSource, you can tell it to call the Knative Service when there's an event coming from Kafka.
Above is the simplest way. If you need more advanced features, there is also support for filtering based on event types or having multiple consumers subscribed to events and more features.
Red Hat's Knative Tutorial has a section for serverless eventing with Kafka.
The exact same use case is possible with Fission which is an open source serverless framework for Kubernetes.
You can create a Message Queue trigger for Kafka and associate it with a serverless function like this:
fission mqt create --name kafkatest --function consumer --mqtype kafka --mqtkind keda --topic request-topic --resptopic response-topic --errortopic error-topic
This would trigger a function called consumer whenever there's a message in the request-topic queue of Kafka.
You can also associate meta data like authentication information as secrets or flags like pollingintervals, max retries etc.
Reference: https://fission.io/docs/usage/triggers/message-queue-trigger-kind-keda/kafka/

What is the difference between creating Queue using Queue class of Spring and creating directly from RabbitMq Console?

We have to choose the best way of implementing RabbitMQ Queue.
We have two approaches
1. Create a Queue and Bind using #Bean and Queue class in Spring.
2. Create a Queue in RabbitMQ web console itself.
We need to know which is the best way the Programming way or Console way and Why?
IMHO, the better way is using the web console. Queue is an infrastructure and will be used by many applications. You should not provide full control of the infrastructure to applications. It should be maintained by the admin.
Also please consider the following aspects.
Security
Ease of use
Threats

Can redis key space notifications be pushed to the redis stream instead of pub/sub channel

We have a requirement that we need to get a notification on changes to a Redis data structure. Based on my research I found out that I can use Redis key space notifications for doing the same. However, Redis key space notifications send the events to Redis pub/sub channel which is fire and forget i.e once the clients lose the connections all the events till the connection is up again are lost.
Redis streams solve this problem. Also I want to use consumer group feature of Redis streams. So is there any way that Redis key space notifications can be pushed to Redis streams instead of Redis pub/sub channel?
The only way this can be done afaik with the current - Redis v5.0.3 - is to use the Modules API to develop a module that registers for keyspace notifications, processes them and adds the relevant messages into the Stream.
With RedisGears it's pretty simple to register a listener that will automatically write each event to a Stream.
e.g. the following register.py script will write for each HSET or HMSET call on person:* key prefix an event to mystream Stream.
register.py:
GearsBuilder() \
.foreach(lambda x: execute('XADD', "mystream", '*', *sum([[k,v] for k,v in x.items()],[]))) \
.register(prefix="person:*", eventTypes=['HSET', 'HMSET'])
To run it all you need to do is call:
$ gears-cli run register.py

Can wifi / geo triggers be invoked even if the Worklight app is not running, including not in the background?

In Android, an app which is not currently running can be notified when certain event happens (like wifi scan results available, boot process completed) through Broadcast Receivers mechanism. Is this possible in anyway so that the wifi/geo triggers can be invoked even if the Worklight app is not running, including not running in the background?
Regarding wifi/connectivity changes notifying your app, that looks possible since it is a standard system event. It would likely take custom native code since you'd need to implement a broadcast receiver. And you'd need to register your receiver in your app's AndroidManifest.xml file. Take a look at http://developer.android.com/reference/android/net/ConnectivityManager.html and http://www.grokkingandroid.com/android-getting-notified-of-connectivity-changes/
Regarding geolocation triggers, it is unclear what sort of triggers you are looking for. This is all I see in the Android docs: http://developer.android.com/reference/android/location/GpsStatus.html and the standard broadcast actions at http://developer.android.com/reference/android/content/Intent.html#constants
If you are looking for something like geofencing, it would take application logic to determine when to fire events, so that means an app or service needs to be running. So although your broadcast receiver's onReceive() method can get called upon a geo event, who is going to fire that event?
Having the triggers activate when the application is not running at all (not even in background) isn't supported through the Worklight APIs.
You could try and use Worklight Android Native SDK together with cmarcelk's suggestions. Or you could use the Worklight Android triggers within a native service, together with the Broadcast Receivers mechanism so that it will run automatically on boot. You could then use an Intent to open the application from the trigger callback.