Subscribe to a topic using node.js - cumulocity

I am attempting to subscribe to a topic from a node js file.
client.publish('s/us/mqttjs_8303dec3','210,' +Math.random() *10)
I am using this line to publish to a signal strength measurement run on a timer.
All good data arrives in cumulocity
My question is how do I now subscribe to that topic?
How would i do that in mqttlens as a backup?
I am new to cumulocity so any help much appreciated.
Fred

In Cumulocity MQTT is used not as a generic broker where you can publish and subscribe to the same topics.
You currently cannot subscribe on MQTT to the direct data of the devices.
If you (as a device) want to subscribe for your operations you can subscribe to e.g. s/ds for the static operations.
Also for the publish the topic can be simply s/us.
I am not sure what the mqttjs_8303dec3 part is in your case but if it is the client identification you can just put it into the MQTT ClientID to associate with the correct device. No need to send it with every publish.

Related

Where can I find azure IoT device messages?

I have sent messages to Azure IoT Hub device called dev1, I could not see the messages in IoT Hub but, I can read the messages only when the client application is online when the sender is sending messages. Azure IoT Hub supports only online messaging and no offline messaging? If offline message support is there, where are these messages are stored, I couldn't see the messages in IoT Hub.
When I configure the custom endpoint as Blob storage, I can see messages are stored in blobs.
Please help me on this.
Thanks in advance
If I understand correctly you are looking for reading the messages directly on IoT Hub portal UI. If that is the case, then one of the things which you can make sure about D2C Messages in IoT Hub portal (UI perspective) is looking at the Metrics chart (See below Images). For reading the actual payload you have to make use of in-built Event Hub endpoint or routing to other supported endpoints.(You have already mentioned in your scenario-Client/Sender applications, So I think you have already known this method of reading messages)
The Metrics chart atleast tells you that the messages are received in IoT Hub (UI), you can't read them on the Portal(UI).
IoT Hub is built on top of Event Hubs, and that's where your messages will be until you start reading them. They will be stored there for 1 day by default, although you can change that up to 7 days. For more information on retention, please read this page.

RabbitMQ and IoT device: keep queue open?

We're using RabbitMQ in a new project. We'll have IoT devices communicating with queues.
For the devices to send info to the cloud we don't see any issues, however sometimes we need to deliver messages from our backend to the IoT devices. For this we let the devices open an exclusive queue. This works perfectly, as long as the devices are online. When they aren't, the queue is closed and no messages can be send to it anymore.
Is there a way to keep the queue open, so messages are kept until the IoT device comes back online?
Vice-versa: Is there some way to have guaranteed delivery starting at the IoT device. For example: energy measurements every 15 minutes. If the connection drops, messages should be stored on disk (to prevent message loss in case of power cut). They are sent later on when the connection comes back online. Does a service or client library exist that implements this or do we need to develop this ourselves?
Is there a way to keep the queue open, so messages are kept until the
IoT device comes back online?
Use a regular queue, and make sure it's durable if you'd like it to survive RabbitMQ restarts.
Is there some way to have guaranteed delivery starting at the IoT
device.
That depends on the library you are using, but you don't tell us what library nor what protocol you're using (AMQP vs MQTT, for instance).
Some libraries offer automatic reconnect and re-creation of topology (queues, exchanges, etc) but I'm not aware of any that offer local storage of messages until the broker is available again. You'll have to code that yourself.
Please carefully read the documentation with regard to publisher confirmations and consumer acknowledgements, as those are both necessary for reliable messaging link.
NOTE: the RabbitMQ team monitors the rabbitmq-users mailing list and only sometimes answers questions on StackOverflow.
Our Cloud has several exchanges and credentials called CredentialsBucket assigned to a set of IoT devices. When an IoT device register, we provide them this credentials that includes a durable queue and exchange. When the IoT device push messages, it goes to Cloud through the exchange where we do additional security check using HMAC.
When Cloud send a message, it send it directly to his queue (no persistent messages in our case) and the IoT device do the same kind of security check.

Redis PUB/SUB: how to ignore own messages?

The idea is:
I have N WCF services which connected and subscribed to the same Redis message channel. These services use this channel to exchange messages to sync some caches and other data.
How each service can ignore its own messages? I.e. how to publish to all but me?
It looks like Redis PUB/SUB doesn't support such filtration. So, the solution is to use set of individual channels for every publisher and common channel for subscription synchronization between them. Here is an golang example of no-echo chat application.

How to handle RabbitMQ with mobile apps

I am looking to implement rabbitmq on google compute engine to handle messages on my android and ios messaging app. I have heard that rabbitmq can be quite power hungry, so i am wondering what the best solution to combat this is?
Do i use a different protocol like MQTT or so i use something like GCM to handle the connection to and from the apps and let rabbitmq just handle queuing the messages?
You would never want make a direct connection from mobile device to your RabbitMQ server, especially if the app on the device is a consumer. RabbitMQ consumers have to poll RabbitMQ continuously to check if there are messages pending for them. You would want a web-server to handle actual HTTP POST/GET of messages from devices. The webserver will do two things:
Save the message to DB (along with the source and intended destination info)
queue APN/GCM push messages to a RabbitMQ (the broker here) exchange
you will need to build a daemon to monitor RabbitMQ for these push messages that have been queued. The daemon's sole task would be to connect or maintain a connection to Apple's or Google's push messaging services and notify your apps that they have a message pending. If a device is notified of a pending message, it contacts the webserver to consume the message

RabbitMQ - How to save message for new consumer

Working on a new chat project, We want to use RabbitMQ to transfer our message.
So can RabbitMQ save all the message in queue or some other place, when an new people(consumer) comes, the RabbitMQ can flush the saved message to the new people?
If you use a persistent queue, rabbitmq can store the messages (the messages are stored in the same mnesia-db path).
So suppose that each user has a own queue, when the user gets on line can download the messages.
Anyway I don’t think it is a good idea use rmq to push messages for chat. There are others appropriate technologies, like MQTT, XMPP.
I suggest to read this post:
using rabbitmq in android for chat
Please read the tutorial to understand what RabbitMQ is.
RabbitMQ is a message broker: it accepts and forwards messages. You can think about it as a post office: when you put the mail that you want posting in a post box, you can be sure that Mr. or Ms. Mailperson will eventually deliver the mail to your recipient. In this analogy, RabbitMQ is a post box, a post office and a postman.