Azure IoT Hub monitoring usage and history - azure-iot-hub

I recently started a project with Azure IoT Edge with the IoT Hub Free Tier so I'm a total beginner. I set up a device sensor, a module and I am sucessfully sending data to my module and IoT Hub. I can
monitor the messages sent from the sensor with Azure IoT Hub expension from Visual Studio Code.
I can see the messages I'm sending but I am having an issue with the number of messages being sent.
I use Azure portal metrics to monitor the number of messages sent and very often Azure would show me different numbers as I refresh. For example "1000" messages and after a refresh "800" messages etc...
Another issue I'm having is also that when using the Metrics functionality, it shows me that some messages are being sent during a time where my sensors weren't sending any messages.
Is there a way to get a detailled report with a history on the messages that the Hub receives?
Any help or advice would be highly appreciated! Thank you

As far as I know there is no "nice and simple" report which will show you what you need. However, if you want to get historical events which IoT hub processed it can be done. Note, though, that history is limited and can be no longer than 7 days. Current retention period can be seen on the azure portal in "Built-in endpoints" section - there is a "Retain for" setting with the value from 1 day (default) to 7 days.
If events are within this range, you can use Powershell command "az iot hub monitor-events" with the --enqueued-time parameter to "look back" in history. The time is specified in milliseconds since unix epoch. Example command:
az iot hub monitor-events --hub-name 'your_iot_hub_name' --device-id  'your_registered_device_id' --properties all --enqueued-time 1593734400000

Related

Using Azure IoT - telemetry from a Windows desktop application

I work for a company that manufactures large scientific instruments, with a single instrument having 100+ components: pumps, temperature sensors, valves, switches and so on. I write the WPF desktop software that customers use to control their instrument, which is connected to the PC via a serial or TCP connection. The concept is the same though - to change a pump's speed for example, I would send a "command" to the instrument, where an FPGA and custom firmware would take care of handling that command. The desktop software also needs to display dozens of "readback" values (temperatures, pressures, valve states, etc.), and are again retrieved by issuing a "command" to request a particular readback value from the instrument.
We're considering implementing some kind of telemetry service, whereby the desktop application will record maybe a couple of dozen readback values, each having its own interval - weekly, daily, hourly, per minute or per second.
Now, I could write my own telemetry solution, whereby I record the data locally to disk then upload to a server (say) once a week, but I've been wondering if I could utilise Azure IoT for collecting the data instead. After wading through the documentation and concepts I'm still none the wiser! I get the feeling it is designed for "physical" IoT devices that are directly connected to the internet, rather than data being sent from a desktop application?
Assuming this is feasible, I'd be grateful for any pointers to the relevant areas of Azure IoT. Also, how would I map a single instrument and all its components (valves, pumps, etc) to an Azure IoT "device"? I'm assuming each component would be a device, in which case is it possible to group multiple devices together to represent one customer instrument?
Finally, how is the collected data reported on? Is there something built-in to Azure, or is it essentially a glorified database that would require bespoke software to analyse the recorded data?
Azure IoT would give you:
Device SDKs for connecting (MQTT or AMQP), sending telemetry, receiving commands, receiving messages, reporting properties, and receiving property update requests.
An HA/DR service (IoT Hub) for managing devices and their authentication, configuring telemetry routes (where to route the incoming messages).
Service SDKs for managing devices, sending commands, requesting property updates, and sending messages.
If it matches your solution, you could also make use of the Device Provisioning Service, where devices connect and are assigned an IoT hub. This would make sense, for instance, if you have devices around the world and wish to have them connect to the closest IoT hub you have deployed.
Those are the building blocks. You'd integrate the device SDK into your WPF app. It doesn't have to be a physical device, but the fact it has access to sensor data makes it behave like one and that seems like a good fit. Then you'd build a service app using the Service SDKs to manage the fleet of WPF apps (that represent an instrument with components, right?). For monitoring telemetry, it would depend on how you choose to route it. By default, it goes to an EventHub instance created for you. You'd use the EventHub SDK to subscribe to those messages. Alternatively, or in addition to, those telemetry messages could be routed to Azure Storage where you could perform historical analysis. There are other routing options.
Does that help?

Creating multiple subs on same topic to implement load sharing (pub/sub)

I spent almost a day on google pub sub documentation to create a small app. I am thinking of switching from rabbitMQ to google pub/sub. Here is my question:
I have an app that push messages to a topic (T). I wanted to do load sharing via subscribers. So I created 3 subscribers to T. I have kept the name of all 3 subs same (S), so that I don't get same message 3 times.
I have 2 issues:
There is no where I console I see 3 same subscribers to T. It shows 1
If I try to start all 3 instances of subscribers at same time. I get "A service error has occurred.". Error disappeared if I start in sequential manner.
Lastly, Is google serious about pub/sub ? Looking at the documentations and public participations, I am not sure if I should switch to google pub/sub.
Thanks,
In pub/sub, each subscription gets a copy of every message. So to load balance handling message, you don't want 3 different subscriptions, but a single subscription that distributes messages to 3 workers.
If you are using pull delivery, simply create a single subscription (as a one-time action when you set up the system), and have each worker pull from the same subscription.
If you are using push delivery, have a single subscription pushing to a single endpoint that provides load balancing (e.g. push to a HTTP load balancer with multiple instances in a backend service
Google is serious about Pub/Sub, it is deeply integrated into many products (GCS, BigQuery, Dataflow, Stackdriver, Cloud Functions etc) and Google uses it internally.
As per documentation on GCP,https://cloud.google.com/pubsub/architecture.
Load balanced subscribers are possible, but all of them have to use same subscription. Don't have any code sample or POC ready but working on same.

Azure IoT Hub blocked for two consecutive days, will not let me change to a paid tier

I have a free tier subscription to Azure IoT hub with only two edge devices connected to it, one of them mostly off. Yesterday, it looks like my hub recorded a slew of messages--within 45 minutes (5 to 5:45 pm PST), 25K messages were recorded by the hub. A few related issues.
I'm not sure what these messages were. I'll add message storage for the future, but wondering if there's a way to debug this.
Ever since then, I haven't been able to use the IoT hub. I get a "message count exceeded" error. That made sense till around 5 pm PST today (same day UTC), but not sure why it is still blockign me after that.
I tried to change my F1 to a basic tier to basic, but that wasn't allowed because I am apparently "not allowed to downgrade"
Any help with any of these?
1.I'm not sure what these messages were. I'll add message storage for the future, but wondering if there's a way to debug this.
IoT Hub operations monitoring enables you to monitor the status of operations on your IoT hub in real time. You can use it to monitor Device identity operations,Device telemetry,Cloud-to-device messages,Connections,File uploads and Message routing.
2.Ever since then, I haven't been able to use the IoT hub. I get a "message count exceeded" error. That made sense till around 5 pm PST
today (same day UTC), but not sure why it is still blockign me after
that.
IoT Hub Free edition enables you to transmit up to a total of 8,000 messages per day, and register up to 500 device identities. The device identity limit is only present for the Free Edition.
3.I tried to change my F1 to a basic tier to basic, but that wasn't allowed because I am apparently "not allowed to downgrade".
You cannot switch from Free to one of the paid editions. The free edition is meant to test out proof-of-concept solutions only.
Confirming the earlier answer, the only solution is to delete the old hub and create a new free one, which is simple enough.
I still haven't figured out what those specific error messages were, but I do notice that when there are errors such as CA certificate auth failures, lots of messages get sent up. I'm still working with MSFT support on the CA certificate signing issues, but this one is a side effect.
For future reference, look at yoru hub's metrics, and note that (i) quote gets reset midnight UTC, but (ii) net violations do not.

Streaming Analytics job not receiving inputs from IOT hub

I followed a IOT Hub tutorial and got it working. I then created a Stream Analytics job and used the above as an Input (which upon test connection works).
However I do not see any inputs being received. When running a sample test I get the following error:
Description Error code: ServiceUnavailable Error message: Unable to
connect to input source at the moment. Please check if the input
source is available and if it has not hit connection limits.
I can see telemetry messages being received in the IOT Hub. Any help would be appreciated
Is the stream analytics job running?
I had a similar problem where i wasn't getting any events from stream analytics and i had forgotten to turn it on.
Click on the stream analytics > overview > start
I had the same problem (using Event Hubs in my case). The root cause was that I had too many queries within my job running against the same input. I solved it by splitting my input into several inputs across multiple consumer groups.
From the documentation (emphasis added):
Each Stream Analytics IoT Hub input should be configured to have its own consumer group. When a job contains a self-join or multiple inputs, some input may be read by more than one reader downstream, which impacts the number of readers in a single consumer group. To avoid exceeding IoT Hub limit of 5 readers per consumer group per partition, it is a best practice to designate a consumer group for each Stream Analytics job.
I have exactly the same problem, though my modules on my raspberry pi are running without failure.
SA says: "Make sure the input has recently received data and the correct format of those events has been selected.

Duplex messaging or Azure Queue Service

All ,
We have a requirement to develop a azure based platform, in which the user can configure multiple pharmaceutical instruments, start measurements on them and analyze the measured data. The typical components in the azure based platform will be following
1 - A .NET based 4 client application running on the computer connected to each instrument. This client application should receive the start measurement command from the azure platform , perform the measurement and update the result back to the azure*
2 - A set of services[probably REST based] which will get the results from the client application and update the database on the cloud
3 - A set of services and business logic which which can be used to perform analysis on the data
4 - A asp.net web application where the user can view instrument details , start measurement etc
There is a two way communication between the Azure platform and the client application i.e. the client needs to update results to the azure and the azure needs to initiate measurement on the instrument via the client application
In such a scenario , what is the recommended approach for the azure platform to communicate to the clients. Is it any of the following
1 - Create a duplex service between the client and server and provide a call back interface to start the measurement
2 - Create a command queue using Azure message queue for each client. when a measurement needs to be started , a message will the put on the queue. The client app will always read from the queue and execute the command
or do we have any other ways to do this , any help is appreciated
We do not fully understand your scenario and constraints around it, but as pointers, we have seen lot of customers use Azure storage queues to implement master-worker scenario (some component adds message to appropriate queue to get work done (take measurements in your case) and workers polling the queue to process this work (client computer connected to your instrument in this case)).
In terms of storing the results back, your master component could provide SAS access to client to write results back to specific blob in an Azure storage account and either have your service and business logic monitor existence of that blob to start your analysis.
Above approach will decouple your client from server and make communication asynchronous via storage. Again, these are just pointers and you would be the best person to pick the right approach that suits your requirement
For communication between the server and the client, you could use SignalR http://signalr.net/ there are two forms of messaging systems supported "as a service" on Azure, these are Service Bus and Message Queues - see this link http://msdn.microsoft.com/en-us/library/hh767287.aspx