How to check logs between OPC Publisher and IoT Hub to confirm the data transfer - azure-iot-hub

I have setup IoT Edge up in one of our machines and installed OPC Publisher and connected it to one of our opc-ua servers which then sends data to OPC Publisher and then to IoT Hub. We have not received any data to our IoT hub for the last 10 days and suddenly today we have received the data. How can we troubleshoot why the data is missing for the last 10 days?

You can generate a support bundle on your edge device that will collect the logs of all deployed modules as well as the edge runtime logs.
sudo iotedge support-bundle --since 11d
More details on troubleshooting IoT Edge here
You can first look into the logs of the publisher and validate if the connection to the OPC UA Server was/is active. If this is fine then have a look into the edgeHub and validate if the upstream connectivity to IoT Hub was affected.

One of the most powerful tools to monitor your edge deployments is the integration with Azure Monitor. It will collect metrics from the edgeHub and edgeAgent, which combined will give you an overview of where your messages are going. It can show you how many messages are sent to your upstream endpoint and when.
Source of image
For a full overview of the capabilities, you can check out this blog post. Installation steps are here
Edit:
OPC Publisher aLso supports diagnostic logging, which will give you more information about the connections to OPC servers. To do this, you need to set the diagnostic interval. You can do this by specifying the --di command argument in your createOptions:
"OPCPublisher":{
"settings":{
"image":"<image>",
"createOptions":{
"Cmd":["di=60"]
}
},
"type":"docker",
"version":"1.0",
"status":"running",
"restartPolicy":"always"
}
The example above will log diagnostic metrics every 60 seconds. You can then upload the logs using the support bundle command from Cristian's answer, or use the UploadSupportBundle direct method to do the same without needing access to the device.

Related

Azure IoT on Edge - IoTSDK - Read batch of messages from ModuleClient

I'm tryng to develop an high-frequency message dispatching application and i'm observing for the behavior of the SDK about message reading from the ModuleClient connected to the edgeHub by using "MQTT on TCP Only" transport settings.
Seems that there is no way to read multiple messages at time (batch) from the edgeHub (I think is something related to the underlying protocol).
So the result is that one must sequentially read a message, process it and give the ack to the hub.
Does exist a way to process multiple message at time without waiting for the processing of the previous?
Is this "limitation" tied to the underlyng protocol?
I'm using Microsoft.Azure.Devices.Client 1.37.2 on a .NET Core 3.1 application deployed on Azure Kubernetes (AKS) by using Azure IoT Edge on Kubernetes workload.
You are correct, you cannot use batch in MQTT protocol. This is a limitation tied to IoTHub when using MQTT Protocol.
IoT Hub only supports batch send over AMQP and HTTPS at the moment.
The MQTT implementation loops over the batch and sends each message
individually.
Ref: https://github.com/Azure/azure-iot-sdk-csharp
Suggest that you add a new feature request, if need IoTHub to support batch when connecting using MQTT: https://feedback.azure.com/forums/321918-azure-iot-hub-dps-sdks

Where can I find azure IoT device messages?

I have sent messages to Azure IoT Hub device called dev1, I could not see the messages in IoT Hub but, I can read the messages only when the client application is online when the sender is sending messages. Azure IoT Hub supports only online messaging and no offline messaging? If offline message support is there, where are these messages are stored, I couldn't see the messages in IoT Hub.
When I configure the custom endpoint as Blob storage, I can see messages are stored in blobs.
Please help me on this.
Thanks in advance
If I understand correctly you are looking for reading the messages directly on IoT Hub portal UI. If that is the case, then one of the things which you can make sure about D2C Messages in IoT Hub portal (UI perspective) is looking at the Metrics chart (See below Images). For reading the actual payload you have to make use of in-built Event Hub endpoint or routing to other supported endpoints.(You have already mentioned in your scenario-Client/Sender applications, So I think you have already known this method of reading messages)
The Metrics chart atleast tells you that the messages are received in IoT Hub (UI), you can't read them on the Portal(UI).
IoT Hub is built on top of Event Hubs, and that's where your messages will be until you start reading them. They will be stored there for 1 day by default, although you can change that up to 7 days. For more information on retention, please read this page.

Gridgain console load balance

I have Gridgain three node cluster and also running Gridgain web console agent and web console on all three nodes. It is all hosted on Windows Server.
I would like to load balance My web console. The problem is I don't know how to share user registration database which it stores in a work directory. Can I use external database to store all that information so that my cluster uses the same database?
There is a problem with Web Console Agent as well. How do I share tokens stored in default.properties?
There is no definitive guide on how to create a cluster for web console for high availability.
Can someone please guide me on how can I form a cluster for a Web console sharing its user store and tokens?
Thanks
If you are looking for multi-cluster support, take a look at documentation:
https://www.gridgain.com/docs/web-console/latest/multi-cluster-support
If you are looking for agent fault-tolerance: just start several agents. Fisrt agent will process all messages, other will be in the hot-stand-by mode.
If you are looking for connection fault-tolerance between agent and cluster (if cluster node failed that is a connection point for agent, Web Console will loose connection to cluster), just specify several nodes addresses as comma-separated list for "node-uri" parameter (in default.properties or as command-line argument).
For example:
node-uri=http://192.168.0.1:8080,http://192.168.0.2:8080;http://192.168.0.3:8080
Hope this helps.

Multiple MQTT connections on a single IOT device

Using the azure-iot-sdk for python I have a program that opens a connection to the IoT Hub and continually listens for direct methods, using the MQTT protocol. This is working as expected. I have a second python program that I invoke from cron hourly, that connects to the IoT Hub and updates the device twin for my device. Again this is using MQTT. Everything is working fine.
However I've come across in the documentation that a device can only have one MQTT connection at a time and the second will drop cause the first to drop. I'm not seeing this, however is what I'm doing unsupported?
Should I have a single program doing both tasks and sharing a single connection?
Yes that is correct, you can't have more than one connection with the same device ID to the IoTHub. Eventually in time you will have inconsistency behaviors and that scenario is unsupported. You should use a single program with a unique device ID doing both tasks.
Depending on the scenario you may want to consider using an iothubowner connection string to do service side operations like manage your IoT hub, and optionally send messages, schedule jobs, invoke direct methods, or send desired property updates to your IoT devices or modules.

Free device logs for analysis

I am setting up HOME SIEM lab using SPLUNK. I am looking for sources which can provide different logs for various devices but not limited for below ones.
Windows Logs
IIS Logs
IDS/IPS Logs
Based on the logs i am planning to build search queries for various events and further using the same to build the rules.
It is not clear why you need logs when you can generate these? For example you can set up a VM with Windows Server and install an agent like NXLog (or any log collection agent that can send logs forwarded via TCP, UDP, TLS, or HTTP) for log collection to Splunk.
Checkout the Montgomery County Data Portal. It's free
https://data.montgomerycountymd.gov/
You could also connect to a crypto exchange API and have lots of data flow in real-time