Get the creation date of an Azure IotHub device / twin - azure-iot-hub

I would like to be able to sort the devices registered on a given Azure IoTHub by creation date.
I tried fetching this devices list through several tools:
using the official CLI with az iot hub device-identity list …, inspecting the JSON formatted output of this command
using the Node.js Azure SDK
The returned objects have various fields dealing with dates :
lastActivityTime
statusUpdateTime
But none of them are initialized to the creation date.
To circumvent this apparent lack of data, I currently have to listen to Azure EventGrid DeviceCreated events and manually set arbitrary createdAt field on the targeted digital-twin tags.
This solution is quite cumbersome, I feel like I'm missing something more obvious.
Is there any simpler way to get the creation date of an Azure IotHub device / twin?

We can retrieve the date and time of the IoT hub creation using both from the portal as well using the JSON file created after the creation of IoT hub and using the code part with subscription ID under the specific resource, we are getting the information of the date and time of the IoT hub. Follow the below procedure to retrieve the information.
Created the iot-hub instance
The date and time of creation is available on the screen
Using the JSON file we can see the date and time of creation.
Copy the JSON request and follow the steps mentioned in the link mentioned and retrieve the data. The same procedure for both digital twins and individual entities.

Related

On Azure IOT Hub, Is it possible to create a device which acts an alias for an already deployed device?

For Azure IOT Hub - I am working on a project where I would like to create some scripts based around the device name of an IOT device in the field. The device name already has details related to the location and device number of the location (which is what the script will pull).
Example H4C1F5Device3, where H4C1F5 is the postal code, and device 3 refers to the 3rd device in this location.
However due to a mixup in deployment, some devices don't match this naming convention and make scripting a little broken.
So my question is there a way to create an "alias"/dummy device in IOT hub with the correct name which can point to the previously deployed device with the "incorrect" name?
So far I have only looked at solutions in the Portal (no CLI, Node.js, etc.). I have tried making changes in the "Device Twins" page, but it does not accept changes when attempting to save.
Would appreciate any help on this. The Azure documentation is quite good, but I'm having trouble finding the solution to this exact query.
There is no such feature in Azure IoT Hub. You could create the new device IDs and reprovision your device with the new credentials.
It's also good practice not to store this kind of information in the device ID. You could use tags on the device twin (or desired properties if the device needs to know), unlike the device ID they are mutable.

Is there any way to get Realtime data from BigQuery SDK / API?

I working on a project in which I’m facing a problem that I want to get real-time data from BigQuery dataset,
So I research hard I learn about web sockets like pusher and laravel-websocket, and I also test them also
but One thing I didn’t understand that how to do If I connect BigQuery SDK with my Laravel project then I didn’t find any event or method that BigQuery gives us to do so, because in my scenario most data comes from IoT devices and the device uses bigQueryAPI to feed data I also want that devices data real-time.
Then I found that there’s a way to connect BigQuery Dataset with firebase and then firebase gives us an event whenever row fed in the dataset.
I just want to know Is there’s any that without a firebase solution I can do it?
because the above solution is not free I have to pay firebase to avail of this feature?
thank you
IoT devices can post the data to pub/sub instead of BigQuery, from pub/sub one consumer(Ex:Dataflow) can read and insert the data into BigQuery and you can create consumer for your custom needs.

How can I connect a community connector (apps script) to a GA4 property so that it functions like it did with Advanced services and UA?

I am currently trying to connect a community connector (apps script) to GA4 in order to retrieve data and modify it before sending it to data studio. I have done this for Universal Analytics properties easily with advanced services; However, Since the Advanced services Analytics option does not work for GA4 properties I have been looking into retrieving data from analytics using fetchUrl. I am wondering if this is the best/only way to connect to GA4?
I have received input for my current code (basic fetchUrl code) which suggested that I would need to access the cloud api for authentication. Now I am wondering if I actually have to do these extra steps to connect to a google source from app script? The reason why I am unsure is because supermetrics has a GA4 connector which does not require any extra steps and of course connects to a GA4 property with a simple authentication method. I would like to essentially create that same connection in app script. Any advice would also be appreciated.
Also I would love to hear if there is any information on when we can expect advanced services to work for GA4.
Currently you will need to use the new Google Analytics Data API (GA4), however in this moment it is an early preview version of the API and is subject to change.
I don't know what the supermetrics plugin does specifically, however i recommend to link the automatic export of GA4 data in BigQuery and connect to it.
Google has not released any deadline for the release of this advanced service. I believe that before this it will have to make this whole new system stable since every week it undergoes updates and adjustments.

How to connect to Cumulocity services in order to add multiple devices data

I'm currently building a dotnet core project that receives data from multiple devices and has to send them over to Cumulocity.
So the flow of the app would be something similar to:
dotnet core app receives new data
connects to cumulocity account
creates device with name according to data received
sends data
repeat for multiple devices
I've been doing this for the past week using this code.
It works fine but then i noticed that calling the client.CreateDevice(...) was not creating new ones when presented with different data, it always hit the same one.
This tells me that i'm probably doing this wrong, that this library is to be used only within devices and not this way.
So my question is, to handle/send multiple devices data to the platform which library should i be using (if any)?
Thank you.
I figured it out, when connecting to Cumulocity the ClientId value will identify the device, to create new ones i just map that property to my devices ID.
Problem solved.

Create rules on TimescaleDB

How can i generate alerts about rules in TimescaleDB? I need to create a rule, and when this rule is broken i want generate a post notification. For example: i want to create a rule that verify if the average temperature in the last 5 minutes of a device D exceeds X, then I want to detect in order to be able to react. Is this possible?
Thanks!
TimescaleDB supports PostgreSQL triggers that can be configured to fire on various changes to the database. See here: http://docs.timescale.com/using-timescaledb/schema-management#triggers
and here for PostgreSQL docs:
https://www.postgresql.org/docs/current/static/sql-createtrigger.html
That should provide a good starting point, but the details of averaging the temperature over a past time window, you'll have to work out depending on how you want to proceed.
According to the official TimescaleDB documentation, best method is to use Grafana and define alert rules
Grafana is a great way to visualize and explore time-series data and
has a first-class integration with TimescaleDB. Beyond data
visualization, Grafana also provides alerting functionality to keep
you notified of anomalies.
[...]
Grafana will send a message via the chosen notification channel.
Grafana provides integration with webhooks, email and more than a
dozen external services including Slack and PagerDuty.
You may also use other alerting tools:
DataDog
Nagios
Zabbix