Create rules on TimescaleDB - sql

How can i generate alerts about rules in TimescaleDB? I need to create a rule, and when this rule is broken i want generate a post notification. For example: i want to create a rule that verify if the average temperature in the last 5 minutes of a device D exceeds X, then I want to detect in order to be able to react. Is this possible?
Thanks!

TimescaleDB supports PostgreSQL triggers that can be configured to fire on various changes to the database. See here: http://docs.timescale.com/using-timescaledb/schema-management#triggers
and here for PostgreSQL docs:
https://www.postgresql.org/docs/current/static/sql-createtrigger.html
That should provide a good starting point, but the details of averaging the temperature over a past time window, you'll have to work out depending on how you want to proceed.

According to the official TimescaleDB documentation, best method is to use Grafana and define alert rules
Grafana is a great way to visualize and explore time-series data and
has a first-class integration with TimescaleDB. Beyond data
visualization, Grafana also provides alerting functionality to keep
you notified of anomalies.
[...]
Grafana will send a message via the chosen notification channel.
Grafana provides integration with webhooks, email and more than a
dozen external services including Slack and PagerDuty.
You may also use other alerting tools:
DataDog
Nagios
Zabbix

Related

How to get a rout information with trafic flow information included for a past date?

I am trying to use Azure Maps API. It will be nice to have route information which should include the locations of course and a speed profile. As you can understand speed profile is not an east one. Free flow speed profile is ok. But we want to simulate real-world conditions meaning that we want to select date and time of departure to get accurate speed information as close to as possible to a real world traffic influence.
Is there any feature that Azure provide this? If not, which API can provide this
I don't have any code at this moment to show since ı don't know which API to use.
Historical traffic data is not currently available in Azure Maps but is being investigated as a potential future feature.

How to build Google Analytics 'collect' like api using Google Cloud services

I'm trying to build a data collection web endpoint.The Use case is similar to Google Analytics collect API. I want to add this endpoint(GET method) to all pages on the website and on-page load collect page info through this API.
Actually I'm thinking of doing this by using Google Cloud services like Endpoints, BQ(for storing the data).. I don't want to host it in any dedicated servers. Otherwise, I will be end up doing a lot for managing/monitoring the service.
Please suggest me how do I achieve this with Google Cloud Service? OR direct me to right direction if my idea is wrong
I suggest focussing on deciding where you want to code to run. There are several GCP options that don't require dedicated servers:
Google App Engine
Cloud Functions/Firebase Functions
Cloud Run (new!)
Look here to see which support Cloud Endpoints.
All of these products can support running code that takes the data from the request and sends it to the BigQuery API.
There are various ways of achieving what you want. David's answer is absolutely valid, but I would like to introduce Stackdriver Custom Metrics to the discussion.
Custom metrics are similar to regular Stackdriver Monitoring metrics, but you create your own time series (Stackdriver lingo described here) to keep track of whatever you want and clients can sent in their data through an API.
You could achieve the same thing with a compute solution (Google Cloud Functions for example) and a database (Google BigTable for example) and writing your own logic.. but Custom Metrics is an already built solution that includes dashboards and alerting policies while being a more managed solution.

Cumulocity extend API

We're working with Cumulocity and we'd like to offer services to our customers that are not currently possible to implement with Cumulocity. As an example, we'd like to be able to retrieve a list of devices located within x kilometers of a given point.
Currently there are two limitations that prevent us from doing so:
the impossibility of extending the Cumulocity API with custom route/parameters
the impossibility of implementing custom functions for specific API GET calls
I can think of a workaround to achieve this, like a POST request of an event that would be processed by an Esper rule, generating another event/measurement that could then be accessed by a GET. But I think we can agree this is not a suitable mechanism.
Please not that the use case I described above is just an example. Our needs don't limit to this and we need a standardized way to expand our services without requirering updates on Cumulocity side.
There are two topics here, I believe:
Geo-querying: Some geographical querying and aggregation use cases can be handled through CEL. A general geo-querying API is on the Cumulocity roadmap. Note: This use case is not only related to extending the API, as such queries go right down into the database.
Extending the API: That is actually possible. Cumulocity has a microservices API in which you can expose other APIs under the URL /services/.... This is, for example, how connectivity platforms are interfaced. The API is not on the web site because it's not GA yet, but you can certainly discuss it with your Cumulocity contact or open a ticket. This btw includes also adding permissions for the new microservices, so that you can do proper A&A.

Splunk Graphite Integration

I want to know if Graphite can pull log data from Splunk to draw Graphs. I know Graphite can read data from Nagios, but want to know if it can pull from Splunk also.
You can also pull data via one of the Splunk SDKs - http://dev.splunk.com/view/sdks/SP-CAAADP7
There is an example on the developer site that shows pulling data from splunk and pushing it to Leftronic - http://dev.splunk.com/view/SP-CAAADSR
There also are a number of visual examples in the JavaScript SDK showing how to pull data from Splunk and visualize with other libraries - http://dev.splunk.com/view/javascript-sdk/SP-CAAAECM
Here's an app I wrote for Splunk that does exactly this: https://github.com/OnBeep/splunk_graphite
If the goal is to chart the data in splunk you can use the chart or timechart command in splunk.
If the gloat is to chart the splunk data in carbon/graphite, depending on the data that you wish to pull out of spunk you should be able to;
Create a save search in splunk
Use the cli or rest api to execute & gather the results of the save search
parse the results then push it into carbon.
This is how it works:
Carbon listen to receive data.
Carbon receives data and stores it in whisper.
Graphite reads from whisper and carbon cache and shows graphs.
There's no pull at all. Submitting data to carbon it's damn easy. It has two ports, one for simple tcp connect and submit one metric per line (metric.name metric.value metric.timestamp), or have a pickle port too.
Usually you will use Logstash or logster to parse application logs with regular expresion and any of those will take care of submitting the resulting metrics to carbon.
Also, if you have a software been able to submit real time metrics by udp, you can use statsd which will listen on udp and on a configured interval sum or average and submit to carbon with a lot of nice settings (like get the 95th percentile, etc).
In summary, I bet whatever log Splunk leave you, you will be able for sure to submit data to graphite.

Website Session Analysis

I'd like to know the best way to deep dive into the flow of my users. For example, I have 4 pages in my flow, how can I analyze which users abandon on which step? I can definitely do it by hand with logging, etc, but I'd rather use an off the shelf solution.
I have apache request logs, as well as google analytics. Can these analyze users as sessions?
You can do this with Google Analytics.
The flow of the users are called a funnel, which you have to set up in Google Analytics, the metric you get out is called Target Conversion Rate.
You can learn more about it at http://www.google.com/support/conversionuniversity/