Splunk Graphite Integration - splunk

I want to know if Graphite can pull log data from Splunk to draw Graphs. I know Graphite can read data from Nagios, but want to know if it can pull from Splunk also.

You can also pull data via one of the Splunk SDKs - http://dev.splunk.com/view/sdks/SP-CAAADP7
There is an example on the developer site that shows pulling data from splunk and pushing it to Leftronic - http://dev.splunk.com/view/SP-CAAADSR
There also are a number of visual examples in the JavaScript SDK showing how to pull data from Splunk and visualize with other libraries - http://dev.splunk.com/view/javascript-sdk/SP-CAAAECM

Here's an app I wrote for Splunk that does exactly this: https://github.com/OnBeep/splunk_graphite

If the goal is to chart the data in splunk you can use the chart or timechart command in splunk.
If the gloat is to chart the splunk data in carbon/graphite, depending on the data that you wish to pull out of spunk you should be able to;
Create a save search in splunk
Use the cli or rest api to execute & gather the results of the save search
parse the results then push it into carbon.

This is how it works:
Carbon listen to receive data.
Carbon receives data and stores it in whisper.
Graphite reads from whisper and carbon cache and shows graphs.
There's no pull at all. Submitting data to carbon it's damn easy. It has two ports, one for simple tcp connect and submit one metric per line (metric.name metric.value metric.timestamp), or have a pickle port too.
Usually you will use Logstash or logster to parse application logs with regular expresion and any of those will take care of submitting the resulting metrics to carbon.
Also, if you have a software been able to submit real time metrics by udp, you can use statsd which will listen on udp and on a configured interval sum or average and submit to carbon with a lot of nice settings (like get the 95th percentile, etc).
In summary, I bet whatever log Splunk leave you, you will be able for sure to submit data to graphite.

Related

Send Google Home data to Splunk

I'm trying to send data generated by a Google Home mesh network to a Splunk instance. I'd especially like to capture which devices are connected to which points throughout the day. This information is available in the app, but does not seem to be able to be streamed to a centralized logging platform. Is there a way I'm not seeing?
There are two methods of ingesting google cloud data supported by splunk.
--Push-based method : data is sent to a Splunk HTTP Event Collector (HEC) through a pub/sub to Splunk dataflow job.
--Pull-based method: data is fetched from the Google Cloud APIs through the Splunk Add-on for Google Cloud Platform.
It is generally recommended that you use the push-based method to ingest Google Cloud data in Splunk. Only in certain cases it is recommended to use the pull based method to ingest Google Cloud data into Splunk. These circumstances are as follows:
--Your Splunk deployment does not offer a Splunk HEC endpoint.
--Your log volume is low.
--You want to pull Cloud Monitoring metrics, Cloud Storage objects, or low-volume logs.
--You are already managing one or more Splunk heavy forwarders or are using a hosted Inputs Data Manger for Splunk Cloud.
More information as how to work around with the setup part and working around with this
problem can be found in the following link :
https://cloud.google.com/architecture/exporting-stackdriver-logging-for-splunk

How to monitor data traffic in WAN using splunk?

My task is actually for self learning, I know the basics of Splunk and how does it work, overall it needs logs then further analysis is future part.
Right now, I am working in a small company without specific firewall device, just the router is there, I am planning to keep real time track of internet speed + data usage by each hosts and similar things, overall say I want to monitor all the network usage in this small WAN environment.
My question is, as splunk need logs then how can I get the log of all the network flow when there is just a router but no external device to create logs, that can be feeded to splunk later?
I tried google but there I find only preset softwares that automatically capture logs and show analysis while I just need logs only.
What you need is a way to get logs off your router to somewhere Splunk can parse them
Most typically, this is done by sending the router's syslog messages to a syslog collector like SC4S.
From there, Splunk can then get the messages processed and ingested for anaylsis

Need a Rest API which should provide Sever monitoring information?

I need a server monitoring REST API which should provide the below points. can anyone suggest which one is best? I have found some tools like Nagios, Zabbix and Grafana but not sure they will provide Rest API.
1)Server Response time monitoring
2)Ping monitoring
3)Port monitoring
4)Graph event presentation & Logs APIs?
4)CPU, Harddisk, memory, Apache and Monitoring, etc.
Purpose of required API
This API will integrate the A application and gathering information from the C application then we can consolidate represent the custom graph in A application as per JSON result.
Any suggestions would be great.
Both Nagios and Zabbix do actual data collection, but Grafana only visualizes the data so you'll be looking at the former for this API. Both have a JSON API:
https://www.nagios.org/ncpa/help/2.2/api.html
https://www.zabbix.com/documentation/current/manual/api

Is there any service of Map and GPS Logger for IoT Devices as free?(like Azure Maps in Azure IoT Central)

I want to visualization latitude and longitude data taken from IoT Devices.
But I couldn't find service is able to plot GPS log data.
For example, AT&T M2X can visualization and log data that just sensor data (like humidity, temperature, and so on) but it can't visualize map from data.
At last found Azure Maps, but it needs to register the credit card.
If needed to pay for the amount of user data, but I want to start map visualization with no pay option setting first.
I desire service keeps below three points, 1. no need pay option setting first, 2. it can post data from HTTP protocol(GET/POST), 3. any kinds of map type is ok (google map, BingMap, OpenStreetMap, and so on)
I'm sorry that my English is so bad.
I look forward to your reply.
thanks.
Might want to take a look at Data Studio (Google Cloud Platform). To get you started (no money) you should be able to create a spreadsheet with comma separated lat/long values, import it into Data Studio and get a visualization on a map of those positions.
Check it out here.
I know you said being able to post via HTTP is a requirement (makes it harder to use this, but wanted to point it out as it's cool and I just learned about it too.
There's also the Maps API which can do it (I'm actually unsure of how "free" it is, but most of our APIs have free tiers for like, # of requests). This is do-able via HTTP.
Azure Maps is a part of Azure and the credit card is for Azure all up, not just Azure Maps. That said, Azure Maps also provides free monthly usage limits. Also, in Azure you can set a spending limit on your account. After adding your credit card, simply set your limit to 0 and you don't need to worry about paying. If you exceed the free limits, your account will simply stop working until the next month when you get a new set of free limits. Here is some documentation on how to set a spending limit in Azure: https://learn.microsoft.com/en-us/azure/billing/billing-spending-limit

Create rules on TimescaleDB

How can i generate alerts about rules in TimescaleDB? I need to create a rule, and when this rule is broken i want generate a post notification. For example: i want to create a rule that verify if the average temperature in the last 5 minutes of a device D exceeds X, then I want to detect in order to be able to react. Is this possible?
Thanks!
TimescaleDB supports PostgreSQL triggers that can be configured to fire on various changes to the database. See here: http://docs.timescale.com/using-timescaledb/schema-management#triggers
and here for PostgreSQL docs:
https://www.postgresql.org/docs/current/static/sql-createtrigger.html
That should provide a good starting point, but the details of averaging the temperature over a past time window, you'll have to work out depending on how you want to proceed.
According to the official TimescaleDB documentation, best method is to use Grafana and define alert rules
Grafana is a great way to visualize and explore time-series data and
has a first-class integration with TimescaleDB. Beyond data
visualization, Grafana also provides alerting functionality to keep
you notified of anomalies.
[...]
Grafana will send a message via the chosen notification channel.
Grafana provides integration with webhooks, email and more than a
dozen external services including Slack and PagerDuty.
You may also use other alerting tools:
DataDog
Nagios
Zabbix