We are currently using ADO (pipeline artifacts) to Splunk for Build step reviews. I am working on a project to migrate Splunk Dashboards to powerbi. I need a query to list down the Dashboards and reports from Splunk on the below criteria so we would identify which ones to be created in powerbi
List of Dashboards / reports (possibly with Author details)
frequency of usage - like how many times the Dashboard / reports was viewed in the last 30 days
I tried few queries from mysplunk but it did not give the result. Thanks for any inputs / suggestions. Thanks.
It would help to know what you've tried so far.
To get a list of dashboards, try this query:
| rest /servicesNS/-/-/data/ui/views
To see which dashboards have been viewed, search the Splunk UI access log.
index=_internal sourcetype=splunkd_ui_access
| rex "\\/data\\/ui\\/views\\/(?<dashboard>[^\?]+)"
| stats count by dashboard
The below queries worked for me ..
To list the Dashboards
| rest /servicesNS/-/-/data/ui/views | table title,app,owner,eai:data,description,updated,version
To list the reports
|rest /servicesNS/-/-/saved/searches |table title,app,owner
Please note this will list down the default dashboards from Splunk that you may want to filter out.
I am still working on the query for the usage frequency . Thanks.
Related
I am getting data in Splunk from Snowflake using Splunk DB Connect. This is just simple orders data. At Splunk search & reporting I am running the following query on my table to get visualization.
source="big_data_table_inner_join" "UNITS_SOLD" | top COUNTRY
What I am seeing is that each time I run query the events number at splunk increases quite heavily. For eg. After running first time they were 342000 events and when I ran the same query they were 67445 events. Any idea why is this happening?
I'm facing a problem in splunk like if i choose current session(2020) from filter then i should get the data of previous Session(2019).
I wrote a splunk query like :
index="entab_due" Session=2019 ClassName="* *"
| eval n=(tonumber(Session)-1)
| where totalBalance > 0 and Session = n
but i didn't get any result.
Problem : Get the data of previous session after selecting Session from filter
Please help me to get the solution.
If two different panels in your dashboard need different data then they probably should use different searches. Or use a base search that gathers the data needed for both and use post-processing to filter the data needed by each panel.
From Splunk, I am trying to get the user, saved search name and last time a query ran ?
A single Splunk query will be nice.
I am very new to Splunk and I have tried these queries :-
index=_audit action=search info=granted search=*
| search IsNotNull(savedsearch_name) user!="splunk-system-user"
| table user savedserach_name user search _time
The above query , is always empty for savesearch_name.
Splunk's audit log leaves a bit to be desired. For better results, search the internal index.
index=_internal savedsearch_name=* NOT user="splunk-system-user"
| table user savedsearch_name _time
You won't see the search query, however. For that, use REST.
| rest /services/saved/searches | fields title search
Combine them something like this (there may be other ways)
index=_internal savedsearch_name=* NOT user="splunk-system-user"
| fields user savedsearch_name _time
| join savedsearch_name [| rest /services/saved/searches
| fields title search | rename title as savedsearch_name]
| table user savedsearch_name search _time
Note that you have a typo in your query. "savedserach_name" should be "savedsearch_name".
But I also recommend a free app that has a dedicated search tool for this purpose.
https://splunkbase.splunk.com/app/6449/
Specifically the "user activity" view within that app.
Why it's a complex problem - part of the puzzle is in the audit log's info="granted" event, another part is in the audit log's info="completed" event, even more of it is over in the introspection index. You need those three stitched together, and the auditlog is plagued with parsing problems and autokv compounds the problem by extracting all of fields from the SPL itself.
That User Activity view will do all of this for you, sidestep pretty thorny autokv problems in the audit data, and not just give you all of this per search, but also present stats and rollups by user, app, dashboard, even by sourcetypes-that-were-actually-searched
it also has a macro called "calculate pain" that will score a "pain" number for each search, and then sum up all the "pain" in the by-user, by-app, by-sourcetype rollups etc. So that admins can try and pick off the worst offenders first.
it's up on SB here and approved for both Cloud and onprem - https://splunkbase.splunk.com/app/6449/
(and there's a #sideview_ui channel for it in the community slack.)
I need to locate data that has become stale in our Splunk instance - so that I can remove it
I need a way to find all the dashboards, and sort them by usage. From the audit logs I've been able to find all the actively used logs, but as my goal is to remove data, I most need the dashboards not in use
any ideas?
You can get a list of all dashboards using | rest /services/data/ui/views | search isDashboard=1. Try combining that with your search for active dashboards to get those that are not active.
| rest /services/data/ui/views | search isDashboard=1 NOT [<your audit search> | fields id | format]
So I was just wondering if it was possible to create a simple xml or html code that has dashboard that searches for all other recently modified or updated searches of dashboards in splunk?
And if so when I search up these updated databases I would like to know the indexes and dataset that these dashboards have.
Requested Table format
Dashboard Name, Index, Timestamp (Shows when the dashboard was last updated)
Hopefully that makes sense..Please let me know if it's possible, or similar ways I can find this! Thanks
You could search
"_audit" index :
index=_audit | table _time user action info
index=_internal
The "_internal" index also has some sources on which to do username analytics ie:searches.log