In Splunk, I need to get the count of events from the below msg field value which matches factType=COMMERCIAL and has filters.
Using the basic Splunk query with wildcard does not work efficiently. Could you please assist
app_name="ABC" cf_space_name=prod msg="*/facts?factType=COMMERCIAL&sourceSystem=ADMIN&sourceOwner=ABC&filters*"
msg: abc.asia - [2021-08-23T00:27:08.152+0000] "GET
/facts?factType=COMMERCIAL&sourceSystem=ADMIN&sourceOwner=ABC&filters=%257B%2522stringMatchFilters%2522:%255B%257B%2522key%2522:%2522BFEESCE((json_data-%253E%253E'isNotSearchable')::boolean,%2520false)%2522,%2522value%2522:%2522false%2522,%2522operator%2522:%2522EQ%2522%257D%255D,%2522multiStringMatchFilters%2522:%255B%257B%2522key%2522:%2522json_data-%253E%253E'id'%2522,%2522values%2522:%255B%25224970111%2522%255D%257D%255D,%2522containmentFilters%2522:%255B%255D,%2522nestedMultiStringMatchFilter%2522:%255B%255D,%2522nestedStringMatchFilters%2522:%255B%255D%257D&sorts=%257B%2522sortOrders%2522:%255B%257B%2522key%2522:%2522id%2522,%2522order%2522:%2522DESC%2522%257D%255D%257D&pagination=null
Try this:
index=ndx sourcetype=srctp msg=*
| rex field=msg "factType=(?<facttype>\w+).(?<params>.+)"
| stats count by facttype params
| fields - count
| search facttype="commercial"
The rex will extract the facttype and any following parameters (note - if the URL is submitted with the arguments in a different order, you'll need to adjust the regular expression)
Then use a | stats count by to bin them together
Lastly, search only where there is both a facttype="commercial" and the URL has additional parameters
Related
I have multiple log messages each containing a list of JobIds -
IE -
1. `{"JobIds":["661ce07c-b5f3-4b37-8b4c-a0b76d890039","db7a18ae-ea59-4987-87d5-c80adefa4475"]}`
2. `{"JobIds":["661ce07c-b5f3-4b37-8b4c-a0b76d890040","db7a18ae-ea59-4987-87d5-c80adefa4489"]}`
3. `{"JobIds":["661ce07c-b5f3-4b37-8b4c-a0b76d890070"]}`
I have a rex to get those jobIds. Next I want to count the number of jobIds
My query looks like this -
| rex field=message "\"(?<job_ids>(?:\w+-\w+-\w+-\w+-\w+)+),?\""
| stats count(job_ids)
But this will only give me a count of 3 when I am looking for 5. How can I get a count of all jobIds? I am not sure if this is a splunk limitation or I am missing something in my regex.
Here is my regex - https://regex101.com/r/vqlq5j/1
Also with max-match=0 but with mvcount() instead of mvexpand():
| makeresults count=3 | streamstats count
| eval message=case(count=1, "{\"JobIds\":[\"a1a2a2-b23-b34-d4d4d4\", \"x1a2a2-y23-y34-z4z4z4\"]}", count=2, "{\"JobIds\":[\"a1a9a9-b93-b04-d4d4d4\", \"x1a9a9-y93-y34-z4z4z4\"]}", count=3, "{\"JobIds\":[\"a1a9a9-b93-b04-d14d14d14\"]}")
``` above is test data setup ```
``` below is the actual query ```
| rex field=message max_match=0 "\"(?<id>[\w\d]+\-[\w\d]+\-[\w\d]+\-[\w\d]+\")"
| eval cnt=mvcount(id)
| stats sum(cnt)
In Splunk, to capture multiple matches from a single event, you need to add max_match=0 to your rex, per docs.Splunk
But to get them then separated into a singlevalue field from the [potential] multivalue field job_ids that you made, you need to mvxepand or similar
So this should get you closer:
| rex field=message max_match=0 "\"(?<job_id>(?:\w+-\w+-\w+-\w+-\w+)+),?\""
| mvexpand job_id
| stats dc(job_id)
I also changed from count to dc, as it seems you're looking for a unique count of job IDs, and not just a count of how many in total you've seen
Note: if this is JSON data (and not JSON-inside-JSON) coming into Splunk, and the sourcetype is configured correctly, you shouldn't have to manually extract the multivalue field, as Splunk will do it automatically
Do you have a full set of sample data (a few entire events) you can share?
I need a Splunk query to fetch the counts of each field used in my dashboard.
Splunk sample data for each search is like this
timestamp="2022-11-07 02:06:38.427"
loglevel="INFO" pid="1"
thread="http-nio-8080-exec-10"
appname="my-test-app"
URI="/testapp/v1/mytest-app/dashboard-service"
RequestPayload="{\"name\":\"test\",\"number\":\"\"}"
What would a search look like to print a table with the number of times the name and number is used to search data (at a time only either number/name data can be given by user).
Expected output in table format with counts for Name and Number
#Hanuman
Can you please try this? You can change regular expression as per your events and match with JSON data.
YOUR_SEARCH | rex field=_raw "RequestPayload=\"(?<data>.*[}])\""
| spath input=data
|table name number
My Sample Search:
| makeresults | eval _raw="*timestamp=\"2022-11-07 02:06:38.427\" loglevel=\"INFO\" pid=\"1\" thread=\"http-nio-8080-exec-10\" appname=\"my-test-app\" URI=\"/testapp/v1/mytest-app/dashboard-service\" RequestPayload=\"{\"name\":\"test\",\"number\":\"1\"}\"*"
| rex field=_raw "RequestPayload=\"(?<data>.*[}])\""
| spath input=data
|table name number
Screen
Thanks
I am creating a dashboard for our service. And I want to create metrics for url requests.
Lets say have a similar url like this one:
/api/v1/users/{userId}/settings
And I have following query in Splunk
url=*/api/v1/users/*/settings
| stats avg(timeTaken) as avg_latency, p99(timeTaken) as "p99(ms)", perc75(timeTaken) as "p75(ms)", count as total_requests, count(eval(responseStatus=500)) as failed_requests by url
| eval "success_rate"=round((total_requests - failed_requests)/total_requests*100,2)
| eval avg = round(avg)
| sort success_rate
All I want is to have a table with one common url showing all the metrics. But instead, I get a table with a list of all urls with different parameters.
You want to create a field which is the URL minus the UserId part, And therefore the stats will be grouped by which url is called.
You can do this by using split(url,"/") to make a mv field of the url, and take out the UserId by one of two ways depending on the URLs.
Mvfilter: Eg: mvfilter(eval(x!=userId))
Or created a new mvfield with the userId removed by it's index in the mvfield using this: Add/Edit/Delete mvfield
Instead of removing you could also choose to replace the UserId with "{userId}", so long as you do the same for all Urls.
And then you can rejoin the url using mvjoin(url,"/")
I hope I understood your question correctly and this helps you!
You could try doing a replace() on your URL field with eval before calling stats:
| eval url=replace(url,"\/\d+\/settings","/settings")
If it turns out the userid is important to hold onto, pull it into its own field prior to running replace():
| rex field=url "\/(?<userid>\d+)\/settings"
expansion for comment
For multiple possible endings of your URL, try something like this:
index=ndx sourcetype=srctp URL IN("*/api/v1/users/*/settings","*/api/v1/users/*/logout","*/api/v1/users/*/profile")
| rex field=url "\/(?<url_type>\w+)$"
| eval url=replace(url,"\/\d+\/\w+$","")
| stats avg(timeTaken) as avg_latency, p99(timeTaken) as "p99(ms)", perc75(timeTaken) as "p75(ms)", count as total_requests, count(eval(responseStatus=500)) as failed_requests by url type
| eval "success_rate"=round((total_requests - failed_requests)/total_requests*100,2)
| eval avg = round(avg)
| sort success_rate
This will extract the "type" (logout, profile, settings) into a new field, then cleanup the URL by removing everything from userid to the end
I have 3 fields in my splunk result like message, id and docId.
Need to group the results by id and doc id which has specific messages
message="successfully added" id=1234 docId =1345
message="removed someUniqueId" id=1234 docId =1345
I have to group based on the results by both id's which has the specific message
search query | rex "message=(?<message[\S\s]*>)" | where message="successfully added"
which is giving result for the first search, when i tried to search for second search query which is not giving result due to the someUniqueId"
search query | rex "message=(?<message[\S\s]*>)" | where match(message, "removed *")
Could you pelase help me to filter the results which has the 2 messages and group by id and docID
The match function expects a regular expression, not a pattern, as the second argument. Try search query | rex "message=(?<message>[\S\s]*)" | where match(message, "removed .*").
BTW, the regex strings in the rex commands are invalid, but that may be a typing error in the question.
I am trying to write a Splunk SPL query that will show me the most popular search terms that a user is looking for in one of my web apps. I have the logs already in Splunk but I am having a hard time extract the search parameter from the event. The event shows the full SQL select statement that looks like the query below:
select result from table where search_term = 'searched for this text'
How can I have this:
index=my_app search_term | top result
How do I actually capture the search term?
Thank you
You can use rex to extract the search term. Something like this
index=my_app | rex "search_term = '(?<search_term>[^']+)"
If you want individual words then use the split function followed by mvexpand to make each word a separate event.
... | eval words=split(search_term, " ") | mvexpand words