Extracting particular value using regex in splunk - splunk

In the below event "status" key has the value either "1" or "0".
I am looking out to extract those "status" having the value "0" and put them in a field
please help me out in getting a regular expression for this.
- 2017-02-14 18:47:28.572 INFO SomePlaceHolder-5 [.abc.def.nothingishere] - string response: <200 OK,{"clips":[{"myid":"123456","historyid":"777-888-999","provider":"somecompany","status":1,"userType":1}]},{X-Backside-Transport=[OK OK], Connection=[Keep-Alive], Transfer-Encoding=[chunked], Content-Type=[application/json], X-Powered-By=[ARR/3.0,ASP.NET], Date=[Tue, 14 Feb 2017 18:47:28 GMT], X-Client-IP=[10.0.0.0.], X-Global-Transaction-ID=[9876543]}>

Presuming Splunk hasn't already extracted these automatically (it looks close to JSON, perhaps), this will do it:
index=ndx sourcetype=srctp
| rex field=_raw "status\":(?<status>\d+)"
| search status=0

Related

CloudWatch Logs Insights display a filed from the Json in the log message

This is my log entry from AWS API Gateway:
(8d036972-0445) Method request body before transformations: {"TransactionAmount":225.00,"OrderID":"1545623982","PayInfo":{"Method":"ec","TransactionAmount":225.00},"CFeeProcess":0}
I want to write a CloudWatch Logs Insights query which can display AWS request id, present in the first parenthesis and the order id present in the json.
I'm able to get the AWS request id by parsing the message. How can I get the OrderID json field?
Any help is greatly appreciated.
| parse #message "(*) Method request body before transformations: *" as awsReqId,JsonBody
#| filter OrderID = "1545623982" This did not work
| display awsReqId,OrderID
| limit 20
You can do it with two parse steps, like this:
fields #message
| parse #message "(*) Method request body before transformations: *" as awsReqId, JsonBody
| parse JsonBody "\"OrderID\":\"*\"" as OrderId
| filter OrderID = "1545623982"
| display awsReqId,OrderID
| limit 20
Edit:
Actually, they way you're doing it should also work. I think it doesn't work because you have 2 space characters between brackets and the word Method here (*) Method. Try removing 1 space.

how to extract value from splunk and generate line graph

My log messages
.o.s.c.PaymentMethodInstrumentController : Exiting ServiceController.getMyServiceDetails() : elapsedTime(ms):34, xrfRequestId:c3b5878d-8795-49cb-b6a7-51ab02789f46, xCorrelationId:786d68ea-ze46-42b9-966f-124f2eb444f6, xForwardedFor:10.242.79.96
.o.s.c.PaymentMethodInstrumentController : Exiting ServiceController.getMyServiceDetails() : elapsedTime(ms):39, xrfRequestId:c3b2c08d-6c6d-49cb-b6a7-51a89897446, xCorrelationId:78676yt64-ze46-42b9-966f-124f2eb444f6, xForwardedFor:10.242.79.96
I am looking to extract elapsedTime(ms):34 and generate the line graph of these values.
Assuming you already have _time, something like that:
<your search>
| rex "elapsedTime(ms):(?<elapsedTime>\d+),"
| table _time elapsedTime

Splunk extract a value from string which begins with a particular value

Could you help me extract file name in table format.
Here the below field just before file name is always constant. "Put File /test/abc/test/test/test to /test/test/test/test/test/test/test/test/test/test destFolderPath: /test/test/test/test/test/test/test/abc/def/hij"
This is an event from splunk
2021-04-08T01:03:40.155069+00:00 somedata||someotherdata||..|||Put File /test/abc/test/test/test to /test/test/test/test/test/test/test/test/test/test destFolderPath: /test/test/test/test/test/test/test/abc/def/hij/CHARGEBACK_20210407_060334_customer.csv
Result should be in table format: (font / format doesnt matter)
File Name
CHARGEBACK_20210407_060334_customer.csv
Assuming the original event/field ends with the file name, you should use this regular expression:
(?<file_name>[^\/]+)$
This will extract the text between the last "/" and the end of the event/field ("$").
You can test it here: https://regex101.com/r/J6bU3m/1
Now you can use Splunk's rex command to extract fields at search-time:
| makeresults
| eval _raw="2021-04-08T01:03:40.155069+00:00 somedata||someotherdata||..|||Put File /test/abc/test/test/test to /test/test/test/test/test/test/test/test/test/test destFolderPath: /test/test/test/test/test/test/test/abc/def/hij/CHARGEBACK_20210407_060334_customer.csv"
| fields - _time
| rex field=_raw "(?<file_name>[^\/]+)$"
Alternatively, you could also use this regular expression since you mentioned that the file path is always the same:
| rex field=_raw "abc\/def\/hij\/(?<file_name>.+)"

How to reference an eval variable in query

I am trying to access a variable (in this example; sampleFromDate and sampleToDate) from a sub-query. I have defined the variables with syntax eval variableName = value and would like to access with syntax filterName=$variableName$. See the example below where I am trying to access values using earliest=$sampleFromDate$ latest=$sampleToDate$
index=*
earliest=-8d latest=-1d
| eval sampleToDate=now()
| eval sampleFromDate=relative_time(now(), "-1d")
| appendcols [
search (index=*)
earliest=$sampleFromDate$ latest=$sampleToDate$
]
This produces the error:
Invalid value "$sampleFromDate$" for time term 'earliest'
The value of sampleFromDate is in the format seconds since epoch time, e.g.
1612251236.000000
I know I can do earliest=-d latest=now() - but I don't want to do this because I want to reference the variables in several locations and output them at the end.
Why are you trying to eval those time values?
Just do:
index=* earliest=-8d latest=-1d
| <rest of search>
| appendcols [
search (index=*) earliest=-1d
| <rest of appended search>
]
There's no need to explicitly set latest unless you want something other than now()

How can we write the Splunk Query to find subField2 is present or not and if present get the counts of all subFiled2

{
index:"myIndex",
field1: "myfield1",
field2: {"subField1":"mySubField1","subField2":145,"subField3":500},
...
..
.
}
SPL : index:"myIndex" eval result = if(field.subField2) .....
is the dot operator works in SPL ?
I am assuming your data is in JSON format. If so, you can use spath to extract fields from your structured data. Then just check if the field is present or not with isnotnull
index="myIndex" | spath | where isnotnull(field2.subField2)
Presuming your data is in JSON format, this should do it:
index=myIndex sourcetype=srctp field2{}.subField2=*
If those are multivalue fields, you'll need to do an mvexpand first