Splunk chart function displaying zero values when trying to round off input - splunk

I have been trying to display a chart in splunk. I uploaded my json data through Splunk HTTP Forwarder and running the query:
After I uploaded the json data, I have got fields such as
"message":{"acplbuild":"ACPL 1.20.1","coresyncbuild":"4.3.10.25","testregion":"EU_Stage","client":"EU_Mac","date":"2019-08-27","iteration":"20","localCreateTime":"6.672","createSyncTime":"135.768","createSearchTime":"0.679","filetype":"CPSD","filesize":"690_MB","filename":"690MB_NissPoetry.cpsd","operation":"upload","upload_DcxTime":"133.196","upload_manifest_time":"133.141","upload_journal_time":"1.753","upload_coresync_time":"135.225","upload_total_time":142.44},"severity":"info"}
I am trying to run the following query
index="coresync-ue1" host="acpsync_allacpl_7" message.testregion=EU_STAGE message.client=EU_Mac message.operation="upload" |eval roundVal = round(message.upload_total_time, 2) | chart median(roundVal) by message.acplbuild
I am getting no values. It should display rounded off median values as a chart. Can someone point me if I am doing anything wrong here.

I used the same data as specified by you and I faced an issue while rounding off the upload_total_time value. So, I first converted it to number, and then the Splunk search query worked.
Input Data Set
{"message":{"acplbuild":"ACPL 1.20.1","coresyncbuild":"4.3.10.25","testregion":"EU_Stage","client":"EU_Mac","date":"2019-08-27","iteration":"20","localCreateTime":"6.672","createSyncTime":"135.768","createSearchTime":"0.679","filetype":"CPSD","filesize":"690_MB","filename":"690MB_NissPoetry.cpsd","operation":"upload","upload_DcxTime":"133.196","upload_manifest_time":"133.141","upload_journal_time":"1.753","upload_coresync_time":"135.225","upload_total_time":142.44},"severity":"info"}
{ "message":{"acplbuild":"ACPL 1.20.2","coresyncbuild":"4.3.10.25","testregion":"EU_Stage","client":"EU_Mac","date":"2019-08-27","iteration":"20","localCreateTime":"6.672","createSyncTime":"135.768","createSearchTime":"0.679","filetype":"CPSD","filesize":"690_MB","filename":"690MB_NissPoetry.cpsd","operation":"upload","upload_DcxTime":"133.196","upload_manifest_time":"133.141","upload_journal_time":"1.753","upload_coresync_time":"135.225","upload_total_time":152.44123},"severity":"info"}
{ "message":{"acplbuild":"ACPL 1.20.3","coresyncbuild":"4.3.10.25","testregion":"EU_Stage","client":"EU_Mac","date":"2019-08-27","iteration":"20","localCreateTime":"6.672","createSyncTime":"135.768","createSearchTime":"0.679","filetype":"CPSD","filesize":"690_MB","filename":"690MB_NissPoetry.cpsd","operation":"upload","upload_DcxTime":"133.196","upload_manifest_time":"133.141","upload_journal_time":"1.753","upload_coresync_time":"135.225","upload_total_time":160.456},"severity":"info"}
Splunk Search Query
source="sample.json" index="splunk_answers" sourcetype="_json"
| convert num(message.upload_total_time) as total_upld_time
| eval roundVal = round(total_upld_time,2)
| chart median(roundVal) by message.acplbuild
Statistics View
Visualization View

Related

How to make pie chart of these values in Splunk

Have the following query index=app (splunk_server_group=bex OR splunk_server_group=default) sourcetype=rpm-web* host=rpm-web* "CACHE_NAME=RATE_SHOPPER" method = GET | stats count(eval(searchmatch("true))) as Hit, count(eval(searchmatch("found=false"))) as Miss
Need to make a pie chart of two values "Hit and Miss rates"
The field where it is possible to distinguish the values is Message=[CACHE_NAME=RATE_SHOPPER some_other_strings method=GET found=false]. or found can be true
With out knowing the structure of your data it's harder to say what exactly you need todo but,
Pie charts is a single data series so you need to use a transforming command to generate a single series. PieChart Doc
if you have a field that denotes a hit or miss (You could use an Eval statement to create one if you don't already have this) you can use it to create the single series like this.
Lets say this field is called result.
|stats count by result
Here is a link to the documentation for the Eval Command
Good luck, hope you can get the results your looking for
Since you seem to be concerned only about whether "found" equals either "hit" or "miss", try this:
index=app (splunk_server_group=bex OR splunk_server_group=default) sourcetype=rpm-web* host=rpm-web* "CACHE_NAME=RATE_SHOPPER" method=GET found IN("hit","miss")
| stats count by found
Pie charts require a single field so it's not possible to graph the Hit and Miss fields in a pie. However, if the two fields are combined into one field with two possible values, then it will work.
index=app (splunk_server_group=bex OR splunk_server_group=default) sourcetype=rpm-web* host=rpm-web* "CACHE_NAME=RATE_SHOPPER" method = GET
| eval result=if(searchmatch("found=true"), "Hit", "Miss")
| stats count by result

How can I put several extracted values from a Json in an array in Kusto?

I'm trying to write a query that returns the vulnerabilities found by "Built-in Qualys vulnerability assessment" in log analytics.
It was all going smoothly I was getting the values from the properties Json and turning then into separated strings but I found out that some of the terms posses more than one value, and I need to get all of them in a single cell.
My query is like this right now
securityresources | where type =~ "microsoft.security/assessments/subassessments"
| extend assessmentKey=extract(#"(?i)providers/Microsoft.Security/assessments/([^/]*)", 1, id), IdAzure=tostring(properties.id)
| extend IdRecurso = tostring(properties.resourceDetails.id)
| extend NomeVulnerabilidade=tostring(properties.displayName),
Correcao=tostring(properties.remediation),
Categoria=tostring(properties.category),
Impacto=tostring(properties.impact),
Ameaca=tostring(properties.additionalData.threat),
severidade=tostring(properties.status.severity),
status=tostring(properties.status.code),
Referencia=tostring(properties.additionalData.vendorReferences[0].link),
CVE=tostring(properties.additionalData.cve[0].link)
| where assessmentKey == "1195afff-c881-495e-9bc5-1486211ae03f"
| where status == "Unhealthy"
| project IdRecurso, IdAzure, NomeVulnerabilidade, severidade, Categoria, CVE, Referencia, status, Impacto, Ameaca, Correcao
Ignore the awkward names of the columns, for they are in Portuguese.
As you can see in the "Referencia" and "CVE" columns, I'm able to extract the values from a specific index of the array, but I want all links of the whole array
Without sample input and expected output it's hard to understand what you need, so trying to guess here...
I think that summarize make_list(...) by ... will help you (see this to learn how to use make_list)
If this is not what you're looking for, please delete the question, and post a new one with minimal sample input (using datatable operator), and expected output, and we'll gladly help.

splunk date time difference

I am new to Splunk. My goal is to optimize the API call, since that particular API method is taking more than 5 minutes to execute.
In Splunk I searched using context ID, I got all the functions and sub functions call by main API call function for that particular execution. Now I want to figure what which sub function took the maximum time. In Splunk in left side, in the list of fields, I see field name CallStartUtcTime (e.g. "2021-02-12T20:17:42.3308285Z") and CallEndUtcTime (e.g. "2021-02-12T20:18:02.3702937Z"). In search how can I write a function which will give me difference between these two times. I google and found we can use eval() function but for me its returning null value.
Additional Info:
search:
clicked on "create table view" and checked start, end and diff fields in the left side fields list. but all three are coming as null
not sure what wrong I am doing. I want to find out the time taken by each function.
Splunk cannot compare timestamps in string form. They must be converted to epoch (integer) form, first. Use the strptime() function for that.
...
| eval start = strptime(CallStartUtcTime, "%Y-%m-%dT%H:%M:%S.%7N%Z")
| eval end = strptime(CallEndUtcTime, "%Y-%m-%dT%H:%M:%S.%7N%Z")
| eval diff = end - start
...

NASA API into table in excel

I'm trying to download weather related data using the Nasa API into excel using powerquery.
I'm trying to query wind speed at 50 metres, the string for which is
https://power.larc.nasa.gov/cgi-bin/v1/DataAccess.py?&request=execute&tempAverage=DAILY&identifier=SinglePoint&parameters=WS50M&userCommunity=SB&lon=142&lat=-38&startDate=20170101&endDate=20201231&outputList=JSON&user=DOCUMENTATION
I know this is the correct string because when I paste this as a url into my chrome browser, I get the desired output in JSON. However, when I try to get the output into a table in excel, I get a mere 2 records. Something is clearly amiss.
Any help on this will be appreciated.
Olá, utilize o power Query do excel.
let
Fonte = Json.Document(Web.Contents("https://power.larc.nasa.gov/cgi-bin/v1/DataAccess.py?&request=execute&tempAverage=DAILY&identifier=SinglePoint&parameters=WS50M&userCommunity=SB&lon=142&lat=-38&startDate=20170101&endDate=20201231&outputList=JSON&user=DOCUMENTATION")),
features = Fonte[features],
features1 = features{0},
properties = features1[properties],
parameter = properties[parameter],
WS50M = parameter[WS50M]
in
WS50M

Search with original text that was replaced earlier

I am gathering performance metrics for each each api that we have. With the below query I get results as
method response_time
Create Billing 2343.2323
index="dev-uw2" logger_name="*Aspect*" message="*ApiImpl*" | rex field=message "PerformanceMetrics - method='(?<method>.*)' execution_time=(?<response_time>.*)" | table method, response_time | replace "public com.xyz.services.billingservice.model.Billing com.xyz.services.billingservice.api.BillingApiImpl.createBilling(java.lang.String)” WITH "Create Billing” IN method
If the user clicks on each api text in table cell to drill down further it will open a new search with "Create Billing" obviosuly it will give zero results since we don't have any log with that string.
I want splunk to search with original text that was replaced earlier.
You can use click.value to get around this.
http://docs.splunk.com/Documentation/SplunkCloud/6.6.3/Viz/tokens