How to configure PagerDuty alerts in Splunk Cloud? - splunk

I've run into a few different issues with the PagerDuty integration in Splunk Cloud.
The documentation on PagerDuty's site is either outdated, not applicable to Splunk Cloud or else there's something wrong with the way my Splunk Cloud account is configured (could be a permissions issue): https://www.pagerduty.com/docs/guides/splunk-integration-guide/. I don't see an Alert Actions page in Splunk Cloud, I have a Searches, Reports and Alerts page though.
I've configured PD alerts in Splunk using the alert_logevent app but it's not clear if I should instead be using some other app. These alerts do fire when there are search hits but I'm seeing another issue (below). The alert_webhook app type seems like it might be appropriate but I was unable to get it to work correctly. I cannot create an alert type using the pagerduty_incident app. . . although I can set it as a Trigger Action (I guess this is how it's supposed to work, I don't find the UI to intuitive here).
When my alerts fire and create incidents in PagerDuty, I do not see a way to set the PagerDuty incident severity.
Also, the PD incidents include a link back to Splunk, which I believe should open the query with the search hits which generated the alert. However, the link brings me to a page with a Page Not Found! error. It contains a link to "more information about my request" which brings up a Splunk query with no hits. This query looks like "index=_internal, host=SOME_HOST_ON_SPLUNK_CLOUD, source=*web_service.log, log_level=ERROR, requestid=A_REQUEST_ID". It it not clear to me if this is a config issue, bug in Splunk Cloud or possibly even a permissions issue for my account.
Any help is appreciated.

I'm also a Splunk Cloud + PagerDuty customer and ran into the same issue. The PagerDuty App for Splunk seems to create all incidents as Critical but you can set different severities with event rules.
One way to do this is dynamically is to rename your Splunk alerts with the desired severity level and then create a PagerDuty event rule for each level that looks for the keyword in the Summary. For example...
If the following condition is met:
Summary contains "TEST"
Then perform the following actions:
Set severity = Info
screenshot of the example in the event rule edit screen
It's a bit of pain to rename your existing alerts in Splunk but it works.
If your severity levels in Splunk are programmatically set like in Enterprise Security, then another method would be to modify the PagerDuty App for Splunk to send the $alert.severity$ custom alert action token as a custom detail in the webhook payload and use that as event rule condition instead of Summary... but that seems harder.

Related

What search terms should I use when creating alert that is triggered when there are no logs coming from service in Splunk?

I want to trigger an alert when there are no logs coming from our services in Splunk but not sure how to do that.
I can search our logs using this [| inputlookup app | search app=app_name env=prod service=app_name] where app is the csv lookup table with app, env, and service properties that provide lookup values for our search.
One other thing to note is I have access to the sourcetype or the source where
sourcetype=kube:container:app_name_env
source=*k8s_app_name_env_*
But again, not sure what search query I should create the alert based on. I know how to create alerts in splunk but not sure how to trigger it if there are no logs coming from the source above. Any suggestions? Thanks!
In the Alert actions, have it send a message when there are no results:

Is it possible to link to a job in the bigquery console?

If I execute a BigQuery job using the REST API (i.e. bigquery.googleapis.com) in the response I get back a selfLink that looks something like this:
https://bigquery.googleapis.com/bigquery/v2/projects/my-project/jobs/job_0123456789ABCDEF?location=EU
In the UI (ie.. console.cloud.google.com) I can see the very same job in the project's query history:
Is it possible to use the information within that API response and construct a URL that will allow a person to visit that URL in the browser and be taken directly to the information about that query in the UI? This would be really useful because we could log a message containing that URL so that anyone viewing the logs can see a user-friendly UI regarding that job.
I suspect the answer is "no" but just thought I'd ask.
I believe you can share this link:
https://console.cloud.google.com/bigquery?project=<my-project>j=<bq:<location>:<job_id>>&page=queryresults
For example: https://console.cloud.google.com/bigquery?project=my-project&j=bq:US:2846160a-9a13-4192-9bff-e691ff2adab6&page=queryresults
If a user has BQ Job List permission in that project, then when they open up the link they will be be able to see the query that was run in the UI, along with the job information.
But they can't see the query results, which is intended behavior. Instead they will get a warning:
Access Denied: User does not have permission to access results of another user's job.

Problem seeing Data from reference input (SQL) Azure Stream Analytics

I am trying to preview the data coming from a SQL DB that is set as a reference input for my Azure stream analytics job.
I am getting this error:
We cannot locate the resource for the selected input. Please make sure its subscription has been selected in the global subscription filter.
I have looked online but can't seem to find how to resolve it.
I also have data coming from an event hub and that is working, error occurs only when previewing DB data.
We cannot locate the resource for the selected input. Please make sure
its subscription has been selected in the global subscription filter.
I can't reproduce your issue on my side.But based on above issue,i think you have multiple subscriptions. You could try to make sure the subscription (which your sql db resides in) has been selected in the global subscription filter.
Step is navigating to Subscription Management Page in the portal and check whether the subscription is selected with global subscription filter.It not,select it in the right box.
Add a reference for you :https://www.kunal-chowdhury.com/2018/06/azure-global-subscription-filter.html

How to delete Google Calendar Event from API (Using Bubble.io)

I am trying to use the Google Calendar API to enable users of my application to create events, retrieve events info, and finally delete events in their google calendar directly from my app, which is by the way built in bubble.io.
I have successfully setup the first 2 usecases, but I am having issues setting up the last one.
I have tried to follow the API documentation from google, but without any luck so far. See screenshot attached of how my call looks right now - I have tried various variations of this call, but always get an error with "code: 404, Message: not found".
I think you should add parameters like as follow :
enter image description here
You can find more info here : https://developers.google.com/calendar/v3/reference/events/delete
Let me know if it was of any help,
A proud member of Bubble community.

Worklight api console request for Push

I need to get a list of users for a specific Push adapter/event source, and, I'm trying to use the API console requests, which says the format is:
http://{hostname}:{port}/{context-root}/console/api/{api-context}/{action}/{parameters}
and I'm using:
http://192.168.1.106:10080/Module_07_04_nativeAPIForiOSPush/console/api/Push/get/PushAdapter/PushEventSource
to search the demo project, which has one subscriber. However, I get 404 return from a browser request.
The first column of the docs is the "api-context", but, it lists "Push" and "Event Sources", which, obviously seems invalid.
What is the correct format to find users subscribed to a push for a specific adapter/event source?
WL server does not provide API for listing subscribed users. By design you should maintain your own DB of subscribed users, this is why you have onSubscribe onUnsubscribe callbacks in event source. As an alternative - you can look into WL's DB tables to find this info.