How can I create an alert in OMS when a Linux service is stopped? - azure-log-analytics

I am trying to create an alert in OMS when a Linux service is stopped.

AFAIK we have below options to accomplish your requirement.
Option I:
If the service/deamon is configured by default configurations then the service log information would be logged under /var/log/messages.
Whenever a Linux service is stopped if the information is getting logged in /var/log/messages file then follow below steps to get alerted:
Goto Azure portal -> YOURLOGANALYTICSWORKSPACE -> Advanced settings -> Data -> Syslog -> type 'daemon' -> click '+' -> click 'save'. For more information, refer this https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-sources-syslog link.
Goto Azure portal -> YOURLOGANALYTICSWORKSPACE -> Logs -> type 'Syslog' -> click 'Run'. Check 'SyslogMessage' column in the output. Output also have various other useful columns like SeverityLevel, ProcessName and ProcessID which you may use while developing the query based on your need.
So query would look something like shown below.
Syslog | where (Facility == "daemon") | where (SyslogMessage has
"xxxxxxx" and SyslogMessage has "stopping") | summarize
AggregatedValue= any(SyslogMessage) by Computer, bin(TimeGenerated,
30s)
Create and configure custom log alert in the Log Analytics workspace alert tile by using above query. Set the threshold value, frequency, period details while configuring an alert. Provide intended action group to get notified on alert getting triggered.
Option II:
If the service/deamon is custom configured then the service log information would be logged in that particular custom path.
Whenever a Linux service is stopped if the information is getting logged in /xxxx/yyyy/zzzz.txt file (or other examples are /aaaa/bbbb/jenkins/jenkins.log, cccc/dddd/tomcat/catalina.out, etc.) then follow below steps to get alerted:
Goto Azure portal -> YOURLOGANALYTICSWORKSPACE -> Advanced settings -> Data -> Custom Logs -> click 'Add +' -> .... For more information, please refer this https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-sources-custom-logs link.
Goto Azure portal -> YOURLOGANALYTICSWORKSPACE -> Logs -> type 'CUSTOMLOGNAME_CL' -> click 'Run'. Check something like 'RawData' column in the output.
So query would look something like shown below.
CUSTOMLOGNAME_CL | where (RawData has "xxxxxxx" and RawData has
"stopping") | summarize AggregatedValue= any(RawData) by Computer,
bin(TimeGenerated, 30s)
Create and configure custom log alert in the Log Analytics workspace alert tile by using above query. Set the threshold value, frequency, period details while configuring an alert. Provide intended action group to get notified on alert getting triggered.
Option III:
In case your service log data can't be collected with custom logs also then send the data directly to Azure monitor using HTTP Data Collector API that is explained here -> https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api.
An example using runbooks in Azure Automation is provided in Collect log data in Azure Monitor with an Azure Automation runbook is explained here -> https://learn.microsoft.com/en-us/azure/azure-monitor/platform/runbook-datacollect.
Hope this helps!! Cheers!! :)

Related

Is is possible to have custom fields in CloudWatch Logs from my application's logs?

I have several NodeJS applications running on ECS Fargate and the logs are being shipped to CloudWatch. I'd like to have custom fields show up in CloudWatch for each log message such as application.name and application.version and possibly ones created depending on the content of the log message. Say my log message is [ERROR] Prototype: flabebes-flower crashed and I'd like to pull out the log level ERROR and the name of the prototype flabebes-flower. Is it possible to have these fields in CloudWatch? If so, how can I accomplish this? I know how to achieve this using Filebeat processors and shipping the logs to Elasticsearch, I have that solution already but I'd like to explore the possibility of moving away from Elasticsearch and just using CloudWatch without having to write my own parsers.
There are basically two options:
If your log messages always have the same format, you can use the parse feature of CloudWatch Log Insights to extract these fields, e.g.,
parse #message "[*] Prototype: * crashed" as level, prototype
If the metadata that you want to extract into custom fields is not in a parsable format, you can configure your application to log in JSON format and add the metadata to the JSON log within your application (how depends on the logging library that you use). Your JSON format can then look something like this:
{"prototype":"flabebes-flower","level":"error","message":"[ERROR] Prototype: flabebes-flower crashed","timestamp":"27.05.2022, 18:09:46"}
Again with CloudWatch Log Insights, you can access the custom fields prototype and level. CloudWatch will automatically parse the JSON for you, there is no need to use the parse command as in the above method.
This allows you, e.g., to run the query
fields #timestamp, message
| filter level = "error"
to get all error message.

Download scheduled Webi report from File Repository Server

Having launched a scheduled report in SAP BO, is it possible to somehow download from the file repository server?
I am working with the Web Intelligence RESTful API. While it is possible to export a report synchronously using the GET /documents/<documentID>?<optional_parameters> request, I have not found any non-blocking asynchronous way except for using schedules.
Here's the intended workflow:
Create a scheduled report ("now") using POST /documents/<documentID>/schedules. Use a custom unique <ReportName>, store the scheduleID
Poll the schedule status using GET /documents/<documentID>/schedules/<scheduleID>
If the schedule status is 1 (success), find the file using a CMS query
Send a POST /cmsquery with content {query: "select * from ci_infoObjects where si_instance=1 and si_schedule_status in (1) and si_name = '<ReportName>'"}
From the result, read "SI_FILES": {"SI_FILE1": "<generatedName>.pdf","SI_VALUE1": 205168,"SI_NUM_FILES":1,"SI_PATH": "frs://Output/<Path>"}
Using the browser or the RESTful API, download the file
Is step 4 possible at all? What would be the URL?
The internal base path can be configured in the CMC, and the file location would be <Path>/<generatedName>.pdf. But how can this file be accessed programmatically OR using an URL without the need to log into the BO BI interface?
As a workaround, it is possible to use the openReport method, thereby passing the scheduleID (which is equal to the SI_ID from the infostore) as parameter.
GET /BOE/OpenDocument/opendoc/openDocument.jsp?iDocID=<scheduleID>&sIDType=InfoObjectID&token=<token>
For file type PDF, the browser internal PDF viewer is displayed. For XLS, the download is immediately initiated.
Another option is to generate report directly into shared location for example to FTP server. Here is how:
In the "Folders" management area of the CMC, select an object.
Click Actions > Schedule, and access the "Destination" page.
If you are scheduling a Web Intelligence document, click Formats
and Destinations.
Select FTP Server as the destination.
For Web Intelligence document, select FTP Server under "Output Format Details"and then
click Destination Options and Settings.
Here is the adm guide where it is explained in more details (p. 858)
https://help.sap.com/doc/24e00820a014406495980dea5d768d52/XI.3.1/en-US/xi31_sp3_bip_admin_en.pdf
Or you can check also exact steps who already done this:
https://blogs.sap.com/2015/06/10/scheduling-webi-report-output-to-ftp-shared-file-location/
After that you can expose your FTP server to internet and construct an URL for download.
I tried below steps to retrieve the scheduled instance of a WEBI report in any format.
Get the list of all the schedule instance with their IDs.
Method: Get
Headers: X-SAP-LogonToken: <token>
API: <base_url>/raylight/v1/documents/<Report ID>/schedules
Select the instance ID you received from step 1 API's response which you want to download and pass the instance ID to below API.
Method: Get
Headers: X-SAP-LogonToken: <token>
API: <base_url>/infostore/folder/<Instance ID>/file
Save the response to .wid/.xlsx/.pdf format using Save response -> Save to a file option on the response body of step 2 API.
I tried it and this works :)

event notification for documentDB instances

I get an alert like this everytime I create / delete a documentDB instance. I do not remember how did I set it up. I checked documentDB console as well as cloudwatch console, but can not see any relveant entry.
Event Source : db-instance
Identifier Link: https://console.aws.amazon.com/docdb/home?region=us-east-1#dbinstance:id=docdb-2019-08-30-10-48-09
SourceId: docdb-2019-08-30-10-48-09
Notification time : 2019-08-30 10:58:10.508
Message : DB instance created
Event ID : https://docs.aws.amazon.com/amazondocdb/latest/developerguide/events.html#RDS-EVENT-0005
Anyone can guide me how to get a notification like this for documentDB instances?
This is RDS instance alert created in cloudwatch - Events - Rules.
If you have created the alert as a "root" user, you can not view / modify it using IAM user who usually has limited privileges!

In VSTS, how do I add an email notification task to the build and release definitions?

I am new to VSTS Continuous Integration and configuring one of my first sets of build and release definitions.
Is there a task I can add that will allow me to notify certain team members of events such as a build failed, or a release is ready?
In the latest iteration of the notification settings https://account.visualstudio.com/_notifications (where there is a single global settings page), if (like me) you don't run everything full screen:
Check the add button isn't off the side of the screen!
For build fail notification, you can set on https://account.visualstudio.com/_notifications -> new -> select Build for category -> select A build fails for template -> next -> select other email for Deliver to -> add the email address and separate with semicolon (;) for multiple email addresses -> you can filter for a specific team project -> finish.
For release success notification, there is no present setting for this. You can create your extension for release success email notification. More detail, you can refer sending email notification.

Azure Web Job Console App With Parameter

I have a console application project and I publish it to Azure Web Jobs.
I want to schedule consoleapp.exe with parameters.
For Example :
First Schedule: consoleapp.exe ImportProducts
Second Schedule: consoleapp.exe OrderTransfer
Is it possible ?
To do that, create a scheduled WebJob in the portal.
Then go to your schedule job (under Azure Scheduler), there should be a link to it from the WebJobs screen.
Update the url that is invoked from: /api/triggeredwebjobs/{job name}/run to /api/triggeredwebjobs/{job name}/run?arguments={arguments}
For reference see WebJobs API here - https://github.com/projectkudu/kudu/wiki/WebJobs-API#invoke-a-triggered-job
This is an approach I used:
Create an azure webjob. Console app that has Main ( string[] args)
Create an azure scheduled task
Use HTTP "Post" (read this http://blog.davidebbo.com/2015/05/scheduled-webjob.html)
Note: remember to put in credentials. My URL looks sommething like this https://[[UID]]:[[PWD]]#mywebaddress.scm.azurewebsites.net/api/triggeredwebjobs/WebJobTest/run?arguments=one
run the scheduled tasks an in the console app you now have access to the arguments.
My log file:
[03/16/2016 01:11:35 > 486e37: SYS INFO] Status changed to Running
[03/16/2016 01:11:35 > 486e37: INFO] Arguments passed = 1
[03/16/2016 01:11:35 > 486e37: INFO] Arguments received = **one**
[03/16/2016 01:11:35 > 486e37: SYS INFO] Status changed to Success
Woop!, please note that I haven't tested this with competing schedules and check if any locks occur. I assume not but you know what assume did
I got an answer from Azure Forum.
This is not possible using the portal scheduling features alone. I have done something similar but needed two web jobs that called the .exe (with parameters) by way of a .bat file.
Another alternative is to have a single continuous web job that uses a QueueTrigger. This web job will listen on a queue for requests to "ImportProducts" and "OrderTransfer". But then you'd have to schedule the enqueuing of the requests separately using Azure Scheduler with ActionType of "Storage Queue" or something similar.
Proposed as answer by KloopDogg