Get sample data with stream analytics - azure-stream-analytics

I am using continuous export and stream analytics for a Power BI report. But I have some problems configuring the inputs for stream analytics.
Continuous export writes data to a blob storage.
But when I click on "sample data" I got the error message:
"No events were found in 'input' for the specified time range".
But I am sure there is data stored for this time range.
My path prefix pattern looks like:
application insights resource_keywithoutdashes/Requests/{date}/{time}
My application insights resource has a whitespace and I have everything in lowercases.
I followed these steps: https://azure.microsoft.com/en-us/documentation/articles/app-insights-export-power-bi/
When I inspect the blob where the data is stored I see the container named "stats". This container has a folder named "application insights resource_keywithoutdashes" and this folder has a folder Requests, in that folder there are folders for every day since I started Continuous export. This folders are named like "2016-10-04" and these have folders named 00, 01, 02 to 23.
I tried to change the path prefix pattern to "stats/application insights resource_keywithoutdashes/Requests/{date}/{time}" but then I got the error message: "Failed to sample data from 'export-input'." and in the details i got: "Operation Failed Unexpectedly. Activity Id: 'xxx'".
When I click test connection I always get the message "Successfully connected to 'input'"
Why can't I get sample data? What am I doing wrong? Do I miss something?
Does anyone has some tips about how to configure this settings?

This page was helping me out:
http://fabriccontroller.net/connecting-auth0-to-power-bi-with-stream-analytics-and-a-webtask/
On that page I found this full url and I found that I was looking at the wrong storage account.
https://{YOUR_STORAGE_ACCOUNT_NAME}.blob.core.windows.net/{YOUR_STORAGE_CONTAINER_NAME}/YYYY/MM/DD/HH/{LOG_ID}.json

Related

Is is possible to have custom fields in CloudWatch Logs from my application's logs?

I have several NodeJS applications running on ECS Fargate and the logs are being shipped to CloudWatch. I'd like to have custom fields show up in CloudWatch for each log message such as application.name and application.version and possibly ones created depending on the content of the log message. Say my log message is [ERROR] Prototype: flabebes-flower crashed and I'd like to pull out the log level ERROR and the name of the prototype flabebes-flower. Is it possible to have these fields in CloudWatch? If so, how can I accomplish this? I know how to achieve this using Filebeat processors and shipping the logs to Elasticsearch, I have that solution already but I'd like to explore the possibility of moving away from Elasticsearch and just using CloudWatch without having to write my own parsers.
There are basically two options:
If your log messages always have the same format, you can use the parse feature of CloudWatch Log Insights to extract these fields, e.g.,
parse #message "[*] Prototype: * crashed" as level, prototype
If the metadata that you want to extract into custom fields is not in a parsable format, you can configure your application to log in JSON format and add the metadata to the JSON log within your application (how depends on the logging library that you use). Your JSON format can then look something like this:
{"prototype":"flabebes-flower","level":"error","message":"[ERROR] Prototype: flabebes-flower crashed","timestamp":"27.05.2022, 18:09:46"}
Again with CloudWatch Log Insights, you can access the custom fields prototype and level. CloudWatch will automatically parse the JSON for you, there is no need to use the parse command as in the above method.
This allows you, e.g., to run the query
fields #timestamp, message
| filter level = "error"
to get all error message.

Download scheduled Webi report from File Repository Server

Having launched a scheduled report in SAP BO, is it possible to somehow download from the file repository server?
I am working with the Web Intelligence RESTful API. While it is possible to export a report synchronously using the GET /documents/<documentID>?<optional_parameters> request, I have not found any non-blocking asynchronous way except for using schedules.
Here's the intended workflow:
Create a scheduled report ("now") using POST /documents/<documentID>/schedules. Use a custom unique <ReportName>, store the scheduleID
Poll the schedule status using GET /documents/<documentID>/schedules/<scheduleID>
If the schedule status is 1 (success), find the file using a CMS query
Send a POST /cmsquery with content {query: "select * from ci_infoObjects where si_instance=1 and si_schedule_status in (1) and si_name = '<ReportName>'"}
From the result, read "SI_FILES": {"SI_FILE1": "<generatedName>.pdf","SI_VALUE1": 205168,"SI_NUM_FILES":1,"SI_PATH": "frs://Output/<Path>"}
Using the browser or the RESTful API, download the file
Is step 4 possible at all? What would be the URL?
The internal base path can be configured in the CMC, and the file location would be <Path>/<generatedName>.pdf. But how can this file be accessed programmatically OR using an URL without the need to log into the BO BI interface?
As a workaround, it is possible to use the openReport method, thereby passing the scheduleID (which is equal to the SI_ID from the infostore) as parameter.
GET /BOE/OpenDocument/opendoc/openDocument.jsp?iDocID=<scheduleID>&sIDType=InfoObjectID&token=<token>
For file type PDF, the browser internal PDF viewer is displayed. For XLS, the download is immediately initiated.
Another option is to generate report directly into shared location for example to FTP server. Here is how:
In the "Folders" management area of the CMC, select an object.
Click Actions > Schedule, and access the "Destination" page.
If you are scheduling a Web Intelligence document, click Formats
and Destinations.
Select FTP Server as the destination.
For Web Intelligence document, select FTP Server under "Output Format Details"and then
click Destination Options and Settings.
Here is the adm guide where it is explained in more details (p. 858)
https://help.sap.com/doc/24e00820a014406495980dea5d768d52/XI.3.1/en-US/xi31_sp3_bip_admin_en.pdf
Or you can check also exact steps who already done this:
https://blogs.sap.com/2015/06/10/scheduling-webi-report-output-to-ftp-shared-file-location/
After that you can expose your FTP server to internet and construct an URL for download.
I tried below steps to retrieve the scheduled instance of a WEBI report in any format.
Get the list of all the schedule instance with their IDs.
Method: Get
Headers: X-SAP-LogonToken: <token>
API: <base_url>/raylight/v1/documents/<Report ID>/schedules
Select the instance ID you received from step 1 API's response which you want to download and pass the instance ID to below API.
Method: Get
Headers: X-SAP-LogonToken: <token>
API: <base_url>/infostore/folder/<Instance ID>/file
Save the response to .wid/.xlsx/.pdf format using Save response -> Save to a file option on the response body of step 2 API.
I tried it and this works :)

Nagios - Raise notification for second time occurrence of an error in log file

I have a log file "sample.log" on Windows , The Log file is rotated every 4 Hours. Each line in the log file has timestamp. There is error Message in the log file "Connection with Server ABC Failed" I have to ignore this error Message on its first occurrence and to ignore the notification alert. But Objective is to raise a notification alert when the same Error Message "Connection with Server ABC Failed" in the "sample.log" file occurs "second time or more". How this can be achieved , Please assist ?
There's a large learning curve with this plugin, but I'd use the check_logfiles plugin for this.
https://labs.consol.de/nagios/check_logfiles/
I haven't had to use this particular feature yet, but from the documentation it seems you can achieve your goals setting criticalthreshold to 2.
Of course you'll also need to specify your criticalpattern regex and I'd recommend setting an okpattern if possible.
In my service definition I also had to be sure to include:
is_volatile 1
Installation instructions are included in the link above.

How to get Yii2 formatted error message in production systems?

When Yii2 is used in debug mode and an error occurs, it shows a error message along with call trace, session, cookie and server info.
Sample image shown below.
In production ready systems, this will not be shown. However, Is there a way to pull this formatted html into a variable in production systems, so that it can be emailed to the developer to ease in debugging errors. If any one has any idea, please let me know.
I tried using \Yii::$app->mailer->render() passing #vendor/yiisoft/yii2/views/errorHandler/exception.php as view, ['exception => $ex] as data, layout file as parameters. I'm getting undefined variable handler error.
Config Log Targets for your purpose:
http://www.yiiframework.com/doc-2.0/guide-runtime-logging.html

error writing mime multipart body part to output stream

I have code that does async file uploads which works fine on my dev vm but after I deployed it to the client system, I keep getting this error:
"error writing mime multipart body part to output stream"
I know this is the line that is throwing the error but I can't seem to figure out why:
//Read the form data and return an async task.
await Request.Content.ReadAsMultipartAsync(provider);
The file size was only 1MB and I even tried different file types with much smaller sizes. Why would this occur, I need ideas
Since the error message is mentioning about an error while writing to output stream, can you check if the folder to where the response is being written out has necessary permissions for your application to write.
You can also get this error if a file with the same name already exists in the destination folder.
I had this issue but I had already set permissions on the destination folder.
I fixed the problem by setting permissions on the App_Data folder (I think this is where the file gets temporarily stored after being uploaded).