I am facing issue while running "AzureActivity" query in Azure Monitor Log. I am using free trail subscription and I created a vm and trying to run query but facing issue. Please find below screenshot.
you can create an Log Analytics workspace. Then go to azure portal -> your vm -> in the Activity log page, click the Diagnostic settings button -> then in the Diagnostic settings, click the Add diagnostic setting button -> then you can send all the logs to the Log Analytics workspace. At last, you can try to query in that Log Analytics workspace.
Related
I am trying to run a job scheduler in azure databricks. While its running various notebook, its failing and showing below error.
As mentioned by #santoznma in comment, you don’t have jobs access control enabled.
Enabling access control for jobs allows job owners to control who can view job results or manage runs of a job.
To enable it follow below steps:
1. Go to the Admin Console.
2. Click the Workspace Settings tab.
3. Click the Cluster, Pool and Jobs Access Control toggle.
4. Click Confirm.
Refer - Enable jobs access control for your workspace - Azure Databricks
I have setup a Log Analytics Workspace, install MMA on a few computers with correct workspace id and workspace key (heartbeats are logged). The location of the workspace is set to North Europe.
I cannot add data sources, as shown here: https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-sources-windows-events
I have this view in "Advanced settings" for the workspace:advanced settings in azure portal for workspace
Provided that you have required access as per this document, I believe it (Advanced settings tile) opens up as expected after waiting for little longer i.e., say ~10 seconds or so!!
Hope this helps! Cheers!
I'm trying to develop a custom authentication for Power BI Report Server and followed the following guide:
https://github.com/Microsoft/Reporting-Services/tree/master/CustomSecuritySample
It's OK. I can authenticate my own way, creating a kind of Single Sign On application. Although, I'm getting an error when I try to save a report as a Power BI Report Server on Power BI Desktop. I get Unexpected Error Encountered.
Does anyone know if this is really a problem with my authentication? In other server, same config, without Custom Authentication, everything works fine.
EDIT:
This is the error I get
I did face similar issue, was unable to publish the Power BI reports to report server and was getting same error message "Unexpected Error Encountered"
To resolve this issue you launch your "Report Server Configuration Manager" on server where you have configured Power Bi Report server, and go to tab "Web Portal URL". The url which has been configured; copy it and go to your Power BI Desktop version -> go to File -> SaveAs -> Power Bi Report Server -> Paste the url in New report server address text box which you have copied.
sample url - http://IPaddress/ReportsPower or http://fqdn/ReportsPower
Click on OK button.
It will list you all the reports which are already available or if it fresh report then would show blank and would get Success message.
Please note that I have tried this on Power BI Desktop version - Version: 2.81.5831.1181 64-bit (May 2020)
Add your powerbi credential in credential manager under windows credential with power BI server URL.enter image description here
I think this resolve your problem.
I have resolved this issue with this way.
Thanks
BennyRaja.A
First of all add your server's IP/DomainName only (like 1.1.1.1 or example.com) in Windows credentials manager along with username and password, See the screenshot 1
Secondly click on file=>Save As=> Report Server give full address of report server like http://example.com/reports, See the screenshot 2
I am using IBM Worklight 6 and for auditing purposes would like to know if I it is possible to log the details of tasks performed on the Worklight Console, i.e. log the details when deploying new version of app/adapter?
Regards,
Tom
Worklight, or more specifically, the application server that Worklight Server is deployed to (WAS, WAS Liberty profile, Tomcat), does not have the ability to filter logs into seperate files, for example for the purpose you have mentioned.
What you might be able to do is take the server log, and create your own manual filtering (by script, of course). What you will need to do is to find the prefix for each action done and filter using it.
In Eclipse (or your production environment) open server.xml > Logging and change the Console log level from Audit (default) to Info.
This in turn will produce the following log lines in the sever log:
[INFO ] FWLSE0084I: Adapter 'aaa' was deployed successfully. [project test]
In your script you can now filter for FWLSE0084I for adapters and similar prefixes for other actions, likely. I am not sure if all Worklight Console actions have prefixes, but if they are ones that require a connection to the server, then likely that they do.
http://i.stack.imgur.com/sZ0fj.png
Issue with data sources that are created through the Pentaho Admin Console ,Now my MySQL IP has been changes, i did modification in my BI server .
But Pentaho Admin console in not up .!
From logs i found this is having
01:24:40,370 ERROR [Logger] misc-org.pentaho.platform.engine.services.connection.datasource.dbcp.PooledDatasourceSystemListener: PooledDatasourceSystemListener.ERROR_0003 - Unable to pool datasource object: MyLocalDatabase caused by com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
01:24:40,857 WARN [PersistenceEngine] Falling back to built-in config
This MyLocalDatabase database Created at Pentaho Admin Console has to be changed with new IP . Do any one help me to know , which file consists of datasources that are created through the Pentaho Admin Console
My PAC is down, no error in server.log file
Datasource information is kept in the hibernate database.
By default, this is kept in an hypersonic database that is launched when you start the BI server. Check context.xml in webapps/pentaho/META-INF to make sure.
There's a DATASOURCE table in there that stores the data source definitions.