Azure Log Analytics - Cannot add data source - azure-log-analytics

I have setup a Log Analytics Workspace, install MMA on a few computers with correct workspace id and workspace key (heartbeats are logged). The location of the workspace is set to North Europe.
I cannot add data sources, as shown here: https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-sources-windows-events
I have this view in "Advanced settings" for the workspace:advanced settings in azure portal for workspace

Provided that you have required access as per this document, I believe it (Advanced settings tile) opens up as expected after waiting for little longer i.e., say ~10 seconds or so!!
Hope this helps! Cheers!

Related

Not able to get Azure SQL Server Extended Events to work when Blob Storage is set to Enabled from selected virtual networks and IP addresses

So I have an Azure Database and want to test extended events with the database.
I was able to set up my Blob Storage container and was able to get Extended Events via Azure Database to work as long as the Blob Storage network setting Public network access is set to Enabled from all networks. If I set Enabled from selected virtual networks and IP addresses and have Microsoft network routing checked as well as Resource type set with Microsoft.Sql/servers and its value as All In current subscription, it still doesn't work.
I'm not exactly sure what I'm doing wrong and I'm not able to find any documentation on how to make it work without opening up to all networks.
The error I'm getting is:
The target, "5B2DA06D-898A-43C8-9309-39BBBE93EBBD.package0.event_file", encountered a configuration error during initialization. Object cannot be added to the event session. (null) (Microsoft SQL Server, Error: 25602)
Edit - Steps to fix the issue
#Imran: Your answer led me to get everything working. The information you gave and the link provided was enough for me to figure it out.
However, for anyone in the future I want to give better instructions.
The first step I had to do was:
All I had to do was run Set-AzSqlServer -ResourceGroupName [ResourcegroupName] b -ServerName [AzureSQLServerName] -AssignIdentity.
This assigns the SQL Server an Azure Active Directory Identity. After running the above command, you can see your new identity in Azure Active Directory under Enterprise applicationsand then where you see theApplication type == Enterprise Applicationsheader, click the headerApplication type == Enterprise Applicationsand change it toManaged Identities`and click apply. You should see your new identity.
The next step is to give your new identity the role of Storage Blob Data Contributor to your container in Blob Storage. You will need to go to your new container and click Access Control (IAM) => Role assignments => click Add => Add Role assignment => Storage Blob Data Contributor => Managed identity => Select member => click your new identity and click select and then Review + assign
The last step is to get SQL Server to use an identity when connecting to `Blob Storage.
You do that by running the command below on your Azure SQL Server database.
CREATE DATABASE SCOPED CREDENTIAL [https://<mystorageaccountname>.blob.core.windows.net/<mystorageaccountcontainername>]
WITH IDENTITY = 'Managed Identity';
GO
You can see your new credentials when running
SELECT * FROM sys.database_scoped_credentials
The last thing I want to mention is when creating Extended Events with
an Azure SQL Server using SSMS, it gives you this link. This only works if you want your Blob Storage wide open. I think this is a disservice and wish they would have instructions when you want your Blob Storage not wide open by using RBAC instead of SAS.
I tried to reproduce the same in my environment I got the result successfully like below:
To resolve this issue, check whether your account type should be
StorageV2(general purpose v2). If you have a general-purpose v1 or blob storage account, try to upgrade like below.
In storage account -> under setting, configuration -> upgrade
Check whether you have choose Allow trusted Microsoft services to access this storage account under exception and I added firewall client Ip address range and vnet like below.
Make sure Microsoft.Authorization/roleAssignments/write permission in your storage account
After enabling firewall, we lose write access to the storage account and audit logs try to Resave the audit settings from the portal is required in order for auditing to function like below.
Note: Auditing to storage behind firewalls using user managed identity authentication type is not presently supported.
When I try to connect, I got result successfully like below:
Reference:
Configure extended events in SQL Azure to the blob storage with Private Endpoint - Microsoft Community Hub by Sakshi Gupta

The document creation or update failed because of invalid reference

I am having trouble completing an excersice on the Microsoft Learn platform.
https://learn.microsoft.com/en-us/learn/modules/examine-components-of-modern-data-warehouse/5-exercise-azure-synapse
I have followed the instructions, but get the following error
Error message
Source settings
Does anyone know what's causing this, and how I can fix the issue?
Regards,
Anders
I ran into this issue too. Unless I missed a step in the Explore Azure Synapse Analytics exercise (https://learn.microsoft.com/en-us/learn/modules/examine-components-of-modern-data-warehouse/5-exercise-azure-synapse), I found the issue to be that the source linked service was not being created for some reason in Steps 3 and 4. I suggest doing the following in a different tab, so you can come back to Ingest Data wizard later to pick up where you left off. I created the source linked service by going to my workspace, then clicking on the "Manage" icon (the one with the wrench on the briefcase). After that, click on "Linked services" followed by clicking on "+ New". At this point, a number of connection icon should appear. In the search bar type "HTTP" and click on it and click "continue". This will bring up a form that will be the same as the one you saw in step 3 of the Explore Azure Synapse Analytics exercise. Fill it out as directed in the exercise, and you should have a source linked service connection of Type "HTTP" created.
Screenshot of Buttons to Click on to Create Source Linked Service
As long as you've completed the rest of the steps of the exercise, going back to the Deployment step and proceeding should now allow it successfully complete.
I followed the same document. When I tried in my environment, I got the same error.
Solution :
I was able to solve by Connecting with GitHub and Troubleshooting Git integration.
Note: Disconnect from the existing git repo. Reconnect back to the same repo, and create a new git branch. Then
use git to create more commits on top of that branch.
Sample dataset:
Deployment status:
Linked service:
For more in detail, please refer to the below links:
https://learn.microsoft.com/en-us/azure/data-factory/source-control#connect-to-a-git-repository
https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-delivery#best-practices-for-cicd

Azure SQL DB Error, This location is not available for subscription

I am having pay as you go subscription and I am creating an Azure SQL server.
While adding server, on selection of location, I am getting this error:
This location is not available for subscriptions
Please help.
There's an actual issue with Microsoft servers. They have too many Azure SQL database creation requests. They're currently trying to handle the situation. This seems to affect all types of subscriptions even paid ones. I have a Visual Studio Enterprise Subscription and I get the same error (This location is not available for subscriptions) for all locations.
See following Microsoft forum thread for more information:
https://social.msdn.microsoft.com/Forums/en-US/ac0376cb-2a0e-4dc2-a52c-d986989e6801/ongoing-issue-unable-to-create-sql-database-server?forum=ssdsgetstarted
As the other answer states, this is a (poorly handled) restriction on Azure as of now and there seems to be no ETA on when it shall be lifted
In the meantime, you can still get an SQL database up and running in Azure, if you don't mind doing a bit of extra work and don't want to wait - just set up a Docker instance and put MSSQL on it!
In the Azure Portal, create a container instance. Use the following docker image: https://hub.docker.com/r/microsoft/mssql-server-windows-express/
while creating, you might have to set the ACCEPT_EULA environment variable to "Y".
after it boots up (10-20 minutes for me), in the portal, connect to it with the "sqlcmd" command and set up your login. In my case, I just needed a quick demo db, so I took the "sa" login, ran "alter login SA with password ='{insert your password}'" and "alter login SA enable". See here for details: https://learn.microsoft.com/en-us/sql/t-sql/statements/alter-login-transact-sql?view=sql-server-ver15#examples
and voila, you have an SQL instance on Azure. Although it's unmanaged and poorly monitored, it might be enough for a short-term solution. The IP address of the docker instance can be found in the Properties section of the container instance blade.
Maybe you can reference this blog: Azure / SQL Server / This location is not available for subscription. It has the same error with you.
Run this powershell command to check if the location you choose is available:
Get-AzureRmLocation | select displayname
If the location is available, the best way to resolve this issue just contact the Azure support to have this enabled for you. You can do this for free using support page on your Azure Portal.
They well contact you can help you solve it.
Hope this helps.
This is how I solved myself. Let me tell you the problem first. Then the solution.
Problem: I created a brand new free Azure account (comes with $250 free credit) for a client. Then upgraded to pay-as-you-go subscription. I was unable to create Azure SQL db. The error was 'location is not available'.
How I solved: I created another pay-as-you-go subscription in the same account. Guess what - I was able to create SQL db in my new subscription right away. Then I deleted the first subscription from my account. And yes, I lost the free credit.
If your situation is similar to mine, you can try this.
PS: I have 3 clients with their own Azure accounts. I was able to create SQL Db in all of their accounts. I think the problem arises only for free accounts and/or for free accounts that upgraded to pay-as-you-go accounts.
EDIT - 2020/04/22
This is still an ongoing problem up to today, but I was told by Microsoft support that on April 24th, a new Azure cluster will be available in Europe. Thus it might get possible to finally deploy SQL Server instances on Free accounts around there.
Deploy a docker container running SQL Server
To complement on #Filip's answer, and given that the problem still remains with Azure SQL Server, a docker container running a SQL Server is a great alternative. You can set yourself one very easily running the following command on the cloud shell:
az container create --image microsoft/mssql-server-windows-express --os-type Windows --name <ContainerName> --resource-group <ResourceGroupName> --cpu <NumberOfCPUs> --memory <Memory> --port 1433 --ip-address public --environment-variables ACCEPT_EULA=Y SA_PASSWORD=<Password> MSSQL_PID=Developer --location <SomeLocationNearYou>
<ContainerName> : A container name of your choice
<ResourceGroupName> : The name of a previously created Resource Group
<NumberOfCPUs> : Number of CPUs you want to use
<Memory> : Memory you want to use
<Password> : Your password
<SomeLocationNearYou> : A location near you. For example,
westeurope
Access SQL Server
Once the container instance is deployed, in the Overview you will be able to find an IP address. Use that IP address and the password you chose in the az container command to connect to the SQL Server, either using Microsoft's SSMS, or the sqlcmd utility
Some documentation regarding the image I have used can be found here.
More information on the command I have used here.

SAP Cloud Platform Trial - The HDI container could not be opened

I'm following the SAP tutorial Create an application using SAP HANA and the Cloud Application Programming model. Steps 1-4 have been successfully completed - I even get the output "1:44:30 PM (HDB) Build of /APP/db completed successfully." at the end of step 4.
When I right click on the db folder and chose Open HDI Container the below error occurs.
The applications running in my space I are below.
Info about my space.
Below are the service instances in my dev space.
Am I:
Doing something wrong,
Missing some prerequisite, or
Does this not work in the trial account?
Thanks,
Mike
Assuming your hdi container is indeed there (go into service instances, not applications), it may be because of the region. Change the region of your trial to Europe Frankfurt: https://blogs.sap.com/2019/04/16/how-to-change-the-region-in-your-cloud-foundry-trial/
Cheers,
Lucia.

Chat History and Monitoring Plugin Openfire

I'm new to openfire. I have a chat application running nodejs. I have a separate Chat server with openfire installed.
I wanted to know how chat history for a chat group is handled?
How to progressively load chat history in the client from openfire server? Should I write a custom routine for the same ? Does Monitoring plugin provide any for the same
What is format is which chats are archived ? Is there a way to retrieve them in any given format ?
Are there any APIs that can be used to access the database ?(I doubt it tho)
I have installed Monitoring Plugin for the same. However I'm not able to find any documentation for openfire or monitoring plugin regarding chat history.
Any help would be much appreciated.
If you have installed the monitoring plugin, you can read in its readme file under the configuration paragraph the following content:
Chat archiving is enabled by default. However, only information about who is communicating and at what time is stored unless chat transcript archiving is enabled. To enable chat transcript archiving or group chat archiving, you will need to log into the admin console and go to:
Server --> Archiving --> Archiving Settings
To enable group chat archiving, you will need to log into the admin console
and go to: Server --> Archiving --> Archiving Settings and 'Message
Archiving' is enabled for either 'Archive one-to-one chats' and/or
'Archive group chats'.
Then the messages get stored in the external database table 'ofMessageArchive'.
The history can be further loaded from the database.
This might help you!
Please login to your server
Go to Plugins, find plugin "Monitoring Service" install/enable it if you have not enabled
Now go to the "Server" menu, go to "Archiving" then "Archiving Settings"
Check "Archive one-to-one chats"
Check "Archive group chats"
Now click on "Update Setting"
And then after most important is you need to click on "Rebuild Index" (last button in this setting screen).
Then send a messsage from your app, and check your database table "ofMessageArchive"
You will have magic :)
It's working for me, let's see it is working for you or not!
Answering to #3:
Chats are archived in following format in ofmessage archive table:
conversationID | fromJID | fromJIDResource | toJID | toJIDResource | sentDate | body |