Azure SQLMI - track users logging in to instance using Log Analytics - sql

I'm looking for a way to list out who logged in to a SQL MI instance.
I have configured log analytics to receive all logs and metrics but see no login information in the logs. Looks like there is a flag in Azure SQL but this does not apply to SQL MI, this
is where i found the flag.
Is there a way to enable this type of log collection in SQLMI so this information is sent to Log Analytics?
Can I get this info right from SQLMI? perhaps a system table?
Thanks for any help you may provide.

Looks like in addition to setting up LogAnalytics, an Audit job needs to be created within the DB in Master. Here is a link to the MSFT article.

Related

'sa' user is "pinging" my Azure SQL Database

I have a Azure SQL Database with Auditing turned on. I noticed that my database comes online after a pause when it shouldn't. I checked the audit logs and it shows strange entries of 'sa' login trying to do smth. Not sure what these entries mean. Is a normal activity from Azure or somebody is trying to connect to my database? I believe that there is no such user 'sa' on Azure SQL databases, or am I wrong. Attaching the screenshot of audit logs.
Additional_info column shows these values (they repeat for every event).
<action_info xmlns="http://schemas.microsoft.com/sqlserver/2008/sqlaudit_data">destroyed</action_info>
<action_info xmlns="http://schemas.microsoft.com/sqlserver/2008/sqlaudit_data">event disabled</action_info>
<action_info xmlns="http://schemas.microsoft.com/sqlserver/2008/sqlaudit_data">event enabled<startup_type>automatic</startup_type></action_info>
logs
Tried Google, found nothing.
I created azure SQL database in azure portal, and I enabled auditing server level destination as storage account.
Image for reference:
After that I enabled auditing at database level with same destination of storage account.
Image for reference:
It enabled successfully, and containers are created successfully in storage account.
Image for reference:
Audit Records:
Here is my log
In this way I am not getting any error related to sa user.
As per my knowledge sa user is the admin you created during setup of SQL Azure server
According to this
Once the azure database is in pause status, it resumes automatically in the following conditions:
Database connection
database export or copy
Viewing auditing records
Viewing or applying performance recommendation
Vulnerability assessment
Modifying or viewing data masking rules
View state for transparent data encryption
Modification for serverless configuration such as max vCores, min vCores, or auto-pause delay
May be for above reason database still remains in online when you pause it.

Azure Log Analytics - SQL Managed Instance logs

I need to retrieve executed sql queries by using log analytics.
I have the log analytics workspace and also configured diagnostic settings (SQL managed instance and database).
But there is no data when I try to execute the following kusto query:
AzureDiagnostics | where Category == "QueryStoreRuntimeStatistics"
Should I set up something more to see this data? (enable Query Store - Write and Read - directly on the SQL Server Database ?)
In the AzureDiagnostic table I have only UsageStats and Erros.
We can check for SQL diagnostics information with help of log analytics workspace, below are few steps which we need to follow:
Need to create a log analytics work space.
Add few configurations to SQL such as enabling audit mode and select Log Analytics workspace.
Enable diagnostics to Log Analytics.
Select the check box for send data to log analytics.
We can understand more about this from one of the Microsoft blogs.

Unable to get the user-id/identity details from Log Analytics workspace captured from Serverless Pool SQLs

I have
Azure Synapse Workspace on which I have Synapse Administrator access
Through Azure portal - I have configured he Log Analytics and running below KQL statement to find the list of queries executed along with data processed in MBs and user-id/identity of the SQL
I am not able to find the identity here - Is there a setting to fetch it ?
Below is the KQL statement
SynapseBuiltinSqlPoolRequestsEnded
| where TimeGenerated > ago(24h)
| evaluate bag_unpack(Properties)
| project startTime,endTime,error,Identity,queryText, command,dataProcessedBytes/1024/1024
Let me know, if this is a limitation at this moment or am I missing anything here ?
Thanks,
Aravind
Yes, there is a limitation for this.
Log analytics does not capture any PII(Personally identifiable information) for data compliance.
Check the Identity column in SynapseBuiltinSqlPoolRequestsEnded. It has the login name of the user running the query.

How to get Azure SQL transactional log

How to get the transaction logs for a Azure SQL db? I'm trying to find log from portal of azure but not getting any luck.
If there is no way to get the log where that is saying in Microsoft docs. any help is appriciate
You don't as it is not exposed in the service. Please step back and describe what problem you'd like to solve. If you want a DR solution, for example, then active geo-replication can solve this for you as part of the service offering.
The log format in Azure SQL DB is constantly changing and is "ahead" of the most recent version of SQL Server. So, it is probably not useful to expose the log (the format is not documented). Your use case will likely determine the alternative question you can ask instead.
Azure SQL Database auditing tracks database events and writes them to an audit log in your Azure storage account, or sends them to Event Hub or Log Analytics for downstream processing and analysis.
Blob audit
Audit logs stored in Azure Blob storage are stored in a container named sqldbauditlogs in the Azure storage account. The directory hierarchy within the container is of the form ////. The Blob file name format is _.xel, where CreationTime is in UTC hh_mm_ss_ms format, and FileNumberInSession is a running index in case session logs spans across multiple Blob files.
For example, for database Database1 on Server1 the following is a possible valid path:
Server1/Database1/SqlDbAuditing_ServerAudit_NoRetention/2019-02-03/12_23_30_794_0.xel
Read-only Replicas audit logs are stored in the same container. The directory hierarchy within the container is of the form ////RO/. The Blob file name shares the same format. The Audit Logs of Read-only Replicas are stored in the same container.

Working out which BigQuery query I am paying for?

I am new to BigQuery and I have a question regarding billing - I have a recurring (almost daily) charge on my account and I think it is related to a query I have embedded into a published Tableau report - people are viewing the report and I am being charged - however the charge is more that I am expecting. How can I track the charge back to the specific query to confirm which one is raising the charge?
Thank you for your help,
Ben
I would start by enabling audit logs and inspecting the logs.
Audit logs are available via Google Cloud Logging, where they can be immediately filtered to provide insights on specific jobs or queries, or exported to Google Cloud Pub/Sub, Google Cloud Storage, or BigQuery.
To analyze your aggregated BigQuery usage using SQL, set up export of audit logs back to BigQuery. For more information about setting up exports from Cloud Logging, see Overview of Logs Export in the Cloud Logging documentation.
Analyzing Audit Logs Using BigQuery: https://cloud.google.com/bigquery/audit-logs