How to get the transaction logs for a Azure SQL db? I'm trying to find log from portal of azure but not getting any luck.
If there is no way to get the log where that is saying in Microsoft docs. any help is appriciate
You don't as it is not exposed in the service. Please step back and describe what problem you'd like to solve. If you want a DR solution, for example, then active geo-replication can solve this for you as part of the service offering.
The log format in Azure SQL DB is constantly changing and is "ahead" of the most recent version of SQL Server. So, it is probably not useful to expose the log (the format is not documented). Your use case will likely determine the alternative question you can ask instead.
Azure SQL Database auditing tracks database events and writes them to an audit log in your Azure storage account, or sends them to Event Hub or Log Analytics for downstream processing and analysis.
Blob audit
Audit logs stored in Azure Blob storage are stored in a container named sqldbauditlogs in the Azure storage account. The directory hierarchy within the container is of the form ////. The Blob file name format is _.xel, where CreationTime is in UTC hh_mm_ss_ms format, and FileNumberInSession is a running index in case session logs spans across multiple Blob files.
For example, for database Database1 on Server1 the following is a possible valid path:
Server1/Database1/SqlDbAuditing_ServerAudit_NoRetention/2019-02-03/12_23_30_794_0.xel
Read-only Replicas audit logs are stored in the same container. The directory hierarchy within the container is of the form ////RO/. The Blob file name shares the same format. The Audit Logs of Read-only Replicas are stored in the same container.
Related
I'm looking for a way to list out who logged in to a SQL MI instance.
I have configured log analytics to receive all logs and metrics but see no login information in the logs. Looks like there is a flag in Azure SQL but this does not apply to SQL MI, this
is where i found the flag.
Is there a way to enable this type of log collection in SQLMI so this information is sent to Log Analytics?
Can I get this info right from SQLMI? perhaps a system table?
Thanks for any help you may provide.
Looks like in addition to setting up LogAnalytics, an Audit job needs to be created within the DB in Master. Here is a link to the MSFT article.
I have a Azure SQL Database with Auditing turned on. I noticed that my database comes online after a pause when it shouldn't. I checked the audit logs and it shows strange entries of 'sa' login trying to do smth. Not sure what these entries mean. Is a normal activity from Azure or somebody is trying to connect to my database? I believe that there is no such user 'sa' on Azure SQL databases, or am I wrong. Attaching the screenshot of audit logs.
Additional_info column shows these values (they repeat for every event).
<action_info xmlns="http://schemas.microsoft.com/sqlserver/2008/sqlaudit_data">destroyed</action_info>
<action_info xmlns="http://schemas.microsoft.com/sqlserver/2008/sqlaudit_data">event disabled</action_info>
<action_info xmlns="http://schemas.microsoft.com/sqlserver/2008/sqlaudit_data">event enabled<startup_type>automatic</startup_type></action_info>
logs
Tried Google, found nothing.
I created azure SQL database in azure portal, and I enabled auditing server level destination as storage account.
Image for reference:
After that I enabled auditing at database level with same destination of storage account.
Image for reference:
It enabled successfully, and containers are created successfully in storage account.
Image for reference:
Audit Records:
Here is my log
In this way I am not getting any error related to sa user.
As per my knowledge sa user is the admin you created during setup of SQL Azure server
According to this
Once the azure database is in pause status, it resumes automatically in the following conditions:
Database connection
database export or copy
Viewing auditing records
Viewing or applying performance recommendation
Vulnerability assessment
Modifying or viewing data masking rules
View state for transparent data encryption
Modification for serverless configuration such as max vCores, min vCores, or auto-pause delay
May be for above reason database still remains in online when you pause it.
I have a need to load data into Azure Hyperscale incrementally.
Source data is in Azure VM that has SQL server installed in it.
Source database is about 6Tb in size and has about 370 tables.
We need a way to get incremental changes in the last X amount of hours and sync them into the same database in Hyperscale.
Ideally, we would extend our database with the availability group setup but since Hyperscale does not support that, we need to find a way to keep these in sync.
Source database does have change data capture enabled.
The best on-line migration option is to use the Azure Database Migration Service (link) where the Online (continuous sync) migration support scenario (link) you need is supported:
The sync will essentially run in the background until completed while being able to access the data that has been migrated. I believe this is a continuous copy scenario and is not incremental. With PaaS database services, you do not have access to perform snapshot replication operations from external data sources. The Hyperscale instance is built upon snapshot replication but it currently only serves the hosted database functionality.
Regards,
Mike
We are wanting to use Azure servers to run our Power Apps applications, however we have local SQL servers which contains our data warehouse we want only certain tables to be on Azure and want to create data feeds between the two with information going from one to the other.
Does anyone have any insight into how I can achieve this?
I have googled but there doesn't appear to be a wealth of information on this topic.
It depends on how fast after a change in your source (the on premise SQL Server) you need that change reflected in your Sink (Azure SQL).
If you have some minutes or even only need to update it every day I would suggest a basic Data Factory Pipeline (search on google for data factory upsert). Here it depends on your data on how you can achieve this.
If you need it faster or it is impossible to extract an incremental update from your source you would need to either use triggers and write the changes from one database to the other or get a program that does change data capture that does that.
It looks like you just want to sync the data in some table between local SQL Server and Azure SQL database.
You can use the Azure SQL Data Sync.
Summary:
SQL Data Sync is a service built on Azure SQL Database that lets you synchronize the data you select bi-directionally across multiple SQL databases and SQL Server instances.
With Data Sync, you can keep data synchronized between your on-premises databases and Azure SQL databases to enable hybrid applications.
A Sync Group has the following properties:
The Sync Schema describes which data is being synchronized.
The Sync Direction can be bi-directional or can flow in only one
direction. That is, the Sync Direction can be Hub to Member, or
Member to Hub, or both.
The Sync Interval describes how often synchronization occurs.
The Conflict Resolution Policy is a group level policy, which can be
Hub wins or Member wins.
Next step, you need to learn how to configure the Data Sync. Please reference this Azure document:Tutorial: Set up SQL Data Sync between Azure SQL Database and SQL Server on-premises.
In this tutorial, you learn how to set up Azure SQL Data Sync by creating a sync group that contains both Azure SQL Database and SQL Server instances. The sync group is custom configured and synchronizes on the schedule you set.
Hope this helps.
The most robust solution here is Transactional Replication. You can also use SSIS or Azure Data Factory for copying tables to/from Azure SQL Database. And Azure SQL Data Sync also exists.
I have a Cloud hosted Database, maintained by a different company. Here is my scenario:
I would like to bring down the entire database, schema and data, and keep my Local database updated real-time(sql server 2008r2).
I cannot setup replication inside SQL server, I do not have permissions on the cloud server.
I cannot use triggers.
I cannot use linked servers
They would provide me a copy of backup (nightly)and access to the transacrion logs every 1 hour
How can I use these to update my entire database.
Thanks in advance