Unable to create Azure SQL Data Warehouse, resource does not exist - azure-sql-database

Applied for preview access on behalf of my client roughly a month ago. Preview access was approved yesterday. Followed the instructions in the email, namely to create a logical SQL server and reply with the name. Was notified today that the aproval process for the logical SQL server was completed.
Now, when trying to create a new Azure SQL Data Warehouse, it sits there for about an hour before failing on an Update SQL database event. Property mentions statusCode NotFound, and the statusMessage contains code 45181 and message "Resource with the name '' does not exist. To continue, specify a valid resource name."
What am I missing here?

Thanks for alerting us to this issue. We have investigated the issue and have unblocked the preview access for your subscription. You should be able to proceed and provision an Azure SQL Data Warehouse. You can follow the Get Starting guide to provision your first database.

Related

'sa' user is "pinging" my Azure SQL Database

I have a Azure SQL Database with Auditing turned on. I noticed that my database comes online after a pause when it shouldn't. I checked the audit logs and it shows strange entries of 'sa' login trying to do smth. Not sure what these entries mean. Is a normal activity from Azure or somebody is trying to connect to my database? I believe that there is no such user 'sa' on Azure SQL databases, or am I wrong. Attaching the screenshot of audit logs.
Additional_info column shows these values (they repeat for every event).
<action_info xmlns="http://schemas.microsoft.com/sqlserver/2008/sqlaudit_data">destroyed</action_info>
<action_info xmlns="http://schemas.microsoft.com/sqlserver/2008/sqlaudit_data">event disabled</action_info>
<action_info xmlns="http://schemas.microsoft.com/sqlserver/2008/sqlaudit_data">event enabled<startup_type>automatic</startup_type></action_info>
logs
Tried Google, found nothing.
I created azure SQL database in azure portal, and I enabled auditing server level destination as storage account.
Image for reference:
After that I enabled auditing at database level with same destination of storage account.
Image for reference:
It enabled successfully, and containers are created successfully in storage account.
Image for reference:
Audit Records:
Here is my log
In this way I am not getting any error related to sa user.
As per my knowledge sa user is the admin you created during setup of SQL Azure server
According to this
Once the azure database is in pause status, it resumes automatically in the following conditions:
Database connection
database export or copy
Viewing auditing records
Viewing or applying performance recommendation
Vulnerability assessment
Modifying or viewing data masking rules
View state for transparent data encryption
Modification for serverless configuration such as max vCores, min vCores, or auto-pause delay
May be for above reason database still remains in online when you pause it.

Azure Data Factory fails to create Azure SQL linked service utilising Managed private endpoint

I have created and approved a managed private endpoint in Azure Data Factory, targeting my Azure SQL server (which has public network access disabled).
I have also created a database user for the System Assigned Managed Identity.
When attempting to add a new linked service in the Data Factory portal, I am able to select my Azure subscription and the Server name, as shown in the screenshot below. However, the Database name dropdown never moves beyond "Loading..."
Attempting to create the linked service via Bicep instead seems to succeed - but reviewing the linked services blade, the linked service is not "Using private endpoint" - and my data pipeline fails.
Fix: Ensure SQL server name is all lowercase.
Checking my browser console whilst the above screen was displayed, I noticed an error relating to validation of the Server name, specifically "Servername cannot be empty or null. It can only be made up of lowercase letters, the numbers 0-9 and the hyphen. The hyphen may not lead or trail in the name."
My server name contained capital letters (although the Data Factory UI was rendering it in all lower case).
After recreating my Azure SQL server, with a name complying with the requirements above, I was able to set up the linked service without issue (both through the UI and through Bicep).
Try with enabling interactive authoting option.
https://learn.microsoft.com/en-us/azure/data-factory/tutorial-copy-data-portal-private

How to get Azure SQL transactional log

How to get the transaction logs for a Azure SQL db? I'm trying to find log from portal of azure but not getting any luck.
If there is no way to get the log where that is saying in Microsoft docs. any help is appriciate
You don't as it is not exposed in the service. Please step back and describe what problem you'd like to solve. If you want a DR solution, for example, then active geo-replication can solve this for you as part of the service offering.
The log format in Azure SQL DB is constantly changing and is "ahead" of the most recent version of SQL Server. So, it is probably not useful to expose the log (the format is not documented). Your use case will likely determine the alternative question you can ask instead.
Azure SQL Database auditing tracks database events and writes them to an audit log in your Azure storage account, or sends them to Event Hub or Log Analytics for downstream processing and analysis.
Blob audit
Audit logs stored in Azure Blob storage are stored in a container named sqldbauditlogs in the Azure storage account. The directory hierarchy within the container is of the form ////. The Blob file name format is _.xel, where CreationTime is in UTC hh_mm_ss_ms format, and FileNumberInSession is a running index in case session logs spans across multiple Blob files.
For example, for database Database1 on Server1 the following is a possible valid path:
Server1/Database1/SqlDbAuditing_ServerAudit_NoRetention/2019-02-03/12_23_30_794_0.xel
Read-only Replicas audit logs are stored in the same container. The directory hierarchy within the container is of the form ////RO/. The Blob file name shares the same format. The Audit Logs of Read-only Replicas are stored in the same container.

How do I avoid "Error creating scheduled query: The caller does not have permission" when setting up a BigQuery scheduled query?

I am trying to set up a scheduled query in BigQuery through the web UI in GCP but I am getting:
Error creating scheduled query: The caller does not have permission
I am creating this query outside of my organisation (under another organisation), but my email address has been given bigquery.transfers.update permission within the other organisation. Previously the error message specified that I needed that permission, now it is much more generic - as above. The query runs as expected without scheduling.
Any help would be really appreciated!
In order to schedule queries you need to add
bigquery.datasets.update also. Since you are capable to run queries in that table you wouldn't have any further problem to achieve that.
These are the minimum permissions required to schedule queries in BigQuery. (Assuming that you have enough permissions to get tables & dataset data)

How to get notified upon database updates in SQL Azure

We host data in SQL Azure, we used to have query notifications if we host data on premise, how do we address it in case of SQL Azure? to get query notifications upon updates, inserts events other than keep polling from code?
Or is any other Azure services support this, e.g. mobile services? notification hubs? newbie to Azure offering here...
Thanks for the help!
You can create a trigger (special type of stored procedure) that is automatically executed after an insert happened. Documentation for triggers is here: https://technet.microsoft.com/en-us/library/ms189799(v=sql.120).aspx
As Joe states you will not be able to send an email out of SQL Database though.
Depending on how quick you need the notification after the insert, maybe you could make an insert into yet another table from within the trigger and pull the data as Joe says.