DACPAC deployment fails on EXTERNAL DATA SOURCEd schemas - sql

(Here is a problem with a similar issue:
Publish to SQL Azure fails with 'Cannot drop the external data source' message)
There is this new pattern called Sql Db Elastic Query (Azure).
The gist of it is captured here:
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-elastic-query-overview
Now, I have a SqlProj that defines:
External Data Source (EDS) and
Database Scoped Credentials (DSC)
To prevent passwords being in the script (I am using SqlCmd variables)
I probably have a gazillion views on "external" tables, based on the elastic query pattern.
And during DACPAC deployments to SQL Azure, I always get an error on:
Error SQL72014: .Net SqlClient Data Provider: Msg 33165, Level 16, State 1, Line 1 Cannot drop the external data source 'XXXX' because it is used by an external table.
Error SQL72045: Script execution error. The executed script:
DROP EXTERNAL DATA SOURCE [XXXX];
Checking the logs, I realize that there are all these Views/Tables that exist and use these EDS/DSC combo.
The work around comes with a price that's ever deepening.
So the question is, has anyone else hit this problem and found the root cause of this?

Related

Backup database - SQL Server

I need make a backup of my SQL Server database. When I try, I get this error:
System.Data.SqlClient.SqlError: Read on "c:..." failed: 23(...)(Data error (cyclic redundancy error))
Now, I'm trying to run this command:
DBCC CheckDB ('MYDATABASE') WITH NO_INFOMSGS, ALL_ERRORMSGS
But I get this error
Msg 8921, Level 16, State 1, Line 18
Check terminated. A failure was detected while collecting facts. Possibly tempdb out of space or a system table is inconsistent. Check previous errors.
What can I do? I just need make a backup.
I'm using Microsoft SQL Server Management Studio.
First of all, check the Service Account used for the SQL Server Instance from Services.
Ensure the service account have enough permission for read/write at the exact location for Backup in Physical Disk.
Ensure the the user (the user you using to login in SQL Instance) have enough permission to perform backup.
Final option to recover the data from the database is create another database with same tables (blank) in different machine in different SQL instance, then Export all the database to new database using Management studio (Right click on the Database > task > Export Data)

Azure SQL Database sys.fn_get_audit_file or sys.fn_xe_file_target_read_file troubles

I'm having troubles on a Azure SQL Database where i'm trying to read DB Audit logs.
Both procedures sys.fn_get_audit_file or sys.fn_xe_file_target_read_file sould be able to read a file.
But whatever I do i'm getting blank tables.But, even if I specify a non existing file I receive a table with zero records instead of a error.
So I'm afraid its something else.
My login is in the db_owner group.
Any suggestions ?
I found that I could only read XEL files by using the same server and same database context that they were created for. So for example, consider the following scenario:
ServerA is the Azure Synapse instance I was creating the audit XEL files from, all related to DatabaseA
ServerB is a normal SQL instance that I want to read the XEL files on
Test 1:
Using ServerB, try to read file directly from blob storage
Result: 0 rows returned, no error message
Test 2:
Using ServerB, download the XEL files locally, and try to read from the local copy
Result: 0 rows returned, no error message
Test 3:
Using ServerA, with the current DB = 'master', try to read file directly from blob storage
Result: 0 rows returned, no error message
Test 4:
Using ServerA, with the current DB = 'DatabaseA', try to read file directly from blob storage
Result: works perfectly
Because I really wanted to read the files from ServerB, I also tried doing a CREATE CREDENTIAL there that was able to read & write to my blob storage account. That didn't make any difference unfortunately - a repeat of Test 1 got the same result as before.

How do I add external tables to Azure SQL DB through Visual Studio/SSDT

My question is, how to you develop and deploy a database project to Azure SQL DB that uses external tables?
I’m using a Visual Studio 2017 database projects to manage Azure SQL databases. I’ve been following the path of building the database in VS, then hitting the publish button, and it works great.
Now I am trying to add external tables for doing Elastic Queries. In SSMS, I created the external data source and credential for it, and a remote table:
CREATE EXTERNAL DATA SOURCE [RemoteServer]
WITH (TYPE = RDBMS, LOCATION = N'myserver.database.windows.net',
CREDENTIAL = [RemoteUser], DATABASE_NAME = N'MyRemoteDb')
GO
CREATE EXTERNAL TABLE dbo.MyTable (
ID int not null,
MyColumn varchar(10) not null
) WITH (
DATA_SOURCE = [RemoteServer]
);
GO
(Credential for RemoteUser also exists.)
SELECT * FROM MyTable
produces the expected result and all is fine in the world.
Now I want to add this to my database project so it can be deployed via publishing (in VS) for any future changes (and saved in source control).
I am getting the error:
SQL71501: External Data Source: [RemoteServer] has an unresolved reference to Database Scoped Credential [RemoteUser].
I can remove the error by setting the properties of all the external pieces to not build, but that defeats the purpose of being able to publish from Visual Studio.
I’ve tried removing the credential from the data source, so it doesn’t exist. It removed the error, but publish fails due to it missing the credential:
CREATE EXTERNAL DATA SOURCE [RemoteServer]
WITH (TYPE = RDBMS, LOCATION = N'myserver.database.windows.net',
DATABASE_NAME = N'MyRemoteDb')
GO
SQL72014: .Net SqlClient Data Provider: Msg 46505, Level 16, State 15, Line 1 Missing required external DDL option 'CREDENTIAL'.)
Any suggestions?
Elastic query in Azure is still in preview at the time of writing and Microsoft list a number of limitations in the overview page. Unfortunately one of them is regarding the scripting of external tables and external data sources via SSDT. It is noted here (see second item in the list in regards to SSDT).

Dataset error message - PdwManagedToNativeInteropException

Currently have a pipeline running in our production environment that has an activity that copies data from an on prem sql database to sql azure database. This pipeline is replicated among the dev and QA environments but don't fail in those environments. Wanted to get a bit more insight as to what this error means.
Message=A database operation failed with the following error: 'PdwManagedToNativeInteropException ErrorNumber: 46724,
"PDW" is short for Parallel Data Warehouse and suggests you might be using the MPP product Azure SQL Data Warehouse, rather than a SQL DB as you mentioned. Is that correct?
This error reflects when your defined size of the column like varchar /int is getting overflown.
Try increasing the size of data types and column and rerun the pipeline.
I recreated it and fixed it in my Data factory.

Duplicate Schema

I am creating and maintaining my SQL Server 2008 warehouse database using a Visual Studio 2008 project.
When I try to deploy the project, I get an error as below
Creating DataWarehouseReports...
Company.HigherEducation.DataWarehouse.dbschema(0,0)Error TSD01268: .Net SqlClient Data Provider: Msg 2714, Level 16, State 6, Line 2 There is already an object named 'DataWarehouseReports' in the database.
Company.HigherEducation.DataWarehouse.dbschema(0,0)Error TSD01268: .Net SqlClient Data Provider: Msg 2759, Level 16, State 0, Line 2 CREATE SCHEMA failed due to previous errors.
An error occurred while the batch was being executed.
Done executing task "SqlDeployTask" -- FAILED.
Done building target "DspDeploy" in project "Company.HigherEducation.DataWarehouse.dbproj" -- FAILED.
Done executing task "CallTarget" -- FAILED.
Done building target "DBDeploy" in project "Company.HigherEducation.DataWarehouse.dbproj" -- FAILED.
Done building project "Company.HigherEducation.DataWarehouse.dbproj" -- FAILED.
What I found, is that the DataWarehouseReports reports schema exists in the master database (granted, by mistake), but I get the "duplicate" error when I try to create it in the OLAP database.
In development, my OLAP and OLTP databases are on the same server, and I encounter the same problem for schema names. For now, I have named schemas differently in the OLTP vs. the OLAP, but for consistency sakes, I would rather have the same name.
How do I debug this?
edit: PEBCAC Alert
I found out that I have a pre-deployment sql script that changes the database context to MASTER. My script was failing and therefore the context never was changed back.
So when the rest of my script executed, it was actually trying to create everything in Master.
Lesson learned: Read the output when the project Deploys.
Basically, the VS2008 project creates a .SQL file and in which the change DB is only where the DB gets created. If you change context in the Pre-Deployment, it is up to you to change it back.