Query multiple Azure SQL databases - Azure SQL Database cannot be used as a Central Management Server - sql

I need to run a query against all databases (that have the same schema), the problem is these are Azure databases within an Elastic Pool. I read that this can be done using the "Central Management Servers" feature in SQL Management Studio but I have installed the latest version 18.3 but when I try and expand the Azure SQL server under "Central Management Servers" I get the following error:
Azure SQL Database cannot be used as a Central Management Server
The type of query I am trying to run against all the databases is as follows, this works fine on a local SQL Server instance but does not work on Azure SQL Server.
SET NOCOUNT ON;
IF OBJECT_ID (N'tempdb.dbo.#temp') IS NOT NULL
DROP TABLE #temp
CREATE TABLE #temp
(
[COUNT] INT
, DB VARCHAR(50)
)
DECLARE #TableName NVARCHAR(50)
SELECT #TableName = '[dbo].[CustomAttributes]'
DECLARE #SQL NVARCHAR(MAX)
SELECT #SQL = STUFF((
SELECT CHAR(13) + 'SELECT ''' + name + ''', COUNT(1) FROM [' + name + '].' + #TableName + 'WHERE dataType = 2'
FROM sys.databases
WHERE OBJECT_ID('[' + name + ']' + '.' + #TableName) IS NOT NULL
FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 1, '')
INSERT INTO #temp (DB, [COUNT])
EXEC sys.sp_executesql #SQL
SELECT *
FROM #temp t

Azure SQL database doesn't support Administer Multiple Servers Using Central Management Servers.
Since your databases are in the same Elastic pool, you can using the Elastic Query to run a query against all databases.
The elastic query feature (in preview) enables you to run a Transact-SQL query that spans multiple databases in Azure SQL Database. It allows you to perform cross-database queries to access remote tables, and to connect Microsoft and third-party tools (Excel, Power BI, Tableau, etc.) to query across data tiers with multiple databases. Using this feature, you can scale out queries to large data tiers in SQL Database and visualize the results in business intelligence (BI) reports.
For more details, please see T-SQL querying:
Reporting across scaled-out cloud databases (preview)
Query across cloud databases with different schemas (preview)
Hope this helps.

Related

How to Remove All Views from Azure Synapse SQL Serverless Inbuilt

Can let me know how to remove views/tables from Azure Synapse Inbuilt SQL Serverless Pool.
Its easy enough to remove individual tables/views using following:
use [DatabaseName]
GO
drop EXTERNAL table schemaname.tablename
But I would to remove all the views/tables shown here:
Run the following T-SQL query which builds a dynamic SQL statement to drop all views:
declare #sql nvarchar(max) = (
select STRING_AGG('drop view ['+s.name+'].['+v.name+']; ','
')
from sys.views v
join sys.schemas s on s.schema_id = v.schema_id
where v.is_ms_shipped = 0
)
exec(#sql)

Copy local SQL database onto Azure

I want to do the - should be simple - task of copying a database from local to live...
I have all the data and table structure I want in my local database.
I have a currently running database live database that has all the backup stuff assigned to it etc so I don't just want to create a brand new database (not that I can find how to do this either)...
I just want to remove all tables from that database and then copy all my data and tables from the local into the live azure sql database.
How do I do this???
you can try to achieve this with SQL Server Management Studio.
If you goto SQL Management Studio, right click on the database you want to copy data for...
Goto Tasks Generate Script
Select just your tables not the entire database object
Open up your azure database in Visual Studio that you want to create the copy into
Open a new query for that database and paste the generated script in
Execute and pray to the gods
Jump around because it worked, now run the following command on the azure table to disable foreign key migrations:
DECLARE #sql NVARCHAR(MAX) = N'';
;WITH x AS
(
SELECT DISTINCT obj =
QUOTENAME(OBJECT_SCHEMA_NAME(parent_object_id)) + '.'
+ QUOTENAME(OBJECT_NAME(parent_object_id))
FROM sys.foreign_keys
)
SELECT #sql += N'ALTER TABLE ' + obj + ' NOCHECK CONSTRAINT ALL;
' FROM x;
EXEC sp_executesql #sql;
now go into sql server management studio and right click on your local database and goto tasks and then export data
export to your azure database but make sure to edit the mappings and tick the identity box.
the data is moved, now set the foreign keys back using this on your azure database:
DECLARE #sql NVARCHAR(MAX) = N'';
;WITH x AS
(
SELECT DISTINCT obj =
QUOTENAME(OBJECT_SCHEMA_NAME(parent_object_id)) + '.'
+ QUOTENAME(OBJECT_NAME(parent_object_id))
FROM sys.foreign_keys
)
SELECT #sql += N'ALTER TABLE ' + obj + ' WITH CHECK CHECK CONSTRAINT ALL;
' FROM x;
EXEC sp_executesql #sql;

SQL: Executing a query to dynamic remote server

I want to be able to execute remote queries based on the results of a local query.
For instance:
DECLARE #REMOTESERVER VARCHAR(10)
Select TOP 1 #REMOTESERVER = RemoteServer from TABLE
--Execute the next query on a remote server from the value I retrieved above
Select * from tblCustomers
What RDBMS are you using? Some will not support a pure sql way of doing this. Others, like SQL Server, might support this scenario. Is the remote server accessible via a linked server that you can access. You could then use dynamic sql to create your sql string. Something like this should work in SQL Server:
SET #Sql = 'SELECT * FROM [' + #RemoteServer + '].dbname.schema.tblCustomers'
EXEC #Sql
Here is a post about linked servers: https://stackoverflow.com/a/4091984/1073631

Drop multiple databases in SQl Azure

I would like to run a script to drop the multiple databases from SQL Azure as soon I finish using it. When I tried as following,
DECLARE #dbname varchar(100);
DECLARE #stmt nvarchar(3000);
SET #dbname = '6A732E0B';
SELECT #stmt = (SELECT 'DROP DATABASE [' + name + ']; ' FROM sys.databases
WHERE name LIKE '%' +#dbname +'%');
EXEC sp_executesql #stmt;
SQL Azure throws error message as “The DROP DATABASE statement must be the only statement in the batch”
Can somebody help me on this?
This is a known limitation in SQL Azure - certain statements need to be in a batch by themselves to be executed. This includes CREATE/ALTER DATABASE, ALTER DATABASE and a few more.
To solve you problem, you can create a loop in you application where you iterate over all the databases and drop them by issuing DROP DATABASE statements in separate batches.
I believe this is a bug of SQL Azure. I've recently reported it to Microsoft:
https://connect.microsoft.com/SQLServer/feedback/details/684160/sp-executesql-the-drop-database-statement-must-be-the-only-statement-in-the-batch

SQL 2005 - Linked Server to Oracle Queries Extremely Slow

On my SQL 2005 server, I have a linked server connecting to Oracle via the OraOLEDB.Oracle provider.
If I run a query through the 4 part identifier like so:
SELECT * FROM [SERVER]...[TABLE] WHERE COLUMN = 12345
It takes over a minute to complete. If I run the same query like so:
SELECT * FROM OPENQUERY(SERVER, 'SELECT * FROM TABLE WHERE COLUMN = 12345')
It completes instantly. Is there a setting I'm missing somewhere to get the first query to run in a decent period of time? Or am I stuck using openquery?
In your first example using "dot" notation, client cursor engine is used and most things are evaluated locally. If you're selecting from a large table and using a WHERE clause, the records will be pulled down locally from the remote db. Once the data has been pulled across the linked server, only then is the WHERE clause is applied locally. Often this sequence is a performance hit. Indexes on the remote db are basically rendered useless.
Alternately when you use OPENQUERY, SQL Server sends the sql statement to the target database for processing. During processing any indexes on the tables are leveraged. Also the where clause is applied on the Oracle side before sending the resultset back to SQL Server.
In my experience, except for the simplest of queries, OPENQUERY is going to give you better performance.
I would recommend using OpenQuery for everything for the above reasons.
One of the pain points when using OpenQuery that you may have already encountered is single quotes. If the sql string being sent to the remote db requires single quotes around a string or date date they need to be escaped. Otherwise they inadvertantly terminate the sql string.
Here is a template that I use whenever I'm dealing with variables in an openquery statement to a linked server to take care of the single quote problem:
DECLARE #UniqueId int
, #sql varchar(500)
, #linkedserver varchar(30)
, #statement varchar(600)
SET #UniqueId = 2
SET #linkedserver = 'LINKSERV'
SET #sql = 'SELECT DummyFunction(''''' + CAST(#UniqueId AS VARCHAR(10))+ ''''') FROM DUAL'
SET #statement = 'SELECT * FROM OPENQUERY(' + #linkedserver + ', '
SET #Statement = #Statement + '''' + #SQL + ''')'
EXEC(#Statement)