Retrieving the Name of Running Stored Procedures Across Multiple Databases - sql

I'm trying to write a query that reports the current database activity. The query links together various DMV's like sys.dm_exec_connections, sys.dm_exec_sessions, sys.dm_exec_requests, etc. The query also pulls the actual queries being run via the sys.dm_exec_sql_text function.
(I'm aware of Activity Monitor and SQL Profiler. I need to gather this information up in a query, so neither of these programs are relevant here.)
Much of the activity in our systems takes place in stored procedures and functions. It would be nice to see the names of these procedures in this query.
My question is:
How do I reliably display the name of the stored procedures or functions being executed?
I'm aware that the sys.dm_exec_sql_text function returns an objectid, and that I can join this objectid to sys.objects. The problem is, there are multiple databases on this server, and sys.objects only applies to the current database. I want this query to be able to show the running object name no matter what database the query happened to be run against.
So far the only solution I have is to use sp_msforeachdb create a temp table containing all the object IDs and names from all databases and join to this table from the result of the dm_exec_sql_text function.
Is there a better solution to the temp table approach? I feel like I'm missing something.

I would recommend Adam Machanic's excellent sp_WhoISActive. It doesn't return the exact object name, but does return the sql command being executed in a nice clickable form.

--I use the following proc:
USE [master]
GO
CREATE PROC [dbo].[sp_who3]
AS
SET NOCOUNT ON
DECLARE #LoginName varchar(128)
DECLARE #AppName varchar(128)
SELECT [SPID] = s.[spid]
, [CPU] = s.[cpu]
, [Physical_IO] = s.[physical_io]
, [Blocked] = s.[blocked]
, [LoginName] = CONVERT([sysname], RTRIM(s.[Loginame]))
, [Database] = d.[name]
, [AppName] = s.[program_name]
, [HostName] = s.[hostname]
, [Status] = s.[Status]
, [Cmd] = s.[cmd]
, [Last Batch] = s.[last_batch]
, [Kill Command] = 'Kill ' + CAST(s.[spid] AS varchar(10))
, [Buffer Command] = 'DBCC InputBuffer(' + CAST(s.[spid] AS varchar(10))
+ ')'
FROM [master].[dbo].[sysprocesses] s WITH(NOLOCK)
JOIN [master].[sys].[databases] d WITH(NOLOCK)
ON s.[dbid] = d.[database_id]
WHERE s.[Status] 'background'
AND s.[spid] ##SPID --#CurrentSpid#
ORDER BY s.[blocked] DESC, s.[physical_io] DESC, s.[cpu] DESC, CONVERT([sysname], RTRIM(s.[Loginame]))
BEGIN
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
SELECT [Spid] = er.[session_Id]
, [ECID] = sp.[ECID]
, [Database] = DB_NAME(sp.[dbid])
, [User] = [nt_username]
, [Status] = er.[status]
, [Wait] = [wait_type]
, [Individual Query] = SUBSTRING(qt.[text], er.[statement_start_offset] / 2, (CASE WHEN er.[statement_end_offset] = - 1 THEN LEN(CONVERT(VARCHAR(MAX), qt.[text])) * 2
ELSE er.[statement_end_offset] END - er.[statement_start_offset]) / 2)
, [Parent Query] = qt.[text]
, [Program] = sp.[program_name]
, [Hostname] = sp.[Hostname]
, [Domain] = sp.[nt_domain]
, [Start_time] = er.[Start_time]
FROM [sys].[dm_exec_requests] er WITH(NOLOCK)
INNER JOIN [sys].[sysprocesses] sp WITH(NOLOCK)
ON er.[session_id] = sp.[spid]
CROSS APPLY [sys].[dm_exec_sql_text](er.[sql_handle]) qt
WHERE er.[session_Id] > 50 -- Ignore system spids.
AND er.[session_Id] NOT IN (##SPID) -- Ignore the current statement.
ORDER BY er.[session_Id], sp.[ECID]
END
GO

Related

Running SQL script within R

I set-up a connection from R to (Microsoft SQL Server Studio) using the RODBC package. While I am able to run simple SQL queries directly from R, I find that the more complex sql queries containing special characters such as "#" for declaring a table while creating a temp table tend to return an error from R. I have tried to escape this within R itself (by placing it in quotes) , however, this is failing, as SQL could not interpret this escape characters (I guess).
My goal is to perform soundex/fuzzy matching of some client records again the clients in the database (~3M rows). I have tried getting this done directly using the stringdist package in R but the matching process is blowing out my RAM (16GB), hence, the reason why I have resulted to matching the data from within SQL itself. I could have easily done this in SQL, however, I need to set-up this in R so that non-technical individuals can easily run the R script and query, database and perform further work on the resulting dataset.
I have tried the suggestion in this post but did not find it helpful to resolve this issue
Any tips on how to escape SQL special characters like the # symbol would be useful.
I get this error in R:
1" 42000 102 [Microsoft][ODBC SQL Server Driver][SQL Server]Incorrect syntax near 'go'."
2 "[RODBC] ERROR: Could not SQLExecDirect '\nSET DATEFORMAT dmy; \ngo\nDECLARE #VerifyClientID TABLE (firstname varchar(100), middlename varchar(100), lastname varchar(100)~
The script:
SET DATEFORMAT dmy;
go
DECLARE #VerifyClientID TABLE (firstname varchar(100), middlename varchar(100), lastname varchar(100), dob date, mobile varchar(100), ID int)
INSERT INTO #VerifyClientID (firstname, lastname, mobile, ID)
VALUES
('JOHN','DOE','0444 444 444',1)
drop table if exists #clientTABLE
select v.ID, p.ABC_NCMID, P.ABC_RowID, P.ABC_FirstName, P.ABC_GuardianName, P.ABC_LastName, P.ABC_SexCode_ID, P.ABC_CellPhone, P.ABC_DOB, ABC_StreetAddress, ABC_City, ABC_Zip, ABC_SSN, ABC_HomePhone
into #clientTABLE
from #VerifyClientID V
inner join dbo.DV_Person P on soundex(V.firstname) = soundex(P.ABC_FirstName)
and soundex(V.lastname) = soundex(P.ABC_LastName)
and (convert(varchar,replace(replace(P.ABC_CELLPHONE,' ',''),0,'')) = convert(varchar,replace(replace(V.mobile,' ',''),0,''))
or convert(varchar,replace(replace(P.ABC_HomePhone,' ',''),0,'')) = convert(varchar,replace(replace(V.mobile,' ',''),0,''))
)
where 1=1
and p.ABC_NCMID in (select Per.PER_CLIENTID from AtlasPublic.View_UODS_Person Per)
and P.ABC_IsPatient = 1
select distinct ID
,firstname
,middlename
,lastname
,dob
,mobile
, (SELECT TOP 1 ABC_NCMID FROM #clientTABLE WHERE id = v.id) as MatchedID
, (SELECT TOP 1 convert(varchar, ABC_DOB, 103) FROM #clientTABLE WHERE id = v.id) as MatchedDOB
, (SELECT TOP 1 case when ABC_SexCode_ID = 8089 then 'Male'
when ABC_SexCode_ID = 8088 then 'Female'
else null end FROM #clientTABLE WHERE id = v.id) as MatchedSEX
, (SELECT TOP 1 ABC_StreetAddress FROM #clientTABLE WHERE id = v.id) as MatchedStreet
, (SELECT TOP 1 ABC_City FROM #clientTABLE WHERE id = v.id) as MatchedCity
, (SELECT TOP 1 ABC_Zip FROM #clientTABLE WHERE id = v.id) as MatchedZip
from #VerifyClientID V

SQL insert into suspended - Read uncommitted or split query?

I have a stored procedure that contains two insert into queries. While running the sp I often see these two as being suspended. The sp has been tested on a test server with almost no traffic but now as we moved it on to the production server these suspended states have appeared with 1-2 deadlocks.
I assume that SQL Server creates table locks while running these queries but I don't know what is the preferred way to solve this?
The insert into queries are moving 30000 records in one iteration into an other database. These are archive data, so queries coming from the normal production processes are nothing to do with the data being archived, they are 2-3 years old.
Can I add WITH NOLOCK to the selects to avoid suspended states and deadlocks?
Or should I set ISOLATION LEVEL to READ UNCOMMITTED? (these records are old, they won't change)
What other options do I have? Cursors to run through the ids it has to archive one by one? (I tried not to use cursors until now.)
These are the two queries. #workitemids and #workstepids are table variables containing one int field.
insert into archive_****.archive.workitems
select * from ****.dbo.WorkItems where ****.dbo.workitems.Id in (select Id from #workitemIds);
insert into archive_****.archive.worksteps([Id], [Timestamp], [Description], [WorkPlace_Id], [WorkItemState_Id], [UserId], [WorkItem_Id], [Technology_Id], [Failcodes_Id], [DrawingNo], [ManualData], [Deleted], [WorkItemState_Arrival_Id], Workstepdatas)
select [Id], [Timestamp], [Description], [WorkPlace_Id], [WorkItemState_Id], [UserId], [WorkItem_Id], [Technology_Id], [Failcodes_Id], [DrawingNo], [ManualData], [Deleted], [WorkItemState_Arrival_Id],
(select Fieldname Field, Value [Value], Unit [Unit] from ****.dbo.workstepdatas wsd
left join ****.dbo.technologydatafields tdf on tdf.Id = wsd.TechnologyDatafields_Id
where tdf.fieldname is not null and wsd.WorkStep_Id = ws.Id
and value NOT LIKE '%[' + CHAR(0)+ '-' +CHAR(31)+']%' COLLATE Latin1_General_100_BIN2
for xml auto,type)
from ****.dbo.worksteps ws
where ws.Id in (select Id from #workstepIds);
Please attempt to write the nested query into a cte as below and advise if any progress. You will need to change the database names.
insert into archive_db.archive.workitems (with tablock)
select *
from db.dbo.WorkItems as w
inner join #workitemIds as wi
on w.Id = wi.id;
with xmlcte
(ID, xmlRow)
as (
select ws.id
, (
select Fieldname as Field
, [Value]
, Unit
from db.dbo.workstepdatas wsd
left join db.dbo.technologydatafields tdf
on tdf.Id = wsd.TechnologyDatafields_Id
where
tdf.fieldname is not null
and wsd.WorkStep_Id = ws.Id
and [value] not like '%[' + char(0) + '-' + char(31) + ']%' collate Latin1_General_100_BIN2
for xml auto, type
) as xmlRow
from db.dbo.worksteps as ws
)
insert into archive_db.archive.worksteps (with tablock)
(
[Id]
, [Timestamp]
, [Description]
, [WorkPlace_Id]
, [WorkItemState_Id]
, [UserId]
, [WorkItem_Id]
, [Technology_Id]
, [Failcodes_Id]
, [DrawingNo]
, [ManualData]
, [Deleted]
, [WorkItemState_Arrival_Id]
, Workstepdatas
)
select ws.[Id]
, [Timestamp]
, [Description]
, [WorkPlace_Id]
, [WorkItemState_Id]
, [UserId]
, [WorkItem_Id]
, [Technology_Id]
, [Failcodes_Id]
, [DrawingNo]
, [ManualData]
, [Deleted]
, [WorkItemState_Arrival_Id]
, [xmlRow]
from db.dbo.worksteps ws
inner join #workstepIds as wsi
on ws.Id = wsi.id
inner join xmlcte -- I assume inner join is OK
on ws.id = xmlcte.id;

SQL Server - Select max

I have a SQL Server stored procedure with an update statement:
UPDATE Sale_Sheet
SET concluded = 'True'
, concluded_time = GETDATE()
, saleNumber = (SELECT MAX(saleNumber) + 1 FROM Sale_Sheet)
WHERE vatNumber = #vatNumber
AND [user_id] = #user_id
It happens that two users initiate a procedure at the same time, and both users receive the same maximum value (SELECT MAX(saleNumber) + 1 FROM Sale_Sheet). How can I fix this?

What part of this short SQL script run on a production is not optimal?

I am running a script on our production database reffering two tables : our table of users (3700 of them) and the table of quotes that they have made (280000 of them). Quote is the main object in our application, a very large object, for whom many data tables are created and filled. My goal is to clean database from all quotes but those made of a small group of users.
I first create a temp table containing ids of those users (it is used else in the script also) and then a cursor that runs through the main table for the quotes, where they are listed, and for those quotes created from the user group does the necessary cleansing.
I see that this script is going to be executed for 26 hours approximately, which I consider peculiar since I need about 15 minutes for the database restoring in general, and I guess the heaviest sql is executed there. The db, though, weighs more than 100GB.
Is there some part of the script that I am making terribly non-optimal, or you have some suggestion how this could be done with much shorter execution.
We are running SQL Server 2008 R2.
Here's the sketch of the script.
CREATE table #UsersIdsToStay(user_id int)
INSERT INTO #UsersIdsToStay
select user_id
from users
where user_name like '%SOMESTRING '
-----
declare #QuoteId int
declare #UserId int
declare QuoteCursor cursor for
select DISTINCT QuoteId, UserId
from QuotesTable
where UserId not in
(
select * from #UsersIdsToStay
)
open QuoteCursor
while 1=1
begin
fetch QuoteCursor into #QuoteId, #UserId
if ##fetch_status != 0 break
-- all the deletions from related tables are executed here using #QuoteId and #UserId
exec('delete from QuoteHistory where QuoteId = ' + #QuoteId + ' and UserId = ' + #UserId )
exec('delete from QuoteRevisions where QuoteId = ' + #QuoteId + ' and UserId = ' + #UserId )
exec('delete from QuoteItems where QuoteId = ' + #QuoteId + ' and UserId = ' + #UserId )
....
end
close QuoteCursor;
deallocate QuoteCursor
The cursor restricts you to only delete a single User_Id/Quote_Id combination at a time on each related table. By using joins you will be able to delete in mass.
You could also switch out the temp table with a Common Table Expression(CTE). If this is a one off script the temp table should be ok, but for production code I would create a CTE.
if OBJECT_ID('tempdb..#quotesToDelete') is not null
drop table #quotesToDelete
select distinct
ut.user_id,
qt.quote_id
into #quotesToDelete
from dbo.QuotesTable qt (nolock)
inner join dbo.UsersTable ut (nolock)
on qt.user_id = ut.user_id
where ut.user_name not like '%SOMESTRING '
-- all the deletions from related tables are executed here using #QuoteId and #UserId
-- relatedtableA
delete a
from relatedtableA a
inner join #quotesToDelete b
on a.user_id = b.user_id
and a.quote_id = b.quote_id
-- relatedtableB
...
Since you don't show the deletes cannot show you how to avoid a cursor.
But could do this without a temp pretty easy
select DISTINCT QuoteId, UserId
from QuotesTable
where UserId not in
(
select user_id
from users
where user_name like '%SOMESTRING '
)
or
select DISTINCT QuoteId, UserId
from QuotesTable
left join UserId
on UserId.user_id = QuotesTable.UserId
and user_name like '%SOMESTRING '
where UserId.user_id is null
The problem is the cusor and you don't need it
CREATE table #QuotesToDelete(QuoteId int, UserID int)
insert into #QuotesToDelete
select DISTINCT QuoteId, UserId
from QuotesTable
left join UserId
on UserId.user_id = QuotesTable.UserId
and user_name like '%SOMESTRING '
where UserId.user_id is null
delete QH
from QuoteHistory QH
join #QuotesToDelete
on #QuotesToDelete.QuoteId = QH.QuoteId
and #QuotesToDelete.UserID = QH.UserID
delete QR
from QuoteRevisions QR
join #QuotesToDelete
on #QuotesToDelete.QuoteId = QR.QuoteId
and #QuotesToDelete.UserID = QR.UserID

SQL Query Optimization

This report used to take about 16 seconds when there were 8000 rows to process. Now there are 50000 rows and the report takes 2:30 minutes.
This was my first pass at this and the client needed it yesterday, so I wrote this code in the logical order of what needed to be done, but without optimization in mind.
Now with the report taking longer as the data increases, I need to take a second look at this and optimize it. I'm thinking indexed views, table functions, etc.
I think the biggest bottleneck is looping through the temp table, making 4 select statements, and updating the temp table...50,000 times.
I think I can condense ALL of this into one large SELECT with either (a) 4 joins to the same table to get the 4 statuses, but then I am not sure how to get the TOP 1 in there, or I can try (b) using nested subqueries, but both seem really messy compared to the current code.
I'm not expecting anyone to write code for me, but if some SQL experts can peruse this code and tell me about any obvious inefficiencies and alternate methods, or ways to speed this up, or techniques I should be using instead, it would be appreciated.
PS: Assume that this DB is for the most part normalized, but poorly designed, and that I am not able to add indexes. I basically have to work with it, as is.
Where the code says (less than) I had to replace a "less than" symbol because it was cropping some of my code.
Thanks!
CREATE PROCEDURE RptCollectionAccountStatusReport AS
SET NOCOUNT ON;
DECLARE #Accounts TABLE
(
[AccountKey] INT IDENTITY(1,1) NOT NULL,
[ManagementCompany] NVARCHAR(50),
[Association] NVARCHAR(100),
[AccountNo] INT UNIQUE,
[StreetAddress] NVARCHAR(65),
[State] NVARCHAR(50),
[PrimaryStatus] NVARCHAR(100),
[PrimaryStatusDate] SMALLDATETIME,
[PrimaryDaysRemaining] INT,
[SecondaryStatus] NVARCHAR(100),
[SecondaryStatusDate] SMALLDATETIME,
[SecondaryDaysRemaining] INT,
[TertiaryStatus] NVARCHAR(100),
[TertiaryStatusDate] SMALLDATETIME,
[TertiaryDaysRemaining] INT,
[ExternalStatus] NVARCHAR(100),
[ExternalStatusDate] SMALLDATETIME,
[ExternalDaysRemaining] INT
);
INSERT INTO
#Accounts (
[ManagementCompany],
[Association],
[AccountNo],
[StreetAddress],
[State])
SELECT
mc.Name AS [ManagementCompany],
a.LegalName AS [Association],
c.CollectionKey AS [AccountNo],
u.StreetNumber + ' ' + u.StreetName AS [StreetAddress],
CASE WHEN c.InheritedAccount = 1 THEN 'ZZ' ELSE u.State END AS [State]
FROM
ManagementCompany mc WITH (NOLOCK)
JOIN
Association a WITH (NOLOCK) ON a.ManagementCompanyKey = mc.ManagementCompanyKey
JOIN
Unit u WITH (NOLOCK) ON u.AssociationKey = a.AssociationKey
JOIN
Collection c WITH (NOLOCK) ON c.UnitKey = u.UnitKey
WHERE
c.Closed IS NULL;
DECLARE #MaxAccountKey INT;
SELECT #MaxAccountKey = MAX([AccountKey]) FROM #Accounts;
DECLARE #index INT;
SET #index = 1;
WHILE #index (less than) #MaxAccountKey BEGIN
DECLARE #CollectionKey INT;
SELECT #CollectionKey = [AccountNo] FROM #Accounts WHERE [AccountKey] = #index;
DECLARE #PrimaryStatus NVARCHAR(100) = NULL;
DECLARE #PrimaryStatusDate SMALLDATETIME = NULL;
DECLARE #PrimaryDaysRemaining INT = NULL;
DECLARE #SecondaryStatus NVARCHAR(100) = NULL;
DECLARE #SecondaryStatusDate SMALLDATETIME = NULL;
DECLARE #SecondaryDaysRemaining INT = NULL;
DECLARE #TertiaryStatus NVARCHAR(100) = NULL;
DECLARE #TertiaryStatusDate SMALLDATETIME = NULL;
DECLARE #TertiaryDaysRemaining INT = NULL;
DECLARE #ExternalStatus NVARCHAR(100) = NULL;
DECLARE #ExternalStatusDate SMALLDATETIME = NULL;
DECLARE #ExternalDaysRemaining INT = NULL;
SELECT TOP 1
#PrimaryStatus = a.StatusName, #PrimaryStatusDate = c.StatusDate, #PrimaryDaysRemaining = c.DaysRemaining
FROM CollectionAccountStatus c WITH (NOLOCK) JOIN AccountStatus a WITH (NOLOCK) ON c.AccountStatusKey = a.AccountStatusKey
WHERE c.CollectionKey = #CollectionKey AND a.StatusType = 'Primary Status' AND a.StatusName 'Cleared'
ORDER BY c.sysCreated DESC;
SELECT TOP 1
#SecondaryStatus = a.StatusName, #SecondaryStatusDate = c.StatusDate, #SecondaryDaysRemaining = c.DaysRemaining
FROM CollectionAccountStatus c WITH (NOLOCK) JOIN AccountStatus a WITH (NOLOCK) ON c.AccountStatusKey = a.AccountStatusKey
WHERE c.CollectionKey = #CollectionKey AND a.StatusType = 'Secondary Status' AND a.StatusName 'Cleared'
ORDER BY c.sysCreated DESC;
SELECT TOP 1
#TertiaryStatus = a.StatusName, #TertiaryStatusDate = c.StatusDate, #TertiaryDaysRemaining = c.DaysRemaining
FROM CollectionAccountStatus c WITH (NOLOCK) JOIN AccountStatus a WITH (NOLOCK) ON c.AccountStatusKey = a.AccountStatusKey
WHERE c.CollectionKey = #CollectionKey AND a.StatusType = 'Tertiary Status' AND a.StatusName 'Cleared'
ORDER BY c.sysCreated DESC;
SELECT TOP 1
#ExternalStatus = a.StatusName, #ExternalStatusDate = c.StatusDate, #ExternalDaysRemaining = c.DaysRemaining
FROM CollectionAccountStatus c WITH (NOLOCK) JOIN AccountStatus a WITH (NOLOCK) ON c.AccountStatusKey = a.AccountStatusKey
WHERE c.CollectionKey = #CollectionKey AND a.StatusType = 'External Status' AND a.StatusName 'Cleared'
ORDER BY c.sysCreated DESC;
UPDATE
#Accounts
SET
[PrimaryStatus] = #PrimaryStatus,
[PrimaryStatusDate] = #PrimaryStatusDate,
[PrimaryDaysRemaining] = #PrimaryDaysRemaining,
[SecondaryStatus] = #SecondaryStatus,
[SecondaryStatusDate] = #SecondaryStatusDate,
[SecondaryDaysRemaining] = #SecondaryDaysRemaining,
[TertiaryStatus] = #TertiaryStatus,
[TertiaryStatusDate] = #TertiaryStatusDate,
[TertiaryDaysRemaining] = #TertiaryDaysRemaining,
[ExternalStatus] = #ExternalStatus,
[ExternalStatusDate] = #ExternalStatusDate,
[ExternalDaysRemaining] = #ExternalDaysRemaining
WHERE
[AccountNo] = #CollectionKey;
SET #index = #index + 1;
END;
SELECT
[ManagementCompany],
[Association],
[AccountNo],
[StreetAddress],
[State],
[PrimaryStatus],
CONVERT(VARCHAR, [PrimaryStatusDate], 101) AS [PrimaryStatusDate],
[PrimaryDaysRemaining],
[SecondaryStatus],
CONVERT(VARCHAR, [SecondaryStatusDate], 101) AS [SecondaryStatusDate],
[SecondaryDaysRemaining],
[TertiaryStatus],
CONVERT(VARCHAR, [TertiaryStatusDate], 101) AS [TertiaryStatusDate],
[TertiaryDaysRemaining],
[ExternalStatus],
CONVERT(VARCHAR, [ExternalStatusDate], 101) AS [ExternalStatusDate],
[ExternalDaysRemaining]
FROM
#Accounts
ORDER BY
[ManagementCompany],
[Association],
[StreetAddress]
ASC;
Don't try to guess where the query is going wrong - look at the execution plan. It will tell you what's chewing up your resources.
You can update directly from another table, even from a table variable: SQL update from one Table to another based on a ID match
That would allow you to combine everything in your loop into a single (massive) statement. You can join to the same tables for the secondary and tertiary statuses using different aliases, e.g.,
JOIN AccountStatus As TertiaryAccountStatus...AND a.StatusType = 'Tertiary Status'
JOIN AccountStatus AS SecondaryAccountStatus...AND a.StatusType = 'Secondary Status'
I'll bet you don't have an index on the AccountStatus.StatusType field. You might try using the PK of that table instead.
HTH.
First use a temp table instead of a table varaiable. These can be indexed.
Next, do not loop! Looping is bad for performance in virtually every case. This loop ran 50000 times rather than once for 50000 records, it will be horrible when you havea million records! Here is a link that will help you understand how to do set-based processing instead. It is written to avoid cursos but loops are similar to cursors, so it should help.
http://wiki.lessthandot.com/index.php/Cursors_and_How_to_Avoid_Them
And (nolock) will give dirty data reads which can be very bad for reporting. If you are in a version of SQl Server higher than 2000, there are better choices.
SELECT #CollectionKey = [AccountNo] FROM #Accounts WHERE [AccountKey] = #index;
This query would benefit from a PRIMARY KEY declaration on your table variable.
When you say IDENTITY, you are asking the database to auto-populate the column.
When you say PRIMARY KEY, you are asking the database to organize the data into a clustered index.
These two concepts are very different. Typically, you should use both of them.
DECLARE #Accounts TABLE
(
[AccountKey] INT IDENTITY(1,1) PRIMARY KEY,
I am not able to add indexes.
In that case, copy the data to a database where you may add indexes. And use: SET STATISTICS IO ON