Query for Multiple Users - Best Practices - sql

I currently have about 10 users that use their own personalized query for an internal process at my workplace. The user inputs a few values at the top of the query, hits execute, and voila, their report shows up in the grid. The source data tables they access are the same, but the created tables within are personalized with the suffix _User1, _User2...User10. Each time they run the query, the previously created tables are dropped and created again. The entire query takes about 1 second to run.
The majority of the structure looks like this repeated 5 times for the 5 steps to get to their desired output:
DROP TABLE z
SELECT *
INTO z
FROM y
Now, the number of users is multiplying to 50, and that means that each tweak in the master query code will result in me changing 50 user-specific queries and sending them back out. Managable and annoying with 10 users, completely unmanagable with 50.
My question is, what is the best way to go about structuring the database/query? Ideally I'd like to just have one query, one set of created tables (not 50). Since it only takes 1 second to run, would we run the risk of two or more users (with different inputs) running the query simultaneously, accessing the same tables and somehow getting bad data because they ran it at the exact same time?
Is there a specfic way this is normally done? Hoping someone can shed some light.
Thanks

Disclaimer: As I've indicated in my comments, giving a bunch of users access directly to SSMS to run reports is a very bad idea. Get some sort of front-end, even a simple MS Access database - you would only need a single license to develop the database, and you could give the rest of the users Access Runtime, for instance. There are so many ways a user could really mess you up if they don't know what they're doing. I will offer some ideas below, but I don't recommend doing this.
One solution: use temp tables so you don't have to worry about each user's tables overlapping:
-- drop the table if it already exists
if object_id('tempdb..#z') is not null
DROP TABLE #z
SELECT *
INTO #z
FROM y
When you prefix a table name with #, it becomes a connection-scoped temporary table, which means separate sessions will not see the temporary tables in other sessions even if they have the same name.
Often it is not necessary to create a temp table unless you have some really complicated scenario. You should be able to make use of subqueries, views, CTE's, and stored procedures to generate the output real-time without any new tables being involved. You can even build views and procedures that reference other views so you can organize your complicated logic. For example, you might encapsulate the logic into a stored procedure like this:
CREATE PROCEDURE TheReport
(
#ReportID int,
#Name varchar(50),
#SomeField varchar(10)
)
AS
BEGIN
-- do some complicated query here
SELECT field1, field2 FROM Result Q
END
Then you don't even have to send updates to your users (unless the fields change). Just have their query call the stored procedure, and you can update the procedure directly at your convenience:
DECLARE #ReportID int
DECLARE #Name varchar(50)
DECLARE #SomeField varchar(10)
-- YOU CAN MODIFY THIS --
SET #ReportID = 5
SET #Name = 'MyName'
SET #SomeField = 'abc'
-- DON'T MODIFY BELOW THIS LINE --
EXEC [TheReport] #ReportID, #Name, #SomeField;

Related

Stop execution of sql script

I have a huge SQL script with many batches (using GO).
On certain conditions, I want to stop the whole execution of this script.
I tried using NOEXEC, but I keep on getting invalid object errors, since my script includes creating and adding data to new tables and columns.
I do not want to use RAISERROR, is there any other way to do this?
There are no good solutions to this problem. One way you can share data between batches is tables(persistent or temporary). So you can have some logic that depends on some state of that table or value of particular columns in that table like this:
--CREATE TABLE debug(col int)
--INSERT INTO debug VALUES(1)
--DELETE FROM debug
SELECT 1
GO
SELECT 2
SELECT 3
GO
IF EXISTS(SELECT * FROM debug)
SELECT 6
GO
Just add IF EXISTS(SELECT * FROM debug) this line to every place in script that you want to skip and depending on the fact that the table has some rows or not those code blocks will execute or not.

Create table from nth result set of stored procedure

I have a stored procedure that returns 4 result sets. The results sets have lots of columns.
What's the best way to create a table out of each result set? The data types and schema in the tables should be the same as the ones from the result sets.
I know I can do this to create a table from a selection:
CREATE TABLE TABLE_NAME
AS SELECT * FROM USERS
So is there a way to select a result set from a stored procedure execution??
While this was mostly covered in chat, it should still have an answer for others who might be wondering the same thing.
The only way to access a specific result set out of multiple results sets, and within the context of T-SQL, is via SQLCLR. Using C#, VB.Net, or any .Net language, you would use a SqlDataReader which can access the result sets separately.
The SQLCLR proc would simply exec the existing T-SQL proc and can either spit out a single result set (assuming an input param would specify which one to return as a result set) or could do a separate connection and directly call INSERT statements to do all 4 at the same time (although at that point it could just as well be a Console App or Windows Form or whatever).

Real time application/benefit of Denali's With Result Set

What are the real time uses of the Denali's With Result Set so far Sql Stored Procs are concern apart from renaming the column names and data types at runtime.
Even what is the benefit of changing the datatypes at runtime in With Result Set
e.g.
Alter PROCEDURE test_Proc
AS
BEGIN
SELECT * FROM tbl_Test
END
GO
EXEC test_Proc
WITH RESULT SETS
(
( Id int,
EmpName varchar(50),
PNo varchar(50)
)
)
Even if the column datatypes has been changed, what will we do with that?
however this article gives some idea about it's benefit in SSIS. But I am more interested in Sql Server stored Proc talking to any front end application(e.g. c#) and the like prespective.
Well, for one, say your application is calling sp_who2, and it is storing SPID in an int32. sp_who2 returns SPID as a char, requiring you to perform special handling in all of your apps to convert the output to an int32. If you create a wrapper procedure, you can do this in one place, and without having to dump the results into a temp table first. One more curious case with sp_who2 is that it returns two identical SPID columns - with WITH RESULT SETS you can rename one of them (say, to redundant_SPID) so that your apps never see multiple columns with the same name.
Another use case is say you are changing a data type from int64 to int32 or int32 to varchar, but you can't change all of your apps at once. You can change the "modern" apps to use the new data type while leaving the other "not changeable right now" apps to use the old data type. This means you can split out the deployment and testing of your apps one by one instead of making a wholesale data type change across all of the apps.

Build temporary table with dynamic sql in SQL Server 2008

To make a long story short...
I'm building a web app in which the user can select any combination of about 40 parameters. However, for one of the results they want(investment experience), I have to extract information from a different table and compare the values in six different columns(stock exp, mutual funds exp, etc) and return only the highest value of the six for that specific record.
This is not the issue. The issue is that at runtime, my query to find the investment exp doesn't necessarily know the account id. Considering a table scan would bring well over half a million clients, this is not an option. So what I'm trying to do is edit a copy of my main dynamically built query, but instead of returning 30+ columns, it'll just return 2, the accountid and experienceid (which is the PK for the experience table) so I can do the filtering deal.
Some of you may define dynamic SQL a little different than myself. My query is a string that depending on the arguments sent to my procedure, portions of the where clause will be turned on or off by switches. In the end I execute, it's all done on the server side, all the web app does is send an array of arguments to my proc.
My over simplified code looks essentially like this:
declare #sql varchar(8000)
set #sql =
'select [columns]
into #tempTable
from [table]
[table joins]' + #dynamicallyBuiltWhereClause
exec(#sql)
after this part I try to use #tempTable for the investment experience filtering process, but i get an error telling me #tempTable doesn't exist.
Any and all help would be greatly appreciated.
The problem is the scope of your temp table only exists within the exec() statement. You can transform your temp table into a "global" temp table by using 2 hash signs -> ##tempTable. However, I wonder why you are using a variable #dynamicallyBuiltWhereClause to generate your SQL statement.
I have done what you are doing in the past, but have had better success generating SQL from the application (using C# to generate my SQL).
Also, you may want to look into Table Variables. I have seen some strange instances using temp tables where an application re-uses a connection and the temp table from the last query is still there.

Access to Result sets from within Stored procedures Transact-SQL SQL Server

I'm using SQL Server 2005, and I would like to know how to access different result sets from within transact-sql. The following stored procedure returns two result sets, how do I access them from, for example, another stored procedure?
CREATE PROCEDURE getOrder (#orderId as numeric) AS
BEGIN
select order_address, order_number from order_table where order_id = #orderId
select item, number_of_items, cost from order_line where order_id = #orderId
END
I need to be able to iterate through both result sets individually.
EDIT: Just to clarify the question, I want to test the stored procedures. I have a set of stored procedures which are used from a VB.NET client, which return multiple result sets. These are not going to be changed to a table valued function, I can't in fact change the procedures at all. Changing the procedure is not an option.
The result sets returned by the procedures are not the same data types or number of columns.
The short answer is: you can't do it.
From T-SQL there is no way to access multiple results of a nested stored procedure call, without changing the stored procedure as others have suggested.
To be complete, if the procedure were returning a single result, you could insert it into a temp table or table variable with the following syntax:
INSERT INTO #Table (...columns...)
EXEC MySproc ...parameters...
You can use the same syntax for a procedure that returns multiple results, but it will only process the first result, the rest will be discarded.
I was easily able to do this by creating a SQL2005 CLR stored procedure which contained an internal dataset.
You see, a new SqlDataAdapter will .Fill a multiple-result-set sproc into a multiple-table dataset by default. The data in these tables can in turn be inserted into #Temp tables in the calling sproc you wish to write. dataset.ReadXmlSchema will show you the schema of each result set.
Step 1: Begin writing the sproc which will read the data from the multi-result-set sproc
a. Create a separate table for each result set according to the schema.
CREATE PROCEDURE [dbo].[usp_SF_Read] AS
SET NOCOUNT ON;
CREATE TABLE #Table01 (Document_ID VARCHAR(100)
, Document_status_definition_uid INT
, Document_status_Code VARCHAR(100)
, Attachment_count INT
, PRIMARY KEY (Document_ID));
b. At this point you may need to declare a cursor to repetitively call the CLR sproc you will create here:
Step 2: Make the CLR Sproc
Partial Public Class StoredProcedures
<Microsoft.SqlServer.Server.SqlProcedure()> _
Public Shared Sub usp_SF_ReadSFIntoTables()
End Sub
End Class
a. Connect using New SqlConnection("context connection=true").
b. Set up a command object (cmd) to contain the multiple-result-set sproc.
c. Get all the data using the following:
Dim dataset As DataSet = New DataSet
With New SqlDataAdapter(cmd)
.Fill(dataset) ' get all the data.
End With
'you can use dataset.ReadXmlSchema at this point...
d. Iterate over each table and insert every row into the appropriate temp table (which you created in step one above).
Final note:
In my experience, you may wish to enforce some relationships between your tables so you know which batch each record came from.
That's all there was to it!
~ Shaun, Near Seattle
There is a kludge that you can do as well. Add an optional parameter N int to your sproc. Default the value of N to -1. If the value of N is -1, then do every one of your selects. Otherwise, do the Nth select and only the Nth select.
For example,
if (N = -1 or N = 0)
select ...
if (N = -1 or N = 1)
select ...
The callers of your sproc who do not specify N will get a result set with more than one tables. If you need to extract one or more of these tables from another sproc, simply call your sproc specifying a value for N. You'll have to call the sproc one time for each table you wish to extract. Inefficient if you need more than one table from the result set, but it does work in pure TSQL.
Note that there's an extra, undocumented limitation to the INSERT INTO ... EXEC statement: it cannot be nested. That is, the stored proc that the EXEC calls (or any that it calls in turn) cannot itself do an INSERT INTO ... EXEC. It appears that there's a single scratchpad per process that accumulates the result, and if they're nested you'll get an error when the caller opens this up, and then the callee tries to open it again.
Matthieu, you'd need to maintain separate temp tables for each "type" of result. Also, if you're executing the same one multiple times, you might need to add an extra column to that result to indicate which call it resulted from.
Sadly it is impossible to do this. The problem is, of course, that there is no SQL Syntax to allow it. It happens 'beneath the hood' of course, but you can't get at these other results in TSQL, only from the application via ODBC or whatever.
There is a way round it, as with most things. The trick is to use ole automation in TSQL to create an ADODB object which opens each resultset in turn and write the results to the tables you nominate (or do whatever you want with the resultsets). you can also do it in DMO if you enjoy pain.
There are two ways to do this easily. Either stick the results in a temp table and then reference the temp table from your sproc. The other alternative is to put the results into an XML variable that is used as an OUTPUT variable.
There are, however, pros and cons to both of these options. With a temporary table, you'll need to add code to the script that creates the calling procedure to create the temporary table before modifying the procedure. Also, you should clean up the temp table at the end of the procedure.
With the XML, it can be memory intensive and slow.
You could select them into temp tables or write table valued functions to return result sets. Are asking how to iterate through the result sets?