How Can I break in my stored procedure code that is under Test in SSMS - ssms

I am using tSQLt to test my sql stored procedures. I would like to be able to debug my stored procedure within the contexts of the test (with the faked tables etc).
I'm having issues setting/hitting a breakpoint in SSMS while debugging.
The code I'm starting the debugger in looks something like:
EXEC tSQLt.run #TestName = '[someTestClass].[test something]'
I am able to step into this call and various subsequent calls but this is very annoying. I'd like to be able to set a breakpoint in my stored procedure under test but I cannot find any documentation on how to accomplish this besides this which seems to be outdated for SSMS 2017.
Is there a way to set a breakpoint in a stored proc using the object explorer in SSMS? Is there a method in TSQLT that makes it easier to step into a test/code under test? Is there a better tool for debugging something like this other than SSMS?
Any help or advice would be much appreciated.

This is how I debug my tests, I came onto this question looking for best practices/how other people debug their tests so hopefully someone can provide a better answer.
My object code is stored in DB projects/deployed via dacpacs. When developing the tests I deploy the DB from visual studio, then execute tsqlt.Run via SSMS.
In my tsqlt databases I have a custom table
CREATE TABLE [tSQLt].[DebugTests]
(
[IsDebug] bit not null default 1,
constraint PK_T1 PRIMARY KEY ([IsDebug]),
constraint CK_T1_Locked CHECK ([IsDebug]=1)
)
If I am actively debugging my tests, I insert into that table and my tests will know that they are in Debug mode and should print/return stuff.
INSERT INTO tSQLt.DebugTests(IsDebug)
VALUES(1);
Inside of my tests, I include this snippet at the beginning of the sproc. This sets a variable that I can then use throughout the sproc to surface various info as I need.
CREATE PROCEDURE [tests_schema].[test myStuff]
AS
DECLARE #Debug BIT=
(
SELECT CASE
WHEN COALESCE(IsDebug, 0) = 1
THEN 1
ELSE 0
END
FROM tSQLt.DebugTests
);
#For instance, I might want to see what my fake table looks like
EXEC tSQLt.FakeTable '[dbo].[MyTable]';
INSERT INTO [dbo].[MyTable] ([Column1],[column5])
Values
(1,'Value')
,(2,'AnotherValue')
if (#Debug = 1)
select 'FakeDataArrange' as [ResultType],'[dbo].[MyTable]' 'TableName', * from [dbo].[MyTable]
#I might also want to see what the actual results of my stored proc/view are. Or surface the #Expected values so that I do not have to jump back to the source code to remember what this test is expecting.
IF(#Debug = 1)
SELECT '#Actual' 'ResultType', *
FROM #Actual;
Then truncate the [tSQLt].[DebugTests] when you are done. This is not performant if you spit out all of these debug messages/results every time, so you really want to make sure that you only have a value in that table when you are actively developing tests.
I have found this to be pretty good for my workflow. Normally I only need to see the values of my faked tables to be able to figure out what is wrong. That being said, I am not testing any complex code, just basic stored procs that are used for reports.

Related

SQL Unit Testing Creating test for Stored procedure that does not necessarily return a value

I am new to sql unit testing but I have written my first test and trying to make sure this is a sensible test. So I have a stored procedure that does a simply Update(if exists) or insert(if not exists). Using TSQLUnit, I wrote the test below to test my stored procedure called spModifyData. What the test is designed to do is verify that when an existing ID is passed, a new record is not created in the database.
ALTER PROCEDURE [dbo].[ut_TestspModifyData]
AS
BEGIN
SET NOCOUNT ON;
-- Setup the test conditions by inserting test data
DECLARE
#newidone uniqueidentifier,
#newidtwo uniqueidentifier,
#newidthree uniqueidentifier,
#ExpectedID uniqueidentifier,
#ActualID uniqueidentifier
SET #ActualID = '13E7C741-9A04-4E84-B604-141874A6A9B4'
SET #ExpectedID = '13E7C741-9A04-4E84-B604-141874A6A9B4'
SET #newidone = newID()
SET #newidtwo = newID()
SET #newidthree = newID()
INSERT INTO DataSource( [DataSourcePrimarySource],[DataSourceName],[DataSourceRecordCreateDate]
,[DataSourceStatus] ,[DataSourceIsActive]) VALUES ('PRIMARY SOURCE ONE', 'XYZ', GETDATE() , #newidone, 1)
-- Exercise the test
EXEC spModifyDataSource #ActualID , 'PRIMARY SOURCE ONETWO', 'BBB', #newIDone, 0
-- Assert expectations
IF (#ExpectedID != #ActualID)
EXEC dbo.tsu_Failure 'ModifyData failed.'
-- Teardown
-- Implicitly done via ROLLBACK TRAN
END
As this test is testing a functional part of your stored procedure, then yes it is a "sensible" unit test - it doesn't have to test parameters to be valid.
That being said - I'm not sure it's the best way of testing what you are trying to. You appear to be using #ActualID as a variable which you set early on but don't specify as an OUTPUT parameter, so I would always expect this test to pass (either I'm reading that wrong, or there's a bug in the test).
Your question stated your test aim was : "verify that when an existing ID is passed, a new record is not created in the database."
I would actually approach the assert section of this test slightly differently - I'd check the result in the DataSource table directly rather than checking the returned parameter - as you are then checking what's created in the database. I would additionally check any output parameter, but usually I'd regard output parameters as a separate test to data checks, or at least a separate assert - and therefore separate message - in the test.
I think you will find it helpful if you can use more descriptive messages - as per the example in the TSQLUnit cookbook
I find it easier to resolve tests that only test one thing - and therefore I know what needs to be fixed. They are often simpler to write, too!
Does that help?

logging detailed information to file in SQL Server

Does anyone have some code to simply log some detailed information to a file within a SQL query (or stored procedure or trigger)? I'm not looking for anything fancy. All I want to do is to quickly put debug information into my SQL, much like folks do for JavaScript debugging using alerts. I looked at using Lumigent, but that seems like overkill for what I want to do. I don't care what the format of the logging is in. Below is a simple example of what I'd like to do.
Example:
DECLARE #x int;
SET #x = '123'
-- log the value of #x
============
9/6/2011 # 4:01pm update
I tried the sqlcmd below, which works well. But it doesn't work well when there are 100 parameters on a stored procedure when I want to debug. In that case, I need to go put a break-point in my client code, then get the value of each argument. Then go and type out the exec command, and then look at the output file. All I want to do is put one simple line of code into my SQL (perhaps calling another stored procedure if it takes more than one line of code), that writes a variable value to a file. That's it. I'm just using this for debugging purposes.
One pretty easy method is to use either OSQL or SQLCMD to run your procedure. These are command-line methods for executing SQL commands/procedures/scripts.
With those utilities you can pipe the output (what would normally appear in the "Messages" tab in SSMS) to a text file.
If you do this, in your example the code would be:
DECLARE #x int;
SET #x = '123'
PRINT #x
If you are running the same procedure multiple times, you can just save it a a one-line batch file to make it very easy to test.
Now with more background I think I can promote my comment to an answer:
Why does it have to be a file? If this is just during debugging, can't you just as easily log to a table, and when you want to see the recent results:
SELECT TOP (n) module, parameters, etc.
FROM logTable
ORDER BY DateCreated DESC;
You can simplify the logging or at least make it easier to replicate from procedure to procedure by having a stored procedure that takes various arguments such as ##PROCID and others to centralize the logging. See this article I wrote for some ideas there - it's geared to just logging once per stored procedure call but you could certainly call it as often as you like inside any stored procedure.
This seems like much less hassle than using an archaic file-based log approach. You're already using a database, take advantage!
If you're committed to using a file for whatever reason (it might help to understand or counter if you enumerate those reasons), then the next best choice would likely be CLR, as already mentioned. A complete solution in this case might be beyond the scope of this single question, but there are tons of examples online.

How can I tell what the parameter values are for a problem stored procedure?

I have a stored procedure that causes blocking on my SQL server database. Whenever it does block for more than X amount of seconds we get notified with what query is being run, and it looks similar to below.
CREATE PROC [dbo].[sp_problemprocedure] (
#orderid INT
--procedure code
How can I tell what the value is for #orderid? I'd like to know the value because this procedure will run 100+ times a day but only cause blocking a handful of times, and if we can find some sort of pattern between the order id's maybe I'd be able to track down the problem.
The procedure is being called from a .NET application if that helps.
Have you tried printing it from inside the procedure?
http://msdn.microsoft.com/en-us/library/ms176047.aspx
If it's being called from a .NET application you could easily log out the parameter being passed from the .net app, but if you don't have access, also you can use SQL Server profiling. Filters can be set on the command type i.e. proc only as well as the database that is being hit otherwise you will be overwhelmed with all the information a profile can produce.
Link: Using Sql server profiler
rename the procedure
create a logging table
create a new one (same signature/params) which calls the original but first logs the params and starting timestamp and logs after the call finishes the end timestamp
create a synonym for this new proc with the name of the original
Now you have a log for all calls made by whatever app...
You can disbale/enable the logging anytime by simply redefining the synonym to point to the logging wrapper or to the original...
The easiest way would be to run a profiler trace. You'll want to capture calls to the stored procedure.
Really though, that is only going to tell you part of the story. Personally I would start with the code. Try and batch big updates into smaller batches. Try and avoid long-running explicit transactions if they're not necessary. Look at your triggers (if any) and cascading Foreign keys and make sure those are efficient.
easiest way is to do the following:
1) in .NET, grab the date-time just before running the procedure
2) in .Net, after the procedure is complete grab the date-time
3) in .NET, do some date-time math, and if it is "slow", write to a file (log) those start and end date-times, user info, all the the parameters, etc.

How to troubleshoot a stored procedure?

what is the best way of troubleshoot a stored procedure in SQL Server, i mean from where do you start etc..?
Test each SELECT statements (if any) outside of your stored procedure to see whether it returns the expected results;
Make INSERT and UPDATE statements as simple as possible;
Try to test Inserts and Updates outside of your SP so that you can check it gives the expected results;
Use the debugger provided with SSMS Express 2008.
Visual Studio 2008 / 2010 has a debug facility. Simply connect to to your SQL Server instance in 'Server Explorer' and browse to your stored procedure.
Visual Studio 'Test Edition' also can generate Unit Tests around your stored procedures.
Troubleshooting a complex stored proc is far more than just determining if you can get it to run or not and finding the step which won't run. What is most critical is whether it actually returns the corect results or performs the correct actions.
There are two kinds of stored procs that need extensive abilites to troublshoot. First the the proc which creates dynamic SQL. I never create one of these without an input parameter of #debug. When this parameter is set, I have the proc print the SQl statment as it would have run and not run it. Almost everytime, this leads you right away to the problem as you can then see the syntax error in the generated SQL code. You also can run this sql code to see if it is returning the records you expect.
Now with complex procs that have many steps that affect data, I always use an #test input parameter. There are two things I do with the #test parameter, first I make it rollback the actions so that a mistake in development won't mess up the data. Second, I have it display the data before it rollsback to see what the results would have been. (These actually appear in the reverse order in the proc; I just think of them in this order.)
Now I can see what would have gone into the table or been deleted from the tables without affecting the data permananently. Sometimes, I might start with a select of the data as it was before any actions and then compare it to a select run afterwards.
Finally, I often want to log actions of a complex proc and see exactly what steps happened. I don't want those logs to get rolled back if the proc hits an error, so I set up a table variable for the logging information I want at the start of the proc. After each step (or after an error depending on what I want to log), I insert to this table variable. After the rollback or commit statement, I select the results of the table variable or use those results to log to a permanent logging table. This can be especially nice if you are using dynamic SQL because you can log the SQL that was run and then when something strange fails on prod, you have a record of which statement was run when it failed. You do this in a table variable because those do not go out of scope in a rollback.
In SSMS, you can simply start by opening the proc., and clicking on the check mark button (Parse) next to the Execute button on the menu bar. It reports any errors it finds.
If there are no errors there and you're stored procedure is harmless to run (you're not inserting into tables, just creating a temp table for example), then comment out the CREATE PROCEDURE x (or ALTER PROCEDURE x) and declare all the parameters by copying that part, then define them with valid values. Then run it to see what happens.
Maybe this is simple, but it's a place to start.

Using table just after creating it: object does not exist

I have a script in T-SQL that goes like this:
create table TableName (...)
SET IDENTITY INSERT TableName ON
And on second line I get error:
Cannot find the object "TableName" because it does not exist or you do not have permissions.
I execute it from Management Studio 2005. When I put "GO" between these two lines, it's working. But what I would like to acomplish is not to use "GO" because I would like to place this code in my application when it will be finished.
So my question is how to make this work without using "GO" so that I can run it programmatically from my C# application.
Without using GO, programmatically, you would need to make 2 separate database calls.
Run the two scripts one after the other - using two calls from your application.
You should only run the second once the first has successfully run anyway, so you could run the first script and on success run the second script. The table has to have been created before you can use it, which is why you need the GO in management studio.
From the BOL: "SQL Server utilities interpret GO as a signal that they should send the current batch of Transact-SQL statements to SQL Server". Therefore, as Jose Basilio already pointed out, you have to make separate database calls.
If this can help, I was faced with the same problem and I had to write a little (very basic) parser to split every single script in a bunch of mini-script which are sent - one at a time - to the database.
something even better than tpdi's temp table is a variable table. they run lightning fast and are dropped automatically once out of scope.
this is how you make one
declare #TableName table (ColumnName int, ColumnName2 nvarchar(50))
then to insert you just do this
insert into #TableName (ColumnName, ColumnName2)
select 1, 'A'
Consider writing a stored proc that creates a temporary table and does whatever it needs to with that. If you create a real table, your app won't be able to run the script more than once, unless it also drops the table -- in which case, you have exactly the functionality of a temp table.