SQL Server Stored Procedure WHERE Clause IF CASE No Value - sql

I have a stored procedure with default values, which I set to NULL. In the WHERE clause, for each value specified, rather than NULL or not given (where it would also be NULL), I would like the column to equal this value, otherwise, I would like to search without this column being in the WHERE clause. The basic structure looks something like this:
// Set up the stored procedure
USE Table;
GO
CREATE PROCEDURE dbo.SearchTable
// Specify value(s)
#Name varchar(50) = NULL
AS
// Set up the query
IF(#Name IS NOT NULL
SELECT * FROM Table WHERE Name=#Name;
ELSE
SELECT * FROM Table
BEGIN
END
I have more than 1 parameter, and so, with this logic, I would need 2 IF's for every parameter. As you can imagine, this gets boring, time-consuming, and error-prone fast.
I would like to move this logic into the query, preferably into the WHERE clause, but any way I have found will cause errors (besides exceptions, which would require just as many IF's). SQL Server doesn't like IF's in the WHERE clause as far as I know, and with CASE I would have to specify the column, which I do not want to do.
What should I do?
Edit:
I have SQL Server version 2012, so please concentrate on this or any recent versions in your answer.

If you don't care about performance, you can do:
SELECT *
FROM Table
WHERE #Name is null or Name = #Name;
Often, having an or condition gets in the way of efficient use of indexes. Perhaps this isn't a problem in your case, though.

You could do something like this. The downside to this is that indexes may not be used properly and thus the performance may not be great.
SELECT * FROM Table
WHERE (#Name Is Null Or Name = #Name)
And (#Col2 Is Null Or Col2 = #Col2)
And (#Col3 Is Null Or Col3 = #Col3)
Each column condition is "anded". Or is used to apply that column condition only if #var is not null. So for example, if this is called with just #Name populated, it is equivalent to Where Name = #Name. If both #Name and #Col2 are populated, it is equivalent to Where Name = #Name And Col2 = #Col2.

Related

Test for a column within a Select statement

Is it possible to test for a column before selecting it within a select statement?
This may be rough for me to explain, I have actually had to teach myself dynamic SQL over the past 4 months. I am using a dynamically generated parameter (#TableName) to store individual tables within a loop (apologize for the vagueness, but the details aren't relevant).
I then want to be able to be able to conditionally select a column from the table (I will not know if each table has certain columns). I have figured out how to check for a column outside of a select statement...
SET #SQLQuery2 = 'Select #OPFolderIDColumnCheck = Column_Name From INFORMATION_SCHEMA.COLUMNS Where Table_Name = #TABLENAME And Column_Name = ''OP__FolderID'''
SET #ParameterDefinition2 = N'#TABLENAME VARCHAR(100), #OPFolderIDColumnCheck VARCHAR(100) OUTPUT'
EXECUTE SP_EXECUTESQL #SQLQuery2, #ParameterDefinition2, #TABLENAME, #OPFolderIDColumnCheck OUTPUT
IF #OPFolderIDColumnCheck IS NULL
BEGIN
SET #OP__FOLDERID = NULL
END
ELSE
IF #OPFolderIDColumnCheck IS NOT NULL
BEGIN
...etc
but id like to be able to do it inside of a select statement. Is there a way to check and see if OP__FOLDERID exists in the table?
Id like to be able to do something like this:
SELECT IF 'OP__FOLDERID' EXISTS IN [TABLE] THEN 'OP__FOLDERID' FROM [TABLE]
Thank you for any help or direction you can offer.
I'm afraid there isn't any direct way to do this within a SELECT statement at all. You can determine if a column exists in a table, however, and construct your dynamic SQL accordingly. To do this, use something like this:
IF COL_LENGTH('schemaName.tableName', 'columnName') IS NOT NULL
BEGIN
-- Column Exists
END
You could then set a variable as a flag, and the code to construct the dynamic SQL would construct the expression with/without the column, as desired. Another approach would be to use a string value, and set it to the column name if it is present (perhaps with a prefix or suffix comma, as appropriate to the expression). This would allow you to save writing conditionals in the expression building, and would be particularly helpful where you have more than one or two of these maybe-columns in a dynamic expression.

Adding to WHERE clause conditions using CASE

I have a stored procedure that accepts an optional #ID param. When the param is passed in, I want the WHERE statement to include something like id = #ID, otherwise, when #ID is null, I don't want it to be filtered.
For example:
#ID BIGINT = NULL
SELECT * from myTable
WHERE
CASE
WHEN #ID IS NOT NULL THEN mytable.id = #ID
END
I am running this in SQL server 2016 and it says bad syntax near mytable.id = #ID. Can CASE be used in this way or should I try a different SQL method for this?
The only other option I considered to accomplish this was by using IF conditions in my stored procedure, but that didn't seem possible either based on my searches.
CASE is an expression, not a statement. It is not used to control flow like this and it will not work.
Your logic would need to be something like this.
Where mytable.id = ISNULL(#ID, mytable.id)
I should caution you that this pattern can lead to some poor performance. For a more detailed explanation and some other options you should check out this article. http://www.sqlinthewild.co.za/index.php/2009/03/19/catch-all-queries/
A bad-performance approach would be:
WHERE ISNULL(#ID,mytable.id) = mytable.id
A better-performance approach would be:
if(#ID IS NULL)
select * from ... without the WHERE condition
else
do your query with the condition mytable.id = #ID
Or build the query dynamically in the stored proc and execute it through sp_executesql passing parameters
Note: If the table is small enough, stick to simplicity and use the first option.

Correct usage of WHERE and AND in procedure with optional parameters

Pre-question info:
I'm writing a stored-procedure that would take some parameters and depending on those parameters(if they are filled - because they don't have to be) I'm adding few where clauses. The thing is I don't know if I'm gonna even use the where clause from start because I don't know if any of my params is going to be non-empty/not-null.
The inside of procedure looks cca like:
BEGIN
DECLARE #strMySelect varchar(max)
SET #strMySelect ='SELECT myparams FROM mytable'
// add some WHERE statement(*)
IF(ISNULL(#myParamDate1,'')<>'')BEGIN
SET #strMySelect =#strMySelect +'
AND param1 >='''+CAST(#myParamDate1 as varchar(30))+''''
END
IF(ISNULL(#myParamDate2,'')<>'')BEGIN
SET #strMySelect =#strMySelect +'
AND param1 <='''+CAST(#myParamDate2 as varchar(30))+''''
END
//... bit more of these "AND"s
EXECUTE(#strExec)
QUESTION:
Is it ok(correct way of doing this) to put in my query some WHERE statement that I know that will be always true so I can use in my parameter cases AND always? OR do I have to check for each param if it's first one that is filled or is there an easy way of checking in SQL that at least one of my parameters isn't NULL/empty?
I handle optional parameters like this:
where (
(#optionalParameter is not null and someField = #optionalParameter )
or
#optionalParameter is null
)
etc
I find it simpler.
Your extra where clause is not a problem from a performance point-of-view, since the query optimizer will (likely) remove the 1 = 1 condition anyway.
However, I would recommend a solution along the lines of what Dan Bracuk suggested for two reasons:
It is easier to read, write and debug.
You avoid the possibility of SQL injection attacks.
There are cases where you have to custom-build your query-string (e.g. when given a table name as parameter), but I would avoid it whenever possible.
You don't need to use EXEC function to check for parameters. A good practice is using case to check for parameter value for example
CREATE PROCEDURE MyProc
#Param1 int = 0
AS
BEGIN
SELECT * FROM MyTable WHERE CASE #param1 WHEN 0 THEN #param1 ELSE MyField END = #Param1
END
GO
In case that #param1 has no value (default 0) then you have #param1=#param1 which gives always true, in case you have #param with value then condition is MyField=#param1.

where condition have a declared variable on a wrong place do not know if its valid

I have a statement like this below in one of my big stored proc, and I am wondering if this is valid to write this way.
SELECT #PVDate = pv.Date,
#PVdMBeginDate = dbo.fPVoidDate(pv.myID)
FROM PVMeter pv (NOLOCK)
WHERE pv.PR_ID = #PR_ID
AND #VCommen BETWEEN pv.PVDMDate AND dbo.fPVoidDate(pv.myID)
Now, here, my question is, #VCommen is a declared date variable with a value set on it. It is not at all a column of PVMeter, while PVDMDate is a column in PVMeter and fpVoidDate returns datetime
While I debug SP, I do not see the value on #PVDate and #PVDMBeginDate
The original query is equivalent to:
SELECT
#PVDate = pv.Date,
#PVdMBeginDate = dbo.fPVoidDate(pv.myID)
FROM PVMeter pv (NOLOCK)
WHERE
pv.PR_ID = #PR_ID
AND pv.PVDMDate <= #VCommen
AND #VCommen <= dbo.fPVoidDate(pv.myID)
This style should be more familiar. In general, you can put any expression in the WHERE clause, it can be made of variables or constants without referring to table columns at all.
Examples
Classic example: WHERE 1=1 ... when the query text is generated dynamically. It is easy to add as many expressions as needed in no particular order and prepend all of them with AND.
DECLARE #VarSQL nvarchar(max);
SET #VarSQL = 'SELECT ... FROM ... WHERE 1=1 ';
IF ... THEN SET #VarSQL = #VarSQL + ' AND expression1';
IF ... THEN SET #VarSQL = #VarSQL + ' AND expression2';
IF ... THEN SET #VarSQL = #VarSQL + ' AND expression3';
EXEC #VarSQL;
Thus you don't need to have complex logic determining whether you need to add an AND before each expression or not.
Another example.
You have a stored procedure with parameter #ParamID int.
You have a complex query in the procedure that usually returns many rows and one column of the result set is some unique ID.
SELECT ID, ...
FROM ...
WHERE
expression1
AND expression2
AND expression3
...
You want to return all rows if #ParamID is NULL and only one row with the given ID if #ParamID is not NULL. I personally use this approach. When I open the screen with the results of a query for the first time I want to show all rows to the user, so I pass NULL as a parameter. Then user makes changes to a selected row, which is done through a separate UPDATE statement. Then I want to refresh results that user sees on the screen. I know ID of the row that was just changed, so I need to requery just this row, so I pass this ID to procedure and fetch only one row instead of the whole table again.
The final query would look like this:
SELECT ID, ...
FROM ...
WHERE
(#ParamID IS NULL OR ID = #ParamID)
AND expression1
AND expression2
AND expression3
...
OPTION (RECOMPILE);
Thus I don't have to repeat the complex code of the query twice.

Best way to filter queries by parameter?

I have been using this method to filter my queries:
Create PROCEDURE [dbo].[pGetTask]
#showCompletedTasks bit = 1
,#showInProgressTasks bit = 1
,#taskID int = null
,#projectID int = null
,#applicationID int = null
,#clientID int = null
... Snip ...
where
a.clientID = isnull(#clientID, a.clientID)
and a.applicationID = isnull(#applicationID, a.applicationID)
and p.projectID = isnull(#projectID, p.projectID)
and t.taskID = isnull(#taskID, t.taskID)
and curr.complete = case #showCompletedTasks when 0 then 0 else curr.complete end
and curr.complete = case #showInProgressTasks when 0 then 1 else curr.complete end
This actually slows my queries by 2 seconds on a 664 row result set. The SQL tuning advisor isn't much help, so I figure this is not the right way to do this. Is there a right way, besides a ton of if statements?
Assuming you have properly indexed the table that the select is on, and these fields are part of the index, my guess is that it would be the calls to isnull. I would change them to this:
(#clientID is null or a.clientID = #clientId) and ...
As for the case statements, indexes on bit fields are pointless, so there's not much to do there.
Check your indexes & statistics. That seems a little slow. The other option would be to do a dynamic query essentially build a string representing your sql and execute it using sp_ExecuteSql (or Exec statement)
Edit
You could try and combine your two cases but I doubt it will have effect on performance of the query. It would look better though...
Although I'm not sure your query is right (Which is hard to say without more info) but shouldn't there be an or clause between the cases your trying to provide two states to return and by having separate params I assume I can ask for Only Complete, Only Not Complete or both...in this case you need an Or
Your best bet is to use this stored procedure to call a series of more specific procedures. You have two issues:
The use of the case statement
causes a table scan, which
(obviously) ignores any indexes you
might have
Even if you break the
statement out into several that are
called conditionally, you'll still
end up with a compiled execution
plan that is specific to the first
call to this procedure.
If you create specific procedures, like pGetTask_Completed and pGetTask_InProgress and call them conditionally from within this proc, you shouldn't have any issues.
You could be the victim of "parameter sniffing" problem. MS-SQL will take the parameters of your 1st run of your SP as the best sampling for making the query plan. Your query could be slow due to this.
To prove, try to run the content of your query directly, by simulating the populated parameters as variables. If it is much faster, then you are indeed having the "parameter sniffing" problem.
The solution is to trick the MS-SQL to think that your parameters are only being used to be assigned to another variables. Example:
create proc ManyParams
(
#pcol1 int,
#pcol2 int,
#pcol3 int
)
as
declare
#col1 int,
#col2 int,
#col3 int
select
#col1 = #pcol1,
#col2 = #pcol2,
#col3 = #pcol3
select
col1,
col2,
col3
from
tbl
where
1 = case when #col1 is null then 1 else case when col1 = #col1 then 1 else 0 end end
and 1 = case when #col2 is null then 1 else case when col2 = #col2 then 1 else 0 end end
and 1 = case when #col3 is null then 1 else case when col3 = #col3 then 1 else 0 end end
here is a great article on this topic:
http://www.sommarskog.se/dyn-search-2005.html
It will give you a lot of ideas to try out.
I tend to to a mix of the things to make these "search" type queries go fast. Here are some that I seem to use all the time:
I try to make certain search parameters required, so you can hit the index on those.
if possible (depends on number of rows) split up the query, using temp tables. IF you only have a few hundred ClientID values, create a #ClientID temp table. Put in the one the user wants, or all of them. You can then make this the FROM table and/or inner join other tables to this to make it much faster.
If you have dates that are optional, don't use anything like (#startDate is null or a.date >= #startDate). Just do something like SET #startDate=COALESCE(#startDate,'01/01/1970'). This will give you a value and eliminate using an "OR" and will use an index.
casperOne's suggestion is what I would start with.
One other possibility is this:
WHERE
(1 =
CASE
WHEN #client_id IS NULL THEN 1
WHEN a.clientID = #clientID THEN 1
ELSE 0
END) AND
...
I found that SQL Server (at least 2005) using the CASE statement like this can cause the query plan to short-circuit the rest of the logic. In the case of simple comparisons that's not really a big problem, but if your logic includes a subquery or some other expensive operation it might be a big help to short-circuit it. In your example, I would just go with casperOne's suggestion though.
Also, if you use the CASE method above, you'll need to add the RECOMPILE options to your SELECT and your stored procedure.