Optional Reporting Parameters (is there a better way?) - sql

I have a report with 4 parameters. I would like to make them not required. The problem is the conventional approach to do this, creates 16 OR/AND statements. If I were to have 10 not required parameters the SOL statement would be out of control. This works but is there an easier way?
Here is what I have:
MAIN DATA SET:
select *
from table
where
table.one = #param1 OR #param1 IS NOT NULL
AND.....(etc.etc..)
#param1, #param2,#param3,#param4: (default value null/blank)
Select some_column from any_table UNION SELECT '' as Nothing

The way I've always done it is
WHERE
col1 = isnull(#col1, col1)
and col2 = isnull(#col2, col2)
...etc
So pretty much what you have, with some semantic corrections.

Related

Using Common Table Expression and IF EXISTS

I am running a query similar to
DECLARE #VARIABLE NVARCHAR(50) = 'VALUE';
WITH MYCTE_TABLE (Column1,Column2)
AS
SELECT
(ColumnA, Column B
FROM
SomeTable
WHERE
ColumnA = SomeValue)
IF EXISTS(SELECT ColumnZ FROM AnotherTable WHERE Columnz = SomeNumbers)
BEGIN
SELECT * FROM MYCTE_TABLE
END
ELSE
BEGIN
MYSUBQUERY2
END
...
However, I keep getting the following error:
Incorrect syntax near the keyword 'IF'.
Each subquery works well when run independently. It seems the use of a common table expression before the IF EXISTS is causing the issue.
Any help please?
I really doubt, that this is the best approach... You tried to clean and shorten this for brevitiy (thumbs up for this!), but the given information is - maybe - not enough.
You cannot use a CTE in different queries. A CTE is fully inlined as part of the query...
But you could write your values into a table variable like here:
DECLARE #tbl TABLE(Column1 INT, Column2 VARCHAR(100)); --Choose appropriate types
INSERT INTO #tbl
SELECT ColumnA, ColumnB FROM SomeTable WHERE ColumnA=SomeValue;
This table variable can be used in later queries (but in the same job!) like any other table:
SELECT *
FROM SomeTable AS st
INNER JOIN #tbl AS tbl ON ...
... or similiar usages...
Another approach might be this
SELECT Column1,Column2 INTO #SomeTempTable FROM SomeWhere
This will write the result of the SELECT into a temp table (which is session wide).
I'm quite sure, that there might be a better (set-based) approach... Are the two sub-queries identical in their result set's structure? If so, you might use UNION ALL and place your "IF EXISTS" as a WHERE-clause to each sub query.
IF is control flow. WITH is within a query. You can do:
IF EXISTS (SELECT ColumnZ FROM AnotherTable WHERE Columnz=SomeNumbers)
BEGIN
WITH MYCTE_TABLE (Column1,Column2)AS
SELECT (ColumnA, Column B FROM SomeTable WHERE ColumnA=SomeValue)
MYSUBQUERY1
END;
ELSE
BEGIN
WITH MYCTE_TABLE (Column1,Column2)AS
SELECT (ColumnA, Column B FROM SomeTable WHERE ColumnA=SomeValue)
MYSUBQUERY2
END;
Or you could use a temporary table or table variable to store the values.

CASE WHEN in WHERE with LIKE condition instead of 1

I have a query with a bunch of OR's inside an AND in the where clause and I'm trying to replace them with CASE WHEN to see if it improves the performance.
The select query inside the stored procedure is something like:
DECLARE #word = '%word%' --These are inputs
DECLARE #type = 'type'
SELECT * FROM table1
WHERE SomeCondition1
AND ( (#type = 'CS' AND col1 like #word)
OR
(#type = 'ED' AND col2 like #word)
....
)
I'm trying to write this query as:
SELECT * FROM table1
WHERE SomeCondition1
AND ( 1= CASE WHEN #type = 'CS'
THEN col1 like #word
WHEN #type = 'ED'
THEN col2 like #word
END )
But SQL 2012 gives the error 'Incorrect Syntax Near Like' for THEN col1 like #word. If I replace THEN col1 like #word with 1 then no complaints but LIKE should return a 0 or 1 anyway.
I tried SELECT (col1 like #word), extra (), etc with no success.
Is there a way to include LIKE in CASE WHEN in WHERE or should I just not bother if using CASE WHEN instead of the original IF's won't make any performance difference?
UPDATE:
This actually didn't make any difference performance wise.
There are is a lot of info online about these 'optional' type stored procedures and how to avoid parameter sniffing performance issues.
This syntax should get you closer though:
AND CASE
WHEN #type = 'CS' THEN col1
WHEN #type = 'ED' THEN col2
END LIKE #word
Just make sure the col1 and col2 datatypes are similar (don't mix INT and VARCHAR)
You should compare query plans between the two syntaxes to ascertain whether it even makes a difference. Your performance issue might be due more to parameter sniffing.
You can also try nested case statements. e.g. based on your latest post, something like:
1 = CASE WHEN #type = 'CandidateStatus'
THEN (CASE WHEN co.[Description] LIKE #text THEN 1 END)
...
END
Here's how I got it to work, now just need to test if it makes any difference to performance. #Nick.McDermaid 's parameter sniffing is worth looking at.
1 = CASE WHEN #type = 'CandidateStatus'
THEN (SELECT 1 WHERE co.[Description] LIKE #text)

Best practice to avoid code replication?

currently I have to change a view, a big part of the view code looks like this:
CAST('My_Tag_' + CASE
WHEN FieldX isnull
THEN ''
WHEN FieldX = '123'
THEN 'some_number'
WHEN FieldX = 'abc'
THEN 'some_text'
ELSE 'strange' ) AS VARCHAR(128) AS myField
)
Just a chunk of code, that puts together a string (the code itself doesn't even matter right now, I have like 50 other examples, where I have a lot of code replication). Now I have exact the same code for 30 more fields in the view, just the 'My_Tag_' and FieldX is changing. If this would be C#, I would just write a little helper function.
Of course I could write a function here, too. But as this is a bigger project with a lot of tables, views, etc, I would have hundreds of functions soon.
Now I am pretty new to SQL and normally my home is the OOP-world. But there has to be a solution to avoid code replication and to avoid having hundreds of helper functions in the database?
What's best practice in this case?
The best practice may be to create a user defined function.
The arguments would be the fields that change and it would return the intended value.
You can use a CTE to add a field to a table:
; with TableWithExtraField as
(
select case ... end as NewField
, table1
)
select NewField
from TableWithExtraField
Or a subquery also works:
select NewField
from (
select case ... end as NewField
, table1
) as TableWithExtraField
CREATE FUNCTION dbo.MyTag(#myfield VARCHAR(MAX))
RETURNS VARCHAR(128)
AS
BEGIN
RETURN CAST('My_Tag_' + CASE
WHEN #myfield IS NULL
THEN ''
WHEN #myfield = '123'
THEN 'some_number'
WHEN #myfield = 'abc'
THEN 'some_text'
ELSE 'strange' END AS VARCHAR(128))
)
END

Referring to column values directly without using variables in T-SQL

Is there a way in T-SQL (SQL Server 2005) to assign a whole record to a record variable and then refer to the particular values using column names?
I mean, instead of:
select #var1 = col1,
#var2 = col2
from mytable
where ID = 1;
and referring to them as #var1 and #var2, something like
#record =
select col1, col2
from mytable
where ID = 1;
and referring to them like #record.col1 and #record.col2 .
I am beginner in t-sql, so hopefully the question is not too trivial.
You can create a table variable and select the whole resultset into it:
DECLARE #tt TABLE (col1 INT, col2 INT)
INSERT
INTO #tt
SELECT col1, col2
FROM mytable
WHERE id = 1
, but you cannot access its data except than in the SELECT query as well.
With pure TSQL (that it without custom datatypes) the thing you ask is impossible.
sounds like you are a programmer ... look at linq maybe as it does what you want.
You can use a temporary table and SELECT...INTO to avoid specifying the column names at the beginning :
SELECT Field1, Field2
INTO #TempTable
FROM MyTable
WHERE MyTable.MyID = 1
but of course you'll still need the FROM #TempTable part when referring to the column names.
SELECT Field1, Field2
FROM #TempTable
and of course to remember to drop the table at the end :
DROP #TempTable
The app code is where you'd normally refer to a single row at a time as a variable.
You could use XML, but you'd have to play with this...
DECLARE #MyRecord xml
DECLARE #Mytable TABLE (col1 int NOT NULL, col2 varchar(30) NOT NULL)
INSERT #Mytable (col1, col2) VALUES (1, 'bob')
select #MyRecord =
(SELECT *
from #Mytable
where col1 = 1
FOR XML AUTO)
SELECT #myRecord.value('./#col', 'int') --also #myRecord.value('#col', 'int')
--gives error
Msg 2390, Level 16, State 1, Line 12
XQuery [value()]: Top-level attribute nodes are not supported
Buried in the Transact SQL documentation I came across this restriction on variables:
Variables can be used only in expressions, not in place of object names or keywords.
Since you'd need to use an object name to qualify a column I don't believe that this is allowed.

Oracle: Is there a simple way to say "if null keep the current value" in merge/update statements?

I have a rather weak understanding of any of oracle's more advanced functionality but this should I think be possible.
Say I have a table with the following schema:
MyTable
Id INTEGER,
Col1 VARCHAR2(100),
Col2 VARCHAR2(100)
I would like to write an sproc with the following
PROCEDURE InsertOrUpdateMyTable(p_id in integer, p_col1 in varcahr2, p_col2 in varchar2)
Which, in the case of an update will, if the value in p_col1, p_col2 is null will not overwrite Col1, Col2 respectively
So If I have a record:
id=123, Col1='ABC', Col2='DEF'
exec InsertOrUpdateMyTable(123, 'XYZ', '098'); --results in id=123, Col1='XYZ', Col2='098'
exec InsertOrUpdateMyTable(123, NULL, '098'); --results in id=123, Col1='ABC', Col2='098'
exec InsertOrUpdateMyTable(123, NULL, NULL); --results in id=123, Col1='ABC', Col2='DEF'
Is there any simple way of doing this without having multiple SQL statements?
I am thinking there might be a way to do this with the Merge statement though I am only mildly familiar with it.
EDIT:
Cade Roux bellow suggests using COALESCE which works great! Here are some examples of using the coalesce kewyord.
And here is the solution for my problem:
MERGE INTO MyTable mt
USING (SELECT 1 FROM DUAL) a
ON (mt.ID = p_id)
WHEN MATCHED THEN
UPDATE
SET mt.Col1 = coalesce(p_col1, mt.Col1), mt.Col2 = coalesce(p_col2, mt.Col2)
WHEN NOT MATCHED THEN
INSERT (ID, Col1, Col2)
VALUES (p_id, p_col1, p_col2);
Change the call or the update statement to use
nvl(newValue, oldValue)
for the new field value.
Using MERGE and COALESCE? Try this link for an example
with
SET a.Col1 = COALESCE(incoming.Col1, a.Col1)
,a.Col2 = COALESCE(incoming.Col2, a.Col2)