If the values in MyTable.MyColumn and #MyVar are NULL, what does the following statement return?
ISNULL(#MyVar, MyTable.MyColumn)
create table MyTable
(
MyColumn int
)
insert into MyTable default values
declare #MyVar int
select ISNULL(#MyVar, MyTable.MyColumn)
from MyTable
Result:
-----------
NULL
Others have explained what happens. Solutions in case you don't want those Nulls.
If the expression is in the SELECT list - and you don't want NULL to appear in the result, you can use the COALESCE() function which can take more than 2 parameters:
COALESCE(#MyVar, MyTable.MyColumn, another_value)
If you have it in a condition, like you mention in the comments:
WHERE MyTable.MyColumn = ISNULL(#MyVar, MyTable.MyColumn)
you can replace it with (assuming you want the condition to hold TRUE when #MyVar is null, no matter if MyTable.MyColumn is null or not):
WHERE (MyTable.MyColumn = #MyVar OR #MyVar IS NULL)
It will return NULL, of course. :) Quite easy to test too. What's the overall need for this? If you need to for example ensure that at least some value is returned, you have a few options. For example case:
SELECT CASE WHEN #MyVar IS NULL
THEN ISNULL(MyTable.MyColumn, 'replacementvalue')
ELSE #MyVar
END AS RESULT
FROM MyTable
We need to know more about the real issue here.
Try this :-)
select ISNULL(null,null)
Related
In T-SQL, I declare a local variable for use with some query like so:
DECLARE #var_last datetime;
SET #var_last = (SELECT TOP(1) col_date FROM tbl_dates ORDER BY col_date);
In an application I'm testing, it would be an error for this query to return NULL, and it's desirable for the query to return a crash error if it were.
I'd like to set #var_last to be NOT NULL but the syntax...
DECLARE #var_last datetime NOT NULL;
...is invalid. I can write a simple check on the return of the query to see if it's NULL, and error if it is, but my question is, is it not possible to declare a local variable as NOT NULL?
That's right, according the documentation for the DECLARE #local_variable, available at: http://technet.microsoft.com/en-us/library/ms188927.aspx, it doesn't accept a NULL | NOT NULL parameter -- those are only valid for column definitions.
If you want to stop execution if you return a NULL, then test for NULL and, if it is, RAISERROR; see: http://technet.microsoft.com/en-us/library/ms178592.aspx.
You can do something like this ...
Using ISNULL()
SELECT TOP(1) #var_last = ISNULL(col_date,'19000101') --<-- Some default value
FROM tbl_dates
ORDER BY col_date;
Using COALESCE()
SELECT TOP(1) #var_last = COALESCE(col_date,'19000101') --<-- Some default value
FROM tbl_dates
ORDER BY col_date;
It is not possible to set a variable to NOT NULL because NULL and NOT NULL are states, not values (even tho you can set a variable to NULL which looks like you are setting a value but you are actually erasing the existing value).
NULL, by definition, means has no value.
I inherited some old stored procedures today, and came across several examples that followed this general pattern, where #Test is some INT value:
IF #Test IS NOT NULL AND #Test > 0
-- do something...
My understanding is that if #Test is NULL, then it has no value, and is not greater than, less than or even equal to zero. Therefore, testing for NULL is redundant in the above code:
IF #Test > 0
-- do something...
This second version seems to work just fine, and is far more readable IHMO.
So, my question: Is my understanding of NULL being unnecessary in this instance correct, or is there some obvious use-case I'm overlooking here where it could all go horribly wrong?
Note: In some cases, it was obvious that the intent was checking for the existence of a value, and I've changed those to IF EXISTS... my question is more concerned with the general case outlined above.
In SQL all comparisons to a NULL value evaluate to false.
So you always have to check explicitly for NULL, if you wish to act on it.
So, in this case, the additional test is not necessary.
#FlorianHeer is right on. NULL > 0 will eventually evaluate to false but as #Pred points out that is because Null > 0 actually evaluates to null and null cast to a bit is false....
A null is an unknown and therefore any comparison with it is also unknown. Think of arithmetic functions such as addition 1 + NULL = NULL, or concatenation 'A' + NULLL = NULL. NULL means the SQL database engine cannot interpret what its value is so any function or comparison on it is also unknown.
#MikkaRin pointed out that it is the assumption in the ELSE portion of a case statement or IF statement where that can become problematic but lets also think about this in the context of a join and how you may or may not want to see the results.
DECLARE #Table1 AS TABLE (Col INT)
DECLARE #Table2 AS TABLE (Col INT)
INSERT INTO #Table1 VALUES (1),(2),(3)
INSERT INTO #Table2 VALUES (1),(NULL),(3),(4)
SELECT *
FROM
#Table1 t1
INNER JOIN #Table2 t2
ON t1.Col <> t2.Col
Naturally you might think because NULL would be not equal to 1,2,3 that it should be included in the result set. But null is unknown so SQL is saying well I don't know if NULL could be 1,2,3 so I cannot return that as a result.
Now lets do the same thing but add a NULL in the first table:
DECLARE #Table1 AS TABLE (Col INT)
DECLARE #Table2 AS TABLE (Col INT)
INSERT INTO #Table1 VALUES (1),(2),(3),(NULL)
INSERT INTO #Table2 VALUES (1),(NULL),(3),(4)
SELECT *
FROM
#Table1 t1
INNER JOIN #Table2 t2
ON t1.Col = t2.Col
Again you might think that NULL is = to NULL but any comparison of NULL is considered unknown so even though both tables have NULL in it it will not be returned in the dataset.
Now consider:
DECLARE #Table1 AS TABLE (Col INT)
INSERT INTO #Table1 VALUES (1),(2),(3),(NULL)
SELECT *, CASE WHEN Col < 2 THEN Col ELSE 1000 END as ColCase
FROM
#Table1 t1
Which will make even the NULL 1000 the question is should NULL an unknown be 1000? if NULL is unknown how do we know that it isn't less than 2?
For a lot of your operations it may simply be enough to compare #Value > 1 but especially when you start dealing with ELSE in case of IF statements or joining on the antithesis you should consider dealing with the NULLs. Such as using ISNULL() or COALESCE() as #GuidoG points out.
IMHO being explicit about your intentions during operations to appropriately account for null values out weighs the minimal savings of typing.
Compare with NULL is necessary if you use ELSE statements:
for example:
declare #t int
set #t=null
if (#t>0) print '1' -- works fine
if (#t<0) print '2' --works fine
if (#t>0)
print '3' --works fine
else print '4' --here we start getting problems, because we are sure that #t<=0 that is obviously not true
you could replace it with
if isnull(#test, 0) > 0
This way it will be shorter and you still have checked everything
another interesting example:
SELECT (null > 0) AS a, !(null > 0) AS b
value of both a and b will be NULL
From my understanding, in some cases null checks are added sometimes to short circuit OR logic. For example, consider the following:
select * from tbl where (#id is null or #id > id)
If you pass in a value for #id, it tests the first condition (#id is null) and sees that it's false, but since it's part of an OR statement, it then goes ahead and then runs the #id > id comparison to see what that one returns as well. OR statements only need one true returned for the whole thing to resolve to true, and must keep testing until it comes across an OR condition that does.
Whereas if you pass in null for the #id parameter, as soon as it gets to the first condition and it returns true. Seeing that the next it's part of an OR statement, SQL knows it doesn't even have to do any of the following comparison, because the entire OR statement has already resolved to true. The #id > id comparison and will not even run it. This can save a ton of processing if it's a huge table or complex join, etc.
I must to check if two values, X and Y are different. If both are null, they must be considered as equal.
The unique way I found is:
select 1 as valueExists
where (#X is null and #Y is not null)
or (#Y is null and #X is not null)
or (#X <> #Y)
Is there a smart way to write this expression?
Thanks!
I think you could use COALESCE for that
WHERE coalesce(#X, '') <> coalesce(#Y, '')
What it does it returns an empty string if one of variables is null, so if two variables are null the two empty strings become equal.
I typically use a technique I picked up from here
SELECT 1 AS valuesDifferent
WHERE EXISTS (SELECT #X
EXCEPT
SELECT #Y)
WHERE EXISTS returns true if the sub query it contains returns a row. This will happen in this case if the two values are distinct. null is treated as a distinct value for the purposes of this operation.
You could try using NULLIF like this:
WHERE NULLIF(#X,#Y) IS NOT NULL OR NULLIF(#Y,#X) IS NOT NULL
You can use ISNULL
WHERE ISNULL(#X,'') <> ISNULL(#Y,'')
In T-SQL, I declare a local variable for use with some query like so:
DECLARE #var_last datetime;
SET #var_last = (SELECT TOP(1) col_date FROM tbl_dates ORDER BY col_date);
In an application I'm testing, it would be an error for this query to return NULL, and it's desirable for the query to return a crash error if it were.
I'd like to set #var_last to be NOT NULL but the syntax...
DECLARE #var_last datetime NOT NULL;
...is invalid. I can write a simple check on the return of the query to see if it's NULL, and error if it is, but my question is, is it not possible to declare a local variable as NOT NULL?
That's right, according the documentation for the DECLARE #local_variable, available at: http://technet.microsoft.com/en-us/library/ms188927.aspx, it doesn't accept a NULL | NOT NULL parameter -- those are only valid for column definitions.
If you want to stop execution if you return a NULL, then test for NULL and, if it is, RAISERROR; see: http://technet.microsoft.com/en-us/library/ms178592.aspx.
You can do something like this ...
Using ISNULL()
SELECT TOP(1) #var_last = ISNULL(col_date,'19000101') --<-- Some default value
FROM tbl_dates
ORDER BY col_date;
Using COALESCE()
SELECT TOP(1) #var_last = COALESCE(col_date,'19000101') --<-- Some default value
FROM tbl_dates
ORDER BY col_date;
It is not possible to set a variable to NOT NULL because NULL and NOT NULL are states, not values (even tho you can set a variable to NULL which looks like you are setting a value but you are actually erasing the existing value).
NULL, by definition, means has no value.
What is the best way to include an input param in the WHERE clause but exclude it if it is null?
There are a number of ways I believe, but I can't seem to remember then.
Also could I use the COALESCE()? But I think this is only for SELECTing values?
Edit
To clarify, let's say a variable called #code ="1" then my where would be Where type='B' AND code = #code but if #code is null then I only want Where type='B' - notice the missing code = #code.
You can use IsNull
where some_column = IsNull(#yourvariable, 'valueifnull')
EDIT:
What you described in the comment can be done like:
where (#code is null or code = #code)
Here's another approach
SELECT * FROM Thingies WHERE ( #thingId IS NULL OR ThingID = #thingId )
How about
WHERE (Column1 = #Var1 OR #Var1 IS NULL)
AND (Column2 = #Var2 OR #Var2 IS NULL)
I’d like to suggest a solution which I found on another site:
SELECT * FROM Thingies
WHERE ThingID = isnull(#ThingId,ThingID)
With this solution if the user selects null for your parameter then your query will return all the rows as the result.
This question really helped me with a similar issue that had a few of us scratching our heads for a bit. I only write it up in case somebody else tries the same approach and cannot figure out why it does not work.
I was trying to only evaluate a part of a multipart WHERE clause if the #Parameter was not null. I tried to do this as below but always had no rows returned if #Parameter was null.
DECLARE #Parameter int = null;
SELECT * FROM TABLE
WHERE [AlternateID] is not null
AND (#Parameter is not null AND [AlternateID] = #Parameter)
I incorrectly thought that (#Parameter is not null AND [AlternateID] = #Parameter) would simply not form part of the full WHERE clause is #Parameter was null. However it was making the entire WHERE clause return false. The remedy was to add an OR 1=1 as below:
WHERE [AlternateID] is not null
AND (#Parameter is not null AND [AlternateID] = #Parameter OR 1=1)
Of course the approach outlined by Ali (not enough reputation to upvote) solves this more efficiently.
WHERE [AlternateID] is not null
AND [Partner_Customer_ID] = ISNULL(#Parameter, [Partner_Customer_ID])
You can use ISNULL(), or check for nulls explicitly as others have mentioned. This should be OK as long as you have no more than 1 or 2 optional input parameters. But if there are more parameters, this approach would be very inefficient as the indexes you create on those columns won't be used as you would expect. In such a case i would recommend you to use dynamic SQL. Here is an excellent article that explains why http://sqlinthewild.co.za/index.php/2009/03/19/catch-all-queries/
I think this will help
#id
#name
SELECT [Id],[Name] FROM [Person]
WHERE Id = #id and ISNULL(#name, Name)
This will allow you just ignore the Name condition if it is null