CAST/CONVERT empty string to INT in SQL Server - sql-server

I came across a bug where I was using CAST(Col1 AS INT) + CAST(Col2 AS INT) where both Col1 and Col2 are VARCHAR and I was getting valid results out when Col1 or Col2 was blank and I didn't expect this. I checked and CAST (and CONVERT) both have this default behavior of replacing blank with 0:
SELECT CAST('' AS INT)
SELECT CONVERT(INT, '')
I checked the info page and I can't see any reference to explain why this is the behavior (or change it through a server setting). I can of course work around this but I wanted to ask why this is the behavior as I do not think it is intuitive.
I'd actually rather this CAST failed or gave NULL, is there a server setting somewhere which effects this?

Consider an INT in SQL Server. It can be one of three values:
NULL
0
Not 0
So if you're casting/converting an empty string, which you are assuming is a number, then 0 is the most logical value. It allows for a distinction between NULL and 0.
SELECT CAST(NULL AS INT) -- NULL
SELECT CAST('' AS INT) -- 0
SELECT CAST('42' AS INT) -- 42
I'd say that's logical.
If you did:
SELECT CAST('abc' AS INT)
You'd get:
Conversion failed when converting the varchar value 'abc' to data type int.
If you do wish to handle empty strings as NULL use NULLIF as Bogdan suggests in his answer:
DECLARE #val VARCHAR(2) = ''
SELECT CAST(NULLIF(#val,'') AS INT) -- produces NULL
NULLIF returns the first expression if the two expressions are not equal. If the expressions are equal, NULLIF returns a null value of the type of the first expression.
Finally, if your columns are storing INT values, then consider changing its data type to INT if you can.

As you probably know NULL is a marker that indicates that a data value does not exist. And '' is a value, empty but value.
So MS SQL cast (or converts) empty value into 0 by default. To overcome this and show it as NULL you can use NULLIF
Simple example:
SELECT int_as_varchars as actual,
cast(NULLIF(int_as_varchars,'') as int) as with_nullif,
cast(int_as_varchars as int) as just_cast
FROM (VALUES
('1'),
(''),
(NULL),
('0')
) as t(int_as_varchars)
Output:
actual with_nullif just_cast
1 1 1
NULL 0
NULL NULL NULL
0 0 0
As you see NULLIF in that case will help you to get NULL instead of 0.

What about this ?
declare #t table(bucket bigint);
INSERT INTO #t VALUES (1);
INSERT INTO #t VALUES (2);
INSERT INTO #t VALUES (-1);
INSERT INTO #t VALUES (5);
INSERT INTO #t VALUES (0);
declare #Bucket bigint = 0 --filter by 0
select * from #t
where 1=1
AND ((#Bucket is Null or cast(#Bucket as nvarchar) = '') or bucket=#Bucket)

Related

Conversion failed when converting the varchar value '$400,000.00' to data type int

I have the following part of query that works fine:
CONVERT(varchar(15), CONVERT(money, AmountOfInsurance), 1) AS AmountOfInsurance
I want to prevent anounts that are equal to 0 to show up formated, just to show up: 0, so I added this CASE statement but I get the following error:
CASE WHEN AmountOfInsurance > 0 THEN '$' + CONVERT(varchar(15), CONVERT(money, AmountOfInsurance), 1) ELSE 0 END AS AmountOfInsurance
Any idea?
Your ELSE should be '' because you want to return a varchar. Now the CASE expression has two data types and INT takes precedence over varchar, that's why it tries to convert the varchar back to INT.
Added for reference:
Data Type Precedence
Just another option (if 2012+) is Format() with a conditional format.
Declare #YourTable table (AmountOfInsurance money)
Insert Into #YourTable values
(400000),
(2500),
(0)
Select format(AmountOfInsurance,IIF(AmountOfInsurance>0,'$#,##0.00','0'))
From #YourTable
Returns
$400,000.00
$2,500.00
0
Use the CAST function to cast
DECLARE #text AS NVARCHAR(10)
SET #text = '100$'
SELECT CASE WHEN AmountOfInsurance > 0 THEN CAST(#text AS MONEY) ELSE 0 END

Using ISNUMERIC fails in condition

I have a table like this (simplified):
CREATE TABLE #table (Id INT, Field NVARCHAR(MAX))
INSERT INTO #table VALUES (1, 'SomeText')
INSERT INTO #table VALUES (2, '1234')
For some reasons I need to query this table and get the sum of Field if it is numeric and return '' if it is not. I tried it like this:
SELECT CASE WHEN ISNUMERIC(Field) = 1 THEN SUM(CONVERT(MONEY, Field)) ELSE '' END
FROM #table
GROUP BY Field
But this query leads to the following exception:
Cannot convert a char value to money. The char value has incorrect syntax.
I even changed the ELSE case from '' to 0 but I still get the same message.
Why do I get the exception? As far as I know, SUM(...) should not be executed when ISNUMERIC(Field) returns 0.
Select sum(case when ISNUMERIC(Field)=1 then cast(field as money) else 0 end)
from #table
Group By Field
Returns
(No column name)
1234.00
0.00
Working with mixed datatypes can be a real pain. Where possible, consider table designs that avoid this. To further complicate matters, IsNumeric does not always return what you might expect.
Filtering out the non-numerics before aggregating is one way to go:
SELECT
SUM(CONVERT(MONEY, Field))
FROM
#table
WHERE
ISNUMERIC(Field) = 1
GROUP BY
Field
;

Is it necessary to test for NULL if also testing for greater than?

I inherited some old stored procedures today, and came across several examples that followed this general pattern, where #Test is some INT value:
IF #Test IS NOT NULL AND #Test > 0
-- do something...
My understanding is that if #Test is NULL, then it has no value, and is not greater than, less than or even equal to zero. Therefore, testing for NULL is redundant in the above code:
IF #Test > 0
-- do something...
This second version seems to work just fine, and is far more readable IHMO.
So, my question: Is my understanding of NULL being unnecessary in this instance correct, or is there some obvious use-case I'm overlooking here where it could all go horribly wrong?
Note: In some cases, it was obvious that the intent was checking for the existence of a value, and I've changed those to IF EXISTS... my question is more concerned with the general case outlined above.
In SQL all comparisons to a NULL value evaluate to false.
So you always have to check explicitly for NULL, if you wish to act on it.
So, in this case, the additional test is not necessary.
#FlorianHeer is right on. NULL > 0 will eventually evaluate to false but as #Pred points out that is because Null > 0 actually evaluates to null and null cast to a bit is false....
A null is an unknown and therefore any comparison with it is also unknown. Think of arithmetic functions such as addition 1 + NULL = NULL, or concatenation 'A' + NULLL = NULL. NULL means the SQL database engine cannot interpret what its value is so any function or comparison on it is also unknown.
#MikkaRin pointed out that it is the assumption in the ELSE portion of a case statement or IF statement where that can become problematic but lets also think about this in the context of a join and how you may or may not want to see the results.
DECLARE #Table1 AS TABLE (Col INT)
DECLARE #Table2 AS TABLE (Col INT)
INSERT INTO #Table1 VALUES (1),(2),(3)
INSERT INTO #Table2 VALUES (1),(NULL),(3),(4)
SELECT *
FROM
#Table1 t1
INNER JOIN #Table2 t2
ON t1.Col <> t2.Col
Naturally you might think because NULL would be not equal to 1,2,3 that it should be included in the result set. But null is unknown so SQL is saying well I don't know if NULL could be 1,2,3 so I cannot return that as a result.
Now lets do the same thing but add a NULL in the first table:
DECLARE #Table1 AS TABLE (Col INT)
DECLARE #Table2 AS TABLE (Col INT)
INSERT INTO #Table1 VALUES (1),(2),(3),(NULL)
INSERT INTO #Table2 VALUES (1),(NULL),(3),(4)
SELECT *
FROM
#Table1 t1
INNER JOIN #Table2 t2
ON t1.Col = t2.Col
Again you might think that NULL is = to NULL but any comparison of NULL is considered unknown so even though both tables have NULL in it it will not be returned in the dataset.
Now consider:
DECLARE #Table1 AS TABLE (Col INT)
INSERT INTO #Table1 VALUES (1),(2),(3),(NULL)
SELECT *, CASE WHEN Col < 2 THEN Col ELSE 1000 END as ColCase
FROM
#Table1 t1
Which will make even the NULL 1000 the question is should NULL an unknown be 1000? if NULL is unknown how do we know that it isn't less than 2?
For a lot of your operations it may simply be enough to compare #Value > 1 but especially when you start dealing with ELSE in case of IF statements or joining on the antithesis you should consider dealing with the NULLs. Such as using ISNULL() or COALESCE() as #GuidoG points out.
IMHO being explicit about your intentions during operations to appropriately account for null values out weighs the minimal savings of typing.
Compare with NULL is necessary if you use ELSE statements:
for example:
declare #t int
set #t=null
if (#t>0) print '1' -- works fine
if (#t<0) print '2' --works fine
if (#t>0)
print '3' --works fine
else print '4' --here we start getting problems, because we are sure that #t<=0 that is obviously not true
you could replace it with
if isnull(#test, 0) > 0
This way it will be shorter and you still have checked everything
another interesting example:
SELECT (null > 0) AS a, !(null > 0) AS b
value of both a and b will be NULL
From my understanding, in some cases null checks are added sometimes to short circuit OR logic. For example, consider the following:
select * from tbl where (#id is null or #id > id)
If you pass in a value for #id, it tests the first condition (#id is null) and sees that it's false, but since it's part of an OR statement, it then goes ahead and then runs the #id > id comparison to see what that one returns as well. OR statements only need one true returned for the whole thing to resolve to true, and must keep testing until it comes across an OR condition that does.
Whereas if you pass in null for the #id parameter, as soon as it gets to the first condition and it returns true. Seeing that the next it's part of an OR statement, SQL knows it doesn't even have to do any of the following comparison, because the entire OR statement has already resolved to true. The #id > id comparison and will not even run it. This can save a ton of processing if it's a huge table or complex join, etc.

Error while cast data from nvarchar to numeric

I have table1 with two columns:
col1 - nvarchar(510)
col2 - nvarchar(510)
I want to take all the values from table1 and put it to table2 where data type is different:
col1_A - numeric(22,10)
col2_A - int
I'm doing like:
insert into table2
select cast(col1 as numeric), cast(col2 as int)
but I'm getting error:
What is wrong?
You probably have a character in table1.col1. Also, it's important to point out that table1.col1 is nvarchar(510) but if len(table1.col1) > 11 you are going to get an Arithmetic overflow error.
declare #char nvarchar(510)
set #char = '123456789101'
--set #char = '1234567891011' --will cause an arithmetic overflow error since you are using numeric(22,10)
--set #char = '123abc456' --will cause Error converting data type nvarchar to numeric
declare #num numeric(22,10)
set #num = cast(#char as numeric(22,10))
select #num
It means you need to sanitize your data. Find which values are causing the issue and then manually correct them first or exclude them. To find which values are not numeric use the ISNUMERIC function.
select col1
from yourtable
where ISNUMERIC(col1) = 0
thank you Eric for help to find wrong recrods!
I did case as below:
case
when [col1] LIKE '%,%' then REPLACE([col1]),',','')
else CAST([col1] as Numeric)
end
and it's working!

Can not select rows where column values are empty string in SQL Server?

I have a column which has varchars like "172.54". I am trying to insert into another table where this columns datatype is float. I am getting error saying can not convert datatype varchar to float. So I do
SELECT *
FROM TBL
WHERE ISNUMERIC(COLNAME) <> 1
And I get no results. But casting is not working. So I look and I have empty strings in that column. So I try to
SELECT *
FROM TBL
WHERE COLNAME = ''
And also every other different amount of spaces.
I ultimately just want to convert the empty strings to null
Also len(colname) = 1
declare #test varchar(10) = ' ' -- any number of spaces is equivalent to ''
select try_convert( float, #test ) as floatval -- '' gives you 0
select case when #test = '' then NULL else try_convert( float, #test ) end as floatval -- value '' returns NULL instead of 0
I guess you column has some characters other than numeric data. Also empty string will be converted to zero it will not throw error.
To filter Numeric data use
COLNAME not like '%[^0-9]%'
Try something like this
insert into tablename (col1,col2)
SELECT col1,col2 FROM TBL
COLNAME not like '%[^0-9]%'

Resources