MS SQL XQuery xs:base64Binary returns NULL - sql-server

I (have to) use base64Binary to convert my base64 encoded string into bytes. In most cases it works good enough, but from time to time it returns NULL .
For example this works like a charm:
DECLARE #Base64String VARCHAR(MAX)
SET #Base64String = 'qwerqwerqwerqwer'
declare #Base64Binary VARBINARY(MAX)
set #Base64Binary = cast('' as xml).value('xs:base64Binary(sql:variable("#Base64String"))', 'VARBINARY(max)');
select #Base64Binary as 'base64'
Result is 0xAB07ABAB07ABAB07ABAB07AB and that's ok for me.
But if I set SET #Base64String = 'qwerqwerqwerqwe=' then I get NULL as result. Why? I pass pretty valid base64 string and expect not null value. I've tried to find some workaround, but no luck. How can I made xs:base64Binary to return valid varbinary value for such input strings?

Having had a little look at this, I would suggest that qwerqwerqwerqwe= is not a valid base64 string.
Decoding qwerqwerqwerqwe= using a base64 conversion tool in C# renders the following:
0xAB07ABAB07ABAB07ABAB07
Encoding this in SQL server actually gives the output qwerqwerqwerqwc=:
DECLARE #Base64String VARCHAR(MAX)
DECLARE #Base64Binary VARBINARY(MAX)
SET #Base64Binary = 0xAB07ABAB07ABAB07ABAB07
PRINT #Base64Binary
SET #Base64String = CAST('' AS XML).value('xs:base64Binary(sql:variable("#Base64Binary"))', 'VARCHAR(max)');
PRINT #Base64String
I would suggest that the reason that SQL Server is returning NULL to you is that the base64 string you are working with is not actually valid.

Related

SQL string to varbinary through XML different than nvarchar

We currently have a function in SQL which I simply do not understand.
Currently we convert a nvarchar to XML, and then select the XML value, and convert that to a varbinary.
When I try to simplify this to convert the nvarchar directly to varbinary, the output is different... Why?
--- Current situation:
Declare #inputString nvarchar(max) = '4d95605d1b8f3bca5ea3e0d2af26027004d17218152e726da0622d669a71f85c'
--1: input to XML
declare #inputXML XML = convert(varchar(max), #inputString)
--2: input XML to binary
declare #inputBinray varbinary(max) = #inputXML.value('(/)[1]', 'varbinary(max)')
select #inputString -- 4d95605d1b8f3bca5ea3e0d2af26027004d17218152e726da0622d669a71f85c
select #inputXML -- 4d95605d1b8f3bca5ea3e0d2af26027004d17218152e726da0622d669a71f85c
select #inputBinray -- 0xE1DF79EB4E5DD5BF1FDDB71AE5E6B77B477669FDBAD36EF4D38775EF6D7CD79D9EEF6E9D6B4EB6D9DEBAF5AEF57FCE5C
--- New situation
--1: Input to binary
declare #inputString2 varbinary(max) = CAST(#inputString as varbinary(max));
select #inputString2 -- 0x3400640039003500360030003500640031006200380066003300620063006100350065006100330065003000640032006100660032003600300032003700300030003400640031003700320031003800310035003200650037003200360064006100300036003200320064003600360039006100370031006600380035006300
Using the value() function to get a XML value specified as varbinary(max) will read the data as if it was Base64 encoded. Casting a string to varbinary(max) does not, it treats it as just any string.
If you use the input string QQA= which is the letter A in UTF-16 LE encoded to Base64 you will see more clearly what is happening.
XML gives you 0x4100, the varbinary of the letter A, and direct cast on the string gives you 0x5100510041003D00 where you have two 5100 = "Q" and of course one 4100 = "A" followed by a 3D00 = "="
Might be I get something wrong, but - if I understand you correctly - I think you simply want to get a real binary from a HEX-string, which just looks like a binary. Correct?
Above I wrote "simply", but this was not simple at all a while ago.
I'm not sure at the moment, but I think it was version v2012, which enhanced CONVERT() (read about binary values and how the third parameter works) and try this:
DECLARE #hexString VARCHAR(max)='4d95605d1b8f3bca5ea3e0d2af26027004d17218152e726da0622d669a71f85c';
SELECT CONVERT(varbinary(max),#hexString,2);
The result is a real binary
0x4D95605D1B8F3BCA5EA3E0D2AF26027004D17218152E726DA0622D669A71F85C
What might be the reason for your issue:
Very long ago, I think it was until v2005, the default encoding of varbinaries in XML was a HEX string. Later this was changed to base64. Might be, that you code was used in a very old environment and was upgraded to a higher version?
Today we use XML in a smiliar way to create and to read base64, which is not supported otherwise. Maybe your code did something similar with HEX strings...?
One more hint for this: The many 00 in your New Situation example show clearly, that this is a two-byte encoded NVARCHAR string. Contrary, your Current Situation shows a simple HEX string.
Your final result is just the binary pattern of your input as string:

Convert Time zonoffset value to varchar

Query:
DECLARE #TimeZoneOffset datetimeoffset
SELECT #TimeZoneOffset = Time_Zone_Offset
FROM OFFSET_TABLE WHERE Active=1
Time_Zone_Offset column contains value like -6:00 (only offset)
When I do SELECT #TimeZoneOffset it throws me an error
Conversion failed when converting date and/or time from character string.
I know I am doing something wrong. I may need to CONVERT/CAST but can't get o/p so far.
Any help
To visualize what is happening here, try this:
DECLARE #x VARCHAR;
SET #x = 'abcdefghijklmnop';
SELECT #x;
Result:
----------
a
You have silently lost data from your variable, because you didn't bother declaring a length for your VARCHAR. In your case, I think you are ending up trying to use the string - somewhere, as opposed to the string -6:00.
I'm not sure how a simple SELECT yielded the error you mentioned; I suspect you are using it in some other context you haven't shown. But please try it again once your variable has been declared correctly.
Now I see why, your question wasn't correct - you said you were converting to VARCHAR but you weren't. This is not really unexpected, as -6:00 is not a valid DATETIMEOFFSET value; there is expected to be date and time components as well, otherwise the data type would just be called OFFSET. A valid DATETIMEOFFSET, according to the documentation, is:
DECLARE #d DATETIMEOFFSET = '1998-09-20 7:45:50.71345 -05:00';
So perhaps you have some datetime value and you want to apply the offset, well you can use SWITCHOFFSET() for that. However -6:00 is not a valid value; it needs to be in [+/-]hh:mm format (notice the leading 0 above, which seems to be missing from your sample data). So this would be valid:
DECLARE #datetime DATETIME = GETDATE(), #offset VARCHAR(6) = '-06:00';
SELECT SWITCHOFFSET(#datetime, #offset);
You need to correct the data in your offsets table and you need to change the way you are using the output. Personally, I've found it easier to stay away from DATETIMEOFFSET and SWITCHOFFSET(), especially since they are not DST-aware. I've had much better luck using a calendar table for offsets, storing the offsets in minutes, and using DATEADD to switch between time zones. YMMV.

SQL Geometry return type

In a sql query I can say
select *
from location
where geoLocation.STDistance(0xE6100000010CDDB5847CD0EB42C033333333331F6240) < 10000
but how do I pass that 0xE61... bit in programatically.
when returning in from the database, it appears to be in a binary format. Thus just putting it into the query doesn't work. If I put it in as type binary it doesn't work either.
Is there a way to retrieve the Geolocation from the database that retains it in the above format, i've tried casting it to varchar but it ends up like POINT(....
SQL has a bunch of static methods for creating geospatial data. Check this out (specifically the OGC static and Extended static links). For example:
DECLARE #g geography;
SET #g = geography::STPointFromText('POINT(-122.34900 47.65100)', 4326);
SELECT #g.ToString();

Using HashBytes in SQL Server returns different results from DB

I try to calculate md5 hash on a certain value, but I get a weird result.
I run it in two different ways:
SELECT HASHBYTES('md5',ZLA_PASSWORD),ZLA_PASSWORD, len(ZLA_PASSWORD) FROM ZLA_PASSWORD;
SELECT HASHBYTES('md5', '123456');
I get two different results, where only the second one is valid:
0xCE0BFD15059B68D67688884D7A3D3E8C 123456 6
0xE10ADC3949BA59ABBE56E057F20F883E
This is done on an SQL Server 2005.
checking the result of MD5 on 123456 was the same as the second result checking online.
Any ideas?
Thanks!
You have different data types
declare #str1 as varchar(10)
declare #str2 as nvarchar(10)
set #str1 = '123456'
set #str2 = '123456'
select
hashbytes('md5', #str1) as 'varchar',
hashbytes('md5', #str2) as 'nvarchar'
Result
varchar nvarchar
0xE10ADC3949BA59ABBE56E057F20F883E 0xCE0BFD15059B68D67688884D7A3D3E8C
LEN Trims the contents before returning the length (of the trimmed string.)
Most likely your password field is a CHAR field and got whitespace in there.
Try doing a RTRIM before hashing:
SELECT HASHBYTES('md5',RTRIM(ZLA_PASSWORD))
More exactly this should solve the issue:
SELECT HASHBYTES('md5',CAST(ZLA_PASSWORD AS varchar)),ZLA_PASSWORD, len(ZLA_PASSWORD) FROM ZLA_PASSWORD;

text encodings in .net, sql server processing

I have an application that gets terms from a DB to run as a list of string terms. The DB table was set up with nvarchar for that column to include all foreign characters. Now in some cases where characters like ä will come through clearly when getting the terms from the DB and even show that way in the table.
When importing japanese or arabic characters, all I see are ????????.
Now I have tried converting it using different methods, first converting it into utf8 encoding and then back and also secondly using the httputility.htmlencode which works perfectly when it is these characters but then converts quotes and other stuff which I dont need it to do.
Now I accused the db designer that he needs to do something on his part but am I wrong in that the DB should display all these characters and make it easy to just query it and add to my ssearch list. If not is there a consistent way of getting all international characters to display correctly in SQL and VB.net
I know when I have read from text files I just used the Microsoft.visualbasic.textfieldparser reader tool with encoding set to utf8 and this would not be an issue.
If the database field is nvarchar, then it will store data correctly. As you have seen.
Somewhere before it gets to the database, the data is being lost or changed to varchar: stored procedure, parameters, file encoding, ODBC translation etc.
DECLARE #foo nvarchar(100), #foo2 varchar(100)
--with arabic and japanese and proper N literal
SELECT #foo = N'العربي 日本語', #foo2 = N'العربي 日本語'
SELECT #foo, #foo2 -- gives العربي 日本語
--now a varchar literal
SELECT #foo = 'العربي 日本語', #foo2 = 'العربي 日本語'
SELECT #foo, #foo2 --gives ?????? ???
--from my Swiss German keyboard. These are part of my code page.
SELECT #foo = 'öéäàüè', #foo2 = 'öéäàüè'
SELECT #foo, #foo2 --gives ?????? ???
So, apologise to the nice DB monkey... :-)
Always try to use NVARCHAR or NTEXT to store foreign charactesr.
you cannot store UNICODE in varchar ot text datatype.
Also put a N before string value
like
UPDATE [USER]
SET Name = N'日本語'
WHERE ID = XXXX;

Resources