Decrypting password correctly after EncryptByPassPhrase - sql-server

I am trying to Encrypt and then Decrypt text in SQL Server 2012. I was expecting the third print to give me back customer_abc:
DECLARE
#var_customer VARCHAR(25),
#var_password VARBINARY(8000)
SET #var_customer = 'customer_abc'
SET #var_password = EncryptByPassPhrase('secret', #var_customer )
print #var_customer
print #var_password
print DecryptByPassPhrase('secret', #var_password )
Result:
customer_abc
0x01000000398F9A0D3FE98D29E8F56D6B1908EA87C08706786319DD1BBB3F150FFC5B7F3C
0x637573746F6D65725F616263

Your code is fine, except that you have to explicitly convert to get the actual varchar value. The output of Decrypt functions are still in binary. Change the last line to:
PRINT CONVERT(VARCHAR(25),DecryptByPassPhrase('secret', #var_password));
As an aside, I strongly suggest you support Unicode (so NVARCHAR instead of VARCHAR). Unless you want to look like you don't take your users' password security seriously.

Related

SQL string to varbinary through XML different than nvarchar

We currently have a function in SQL which I simply do not understand.
Currently we convert a nvarchar to XML, and then select the XML value, and convert that to a varbinary.
When I try to simplify this to convert the nvarchar directly to varbinary, the output is different... Why?
--- Current situation:
Declare #inputString nvarchar(max) = '4d95605d1b8f3bca5ea3e0d2af26027004d17218152e726da0622d669a71f85c'
--1: input to XML
declare #inputXML XML = convert(varchar(max), #inputString)
--2: input XML to binary
declare #inputBinray varbinary(max) = #inputXML.value('(/)[1]', 'varbinary(max)')
select #inputString -- 4d95605d1b8f3bca5ea3e0d2af26027004d17218152e726da0622d669a71f85c
select #inputXML -- 4d95605d1b8f3bca5ea3e0d2af26027004d17218152e726da0622d669a71f85c
select #inputBinray -- 0xE1DF79EB4E5DD5BF1FDDB71AE5E6B77B477669FDBAD36EF4D38775EF6D7CD79D9EEF6E9D6B4EB6D9DEBAF5AEF57FCE5C
--- New situation
--1: Input to binary
declare #inputString2 varbinary(max) = CAST(#inputString as varbinary(max));
select #inputString2 -- 0x3400640039003500360030003500640031006200380066003300620063006100350065006100330065003000640032006100660032003600300032003700300030003400640031003700320031003800310035003200650037003200360064006100300036003200320064003600360039006100370031006600380035006300
Using the value() function to get a XML value specified as varbinary(max) will read the data as if it was Base64 encoded. Casting a string to varbinary(max) does not, it treats it as just any string.
If you use the input string QQA= which is the letter A in UTF-16 LE encoded to Base64 you will see more clearly what is happening.
XML gives you 0x4100, the varbinary of the letter A, and direct cast on the string gives you 0x5100510041003D00 where you have two 5100 = "Q" and of course one 4100 = "A" followed by a 3D00 = "="
Might be I get something wrong, but - if I understand you correctly - I think you simply want to get a real binary from a HEX-string, which just looks like a binary. Correct?
Above I wrote "simply", but this was not simple at all a while ago.
I'm not sure at the moment, but I think it was version v2012, which enhanced CONVERT() (read about binary values and how the third parameter works) and try this:
DECLARE #hexString VARCHAR(max)='4d95605d1b8f3bca5ea3e0d2af26027004d17218152e726da0622d669a71f85c';
SELECT CONVERT(varbinary(max),#hexString,2);
The result is a real binary
0x4D95605D1B8F3BCA5EA3E0D2AF26027004D17218152E726DA0622D669A71F85C
What might be the reason for your issue:
Very long ago, I think it was until v2005, the default encoding of varbinaries in XML was a HEX string. Later this was changed to base64. Might be, that you code was used in a very old environment and was upgraded to a higher version?
Today we use XML in a smiliar way to create and to read base64, which is not supported otherwise. Maybe your code did something similar with HEX strings...?
One more hint for this: The many 00 in your New Situation example show clearly, that this is a two-byte encoded NVARCHAR string. Contrary, your Current Situation shows a simple HEX string.
Your final result is just the binary pattern of your input as string:

MS SQL Server EncryptByKey - String or binary data would be truncated

In theory varchar(max) and varbinary(max) columns should be capable of storing up to 2GB of data but I cannot store a unicode string 5000 characters long.
I've looked through other questions on this topic and they all suggest checking column sizes. I've done this and see that all related columns are declared with max size.
The key difference from similar questions is that, when storing I'm encrypting data using EncryptByKey and I think that it's the bottleneck I'm looking for. From MSDN I know that return type of EncryptByKey has max size of 8000 bytes, and it is not clear what is max size of #cleartext argument, but I suspect it's the same.
The following code gives me error :
OPEN SYMMETRIC KEY SK1 DECRYPTION BY CERTIFICATE Cert1;
DECLARE #tmp5k AS NVARCHAR(max);
SET #tmp5k = N'...5000 characters...';
SELECT EncryptByKey(Key_GUID('SK1'), #tmp5k);
GO
[22001][8152] String or binary data would be truncated.
How to encrypt and store big strings (around 5k unicode characters)?
So I ran into this issue when using C# and trying to encrypt and inserts a long JSON string into SQL. What ended up working was converting the plain-text string to binary and then using the same SQL EncryptByKey function to insert that instead.
If you're doing this is just SQL, I think you can use this function:
CONVERT(VARBINARY(MAX), #tmp5k) AS ToBinary
So using our example:
OPEN SYMMETRIC KEY SK1 DECRYPTION BY CERTIFICATE Cert1;
DECLARE #tmp5k AS NVARCHAR(max);
SET #tmp5k = N'...5000 characters...';
SELECT EncryptByKey(Key_GUID('SK1'), CONVERT(VARBINARY(MAX), #tmp5k));
GO
And here's an example of using SQL to convert the binary back to a string:
CONVERT(VARCHAR(100), CONVERT(VARBINARY(100), #TestString)) AS StringFromBinaryFromString ;

How to ensure specific character encoding in Microsoft SQL Server?

What I need is to ensure that a string gets encoded in a known character encoding. So far, my research and testing with MS SQL Server has revealed that the documented encoding is 'UCS-2', however the actual encoding (on the server in question) is 'UCS-2LE'.
Which doesn't seem very reliable. What I would love is an ENCODE function as found in PERL, Node, or most anything, so that regardless of upgrades or settings changes, my hash function will be working on known input.
We can limit the hashing string to HEX, so at worst, we could manually map the 16 possible input characters to the proper bytes. Anyone have a recommendation on this?
Here's the PERL I'm using:
use Digest::SHA qw/sha256/;
use Encode qw/encode/;
$seed = 'DDFF5D36-F14D-495D-BAA6-3688786D6CFA';
$string = '123456789';
$target = '57392CD6A5192B6185C5999EB23D240BB7CEFD26E377D904F6FEF262ED176F97';
$encoded = encode('UCS-2LE', $seed.$string);
$sha256 = uc(unpack("H*", sha256($encoded)));
print "$target\n$sha256\n";
Which matches MS SQL:
HASHBYTES('SHA_256', 'DDFF5D36-F14D-495D-BAA6-3688786D6CFA123456789')
But what I really want is:
HASHBYTES('SHA_256', ENCODE('UCS2-LE', 'DDFF5D36-F14D-495D-BAA6-3688786D6CFA123456789'))
So that no matter what MS SQL happens to be encoding the input string as, the HASHBYTES will always operate on a known byte array.
SQL Server uses UCS-2 only on columns, variables and literals that were declared as nvarchar. In all other cases it uses 8-bit ASCII with the encoding of the current database, unless specified otherwise (using the collate clause, for example).
So, you either have to specify a Unicode literal:
select HASHBYTES('SHA_256', N'DDFF5D36-F14D-495D-BAA6-3688786D6CFA123456789');
Or, you can use a variable or table column of the nvarchar data type:
-- Variable
declare #var nvarchar(128) = N'DDFF5D36-F14D-495D-BAA6-3688786D6CFA123456789';
select HASHBYTES('SHA_256', #var);
-- Table column
declare #t table(
Value nvarchar(128)
);
insert into #t
select #var;
select HASHBYTES('SHA_256', t.Value)
from #t t;
P.S. Of course, since Wintel is a little-endian platform, SQL Server uses the same version of the encoding as the OS / hardware. Unless something new will come out in SQL Server 2017, there is no way to get big-endian representation in this universe natively.

Why can I store an Ukrainian string in a varchar column?

I got a little surprised as I was able to store an Ukrainian string in a varchar column .
My table is:
create table delete_collation
(
text1 varchar(100) collate SQL_Ukrainian_CP1251_CI_AS
)
and using this query I am able to insert:
insert into delete_collation
values(N'використовується для вирішення квитки')
but when I am removing 'N' it is showing ?????? in the select statement.
Is it okay or am I missing something in understanding unicode and non-unicode with collate?
From MSDN:
Prefix Unicode character string constants with the letter N. Without
the N prefix, the string is converted to the default code page of the
database. This default code page may not recognize certain characters.
UPDATE:
Please see a similar questions::
What is the meaning of the prefix N in T-SQL statements?
Cyrillic symbols in SQL code are not correctly after insert
sql server 2012 express do not understand Russian letters
To expand on MegaTron's answer:
Using collate SQL_Ukrainian_CP1251_CI_AS, SQL server is able to store ukrainian characters in a varchar column by using CodePage 1251.
However, when you specify a string without the N prefix, that string will be converted to the default non-unicode codepage before it is sent to the database, and that is why you see ??????.
So it is completely fine to use varchar and collate as you do, but you must always include the N prefix when sending strings to the database, to avoid the intermediate conversion to default (non-ukrainian) codepage.

How to Show Eastern Letter(Chinese Character) on SQL Server/SQL Reporting Services?

I need to insert chinese characters in my database but it always show ???? ..
Example:
Insert this record.
微波室外单元-Apple
Then it became ???
Result:
??????-Apple
I really Need Help...thanks in regard.
I am using MSSQL Server 2008
Make sure you specify a unicode string with a capital N when you insert like:
INSERT INTO Table1 (Col1) SELECT N'微波室外单元-Apple' AS [Col1]
and that Table1 (Col1) is an NVARCHAR data type.
Make sure the column you're inserting to is nchar, nvarchar, or ntext. If you insert a Unicode string into an ANSI column, you really will get question marks in the data.
Also, be careful to check that when you pull the data back out you're not just seeing a client display problem but are actually getting the question marks back:
SELECT Unicode(YourColumn), YourColumn FROM YourTable
Note that the Unicode function returns the code of only the first character in the string.
Once you've determined whether the column is really storing the data correctly, post back and we'll help you more.
Try adding the appropriate languages to your Windows locale setings. you'll have to make sure your development machine is set to display Non-Unicode characters in the appropriate language.
And ofcourse u need to use NVarchar for foreign language feilds
Make sure that you have set an encoding for the database to one that supports these characters. UTF-8 is the de facto encoding as it's ASCII compatible but supports all 1114111 Unicode code points.
SELECT 'UPDATE table SET msg=UNISTR('''||ASCIISTR(msg)||''') WHERE id='''||id||''' FROM table WHERE id= '123344556' ;

Resources