I have a table in SQL Server, with binary data stored as a string in a varchar(max) field.
The table name is attachment, and the field is named "documentbody".
select id, mimetype, documentbody
from attachment
The files stored in the table are mostly PDF, but also include JPG and PNG and probably some other file types too.
Here is a sample of what one of the "files" looks like, when queried (first 100 characters only):
JVBERi0xLjQKJeLjz9MNCjEgMCBvYmoKPDwvVHlwZSAvUGFnZQovUGFyZW50IDIgMCBSCi9NZWRpYUJveCBbIDAgMCA2MTIuMDAw
How can I convert this data into actual binary data?
When one wishes to convert data from one datatype to another, and no implicit convert exists, one uses either CAST or CONVERT.
e.g.
select cast(MyColumn as varbinary(max)), convert(varbinary(max), MyColumn)
from MyTable;
CAST is ANSI-SQL for what it is worth whereas CONVERT is SQL Server specific. However CONVERT handles many other cases including specific formatting, which CAST doesn't handle.
OK, taking a total guess here, many people encode binary data as base64 so try this:
SELECT CAST(CAST(N'' AS XML).value('xs:base64Binary(sql:column("MyColumn"))', 'VARBINARY(MAX)') AS VARCHAR(MAX))
FROM MyTable;
Related
I am trying to figure out how to pass an XML value to a stored procedure using MSSQL node driver, from the documentation I can see that the driver does support stored procedures, and you also define custom data types like this:
sql.map.register(MyClass, sql.Text);
but I haven't found an example how can it be done for XML so far.
I did find a similar question but for a .NET SQL driver, trying to figure out if anyone has done this for Node.
UPDATE
I was able to send an XML to a stored procedure and parse it in DB, here's the example:
var request = new sql.Request(connection);
var xml = '<root><stock><id>3</id><name>Test3</name><ask>91011</ask></stock></root>'
request.input('XStock', sql.Xml, xml);
request.execute('StockUpdateTest', function (err, recordsets, returnValue, affected) {
});
I do not know this special case, but there are some general ideas:
The input parameter of a stored procedure, which should take some XML, can be either XML, VARCHAR or NVARCHAR. Well, just to mention this, VARBINARY might work too, but why should one do this...
A string in SQL-Server is either 8-bit encoded (VARCHAR) or 16-bit (NVARCHAR), XML is - in any case - NVARCHAR internally.
Most cases will be casted implicitly. You can pass a valid XML as VARCHAR or as NVARCHAR and assign this to a variable of type XML. Both will work. But you will get into troubles if your XML includes special characters...
Important: If the XML includes a declaration like <?xml ... encoding="utf-8"?> it must be handed over to the XML variable as VARCHAR, while utf-16 must be NVARCHAR. This declaration would be omitted in SQL Server in any case, so easiest is, to pass the XML as string without such a declaration.
The clear advise is, to pass an XML as 16-bit unicode string without <?xml ...?>-declaration. Doing so, there will be no implicit casting and you will not run in troubles with special characters and/or encoding issues.
Your SP can either define the parameter as XML or as NVARCHAR(MAX) and assign it to a typed variable internally.
Hope this helps!
I have XML docs stored in a TEXT column (collation_name French_CI_AS, character_set_name iso_1).
I want to move them to a new table, in an XML column with the following SQL...
INSERT INTO Signature(JustifId, SignedJustif)
SELECT JustifID, CONVERT(XML, Justif.SignedJustif,2)
FROM Justif
When I do this, I get character encoding errors, that point to the high ascii character in this fragment "presentación, OU=CERES, O=FNMT-RCM, C=ES" - a spanish accented o in an X509 certificate.
This ó started life in utf8, became utf16 as a .net string, then became iso_1 when inserted into the TEXT column. I can copy and paste it into a web page no problem. How, then, do I move it from a TEXT column to an XML column in the same DB (and why is this so difficult?)?
The CONVERT idea came from this post. This MS page covers creating XML from varchar and nvarchar.
This is tricky... A conversion on byte-level might lead to unexpected results...
Try this
INSERT INTO Signature(JustifId, SignedJustif)
SELECT JustifID, CONVERT(XML, CONVERT(VARCHAR(MAX),Justif.SignedJustif))
FROM Justif
If you still get issues, try to specify the specific collation together with the conversion and/or try to convert to NVARCHAR(MAX).
If this doesn't help, please edit your question and poste a (reduced) example. Best was a test-scenario with a minimal XML, where one can reproduce the error.
I have a mssql database with Id in the form of hex values.
For example, when viewed in Management studio, a typical id column looks like
id, | userName
0x8189CF203DEA4A44B8ADEFF1C8246866, | John
0xAF4845C8A34A48EF8B6D481F2D20D561, | Peter
0x70B1F5E3B3F8417BBB99912640C54520, | Alan
To query the user table, I need to write something like
SELECT * FROM users Where Id = 0x8189CF203DEA4A44B8ADEFF1C8246866
I use a lot of sequelize.query to run a bunch of SQL statements directly.
When such table is read in sequelize, the id gets converted into buffer type in sequelize. So my question is, how can I keep this hex value? Is there a config that keep the string hex value of these ids? Did I have to convert these buffer type manually to hex string by hand, and attach a 0x in the front?
For example, when viewed in Management studio, a typical id column looks like
For me, a typical ID does not look like this, but I'm quite sure, that your hex-values are UNIQUEIDENTIFIERs (=GUID) actually (see option 2).
Option 1: HEX-string
You might store the hex string as its string representation:
SELECT sys.fn_varbintohexstr(0x8189CF203DEA4A44B8ADEFF1C8246866)
returns "0x8189cf203dea4a44b8adeff1c8246866" (which is a string now)
But - how ever - the function meant to do the opposite truncates part of this
select sys.fn_cdc_hexstrtobin(N'0x8189CF203DEA4A44B8ADEFF1C8246866')
returns 0x8189CF203DEA4A44B8AD (which is to short!!!)
OPTION 2: GUID
I would cast these values to GUIDs (if none of them is wider than 16 Bytes!) and store them typesafe. It is easy and fully out-of-the-box to get a GUID as its string representation (e.g. to write this in XML) and to cast it back to GUID.
SELECT CAST(0x8189CF203DEA4A44B8ADEFF1C8246866 AS uniqueidentifier);
returns 20CF8981-EA3D-444A-B8AD-EFF1C8246866
SELECT CAST('20CF8981-EA3D-444A-B8AD-EFF1C8246866' AS uniqueidentifier)
returns the same as above, just to show, that this string value is casted to a real GUID
SELECT CAST(CAST('20CF8981-EA3D-444A-B8AD-EFF1C8246866' AS uniqueidentifier) AS varbinary(max))
returns 0x8189CF203DEA4A44B8ADEFF1C8246866
Now you have your original HEX-string back again.
How to convert binary data to text?
I have column called File names in test table with image datatype so when i am selecting the data from test table its showing the data for file names column is binary data i.e ('0x433A5C55736535').
Regards
Anji
I can't beleive nobody answered this. If it's image data you won't get anything readable. but in case it is text data in a binary field, you can do this:
select cast(DataColumn as varchar(MAX)) as DataAsText from [TableWithData];
'0x433A5C55736535' is not binary data, binary data is composed only with 0 and 1.
I think you need to use the defined functions of your test table to get the data you want, if you are in MySQL it's "Select * from file names"
So I decided for the fun of it to read a text file and store the contents into a NVARCHAR using TSQL and the Microsoft SQL Server Management Studio 2008 R2. I found an example for doing this at https://www.simple-talk.com/sql/t-sql-programming/the-tsql-of-text-files/
So I tried this with my ABC.txt file and its contents are:
ABCDEFGHIJKLMNOPQRSTUVWXYZ
abcdefghijklmnopqrstuvwxyz
When I first tried to store the contents of this file into#myString I used this code:
declare #myString nvarchar(max);
Select #myString = BulkColumn
from OPENROWSET(Bulk 'C:\Users\<myComputer'sNameHere>\Documents\How2\FilesForTestingStuff\ABC.txt', SINGLE_BLOB) as x
print #myString;
I got this as my output when I printed the string:
䉁䑃䙅䡇䩉䱋乍偏剑呓噕塗婙扡摣晥桧橩汫湭灯牱瑳癵硷穹
I changed nvarchar to varchar and I got the correct contents of the file.
Anyone know why this happend? I didn't think that there's a conversion difference other than nvarchar has more space available than varchar and is able to hold unicode characters.
Also how do you normally attempt reading from a file and inserting the contents into a nvarchar?
I suppose it depends on the encoding of the input file.
You used SINGLEBLOB and according to MSDN it causes data to be returned as varbinary(MAX). Your file was probably saved using a non-unicode encoding, so when it was imported data into nvarchar column, SQL interpreted it incorrectly. Changing the type allowed characters to be read correctly. Please try to encode the file with UTF-16 and try to import data into a nvarchar(MAX) variable.
Update
I tried to recreate the issue You described. I've saved a text file with ANSI encoding, run the import script and got the output similar to the one You posted in Your question. Then, I converted the file to UCS-2 Little Endian encoding and after running the script I got correct output.
To sum it up, if You want to use importing with SINGLEBLOB option, just convert the file with data to use UCS-2 Little Endian encoding and it should work correctly with nvarchar SQL type.
Reference links:
OPENROWSET
nchar and varchar