i've data in XML like
<Values>
<Id>7f8a5d20-d171-42f5-a222-a01b5186a048</Id>
<DealAttachment>
<AttachmentId>deefff3f-f63e-4b4c-8e76-68b6db476628</AttachmentId>
<IsNew>true</IsNew>
<IsDeleted>false</IsDeleted>
<DealId>7f8a5d20-d171-42f5-a222-a01b5186a048</DealId>
<AttachmentName>changes2</AttachmentName>
<AttachmentFile>991049711010310132116104101321011099710510832115117981061019911645100111110101131011711210097116101329811110012132116101120116321151171031031011151161011003298121326610510810845100111110101131010011132110111116321151011101003210110997105108321051103299971151013211110232117112100971161011003299108105101110116115329811711632115101110100321051103299971151013211110232100101108101116101100329910810510111011611545321001111101011310991141019711610132973211510111297114971161013211697981081013211611132991111101169710511032117115101114115321161113211910411110932101109971051083211910510810832981013211510111011645100111110101131011611411711099971161013211610410132108111103321169798108101329710211610111432111110101321091111101161044510011111010113108411497110115108971161051111101154532100111110101131097100100321129711497109101116101114321161113210611198321161113211411711032115105108101110116108121451001011089712110110013101310131013101310131010511010211111410932981051081083210511032999711510132981111161043211711210097116105111110471001011081011161051111104432117112100971161051111103211910510810832110111116329810132105110102111114109101100</AttachmentFile>
<AttachmentType>.txt</AttachmentType>
</DealAttachment>
</Values>
where AttachmentFile is varbinary(max)
DECLARE #AttachmentId uniqueidentifier,
#DealId uniqueidentifier,
#IsNew bit,
#IsDeleted bit,
#AttachmentName varchar(100),
#AttachmentFile varbinary(max),
#AttachmentType varchar(50)
SET #DealId = #SubmitXml.value('(Values/Id/node())[1]', 'uniqueidentifier')
SET #AttachmentId = #SubmitXml.value('(Values/DealAttachment/AttachmentId/node())[1]', 'uniqueidentifier')
SET #IsNew = #SubmitXml.value('(Values/DealAttachment/IsNew/node())[1]', 'bit')
SET #IsDeleted = #SubmitXml.value('(Values/DealAttachment/IsDeleted/node())[1]', 'bit')
SET #AttachmentName = #SubmitXml.value('(Values/DealAttachment/AttachmentName/node())[1]', 'varchar(100)')
SET #AttachmentFile = #SubmitXml.value('(Values/DealAttachment/AttachmentFile/node())[1]', 'varbinary(max)')
SET #AttachmentType = #SubmitXml.value('(Values/DealAttachment/AttachmentType/node())[1]', 'varchar(50)')
But, after above statement #AttachmentFile is NULL or blankspace.
Binary data types in SQL Server (including varbinary) are represented as hexadecimal in queries which read and write them.
I think the problem here is that, rather than writing to the database directly from the byte stream (as in the example you linked to in your comment, which would implicitly cast the byte array to a hexadecimal value), it's being written to an intermediate XML block. When the data is written to the XML block, it looks like the byte stream is being converted to a string made up of concatenated list of integers of the byte values in decimal. Because the byte values are not delimited, it might not be possible to reconstruct the original data from this stream.
If you need to use an intermediate XML file, a more conventional approach would be to encode the file data as Base64 in the XML block (as discussed in this question - and doubtless many others). You could then decode it using the xs:base64Binary function:
SET #AttachmentFile = #SubmitXml.value('xs:base64Binary((Values/DealAttachment/AttachmentFile/node())[1])','varbinary(max)')
#Ed is correct in that you have somehow stored the decimal ASCII values for each character instead of the hex value for each. You can see that by decoding each one:
SELECT CHAR(99) + CHAR(104) + CHAR(97) + CHAR(110) + CHAR(103) + CHAR(101) +
CHAR(32) + CHAR(116) + CHAR(104) + CHAR(101);
-- change the
But as you can also see, there is no way to decode that programmatically because it is a mix of 2 digit and 3 digit values.
If you really were storing hex bytes in that XML element, you could turn it into VARBINARY(MAX) having the same binary bytes by first extracting it from the XML as a plain VARCHAR(MAX) and then converting it to VARBINARY(MAX) using the CONVERT built-in function, specifying a "style" of 2.
SELECT CONVERT(VARBINARY(MAX), 'this is a test, yo!');
-- 0x74686973206973206120746573742C20796F21
DECLARE #SomeXML XML = N'
<Values>
<DealAttachment>
<AttachmentFile>74686973206973206120746573742C20796F21</AttachmentFile>
</DealAttachment>
</Values>';
SELECT CONVERT(VARBINARY(MAX),
#SomeXML.value('(/Values/DealAttachment/AttachmentFile/text())[1]',
'VARCHAR(MAX)'),
2)
-- 0x74686973206973206120746573742C20796F21 (but as VARBINARY)
However, that all being said, ideally you would just Base64 encode the binary file on the way in (also as mentioned by #Ed) using Convert.ToBase64String (since you already have a byte[]).
Related
Is there a simple way to convert a utf-8 encoded varbinary(max) column to varchar(max) in T-SQL. Something like CONVERT(varchar(max), [MyDataColumn]). Best would be a solution that does not need custom functions.
Currently, i convert the data on the client side, but this has the downside, that correct filtering and sorting is not as efficient as done server-side.
XML trick
Following solution should work for any encoding.
There is a tricky way of doing exactly what the OP asks. Edit: I found the same method discussed on SO (SQL - UTF-8 to varchar/nvarchar Encoding issue)
The process goes like this:
SELECT
CAST(
'<?xml version=''1.0'' encoding=''utf-8''?><![CDATA[' --start CDATA
+ REPLACE(
LB.LongBinary,
']]>', --we need only to escape ]]>, which ends CDATA section
']]]]><![CDATA[>' --we simply split it into two CDATA sections
) + ']]>' AS XML --finish CDATA
).value('.', 'nvarchar(max)')
Why it works: varbinary and varchar are the same string of bits - only the interpretation differs, so the resulting xml truly is utf8 encoded bitstream and the xml interpreter is than able to reconstruct the correct utf8 encoded characters.
BEWARE the 'nvarchar(max)' in the value function. If you used varchar, it would destroy multi-byte characters (depending on your collation).
BEWARE 2 XML cannot handle some characters, i.e. 0x2. When your string contains such characters, this trick will fail.
Database trick (SQL Server 2019 and newer)
This is simple. Create another database with UTF8 collation as the default one. Create function that converts VARBINARY to VARCHAR. The returned VARCHAR will have that UTF8 collation of the database.
Insert trick (SQL Server 2019 and newer)
This is another simple trick. Create a table with one column VARCHAR COLLATE ...UTF8. Insert the VARBINARY data into this table. It will get saved correctly as UTF8 VARCHAR. It is sad that memory optimized tables cannot use UTF8 collations...
Alter table trick (SQL Server 2019 and newer)
(don't use this, it is unnecessary, see Plain insert trick)
I was trying to come up with an approach using SQL Server 2019's Utf8 collation and I have found one possible method so far, that should be faster than the XML trick (see below).
Create temporary table with varbinary column.
Insert varbinary values into the table
Alter table alter column to varchar with utf8 collation
drop table if exists
#bin,
#utf8;
create table #utf8 (UTF8 VARCHAR(MAX) COLLATE Czech_100_CI_AI_SC_UTF8);
create table #bin (BIN VARBINARY(MAX));
insert into #utf8 (UTF8) values ('Žluťoučký kůň říčně pěl ďábelské ódy za svitu měsíce.');
insert into #bin (BIN) select CAST(UTF8 AS varbinary(max)) from #utf8;
select * from #utf8; --here you can see the utf8 string is stored correctly and that
select BIN, CAST(BIN AS VARCHAR(MAX)) from #bin; --utf8 binary is converted into gibberish
alter table #bin alter column BIN varchar(max) collate Czech_100_CI_AI_SC_UTF8;
select * from #bin; --voialá, correctly converted varchar
alter table #bin alter column BIN nvarchar(max);
select * from #bin; --finally, correctly converted nvarchar
Speed difference
The Database trick together with the Insert trick are the fastest ones.
The XML trick is slower.
The Alter table trick is stupid, don't do it. It loses out when you change lots of short texts at once (the altered table is large).
The test:
first string contains one replace for the XML trick
second string is plain ASCII with no replaces for XML trick
#TextLengthMultiplier determines length of the converted text
#TextAmount determines how many of them at once will be converted
------------------
--TEST SETUP
--DECLARE #LongText NVARCHAR(MAX) = N'český jazyk, Tiếng Việt, русский язык, 漢語, ]]>';
--DECLARE #LongText NVARCHAR(MAX) = N'JUST ASCII, for LOLZ------------------------------------------------------';
DECLARE
#TextLengthMultiplier INTEGER = 100000,
#TextAmount INTEGER = 10;
---------------------
-- TECHNICALITIES
DECLARE
#StartCDATA DATETIME2(7), #EndCDATA DATETIME2(7),
#StartTable DATETIME2(7), #EndTable DATETIME2(7),
#StartDB DATETIME2(7), #EndDB DATETIME2(7),
#StartInsert DATETIME2(7), #EndInsert DATETIME2(7);
drop table if exists
#longTexts,
#longBinaries,
#CDATAConverts,
#DBConverts,
#INsertConverts;
CREATE TABLE #longTexts (LongText VARCHAR (MAX) COLLATE Czech_100_CI_AI_SC_UTF8 NOT NULL);
CREATE TABLE #longBinaries (LongBinary VARBINARY(MAX) NOT NULL);
CREATE TABLE #CDATAConverts (LongText VARCHAR (MAX) COLLATE Czech_100_CI_AI_SC_UTF8 NOT NULL);
CREATE TABLE #DBConverts (LongText VARCHAR (MAX) COLLATE Czech_100_CI_AI_SC_UTF8 NOT NULL);
CREATE TABLE #InsertConverts (LongText VARCHAR (MAX) COLLATE Czech_100_CI_AI_SC_UTF8 NOT NULL);
insert into #longTexts --make the long text longer
(LongText)
select
REPLICATE(#LongText, #TextLengthMultiplier)
from
TESTES.dbo.Numbers --use while if you don't have number table
WHERE
Number BETWEEN 1 AND #TextAmount; --make more of them
insert into #longBinaries (LongBinary) select CAST(LongText AS varbinary(max)) from #longTexts;
--sanity check...
SELECT TOP(1) * FROM #longTexts;
------------------------------
--MEASURE CDATA--
SET #StartCDATA = SYSDATETIME();
INSERT INTO #CDATAConverts
(
LongText
)
SELECT
CAST(
'<?xml version=''1.0'' encoding=''utf-8''?><![CDATA['
+ REPLACE(
LB.LongBinary,
']]>',
']]]]><![CDATA[>'
) + ']]>' AS XML
).value('.', 'Nvarchar(max)')
FROM
#longBinaries AS LB;
SET #EndCDATA = SYSDATETIME();
--------------------------------------------
--MEASURE ALTER TABLE--
SET #StartTable = SYSDATETIME();
DROP TABLE IF EXISTS #AlterConverts;
CREATE TABLE #AlterConverts (UTF8 VARBINARY(MAX));
INSERT INTO #AlterConverts
(
UTF8
)
SELECT
LB.LongBinary
FROM
#longBinaries AS LB;
ALTER TABLE #AlterConverts ALTER COLUMN UTF8 VARCHAR(MAX) COLLATE Czech_100_CI_AI_SC_UTF8;
--ALTER TABLE #AlterConverts ALTER COLUMN UTF8 NVARCHAR(MAX);
SET #EndTable = SYSDATETIME();
--------------------------------------------
--MEASURE DB--
SET #StartDB = SYSDATETIME();
INSERT INTO #DBConverts
(
LongText
)
SELECT
FUNCTIONS_ONLY.dbo.VarBinaryToUTF8(LB.LongBinary)
FROM
#longBinaries AS LB;
SET #EndDB = SYSDATETIME();
--------------------------------------------
--MEASURE Insert--
SET #StartInsert = SYSDATETIME();
INSERT INTO #INsertConverts
(
LongText
)
SELECT
LB.LongBinary
FROM
#longBinaries AS LB;
SET #EndInsert = SYSDATETIME();
--------------------------------------------
-- RESULTS
SELECT
DATEDIFF(MILLISECOND, #StartCDATA, #EndCDATA) AS CDATA_MS,
DATEDIFF(MILLISECOND, #StartTable, #EndTable) AS ALTER_MS,
DATEDIFF(MILLISECOND, #StartDB, #EndDB) AS DB_MS,
DATEDIFF(MILLISECOND, #StartInsert, #EndInsert) AS Insert_MS;
SELECT TOP(1) '#CDATAConverts ', * FROM #CDATAConverts ;
SELECT TOP(1) '#DBConverts ', * FROM #DBConverts ;
SELECT TOP(1) '#INsertConverts', * FROM #INsertConverts;
SELECT TOP(1) '#AlterConverts ', * FROM #AlterConverts ;
SQL-Server does not know UTF-8 (at least all versions you can use productivly). There is limited support starting with v2014 SP2 (and some details about the supported versions)
when reading an utf-8 encoded file from disc via BCP (same for writing content to disc).
Important to know:
VARCHAR(x) is not utf-8. It is 1-byte-encoded extended ASCII, using a codepage (living in the collation) as character map.
NVARCHAR(x) is not utf-16 (but very close to it, it's ucs-2). This is a 2-byte-encoded string covering almost any known characters (but exceptions exist).
utf-8 will use 1 byte for plain latin characters, but 2 or even more bytes to encoded foreign charsets.
A VARBINARY(x) will hold the utf-8 as a meaningless chain of bytes.
A simple CAST or CONVERT will not work: VARCHAR will take each single byte as a character. For sure this is not the result you would expect. NVARCHAR would take each chunk of 2 bytes as one character. Again not the thing you need.
You might try to write this out to a file and read it back with BCP (v2014 SP2 or higher). But the better chance I see for you is a CLR function.
you can use the following to post string into varbinary field
Encoding.Unicode.GetBytes(Item.VALUE)
then use the following to retrive data as string
public string ReadCString(byte[] cString)
{
var nullIndex = Array.IndexOf(cString, (byte)0);
nullIndex = (nullIndex == -1) ? cString.Length : nullIndex;
return System.Text.Encoding.Unicode.GetString(cString);
}
In the same vein as this question, what is the equivalent in SQL Server to the following Postgres statement?
select encode(some_field, 'escape') from only some_table
As you were told already, SQL-Server is not the best with such issues.
The most important advise to avoid such issues is: Use the appropriate data type to store your values. Storing binary data as a HEX-string is running against this best practice. But there are some workarounds:
I use the HEX-string taken from the linked question:
DECLARE #str VARCHAR(100)='0x61736461640061736461736400';
--here I use dynamically created SQL to get the HEX-string as a real binary:
DECLARE #convBin VARBINARY(MAX);
DECLARE #cmd NVARCHAR(MAX)=N'SELECT #bin=' + #str;
EXEC sp_executeSql #cmd
,N'#bin VARBINARY(MAX) OUTPUT'
,#bin=#convBin OUTPUT;
--This real binary can be converted to a VARCHAR(MAX).
--Be aware, that in this case the input contains 00 as this is an array.
--It is possible to split the input at the 00s, but this is going to far...
SELECT #convBin AS HexStringAsRealBinary
,CAST(#convBin AS VARCHAR(MAX)) AS CastedToString; --You will see the first "asda" only
--If your HEX-string is not longer than 10 bytes there is an undocumented function:
--You'll see, that the final AA is cut away, while a shorter string would be filled with zeros.
SELECT sys.fn_cdc_hexstrtobin('0x00112233445566778899AA')
SELECT CAST(sys.fn_cdc_hexstrtobin(#str) AS VARCHAR(100));
UPDATE: An inlinable approach
The following recursive CTE will read the HEX-string character by character.
Furthermore it will group the result and return two rows in this case.
This solution is very specific to the given input.
DECLARE #str VARCHAR(100)='0x61736461640061736461736400';
WITH recCTE AS
(
SELECT 1 AS position
,1 AS GroupingKey
,SUBSTRING(#str,3,2) AS HEXCode
,CHAR(SUBSTRING(sys.fn_cdc_hexstrtobin('0x' + SUBSTRING(#str,3,2)),1,1)) AS TheLetter
UNION ALL
SELECT r.position+1
,r.GroupingKey + CASE WHEN SUBSTRING(#str,2+(r.position)*2+1,2)='00' THEN 1 ELSE 0 END
,SUBSTRING(#str,2+(r.position)*2+1,2)
,CHAR(SUBSTRING(sys.fn_cdc_hexstrtobin('0x' + SUBSTRING(#str,2+(r.position)*2+1,2)),1,1)) AS TheLetter
FROM recCTE r
WHERE position<LEN(#str)/2
)
SELECT r.GroupingKey
,(
SELECT x.TheLetter AS [*]
FROM recCTE x
WHERE x.GroupingKey=r.GroupingKey
AND x.HEXCode<>'00'
AND LEN(x.HEXCode)>0
ORDER BY x.position
FOR XML PATH(''),TYPE
).value('.','varchar(max)')
FROM recCTE r
GROUP BY r.GroupingKey;
The result
1 asdad
2 asdasd
Hint: Starting with SQL Server 2017 there is STRING_AGG(), which would reduce the final SELECT...
If you need this functionality, it's going to be up to you to implement it. Assuming you just need the escape variant, you can try to implement it as a T-SQL UDF. But pulling strings apart, working character by character and building up a new string just isn't a T-SQL strength. You'd be looking at a WHILE loop to count over the length of the input byte length, SUBSTRING to extract the individual bytes, and CHAR to directly convert the bytes that don't need to be octal encoded.1
If you're going to start down this route (and especially if you want to support the other formats), I'd be looking at using the CLR support in SQL Server, to create the function in a .NET language (C# usually preferred) and use the richer string manipulation functionality there.
Both of the above assume that what you're really wanting is to replicate the escape format of encode. If you just want "take this binary data and give me a safe string to represent it", just use CONVERT to get the binary hex encoded.
1Here's my attempt at it. I'd suggest a lot of testing and tweaking before you use it in anger:
create function Postgresql_encode_escape (#input varbinary(max))
returns varchar(max)
as
begin
declare #i int
declare #len int
declare #out varchar(max)
declare #chr int
select #i = 1, #out = '',#len = DATALENGTH(#input)
while #i <= #len
begin
set #chr = SUBSTRING(#input,#i,1)
if #chr > 31 and #chr < 128
begin
set #out = #out + CHAR(#chr)
end
else
begin
set #out = #out + '\' +
RIGHT('000' + CONVERT(varchar(3),
(#chr / 64)*100 +
((#chr / 8)%8)*10 +
(#chr % 8))
,3)
end
set #i = #i + 1
end
return #out
end
I'm curently storing a list of ids in a column as a CSV string value ('1;2;3').
I'd like to optimize with a better approach (I believe) which would use varbinary(max).
I'm looking for tsql functions
1 . That would merge side by side a set of integer rows into a varbinary(max)
2 . That would split the varbinary field into a set of int rows
Any tips appreciated, thanks
Solution is very questionable. I'd also suggest to normalize the data.
However, if you still want to store your data as VARBINARY, here is the solution:
CREATE FUNCTION dbo.fn_String_to_Varbinary(#Input VARCHAR(MAX))
RETURNS VARBINARY(MAX) AS
BEGIN
DECLARE #VB VARBINARY(MAX);
WITH CTE as (
SELECT CAST(CAST(LEFT(IT,CHARINDEX(';',IT)-1) AS INT) as VARBINARY(MAX)) as VB, RIGHT(IT,LEN(IT) - CHARINDEX(';',IT)) AS IT
FROM (VALUES (#Input)) as X(IT) union all
SELECT VB + CAST(CAST(LEFT(IT,CHARINDEX(';',IT)-1) AS INT) as VARBINARY(MAX)) as VB, RIGHT(IT,LEN(IT) - CHARINDEX(';',IT)) AS IT FROM CTE WHERE LEN(IT) > 1
)
SELECT TOP 1 #VB = VB FROM CTE
ORDER BY LEN(VB) DESC
RETURN #VB
END
GO
DECLARE #Input VARCHAR(MAX) = '421;1;2;3;5000;576;842;375;34654322;18;67;000001;1232142334;'
DECLARE #Position INT = 9
DECLARE #VB VARBINARY(MAX)
SELECT #VB = dbo.fn_String_to_Varbinary(#Input)
SELECT #VB, CAST(SUBSTRING(#VB,4*(#Position-1)+1,4) AS INT)
GO
The function converts string into VARBINARY and then script extracts 9th number from that VARBINARY value.
Do not run this function against a data set with million records and million numbers in each line.
I understand the vagaries of Unicode in SQL Server - varchar vs nvarchar, etc. I don't have a problem storing and retrieving Unicode data. However, there are some fields we have chosen to keep as varchar, since a non-ASCII character in those fields is considered anomalous.
When a Unicode character makes it into one of those non-Unicode fields, SQL Server converts it to a question mark: "?". BUT, sometimes it's hard to tell when a substitution has occurred because a question mark is a valid character in those fields.
My question: Can I get SQL Server to use a different substitution character, rather than a question mark? For instance, an underscore or even an empty string ('')?
Straight answer to your question is, you can not 'set' that character. As others suggested and you probably already knew, need to check for valid data to your 'special' varchar fields.
Because I was bored. I'm near positive this won't be useful in application, but it does do what you asked for. You could create a function with this if you really wanted to...
Declare #Nvarchar Nvarchar(25) = N'Hɶppy',
#NVbinary Varchar(128),
#parse Int = 3,
#NVunit Varchar(4),
#result Varchar(64) = '0x',
#SQL Nvarchar(Max);
Select #NVbinary = master.sys.fn_varbintohexstr(Convert(Varbinary(128),#Nvarchar))
While (#parse < Len(#NVbinary))
Begin
Select #NVunit = Substring(#NVbinary,#parse,4),
#parse = #parse + 4
If Substring(#NVunit,3,2) = '00'
Begin
Set #result = #result + Substring(#NVunit,1,2)
End
Else
Begin
Set #result = #result + '22' -- 22 is the hex value for quotation mark; Use Select Convert(Varbinary(8),'"') to get the value for whatever non-unicode character you want.
End
End
Set #SQL = 'Select Convert(Varchar(128),' + #result + '), ''' + #result + ''''
Select #Nvarchar, #NVbinary
Exec sp_executeSQL #SQL
You are right, any unicode character that does not have an ASCII equivalent leads to data loss when you put it into a varchar, and leaves behind a question mark:
select ascii(cast(nchar(1000) as varchar));
I agree with R. Martinho Fernandes, you need to solve this at the application layer. You could have the app replace any 2-byte unicode pair that has value over 255 with whatever you want. Maybe you can change your application-layer encoding to accept ASCII and Extended ASCII data only. But trying to fault the data layer in this case is like saying, "Our data field only accepts 'M' or 'F.' Why is the database complaining when the user sends us a 'Z'?"
I have a database that is a result of an import. The database is a deliverable, I did not do the import myself, nor do I have access to the original data to do it myself. That being said, there is an integer value that was imported to a text datatype. All of the stored values are valid integers. I keep getting:
Explicit conversion from data type text to int is not allowed.
if I try to change the field data type in the table. I have also created a new INT field in the table and tried to update it based upon the value in the TEXT field, but I receive the same error. Lastly I tried to create a new table and tried to insert the old values but cannot convert or cast to the int successfully.
This seems to work: CONVERT(INT, CONVERT(VARCHAR(MAX),myText))
Edit:
I'm not totally sure of what's the best choice for the inner conversion... Choosing either VARCHAR(N) with N > 10 or VARCHAR(MAX) has the advantage of not preventing an overflow by truncating (assuming the overflow is the preferred behavior in that case).
Also, the conversion to INT seems to treat leading spaces as zero. So VARCHAR(MAX) reduces the chance of erroneously getting zero. E.g.:
CREATE TABLE #foo ( bar TEXT )
INSERT INTO #foo
VALUES (' 10')
SELECT CONVERT (INT, CONVERT(VARCHAR(MAX),bar)) FROM #foo -- 10
SELECT CONVERT (INT, CONVERT(VARCHAR(10),bar)) FROM #foo -- 0
Probably the best thing is to do some validation to make sure the input meets whatever your requirements are.
TextValue to Int direct Conversion is not possible, so
convert TextValue to varchar(max)
varchar(max) to int or bigint
Example :
convert(int, (convert( varchar(max),'TextValue'))) as ColumnName
You need to convert text type to varchar(x), after that you can cast or convert to int. To avoid double convert i prefer cast.
CAST(CONVERT(VARCHAR(50),CONFIG_VALUE) AS INT)
Full example:
DECLARE #age int;
SET #age = (SELECT CAST(CONVERT(varchar(50),#yourTextVarHere) AS INT))
SELECT #age