How to convert to UTF-8 in SQL Server database? - sql-server

i'm working on mssql server 2017.
i got table with 'name' nvarchar column with values that look like '\u039b \u041e \u0422 \u0422 \u0410'
(sequence of '\u0[number][number][number or letter]')
how do i convert them to the correct characters?

This is actually the same as JSON escaping.
You can use JSON_VALUE to get the value out, by creating a JSON array
SELECT JSON_VALUE('["' + t.name + '"]', '$[0]')
FROM YourTable t;
db<>fiddle

Related

How to convert Oracle TO_NUMBER to SQL Server

I am migrating some SQL scripts from Oracle to SQL Server and have come across an issue when trying to convert Oracles TO_NUMBER into T-SQL code. Looked through the web for some time but never found the answer but they all say to convert hex using CAST or CONVERT with VARBINARY.
An example of the issue I am getting is below.
In Oracle:
select
to_number( '000000000000000000001111', 'XXXXXXXXXXXXXXXXXXXXXXXX' )
from
dual
returns 4369
When trying to use T-SQL CONVERT:
select
CONVERT(VARBINARY, '000000000000000000001111')
returns 0x303030303030303030303030303030303030303031313131
and
select
CONVERT(INT, CONVERT(VARBINARY, '000000000000000000001111'))
returns 825307441
Any help would be greatly appreciated.
Thanks
TSQL has binary literals, so you can write it like this:
select CONVERT(INT, 0x000000000000000000001111)
outputs
4369
If you are starting with a hex string, you can use convert with the binary style of 1, or 2, so
select convert(int, convert(varbinary(1024),'000000000000000000001111', 2))
outputs
4369
This should do the work.
select convert(bigint, convert(Varbinary(MAX), '000000000000000000001111',2))

Convert number to varchar using to_varchar with leading zeros in Snowflake

In Snowflake there is a number column storing values like: 8,123,456. I am struggling determining how to structure select statement to return a value like: 00008123456.
In SQL SERVER you can do something like this:
right(replicate('0', 11) + convert(varchar(11), job_no), 11) AS job_no
I understand Snowflake to_varchar function can be used, but not sure how to structure like in SQL Server.
Just add a format string on to_varchar():
select to_varchar(8123456, '00000000000');
Give it try with: lpad(to_char(nos),11,0)
More examples & details: https://docs.snowflake.com/en/sql-reference/functions/lpad.html#examples
a solution provided by Greg Pavlik saved me.
I just switched from DB2 to Snowflake.
So, TO_VARCHAR( <numeric_expr> [, '<format>' ] ) worked.

Casting DB2 column to accept multi lingual characters in Open query

I am using open query to retrieve data from IBM db2 to SQL server.
Below is the sample query used
select top 10 * from OpenQuery(Link server, 'Select columnName from table where column2=15' )
The columnName needs to be converted / cast in Unicode format to accept multi lingual characters. How to use casting in the inner query?
My issue is similar to that of https://www.sqlservercentral.com/forums/997384/linker-server-to-as400-db2-character-translation-problems
I want to retrieve the data in Thai and Chinese characters. I have rows of data which are to be in Thai and Chinese characters. But the data shows as garbled when I use the command which I have provided. The column type in sql server is defined as nvarchar.
Cast your column with the CCSID code xxxx corresponding to your language :
select cast(columnName as char(14) CCSID xxxx) from ...

Convert oracle date string to SQL Server datetime

In a SQL Server 2000 DB, I have a table which holds string representations of Oracle DB dates. They are formatted like "16-MAY-12". I need to convert these to datetime. I can not seem to find a conversion style number that matches, nor can I find a function that will allow me to specify the input format. Any ideas?
This seems to work for me:
SELECT CONVERT(DATETIME, '16-MAY-12');
You can also try using TO_CHAR() to convert the Oracle values to a more SQL Server-friendly format (best is YYYYMMDD) before pulling them out of the darker side.
Follow Aaron's advice and cast to string on the Oracle side and then did a check/recast on the MS SQL side. See example below:
;WITH SOURCE AS (
SELECT * FROM openquery(lnk,
'SELECT
TO_CHAR(OFFDATE , ''YYYY-MM-DD HH24:MI:SS'') AS OFFDATE,
FROM
ORACLE_SOURCE')),
SOURCE_TRANSFORM AS
(
SELECT
CASE
WHEN ISDATE(OFFDATE) = 1 THEN CAST(OFFDATE AS DATETIME)
ELSE NULL END AS OFFDATE
FROM
SOURCE
)
SELECT * FROM SOURCE_TRANSFORM

Search for comma values inside query?

How can I search for a value say
23,000
on a VARBINARY(MAX) filestream column in SQL Server 2008 R2 ? This won't work
SELECT * FROM dbo.tbl_Files WHERE CONTAINS(SystemFile, '%[23,000]%');
I think its just you have % and full text search uses *
select
*
from tbl_Files
Where contains(SystemFile, '"*23,000*"')
I have a full text index with phone numbers in it and this works too
select
*
from tbl_Files
Where contains(SystemFile, '0116')
SELECT * FROM dbo.tbl_Files WHERE CAST(SystemFile AS NVARCHAR) LIKE '%23,000%'
Please try this one :
select * from dbo.tbl_Files where CAST(SystemFile as int) like '%23000%'
If you have a VARBINARY datatype for a column then you should have to CAST that value , because it is stored as bianry value in table.
try this one :
SELECT * FROM dbo.tbl_Files WHERE cast(SystemFile as varchar) like '%23,000%'

Resources