Overcome 255 char limitations in Sybase ASE system while Concat Multiple Column - concatenation

I have a table with 39 column and 30 rows in Sybase.I am trying to Concat all the 39 columns in a single column with 30 rows.
Tools used:
Winsql professional 4.5 connect to Sybase DB
table1 has actual data
Created a temp table2 of data type text. Create table #temp2 (Line text)
Insert and formatted using trim for space,null values and tried concat using + symbol into temp table2 from table1
Result: data gets truncated at 256 char
Findings: Sybase ASE text data type supports only 255 char
Can someone suggest on how to overcome with this issue!

Sybase ASE's text data type is not limited to 256 characters, but there are some tricks to using it successfully such as specifying textsize and being aware that these settings may be session and stored procedure specific.
Consider the following example Sybase ASE 16.0 GA on Linux:
Create then 39 column table.
create table table_1 (
col_1 Varchar(255) null,
col_2 Varchar(255) null,
col_3 Varchar(255) null,
col_4 Varchar(255) null,
col_5 Varchar(255) null,
col_6 Varchar(255) null,
col_7 Varchar(255) null,
col_8 Varchar(255) null,
col_9 Varchar(255) null,
col_10 Varchar(255) null,
col_11 Varchar(255) null,
col_12 Varchar(255) null,
col_13 Varchar(255) null,
col_14 Varchar(255) null,
col_15 Varchar(255) null,
col_16 Varchar(255) null,
col_17 Varchar(255) null,
col_18 Varchar(255) null,
col_19 Varchar(255) null,
col_20 Varchar(255) null,
col_21 Varchar(255) null,
col_22 Varchar(255) null,
col_23 Varchar(255) null,
col_24 Varchar(255) null,
col_25 Varchar(255) null,
col_26 Varchar(255) null,
col_27 Varchar(255) null,
col_28 Varchar(255) null,
col_29 Varchar(255) null,
col_30 Varchar(255) null,
col_31 Varchar(255) null,
col_32 Varchar(255) null,
col_33 Varchar(255) null,
col_34 Varchar(255) null,
col_35 Varchar(255) null,
col_36 Varchar(255) null,
col_37 Varchar(255) null,
col_38 Varchar(255) null,
col_39 Varchar(255) null)
go
I receive a warning about the potential row sizes not fitting on the page. My Sybase ASE instance is configured for 2K pages. 16K page size instances will not receive this warning. Truncation will only occur should the row size become larger than the page size:
Warning: Row size (10028 bytes) could exceed row size limit, which is 1962
bytes.
Insert the rows into table_1. Ideally, with 16K pages and columns with 255 characters, this insert statement could be used:
insert into table_1 values (
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL01',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL02',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL03',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL04',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL05',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL06',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL07',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL08',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL09',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL10',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL11',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL12',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL13',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL14',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL15',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL16',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL17',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL18',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL19',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL20',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL21',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL22',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL23',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL24',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL25',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL26',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL27',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL28',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL29',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL30',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL31',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL32',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL33',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL34',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL35',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL36',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL37',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL38',
'0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789COL39')
go 30
The "go 30" at the end, submits the SQL batch 30 times, inserting 30 rows.
Since my own instance only has 2K pages, I'm limiting myself to 45 characters per column which is 1755 characters.
insert into table_1 values (
'0123456789012345678901234567890123456789COL01',
'0123456789012345678901234567890123456789COL02',
'0123456789012345678901234567890123456789COL03',
'0123456789012345678901234567890123456789COL04',
'0123456789012345678901234567890123456789COL05',
'0123456789012345678901234567890123456789COL06',
'0123456789012345678901234567890123456789COL07',
'0123456789012345678901234567890123456789COL08',
'0123456789012345678901234567890123456789COL09',
'0123456789012345678901234567890123456789COL10',
'0123456789012345678901234567890123456789COL11',
'0123456789012345678901234567890123456789COL12',
'0123456789012345678901234567890123456789COL13',
'0123456789012345678901234567890123456789COL14',
'0123456789012345678901234567890123456789COL15',
'0123456789012345678901234567890123456789COL16',
'0123456789012345678901234567890123456789COL17',
'0123456789012345678901234567890123456789COL18',
'0123456789012345678901234567890123456789COL19',
'0123456789012345678901234567890123456789COL20',
'0123456789012345678901234567890123456789COL21',
'0123456789012345678901234567890123456789COL22',
'0123456789012345678901234567890123456789COL23',
'0123456789012345678901234567890123456789COL24',
'0123456789012345678901234567890123456789COL25',
'0123456789012345678901234567890123456789COL26',
'0123456789012345678901234567890123456789COL27',
'0123456789012345678901234567890123456789COL28',
'0123456789012345678901234567890123456789COL29',
'0123456789012345678901234567890123456789COL30',
'0123456789012345678901234567890123456789COL31',
'0123456789012345678901234567890123456789COL32',
'0123456789012345678901234567890123456789COL33',
'0123456789012345678901234567890123456789COL34',
'0123456789012345678901234567890123456789COL35',
'0123456789012345678901234567890123456789COL36',
'0123456789012345678901234567890123456789COL37',
'0123456789012345678901234567890123456789COL38',
'0123456789012345678901234567890123456789COL39')
go 30
Check the correct number of characters have been entered, this should be the length of string in each column. In my own case, 45.
select
char_length(col_1),
char_length(col_2),
char_length(col_3),
char_length(col_4),
char_length(col_5),
char_length(col_6),
char_length(col_7),
char_length(col_8),
char_length(col_9),
char_length(col_10),
char_length(col_11),
char_length(col_12),
char_length(col_13),
char_length(col_14),
char_length(col_15),
char_length(col_16),
char_length(col_17),
char_length(col_18),
char_length(col_19),
char_length(col_20),
char_length(col_21),
char_length(col_22),
char_length(col_23),
char_length(col_24),
char_length(col_25),
char_length(col_26),
char_length(col_27),
char_length(col_28),
char_length(col_29),
char_length(col_30),
char_length(col_31),
char_length(col_32),
char_length(col_33),
char_length(col_34),
char_length(col_35),
char_length(col_36),
char_length(col_37),
char_length(col_38),
char_length(col_39)
from table_1
go
Now create table_2 with the text column.
create table table_2 (col_1 text null)
go
Insert the rows into table_2 from the concatenated values of the columns in table_1. There will be one row in table_2 for each row in table_1.
insert into table_2 select
col_1 +
col_2 +
col_3 +
col_4 +
col_5 +
col_6 +
col_7 +
col_8 +
col_9 +
col_10 +
col_11 +
col_12 +
col_13 +
col_14 +
col_15 +
col_16 +
col_17 +
col_18 +
col_19 +
col_20 +
col_21 +
col_22 +
col_23 +
col_24 +
col_25 +
col_26 +
col_27 +
col_28 +
col_29 +
col_30 +
col_31 +
col_32 +
col_33 +
col_34 +
col_35 +
col_36 +
col_37 +
col_38 +
col_39 as result
from table_1
go
Check the length of the column in table_2. If using 45 characters it should be 1755; if using 255 characters it should be 9945.
select char_length(col_1) from table_2
go
Confirm the last value of the last row ends in "COL39".
select col_1 from table_2
go
Something like...
0123456789012345678901234567890123456789COL39
Given the above test case was conducted using Sybase's isql utility we can show that Sybase ASE does correctly concatenate the values and store them in a text column. You are using "winsql", a tool that I am not familiar with nor do I have access to. I imagine this may be imposing some limits on what is being displayed. I suspect somewhere it maybe running:
set textsize 255
or simply truncating the data. The above test case should be able to confirm this. The values returned by char_length() will not be subjected to the truncation unless the input data is truncated.
After installing Sybase ASE 12.5.1 on Windows 2000 and configuring it for 16K pages, the commands above were executed in a database called "rwc". The commands worked as advertised.

Related

Number of bytes used for Unicode characters in varchar

A common misconception is to think that CHAR(n) and VARCHAR(n), the n defines the number of characters. But in CHAR(n) and VARCHAR(n) the n defines the string length in bytes (0-8,000). n never defines numbers of characters that can be stored
According to this statement from Microsoft, I assume, n is the data length of a string and when we store unicode characters in varchar, a single character should take 2 bytes. But, when I try with a sample as below, I see varchar data taking 1 byte instead of 2 bytes.
declare #varchar varchar(6), #nvarchar nvarchar(6)
set #varchar = 'Ø'
select #varchar as VarcharString, len(#varchar) as VarcharStringLength, DATALENGTH(#varchar) as VarcharStringDataLength
Could someone explain the reason behind it?
Found time to test the assumptions of my first answer:
Create UTF8-enabled database
CREATE DATABASE [test-sc] COLLATE Latin1_General_100_CI_AI_KS_SC_UTF8
Create table with all kinds of N/VARCHAR columns
CREATE TABLE [dbo].[UTF8Test](
[Id] [int] IDENTITY(1,1) NOT NULL,
[VarcharText] [varchar](50) COLLATE Latin1_General_100_CI_AI NULL,
[VarcharTextSC] [varchar](50) COLLATE Latin1_General_100_CI_AI_KS_SC NULL,
[VarcharUTF8] [varchar](50) COLLATE Latin1_General_100_CI_AI_KS_SC_UTF8 NULL,
[NVarcharText] [nvarchar](50) COLLATE Latin1_General_100_CI_AI_KS NULL,
[NVarcharTextSC] [nvarchar](50) COLLATE Latin1_General_100_CI_AI_KS_SC NULL,
[NVarcharUTF8] [nvarchar](50) COLLATE Latin1_General_100_CI_AI_KS_SC_UTF8 NULL)
Insert test data from various Unicode ranges
INSERT INTO [dbo].[UTF8Test] ([VarcharText],[VarcharTextSC],[VarcharUTF8],[NVarcharText],[NVarcharTextSC],[NVarcharUTF8])
VALUES ('a','a','a','a','a','a')
INSERT INTO [dbo].[UTF8Test] ([VarcharText],[VarcharTextSC],[VarcharUTF8],[NVarcharText],[NVarcharTextSC],[NVarcharUTF8])
VALUES ('ö','ö','ö',N'ö',N'ö',N'ö')
-- U+56D7
INSERT INTO [dbo].[UTF8Test] ([VarcharText],[VarcharTextSC],[VarcharUTF8],[NVarcharText],[NVarcharTextSC],[NVarcharUTF8])
VALUES (N'囗',N'囗',N'囗',N'囗',N'囗',N'囗')
-- U+2000B
INSERT INTO [dbo].[UTF8Test] ([VarcharText],[VarcharTextSC],[VarcharUTF8],[NVarcharText],[NVarcharTextSC],[NVarcharUTF8])
VALUES (N'𠀋',N'𠀋',N'𠀋',N'𠀋',N'𠀋',N'𠀋')
SELECT lengths
SELECT TOP (1000) [Id]
,[VarcharText]
,[VarcharTextSC]
,[VarcharUTF8]
,[NVarcharText]
,[NVarcharTextSC]
,[NVarcharUTF8]
FROM [test-sc].[dbo].[UTF8Test]
SELECT TOP (1000) [Id]
,LEN([VarcharText]) VT
,LEN([VarcharTextSC]) VTSC
,LEN([VarcharUTF8]) VU
,LEN([NVarcharText]) NVT
,LEN([NVarcharTextSC]) NVTSC
,LEN([NVarcharUTF8]) NVU
FROM [test-sc].[dbo].[UTF8Test]
SELECT TOP (1000) [Id]
,DATALENGTH([VarcharText]) VT
,DATALENGTH([VarcharTextSC]) VTSC
,DATALENGTH([VarcharUTF8]) VU
,DATALENGTH([NVarcharText]) NVT
,DATALENGTH([NVarcharTextSC]) NVTSC
,DATALENGTH([NVarcharUTF8]) NVU
FROM [test-sc].[dbo].[UTF8Test]
I was surprised to find that the old mantra "a VARCHAR only stores single byte characters" needs to be revised when using UTF8 collations.
Note that only table columns are associated with collations, but not T-SQL variables:
SELECT #VarcharText = [VarcharText],
#NVarcharText = [NVarcharText]
FROM [test-sc].[dbo].[UTF8Test]
WHERE [Id] = 4
SELECT #VarcharText, Len(#VarcharText), DATALENGTH(#VarcharText), #NVarcharText, Len(#NVarcharText), DATALENGTH(#NVarcharText)
SELECT #VarcharText = [VarcharTextSC],
#NVarcharText = [NVarcharTextSC]
FROM [test-sc].[dbo].[UTF8Test]
WHERE [Id] = 4
SELECT #VarcharText, Len(#VarcharText), DATALENGTH(#VarcharText), #NVarcharText, Len(#NVarcharText), DATALENGTH(#NVarcharText)
SELECT #VarcharText = [VarcharUTF8],
#NVarcharText = [NVarcharUTF8]
FROM [test-sc].[dbo].[UTF8Test]
WHERE [Id] = 4
SELECT #VarcharText, Len(#VarcharText), DATALENGTH(#VarcharText), #NVarcharText, Len(#NVarcharText), DATALENGTH(#NVarcharText)
I thought the original quote was a bit confusion, as it continues
The misconception happens because when using single-byte encoding, the
storage size of CHAR and VARCHAR is n bytes and the number of
characters is also n.
but since it mentions encodings, my guess is that the statement refers to the UTF encodings supported in SQL Server 2019 and higher which seem to allow (I haven't tried yet) to store Unicode in VARCHAR columns.
declare #char varchar(4)
declare #nvarchar nvarchar(4)
Set #char = '#'
Set #nvarchar = '#'
select #char as charString,
LEN(#char) as charStringLength,
DATALENGTH(#char) as charStringDataLength
select #nvarchar as nvarcharString,
LEN(#nvarchar) as nvarcharStringLength,
DATALENGTH(#nvarchar) as nvarcharStringDataLength
You can store unicode in varchar (if you want to), however every byte is interpreted as a single character, while unicode (for sql server, utf16, ucs2) uses 2 bytes for a single character and you have to account for that, when displaying unicode stored in varchar.
declare #nv nvarchar(10) = N'❤'
select #nv;
declare #v varchar(10) = cast(cast(#nv as varbinary(10)) as varchar(10))
select #v, len(#v); --two chars
select cast(#nv as varbinary(10)), cast(#v as varbinary(10)); --same bytes in both n/var char
--display nchar from char
select cast(cast(#v as varbinary(10)) as nvarchar(10));

How to store different collation text in SQL Server sql_variant type?

SQL Server storing for each sql_variant text value own collation, so I was trying for test purposes to store strings from german to french into sql_variant.
CREATE TABLE [dbo].[VarCollation]
(
[uid] [INT] IDENTITY (1, 1) NOT NULL,
[comment] NVARCHAR(100),
[variant_ger] [sql_variant] NULL,
[variant_rus] [sql_variant] NULL,
[variant_jap] [sql_variant] NULL,
[variant_ser] [sql_variant] NULL,
[variant_kor] [sql_variant] NULL,
[variant_fre] [sql_variant] NULL
) ON [PRIMARY]
GO
INSERT INTO VarCollation(comment, variant_ger, variant_rus, variant_jap, variant_ser, variant_kor, variant_fre)
VALUES('NVarChar',
CONVERT(NVARCHAR, N'Öl fließt') COLLATE SQL_Latin1_General_CP1_CI_AS,
CONVERT(NVARCHAR, N'Москва') COLLATE Cyrillic_General_CI_AS,
CONVERT(NVARCHAR, N' ♪リンゴ可愛いや可愛いやリンゴ。半世紀も前に流行した「リンゴの') COLLATE Japanese_CI_AS,
CONVERT(NVARCHAR, N'ŠšĐđČčĆ掞') COLLATE Serbian_Latin_100_CI_AS,
CONVERT(NVARCHAR, N'향찰/鄕札 구결/口訣 이두/吏讀') COLLATE Korean_100_CI_AS,
CONVERT(NVARCHAR, N'le caractère') COLLATE French_CS_AS);
GO
INSERT INTO VarCollation (comment, variant_ger, variant_rus, variant_jap, variant_ser, variant_kor, variant_fre)
VALUES('VarChar',
CONVERT(VARCHAR, N'Öl fließt') COLLATE SQL_Latin1_General_CP1_CI_AS,
CONVERT(VARCHAR, N'Москва') COLLATE Cyrillic_General_CI_AS,
CONVERT(VARCHAR, N' ♪リンゴ可愛いや可愛いやリンゴ。半世紀も前に流行した「リンゴの') COLLATE Japanese_CI_AS,
CONVERT(VARCHAR, N'ŠšĐđČčĆ掞') COLLATE Serbian_Latin_100_CI_AS,
CONVERT(VARCHAR, N'향찰/鄕札 구결/口訣 이두/吏讀') COLLATE Korean_100_CI_AS,
CONVERT(VARCHAR, N'le caractère') COLLATE French_CS_AS);
GO
By analyzing data of each sql_variant I see that each value stored with exact collation assigned for both NVARCHAR and VARCHAR.
German
collationId 0x3400d008
codepage 0x000004e4
Russian
collationId 0x0000d015
codepage 0x000004e3
Japanese
collationId 0x0000d010
codepage 0x000003a4
Serbian
collationId 0x0004d04c
codepage 0x000004e2
Korean
collationId 0x0004d040
codepage 0x000003b5
French
collationId 0x0000c00b
codepage 0x000004e4
but SSMS shows proper values for NVARCHAR and garbage for VARCHAR
uid comment variant_ger variant_rus variant_jap variant_ser variant_kor variant_fre
1 NVarChar Öl fließt Москва  ♪リンゴ可愛いや可愛いやリンゴ。半世紀も前に流行した「リン ŠšĐđČčĆ掞 향찰/鄕札 구결/口訣 이두/吏讀 le caractère
2 VarChar Ol flie?t Москва ?d???????????????????????????? SsDdCcCcZz ??/?? ??/?? ??/?? le caractere
From what I see in sql_variant data for VARCHAR japanese text stored with some characters already replaced by 0x3f ('?'). I tried to INSERT without convert and N but result the same. Is it possible to insert such text into sql_variant and how to do that?
To answer your question, yes, you can store different collations in a sql_variant, however, your COLLATE statement is in the wrong place. You are changing the collation of the value after the nvarchar has been converted to a varchar, so the characters have already been lost. Converting a varchar back to an nvarchar, or changing it's collation afterwards doesn't restore "lost" data; it has already been lost.
Even if you fix that, you'll notice, however, you don't get the results you want:
USE Sandbox;
GO
CREATE TABLE TestT (TheVarchar sql_variant)
INSERT INTO dbo.TestT (TheVarchar)
SELECT CONVERT(varchar, N'향찰/鄕札 구결/口訣 이두/吏讀' COLLATE Korean_100_CI_AS)
INSERT INTO dbo.TestT (TheVarchar)
SELECT CONVERT(varchar, N' ♪リンゴ可愛いや可愛いやリンゴ。半世紀も前に流行した「リンゴの' COLLATE Japanese_CI_AS);
SELECT *
FROM dbo.TestT;
GO
DROP TABLE dbo.TestT;
Notice that the second string has the value ' ♪リンゴ可愛いや可愛いやリン' (it's been truncated). That's because you haven't declared your length value for varchar. Always declare your lengths, precisions, scales, etc. You know your data better than I, so you will know an appropriate value for it.

Combine two columns and input the result in a different columns using SQL server

Please can anyone help me with the insert sql statement below. I am trying to create a SampleID by combining column ID (auto generate by the database) and the MBID column. I am having the error 'CONCAT' is not a recognized built-in function name.
Thanks
SqlCommand sc = new SqlCommand(#"insert into Sample (MBID, SampleType,SampleDate,ConsultantName,Comments,FirstSample, SampleID)
values(#MBID , #SampleType , #SampleDate , #ConsultantName , #Comments, CONCAT(ID +'-'+ MBID) ;", con);
Table Design
CREATE TABLE [dbo].[Sample] (
[ID] INT IDENTITY (5, 1) NOT NULL,
[SampleID] NVARCHAR (50) NOT NULL,
[SampleType] NVARCHAR (50) NULL,
[SampleDate] DATE NULL,
[ConsultantName] NVARCHAR (50) NULL,
[Comments] NVARCHAR (MAX) NULL,
[FirstSample] NVARCHAR (MAX) NULL,
[MBID] INT NULL,
CONSTRAINT [PK_Sample] PRIMARY KEY CLUSTERED ([SampleID] ASC)
);
Firstly, CONCAT was introduced in SQL Server 2012
Using CONCAT function:
SELECT CONCAT ( 'Welcome ', 'World ', '!' ) AS Result;
Secondly, you want auto generated value to be concatenated to the id value for the sample Id column. The below query can be used for that...
SELECT IDENT_CURRENT('table_name')+1;
Now, alter your query as below
SqlCommand sc = new SqlCommand(#"insert into Sample (MBID, SampleType,SampleDate,ConsultantName,Comments,FirstSample, SampleID)
values(#MBID , #SampleType , #SampleDate , #ConsultantName , #Comments, cast((IDENT_CURRENT('Sample')+1) as VARCHAR(max)) +'-'+ CAST(#MBID AS VARCHAR(10)));", con);
CONCAT is available from SQLServer 2012..use + instead
Also remeber ,you might have to use ISNULL to avoid nulls,Since CONCAT ignores nulls
To concatenate strings in SQL-server, you could either use the CONCAT function, or +. You are trying to do both.
The CONCAT function takes at least 2 comma-separated arguments.
So, either
ID +'-'+ MBID
Or
CONCAT(ID, '-', MBID)

How to store Arabic in SQL server?

I am trying to store Arabic-strings in my database. it is working fine by using COLLATE Arabic_CI_AI_KS_WS but some of Arabic records are missing some Arabic-alphabets. i have tried with some other sollate with but result as same. how to fix it ?
table structure :
CREATE TABLE [dbo].[Ayyat_Translation_Language_old_20131209] (
[Ayat_Translation_Language_ID] INT IDENTITY (1, 1) NOT NULL,
[Translation_Laanguage_ID] INT NULL,
[Juz_ID] INT NULL,
[Surah_ID] INT NOT NULL,
[Ayat_Description] VARCHAR (2000) COLLATE Arabic_CI_AI_KS_WS NOT NULL
)
insertion code :
string query = "insert into Ayyat_Translation_Language_old_20131209 values(null,null," + surah + ",N'" + verse + "')"; where verse contains Arabic contents.
and it stores data like this (with question-marks) :
?بِسْمِ اللَّهِ الرَّحْمَ?نِ الرَّحِيمِ
i have read that link : store arabic in SQL database
To store unicode string data, use NVARCHAR(2000) rather than VARCHAR(2000) for column [Ayat_Description]
Ref.: nchar and nvarchar
All of what you have to do is to make sure that
the column Data type is nvarchar()
after that I inserted Arabic with no problems even with Tashkeel
this in SQl server management studio 2012
It may help
CREATE TABLE [dbo].[Ayyat_Translation_Language_old_20131209] (
[Ayat_Translation_Language_ID] INT IDENTITY (1, 1) NOT NULL,
[Translation_Laanguage_ID] INT NULL,
[Juz_ID] INT NULL,
[Surah_ID] INT NOT NULL,
[Ayat_Description] NVARCHAR (2000) COLLATE Arabic_CI_AI_KS_WS NOT NULL
use arabic collation or use unicode(nvarchar(max))
This is a Example table For your answer :
CREATE TABLE #t11
(
column1 NVARCHAR(100)
)
INSERT INTO #t11 VALUES(N'لا أتكلم العربية')
SELECT * FROM #t11
fiddle demo

Index on calculated column triggers 900 byte index size limit

I have a table with a calculated VARCHAR column that will contain up to 106 characters:
CREATE TABLE report (
report_id INT IDENTITY(1, 1) NOT NULL,
name VARCHAR(100) COLLATE Modern_Spanish_CI_AI NOT NULL,
city_id VARCHAR(6) COLLATE Modern_Spanish_CI_AI,
unique_name AS
CASE
WHEN city_id IS NULL
THEN name
ELSE name + REPLICATE(' ', 100 - LEN(name)) + city_id
END COLLATE Modern_Spanish_CI_AI,
CONSTRAINT report_pk PRIMARY KEY (report_id)
);
/* Report name is unique per city (and among city-less rows) */
CREATE UNIQUE INDEX report_idx1 ON report (unique_name);
But when I run the statement I get this warning:
Warning! The maximum key length is 900 bytes. The index 'report_idx1'
has maximum length of 8000 bytes. For some combination of large
values, the insert/update operation will fail.
Is there a way to tell SQL Server that the column will not go beyond 106 characters so I get rid of the warning?
Try CAST(CASE ... END AS VARCHAR(106))...
CAST(CASE
WHEN city_id IS NULL
THEN name
ELSE name + REPLICATE(' ', 100 - LEN(name)) + city_id
END AS VARCHAR(106)) COLLATE Modern_Spanish_CI_AI
or simply ignore it... It's only a warning.

Resources