XML input getting truncated - sql-server

I have an xml doc (size: 3.59 mb) with 3765815 total characters in it. My sql server 2008 database table has a column with xml data type. When I try to insert this xml into the column it seems to truncate it.
I thought xml data type can handle 2GB of data. Is this a correct understanding or am i missing something?
Thanks
Here is the query i am using
declare printxml nvarchar(max)
select printxml=cast(inputxml as varchar(max))
from TableA
where SomeKey='<some key>'
print printxml

Select the data directly instead of printing it to the messages window:
SELECT
inputxml
FROM TableA
WHERE SomeKey = '<somekey>'
The caveat is that you have to set up Management Studio to be able to return all the data to the window. You do that using the following option (the default setting is 2MB):

Related

How to parse string into multiple tables in SQL Server 2017

I have a text file that was created by dumping 8 SQL tables into it. Now I need to import this data back into SQL Server.
Using BULK insert I was able to load data into one table with single column 'FileData'.
DECLARE #FileTable TABLE (FileData NVARCHAR(MAX))
INSERT INTO #FileTable
SELECT BulkColumn
FROM OPENROWSET( BULK N'C:\My\Path\Name\FileName.txt', SINGLE_CLOB) AS Contents
SELECT * FROM #FileTable
So now I have this huge string that I need to organize into different tables.
For example this part of string corresponds to the below table :
FileData
00001 00000009716496000000000331001700000115200000000000
Table:
It also seems like all fields have a set length and I can get that length.
I can see doing something like this:
select SUBSTRING('00001 00000009716496000000000331001700000115200000000000 ', 1,5) as RecordKey
select SUBSTRING('00001 00000009716496000000000331001700000115200000000000 ', 6,17) as Filler
select SUBSTRING('00001 00000009716496000000000331001700000115200000000000 ', 23,16) as BundleAnnualPremium
But is any faster and better way to load this data into different tables?
You could just bulk insert with a format file right from the start. But since the data is already loaded into a big table, if you'd rather use pure TSQL, you can pull elements out of a string using left(), right(), and substring().

Error in Insert to varbinary(max) String or binary data would be truncated. The statement has been terminated

I have table with definition:
I have image files near 80kb. When I am trying insert data to table Usluga like this:
INSERT [dbo].[Usluga] (Nazvanie, Cena_za_poseshenie, Image)
SELECT N'Персональный тренинг', 50, ThumbnailPhoto.*
FROM OPENROWSET
(BULK 'MyFilePathToImage.jpg', SINGLE_BLOB) ThumbnailPhoto
go
INSERT [dbo].[Usluga] (Nazvanie, Cena_za_poseshenie, Image)
SELECT N'Бокс', 90, ThumbnailPhoto.*
FROM OPENROWSET
(BULK 'MyFilePathToImage.jpg', SINGLE_BLOB) ThumbnailPhoto
go
I give error
String or binary data would be truncated.
The statement has been terminated.
But varbinary(max) allows save data from 0 through 2^31-1 (2,147,483,647) bytes.
How do I fix this?
Perhaps you're looking at the wrong column. Try resizing your nvarchar(20) column so it can accept more than 20 characters.
I was getting similar error when trying to insert a image file as a blob to
SQL Server database, it would throw "RIGHT TRUNCATION" error.
I solved my problem by changing the blob column data type from VARBINARY(MAX) to
IMAGE. The varbinary(max) datatype was supposed to replace the image datatype in newer SQL Server instances, my instance was SQL Server 2012. If using the Image data type works for you, then just keep in mind that the max size for this type is 2 GB.

Mysterious:Selecting a large Xml in Sql Server

I am having a column in my table which stores XML data as a varchar(MAX).For example one of my value has around 1 LAC characters.While selecting it from the Table, i end up getting only 43679 characters for all samples.
What is the reason behind this mystery?Please,if there is any way to retrieve the complete data,help.
Try using settings of sql server management studio.
try to select with a cast:
select top 1 cast(ColumnName as xml) from Table

How to show CLOB type in a SELECT in SQL Server?

I have a table with one column of CLOB type data, they are all very short no more than 20 bytes, however I cannot see the actual string in the CLOB data.
For example if I use SELECT *, under the CLOB type every data is like:
CLOB, 8 Bytes
CLOB, 15 Bytes
CLOB, 9 Bytes
But I just want to see the content of the CLOB data.
I tried:
SELECT DBMS_LOB.SUBSTR(ClobColumnName, 20 ,1)
And it doesn't work, error is:
Error Code: 4121, SQL State: S1000
Cannot find either column "DBMS_LOB" or the user-defined function or aggregate "DBMS_LOB.SUBSTR", or the name is ambiguous.
So can I ask what's the syntax for direct display a CLOB data in a query?
I'm using SQL Server with dbVisualizer.
I figured out one solution. There should be better ways, please show more possible solutions in the comments.
SELECT CAST(ClobColumnName AS VARCHAR(50)) AS ClobColumnName ;
I have table with one column has CLOB data type(1000K), after storing message/data into CLOB column and found one solution see the actual data in CLOB column.
SELECT CAST(T.CLOB_COLUMNNAME AS VARCHAR(1000)) AS SAMPLEDATA
FROM TABLE_NAME AS T
The above query CAST the CLOB(Character Large Objects) into a normal String.
To see it in DbVis you just have to change it in the options.
There is an entry for the display of CLOB columns.
I presume you are using jDTS driver to connect to the SQL Server.
In the driver properties of the connection you can set the "USELOBS" to False to automatically cast them to string.
I had the same problem and solved it by using DBeaver (http://dbeaver.jkiss.org/) instead of dbVisualizer.
When I use DBeaver and do a select * from my SQLServer I can just double-click the CLOB in the result set and it opens in a new window with the content. Very slick.

SQL Server - trying to convert column to XML fails

I'm in the process of importing data from a legacy MySQL database into SQL Server 2005.
I have one table in particular that's causing me grief. I've imported it from MySQL using a linked server and the MySQL ODBC driver, and I end up with this:
Col Name Datatype MaxLen
OrderItem_ID bigint 8
PDM_Structure_ID int 4
LastModifiedDate datetime 8
LastModifiedUser varchar 20
CreationDate datetime 8
CreationUser varchar 20
XMLData text -1
OrderHeader_ID bigint 8
Contract_Action varchar 10
ContractItem int 4
My main focus is on the XMLData column - I need to clean it up and make it so that I can convert it to an XML datatype to use XQuery on it.
So I set the table option "large data out of row" to 1:
EXEC sp_tableoption 'OrderItem', 'large value types out of row', 1
and then I go ahead and convert XMLData to VARCHAR(MAX) and do some cleanup of the XML stored in that field. All fine so far.
But when I now try to convert that column to XML datatype:
ALTER TABLE dbo.OrderItem
ALTER COLUMN XMLData XML
I get this message here:
Msg 511, Level 16, State 1, Line 1
Cannot create a row of size 8077 which
is greater than the allowable maximum
row size of 8060. The statement has
been terminated.
which is rather surprising, seeing that the columns besides the XMLData only make up roughly 90 bytes, and I specifically instructed SQL Server to store all "large data" off-row....
So why on earth does SQL Server refuse to convert that column to XML data??? Any ideas?? Thoughts?? Things I can check / change in my approach??
Update: I don't know what changed, but on a second attempt to import the raw data from MySQL into SQL Server, I was successfully able to convert that NTEXT -> VARCHAR(MAX) column to XML in the end..... odd..... anyhoo - works now - thanks guys for all your input and recommendations! Highly appreciated !
If you have sufficient storage space, you could try selecting from the VARCHAR(MAX) version of the table into a new table with the same schema but with XMLData set up as XML - either using SELECT INTO or by explicitly creating the table before you begin.
PS - it's a side issue unrelated to your problem, but you might want to check that you're not losing Unicode characters in the original MySQL XMLData field by this conversion since the text/varchar data types won't support them.
Can you ADD a new column of type xml?
If so, add the new xml column, update the table to set the new column equal to the XmlData column and then drop the XmlData column.
Edit
I have a table "TestTable" with a "nvarchar(max)" column.
select * from sys.tables where name = 'TestTable'
This gives a result containing:
[lob_data_space_id] [text_in_row_limit] [large_value_types_out_of_row]
1 0 0
yet I can happily save 500k characters in my nvarchar(max) field.
What do you get if you query sys.tables for your OrderItems table?
If your [text_in_row_limit] is not zero, try this, which should convert any existing in-row strings into BLOBs:
exec sp_tableoption 'OrderItems', 'text in row', 0
and then try to switch from nvarchar(max) to xml.
From BOL,
Disabling the text in row option or
reducing the limit of the option will
require the conversion of all BLOBs;
therefore, the process can be long,
depending on the number of BLOB
strings that must be converted. The
table is locked during the conversion
process.

Resources