I take XML from a webservice and it's very long (> 1 million characters). I put the XML in an SSIS variable.
I want to put the raw XML from the variable into a SQL Server 2012 table. Table column is nvarchar(max).
From sql task i use simple
Insert (xml) values (#variable)
However when i look at the column length in SQL Server, only 500k chars there!
Why is this?
Related
There is a transfer happening between two database.
The first one is a SQL Server. The second one is Postgresql.
I have a column that has a UNIQUEIDENTIFIER in SQL Server and it is sending data to a VARCHAR column in Postgresql. The way the code is implemented expects that column to be a varchar/string.
The issue is that the data gets transferred to that column, but has some formatting issues.
The SQL Server UNIQUEIDENTIFIER value: 27E66FD9-79B8-4342-92A9-3CA87E497E69
The Postgresql VARCHAR value: b'27E66FD9-79B8-4342-92A9-3CA87E497E69'
Obviously, I don't want the extra b'' in there. Is there a way to change this in the database without modifying the data type?
I have a strange problem. In my SQL Server database there is table containing an nchar(8) column. I have inserted several rows into it with different Unicode data for nchar(8) column.
Now when I fire a query like this
select *
from table
where nacharColumnName = N'㜲㤵㠱㠷㔳'
It gives me a row which contains 㤱㔱〴㐰' as unicode data for nchar(8) column.
How does SQL Server compare unicode data?
You should configure your column (or entire database) collation, so that equality and like operators work as expected. Simplified Chinese of whatever
I have done several SSIS packages over the past few months to move data from a legacy database to a SQL Server database. It normally takes 10-20 minutes to process around 5 millions of records depending on the transformation.
The issue I am experiencing with one of my package is a very poor performance because one of the columns in my destination is of the SQL Server XML data type.
Data comes in like this: 5
A script creates a Unicode string like this: <XmlData><Value>5</Value></XmlData>
Destination is simply a column with XML data type
This is really slow. Any advice?
I did a SQL Trace and notice that in behind the scene SSIS is executing on each row a convert before the insert:
declare #p as xml
set #p=convert(xml,N'<XmlData><Value>5</Value></XmlData>')
Try using a temporary table to store the resulting 5 million records without the XML transformation and then use SQL Server itself to move them from tempDB to the final destination:
INSERT INTO final_destination (...)
SELECT cast(N'<XmlData><Value>5</Value></XmlData>' AS XML) AS batch_converted_xml, col1, col2, colX
FROM #tempTable
If 5.000.000 turns to be too much data for a single batch, you can do it in smaller batches (100k lines should work like a charm).
The record captured by the profiler looks like an OleDB transformation with one command per line.
I need to generate an SQL insert script to copy data from one SQL Server to another.
So with .net, I'm reading the data a given SQL Server table and write this to a new text file which can then be executed in order to insert this data on other databases.
One of the columns is a VARBINARY(MAX).
How should and can I transform the obtained byte[] into text for the script so that it can still be inserted on the other databases?
SSMS shows this data as hex string. Is this the format to use?
I can get this same format with the following
BitConverter.ToString(<MyByteArray>).Replace("-", "")
But how can this be inserted again?
I tried
CONVERT(VARBINARY(MAX), "0xMyHexString")
This does an insert, but the value is not the same as in the source table.
It turned out you can just directly insert the hex string, no need to convert anything:
INSERT TableName (VarBinColumnName)
VALUES (0xMyHexString)
Just don't ask why I didn't test this directly...
There are two questions on SO that may help:
What is the fastest way to get varbinary data from SQL Server into a C# Byte array?
and
How Do I Insert A Byte[] Into an SQL Server VARBINARY column?
I'm in the process of importing data from a legacy MySQL database into SQL Server 2005.
I have one table in particular that's causing me grief. I've imported it from MySQL using a linked server and the MySQL ODBC driver, and I end up with this:
Col Name Datatype MaxLen
OrderItem_ID bigint 8
PDM_Structure_ID int 4
LastModifiedDate datetime 8
LastModifiedUser varchar 20
CreationDate datetime 8
CreationUser varchar 20
XMLData text -1
OrderHeader_ID bigint 8
Contract_Action varchar 10
ContractItem int 4
My main focus is on the XMLData column - I need to clean it up and make it so that I can convert it to an XML datatype to use XQuery on it.
So I set the table option "large data out of row" to 1:
EXEC sp_tableoption 'OrderItem', 'large value types out of row', 1
and then I go ahead and convert XMLData to VARCHAR(MAX) and do some cleanup of the XML stored in that field. All fine so far.
But when I now try to convert that column to XML datatype:
ALTER TABLE dbo.OrderItem
ALTER COLUMN XMLData XML
I get this message here:
Msg 511, Level 16, State 1, Line 1
Cannot create a row of size 8077 which
is greater than the allowable maximum
row size of 8060. The statement has
been terminated.
which is rather surprising, seeing that the columns besides the XMLData only make up roughly 90 bytes, and I specifically instructed SQL Server to store all "large data" off-row....
So why on earth does SQL Server refuse to convert that column to XML data??? Any ideas?? Thoughts?? Things I can check / change in my approach??
Update: I don't know what changed, but on a second attempt to import the raw data from MySQL into SQL Server, I was successfully able to convert that NTEXT -> VARCHAR(MAX) column to XML in the end..... odd..... anyhoo - works now - thanks guys for all your input and recommendations! Highly appreciated !
If you have sufficient storage space, you could try selecting from the VARCHAR(MAX) version of the table into a new table with the same schema but with XMLData set up as XML - either using SELECT INTO or by explicitly creating the table before you begin.
PS - it's a side issue unrelated to your problem, but you might want to check that you're not losing Unicode characters in the original MySQL XMLData field by this conversion since the text/varchar data types won't support them.
Can you ADD a new column of type xml?
If so, add the new xml column, update the table to set the new column equal to the XmlData column and then drop the XmlData column.
Edit
I have a table "TestTable" with a "nvarchar(max)" column.
select * from sys.tables where name = 'TestTable'
This gives a result containing:
[lob_data_space_id] [text_in_row_limit] [large_value_types_out_of_row]
1 0 0
yet I can happily save 500k characters in my nvarchar(max) field.
What do you get if you query sys.tables for your OrderItems table?
If your [text_in_row_limit] is not zero, try this, which should convert any existing in-row strings into BLOBs:
exec sp_tableoption 'OrderItems', 'text in row', 0
and then try to switch from nvarchar(max) to xml.
From BOL,
Disabling the text in row option or
reducing the limit of the option will
require the conversion of all BLOBs;
therefore, the process can be long,
depending on the number of BLOB
strings that must be converted. The
table is locked during the conversion
process.