Related
I am generating XML from 60 tables, and storing this xml in a table.
Table Name : Final_XML_Table
PK FK XML_Content (type xml)
1 1 "XML that I am generating from 60 tables"
When I am running below query , it gives memory exception :
Select * from Final_XML_Table
Things I have tried :
1. Results to text : I am getting only few lines from XML as text in output window
2. Results to file : I am getting only few lines from XML in file.
Please suggest, and also if there is any change , will I have to do this on server's SQL server as well while deployment.
I have also set XML_Data to unlimited :
This is not an answer, but to much for a comment...
The fact, that you are able to store the XML, shows clearly, that the XML is not to big for the database.
The fact that you get an out-of-memory exception with Select * from Final_XML_Table shows clearly, that SSMS has a problem on reading/displaying your XML.
You might try to check the length like here:
DECLARE #tbl TABLE (x XML);
INSERT INTO #tbl VALUES('<root><test>blah</test><test /><test2><x/></test2></root>');
SELECT * FROM #tbl; --This does not work for you
SELECT DATALENGTH(x) FROM #tbl; --This returns just "82" in this case
Might be, that due to a logical error in your XML's creation (a wrong join?) the XML contains multiple/repeated elements. You might try a query like this to get a count of nodes in order to check if this number is realistic:
SELECT x.value('count(//*)','int') FROM #tbl
For the exampe above this returns "5"
You might do the same with your original XML.
With a query like the following you can retrieve all node names of the first level, the second level and so on. You can check if this looks okay:
SELECT firstLevel.value('local-name(.)','varchar(max)') AS l1_node
,SecondLevel.value('local-name(.)','varchar(max)') AS l2_node
--add more
FROM #tbl
OUTER APPLY x.nodes('/*') AS A(firstLevel)
OUTER APPLY A.firstLevel.nodes('*') AS B(SecondLevel)
--add more
And - of course - you might open the ResourceMonitor to look at the actual usage of memory...
Come back with more details...
That error isn't a SQL Server error, it's from SSMS. It means that SSMS has run out of memory.
SSMS is only a 32bit application, so can only address 2GB of RAM. If it tries to address more than that, the error will occur. if you've had SSMS open and returned some very large datasets, that RAM is going to get used up.
In all honesty, if you're running a query like SELECT * FROM Final_XML_Table then I would hazard a guess that the dataset is huge. Add a WHERE clause, or don't return the dataset on screen. if you really need to view the data (all of it), export it to something else. But I very much doubt you need to look at every row, if you're returning around 2GB of data.
I am working on a vb.net application, the management wants me to change the applications data source from SQL Server to XML.
I have a class called WebData.vb in the old application I need to somehow find a way to replace the stored procedures in it and make it read xml. So I was thinking of getting the xml structure from the returning result set of the stored procedure. I looked online and they said that for normal select statement you can do something like this:
FOR xml path ('Get_Order'),ROOT ('Get_Orders')
I am looking for something like
EXEC dbo.spMML_GET_ORDERS_FOR_EXPORT
FOR xml path ('Get_Order'),ROOT ('Get_Orders')
so now that I have the structure I can pass that data to a datatable and then return that datatable to the method.
Also if there is an alternative way in creating a XML stored procedure please let me know thanks coders.
Assuming you can't modify the stored proc (due to other dependencies or some other reason) to have the SELECT within the proc have the FOR XML syntax, you can use INSERT/EXEC to insert the results of the stored proc into a temp table or table variable, then apply your FOR XML onto a query of those results.
Something like this should work:
DECLARE #Data TABLE (...) -- Define table to match results of stored proc
INSERT #Data
EXEC dbo.spMML_GET_ORDERS_FOR_EXPORT
SELECT * FROM #Data FOR xml path ('Get_Order'),ROOT ('Get_Orders')
There are a few methods, one adding namespaces using WITH XMLNAMESPACES(<STRING> AS <NAMESPACE string>). XMLNAMESPACES can embed appropriate XML markers to your tables for use with other applications (which hopefully is a factor here), making documentation a little easier.
Depending on your application use, you can use FOR XML {RAW, PATH, AUTO, or EXPLICIT} in your query, as well as XQUERY methods...but for your needs, stick to the simpler method like XML PATH or XML AUTO.
XML PATH is very flexible, however you lose the straightforward identification of the column datatypes.
XMLNAMESPACE
WITH XMLNAMESPACES('dbo.MyTableName' AS SQL)
SELECT DENSE_RANK() OVER (ORDER BY Name ASC) AS 'Management_ID'
, Name AS [Name]
, Began AS [Team/#Began]
, Ended AS [Team/#Ended]
, Team AS [Team]
, [Role]
FROM dbo.SSIS_Owners
FOR XML PATH, ELEMENTS, ROOT('SQL')
XML AUTO
Because you might want to return to the database, I suggest using XML AUTO with XMLSCHEMA, where the sql datatypes are kept in the XML.
SELECT DENSE_RANK() OVER (ORDER BY Name ASC) AS 'Management_ID'
, Name AS [Name]
, Began AS [Team/#Began]
, Ended AS [Team/#Ended]
, Team AS [Team]
, [Role]
FROM dbo.SSIS_Owners
FOR XML AUTO, ELEMENTS, XMLSCHEMA('SSIS_Owners')
Downside is XMLNAMESPACES is not an option, but you can get around this through solutions like XML SCHEMA COLLECTIONS or in the query itself as I showed.
You can also just use XML PATH directly without the namespace, but again, that depends on your application use as you are transforming everything to XML files.
Also note how I defined the embedded attributes. A learning point here, but think about the query in the same order that the XML would appear. That is why I defined the variable attributes first before I then stated what the text for that node was.
Lastly, I think you'll find Paparazzi has a question on this topic that covers quite. TSQL FOR XML PATH Attribute On , Type
I have the following scenario/requirements for which I am not sure what is the best way to address to perform in the fastest way possible, looking for some guidance of features to use and examples of them, if available
I will receive anywhere between 10k to 100k of entities (in XML format) from a web service that I want to upsert (some rows might exist, others might not).
here are some of the requirements:
The source of the XML is a web service that I'm calling from C# code. Actually two different methods. For one of the methods, the return schema will be something flat that I can map directly to one of my tables. For the other, it will return an XML representation that I might need to work with in C# in order to be able to map it to flat entities for my tables. In that scenario, would it be best to do the modifications needed and then write to file to an XML to use as source?
The returned XML can contain up to 150k entities in XML, that may or may not exist in my tables yet, so I'm looking to upsert them. The files, when written to disk, can weight up to 20 megabytes. I asked if they could do JSON instead of XML, but apparently that's not a choice.
The SQL database is on a different server than my IIS server, so I rather avoid having the SQL server retrieve the XML from a file, I rather pass it from C# as a string or as a Table Value Parameter.
The tables are rather simple and don't have indexes other than the PK ones.
I've never been big on XML, although it got way easier with LINQ to XML, which I was initially using to parse each record and send individual inserts but the performance was just bad, so based on some research I've been doing, I'm thinking I could use:
Upserts from SQL server through MERGE statements.
Pass the whole XML as a parameter and use OPENXML to use as source in the MERGE statement.
Or, somehow generate a Table Value Parameter in C# and pass that to SQL to use on the MERGE.
I read on this similar question (which didnt have access to upsert/merge) that instead of trying to upsert directly from the XML, that it might be better to insert everything to a temporary table and do the merge/upsert against the temporary table?
Would this work and be considerably fast?
If anyone has had a similar scenario, can you share your thoughts/ideas about what combination of features would be best?
Thanks.
You are on the right track. I have a similar setup using XML to transfer data between an online portal and the client-server application. The rest of the setup is very similar to what you have.
The fact that your tables are not indexed is a bit of a concern, if you are comparing any fields that are not PK Fields, regardless of how you index the temp tables. It is important to have either one index with all of the fields used in the merge match clause, or an index for each of them - I find the former yields better performance. Beyond that, using an XML parameter, OpenXML and temp tables is the way to go.
The following code has not been tested, so may need a bit of debugging, but it will put you on the right track. A couple of notes: If all of the fields in the OpenXML WITH clause are attributes, then you can drop the last parameter (i.e. ", 2") and field source specifiers (i.e. "#id" for the detail table). Although the data in your description is flat, in which case you will only need one table, I do often need to import into linked records. I have included a simple master-detail relationship example in the code below, just for the sake of completeness.
CREATE PROCEDURE usp_ImportFromXML (#data XML) AS
BEGIN
/*
<root>
<data>
<match_field_1>1</match_field_1>
<match_field_2>val2</match_field_2>
<data_1>val3</data_1>
<data_2>val4</data_2>
<detail_records>
<detail_data id="detailID1">
<detail_1>blah1<detail_1>
<detail_2>blah2<detail_2>
</detail_data>
<detail_data id="detailID2">
<detail_1>blah3<detail_1>
<detail_2>blah4<detail_2>
</detail_data>
</detail_records>
</data>
<data>
...
</root>
*/
DECLARE #iDoc INT
EXEC sp_xml_preparedocument #iDoc OUTPUT, #data
SELECT * INTO #temp
FROM OpenXML(#iDoc, '/root/data', 2) WITH (
match_field_1 INT,
match_field_2 VARCHAR(50),
data_1 VARCHAR(50),
data_2 VARCHAR(50)
)
SELECT * INTO #detail
FROM OpenXML(#iDoc, '/root/data/detail_data', 2) WITH (
match_field_1 INT '../../match_field_1',
match_field_2 VARCHAR(50) '../../match_field_2',
detail_id VARCHAR(50) '#id',
detail_1 VARCHAR(50),
detail_2 VARCHAR(50)
)
EXEC sp_xml_removedocument #iDoc
CREATE INDEX #IX_temp ON #temp(match_field_1, match_field_2)
CREATE INDEX #IX_detail ON #detail(match_field_1, match_field_2, detail_id)
MERGE data_table a
USING #temp ta
ON ta.match_field_1 = a.match_field_1 AND ta.match_field_2 = a.match_field_2
WHEN MATCHED THEN
UPDATE SET data_1 = ta.data_1, data_2 = ta.data_2
WHEN NOT MATCHED THEN
INSERT (match_field_1, match_field_2, data_1, data_2) VALUES (ta.match_field_1, ta.match_field_2, ta.data_1, ta.data_2)
MERGE detail_table a
USING (SELECT d.*, p._key FROM #detail d, data_table p WHERE d.match_field_1 = p.match_field_1 AND d.match_field_2 = p.match_field_2) ta
ON a.id = ta.id AND a.parent_key = ta._key
WHEN MATCHED THEN
UPDATE SET detail_1 = ta.detail_1, detail2 = ta.detail_2
WHEN NOT MATCHED THEN
INSERT (parent_key, id, detail_1, detail_2) VALUES (ta._key, ta.id, ta.detail_1, ta.detail_2)
DROP TABLE #temp
DROP TABLE #detail
END
Use (3). Process the data ready for upset in C#. C# is made for this kind of algorithmic work. It is both the right programming language as well as the faster programming language. T-SQL is not the right tool. You do not want to use XML with T-SQL for very high performance stuff because it burns CPU like crazy. Instead use the fast TDS protocol to send TVP or bulk data.
Then, send the data to the server using either a TVP or a bulk-insert (SqlBulkCopy) to a temp table. The latter technique is great for very many rows (>10k?). Bulk insert uses special TDS features. It does not use SQL batches to transfer the data. It does not get faster than this.
Then use the MERGE statement as you described. Use big batch sizes, potentially all rows in one batch.
The best way I've found is to bulk insert into a temp table from your C# code, then issue the merge once the data is in SQL Server. I have an example here on my blog SQL Server Bulk Upsert
I use this in production to insert millions of rows daily, and have yet to find a faster way to do it. Give it a try, I think you will be impressed with the performance of the solution.
In this live SQL Server 2008 (build 10.0.1600) database, there's an Events table, which contains a text column named Details. (Yes, I realize this should actually be a varchar(MAX) column, but whoever set this database up did not do it that way.)
This column contains very large logs of exceptions and associated JSON data that I'm trying to access through SQL Server Management Studio, but whenever I copy the results from the grid to a text editor, it truncates it at 43679 characters.
I've read on various locations on the Internet that you can set your Maximum Characters Retrieved for XML Data in Tools > Options > Query Results > SQL Server > Results To Grid to Unlimited, and then perform a query such as this:
select Convert(xml, Details) from Events
where EventID = 13920
(Note that the data is column is not XML at all. CONVERTing the column to XML is merely a workaround I found from Googling that someone else has used to get around the limit SSMS has from retrieving data from a text or varchar(MAX) column.)
However, after setting the option above, running the query, and clicking on the link in the result, I still get the following error:
Unable to show XML. The following error happened:
Unexpected end of file has occurred. Line 5, position 220160.
One solution is to increase the number of characters retrieved from the server for XML data. To change this setting, on the Tools menu, click Options.
So, any idea on how to access this data? Would converting the column to varchar(MAX) fix my woes?
SSMS only allows unlimited data for XML data. This is not the default and needs to be set in the options.
One trick which might work in quite limited circumstances is simply naming the column in a special manner as below so it gets treated as XML data.
DECLARE #S varchar(max) = 'A'
SET #S = REPLICATE(#S,100000) + 'B'
SELECT #S as [XML_F52E2B61-18A1-11d1-B105-00805F49916B]
In SSMS (at least versions 2012 to current of 18.3) this displays the results as below
Clicking on it opens the full results in the XML viewer. Scrolling to the right shows the last character of B is preserved,
However this does have some significant problems. Adding extra columns to the query breaks the effect and extra rows all become concatenated with the first one. Finally if the string contains characters such as < opening the XML viewer fails with a parsing error.
A more robust way of doing this that avoids issues of SQL Server converting < to < etc or failing due to these characters is below (credit Adam Machanic here).
DECLARE #S varchar(max)
SELECT #S = ''
SELECT #S = #S + '
' + OBJECT_DEFINITION(OBJECT_ID) FROM SYS.PROCEDURES
SELECT #S AS [processing-instruction(x)] FOR XML PATH('')
I was able to get this to work...
SELECT CAST('<![CDATA[' + LargeTextColumn + ']]>' AS XML) FROM TableName;
One work-around is to right-click on the result set and select "Save Results As...". This exports it to a CSV file with the entire contents of the column. Not perfect but worked well enough for me.
Did you try this simple solution? Only 2 clicks away!
At the query window,
set query options to "Results to Grid", run your query
Right click on the results tab at the grid corner, save results as any files
You will get all the text you want to see in the file!!! I can see 130,556 characters for my result of a varchar(MAX) field
The simplest workaround I found is to backup the table and view the script. To do this
Right click your database and choose Tasks > Generate Scripts...
"Introduction" page click Next
"Choose Objects" page
Choose the Select specific database objects and select your table.
Click Next
"Set Scripting Options" page
Set the output type to Save scripts to a specific location
Select Save to file and fill in the related options
Click the Advanced button
Set General > Types of data to script to Data only or Schema and Data and click ok
Click Next
"Summary Page" click next
Your sql script should be generated based on the options you set in 4.2. Open this file up and view your data.
The data type TEXT is old and should not be used anymore, it is a pain to select data out of a TEXT column.
ntext, text, and image (Transact-SQL)
ntext, text, and image data types
will be removed in a future version of
Microsoft SQL Server. Avoid using
these data types in new development
work, and plan to modify applications
that currently use them. Use
nvarchar(max), varchar(max), and
varbinary(max) instead.
you need to use TEXTPTR (Transact-SQL) to retrieve the text data.
Also see this article on Handling The Text Data Type.
It sounds like the Xml may not be well formed. If that is the case, then you will not be able to cast it as Xml and given that, you are limited in how much text you can return in Management Studio. However, you could break up the text into smaller chunks like so:
With Tally As
(
Select ROW_NUMBER() OVER ( ORDER BY s1.object_id ) - 1 As Num
From sys.sysobjects As s1
Cross Join sys.sysobjects As s2
)
Select Substring(T1.textCol, T2.Num * 8000 + 1, 8000)
From Table As T1
Cross Join Tally As T2
Where T2.Num <= Ceiling(Len(T1.textCol) / 8000)
Order By T2.Num
You would then need to manually combine them again.
EDIT
It sounds like there are some characters in the text data that the Xml parser does not like. You could try converting those values to entities and then try the Convert(xml, data) trick. So something like:
Update Table
Set Data = Replace(Cast(Data As varchar(max)),'<','<')
(I needed to cast to varchar(max) because the replace function will not work on text columns. There should not be any reason you couldn't convert those text columns to varchar(max).)
You are out of luck, I think. THe problem is not a SQL level problem as all other answers seem to focus on, but simply one of the user interface. Management Studio is not meant to be a general purpose / generic data access interface. It is not there to be your interface, but your administrative area, and it has serious limitations handling binary data and large test data - because people using it within the specified usage profile will not run into this problem.
Presenting large text data is simply not the planned usage.
Your only choice would be a table valued function that takes the text input and cuts it rows for every line, so that Management Studio gets a list of rows, not a single row.
I prefer this simple XML hack which makes columns clickable in SSMS on a cell-by-cell basis. With this method, you can view your data quickly in SSMS’s tabular view and click on particular cells to see the full value when they are interesting. This is identical to the OP’s technique except that it avoids the XML errors.
SELECT
e.EventID
,CAST(REPLACE(REPLACE(e.Details, '&', '&'), '<', '<') AS XML) Details
FROM Events e
WHERE 1=1
AND e.EventID BETWEEN 13920 AND 13930
;
Starting from SSMS 18.2, you can now view up to 2 million characters in the grid results. Source
Allow more data to be displayed (Result to Text) and stored in cells
(Result to Grid). SSMS now allows up to 2M characters for both.
I verified this with the code below.
DECLARE #S varchar(max) = 'A'
SET #S = REPLICATE(#S,2000000) + 'B'
SELECT #S as a
declare #takeOver table(details nvarchar(max))
declare #json_auto nvarchar(max)
select #json_auto = (select distinct
From table_1 cg
inner join table_2 c
on cg.column_1= c.column_1and cg.isDeleted =0 and c.isdeleted = 0
inner join table_3 d
on c.column_2= d.column_2 and d.isdeleted = 0
where cg.Id= 1017
for Json Auto)
insert into #takeOver
values(#json_auto)
select * from #takeOver
Some how some records in my table are getting updated with value of xyz in a certain column. Out of hundred of stored procedures, functions, triggers, how can I determine which code is doing this action. Is there a way to search through the database some how through each and every script of the code?
Please help.
One approach is to check syscomments
Contains entries for each view, rule,
default, trigger, CHECK constraint,
DEFAULT constraint, and stored
procedure within the database. The
text column contains the original SQL
definition statements..
e.g. select text from syscomments
If you are having trouble finding that literal string, the values could be coming from a table, or they could be being concatenated within a routine.
Try this
Select text from syscomments
where CharIndex('x', text) > 0
and CharIndex('y', text) > 0
and CharIndex('z', text) > 0
That might help you either find the right routine, or further indicate that the values are coming from a table.
This is going to be nearly impossible to do in SQL Server 2000 because the update might very well be from a variable that has that value or a join to another table that has that value and not hard-coded into the stored proc, trigger etc. The update could also be coming from a DTS package, a job, a piece of dynamic code run by the app or even from query analyzer, so the code itself may not be recorded inthe datbase anywhere.
Perhaps a better approach might be to create an audit table for the table in question and have it record the user and the code from the spid that generated the change as well as the old and new values. You'll have to wait until it happens again, but then you would know exactly what changed the value and what value to put it back to if need be.
Alternatively you could run profiler on the system until it happens but profiler tends to hurt performance and is not usually a good idea to run on a production system. If it is happening very often, it might be an acceptable alternative.
Here's a hint as to how you might get some of the info you want for the eventual trigger code you write:
create table #temp (eventtype nvarchar (1000), parameters int, eventinfo nvarchar (4000), myspid int)
declare #myspid int
select #myspid =##spid
insert #temp (eventtype,parameters, eventinfo)
exec ('dbcc inputbuffer (##spid)')
update #temp
set myspid = #myspid
select hostname, program_name, eventinfo
from #temp t
join sysprocesses s on t.myspid = s.spid
WHERE spid = #myspid
You might use sql-profiler to trac the update of a given table / column.