I have a query that gives approximately four hundred thousand records [400000], up to 18mb.
I parsed the output to XML output, using:
for xml path('url'), Root('urlset')
Now, query result doesn't show complete XML.
when I try to view in XML SQL window, not able to export in an XML file.
ERROR:
I have done below practices as suggested in other posts:
Run This Script to enable xp_cmdshell
Increase XML data capacity to Unlimited in Options >> Result to Grid
Still the same error! How to resolve this?
After a while, I tried what #gofr1 referred in his comment.
Above practice doesn't become effective on opening new Query window/ Refreshing the Database connection.
It works ONLY AFTER Restarting SSMS.
update NOTE: With this solution, you can see Results in Grid, but if you try to export it to XML file. you will get the same error.
Exception of type 'System.OutOfMemoryException' was thrown.
UPDATE:
so, here I decided to pass the XML output to an application and let the C# generate XML file in the desired folder.
As Google sitemap XML limit is : 50,000. So, one should create the number of sitemaps each containing max 50k records.
Note: Google allows 1000 sitemap files for each domain.
EDIT:
Google has increased the maximum limit from 1,000 sitemaps to 50,000 Sitemaps, a sitemap file size can be maximum up to 50mb. This is a huge increase in capacity.
References:
seroundtable/archives/021559
searchengineland/google-bing-increase-file-size-limit-sitemaps-files
searchenginejournal/google-bing-increase-sitemap-file-size-limit-50
This is because SSMS not able to show a large amount of data in the result window.
You can write a utility and dump the data into the file for better readability.
Related
I'd like to try and gauge my users internet speeds based on the downloading of a dataset of known size (1MB).
Using T-SQL only, how can I quickly create a table with exactly 1MB of data in it?
I want to be able to run EXEC sp_spaceused N'dbo.myTableName to verify data size.
The SO search keywords ended up being Numbers Table. Searching for this term, I found a great post.
Closing this as duplicate of: What is the best way to create and populate a numbers table?
I think i'm going to move a different direction for my use case with
fsutil file createnew C:\Desktop\testFile.png 1000000
we have requirement :
To show object data on Visual Force page using pagination.
Export button to export all records to xls or csv file.
Issue is data size is too large i.e. more than 100000 records.
How can we write more than 100000 records to xls file using Apex?
I know for sure writing to a 'Document' could work. Maybe even the newer Files feature. You can write and append to an existing document. Using the #read only annotation you can query more than 10.000 records. You might run into heap size errors though. Other option could be to use the Bulk API v2.
I have searched Google and this site for about 2 hours trying to gather how to do this and no luck on a way that fits/ I understand. As the title says, I need to export table data to an XML file. I have an Azure SQL database with table data.
Table name: District
Table Columns: Id, name, organizationType, address, etc.
I need to take this data and create a XML file that I can save so that it can be given to others.
I have tried using:
SELECT *
FROM dbo.District
FOR XML PATH('districtEntry'), ROOT('leaID')
It gives me the data in XML format, but I don't see a way to save it.
Also, there are some functions I need to be able to perform with the data:
Program should have these options:
1) Export all data.
2) Export all rows created or updated since a specified date.
Files should be named in format ENTITY.DATE.XML, as in
DISTRICT.20150521.XML (use date in YYYYMMDD format).
This leads me to believe I need to write code other than SQL since a requirement would be to query the table for certain data elements as well.
I was wondering if I would need to download any Database Server Data Tools, write code, and if so, in what language, etc. The XML file creation would need to be automated I believe after every update of the table or after a query.
I am very confused and in need of guidance as I now have almost given up hope. Please let me know if I need to clarify anything. Thank you.
P.S. I would have given pictures but I do not have enough reputation to supply them.
I would imagine you're looking to write a program in VB.NET or C#, using ADO.NET in either case. Here's an MSDN article with a complete sample of how to connect to and query SQL Azure:
https://msdn.microsoft.com/en-us/library/azure/ee336243.aspx
The example shows how to write the output to the Console, but you could also write the output similarly using something like a StreamWriter to write it to a file.
You could also create a sqlcmd script to do this, following the guidelines here to connect using sqlcmd:
https://msdn.microsoft.com/en-us/library/azure/ee336280.aspx
Alternatively, if this is a process that does not need to be automated or repeated frequently, you could do it using SSMS:
http://azure.microsoft.com/en-us/documentation/articles/sql-database-manage-azure-ssms/
Running your query through SSMS would produce an XML document, which could be saved using File->Save As
I'm converting a client's SSIS packages from DTS to SSIS. In one of their packages they have an execute SQL task that has a query similar to this:
SELECT * FROM [SOME_TABLE] AS ReturnValues
ORDER BY IDNumber
FOR XML AUTO, ELEMENTS
This query seems to return in a decent amount of time on the old system but on the new box it takes up to 18 minutes to run the query in SSMS. Sometimes if I run it it will generate an XML link and if I click on it to view the XML its throwing a 'System.OutOfMemoryException' and suggests increasing the number of characters retrieved from the server for XML data. I increased the option to unlimited and still getting error.
The table itself contains 220,500 rows but the query rows returned is showing 129,810 before query stops. Is this simply a matter of not having enough memory available to the system? This box has 48 GB (Win 2008 R2 EE x64), instance capped to 18GB because its shared dev environment. Any help/insight would be greatly appreciated as I don't really know XML!
When you are using SSMS to do XML queries FOR XML, it will generate all the XML and then put it into the grid and allow you to click on it. There are limits to how much data it brings back and with 220,000 rows, depending on how wide the table is, is huge and produces a lot of text.
The out of memory is the fact that SQL Server is trying to parse all of it and that is a lot of memory consumption for SSMS.
You can try to execute to a file and see what you get for size. But the major reason for running out of memory, is because that is a lot of XML and returning it to the grid, you will not get all the results all the time with this type of result set (size wise).
DBADuck (Ben)
The out of memory exception you're hitting is due to the amount of text a .net grid control can handle. 220k lines is huge! the setting in SSMS to show unlimited data is only as good as the .net control memory cap.
You coul look at removing the ELEMENTS option and look at the data in attribute format. That will decreate the amount XML "string space" returned. Personally, I prefer attributes over elements for that reason alone. Context is king, so it depends on what you're trying to accomplish (look at the data or use the data). Could youp pipe the data into an XML variable? When all is said & done, DBADuck is 100% correct in his statement.
SqlNightOwl
Our customer is complaining that our export file is too long; they would like us to split the export into many files with no more than “n” records per file. Is there a way of doing this with “select for xml”
At persent we are using Sql Server 2005 for this project.
(If this is too hard, I can always post process the single large file to split it up)
I don't think there's anything simple'n'easy you can do here.
My approach would probably be to limit the number of rows returned by each SELECT statement (by partioning the data returned by some criteria, e.g. by date or location or something), and then put those smaller XML streams into files one by one. Doable, but not very elegant or sophisticated..