How to definitely fix this error: .NET Framework execution was aborted by escalation policy because of out of memory? - sql-server

First of all, here is our setup:
SQL Server 2014 SP2 Standard Edition 64-bit
128GB of RAM (maximum allowed by Standard Edition), of which 120GB is allocated to SQL Server
The server currently hosts ~5000 databases which are all similar (same tables, stored proc, etc.) for a total of 690GB of data (mdf files only)
Now what happens:
Every now and then, after the server has been up for some time, we receive this error when executing queries on some databases:
.NET Framework execution was aborted by escalation policy because of out of memory
This error happens more often when we perform an update of all client databases (when we launch a feature) using Red Gate SQL Multi Script. Of the ~5000 DBS, we have the error on 70 of them. Running the update script again, the error happens on a portion, and so on until we have all databases updated correctly. This is just annoying.
We have this error since a long time. Our server had 64GB of RAM, so we just added more memory to max out SQL Server Standard Editor, but still, the error came back a few days later. We think the error might be a symptom of something else.
A few things that might help to get an answer:
Our version of SQL Server is 64-bit, so we think we don't have to deal with Virtual Address Space Reservation
The error also happens when running from a client app written in PHP/Linux, so we're not talking about .NET framework of the client code
In our databases, the only usage of .NET framework we make is GROUP_CONCAT, a.NET CLR Assembly with User Defined Functions which help us simulate MySQL GROUP_CONCAT aggregate. We have a copy of the assembly in each of our 5000 client databases.
We already tried to lower the max server memory setting (to 96GB), but we were still getting those errors
If more info is needed I will update my question.

It's been 4 months since I tried a fix, and I did not experienced the bug again. Still, I don't have the exact explanation for the bug, but here is what I tried and it seems to work:
My guess was that having the same .NET CLR Assembly in each of our 5000+ databases might be the problem and increased memory usage for .NET in some way.
I created a new database named something like DotNetClrUtils
I copied the .NET CLR Assembly for GROUP_CONCAT in this database
I changed all usage of GROUP_CONCAT in all our client code and stored procedures to reference the single instance in DotNetClrUtils database (calling it like this: DotNetClrUtils.dbo.GROUP_CONCAT_D(col, ','))
That's all, and now this problem is gone!

Related

SQL server temporary connection failure from particular clients

We have a client deployment of our software that is showing intermittent SQL server connection failures, and we are struggling to understand them.
Our system consists of a SQL Server DB (2012) and 14 identical engines, each installed on a Windows 2012 VM. Each of these was created from the same template so they should be identical. The engines consist of a Windows service that connects to the DB on startup by reading a single row from a table. If the connection fails they will wait a few seconds and try again, until they get a connection.
In this particular case, the VMs were all rebooted due to a Windows Update. (The SQL server had the update/reboot about 12 hours before). They came online within a few minutes of each other. 12 of the engines started up without any problem. Two of them, however, failed to connect to the DB with:
"The underlying provider failed on Open."
Those two engines then started to poll, and continued to get this error for many hours. The rest of the engines had started up and were fine. We have a broker service too that was accessing the DB throughout and showed no connection issues.
When the client noticed this issue, they restarted the engine services on the two problem VMs, and the two engines connected to the DB just fine.
We are trying to understand what could have happened here. I guess my main questions are:
What could be an explanation of why 12 connections succeed and two fail? There's absolutely no difference as far as we know between the engines. The query itself is very simple.
Why did the connection continue to fail for those two engines until the service was restarted? This suggests to me that there is some process-level failed state that is only cleared when restarting the services. I've looked at the code to see if it was reusing the connections. It uses Entity Framework to read the single table row, and we create a fresh DbContext each time. I don't understand how this could go wrong.
We noted that there was a CheckDb operation proceeding on the DB around the time the services were coming up, and we wondered if this could be related to the issue. However, the client says that this runs every night and hasn't caused problems in the past. And it wouldn't explain why the engines didn't come back up again.
Thanks in advance for any help.

Queries slow when run by specific Windows account

Running SQL Server 2014 Express on our domain. We use Windows Authentication to log on. All queries are performed in stored procedures.
Now, the system runs fine for all our users - except one. When he logs on (using our software), all queries take around 10 times longer (e.g. 30 ms instead of 2 ms). The queries are identical, the database is the same, the network speed is the same, the operative system is the same, the SQL Server drivers are the same, connection pooling is the same, DNS is the same. Changing computer does not help. The problem seems to be linked to the account being used.
What on Earth may be the cause for this huge performance hit?
Please advise!
I would try rebuilding the SP (by running an ALTER statement that duplicates its existing structure) to force SQL Server to recompile. I don't know every way SQL Server caches things but it can definitely create distinct execution plans for different types of connections so I wouldn't be surprised if your slow user is running a version with an inefficient execution plan.
http://www.sommarskog.se/query-plan-mysteries.html

Classic ASP pages using COM object are extremely slow

First of all, I'd like to say upfront that some of the details on this question may be a little vague as it relates to commercial code that I obviously cannot divulge here. But anyway, here goes.
I am currently dealing with a .Net based web application that also has pages of classic ASP in certain areas - this application has a backend MSSQL database. The data access for the classic ASP areas of the application are handled by a 32 bit DLL added into Component Services on the IIS server. The application has recently been changed so all components (IIS authentication, .Net Application Pool and a data access DLL added to Component Services) now run under a single Windows account.
The problem I have found recently is that whilst .Net data access to the database is running at a perfectly normal speed, the classic ASP data access that goes via the COM+ component is extremely slow. Unfortunately, this doesn't seem to be throwing any errors anywhere, not in IIS logs and not in Event Viewer.
The COM+ component is instantiated in a standard fashion and this is done in an include file that is referenced on any ASP page needing data access, eg.
var objDataProvider = Server.CreateObject("DataAccess.DataProvider");
The ASP pages then use methods in the COM+ to execute queries such as;
objDataProvider.Query(...)
objDataProvider.Exec(...)
I have enabled failed request tracing in IIS for any ASP pages that are taking longer than 20s to process and can see that the majority of the calls are dealing with record sets as seen in the examples below.
if(rsri.EOF)
and
theReturn = rsData(0);
The two examples above both take over 9s to run. Profiling the SQL that runs in the background shows this runs in a matter of milliseconds which suggests the problem lies with the COM+ returning the data back to the ASP OR the ASP being slow to process it.
Windows Server 2008 R2 SP1 (64 bit)
Intel Xeon CPU
2GB RAM
Has anyone experienced a similar situation to this before or able to point me in the direction of any further diagnostics that might help?
After days of messing around with this, I've finally found the answer and it's the simplest of issues solved by going back to basics.
The web server hosting the application was in a DMZ and required a hosts file entry for the SQL server. Once this was added, the application was flying along again.
You'll have to go deeper by yourself. I had used before Ants Performance Profiler to improve the performance of .Net applications. It seems it can profile COM+ as well.
http://www.red-gate.com/products/dotnet-development/ants-performance-profiler/walkthrough
This or some other similar profiling tool is the only option I see to solve your problems.
I'm on the same issue. I have migrated a iis server from W2003-SQL express 2005 to W2012R2-SQL Server 2014 express and the asp scripts are 2 times slower.
The sql querys inside are executed with the same or better speed and I test the server locally to discard internet bandwith problems. And I think it may be a dns issue.
What change had you done in hosts file? and, What sql string connection or connection type had you used?
Thanks

ADO.NET database access

I have written a program in VB.NET and one of the things this program does is insert records into a Microsoft Access database. The backend of my program that access the database is written as an interchangeable layer. If I "swap" this layer out with a layer that used a Microsoft SQL Server database, my program flies. If I use MS Access, its still pretty quick, but it is much slower. Does anyone have any hints or tips on how to speed up ADO.NET transactions using Microsoft Access? I would really rather use MS Access over SQL Server so that I can distribute my database with my program (rather then connecting to some remote SQL Server). Any suggestions? Also, when I created the MS Access database, I created it in Access 2000 compatible mode. would it be faster to use 2003 compatible mode?
Thanks in advance
Although you need to install it, SQL Server Express supports "XCopy file deployment" where all you need to do to deploy the application is ship an .mdf file and your executables.
Details are here on MSDN.
This does support stored procedures: I've used it in our unit tests to dynamically create a mocked-out database on the fly.
Access is, as you're experiencing, less than optimal.
Have you taken a look at SQL Server Compact Edition. It can be embedded and distributed with your application...and should perform much better than Access.
SQL Server Compact 3.5 will give you the same benefit - a single database file that you can deploy and distribute (as long as you include the runtime assemblies in your app).
It has reduced query capabilities compared to a full SQL Server instance, but it is definitely faster than the Access engine.
I have used it with a mobile app that has a desktop component and it did everything I needed it to do.
Did you also have the Access backend open in Access at the same time? If so try your program without having it open. If that speeds things up then you should open either a database connection or a recordset (against a table with few records) and leave it open while processing the data.
The problem is that if you open and close objects or recordsets against an Access database file and someone else is in the Access database file, Jet wastes a lot of time doing locks against the LDB file. So keeping a permaneent connection to the Access database file solves this problem.
To my experience, ADO.NET is not much optimized for MS Access. Using the older ADO or DAO interfaces (which are available in VB.NET via COM) can bring you performance improvements about a factor 20 or more in some cases. But it all depends a lot of what SQL statements your program really does (lots of batch updates / insert, or lots of queries with large result sets, or lots of interactive LOAD-Transform-Store Cycles).
The MSDN features an Article on how to speed up ADO.NET: http://msdn.microsoft.com/en-us/library/ms998569.aspx
Even though the article is a bit dusty, it still makes a few good points :)
Other than that, using MS Access myself, I found that a few techniques such as caching of data, selecting without the source scheme or optimizing queries are suitable to keep the performance at a halfway decent level.

Performance problems with SQL Server Management Studio

I'm running Sql Server Management Studio 2008 on a decent machine. Even if it is the only thing open with no other connections to the database, anything that has to do with the Database Diagram or simple schema changes in a designer take up to 10 minutes to complete and SQL Management Studio is unresponsive during that time. The same SQL code takes less than a second. This entirely defeats the purpose of the designers and diagramers.
------------------
System Information
------------------
Operating System: Windows Vistaâ„¢ Ultimate (6.0, Build 6001) Service Pack 1 (6001.vistasp1_gdr.080917-1612)
Processor: Intel(R) Core(TM)2 Quad CPU Q6700 # 2.66GHz (4 CPUs), ~2.7GHz
Memory: 6142MB RAM
Please tell me this isn't a WOW64 problem; if it is, I love MS, but step up your 64-bit support in development tools.
Is there anything I can do to get the performance anywhere near acceptable?
Edit:
I've got version 10.0.1600.22 of SQL Server Management Studio installed. Is this not the latest release? I'm sure I installed it from an MSDN CD and I pretty much rely on Windows Update these days. Is there any place I can quickly see what the latest release version number is for tools like this?
Edit:
Every time I go to open a database diagram I get the message "This database does not have one or more of the support objects required to use database diagramming. Do you wish to create them?" I say yes every time. Is this part of the problem? Also, if I press the copy icon, I get the message "Current thread must be set to single thread apartment (STA) mode before OLE calls can be made." Database corruption?
I'm running in a similar environment and not having that problem.
As with any performance problem, you'll have to analyze it a bit - just saying "it takes 10 minutes" give no information on the reason it takes so long, so no information you can use to solve the problem.
Here are some tools to play around with. I'd have mentioned them originally, but "play around" is all I've learned to do with them. I'd recommend you try learning a little about them, which I have not done. http://technet.microsoft.com is a good source on performance issues.
Start with Task Manager, believe it or not. It's been enhanced in Vista and Server 2008, and now has a better Performance tab, and a Services tab. Be sure to click "Show processes from all users", or you'll miss nasty things done by services.
The bottom of the Performance tab has a "Resource Monitor" button. Click it, watch it, learn what it can do for you.
The Resource Monitor is actually part of a larger "Reliability and Performance Monitor" tool in Administrative Tools. Try it. It even includes the new version of perfmon, which will be more useful when you have a better idea what counters to look at.
I will also suggest the Process Explorer and Process Monitor tools from Sysinternals. See http://technet.microsoft.com/en-us/sysinternals/default.aspx.
Do your simple schema changes possibly mean that you're reordering the columns of a table?
In that case, what SQL Management Studio does behind the scenes is create a new table, move all the data from the old table to the newly created table, and then drop the old table.
Thus, if you reorder columns on a table with lots of data, lots of indices or both, you CAN incur a massive amount of "reorganization" work without really realizing it.
Marc
Can you try connecting your SQL Management Studio to a different instance of SQL Server or, better, an instance on a remote machine (and try to make similar changes)?
Are there any entries in the System or Application Event Logs (or SQL logs for that matter)? Have you tried uninstalling and reinstalling SQL Server on your machine? What version of SQL Server (database) are you running?
Lastly, can you open the Activity Monitor successfully? Right click on the server (machine name) - top of the three in the object explorer window - and click on 'Activity Monitor'.
Do you have problems with other software on your machine or only with SQL Server & Management Studio?
When you open SSMS it attempts to validate itself with Microsoft. You can speed this process by performing the second of the recommendations at the following link.
http://www.sql-server-performance.com/faq/sql_server_management_studio_load_time_p1.aspx
Also, are you using the registered servers feature? If so SSMS will attempt to validate all of these.
It seems as though it was a network configuration problem. Never trust a developer (myself) to setup a haphazard domain at his office.
I had my DNS server on my computer pointed to my ISP's (default because the wireless router we're using provided by the ISP doesn't allow me to override the DNS server to my own) instead of my DNS server here, so I have to remember to configure it manually on each computer, which I forgot for this particular computer.
I only discovered it when I tried to connect for the first time to a remote SQL Server instance form this PC. It was trying to resolve to an actual sub-domain of mycompany.com instead of my DNS server's authority of COMPUTERNAME.corp.mycompany.com
I can't say why this was an issue for the designers in SQL Server but not anything else, but my only hypothesis is that when I established a connection to my own computer locally using the computer name instead of "." or "localhost", SQL queries executed immediately, knowing it was local, but the designers still waited for a timeout from the external IP address before trying the local one.
Whatever the explanation is, changing my DNS server for my network card on the local machine to my DNS server's IP made it all work very quickly.
I had a similar issue with mine. Turned out to be some interference with the biometrics login service running on my laptop. Disabled the service and now it works.

Resources