how to release memory of SQL Server? - sql-server

I have a server that saves 1000 record every minute, I wrote a trigger that does a series of jobs for every row and I used a lot of table and other variables in it. Now my SQL Server memory usage increases very fast and I can't manage it.
I set min server memory and max server memory but it can't help me, when I set max memory server my SQL Server performance decreased and save process interrupted.
I used these resources but these can't help me:
Great SQL Server Debates: Lock Pages in Memory
How to: Set a Fixed Amount of Memory (SQL Server Management)
I have to reset SQL Server Service every 4 hour but this is not a correct way.
How to release memory of SQL Server?

Related

SSIS Server Maintenance Job ends up using all of server memory

For some reason the SSIS Server Maintenance Job ends up having the SQL Server instance use all available server memory after a few runs (it runs every midnight). When that happens, my SSIS packages no longer have memory to run in and start swapping on disk which leads to unacceptable execution times or at worst a total hang.
So far I've been resetting the SQL Server service through Configuration Manager every morning, but that's not a viable long term solution. I have not set maximum memory limit for the SQL Server instance. Would that help? If not, what can I do?
Server information: Azure VM, 32 GB ram, no other purpose for the server than running SSIS.
You should always set a maximum memory limit for SQL Server instances.
A simple rule of thumb is to leave 4GB or 10% of total memory free, whichever is larger, and tweak as necessary.
If your SQL Server instance is running as a VM, then you also need to set a memory reservation at the host for your VM. Otherwise, the host's 'balloon memory manager' might kick in and steal memory back from your instance.
Ref:
Server memory configuration options
Understanding Memory Resource Management in VMware

SQL Server is not releasing memory after Daily Load

We have a SQL Server 2012 Enterprise Edition, 128GB of RAM, Windows 2008R2. The SQL Server job runs every day at 3 AM and takes 5 hrs to load data into the database. During this process, SQL Server utilizes 123GB (max memory allocated).
After the job completes, SQL Server is not releasing the RAM.
Queried memory utilization where buffer pool shows 97GB. Users don't access database during this time. I restarted SQL Server services to bring RAM down. I didn't find a correct answer related to this issue. Why is it not releasing the RAM? How can we bring RAM utilization down?
SQL Server Job -> SSIS package -> Import data from Mysql to SQL Server database
Thanks
This is by design once SQL Server uses memory, it keeps hold of it and does not release it back to OS.
Your Task Manager may show all/nearly all memory used by SQL Server but if you want to see how much memory SQL Server is actually using you can use the following query.
SELECT (physical_memory_in_use_kb/1024) AS Memory_usedby_Sqlserver_MB
FROM sys.dm_os_process_memory;
By design, SQL Server holds on to the RAM that is has allocated. Much of the RAM is used for the buffer pool. The buffer pool is a cache that holds database pages in memory for fast retrieval.
If SQL Server were to release some memory, and someone were to run a query that requests it right afterwards, the query would have to wait for expensive physical I/O to produce the data. Therefore, SQL Server tries to hold as much memory as possible (and as configured) for as long as possible.
The RAM settings here specify the min server memory and the max server memory. Careful setting of the max memory setting allows room for other processes to run. The article quotes a complicated formula for determining how much room to leave:
From the total OS memory, reserve 1GB-4GB to the OS itself.
Then subtract the equivalent of potential SQL Server memory allocations
outside the max server memory control, which is comprised of stack
size 1 * calculated max worker threads 2 + -g startup parameter 3 (or
256MB by default if -g is not set). What remains should be the
max_server_memory setting for a single instance setup.
In our servers, we usually just wing it and set the max memory option to several GB below the total physical memory. This leaves plenty of room for the OS and other applications.
If SQL Server memory is over the min server memory, and the OS is under memory pressure, SQL Server can release memory until it is at the min server memory setting.
Reference: Memory Management Architecture Guide.
One of the primary design goals of all database software is to
minimize disk I/O because disk reads and writes are among the most
resource-intensive operations. SQL Server builds a buffer pool in
memory to hold pages read from the database. Much of the code in SQL
Server is dedicated to minimizing the number of physical reads and
writes between the disk and the buffer pool. SQL Server tries to reach
a balance between two goals:
Keep the buffer pool from becoming so big that the entire system is low on memory.
Minimize physical I/O to the database files by maximizing the size of the buffer pool.
When SQL Server is using memory dynamically, it queries the system
periodically to determine the amount of free memory. Maintaining this
free memory prevents the operating system (OS) from paging. If less
memory is free, SQL Server releases memory to the OS. If more memory
is free, SQL Server may allocate more memory. SQL Server adds memory
only when its workload requires more memory; a server at rest does not
increase the size of its virtual address space.
...
As more users connect and run queries, SQL Server acquires the
additional physical memory on demand. A SQL Server instance continues
to acquire physical memory until it either reaches its max server
memory allocation target or Windows indicates there is no longer an
excess of free memory; it frees memory when it has more than the min
server memory setting, and Windows indicates that there is a shortage
of free memory.
As other applications are started on a computer running an instance of
SQL Server, they consume memory and the amount of free physical memory
drops below the SQL Server target. The instance of SQL Server adjusts
its memory consumption. If another application is stopped and more
memory becomes available, the instance of SQL Server increases the
size of its memory allocation. SQL Server can free and acquire several
megabytes of memory each second, allowing it to quickly adjust to
memory allocation changes.
If, for some reason:
You absolutely MUST have that memory back
You know you do not need it for a while
You are willing to pay a penalty for virtual memory allocation and physical I/O to retrieve data from disk the next time you need that memory
Then you can temporarily reconfigure the database max server memory setting to a lower value. This can be done through the SSMS user interface, or you can use an sp_configure 'max server memory' followed by reconfigure to make the changes programatically.
Full disclosure: I did not try it myself.
You should not try it on your production environment before testing it somewhere else.
This is from a DBA answer:
sp_configure 'show advanced options', 1;
GO
RECONFIGURE;
GO
sp_configure 'max server memory', 4096;
GO
RECONFIGURE;
GO
4096 should be replaced by the value that you find acceptable as the minimum.
Should be followed by a similar command to increase the memory back to your original maximum.

Sql server does not consume memory to Maxlimit

here is the situation
we have 2 TB Memory Installed on Server and 'max Server Memory (MB)'
is set to 1800000 MB.(we tried default limitation(32 TB) also.) all the time the memory available is 1.7 TB in task manager.it seems sql server not consume memory more than a specif value.
as i know sql server all the time consume memory.
we dataware house with multipe data mart the we full process every 1 hour. some cubes fact are about 100 milion record with 20-30 cuolmns.
i wnat to be sure that sql server and anlysis service can use moery without limitation .are they don't need more than 300 GIG ?i installed some windows application for cunsuming memory this app can cunsume memory but sql server not.is there any possible test scenario that i cant test to be sure that sql server cant use memory to 1 TB?

How to reduce Physical Memory usage in SQL Server

I have 32 GB physical memory server. When I am starting my server its taking 18gb memory when the server and SQL Server 2008 R2 will up. But after few hours SQL Server will be taking up 23gb or more cached size going 4939 or more. What is the cause of this problem and how can I see which queries making this problem?`
SQL loves memory, it'll use what it needs to, especially when caching data. The very nature of caching data is using memory. If your concerned about leaving a little memory for the server or other processes running on your server, then set a max memory amount to SQL.

SQL Server 2014 standard edition slows the machine when Database size grows

I have a scenario where an application server saves 15k rows per second in SQL Server database. At first initial hours machine is still usable but whenever the database size increases ~20gig, it seems that machine is becoming unusable.
I saw some topics/forums/answers/blogs suggesting to limit the max memory usage of SQL Server. Any thoughts on this?
Btw, using SQL Bulkcopy to insert rows in the database.
I have two suggestions for you:
1 - Database settings:
When you create the database, try to use a large initial size, and consider to have a bigger autogrowth percentage/size.
You will want to minimize the times your filegroups need to grow.
2 - Server settings:
In your SQL Server settings I would recommend that you remove one logical processor from the SQL Server. The OS will use this processor when the SQL Server is busy with heavy loads on the other processors. In my experience, this usually gives a nice boost to the OS .

Resources