2 questions on RAM, Upgrade & SQL Server's usage -
Does upgrading RAM automatically increases the allocation to SQL Server or it has to be manually configured via SP_CONFIGURE?
Can we know what was originally allocated to SQL Server before configuring manually through SP_CONFIGURE? I could've found if I gathered the values from SELECT object_name, cntr_value FROM SYS.DM_OS_PERFORMANCE_COUNTERS WHERE counter_name IN ('Total Server Memory (KB)','Target Server Memory (KB)') but I didn't.
So, is there any way to find the historical config?
--In 'thoughts'...
You can't get the historical configuration values from any standard tables (apart from something pretty esoteric).
Whether or not SQL Server will automatically use all available memory just depends upon the max memory configuration. It's designed to use whatever memory is available, up to the configured maximum, apart from leaving some memory for the operating system.
If the operating system flags that it's running low on memory, SQL Server will release memory from its process when necessary. Usually it's pretty graceful, but the OS can also urgently require memory, and SQL Server will respond to that too. It starts dumping pages rapidly, and you'll usually see that in the SQL Server error log.
Related
I understand from a search here that SQL Server 2012 will continue to use memory until it meets the limit set for it, but the usage I see is hard to believe. The database is about 26MB at the moment, but the memory usage is over 30GB. Is this to be expected or is there some other problem lurking somewhere?
There is nothing wrong. The memory allocated to SQL Server can be changed to almost any value you'd like. I'd say 30GB is certainly overkill for a 26MB DB if it is the only DB on the instance. The memory allocated to SQL Server is used for numerous functions, e.g. sorting queries, plan caches, etc. The 30GB you're seeing means that 30GB of your system memory is reserved for SQL Server.
For a better understanding, you'll want to look into your target memory too. Target memory is how much memory is needed for SQL Server work, based on your configuration. In your case, I bet target memory equals max memory and SQL Server is trying to consume all the memory. Here is how you can check that:
SELECT *
FROM sys.dm_os_performance_counters
WHERE counter_name in ('Total Server Memory (KB)', 'Target Server Memory (KB)')
More information from Brent Ozar:
So how much memory is SQL using? I’ll make this easy for you. SQL
Server is using all of the memory. Period.
No matter how much memory you put in a system, SQL Server will use all
it can get until it’s caching entire databases in memory and then
some. This isn’t an accident, and there’s a good reason for it. SQL
Server is a database: programmers store data in SQL Server, and then
SQL Server manages writing that data to files on the hard drive.
Programmers issue SELECT statements (yes, usually SELECT *) and SQL
Server fetches the data back from the drives. The organization of
files and drives is abstracted away from the programmers.
To improve performance, SQL Server caches data in memory. SQL Server
doesn’t have a shared-disk model: only one server’s SQLserver.exe can
touch the data files at any given time. SQL Server knows that once it
reads a piece of data from the drives, that data isn’t changing unless
SQL Server itself needs to update it. Data can be read into memory
once and safely kept around forever. And I do mean forever – as long
as SQL Server’s up, it can keep that same data in memory. If you have
a server with enough memory to cache the entire database, SQL Server
will do just that. Why Doesn’t SQL Server Release Memory?
Memory makes up for a lot of database sins like:
Slow, cheap storage (like SATA hard drives and 1Gb iSCSI)
Programs that needlessly retrieve too much data
Databases that don’t have good indexes
CPUs that can’t build query plans fast enough
Throw enough memory at these problems and they go away, so SQL Server
wants to use all the memory it can get. It also assumes that more
queries could come in at any moment, so it never lets go or releases
memory unless the server comes under memory pressure (like if other
apps need memory and Windows sends out a memory pressure
notification).
By default, SQL Server assumes that its server exists for the sole
purpose of hosting databases, so the default setting for memory is an
unlimited maximum. (There are some version/edition restrictions, but
let’s keep things simple for now.) This is a good thing; it means the
default setting is covering up for sins. To find out if the server’s
memory is effectively covering up sins, we have to do some
investigation.
Docs on SQL Server memory configuration: https://learn.microsoft.com/en-us/sql/database-engine/configure-windows/server-memory-server-configuration-options
Here is how you can set a fixed amount for Min/Max memory on SQL Server: https://technet.microsoft.com/en-us/library/ms191144(v=sql.105).aspx
We have an 8 Core, 16GB RAM server that has SQL Server 2008 running on it. When we perform large queries on millions of rows the RAM usage goes up to 15.7GB and then even file browsing, opening excel etc gets really slow.
So does SQL Server really release memory when another process needs it or am I having another issue? We don't have any other major programs running on this server.
We've set a max memory usage of 14GB for SQL Server.
Thanks all for any enlightenment or trouble shooting ideas.
Yes it does. See SQLOS's memory manager: responding to memory pressure for details how this works. But what exactly means to have 'memory pressure' it depends from machine to machine and from OS version to OS version, see Q & A: Does SQL Server always respond to memory pressure?. If you want to reserve more memory for applications (I'm not even bother to ask why you browse files and use Excel on a machine dedicated to SQL Server....) then you should lower the mas server memory until it leaves enough for your entertainment.
SQL server does NOT release memory. It takes all the memory it can get up to the MaxMemory setting and it stays there.
I need some help very badly.
I'm working on a project where a bulk of data is entered all the time. It's a reporting software.
10 Million records in an average is stored per day and it could keep on increasing as users increase.
As of now, SQL SERVER CONSUMES 5gb of RAM on the task manager. I have an 8GB ram on my server now.
How do other enterprises manage such situations?
SQL Server uses memory efficiently and takes as much as it can. It's also usually clever enough to release memory when needed.
Using 5GB means:
SQL Server is configured to 5GB or SQL Server has simply reserved this memory during normal usage
It's left 3GB because it doesn't need to use it
Nothing is wrong... and I'd probably configure the SQL Server max mem to 6.5GB...
Late addition: Jonathan Kehayias blog entry
SQL Server typically uses as much memory as it can get it's hands on, as it then stores the more frequently accessed data in memory to be more efficient, as disk access is slower then memory access.
So nothing is wrong with it using 5gb of memory.
To be honest, it's leaving 3gb of memory for other applications and the operating system, so there might not be anything wrong with this. (If this is all that server is designed to do.).
To configure the memory limit, do the following:
In SQL Server Enterprise manager, right click on the server name, and go to properties.
Click on the Memory option
Reduce the maximum server memory to what you think is appropriate.
Click ok.
I highly doubt that this is in fact a memory leak. The increase of SQL Server's memory usage is by design, simply because it caches a lot of stuff (queries, procedures).
What you will most likely see is that if the available memory that is still left runs low, SQL server will 'flush' its memory, and you would see in fact that memory will be freed in the end.
I want to install sql server 2008 express on my laptop that has 1 GB memory, but my database contains lots of binary data that I don't want spending all my RAM. I would much rather sacrifice sql performance (make it page) in favor of other applications.
Is it possible to limit the memory footprint of sql server?
look here and here
essentially sp_configure 'max server memory' I think
I've only got SQL Server 2005 Express, not 2008, but from SQL Server Management Studio Express, if I right-click on the root node in the tree (the server node) and select Properties, there's a "Memory" page with both minimum and maximum amounts of memory available to be set.
From the docs for these options:
Minimum server memory (in MB)
Specifies that SQL Server should start
with at least the minimum amount of
allocated memory and not release
memory below this value. Set this
value based on the size and activity
of your instance of SQL Server. Always
set the option to a reasonable value
to ensure that the operating system
does not request too much memory from
SQL Server and inhibit Windows
performance.
Maximum server memory (in MB)
Specifies the maximum amount of memory
SQL Server can allocate when it starts
and while it runs. This configuration
option can be set to a specific value
if you know there are multiple
applications running at the same time
as SQL Server and you want to
guarantee that these applications have
sufficient memory to run. If these
other applications, such as Web or
e-mail servers, request memory only as
needed, then do not set the option,
because SQL Server will release memory
to them as needed. However,
applications often use whatever memory
is available when they start and do
not request more if needed. If an
application that behaves in this
manner runs on the same computer at
the same time as SQL Server, set the
option to a value that guarantees that
the memory required by the application
is not allocated by SQL Server.
I'd be surprised if these options weren't in 2008, but you could always just install it and try.
You can do it w/ osql:
http://kb.hs-lab.com/content/7/113/en/how-to-limit-ram-usage-for-sql-2005-express-database.html
osql -E -S YOURSERVERNAME\PRINTLOGGER
sp_configure 'show advanced options',1
RECONFIGURE WITH OVERRIDE
GO
then
sp_configure 'max server memory',70?
RECONFIGURE WITH OVERRIDE
GO
You might also try giving cpu priority to your favored applications and letting SQL manage memory dynamically. It will release memory as needed by other apps, regardless of priority.
Hopefully you're not trying to run visual studio on that machine. It won't be much fun.
I have a development vm which is running sql server as well as some other apps for my stack, and I found that the other apps are performing awfully. After doing some digging, SQL Server was hogging the memory. After a quick web search I discovered that by default, it will consume as much memory as it can in order to cache data and give it back to the system as other apps request it, but this process often doesn't happen fast enough, apparently my situation is a common problem.
There however is a way to limit the memory SQL Server is allowed to have. My question is, how should I set this limit. Obviously I'm going to need to do some guess and check, but is there an absolute minimum threshhold? Any recommendations are appreciated.
Edit:
I'll note that out developer machines have 2 gigs of memory so I'd like to be able to run the vm on 768 mb or less if possible. This vm will be only used for local dev and testing , so the load will be very minimal. After code has been tested locally it goes to another environment where the SQL server box is dedicated. What I'm really looking for here is recommendations on minimums
Extracted fromt he SQL Server documentation:
Maximum server memory (in MB)
Specifies the maximum amount of memory
SQL Server can allocate when it starts
and while it runs. This configuration
option can be set to a specific value
if you know there are multiple
applications running at the same time
as SQL Server and you want to
guarantee that these applications have
sufficient memory to run. If these
other applications, such as Web or
e-mail servers, request memory only as
needed, then do not set the option,
because SQL Server will release memory
to them as needed. However,
applications often use whatever memory
is available when they start and do
not request more if needed. If an
application that behaves in this
manner runs on the same computer at
the same time as SQL Server, set the
option to a value that guarantees that
the memory required by the application
is not allocated by SQL Server.
The recommendation on minimum is: No such thing. The more memory the better. The SQL Sever needs as much memory as it can get or it will trash your IO.
Stop the SQL Server. Run your other applications and take note to the amount of memory they need. Subtract that from your total available RAM, and use that number for the MAX memory setting in the SQL Server.
Since this is a development environment, I agree with Greg, just use trial and error. It's not that crucial to get it perfectly right.
But if you do a lot of work in the VM, why not give it at least half of the 2GB?
so id like to be able to run the vm on
768 mb or less if possible.
That will depend on your data and the size of your database. But I usually like to give SQL server at least a GB
It really depends on what else is going on on the machine. Get things running under a typical load and have a look at Task Manager to see what you need for everything else. Try that number to start with.
For production machines, of course, it is best to give control of the machine to Sql Server (Processors -> Boost Sql Server Priority) and let it have all the RAM it wants.
Since you are using VMs, maybe you could create a dedicated one just for Sql Server and run everything else on a different VM.