SQL Server does not use the assigned memory - sql-server

We use
Microsoft SQL Server 2008 R2 (RTM) - 10.50.1600.1 (X64) Apr 2 2010 15:48:46 Copyright (c) Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 6.2 (Build 9200: ) (Hypervisor)
The machine has 70GB Memory
The SQL Server has
minimum Server memory: 20480 MB
maximum Server memory: 51200 MB
But when I open the Resourcemonitor and check the memory of sqlserver.exe-process, I see that the Commited Memory (Zugesichert) is about 51GB, but the WorkingSet (Arbeitssatz) and Privat are only about 1GB.
The SQL Server is under full load and it is running for 3 month without restart.
The Page Lifetime Expectancy is 14'146s (=about 4h)
For testing purpose, I have selected a table with 3.5Mio Rows (Storage Size: 4'600MB) - but the values for WorkingSet and Private in the Resourcemonitor do not change.
Now my questions:
Is there something wrong or are the values of Resourcemonitor not correct?
If yes, where can I get the real memory usages?
If no, where can I take in to solve the problem or get more information?

Related

Enable File Stream SQL Server 2017

I am trying to enable file stream on Microsoft SQL Server 2017, on a Windows Core 2019 Server.
I have changed the registry settings at the following location to enable it as there is no GUI. I set the value to 3
ServerName\HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL14.InstanceName\MSSQLServer\Filestream.EnabledLevel
After This I restarted
I have also installed the latest Cumulative update (20) as I know there where issues with Driver signing.
After This I restarted
I am seeting the following Errors in the log file at C:\Program Files\Microsoft SQL Server\MSSQL14.InstanceName\MSSQL\Log
<{781FAE78-09AB-4EE5-B051-67747BDB19E3}>RsFxMgmtInitialize failed (the RsFx device is not found. Check if RsFx driver is started.): Error 0x80070002 (-2147024894)
2020-04-21 08:56:03.94 spid5s FILESTREAM: failed to connect to kernel driver RsFx0503.
<{1038F43D-3391-45F7-B1B3-BADF26459429}>Failed to initialize CFsaShareFilter: Error 0x80070002 (-2147024894)
<{1038F43D-3391-45F7-B1B3-BADF26459429}>Failed to initialize CFsaShareFilter: Error 0x80070002 (-2147024894)
2020-04-21 08:56:03.94 spid5s FILESTREAM: effective level = 0, configured level = 2.
2020-04-21 08:56:03.94 spid5s FILESTREAM feature could not be initialized. The operating system Administrator must enable FILESTREAM on the instance using Configuration Manager.
SELECT ##VERSION
Microsoft SQL Server 2017 (RTM-CU20) (KB4541283) - 14.0.3294.2 (X64) Mar 13 2020 14:53:45 Copyright (C) 2017 Microsoft Corporation Standard Edition (64-bit) on Windows Server 2019 Standard 10.0 <X64> (Build 17763: ) (Hypervisor)
I got this working on one server and cannot get it working on the second one.
There is also a version of sql 2019 on this box
Any help? Also any advice on getting this working on a cluster
I used "computer management" and remotly connect to your core server. From there, you can enable FileStream and configure it.
This seemed to work better that the registry method

too slow alter index reorganize/rebuild (with online)

I have 2 similar SQL Server installations on 2 similar GCP projects.
Everything is the same - configuration, CPU, RAM, disk drive layout, similar (but not the same) database with similar data and workload.
When I run alter index index [IndexName] on dbo.Tablename rebuild with (online=on) on the 1st server it takes about 30 minutes to rebuild the whole index. On the 2nd server it ran for more than 3.5 hours before I stopped the rebuild.
All disk metrics (throughput, queue length, etc.) look reasonable. The rebuild is being performed during a night maintenance window with no significant transaction load on the server.
My question is: How can I "debug" the rebuild process to see what is going on, why two similar databases on two similar servers act very different? Is there any trace flag, extended event, etc. which may help to investigate the problem?
Microsoft SQL Server 2016 (SP2) (KB4052908) - 13.0.5026.0 (X64)
Mar 18 2018 09:11:49
Copyright (c) Microsoft Corporation
Enterprise Edition: Core-based Licensing (64-bit) on Windows Server 2016 Datacenter 10.0 (Build 14393: ) (Hypervisor)
26CPU/20GB of RAM
Index size before rebuilding: ~110Gb, after rebuilding: ~30Gb

Importing into SQL Server directly from Zipped File

I have a simple 4 column text file with billions of rows (say Huge_file.txt ). Now the Huge_File.txt after compressing is around 1 TB in zipped format with around 11 % compression rate ( after unzipping the size will balloon upto 10 TB or so.)
Is there a way to load Huge_File.txt without unzipping?(Space issues on server.) Like something similar to how Redshift imports a zipped file from S3 without unzipping it.
P.S. I am using SQL Server
Version : Microsoft SQL Server 2016 (SP1) (KB3182545) - 13.0.4001.0 (X64)
Oct 28 2016 18:17:30
Copyright (c) Microsoft Corporation
Enterprise Edition: Core-based Licensing (64-bit) on Windows Server 2016 Standard 6.3 (Build 14393: )
Any help is appreciated. Thank you.

SQL Server - Plan cache cleared very often

There are some performance issues on our SQL server. When we began to analyze, we found several problems, including that the plan cache is cleaned very often for no reason (5-10 times per hour).
We also used the "sp_BlitzFirst" script for analysis and it also showed that we have problem: "Plan Cache Erased Recently".
However, we do not have any jobs that can perform cleaning. And no one performs cleaning manually, too.
We would like to know what might be the reasons for this behavior?
Microsoft SQL Server 2012 - 11.0.2100.60 (X64) Feb 10 2012 19:39:15
Enterprise Edition: Core-based Licensing (64-bit) on Windows NT 6.2 <X64> (Build 9200: ) (Hypervisor)
Total RAM: 32GB
SQL Server RAM: 29GB
Average RPS (requests per second): ~250

How to get SSAS to Use More Memory

I'm trying to process a cube on a development server which is processing data from a different server. The process took a long time to run the first time so I figured it was partially because the development server only had 4 GB of RAM on it. So, I bumped it up to 20 GB of RAM hoping to see some improvement in performance.
However, when I checked "perfmon" I noticed that total memory usage would not go beyond 4 GB of RAM even though I now have 20 GB.
How do I get SSAS do use more RAM?
Something else I should do after installing RAM? I know it's recognized and the computer as a whole is actually working better.
Some info:
SQL Server version: Microsoft SQL Server 2014 (SP2) (KB3171021) - 12.0.5000.0 (X64) Developer Edition (64-bit) on Windows NT 6.1 (Build 7601: Service Pack 1)
Windows version: Windows 7 pro 64-bit
Visual Studio version: Community 2015
Here's a screen shot of the memory usage. At this time, the current step it's running is "Processing Partition 'MyCube' - In Progress - 450000 of 100."
Here's a screen shot of the SSAS Server settings:

Resources