I'm trying to process a cube on a development server which is processing data from a different server. The process took a long time to run the first time so I figured it was partially because the development server only had 4 GB of RAM on it. So, I bumped it up to 20 GB of RAM hoping to see some improvement in performance.
However, when I checked "perfmon" I noticed that total memory usage would not go beyond 4 GB of RAM even though I now have 20 GB.
How do I get SSAS do use more RAM?
Something else I should do after installing RAM? I know it's recognized and the computer as a whole is actually working better.
Some info:
SQL Server version: Microsoft SQL Server 2014 (SP2) (KB3171021) - 12.0.5000.0 (X64) Developer Edition (64-bit) on Windows NT 6.1 (Build 7601: Service Pack 1)
Windows version: Windows 7 pro 64-bit
Visual Studio version: Community 2015
Here's a screen shot of the memory usage. At this time, the current step it's running is "Processing Partition 'MyCube' - In Progress - 450000 of 100."
Here's a screen shot of the SSAS Server settings:
Related
I have 2 similar SQL Server installations on 2 similar GCP projects.
Everything is the same - configuration, CPU, RAM, disk drive layout, similar (but not the same) database with similar data and workload.
When I run alter index index [IndexName] on dbo.Tablename rebuild with (online=on) on the 1st server it takes about 30 minutes to rebuild the whole index. On the 2nd server it ran for more than 3.5 hours before I stopped the rebuild.
All disk metrics (throughput, queue length, etc.) look reasonable. The rebuild is being performed during a night maintenance window with no significant transaction load on the server.
My question is: How can I "debug" the rebuild process to see what is going on, why two similar databases on two similar servers act very different? Is there any trace flag, extended event, etc. which may help to investigate the problem?
Microsoft SQL Server 2016 (SP2) (KB4052908) - 13.0.5026.0 (X64)
Mar 18 2018 09:11:49
Copyright (c) Microsoft Corporation
Enterprise Edition: Core-based Licensing (64-bit) on Windows Server 2016 Datacenter 10.0 (Build 14393: ) (Hypervisor)
26CPU/20GB of RAM
Index size before rebuilding: ~110Gb, after rebuilding: ~30Gb
Need some help with choosing the correct database version to use several applications like SQL Developer in conjunction with the db. Yesterday I installed Oracle 18C, which when operating consumes around 3-4 gigs of my ram, what's idle version of Oracle DB to run SQL Developer, Jaspersoft's iReport etc. Below I'll attach the specifications of my laptop.
Processor: i3 6100u
Memory: 8 Gigs of DDR3 Memory
Storage: 1TB 5400 RPM HDD
Operating System: Windows 10 Pro (64-Bit)
Currently using 18.4.0-376.1900 (64 Bit version), it would be awesome if you guys could suggest a DB which will be compatible with this version and also should support things like SQLPlus, Jaspersoft's iReport,PL-SQL etc. Thanking you in advance.
I'd suggest Oracle 11g Express Edition (XE) as you're on Windows; there's 18cXE, but only for Linux. Here's the link: https://www.oracle.com/technetwork/database/database-technologies/express-edition/downloads/xe-prior-releases-5172097.html
It is a fully functional, free to use database. True, it has some limitations, but you shouldn't worry about these for what you need. SQL Developer works with it, you can connect other tools to it, it has Oracle Application Express (Apex) installed (version 4.x, but you can upgrade it to the most recent version) ...
Shortly, that's an option you should think about.
I have a Lenovo T480s:
Intel i7 8th Gen
16 GB Ram
Windows 10
The problem is that I get some issues with Microsoft SQL Server Management Studio v17 (is up to date).
When I use SSMS, for example to visualize a database, a table or to edit a table, I continuously get the message
Not Responding
and it often loses the connection.
Does anyone know how to fix this problem?
Thanks!
We use
Microsoft SQL Server 2008 R2 (RTM) - 10.50.1600.1 (X64) Apr 2 2010 15:48:46 Copyright (c) Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 6.2 (Build 9200: ) (Hypervisor)
The machine has 70GB Memory
The SQL Server has
minimum Server memory: 20480 MB
maximum Server memory: 51200 MB
But when I open the Resourcemonitor and check the memory of sqlserver.exe-process, I see that the Commited Memory (Zugesichert) is about 51GB, but the WorkingSet (Arbeitssatz) and Privat are only about 1GB.
The SQL Server is under full load and it is running for 3 month without restart.
The Page Lifetime Expectancy is 14'146s (=about 4h)
For testing purpose, I have selected a table with 3.5Mio Rows (Storage Size: 4'600MB) - but the values for WorkingSet and Private in the Resourcemonitor do not change.
Now my questions:
Is there something wrong or are the values of Resourcemonitor not correct?
If yes, where can I get the real memory usages?
If no, where can I take in to solve the problem or get more information?
There are some performance issues on our SQL server. When we began to analyze, we found several problems, including that the plan cache is cleaned very often for no reason (5-10 times per hour).
We also used the "sp_BlitzFirst" script for analysis and it also showed that we have problem: "Plan Cache Erased Recently".
However, we do not have any jobs that can perform cleaning. And no one performs cleaning manually, too.
We would like to know what might be the reasons for this behavior?
Microsoft SQL Server 2012 - 11.0.2100.60 (X64) Feb 10 2012 19:39:15
Enterprise Edition: Core-based Licensing (64-bit) on Windows NT 6.2 <X64> (Build 9200: ) (Hypervisor)
Total RAM: 32GB
SQL Server RAM: 29GB
Average RPS (requests per second): ~250