Does updating statistics recompile stored procedures in sql server - sql-server

Does updating statistics recompile stored procedures in sql server or even after updating statistics( Auto or manual) the procedures run with the same execution plan it first compiled with?

MSDN has a lengthy article on that. To sum it up:
Therefore, plan optimality-related reasons have close association with
the statistics.
Looks like it depends on how much the statistics changed. So updating statistics may lead to a recompile but does not have to. To force removal of all cached query plans, you can run:
DBCC FREEPROCCACHE

Related

How to clear sqlserver cache to get correct execution plan

I had a query that was running slow (2.5 mins) on sqlserver.
I got actual execution plan, and there was a suggestion for an index. I created the index and now execution time is < 2 seconds.
Then we had to restart sql server.
Query went back to being slow (2.5 mins), again, I looked at execution plan. This time there was a suggestion for a different index!
It would appear that first execution plan index suggestion was taking into account some sort of cached index maybe?
How can I clear cache (if this is the issue) before looking at execution plan?
The symptoms suggest parameter sniffing, where the query plan was generated for the initially supplied parameter values but the plan is suboptimal for subsequent queries with different values. You can invalidate the currently cached plan for specific query by providing the plan handle to DBCC FREEPROCCACHE:
DBCC FREEPROCCACHE(plan_pandle);
There are a number of ways to avoid parameter sniffing. If the query is not executed frequently, a recompile query hint will provide the optimal plan for the parameter values supplied. Otherwise, you could specify an optimize for unknown hint or use the Query Store (depending on your SQL Server version) to force a specific plan or have SQL Server automatically identify plan regression and select a plan.
Dont clear the cache in PRODUCTION environment. It will lead to serious performance issues.
If you want to generate new plan instead of existing plan, you can go for RECOMPILE option as part of stored procedure execution to see whether new index is being considered in the new plan.
EXEC dbo.Procedure WITH RECOMPILE;
or you can regenerate the execution plan for the procedure, by using the below command. Next time, it will be using the newly generated plan.
EXEC sp_recompile `dbo.procedure`
If you want to measure performance improvement repeatedly in a test environment, you can go with below clearing approaches:
DBCC FREEPROCCACHE -- It will clear the plan cache completely
DBCC DROPCLEANBUFFERS -- It will clear the unchanged data brought from disk to memory.
More elegant approach is to write the dirty pages to disk and then issue the cleaning of unchanged data.
CHECKPOINT;
GO
DBCC DROPCLEANBUFFERS;
GO
DBCC FREEPROCCACHE;
GO

query hangs, but ok if I update statistics

Got an issue where i have a complicated sql query that occasionally hangs and doesn't execute on MS SQL. However, when i run update statistics on the tables involved in the query, the query executes normally.
Any idea or pointers on the cause?
Thanks!
SQL Server creates an "execution plan" that uses the statistics info to determine an optimal order to filter the data/reduce access to the database tables.
This execution plan is stored in the database cache and is re-used as long as the database is online; the statistics are not rebuild and the query is not modified.
When you update the indexes, the statistics are updated as well.
As a result, the stored execution plan for your query is no longer optimal and as a result will not be used any more.
I expect SQL Server also closes unused locks and transactions for the table before rebuilding the index. That is an undocumented feature.

How to let SQL Server know not to use Cache in Queries?

Just a general question:
Is there a query/command I can pass to SQL Server not to use cache when executing a particularly query?
I am looking for a query/command that I can set rather than a configuration setting. Is there no need to do this?
DBCC FREEPROCCACHE
Will remove all cached procedures execution plans. This would cause all subsequent procedure calls to be recompiled.
Adding WITH RECOMPILE to a procedure definition would cause the procedure to be recompiled every time it was called.
I do not believe that (in SQL 2005 or earlier) there is any way to clear the procedrue cache of a single procedures execution plan, and I'd doubt you could do it in 2008 either.
If you want to force a query to not use the data cache, the best approach is to clear the cache before you run the query:
CHECKPOINT
DBCC DROPCLEANBUFFERS
Note that forcing a recompile will have no effect on the query's use of the data cache.
One reason you can't just mark an individual query to avoid the cache is the cache use is an integral part of executing the query. If the data is already in cache, then what would SQL Server do with the data if it was read a second time? Not to mention sync issues, etc.
Another more localized way to not use the MS-SQL Server Cache is to use the OPTION(RECOMPILE) keyword at the end of your statement.
E.g.
SELECT Columnname
FROM TableName
OPTION(RECOMPILE)
For more information about this and other similar query-cache clues to help identify problems with a query, Pinal Dave (no affiliation) has some helpful info about this.
use
WITH RECOMPILE

How do you fix queries that only run slow until they're cached

I have some queries that are causing timeouts in our live environment. (>30 seconds)
If I run profiler and grab the exact SQL being run and run it from Management Studio then they take a long time to run the first time and then drop to a few hundred miliseconds each run after that.
This is obviously SQL caching the data and getting it all in memory.
I'm sure there are optimisations that can be made to the SQL that will make it run faster.
My question is, how can I "fix" these queries when the second time I run it the data has already been cached and is fast?
May I suggest that you inspect the execution plan for the queries that are responsible for your poor performance issues.
You need to identify, within the execution plan, which steps have the highest cost and why. It could be that your queries are performing a table scan, or that an inappropriate index is being used for example.
There is a very detailed, free ebook available from the RedGate website that concentrates specifically on understanding the contents of execution plans.
https://www.red-gate.com/Dynamic/Downloads/DownloadForm.aspx?download=ebook1
You may find that there is a particular execution plan that you would like to be used for your query. You can force which execution plan is used for a query in SQL Server using query hints. This is quite an advanced concept however and should be used with discretion. See the following Microsoft White Paper for more details.
http://www.microsoft.com/technet/prodtechnol/sql/2005/frcqupln.mspx
I would also not recommend that you clear the procedure cache on your production environment as this will be detrimental to the performance of all other queries on the platform that are not currently experience performance issues.
If you are executing a stored procedure for example you can ensure that a new execution plan is calculated for each execution of the procedure by using the WITH RECOMPILE command.
For overall performance tuning information, there are some excellent resources over at Brent Ozar’s blog.
http://www.brentozar.com/sql-server-performance-tuning/
Hope this helps. Cheers.
According to http://morten.lyhr.dk/2007/10/how-to-clear-sql-server-query-cache.html, you can run the following to clear the cache:
DBCC DROPCLEANBUFFERS
DBCC FREEPROCCACHE
EDIT: I checked with the SQL Server documentation I have and this is at least true for SQL Server 2000.
Use can use
DBCC DROPCLEANBUFFERS
DBCC FREEPROCCACHE
But only use this in your development environment whilst tuning the queries for deployment to a live server.
I think people are running off in the wrong direction. If I understand, you want the performance to be good all the time? Are they not running fast the 2nd (and subsequent executions) and are slow the first time?
The DBCC commands above clear out the cache, causing WORSE performance.
What you want, I think, is to prime the pump and cache the data. You can do this with some startup procedures that execute the queries and load data into memory.
Memory is a finite resource, so you can't load all data, likely, into memory, but you can find a balance. Brent has some good references above to help learn what you can do here.
Query optimisation is a large subject, there is no single answer to your question. The clues as to what to do are all in the query plan which should be the same regardless of whether the results are cached or not.
Look for the usual things such as table scans, indexes not being used when you expect them to be used, etc. etc. Ultimately you may have to revew your data model and perhaps implement a denormalisation strategy.
From MSDN:
"Use DBCC DROPCLEANBUFFERS to test queries with a cold buffer cache without shutting down and restarting the server."

How often do you update statistics in SQL Server 2000?

I'm wondering if updating statistics has helped you before and how did you know to update them?
exec sp_updatestats
Yes, updating statistics can be very helpful if you find that your queries are not performing as well as they should. This is evidenced by inspecting the query plan and noticing when, for example, table scans or index scans are being performed instead of index seeks. All of this assumes that you have set up your indexes correctly.
There is also the UPDATE STATISTICS command, but I've personally never used that.
It's common to add your statistics update to a maintenance plan (as in an Enterprise Manager-defined Maintenance plan). That way it happens on a schedule - daily, weekly, whatever.
SQL Server 2000 uses statistics to make good decisions about query execution so they definitely help.
It's a good idea to rebuild your indexes at the same time (DBCC DBREINDEX and DBCC INDEXDEFRAG).
If you rebuild indexes, then the statistics for those indexes are automatically rebuilt.
If your timeframes allow, then running UPDATE STATISTICS of part of a maintenance plan is a good idea, as frequently as nightly (if your indexes are being rebuilt less frequently than that).
SQL Server: To determine if out of date statistics are the cause of a query performing poorly, turn on 'Query->Display Estimated Execution plan' (CTRL-L) in Management Studio and run the query. Open another window, paste in the same query and turn on 'Query->Display ActualExecution plan' (CTRL-M) in Management Studio and re-run the query. If the execution plans are different then statistics are most likely out of date.
Updating statistics becomes necessary after the following events:
- Records are inserted into your table
- Records are deleted from your table
- Records are updated in your table
If you have a large database with millions of records that gets lots of writes per day you probably should be determining an off-peak time to schedule index updates.
Also, you need to consider your type of traffic. If you have a lot (millions) of records in tables with many foreign key dependencies and you have a larger proportion of writes to reads you might want to consider turning off automatic statistics recomputation (NOTE: this feature will be removed in a future version of SQL Server, but for SQL Server 2000 you should be OK). This tells the engine to not recompute statistics on every INSERT, DELETE, or UPDATE and makes those actions much more performant.
Indexes are no laughing matter. They are the heart and soul of a performant database.

Resources