I noticed strange behavior in store procedure execution in sql server. Suddenly it takin longer time with no end. SP getting called from another server through SSIS package. SP is without input parameters, so we cannot expect parameter sniffing here. But SP uses temp variables to declare a table. It might possibility that missing statistics from temp variable may cause sudden change in execution plan and sp run slow.
But then why only recompiling SP is helping here. Every day I have to recompile sp before it runs otherwise it showing same behavior, run longer and longer (no end).
My question is: why is sp_recompile required to run sp quick every day?
you can use with recompile hint, but I would strongly recommend (Since recompiling a stored procedure puts a load on the server and also if execution plan cache is full every recompilation would cause dropping of one of the cached execution plans, which can lead to dropping of good execution plans.) try and observe execution plan changing to find out why the execution plan become inefficient.
Related
I am running SP on SQL Server 2017
The first time I executed the SP it took 43 sec
but the 2nd time it took 1 sec only
How can I execute the SP everytime same as the first time without cache or any learning from the previous experience?
I am not asking why, as this question wants to know Why this is happening
First run slowness in a sql server stored procedure
I am asking, How to make it the same as the first time.
I want my procedure everytime it executes as if its the first time
If you want to remove stored plan caches, you can execute DBCC FREEPROCCACHE. Just beware that it will get a clean slate for all stored procedures.
Remember that once the plan is compiled, it will be stored for future calls until either the SP gets altered or a dependent object is also modified. So most of the times what you want to test is actually the performance of the already compiled plan, unless you constantly are clearing these caches, restarting the servers or triggering a recompile somehow.
Run:
CHECKPOINT;
DBCC DROPCLEANBUFFERS;
prior to each execution. This will clear the buffer pool so that the playing field is leveled and each iteration will incur roughly the same IO overhead.
Also see Performance testing with DBCC DROPCLEANBUFFERS for additional considerations for measuring performance with this method.
I have a question: let's say I have a procedure which contains dynamic SQL inside the definition of procedure, so when I execute the procedure for the first time, it's obvious that it compiles the procedure and stores the plan for first time.
What happens during the second run? Will the same plan be used or will the procedure go for a recompile as it contains dynamic SQL in it?
Dynamic SQL is always compiled. It may result in the same execution plan as the first run (totally dependent on parameters).
I would suggest reading this article from MS. Relevant quotes:
Recompilations: Definition
Before a query, batch, stored procedure, trigger, prepared statement, or dynamic SQL statement (henceforth, "batch") begins execution on a SQL Server, the batch gets compiled into a plan. The plan is then executed for its effects or to produce results.
and
Compiled plans are stored into a part of SQL Server's memory called plan cache . Plan cache is searched for possible plan reuse opportunities. If a plan reuse for a batch happens, its compilation costs are avoided.
A similar question has already been answered in Stack Exchange for Database Administrators. Please refer: https://dba.stackexchange.com/questions/47283/when-does-sp-executesql-refresh-the-query-plan
Where is the execution plan for a stored procedure stored at? When you create a stored procedure does it create an execution plan then or do you have to run it once before anything is created?
The execution plan is created the first time the stored procedure is run - and it's stored in the (volatile) plan cache.
If the server is too busy and needs space for more recent execution plans to be cached, or if the SQL Server service is shut down, that cache goes away and the next time around, the procedure needs to be parsed and an execution plan determined again.
Just by creating the stored procedure, you are not storing any execution plan - the stored procedure aren't pre-compiled or anything like that, as folklore often claims. That is just simply not the case.
As a matter of fact, SQL Server doesn't even check for object existence at the time when you create the stored procedure. You can totally create a stored procedure that select from a non-existing table - and it will be created just fine. The error only occurs at runtime - once the execution plan is attempted to be constructed for the first time (and at that point, of course, the fact that the table isn't present, will cause an error)
The stored procedure body is stores in a system table.
The plan is cached and reused every time it is executed, Though yes, first it has to be created at first execution.
I was wondering if you guys could help me get to the bottom of a weird problem I have recently had on SQL Server.
I have a stored procedure (lets call SPold) which is reasonably large with a lot of calculations (can't possibly do this in app as info for around 6000 users needs to come back in a one-er (I reduce this to 1000 based on Surname)). The stored procedure usually executes in a couple of seconds, and is called once every couple of minutes.
Now this morning, the stored procedure was suddenly taking 4-10 times as long to execute, causing a number of timeouts. I discovered that by making a copy of the procedure with a new name (SPnew) and executing, I would get the fast execution times again. This indicated to me that the execution plan was the problem with the original, SPold, so I decided to execute it with recompile. This would return the results a quicker (although not as fast as SPnew), but subsequent calls from users to SPold were once again slow. It was like the new plan wasn't being kept.
What I have done is to fix this is put Exec SPnew into SPold, and now calls to SPold are returning fast again.
Does anyone have any idea what is going on here? The only thing that updated overnight was the statistics, although I think that this should affect both SPold and SPnew.
Sounds like you are experiencing an incorrectly cached query plan due to parameter sniffing.
Can you post the stored procedure?
Batch Compilation, Recompilation, and Plan Caching Issues in SQL Server 2005
I Smell a Parameter!
In SQL Server 2005, you can use the OPTIMIZE FOR query hint for preferred values of parameters to remedy some of the problems associated with parameter sniffing:
OPTIMIZE FOR Instructs the query optimizer to use a particular value for a local
variable when the query is compiled and optimized. The value is used
only during query optimization, and not during query execution.
OPTIMIZE FOR can counteract the parameter detection behavior of the
optimizer or can be used when you create plan guides. For more
information, see Recompiling Stored Procedures and Optimizing Queries
in Deployed Applications by Using Plan Guides.
Although SQL Server 2005 does not support OPTIMIZE FOR UNKNOWN (introduced in SQL Server 2008) which
will eliminate parameter sniffing for a given parameter:
OPTION (OPTIMIZE FOR (#myParam UNKNOWN))
You can achieve the same effect in SQL Server 2005 by copying the parameter into a local variable, and then use the local variable in the query.
I've also encounterred two "strange" cases with Sql Server 2005, which might relate to your problem as well.
In the first case my procedure executed prety fast, when being run as dbo, and it was slow when being run from the application, under a different user account.
In the second case the query plan of the procedure got optimized for the parameter values with which the procedure was called for the first time, and this plan was then reused later for other parameter values as well, resulting in a slow execution.
For this second case the solution was to copy the parameter values into local variables in the procedure, and then using the variables in the queries instead of the parameters.
I think the question says it all. I have several monthly processes in stored procedures which take anywhere from a minute to an hour. If I declare them WITH RECOMPILE, an execution plan will be generated each time.
If the underlying indexes or statistics or views are changed by the DBA, I don't want anyone to have to go in and force a recompile the SPs with an ALTER or whatever.
Is there any downside to this?
Under the circumstances, it would be completely harmless, and probably a good idea.
As I understand it, an SP should be re-compiled if needed automatically. So your concern about underlying changes doesn't really matter.
However, the server tries to cache compiled SP plans. Using WITH RECOMPILE will free the memory that would have been used to cache the compiled procedures (at least until the next time the cache is cleared). Since they're only run monthly this seems like a good idea.
Also, you might want to look at this article for other reason to use that directive:
https://web.archive.org/web/1/http://articles.techrepublic%2ecom%2ecom/5100-10878_11-5662581.html
If each stored procedure is only run once per month, it is highly unlikely that the compiled procedure will still be in the procedure cache. Effectively it will be recompiling anyway.
Even if you run the same stored procedure 100 times on your reporting day, it will only take 0-2 seconds to compile each time (depending on the complexity of the stored procedure), so it's not a massive overhead. I'd feel comfortable setting WITH RECOMPILE on those stored procedures.