Executing stored procedure takes too long than executing TSQL - sql-server

I have a stored procedure and when I want to execute it using exec proc_name it takes 1 min
If I copy the code from stored procedure, declare params as variables and then execute the code it takes 10 sec.
What's wrong ?
I am missing something here ?
I am asking this because I use ADO.NET and I get a timeout error when I want to execute that stored procedure using ExecuteNonQuery.
Thank you

Its caused by suboptimal plans being used.
You mention that the s.p. has parameters, I've had similar issues due to 'parameter sniffing'.
The quickest check to see if this is the issue is just to, inside the SP, copy the input parameters into local variables then use only the local variables.
This stops e.g. optimisation for certain paramater values at the expense of others.
I've had this before in an s.p. which had int parameters where certain parameter values changed the control flow (as well as how queries would be executed) a bit.

Start Sql Profiler and compare those two executions: is the extra 50 mins spent on the server? Are the queries really the same?
You can than copy the actual query text and run it manually and check execution plan.

try the executing proc with Execution plan icon switched on.
It will tell you exactly which part takes time and you/we can probably take over (suggestions) from there.
Thanks

As a general idea, query plans are cached differently when we talk about adhoc statements vs stored procedures. So the execution time could be different as chosen query plan could be different.
As suggestions, I think at:
1/ Invalidate the query plan associated with that stored procedure:
sp_recompile <procname>
2/ Delete all query plans from cache (the hard-way, non recommended in PROD unless you understand very well the consequences): :
DBCC FREEPROCCACHE
3/ Update statistics for involved tables.
4/ Have look at actual execution plan for both cases and isolate where is the performance bottleneck. Post some code and we'll provide you more details about.

Option 1 : execute SP in Alter State and try again with parameters.
Option 2 : EXEC sp_updatestats
Option 3 : Failing with option 1, add "option(recompile)" at the end of your query.
Eg : Select Id from Table1 Order BY Id desc option(recompile)
If this runs faster, slow down was due to executions plans made by SQL.

Related

SQL Server different time duration between stmtcompleted and direct query

I have then following tracert about one query in my application:
As you can see, the query reads all registers in the table and the result is a big impact in time duration.
But when I try to execute the query directly the result is another... What is wrong?
You executed ANOTHER query from SSMS.
The query shown in profiler is part of stored procedure, and has 8 parameters.
What you've executed is a query with constants that has another execution plan as all the constants are known and estimation was done in base of these known values.
When your sp's statement was executed the plan was created for god-knows-what sniffed parameters and this plan is different from what you have in SSMS.
From the thikness of arrows in SSMS it's clear that you query does not do 7.954.449 reads.
If you want to see you actual execution plan in profiler you should select corresponding event (Showplan XML Statistics Profile ).
Yes, There are two different queries. The Axapta uses placeholders in common. Your query uses literal constants.
The forceLiterals hint in the query make the Axapta query similar to your SSMS sample. Default Axapta hint is forcePlaceholders.
Main goal of placeholders is optimize a massive stream of similar queries with different constants from multiple clients. Mainly due to the use of a query plans cache.
see also the injection warning for forceLiterals:
https://msdn.microsoft.com/en-us/library/aa861766.aspx

Will TSQL return faster results than stored procedure in SQL Server

I have a stored procedure that works fine previously. It took 4 to 5 secs to get the results.
I didn't used this stored procedure for the past two months. When I call the same procedure now it takes more than 5 minutes to produce the result.
(There is no records populated to my source tables in the past two months)
I converted the stored procedure and executed as TSQL block it is back to normal. But when I convert back to stored procedure again it is taking more than 5 minutes.
I am wondering why it is behaving like this. I used 6 table variables. I just populating those table variables to get the desired results by joining all those.
I already tried the below options
With Recompile at the stored procedure level
DBCC FREEPROCCACHE
DBCC DROPCLEANBUFFERS
sp_updatestats
but there is no improvement. When I execute it as TSQL it works fine.
Please suggest me any ideas to optimize the stored procedure.
In your queries, add OPTION(OPTIMIZE FOR UNKNOWN) (as the last clause) to prevent parameter sniffing. For syntax and explanation, see the documentation on Query Hints.
What SQL Server does the first time it runs a Stored Procedure is optimize the execution plan(s) for the parameters that were passed to it. This is done in a process that is called Parameter Sniffing.
In general, execution plans are cached by SQL Server so that SQL Server doesn't have to recompile each time for the same query. The next time the procedure is run, SQL Server will re-use the execution plan(s) for the queries in it... However, the execution plan(s) might be totally inefficient if you call it (them) with different parameters.
The option I gave you will tell to the SQL compiler that the execution plan should not be optimized for specific parameters, but rather for any parameter that is passed to the Stored Procedure.
To quote the documentation:
OPTIMIZE FOR UNKNOWN
Instructs the query optimizer to use statistical data instead of the initial values for all local variables when the query is compiled and optimized, including parameters created with forced parameterization.
In some cases Stored Procedures can benefit from Parameter Sniffing, in some cases they don't. For the Stored Procedures that don't benefit from Paramater Sniffing, you can add the option to each query that uses any of the parameters of the Stored Procedure.
You may have bad execution plan associated with that proc.
Try this one
DBCC FREESYSTEMCACHE ('ALL') WITH MARK_IN_USE_FOR_REMOVAL;
You may also find this interesting to read
http://www.sqlpointers.com/2006/11/parameter-sniffing-stored-procedures.html

Non-optimal execution plan using sp_executesql

I'm having problems with slow performance in a sql select statement with some parameters, for the same query, executing this select using sp_executesql way it takes double time that the inline-way.
The problem is that in sp_execute-way sql server is not using optimal execution plan. Although plans are different, it seems that in both cases indexes of tables are being used correctly. I really don't understand why performance are so different.
My original query is more complex but to try to figure out what's happening I have simplify the original query to a select with 3 tables and 2 joins. The main difference is the use of Hash Match in optimal , I really don't know the meaning of this but is the only difference I can see.
Optimal plan (hash match, over 3 seconds)
Wrong plan (no hash match, same indexes than above, over 12 seconds)
I think my problem is not "parameter sniffing", in my case the query is always slow for all distinct parameter values because the execution plan is always incorrect.
OPTION (RECOMPILE) doesn't help, sp_executesql keeps going slow and inline-way take more time (because the query always compile the execution plan)
Statistics for tables are updated
I have to use sp_executesql way because it seems that reporting services encapsulates the select in sp_executesql calls
Does anybody know why sp_executesql generates a different (wrong) execution plan than the inline query?
EDIT: Queries wasn't using same indexes I guess that because the execution tree is not the same and sqlserver takes indexes as it pleases, attached you can find new execution plans to force to use the same indexes, performance is now even worst, from 12 seconds to more than 15 minutes (I have cancelled) in slow query. I'm really not interested in run this specific query more speed, as I say this is not the real query I'm dealing with, what I'm trying to figure out is why execution plans are so different between inline-query and sp_executesql-query.
Is there any magic option in sp_executesql that do this works properly? :)
Optimal
Slow
My understanding is that sp_executesql keeps a cached plan after the first execution. Subsequent queries maybe using a bad cached plan. You can use the following command to clear out the ENTIRE SQL Server procedure cache.
DBCC FREEPROCCACHE
http://msdn.microsoft.com/en-us/library/ms174283.aspx

SQL performance: Recompile versus drop and re-apply for stored procedure

I have a stored procedure used for a DI report that contains 62 sub-queries using UNION ALL. Recently, performance went from under 1 minute to over 8 minutes and using SQL Profiler, it was showing very high CPU and Reads. The procedure currently has passed in variables set to local variables to prevent parameter sniffing.
Running the contents of the procedure as a SELECT statement and performance was back to under a minute.
Calling the procedure via EXEC in Management Studio and performance was horrible and over 8 minutes.
Calling procedure via EXEC including WITH RECOMPILE command and performance did not improve. I ran DBCC FREEPROCCACHE and DBCC DROPCLEANBUFFERS and still no improvement.
In the end, I dropped the procedure and re-applied it and performance is now back.
Can anyone help explain to me why the initial steps did not correct the performance of the procedure but dropping and re-applying the procedure did?
Sounds like blocking parameter sniffing produced a bad plan. When you use local variables the query optimizer uses the density for each column to come up with cardinality estimates, essentially optimizing for the average value. If you data distribution is skewed significantly, this estimate will be significantly off for some values. This theory explains why your initial steps did not work. Using WITH RECOMPILE or running DBCC FREEPROCCACHE will not help if parameter sniffing is blocked. It will just produce the same plan every time. Because you say that running the contents of the procedure as a SELECT statement made it faster makes me think you actually need parameter sniffing. However you also need to try using WITH RECOMPILE if compilation time is acceptable, otherwise there's a risk of getting stuck with a bad plan based on atipical sniffed values.

Parameter Sniffing (or Spoofing) in SQL Server

A while ago I had a query that I ran quite a lot for one of my users. It was still being evolved and tweaked but eventually it stablised and ran quite quickly, so we created a stored procedure from it.
So far, so normal.
The stored procedure, though, was dog slow. No material difference between the query and the proc, but the speed change was massive.
[Background, we're running SQL Server 2005.]
A friendly local DBA (who no longer works here) took one look at the stored procedure and said "parameter spoofing!" (Edit: although it seems that it is possibly also known as 'parameter sniffing', which might explain the paucity of Google hits when I tried to search it out.)
We abstracted some of the stored procedure to a second one, wrapped the call to this new inner proc into the pre-existing outer one, called the outer one and, hey presto, it was as quick as the original query.
So, what gives? Can someone explain parameter spoofing?
Bonus credit for
highlighting how to avoid it
suggesting how to recognise possible cause
discuss alternative strategies, e.g. stats, indices, keys, for mitigating the situation
FYI - you need to be aware of something else when you're working with SQL 2005 and stored procs with parameters.
SQL Server will compile the stored proc's execution plan with the first parameter that's used. So if you run this:
usp_QueryMyDataByState 'Rhode Island'
The execution plan will work best with a small state's data. But if someone turns around and runs:
usp_QueryMyDataByState 'Texas'
The execution plan designed for Rhode-Island-sized data may not be as efficient with Texas-sized data. This can produce surprising results when the server is restarted, because the newly generated execution plan will be targeted at whatever parameter is used first - not necessarily the best one. The plan won't be recompiled until there's a big reason to do it, like if statistics are rebuilt.
This is where query plans come in, and SQL Server 2008 offers a lot of new features that help DBAs pin a particular query plan in place long-term no matter what parameters get called first.
My concern is that when you rebuilt your stored proc, you forced the execution plan to recompile. You called it with your favorite parameter, and then of course it was fast - but the problem may not have been the stored proc. It might have been that the stored proc was recompiled at some point with an unusual set of parameters and thus, an inefficient query plan. You might not have fixed anything, and you might face the same problem the next time the server restarts or the query plan gets recompiled.
Yes, I think you mean parameter sniffing, which is a technique the SQL Server optimizer uses to try to figure out parameter values/ranges so it can choose the best execution plan for your query. In some instances SQL Server does a poor job at parameter sniffing & doesn't pick the best execution plan for the query.
I believe this blog article http://blogs.msdn.com/queryoptteam/archive/2006/03/31/565991.aspx has a good explanation.
It seems that the DBA in your example chose option #4 to move the query to another sproc to a separate procedural context.
You could have also used the with recompile on the original sproc or used the optimize for option on the parameter.
A simple way to speed that up is to reassign the input parameters to local parameters in the very beginning of the sproc, e.g.
CREATE PROCEDURE uspParameterSniffingAvoidance
#SniffedFormalParameter int
AS
BEGIN
DECLARE #SniffAvoidingLocalParameter int
SET #SniffAvoidingLocalParameter = #SniffedFormalParameter
--Work w/ #SniffAvoidingLocalParameter in sproc body
-- ...
In my experience, the best solution for parameter sniffing is 'Dynamic SQL'. Two important things to note is that 1. you should use parameters in your dynamic sql query 2. you should use sp_executesql (and not sp_execute), which saves the execution plan for each parameter values
Parameter sniffing is a technique SQL Server uses to optimize the query execution plan for a stored procedure. When you first call the stored procedure, SQL Server looks at the given parameter values of your call and decides which indices to use based on the parameter values.
So when the first call contains not very typical parameters, SQL Server might select and store a sub-optimal execution plan in regard to the following calls of the stored procedure.
You can work around this by either
using WITH RECOMPILE
copying the parameter values to local variables inside the stored procedure and using the locals in your queries.
I even heard that it's better to not use stored procedures at all but to send your queries directly to the server.
I recently came across the same problem where I have no real solution yet.
For some queries the copy to local vars helps getting back to the right execution plan, for some queries performance degrades with local vars.
I still have to do more research on how SQL Server caches and reuses (sub-optimal) execution plans.
I had similar problem. My stored procedure's execution plan took 30-40 seconds. I tried using the SP Statements in query window and it took few ms to execute the same.
Then I worked out declaring local variables within stored procedure and transferring the values of parameters to local variables. This made the SP execution very fast and now the same SP executes within few milliseconds instead of 30-40 seconds.
Very simple and sort, Query optimizer use old query plan for frequently running queries. but actually the size of data is also increasing so at that time new optimized plan is require and still query optimizer using old plan of query. This is called Parameter Sniffing.
I have also created detailed post on this. Please visit this url:
http://www.dbrnd.com/2015/05/sql-server-parameter-sniffing/
Changing your store procedure to execute as a batch should increase the speed.
Batch file select i.e.:
exec ('select * from order where order id ='''+ #ordersID')
Instead of the normal stored procedure select:
select * from order where order id = #ordersID
Just pass in the parameter as nvarchar and you should get quicker results.

Resources