SQL Server proc running 5x slower than plain query - sql-server

I have the following query:
DECLARE #DaysNotUsed int = 14
DECLARE #DaysNotPhoned int = 7
--Total Unique Students
DECLARE #totalStudents TABLE (SchoolID uniqueidentifier, TotalUniqueStudents int)
INSERT INTO #totalStudents
SELECT
SSGG.School,
COUNT(DISTINCT S.StudentID)
FROM Student S
INNER JOIN StudentStudents_GroupGroups SSGG ON (SSGG.Students = S.StudentID AND SSGG.School = S.School)
INNER JOIN [Group] G ON (G.GroupID = SSGG.Groups AND G.School = SSGG.School)
INNER JOIN SessionHistory SH ON (SH.Student = S.StudentID AND SH.School = S.School AND SH.StartDateTime > GETDATE() - #DaysNotUsed)
WHERE G.IsBuiltIn = 0
AND S.HasStartedProduct = 1
GROUP BY SSGG.School
--Last Used On
DECLARE #lastUsed TABLE (SchoolID uniqueidentifier, LastUsedOn datetime)
INSERT INTO #lastUsed
SELECT
vi.SchoolID,
MAX(sh.StartDateTime)
FROM View_Installation as vi
INNER JOIN SessionHistory as sh on sh.School = vi.SchoolID
GROUP BY vi.SchoolID
SELECT
VI.SchoolID,
INS.DateAdded,
INS.Removed,
INS.DateRemoved,
INS.DateToInclude,
VI.SchoolName AS [School Name],
VI.UsersLicensed AS [Licenses],
ISNULL(TS.TotalUniqueStudents, 0) as [Total Unique Students],
ISNULL(TS.TotalUniqueStudents, 0) * 100 / VI.UsersLicensed as [% of Students Using],
S.State,
LU.LastUsedOn,
DATEDIFF(DAY, LU.LastUsedOn, GETDATE()) AS [Days Not Used],
SI.AreaSalesManager AS [Sales Rep],
SI.CaseNumber AS [Case #],
SI.RenewalDate AS [Renewal Date],
SI.AssignedTo AS [Assigned To],
SI.Notes AS [Notes]
FROM View_Installation VI
INNER JOIN School S ON S.SchoolID = VI.SchoolID
LEFT OUTER JOIN #totalStudents TS on TS.SchoolID = VI.SchoolID
INNER JOIN #lastUsed LU on LU.SchoolID = VI.SchoolID
LEFT OUTER JOIN InactiveReports..SchoolInfo SI ON S.SchoolID = SI.SchoolID
LEFT OUTER JOIN InactiveReports..InactiveSchools INS ON S.SchoolID = INS.SchoolID
WHERE VI.UsersLicensed > 0
AND VI.LastPhoneHome > GETDATE() - #DaysNotPhoned
AND
(
(
SELECT COUNT(DISTINCT S.StudentID)
FROM Student S
INNER JOIN StudentStudents_GroupGroups SSGG ON (SSGG.Students = S.StudentID AND SSGG.School = S.School)
INNER JOIN [Group] G ON (G.GroupID = SSGG.Groups AND G.School = SSGG.School)
WHERE G.IsBuiltIn = 0
AND S.School = VI.SchoolID
) * 100 / VI.UsersLicensed < 50
OR
VI.SchoolID NOT IN
(
SELECT DISTINCT SH1.School
FROM SessionHistory SH1
WHERE SH1.StartDateTime > GETDATE() - #DaysNotUsed
)
)
ORDER BY [Days Not Used] DESC
Running just plain sql like this in SSMS take about 10 seconds to run. When I created a stored procedure with exactly the same code, the query takes 50 seconds instead. The only difference in the actual code of the proc is a SET NOCOUNT ON that the IDE put in by default, but adding that line to the query doesn't have any impact. Any idea what would cause such a dramatic slow down like this?
EDIT I neglected the declare statements at the beginning. These are not in the proc, but are parameters to it. Could this be the difference?

I agree about the potential parameter sniffing issue, but I would also check these settings.
For the procedure:
SELECT uses_ansi_nulls, uses_quoted_identifier
FROM sys.sql_modules
WHERE [object_id] = OBJECT_ID('dbo.procedure_name');
For the SSMS query window where the query is running fast:
SELECT [ansi_nulls], [quoted_identifier]
FROM sys.dm_exec_sessions
WHERE session_id = ##SPID;
If either of these don't match, you might consider dropping the stored procedure and re-creating it with those two settings matching. For example, if the procedure has uses_quoted_identifier = 0 and the session has quoted_identifier = 1, you could try:
DROP PROCEDURE dbo.procedure_name;
GO
SET QUOTED_IDENTIFIER ON;
GO
CREATE PROCEDURE dbo.procedure_name
AS
BEGIN
SET NOCOUNT ON;
...
END
GO
Ideally all of your modules will be created with the exact same QUOTED_IDENTIFIER and ANSI_NULLS settings. It's possible the procedure was created when the settings were off (the default is on for both), or it's possible that where you are executing the query, one or both options are off (you can change this behavior in SSMS under Tools/Options/Query Execution/SQL Server/ANSI).
I'm not going to make any disclaimers about the behavior of the stored procedure with the different settings (for example you may have wanted ANSI_NULLS off so you could compare NULL = NULL), that you'll have to test, but at least you'll be comparing queries that are being run with the same options, and it will help narrow down potential parameter sniffing issues. If you're intentionally using SET ANSI_NULLS OFF, however, I caution you to find other approaches as that behavior will eventually be unsupported.
Other ways around parameter sniffing:
make sure you don't inadvertently compile the procedure with atypical parameters
use the recompile option either on the procedure or on the statement that seems to be the victim (I'm not sure if all of these are valid, because I can only tell that you are using SQL Server 2005 or greater, and some of these were introduced in 2008)
declare local variables similar to your input parameters, and pass the input parameter values to them, using the local variables later in the prodedure and ignoring the input parameters
The last option is my least favorite, but it's the quickest / easiest fix in the midst of troubleshooting and when users are complaining.

Also, in addition to everything else mentioned, if you are on SQL Server 2008 and up, have a look at OPTIMIZE FOR UNKNOWN http://blogs.msdn.com/b/sqlprogrammability/archive/2008/11/26/optimize-for-unknown-a-little-known-sql-server-2008-feature.aspx

I would recommend recompiling the execution plan for the stored procedure.
usage: sp_recompile '[target]'
example: sp_recompile 'dbo.GetObject'
When you execute a query from SSMS the query plan is automatically redone every time its executed. However with stored procs, sql server caches execution plans for stored procedures, and its this execution plan that gets used everytime the stored proc is called.
Link for sp_recompile.
You can also change the proc to use with WITH RECOMPILE clause within the stored proc.
Example:
CREATE PROCEDURE dbo.GetObject
(
#parm1 VARCHAR(20)
)
WITH RECOMPILE
AS
BEGIN
-- Queries/work here.
END
However this will force the execution plan to be recompiled every time the stored proc is called. This is good for dev/testing where the proc and/or data changes quite frequently. Make sure you remove it when you deploy it to production, as this can have a performance hit.
sp_recompile only recompiles the execution plan once. If you need to do it again at a later date, you will need to make the call again.
Good luck!

OK, thank you all for your help. Turns out it was a terribly stupid rookie mistake. The first time I created the proc, it created it under my user's schema instead of the dbo schema. When I called the proc I was simply doing 'exec proc_name', which I'm realizing now was using the version of the proc under my user's schema. Running 'exec dbo.proc_name' ran as expected.

Related

SQL - Replacing connection strings for multiple stored procedures

I'm looking to replace connection strings across multiple stored procedures.The development database I'm working with is restored from production and stored procedures within contain connection strings to a production linked server. My aim is replace the linked server connection strings to point at a development linked server for testing purposes. I'm looking to automated this as a step in the SQL Agent restore job to run immediately after the restore.
The issue I'm having is setting the stored proc definition as a variable while keeping the formatting. When selecting the definition text, the stored proc is all on one line so after replacing the CREATE for ALTER along with the connection stings I cannot execute the SQL due to the formatting. I've tried playing around with STRING_SPLIT as but to no avail. Is there a way to do this or is it not possible?
Here is an example of part of one of the procedures
SELECT
a.AgreementNumber,
a.AgreementProposalID,
c.CustomerNumber,
pc.Id CustomerID,
a.AgreementCreateDate
INTO
#a
FROM
SENTINEL.DotDot_S3DB01_Replica.dbo.AgreementTable a
INNER JOIN
SENTINEL.DotDot_S3CUSTDB_Replica.dbo.CustomerTable c ON a.AgreementCustomerNumber = c.CustomerNumber
INNER JOIN
SENTINEL.DotDot_Proposal_Replica.dbo.Customer pc ON pc.Code = c.CustomerNumber
WHERE
a.AgreementCreateDate > #LastEnteredDate;
After my changes it needs to look like this to reference the test database names
SELECT
a.AgreementNumber,
a.AgreementProposalID,
c.CustomerNumber,
pc.Id CustomerID,
a.AgreementCreateDate
INTO
#a
FROM
SENTINEL.DotDotUAT_S3DB01.dbo.AgreementTable a
INNER JOIN
SENTINEL.DotDotUAT_S3CUSTDB.dbo.CustomerTable c ON a.AgreementCustomerNumber = c.CustomerNumber
INNER JOIN
SENTINEL.DotDotUAT_Proposal .dbo.Customer pc ON pc.Code = c.CustomerNumber
WHERE
a.AgreementCreateDate > #LastEnteredDate;
I've tried using this code to select the stored procedure code as a variable so I can execute the alter procedure command with the changes however the object definition selected from sys.procedures is displayed all on one line.
DECLARE #SQL VARCHAR(MAX)
SET #SQL = ( SELECT OBJECT_DEFINITION(object_id)
FROM sys.procedures
WHERE name = 'Usp_Proc')
SET #SQL = REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(#SQL,'CREATE','ALTER'),'OriginalString','ReplacementString'),'OriginalString','ReplacementString'),'OriginalString','ReplacementString'),'OriginalString','ReplacementString')
EXEC (#SQL)

How to put all my selected columns into a dummy variable?

Backround
This question is a follow-up to a previous question. To give you context here as well, I would like to summarize the previous question: in my previous question I intended to have a methodology to execute selections without sending their result to the client. The goal was to measure performance without eating up a lot of resources by sending millions of data. I am only interested in the time needed to execute those queries and not in the time they will send the results to the client app, since I intend to optimize queries, so the results of the queries will not change at all, but the methodology will change and I intend to be able to compare the methodologies.
Current knowledge
In my other question several ideas were presented. An idea was to select the count of the records and put it into a variable. However, that changed the query plan significantly and the results were not accurate in terms of performance. The idea of using a temporary table was presented as well, but creating a temporary table and inserting into it is difficult if we do not know what query will be our input to measure and also introduces a lot of white noise, so, even though the idea was creative, it was not ideal for my problem. Finally Vladimir Baranov came with an idea to create as many variables as many columns the selection will return. This was a great idea, but I refined it further, by creating a single variable of nvarchar(max) and selecting all my columns into it. The idea works great, except for a few problems. I have the solution for most of the problems, but I would like to share them, so, I will describe them regardless, but do not misunderstand me, I have a single question.
Problem1
If I have a #container variable and I do a #container = columnname inside each selection, then I will have conversion problems.
Solution1
Instead of just doing a #container = columnname, I need to do a #container = cast(columnname as nvarchar(max))
Problem2
I will need to convert <whatever> as something into #container = cast(<whatever> as nvarchar(max)) for each columns in the selection, but not for subselections and I will need to have a general solution handling case when and parantheses, I do not want to have any instances of #container = anywhere, except to the left of the main selection.
Solution2
Since I am clueless about regular expressions, I can solve this by iterating the query string until I find the from of the main query and each time I find a parantheses, I will do nothing until that parantheses is closed, find the indexes where #container = should be put and as [customname] should be taken out and from right to left do all that in the query string. This will be a long and unelegant code.
Question
Is it possible to make sure that all my main columns but nothing else start with #container = and ends without as [Customname]?
This is much too long for a comment but I'd like to add my $.02 to the other answers and share the scripts I used to test the suggested methods.
I like #MartinSmith's TOP 0 solution but am concerned that it could result in a different execution plan shape in some cases. I didn't see this in the tests I ran but I think you'll need to verify the plan is about the same as the unmolested query for each query you test. However, the results of my tests suggest the number of columns and/or data types might skew performance with this method.
The SQLCLR method in #VladimirBaranov's answer should provide the exact plan as the app code generates (assuming identical SET options for the test) there will still be some slight overhead (YMMV) with SqlClient consuming results within the SQLCLR. There will be less server overhead with this method compared to returning results back to the calling application.
The SSMS discard results method I suggested in my first comment will incur more overhead than the other methods but does include the server-side work SQL Server will perform not only in running the query, but also filling buffers for the returned result. Whether or not this additional SQL Server work should be taken into account depends on the purpose of the test. For unit-level performance tests, I prefer to execute tests using the same API as the app code.
I captured server-side performance with these 3 methods with #MartinSmith's original query. The average of 1,000 iterations on my machine were:
test method cpu_time duration logical_reads
SSMS discard 53031.000000 55358.844000 7190.512000
TOP 0 52374.000000 52432.936000 7190.527000
SQLCLR 49110.000000 48838.532000 7190.578000
I did the same with a trivial query returning 10,000 rows and 2 columns (int and nvarchar(100)) from a user table:
test method cpu_time duration logical_reads
SSMS discard 4204.000000 9245.426000 402.004000
TOP 0 2641.000000 2752.695000 402.008000
SQLCLR 1921.000000 1878.579000 402.000000
Repeating the same test but with a varchar(100) column instead of nvarchar(100):
test method cpu_time duration logical_reads
SSMS discard 3078.000000 5901.023000 402.004000
TOP 0 2672.000000 2616.359000 402.008000
SQLCLR 1750.000000 1798.098000 402.000000
Below are the scripts I used for testing:
Source code for SQLCLR proc like #VladimirBaranov suggested:
public static void ExecuteNonQuery(string sql)
{
using (var connection = new SqlConnection("Context Connection=true"))
{
connection.Open();
var command = new SqlCommand(sql, connection);
command.ExecuteNonQuery();
}
}
Xe trace to capture the actual server-side timings and resource usage:
CREATE EVENT SESSION [test] ON SERVER
ADD EVENT sqlserver.sql_batch_completed(SET collect_batch_text=(1))
ADD TARGET package0.event_file(SET filename=N'QueryTimes')
WITH (MAX_MEMORY=4096 KB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,MAX_DISPATCH_LATENCY=30 SECONDS,MAX_EVENT_SIZE=0 KB,MEMORY_PARTITION_MODE=NONE,TRACK_CAUSALITY=OFF,STARTUP_STATE=OFF);
GO
User table create and load:
CREATE TABLE dbo.Foo(
FooID int NOT NULL CONSTRAINT PK_Foo PRIMARY KEY
, Bar1 nvarchar(100)
, Bar2 varchar(100)
);
WITH
t10 AS (SELECT n FROM (VALUES(0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) t(n))
,t10k AS (SELECT ROW_NUMBER() OVER (ORDER BY (SELECT 0)) AS num FROM t10 AS a CROSS JOIN t10 AS b CROSS JOIN t10 AS c CROSS JOIN t10 AS d)
INSERT INTO dbo.Foo WITH (TABLOCKX)
SELECT num, REPLICATE(N'X', 100), REPLICATE('X', 100)
FROM t10k;
GO
SQL script run from SSMS with the discard results query option to run 1000 iterations of the test with the 3 different methods:
SET NOCOUNT ON;
GO
--return and discard results
SELECT v.*,
o.name
FROM master..spt_values AS v
JOIN sys.objects o
ON o.object_id % NULLIF(v.number, 0) = 0;
GO 1000
--TOP 0
DECLARE #X NVARCHAR(MAX);
SELECT #X = (SELECT TOP 0 v.*,
o.name
FOR XML PATH(''))
FROM master..spt_values AS v
JOIN sys.objects o
ON o.object_id % NULLIF(v.number, 0) = 0;
GO 1000
--SQLCLR ExecuteNonQuery
EXEC dbo.ExecuteNonQuery #sql = N'
SELECT v.*,
o.name
FROM master..spt_values AS v
JOIN sys.objects o
ON o.object_id % NULLIF(v.number, 0) = 0;
'
GO 1000
--return and discard results
SELECT FooID, Bar1
FROM dbo.Foo;
GO 1000
--TOP 0
DECLARE #X NVARCHAR(MAX);
SELECT #X = (SELECT TOP 0 FooID, Bar1
FOR XML PATH(''))
FROM dbo.Foo;
GO 1000
--SQLCLR ExecuteNonQuery
EXEC dbo.ExecuteNonQuery #sql = N'
SELECT FooID, Bar1
FROM dbo.Foo
';
GO 1000
--return and discard results
SELECT FooID, Bar1
FROM dbo.Foo;
GO 1000
--TOP 0
DECLARE #X NVARCHAR(MAX);
SELECT #X = (SELECT TOP 0 FooID, Bar2
FOR XML PATH(''))
FROM dbo.Foo;
GO 1000
--SQLCLR ExecuteNonQuery
EXEC dbo.ExecuteNonQuery #sql = N'
SELECT FooID, Bar2
FROM dbo.Foo
';
GO 1000
I would try to write a single CLR function that runs as many queries as needed to measure. It may have a parameter with the text(s) of queries to run, or names of stored procedures to run.
You have a single request to the server. Everything is done locally on the server. No network overhead. You discard query result in the .NET CLR code without using explicit temp tables by using ExecuteNonQuery for each query that you need to measure.
Don't change the query that you are measuring. Optimizer is complex, changes to the query may have various effects on the performance.
Also, use SET STATISTICS TIME ON and let the server measure the time for you. Fetch what the server has to say, parse it and send it back in the format that suits you.
I think, that results of SET STATISTICS TIME ON / OFF are the most reliable and accurate and have the least amount of noise.

Query runs fast, but runs slow in stored procedure

I am doing some tests using the SQL 2005 profiler.
I have a stored procedure which simply runs one SQL query.
When I run the stored procedure, it takes a long time and performs 800,000 disk reads.
When I run the same query separate to the stored procedure, it does 14,000 disk reads.
I found that if I run the same query with OPTION(recompile), it takes 800,000 disk reads.
From this, I make the (possibly erroneous) assumption that the stored procedure is recompiling each time, and that's causing the problem.
Can anyone shed some light onto this?
I have set ARITHABORT ON. (This solved a similar problem on stackoverflow, but didn't solve mine)
Here is the entire stored procedure:
CREATE PROCEDURE [dbo].[GET_IF_SETTLEMENT_ADJUSTMENT_REQUIRED]
#Contract_ID int,
#dt_From smalldatetime,
#dt_To smalldatetime,
#Last_Run_Date datetime
AS
BEGIN
DECLARE #rv int
SELECT #rv = (CASE WHEN EXISTS
(
select * from
view_contract_version_last_volume_update
inner join contract_version
on contract_version.contract_version_id = view_contract_version_last_volume_update.contract_version_id
where contract_version.contract_id=#Contract_ID
and volume_date >= #dt_From
and volume_date < #dt_To
and last_write_date > #Last_Run_Date
)
THEN 1 else 0 end)
-- Note that we are RETURNING a value rather than SELECTING it.
-- This means we can invoke this function from other stored procedures
return #rv
END
Here's a script I run that demonstrates the problem:
DECLARE
#Contract_ID INT,
#dt_From smalldatetime,
#dt_To smalldatetime,
#Last_Run_Date datetime,
#rv int
SET #Contract_ID=38
SET #dt_From='2010-09-01'
SET #dt_To='2010-10-01'
SET #Last_Run_Date='2010-10-08 10:59:59:070'
-- This takes over fifteen seconds
exec GET_IF_SETTLEMENT_ADJUSTMENT_REQUIRED #Contract_ID=#Contract_ID,#dt_From=#dt_From,#dt_To=#dt_To,#Last_Run_Date=#Last_Run_Date
-- This takes less than one second!
SELECT #rv = (CASE WHEN EXISTS
(
select * from
view_contract_version_last_volume_update
inner join contract_version
on contract_version.contract_version_id = view_contract_version_last_volume_update.contract_version_id
where contract_version.contract_id=#Contract_ID
and volume_date >= #dt_From
and volume_date < #dt_To
and last_write_date > #Last_Run_Date
)
THEN 1 else 0 end)
-- With recompile option. Takes 15 seconds again!
SELECT #rv = (CASE WHEN EXISTS
(
select * from
view_contract_version_last_volume_update
inner join contract_version
on contract_version.contract_version_id = view_contract_version_last_volume_update.contract_version_id
where contract_version.contract_id=#Contract_ID
and volume_date >= #dt_From
and volume_date < #dt_To
and last_write_date > #Last_Run_Date
)
THEN 1 else 0 end) OPTION(recompile)
OK, we have had similar issues like this before.
The way we fixed this, was by making local parameters inside the SP, such that
DECLARE #LOCAL_Contract_ID int,
#LOCAL_dt_From smalldatetime,
#LOCAL_dt_To smalldatetime,
#LOCAL_Last_Run_Date datetime
SELECT #LOCAL_Contract_ID = #Contract_ID,
#LOCAL_dt_From = #dt_From,
#LOCAL_dt_To = #dt_To,
#LOCAL_Last_Run_Date = #Last_Run_Date
We then use the local parameters inside the SP rather than the parameters that was passed in.
This typically fixed the issue for Us.
We believe this to be due to parameter sniffing, but do not have any proof, sorry... X-)
EDIT:
Have a look at Different Approaches to Correct SQL Server Parameter Sniffing for some insightful examples, explanations and fixes.
As others have mentioned, this could be a 'parameter sniffing' problem. Try including the line:
OPTION (RECOMPILE)
at the end of your SQL query.
There is an article here explaining what parameter sniffing is: http://blogs.technet.com/b/mdegre/archive/2012/03/19/what-is-parameter-sniffing.aspx
I guess this is caused by parameter sniffing.
The issue of why a batch takes forever to run inside a SQL stored procedure yet runs instantaneously in SSMS has to do with SQL parameter sniffing, especially with datetime parameters.
There are several excellent articles on parameter sniffing out there.
Here's one of them ( I didn't write it, just passing it on).
http://www.sommarskog.se/query-plan-mysteries.html
On my issue I've run:
exec sp_updatestats
and this speed up my sp from 120s to just 3s. More info about Updating Statistics can be found here https://msdn.microsoft.com/en-us/library/ms173804.aspx
I too got the same problem today. I have dropped and recreated the SP and it worked. This is something with SP cache and when dropped the SP the cached plan has been removed. You can try the same or use 'DBCC FREEPROCCACHE' to delete cache.

Why would a stored procedure perform differently when executed remotely to locally?

We've a stored procedure that happens to build up some dynamic SQL and execute via a parametrised call to sp_executesql.
Under normal conditions, this works wonderfully, and has made a large benefit in execution times for the procedure (~8 seconds to ~1 second), however, under some unknown conditions, something strange happens, and performance goes completely the other way (~31 seconds), but only when executed via RPC (i.e. a call from a .Net app with the SqlCommand.CommandType of CommandType.StoredProcedure; or as a remote query from a linked server) - if executed as a SQL Batch using SQL Server Management Studio, we do not see the degradation in performance.
Altering the white-space in the generated SQL and recompiling the stored procedure, seems to resolve the issue at least in the short term, but we'd like to understand the cause, or ways to force the execution plans to be rebuilt for the generated SQL; but at the moment, I'm not sure how to proceed with either?
To illustrate, the Stored Procedure, looks a little like:
CREATE PROCEDURE [dbo].[usp_MyObject_Search]
#IsActive AS BIT = NULL,
#IsTemplate AS BIT = NULL
AS
DECLARE #WhereClause NVARCHAR(MAX) = ''
IF #IsActive IS NOT NULL
BEGIN
SET #WhereClause += ' AND (svc.IsActive = #xIsActive) '
END
IF #IsTemplate IS NOT NULL
BEGIN
SET #WhereClause += ' AND (svc.IsTemplate = #xIsTemplate) '
END
DECLARE #Sql NVARCHAR(MAX) = '
SELECT svc.[MyObjectId],
svc.[Name],
svc.[IsActive],
svc.[IsTemplate]
FROM dbo.MyObject svc WITH (NOLOCK)
WHERE 1=1 ' + #WhereClause + '
ORDER BY svc.[Name] Asc'
EXEC sp_executesql #Sql, N'#xIsActive BIT, #xIsTemplate BIT',
#xIsActive = #IsActive, #xIsTemplate = #IsTemplate
With this approach, the query plan will be cached for the permutations of NULL/not-NULL, and we're getting the benefit of cached query plans. What I don't understand is why it would use a different query plan when executed remotely vs. locally after "something happens"; I also don't understand what the "something" might be?
I realise I could move away from parametrisation, but then we'd lose the benefit of caching what are normally good execution plans.
I would suspect parameter sniffing. If you are on SQL Server 2008 you could try including OPTIMIZE FOR UNKNOWN to minimise the chance that when it generates a plan it does so for atypical parameter values.
RE: What I don't understand is why it would use a different query plan when executed remotely vs. locally after "something happens"
When you execute in SSMS it won't use the same bad plan because of different SET options (e.g. SET ARITHABORT ON) so it will compile a new plan that works well for the parameter values you are currently testing.
You can see these plans with
SELECT usecounts, cacheobjtype, objtype, text, query_plan, value as set_options
FROM sys.dm_exec_cached_plans
CROSS APPLY sys.dm_exec_sql_text(plan_handle)
CROSS APPLY sys.dm_exec_query_plan(plan_handle)
cross APPLY sys.dm_exec_plan_attributes(plan_handle) AS epa
where text like '%FROM dbo.MyObject svc WITH (NOLOCK)%'
and attribute='set_options'
Edit
The following bit is just in response to badbod99's answer
create proc #foo #mode bit, #date datetime
as
declare #Sql nvarchar(max)
if(#mode=1)
set #Sql = 'select top 0 * from sys.objects where create_date < #date /*44FC79BD-2AF5-4774-9674-04D6C3D4B228*/'
else
set #Sql = 'select top 0 * from sys.objects where modify_date < #date /*44FC79BD-2AF5-4774-9674-04D6C3D4B228*/'
EXEC sp_executesql #Sql, N'#date datetime',
#date = #date
go
declare #d datetime
set #d = getdate()
exec #foo 0,#d
exec #foo 1, #d
SELECT usecounts, cacheobjtype, objtype, text, query_plan, value as set_options
FROM sys.dm_exec_cached_plans
CROSS APPLY sys.dm_exec_sql_text(plan_handle)
CROSS APPLY sys.dm_exec_query_plan(plan_handle)
cross APPLY sys.dm_exec_plan_attributes(plan_handle) AS epa
where text like '%44FC79BD-2AF5-4774-9674-04D6C3D4B228%'
and attribute='set_options'
Returns
Recompilation
Any time the execution of the SP would be significantly different due to conditional statements the execution plan which was cached from the last request may not be optimal for this one.
It's all about when SQL compiles the execution plan for the SP. They key section regarding sp compilation on Microsoft docs is this:
... this optimization occurs automatically the first time a stored procedure is run after SQL Server is restarted. It also occurs if an underlying table that is used by the stored procedure changes. But if a new index is added from which the stored procedure might benefit, optimization does not occur until the next time that the stored procedure is run after SQL Server is restarted. In this situation, it can be useful to force the stored procedure to recompile the next time that it executes
SQL does recompile execution plans at times, from Microsoft docs
SQL Server automatically recompiles stored procedures and triggers when it is advantageous to do this.
... but it will not do this with each call (unless using WITH RECOMPILE), so if each execution could be resulting in different SQL, you may be stuck with the same old plan for at least one call.
RECOMPILE query hint
The RECOMPILE query hint, takes into account your parameter values when checking what needs to be recompiled at the statement level.
WITH RECOMPILE option
WITH RECOMPILE (see section F) will cause the execution plan to be compiled with each call, so you will never have a sub-optimal plan, but you will have the compilation overhead.
Restructure into multiple SPs
Looking at your specific case, the execution plan for the proc never changes and the 2 sql statements should have prepared execution plans.
I would suggest that restructuring the code to split the SPs rather than have this conditional SQL generation would simplify things and ensure you always have the optimal execution plan without any SQL magic sauce.

How can I display the execution plan for a stored procedure?

I am able to view the Estimated Execution Plan (Management Studio 9.0) for a query without a problem but when it comes to stored procedures I do not see an easy way to do this without copying the code from the ALTER screen and pasting it into a query window, otherwise it will show the plan for the ALTER and not the procedure. Even after doing this, any inputs are missing and I would need to DECLARE them as such.
Is there an easier way to do this on stored procedures?
Edit:
I just thought of something that might work but I am not sure.
Could I do the estimated execution plan on
exec myStoredProc 234
SET SHOWPLAN_ALL ON
GO
-- FMTONLY will not exec stored proc
SET FMTONLY ON
GO
exec yourproc
GO
SET FMTONLY OFF
GO
SET SHOWPLAN_ALL OFF
GO
Select the storedprocedure name (just type it in a query window), right click, and choose the 'Display Estimated Execution Plan' button in the toolbar of SQl Server Mgmt Studio.
Note that you don't have to have the stored procedure code open. Just the procedure name has to be selected.
The plan for the stored procedure from with in the called procedures will also be displayed in graphical form.
When executing a stored procedure in SQL Management Studio 2008 you can click Query -> Include Actual Execution Plan from the menu...its also on the tool bar
After reading through the comments executing seems to be an issue and to solve this issue i would recommend wrapping the execution of the stored procedure in a transaction rolling it back at the end
Use
SET SHOWPLAN_ALL ON
Go
exec myStoredProc 234
GO
SET SHOWPLAN_ALL OFF
GO
See http://msdn.microsoft.com/en-us/library/aa259203.aspx
As long as you aren't using tmp tables i think this will work
I know answer was submitted a while ago but I find query below useful
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;
SELECT [ProcedureName] = OBJECT_NAME([ps].[object_id], [ps].[database_id])
,[ProcedureExecutes] = [ps].[execution_count]
,[VersionOfPlan] = [qs].[plan_generation_num]
,[ExecutionsOfCurrentPlan] = [qs].[execution_count]
,[Query Plan XML] = [qp].[query_plan]
FROM [sys].[dm_exec_procedure_stats] AS [ps]
JOIN [sys].[dm_exec_query_stats] AS [qs] ON [ps].[plan_handle] = [qs].[plan_handle]
CROSS APPLY [sys].[dm_exec_query_plan]([qs].[plan_handle]) AS [qp]
WHERE [ps].[database_id] = DB_ID()
AND OBJECT_NAME([ps].[object_id], [ps].[database_id]) = 'TEST'
There are quite a few ways to get the actual execution plan of a stored procedure.
SELECT
qp.query_plan,
SQLText.text
FROM sys.dm_exec_cached_plans AS CP
CROSS APPLY sys.dm_exec_sql_text( plan_handle)AS SQLText
CROSS APPLY sys.dm_exec_query_plan( plan_handle)AS QP
WHERE objtype = 'Proc' and cp.cacheobjtype = 'Compiled Plan'
looking at the plans on a production server with the statistics of the data in the production server may show a different plan then of a dev box with a smaller dataset.
There is a lot more data to look at, like how often is a procedure executed according to query cache
SELECT
qp.query_plan,
CP.usecounts as [Executed],
DB_name(QP.dbid) as [Database],
OBJECT_NAME(QP.objectid) as [Procedure],
SQLText.text as [TSQL],
so.create_date as [Procedure Created],
so.modify_date as [Procedure Modified]
FROM sys.dm_exec_cached_plans AS CP
CROSS APPLY sys.dm_exec_sql_text( plan_handle)AS SQLText
CROSS APPLY sys.dm_exec_query_plan( plan_handle)AS QP
join sys.objects as so on so.[object_id]=QP.objectid
WHERE objtype = 'Proc' and cp.cacheobjtype = 'Compiled Plan'
The XML query plan (the first column in both queries), contains the XML of the execution plan allowing you, in SSMS to click on it and view the actual plans but also allows you to scan for things you do not like to have like index scan or "god forbid" table scans.
SELECT
qp.query_plan,
CP.usecounts as [Executed],
DB_name(QP.dbid) as [Database],
OBJECT_NAME(QP.objectid) as [Procedure],
SQLText.text as [TSQL],
so.create_date as [Procedure Created],
so.modify_date as [Procedure Modified]
FROM sys.dm_exec_cached_plans AS CP
CROSS APPLY sys.dm_exec_sql_text( plan_handle)AS SQLText
CROSS APPLY sys.dm_exec_query_plan( plan_handle)AS QP
join sys.objects as so on so.[object_id]=QP.objectid
WHERE objtype = 'Proc' and cp.cacheobjtype = 'Compiled Plan'
and cast(qp.query_plan as nvarchar(max)) like '%loop%'
I sample this using a really bad way by casting the XML to string and then doing a wildcard search however XML queries are not things most do every day and string wildcards are easy for everyone.
Running the stored procedure in management studio (or query analyser) with show actual execution plan (from the query menu) enabled will show you the plan for the stored procedure after you have run it. If you cant run it there is show estimated execution plan (though in my experience that is often less accurate.)
You can also use Profiler to see the execution plan. You'll want to include the Performance : Show Plan Statistics Profile option and be sure to inlcude Binary Data in your columns.
You can then run any query or procedure and see the execution plan.
Edit
If you can't use profiler, and you don't want to open another window I suggest that you include a comment block at the begining of your stored procs. For example imagine the following:
/*
Description: This procedure does XYZ etc...
DevelopedBy: Josh
Created On: 4/27/09
Execution: exec my_procName N'sampleparam', N'sampleparam'
*/
ALTER PROCEDURE my_procName
#p1 nvarchar(20),
#p2 nvarchar(20)
AS
What this allows is that you can highlight just the execution purpose and turn on show execution plan. And run it.
Here's a screenshot.Took me a while to figure out where to look for.

Resources