I have an SSRS report which has seen its TimeDataRetrieval (as per the ExecutionLog3 table in the ReportingServices database) increase by 60 seconds overnight and I can't figure out why.
The report has two parameters and contains a single Dataset which passes one of those report parameters to a SQL stored procedure. I can run the stored procedure standalone in SSMS and it completes in seconds, in line with the previous report performance.
I have read many threads and articles online about how parameter sniffing affects the execution plan which SQL builds for a stored procedure when it's been called from an SSRS report versus when it is run direct, but I've tried adding an internal variable to the stored procedure, assigning the incoming parameter value to that variable and using that variable in the query within the stored procedure instead of the parameter, but this didn't make any difference to the issue. I also even tried adding OPTION(RECOMPILE) to the stored procedure, but again this had no impact.
The issue began occurring right after we upgraded our Dynamics CRM 2015 system (whose database resides on the same SQL Server as this instance of SSRS - probably a bad idea I know) to Dynamics 365, so I'm wondering if that could somehow have something to do with it, but I'm at a loss as to how to troubleshoot this one, so any suggestions would be most welcome!
Do the tables that this SP runs from steadily grow in size? Sometimes you get a 'threshold' affect where suddenly the number of rows cause performance issues. I suggest you rebuild statistics on all the tables in use and add the OPTION(RECOMPILE) and retest.
Also when trying to recreate in SSMS, you must make sure you also include all the SET options. You should capture the SQL using profiler and use exactly that, including all of the four or five set options before it (i.e. SET ARITHABORT)
You might find you can then reproduce in SSMS, in which case it is definiteley a parameter sniffing issue. (although recompile usually fixes that)
Related
We have a main stored procedure that returns around 1000 records, changes by the user permissions.
Lately the procedure performance became very bad - but only from the web-service - more than a minute!
but when running the same SP with the same parameters from ssms took only 3 seconds!!
When I tried to check the problem I added writes to log table, and immediately this change improved the performance again to 3 seconds from the web-service.
This is a mystery for me:
1. The difference between running from web-service and ssms
2. The change after adding the logging
Your issue is called parameter sniffing. There were 2 execution plans for this procedure, one created the first time you launched it from web server and another created when you lanuched it from SSMS. And the parameters of these two plans were different. The next time you execute this proc, one of this plans is used: when you execute from SSMS, the second plan is used, and from web service the first plan is used. The parameters passed to this proc were atypical when executed from wb service, and typical when executed from SSMS.
When you altered your procedure, those 2 plans were invalidated as the procedure has changed, then the new execution plan was built for SSMS and for web service, this times both plans were made for the same or similar paremeters.
If you could extract old plans from plan cache you'd see they were different and the parameters sniffed also were different while now the plans are the same and parameter sniffed are the same or similar.
Here you can read more on parameter sniffing: Slow in the Application, Fast in SSMS?
Understanding Performance Mysteries
Please do not use functions on TOP, on recordset, and, on WHERE/JOIN clauses.
When youre calling SP from SSMS, server optimizes it. But, when calling from frontend, it is huge problem. So, eliminate functions, if possible.
If you want to view about what im talking, please start profiler and then log RPC starting/completed, sql statament events. Call quantity is same, as recordset. So, assume, youre calling procedure 1000 times when usig FN on statement , returning recordset.
I have a long time issue keep popping up every time.
I create ssrs report with some select query. when i try to run the report it takes around 20sec to render.
i've checked the sql profiler and indeed the query run more than 20 sec.
when i copy the query to the management studio, it runs in 0 sec.
as written in earlier posts i've tried the walk around of declaring parameters in the query and setting their value with the ssrs params. sometime it works, currently it doesn't...
any other walk around?
Configure your report to run from the cache.
Caching is a copy of the last executed report. It is not a persisted copy, it has a lifetime (like caching for 30 minutes). It is stored on the temp database. You can have only one "instance" per report (if you have parameters, you will have one per combination of parameter)
you can do that on the execution tab of the report on report manager
Make the sql statement into a stored procedure and use the WITH RECOMPILE option in the sp.
E.g.
CREATE PROCEDURE dbo.spname #ParamName varchar(30)
**WITH RECOMPILE**
AS
This will help counteract the "parameter sniffing" during the procedure execution and help improve performance.
I was wondering if you guys could help me get to the bottom of a weird problem I have recently had on SQL Server.
I have a stored procedure (lets call SPold) which is reasonably large with a lot of calculations (can't possibly do this in app as info for around 6000 users needs to come back in a one-er (I reduce this to 1000 based on Surname)). The stored procedure usually executes in a couple of seconds, and is called once every couple of minutes.
Now this morning, the stored procedure was suddenly taking 4-10 times as long to execute, causing a number of timeouts. I discovered that by making a copy of the procedure with a new name (SPnew) and executing, I would get the fast execution times again. This indicated to me that the execution plan was the problem with the original, SPold, so I decided to execute it with recompile. This would return the results a quicker (although not as fast as SPnew), but subsequent calls from users to SPold were once again slow. It was like the new plan wasn't being kept.
What I have done is to fix this is put Exec SPnew into SPold, and now calls to SPold are returning fast again.
Does anyone have any idea what is going on here? The only thing that updated overnight was the statistics, although I think that this should affect both SPold and SPnew.
Sounds like you are experiencing an incorrectly cached query plan due to parameter sniffing.
Can you post the stored procedure?
Batch Compilation, Recompilation, and Plan Caching Issues in SQL Server 2005
I Smell a Parameter!
In SQL Server 2005, you can use the OPTIMIZE FOR query hint for preferred values of parameters to remedy some of the problems associated with parameter sniffing:
OPTIMIZE FOR Instructs the query optimizer to use a particular value for a local
variable when the query is compiled and optimized. The value is used
only during query optimization, and not during query execution.
OPTIMIZE FOR can counteract the parameter detection behavior of the
optimizer or can be used when you create plan guides. For more
information, see Recompiling Stored Procedures and Optimizing Queries
in Deployed Applications by Using Plan Guides.
Although SQL Server 2005 does not support OPTIMIZE FOR UNKNOWN (introduced in SQL Server 2008) which
will eliminate parameter sniffing for a given parameter:
OPTION (OPTIMIZE FOR (#myParam UNKNOWN))
You can achieve the same effect in SQL Server 2005 by copying the parameter into a local variable, and then use the local variable in the query.
I've also encounterred two "strange" cases with Sql Server 2005, which might relate to your problem as well.
In the first case my procedure executed prety fast, when being run as dbo, and it was slow when being run from the application, under a different user account.
In the second case the query plan of the procedure got optimized for the parameter values with which the procedure was called for the first time, and this plan was then reused later for other parameter values as well, resulting in a slow execution.
For this second case the solution was to copy the parameter values into local variables in the procedure, and then using the variables in the queries instead of the parameters.
I have a problem with this one stored procedure that works 99% of the time throughout our application, but will time out when called from a particular part of the application.
The table only has 3 columns and contains about 300 records. The stored proc will only bring back one record and looks like this
"Select * from Table Where Column = #parameter"
When the sp is executed in management studio it takes :00 seconds.
The stored procedure is used a lot in our application, but only seems to time out in one particular part of our program. I can't think of any reason why such a simple sp would time out. Any ideas?
This is a vb.net desktop application and using sql server 2005.
You've got some code that's already holding a lock on the table so it can't be read.
try
SELECT * FROM Table WITH (NOLOCK) WHERE Column = #parameter
We had a very similar problem, we had several stored procedures that would keep timing out in the application (~30 sec), but run fine in SSMS.
The short term solution that we used was to re-run the stored procedures which fixed the problem temporarily. If this also fixes the problem temporarily for you, then you should investigate parameter sniffing problems.
For futher information see http://dannykendrick.blogspot.co.nz/2012/08/sql-parameter-sniffing.html
you need to get performance metrics. Use the sql profiler to confirm that the SP is slow at that time or something else. If it is the sql that's slow at that time - consider things like locks that may be forcing your query to wait. Lets us know and we might be able to give more specific information at that point.
If it not the SP but say the VB code, a decent profile like RedGate's Ants or JetBrains' DotTrace may help.
I work with legacy systems that have tens of thousand of lines of stored procedure code, where many of the stored procedures are obsolete and not used anymore. There doesn't seem to be a way to check execution history, so my question is if it might be a good idea to start each stored procedure by inserting a row into a table that keeps records of the execution?
could be very simple like:
insert into
executionHistory (
name,
date
)
select
'spName',
getdate()
-- then rest of procedure
I imagine this could be very useful for doing cleanups of old unused code, and might also be handy when trying to decide where to optimize. I mean it's better to shave 10 seconds off execution time on a procedure that is executed 50 times a day, than saving 10 minutes execution time on a procedure that is only used once a year.
There is a tracing option (SQL Profiler) in SQL server. you could take a trace of a days SQL activity and see which sprocs are executed there.
This will give you a good idea of where to focus your optimisations.
because you're using sql server 2008 i wouldn't do what rwmnau suggest because this would mean you have to modify all your stored procedures.
SQL Server 2008 introduces a feature called Extended Events and SQL Server Auditing based on them. Extended events are high performance tracing system.
by using SQL Server Auditing you can trace your system withouth the overhead of sql trace.
I think your idea is simple enough and would accomplish your goal. Though it would involve modifying every SP, it's the route I would choose. Then you can ensure that you're getting an accurate recording of all activity on the database.
Another poster suggested you do a trace - while this works for short periods, it's only going to catch the time you're watching. You'd have to make sure you traces across any important, high-traffic periods, like month-end financial closing, and even then, you're missing other times you don't think are that big a deal, so you're being subjective.