I'm trying to optimize a report that uses multiple stored procedures on the same table. Unfortunately, each procedure is reading millions of records and aggregating the results. It's a very intense read for a report, but each stored procedure is optimized to run pretty fast within SSMS.
I can run each stored procedure and get a result set within 10 to 20 seconds. When I put them all into one report within SSRS, the report times out.
There is a total of 4 parameters per stored procedure. All targeting the same table, just aggregating the data in different ways. Indexes on those tables are inline with the query. It's based on time, user and the one dimension I'm using to COUNT() both DISTINCT and NONDISTINCT.
I'm thinking the issue is the fact SSRS is running 4 procedures at the same time on the same table as opposed to one after the other. Is this true? If so, is there anyway to ensure SSRS does not run them in parallel?
My only option is to create summary table that is already preaggregated. Then just run the report off that table. Otherwise, I guess param sniffing is possible too.
By default, datasets in SSRS are executed in parallel.
If all of your datasets refer to the same datasource, then you can configure for serialized execution of the datasets on a single connection this way:
open the data source dialog in report designer
ensure that the Use Single Transaction checkbox is checked
Once that checkbox is selected, datasets that use the same data source are no longer executed in parallel.
I hope that solves your problem.
Related
Consider Java application reading/modifying data from a SQL Server database using only stored procedures.
I am interested in knowing exactly what rows were inserted/updated after execution of some code.
Code that is executing could trigger multiple stored procedures and these procedures are working with different tables in general case.
My current solution is to debug low level Java code executed before any of stored procedures is called and inspecting parameters passed, to manually reconstruct impacts.
This seem to be ineffective and unreliable.
Is there a better approach?
To know exactly what rows were inserted/updated after execution of some code, you can implement triggers for UPDATE, DELETE and INSERT operations for the tables involved. These triggers should be almost the same for every table, changing just the name and the association with its table.
For this suggestion, these tables should have audit columns, like one for the datetime when they rows were inserted and one for datetime when they rows were updated - at least. You can search for more audit ideas if you want (and need), like a column to know wich user triggered the insert/update, or how many times the row was altered, an so on.
You should elaborate a different approach to achieve this depending of how much data you intend to generate with these triggers.
I'm assuming you know how to do this with best practices (for example, you can [and should, IMHO] create these triggers dinamically to facilitate maintenance).
Finally, you will be able to elaborate a query from sys tables that contains information about tables and rows and return only the rows involved, ordered by these new columns (just an idea that I hope fits in your particular case).
I have a ssrs report which has many tables. These tables fetch the data from database tables with the defined datasets. Now before my report fetches the data from the dataset, it is really important to run a procedure based on user input into my report parameter. How can I achieve this?
Note: As of now I wrote a EXEC PROC statement in one of my datasets. I am now clicking the 'View report' twice to see the expected results.
By default, SSRS processes all the datasets in parallel. There is an option called "Use single transaction when processing the queries" which makes it evaluate the datasets one at a time. So the first procedure has to finish before it will move on. This setting is in the Data Source properties. The datasets will be processed in the order you added them, so you may have to rearrange them to make your procedure be first.
I've got an SSIS package - the primary function of which is to precalculate some data and invoke an parameterized SSRS report. The SSRS report has multiple datasets, that it retrieves through stored procedures. It takes around 2-2.5 seconds to generate.
When I loop through the report within the package, the loop obviously executes one report at a time. To speed up this process, I split up the dataset into two and tried passing each dataset into its own loop container. The problem is that although the loops process simultaneously, the step at which the report is generated (script task) halts the process for the other loop - that is, while one report is generating, the other waits.
Given this, it seems that SSRS locks and only allows for one execution at a time. The profiler showed "sp_WriteLockSession" being invoked but according to this it appears that that is based on the design. I've also read up on the "no lock" hint but I'm not sure that's the route I want to go down either.
I'm not sure if I'm approaching this in the right way. Am I missing something? The only other thing I can think of is to create a second report and invoke that instead but if its locking due to the underlying datasets, then I'm really not sure what else to do. The datasets are primarily just select statements, with a couple of them inserting one row into a single table at the very end.
I'd appreciate any advice, thanks in advance!
I have one complex report which fetches records from multiple tables.
I saw at many places that SSRS does not allow multiple data tables returned from single stored procedure, that is the reason I created one stored procedure and I created six dataset for report that was filtered from shared dataset, but when I ran below query it shows that my procedure was executed for six times and that might causing the performance issue.
SELECT TOP 100 *,Itempath,parameters,
TimeDataRetrieval + TimeProcessing + TimeRendering as [total time],
TimeDataRetrieval, TimeProcessing, TimeRendering,
ByteCount, [RowCount],Source, AdditionalInfo
FROM ExecutionLog3 where ItemPath like '%GetServiceCalls%'
ORDER BY Timestart DESC
To get rid of this, I removed all dataset filters and applied filter on tablix. After that I can see that procedure was called only one time. But that does not affect the performance much.
Now, question that is coming to me is how exactly I can improve performance of SSRS report.
Note: My Query executes in 13 seconds of time and report takes almost 20 mins to execute.
Please help me to resolve this issue.
Regards,
Dhaval
I always found that SSRS filters on large tables to take forever and that any text wildcards performed even more poorly.
My advise would be to do all the "grunt work" except sorts in SQL and then do any sorts in SSRS.
Part of you problem may be that you have a large dataset and you are performing wildcard searches which don't play well with Indexes when you have the wildcard at the start of the like statement (e.g. like '%... ).
I have a report that renders data returned from a stored procedure. Using profiler I can catch the call to the stored procedure from the reporting services.
The report fails stating the report timed out yet I can execute the stored procedure from SSMS and it returns the data back in five to six seconds.
Note, in the example test run only two rows are returned to the report for rendering though within the stored procedure it may have been working over thousands or even millions of records in order to collate the result passed back to reporting services.
I know the stored procedure could be optimised more but I do not understand why SSRS would be timing out when the execution only seems to take a few seconds to execute from SSMS.
Also another issue has surfaced. If I recreate the stored procedure, the report starts to render perfectly fine again. That is fine except after a short period of time, the report starts timing out again.
The return of the time out seems to be related to new data being added into the main table the report is running against. In the example I was testing, just one hundred new records being inserted was enough to screw up the report.
I imagine more correctly its not the report that is the root cause. It is the stored procedure that is causing the time out when executed from SSRS.
Once it is timeing out again, I best fix I have so far is to recreate the stored procedure. This doesn't seem to be an ideal solution.
The problem also only seems to be occuring on our production environment. Our test and development platforms do not seem to be exhibiting the same problem. Though dev and test do not have the same volume of records as production.
The problem, as you described it, seems to come from variations on the execution plan of some parts in your stored procedure. Look at what statistics are kept on the tables used and how adding new rows affect them.
If you're adding a lot of rows at the
end of the range of a column (think
about adding autonumbers, or
timestamps), the histogram for that
column will become outdated rapidly.
You can force an immediate update from
T-SQL by executing the UPDATE
STATISTICS statement.
I have also had this issue where the SPROC takes seconds to run yet SSRS simply times out.
I have found from my own experience that there are a couple of different methods to overcome this issue.
Is parameter sniffing! When your stored procedure is executed from SSRS it will "sniff" out your parameters to see how your SPROC is using them. SQL Server will then produce an execution plan based on its findings. This is good the first time you execute your SPROC, but you don't want it to be doing this every time you run your report. So I declare a new set of variables at the top of my SPROC's which simply store the parameters passed in the query and use these new parameters throughout the query.
Example:
CREATE PROCEDURE [dbo].[usp_REPORT_ITD001]
#StartDate DATETIME,
#EndDate DATETIME,
#ReportTab INT
AS
-- Deter parameter sniffing
DECLARE #snf_StartDate DATETIME = #StartDate
DECLARE #snf_EndDate DATETIME = #EndDate
DECLARE #snf_ReportTab INT = #ReportTab
...this means that when your SPORC is executed by SSRS it is only looking at the first few rows in your query for the passed parameters rather than the whole of your query. Which cuts down execution time considerably in SSRS.
If your SPROC has a lot of temp tables that are declared as variables (DECLARE #MyTable AS TABLE), these are really intensive on the server (In terms of memory) when generating reports. By using hash temp tables (SELECT MyCol1, MyCol2 INTO #MyTable) instead, SQL Server will store your temp tables in TempDB on the server rather than in system memeory, making the report generation less intensive.
sometime adding WITH RECOMPILE option to the CREATE statement of stored procedure helps.
This is effective in situations when the number of records explored by the procedure changes in the way that the original execution plan is not optimal.
Basically all I've done so far was to optimise the sproc a bit more and it seems to at least temporarily solve the problem.
I would still like to know what the difference is between calling the sproc from SSMS and SSRS.