Reporting Services Timeout with Range Chart (Gantt) - sql-server

I have a chart report using a range chart with SSRS 2012. It was working great until last week. Now, it won't finish running in the IDE or on the report server. I get rsInternalError, and when I look at the log files I get a ReportServerStorageException which to me indicates timeout or another problem from SQL Server.
"An error occurred within the report server database. This may be due to a connection failure, timeout or low disk condition within the database."
However, the query is lightning fast and I am aware of slow parameter queries and the param sniffing problem. I've confirmed that is now what is happening here though.
If I change the report so that it groups on a field that will only show maybe 10 or 20 rows, then it works fine. When the row series is grouped on a field which returns 100 to 200+ rows then that's where it fails, but only in some cases. It seems a difference of one or two rows being returned makes the difference between working and failing.
In either case, the SQL Command to the database would be the exact same. The only difference is on the report rendering side as there would be more rows on the y axis.
Anyone have any idea what's happening here? I don't understand why it's failing on the SQL Side. It also shouldn't be failing on the rendering side.

Related

SQL Server 2016 Mobile Report Datasets Expire about 30 Days

We are in the process of creating a suite of SQL server 2016 Reporting Services Mobile Reports for our company’s Cloud offering to customers, however, we keep running into a situation where the all datasets expire after a certain time.
We have found that all the datasets on the server seem to stop working after 30 days after they have been created and an Error message (“The data set could not be processed. There was a problem getting data from the Report Server Web Service.”) is displayed.
To resolve this, all the datasets need to be opened manually and re-saved onto the server. As you can imagine, this isn’t really a suitable solution for as we have a long number of reports and datasets for each customer.
After a bit of investigation, we have managed to pinpoint a “Snapshotdata” table in the report server database which has an “ExpirationDate” column, which seems to be linked to the issue.
Has anyone else can across this before and could please advise a possible solution to the datasets expiring? Why would the datasets have an expiration date on them anyway?
A dataset will not be expired once it has been created.
In your scenario, did you create cache for those datasets? Was there anything change to the dataset?
You said in mobile report it prompted "dataset could not be processed" error, please locate to dataset property pane, and check whether it returns data successfully by clicking on Load Data. If not, change to another account and try again.
Besides, please check whether the account used to connect to data source was expired after 30 days which might caused the failure of data retrieval.

Changing report parameters intermittently causes loading screen to appear for minutes at a time

I'm using SQL Server and SSRS 2012. Intermittently when running reports on live environments, changing a single
parameter can cause the entire report to lock up, show the loading icon, and not allow other parameter changes for minutes at a time.
I found a similar ticket on microsoft connect that said it was fixed in a cumulative update for 2008 R2, but I'm experiencing it in SSRS 2012. I'm not sure what to do. Because it's intermittent, it's difficult to replicate, and I haven't been able to find any solutions for this online.
EDIT: This is only when changing the parameter, the loading occurs before I get the chance to hit 'View Report.' It can occur with several of the parameters, and most of them have dependencies. It can be on the parent or the child parameter.
I have also checked the execution log - the time taken to retrieve and process the parameters from shared data sets is much less than the time the 'loading' box stays on the screen. Max data retrieval time is 20secs total, loading box lasts for minutes at a time.
Do you mean when you re-run the report after changing a parameter or just changing the parameter without hitting View Report? If you are just changing the parameter, is the parameter used to refrsh othr related parameters? Basically we need to determine if the issue is with a query that's executing.
If it is then it could be a parameter sniffing issue where the query optimizer has used previous parameters to build a query plan that it not suitable. You can test this quickly by adding OPTION (RECOMPILE) to the end of the affected dataset query (assuming it's just a SQL script).

What's changed on Azure to slow my SQL sproc down to a crawl?

In December 2015 I deployed a small azure web app (webapi, 1 controller, 2 REST end points) along with an Azure SQL db (1 table, 1.7M rows, 3 stored procedures).
I could call my rest endpoints and get data back within a few seconds. Happy days.
Now I make the same call and my app throws a 500 error. Closer examination shows the SQL access timed out.
I can open the db (using Visual Studio data tools) and run the queries and call the stored procedures. For my main sproc execution time is about 50 seconds - way too long for the app to wait.
The data in the table has not changed since deployment, and the app and db have been untouched for the last few months, so how come it ran OK back in December but fails miserably now?
All help greatly appreciated.
The Query Store is available in SQL Server 2016 and Azure SQL Database. It is a sort of "flight recorder" which records a history of query executions.
Its purpose is to identify what has gone wrong, when a query execution plan suddenly becomes slow. Unlike DMVs, the Query Store data is persisted in tables, so it isn't lost when SQL Server is restarted, and can be retained for months.
It has four reports in SSMS. This picture shows the Top Resource Consuming Queries. The top left pane shows a bar graph where each bar represents a query, ordered by descending resource usage.
You can select a particular query of interest, then the top right pane shows a timeline with points for each execution. In this example, you can see that the query has got much worse, because the second dot is showing much higher resource usage. (Actually I forced this to happen by deliberately dropping a covering index.)
Then you can click on a particular dot and the graphical execution plan is displayed in the lower pane. So in this example, I can compare the two plans to see what has changed. The graphical execution plan is telling me there is a missing index (this feature in itself is not new), and if I clicked on the previous dot this message wouldn't appear. So that's a pretty good clue as to what's gone wrong!
The Regressed Queries report has the same format, but it shows only queries that have "regressed" or got worse. So it is ideal for troubleshooting.
I know this doesn't resolve your present situation, unless you happened to have Query Store enabled. However it could be very useful for the future and for other people reading this.
See MSDN > Monitoring Performance By Using the Query Store: https://msdn.microsoft.com/en-GB/library/dn817826.aspx

System.Data.SqlClient.SqlException: Timeout expired on commit

I am writing some code that is importing a large amount of data into three tables currently around 6 million rows across the three tables. I am wanting to do this in a transaction so if there are any issues or the user cancels the import nothing is imported. This works fine on my own development machine however on a slower amazon ec2 instance and micro sql instance I am getting the following exception:
System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding
Now I know that the commit is finishing eventually because the data is present in the tables when I look, so my question is; can this be easily avoided without adding the connection timeout property to my connection string (I only want this one operation to not timeout) or is this a really hard/dangerous thing to be doing?
I am not sure if maybe I should import into holding tables and then call stored procedures to move the data when I am ready because I would assume this will result in a shorter transaction)
I am using Ms Sql server 2012.
Do comment if I need to add more data.
Many thanks for your help
Check what is the SP getting timedout .. if you have any third party tool like Redgate or Avicode you can figure it out ..or use Profiler to figure it out.. then see the execution plan for the SP or query .. If you find any Key lookups or RID lookups then resolve it first and try again..

SSRS Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding

I have a reporting solution with several reports. Up to now, I have been able to add a dataset based on a SPROC with no problems. However, when I try to add the lastest dataset, and use a SPROC for its query type, when I click on Refresh Fields I get the following error:
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
I have tested the database connection in Data Source properties>Edit>Test Connection, and it's working fine.
I have increased the timeout to 100 in the following areas:
The connection string for the datasource, which is - Connect
Timeout=100
Tools>Options>Database Tools>Query and View
Designers. Cancel long running query is set to 100.
Tools>Options>Database Tools>Table and Database Designers>Checked
'Override Connection String time-out value for table designer
updates. Transaction time-out after is set to 100
The SPROC runs fine in the SQL database. It takes about 55 seconds.
Any other ideas?
Thanks.
UPDATE: I now can't add any dataset with a SPROC. Even thought the SPROCs are all working fine in SQL!!!!!!
If you are using Report Builder, you can increase timeout also in your DataSet.
I have also face same problem for adding the newly added column in stored procedure.
From the following way overcome this issue.
Alter the stored procedure as comment all the query except that final select command.
Now that new column has been added, then un-comment that quires in sp.
The thing to remember with your report is that when it is ran, it will attempt to run ALL the datasets just to make sure the are runnable, and the data they are requesting can be returned. So by running the each proc seperately you are in fact not duplicating with SSRS is trying to do...and to be honest don't bother.
What you could try is running sp_who while the report is running, or even just manually go through the procedures to see what table they have in common. Since your proc takes 52 seconds to return its dataset I'm going to assume its doing some heavy lifting. Without the queries nobody will be able to tell what the exact problem is.
I suggest using NO LOCK to see if that resolves your issue. If it does then your procs are fighting for data and blocking each other...possibly in a endless loop. Using NO LOCK is NOT a fix. Read what it does and judge for yourself however.
My solution was to go to the Dataset Properties for the given problem dataset, paste the query in the Query field, click Refresh Fields, and click Ok.

Resources