I'm trying to recreate an access report as a crystal report and I'm having trouble.
Basically, the Access report runs a stored procedure (which returns nothing, but populates tables), then those tables are queried to display the data on the report.
So I'm trying to figure out how to run the procedure with parameters from a crystal report. I've got the second part fine, if I run the procedure manually, then display the report I get the appropriate data.
How can I execute the stored procedure before querying those tables from Crystal Reports???
Have you tried combining the stored procedure and the queries into one stored procedure? Execute the one that populates the tables, and then run the queries that return the data?
Use Visual Studio (Assumes you were using Access, so you may be a Microsoft shop.) to create a form that triggers the stored procedure and then opens the Crystal Report.
This way users don't have to have Crystal Reports installed. Just your app.
Can you get away with a combined Access & Crystal approach?
The Access db prompts the user for parameters and does whatever it needs to get & filter the data from the db.
Crystal generates it's report using the Access db as it's source.
This would be like taking the original Access file, deleting just the report, and recreating just the report in Crystal, using the original Access queries as your source.
Related
Firstly some background on the environment..
RDL's are designed in Report Builder 3.0 (The pre-2016 one).
RDL's are hosted on what appears to be an SSRS 2014 Reporting Services server (Reason why I am saying appears to be is because the alias of the Report Manager and Web Service URL is "SSRS_2014" (And the DBA's told me so)).
Our database server is either running SQL Server 2014 or 2016. I am running SQL Server 2017 on my workstation.
The problem:
I have an SSRS report data set that retrieves information from a very standard stored procedure. Recently I had to change one line of code and add a column related to this change to the result set. Stored procedure works as expected when testing it in SSMS Query Analyser. Here’s an excerpt of the intended results:
But after refreshing the dataset (which adds the new column) the dataset now returns inconsistent values under two columns for all the records returned whilst the stored procedure that retrieves the data is returning values when I run it in SSMS. Even when I run it in Query Designer it still returns inconsistent values.
Here’s a screenshot:
This is not a shared data set and from what I can gather the report does not have any caching applied. When trying to see if there is, or when trying to setup caching the Report Manager returns the following error:
An error occurred within the report server database. This may be due to a connection failure, timeout or low disk condition within the database. (rsReportServerDatabaseError)
Get Online Help For more information about this error navigate to the report server on the local server machine, or enable remote Errors.
As below:
This behaviour is occurring even under the following circumstances:
I deleted and added the data set.
I deleted and added the data source.
I created a new data source.
I created a new RDL.
I created a copy of the stored procedure (Renamed slightly).
I added copies of these columns (Renamed slightly) to the stored procedure (They behave the same).
Fiddled with the “Use single transaction when processing the queries” option (Was off).
The clinchers:
(Bear in mind that the stored procedure returns the data correcting in SSMS).
- When I hard-code the values in the stored procedure, these values appear when executing the stored procedure via SSMS Query Analyser but not when running it via Query Designer (Same inconsistent values).
- I have another (summary) report that obtains data from this stored procedure (Embedded) via another procedure. That report returns the data correctly.
- I tried to create a second stored procedure that executes this stored procedure (Similarly to what the “summary” one does) and it still misbehaves.
- When I take the script that I use to test the stored procedure in SSMS Query Analyser it returns the values! But this is not ideal because the parameters are defined and when I remove those parameters it goes back to normal (mis)behaviour.
I was considering adding a type of snapshot table. Where the stored procedure would first build the data and then do a select from this table to return the data. The problem here is this report is run my multiple users and I do not have the time (3 days before Christmas and I am already on Christmas leave) do go and design a whole snapshot system.
I did a lot of research on the internet yesterday. And went through all suggestions given in the following sites/forums to no avail:
Why is my SSRS report showing old data?
SSRS: field shows correct in query but wrong in report preview
How to clear cache of 1 stored procedure in sql server
https://jazz.net/forum/questions/243993/report-builder-not-showing-updated-data
https://learn.microsoft.com/en-us/sql/t-sql/database-console-commands/dbcc-freeproccache-transact-sql?view=sql-server-2017
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/e6f86f15-420c-4b64-bc70-01dea93a0995/report-results-and-query-results-returning-different-number-of-rows-why?forum=sqlreportingservices
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/9bc13904-a727-4786-9a11-693570f5f8d4/diable-the-cache-for-all-reports-in-ssrs-2008?forum=sqlreportingservices
And various other ones about caching and so on.
I even got the DBA’s to restart that SSRS server to see if that might refresh any forms of caching.
I also added “WITH RECOMPILE” hint to the stored procedure to no avail.
So I am at a loss… I am at this stage thinking of taking the Stored procedure’s code and running it as an embedded query but that is not best practice.
We are also in the process of handing this project over to another company so I don’t want to leave them this (I have my pride).
I have a stored procedure witch runs several EXEC commands. As a result it returns more than one table. In SQL Server Report Builder or SQL Server Data Tools (SSDT) I can only access the first table it retrieves from this stored procedure. But I need to access last table, in which contains the merged columns from different tables produced by different stored procedures.
I have tried the hide tables other than the last table, but failed. Is there any suggestions you can offer to solve this problem. I appreciate and thank with all my hearth to whom tries to contribute the solution of my problem.
I found a solution for this problem. It is not quite what I have asked but solves this issiue. Here is the solution:
I have edited all sub-stored procedures that I used to "RETURN 0" to prevent them to give an output. So when I call them from Main stored procedure they have no visible output at the "Results" window. Only the Main stored procedure has a single output. Thus I can use it in Report Builder or SSDT like a normal stored procedure without any more modification.
Is there a way to connect a crystal report to multiple databases?
My database (SQL Server) is periodically archived and sometimes, I need to access data from an older "partition"/archive, let's call them DB109 and DB110. I need to produce one report with data from both DB109 and DB110. They have the same structure, same query, etc.
Is there a way to run the report for both DBs without running them separately and ending up with multiple files?
It seems like you could probably make two subreports, with one linking to one db and the other linking to the second. Add a parameter on whether to run the first, second, or both. And then conditionally display the subreports based on the parameter.
A second option would be to have a linked server in your main database to your archive database, and then write a procedure that pulls from both (perhaps also based on a parameter). Then use that procedure as your Crystal source.
I can't find anything on this so I don't think I am asking the question right but here is my situation. I have a stored procedure which the end user passes a list of filter criteria, since I don't know what the filter criteria will be I used dynamic SQL. Further, to allow for more than one user to run the stored procedure concurrently I used all dynamically named temp tables so there would be no collisions. That all works.
Now my problem is how to output the report. Right now I have a SSRS report pointing to a single database that the stored procedure dumps its output to. When the stored procedure finishes the report is displayed. This works for one user but if two users run the stored procedure at the same time I have no way of knowing which output data will show on the report. Complicating issues is the dynamic user filter criteria can greatly effect the time the stored procedure takes to complete. I can see the report loading data just as the other stored procedure session is truncating or loading data to the output table.
I can queue up requests and run them one at a time but ideally I want them to be able to run concurrently as several users have to run this report many times at the beginning of each month. Is there a way to ensure that the data displayed on the report to the end user matches the data output from the stored procedure session the user ran.
EDIT:
The following is not a requirement but to clarify how this works now, the end user goes to an web site, enters in some filter criteria into a bunch of text boxes, one for each filterable database field, using a third parties search wildcards format that they already know. I then take that input, clean it up and parse it into a SQL WHILE string which is then passed to the stored procedure that gets the data. When the stored procedure finishes control is passed back to the website that then displays the report as an imbedded object. So the website is calling the parsing method, the stored procedure and the report.
SQL Server will create a separate session for each report user. It will then invoke the stored procedure in its own session (SQL Server is a multi session product) and using the selected parameters will run the proc and produce unique results which will then be passed back to the report user who invoked it and the report will then be shown to the user. If ten users simultaneously invoke the report, with unique parameters, they will each see a different report in the body of the report.
I ended up reworking the whole thing as Benjamin suggested and it works fine but that really was not the answer I was looking for. I finally stumbled upon it while researching something else. You can set the report viewer's DataSources.Add() method to add your own dataset. This would have allowed me to run the stored procedure in the webpage so I can catch and handle the errors at the webpage level so I could give more useful feedback to the end users. You can see this other post setting the datasource for a local report net report viewer for more detail.
The example is for a local report but I was able to get it to work on a remote report but I am on a trusted internal only network. As I understand it, you can also do it with the newer report viewers too.
I have a couple stored procedures that run for about 2-3 minutes a piece (lots of data). When I run the stored procedures inside SQL Server Management Studio, the queries run fine and return the appropriate data, however, when I run my SSRS Report, it errors out with "Object has been disconnected or does not exist at the server."
Any suggestions? I think it has to do with the time it takes to run all the queries.
I have tried setting WITH RECOMPILE with no luck.
http://social.msdn.microsoft.com/Forums/en-US/sqlreportingservices/thread/ed0ad78d-be17-475f-b8d1-9b2c642c1835
Looks like this may actually be a bug.
Here what you can do a workaround for the report. Inside the stored procedure use a temporary table to store all the data and use filters against the temporary table.