I'm new to CR (this is my first experience in fact). I'm trying to create a report based on a stored procedure I created in SQL Server 2012.
The SP is relatively simple and it runs just fine within SQL Server: all the data is right there in the results.
However, when I created a new standard report in Crystal Reports (2013), I'm able to access my server and database, select the SP, and the fields I want to use. It goes smoothly until I select "Finish". When the report loads, there's only the field headers from the SP.
I'm lost. I've tried it many times and continue to have no data. When I right-click to check the connection, it confirms connection. When I right-click on a field in the field explorer (I believe it's called) to view the data, there's nothing there.
The strange thing is, I created a view in SQL Server with the same query and when I added that view to CR, it worked fine. All my data was right there.
I also tried using a few other SPs in the database and I had the same issue —headers with no data, so I'm pretty confident it's not the SP itself.
Note: after selecting my SP and field when starting a new CR, I'm presented a window to choose the data range (which I assume is based on a date time parameter I have built into the SP). I didn't choose a date range because the end user will be selecting the range they need, so I checked the null boxes. I doubt if this plays any part, but I figured I'd mention it.
There must be something simple I'm missing here. I just don't get it. Any ideas? Thanks for taking the time to help.
put your code and SP so that i can identify
Related
In SSRS I have changed some tablenames and want to change the queries in all the reports that use those tables.
I have no trouble changing the queries. But when I run the reports in reportmanager they still use the old code.
If I edit the report in reportbuilder I can see that the code is changed.
If I save the report in reportbuilder then the reportmanager uses the correct code.
How do I make the change through a query without having to open each report and saving it again?
You can create a SYNONYM for the old table name to reference the new table name and your reports will still run correctly.
CREATE SYNONYM OldTableName FOR NewTableName
For more information please see this excellent introduction
http://www.sqlservertutorial.net/sql-server-basics/sql-server-synonym/
Firstly some background on the environment..
RDL's are designed in Report Builder 3.0 (The pre-2016 one).
RDL's are hosted on what appears to be an SSRS 2014 Reporting Services server (Reason why I am saying appears to be is because the alias of the Report Manager and Web Service URL is "SSRS_2014" (And the DBA's told me so)).
Our database server is either running SQL Server 2014 or 2016. I am running SQL Server 2017 on my workstation.
The problem:
I have an SSRS report data set that retrieves information from a very standard stored procedure. Recently I had to change one line of code and add a column related to this change to the result set. Stored procedure works as expected when testing it in SSMS Query Analyser. Here’s an excerpt of the intended results:
But after refreshing the dataset (which adds the new column) the dataset now returns inconsistent values under two columns for all the records returned whilst the stored procedure that retrieves the data is returning values when I run it in SSMS. Even when I run it in Query Designer it still returns inconsistent values.
Here’s a screenshot:
This is not a shared data set and from what I can gather the report does not have any caching applied. When trying to see if there is, or when trying to setup caching the Report Manager returns the following error:
An error occurred within the report server database. This may be due to a connection failure, timeout or low disk condition within the database. (rsReportServerDatabaseError)
Get Online Help For more information about this error navigate to the report server on the local server machine, or enable remote Errors.
As below:
This behaviour is occurring even under the following circumstances:
I deleted and added the data set.
I deleted and added the data source.
I created a new data source.
I created a new RDL.
I created a copy of the stored procedure (Renamed slightly).
I added copies of these columns (Renamed slightly) to the stored procedure (They behave the same).
Fiddled with the “Use single transaction when processing the queries” option (Was off).
The clinchers:
(Bear in mind that the stored procedure returns the data correcting in SSMS).
- When I hard-code the values in the stored procedure, these values appear when executing the stored procedure via SSMS Query Analyser but not when running it via Query Designer (Same inconsistent values).
- I have another (summary) report that obtains data from this stored procedure (Embedded) via another procedure. That report returns the data correctly.
- I tried to create a second stored procedure that executes this stored procedure (Similarly to what the “summary” one does) and it still misbehaves.
- When I take the script that I use to test the stored procedure in SSMS Query Analyser it returns the values! But this is not ideal because the parameters are defined and when I remove those parameters it goes back to normal (mis)behaviour.
I was considering adding a type of snapshot table. Where the stored procedure would first build the data and then do a select from this table to return the data. The problem here is this report is run my multiple users and I do not have the time (3 days before Christmas and I am already on Christmas leave) do go and design a whole snapshot system.
I did a lot of research on the internet yesterday. And went through all suggestions given in the following sites/forums to no avail:
Why is my SSRS report showing old data?
SSRS: field shows correct in query but wrong in report preview
How to clear cache of 1 stored procedure in sql server
https://jazz.net/forum/questions/243993/report-builder-not-showing-updated-data
https://learn.microsoft.com/en-us/sql/t-sql/database-console-commands/dbcc-freeproccache-transact-sql?view=sql-server-2017
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/e6f86f15-420c-4b64-bc70-01dea93a0995/report-results-and-query-results-returning-different-number-of-rows-why?forum=sqlreportingservices
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/9bc13904-a727-4786-9a11-693570f5f8d4/diable-the-cache-for-all-reports-in-ssrs-2008?forum=sqlreportingservices
And various other ones about caching and so on.
I even got the DBA’s to restart that SSRS server to see if that might refresh any forms of caching.
I also added “WITH RECOMPILE” hint to the stored procedure to no avail.
So I am at a loss… I am at this stage thinking of taking the Stored procedure’s code and running it as an embedded query but that is not best practice.
We are also in the process of handing this project over to another company so I don’t want to leave them this (I have my pride).
I can't find anything on this so I don't think I am asking the question right but here is my situation. I have a stored procedure which the end user passes a list of filter criteria, since I don't know what the filter criteria will be I used dynamic SQL. Further, to allow for more than one user to run the stored procedure concurrently I used all dynamically named temp tables so there would be no collisions. That all works.
Now my problem is how to output the report. Right now I have a SSRS report pointing to a single database that the stored procedure dumps its output to. When the stored procedure finishes the report is displayed. This works for one user but if two users run the stored procedure at the same time I have no way of knowing which output data will show on the report. Complicating issues is the dynamic user filter criteria can greatly effect the time the stored procedure takes to complete. I can see the report loading data just as the other stored procedure session is truncating or loading data to the output table.
I can queue up requests and run them one at a time but ideally I want them to be able to run concurrently as several users have to run this report many times at the beginning of each month. Is there a way to ensure that the data displayed on the report to the end user matches the data output from the stored procedure session the user ran.
EDIT:
The following is not a requirement but to clarify how this works now, the end user goes to an web site, enters in some filter criteria into a bunch of text boxes, one for each filterable database field, using a third parties search wildcards format that they already know. I then take that input, clean it up and parse it into a SQL WHILE string which is then passed to the stored procedure that gets the data. When the stored procedure finishes control is passed back to the website that then displays the report as an imbedded object. So the website is calling the parsing method, the stored procedure and the report.
SQL Server will create a separate session for each report user. It will then invoke the stored procedure in its own session (SQL Server is a multi session product) and using the selected parameters will run the proc and produce unique results which will then be passed back to the report user who invoked it and the report will then be shown to the user. If ten users simultaneously invoke the report, with unique parameters, they will each see a different report in the body of the report.
I ended up reworking the whole thing as Benjamin suggested and it works fine but that really was not the answer I was looking for. I finally stumbled upon it while researching something else. You can set the report viewer's DataSources.Add() method to add your own dataset. This would have allowed me to run the stored procedure in the webpage so I can catch and handle the errors at the webpage level so I could give more useful feedback to the end users. You can see this other post setting the datasource for a local report net report viewer for more detail.
The example is for a local report but I was able to get it to work on a remote report but I am on a trusted internal only network. As I understand it, you can also do it with the newer report viewers too.
In visual studio datasource designer is there any way to refresh a table and its relations/foreign key constraints while keeping the custom queries?
The way I am doing it at the moment is removing the table and adding it again. This adds all the relations and refreshes all fields.
Also if I change a fields data type, is there a way to automatically refresh all the fields in the datasource? Again without deleting the table and adding it again.
Reason for this is because some of my TableAdapters have quite a number of complex queries attached to them and when I remove the table the adapter gets removed as well including all its queries.
I am using Visual Studio 2008 and connecting to a MySQL database.
Any1 have an idea?
Each table has a default query (The one on top with the check on it). When you dragged your tables in to the dataset to create the query, it wrote a SQL statement which it uses to schema your table. Keep that query simple, you might not actually use it in code, and you can always edit that query to update the table schema.
Every time you open the default query it connects to your datasource and allows you to select new columns that weren't in there before. If you want to update your existing columns, delete all the columns out of the table before you attempt to open the query. When you save the query, your updated columns get added back.
Make sure your connection string has permissions to view column information.
I reported this to MSFT but no response. The designer hangs all the time on the simplest of SQL statements. What I found that works for me is.
Add a new table to the designer.
Save it.
Shut down visual studio 2010.
Start VS 2010.
Add one or two more SQL statements and follow steps 2-4 again.
This is a pain in the neck but the only thing that stops Visual Studio dataset designer from hanging. I experienced in this same issue in VS 2008. I am connecting to Oracle but still shutting down VS and starting it back up works, but really, this is nonsense.
You can add/change/remove fields and relation ships, but i would suggest looking into NHibernate.
You should be able to right-click the dataset in solution explorer and select "Run Custom Tool" to refresh the table and it's query/relationships.
If that command is not there, check that the dataset properties has "MSDataSetGenerator" in the Custom Tool field.
Right click on your DataSet name and select Dataset Properties
Below the Query box you will see a button for Refresh Fields.
Click on Query Designer and the new field should show in your table list.
I'm using SQL Server Reporting Services 2008 to produce an invoice. The layout of this invoice is fairly standard - a page header/footer, then some address details at the top, followed by a single table for the invoice lines, and a set of rectangles for the totals below the table.
This report worked absolutely fine in SSRS 2005, but since moving to SSRS 2008 I've found a problem with invoices of a certain length. The problematic length is when there are too many rows to display on page 1, and enough to display the entire table on page 2 (i.e. without the address details being displayed at the top). This means that page 1 contains ONLY the address information, whereas it used to also contain the start of the table.
Screenshot of working report (SSRS 2005):
Working Report http://img225.imageshack.us/img225/1439/invoicessrs2005.png
Screenshot of broken report (SSRS 2008):
Broken Report http://img225.imageshack.us/img225/69/invoicessrs2008.png
I've played with the KeepTogether property of the table (which was set to False anyway), with no effect.
Does anyone have any suggestions how I can make this work?
Richard,
I ran into this incredibly annoying issue before when I was placing graphs on a report that I redid as well. I got so fed up with it and couldn't find anything that explained why it was happening so I just shrunk down the actual size of the table (squished it to the left a bit) until it printed correctly in the preview. It was meant to serve as a temporary solution until I found out why this happened but it is still used like that and it's been about 6 months now.
I recently upgraded to SQL Server 2008 R2, and decided to revisit this problem. It seems to have gone away now with the R2 update :)