I built my code in SSMS. It returns 900 rows. I ported the same code over to SSRS, ran it through Query Builder, it returns a lot of records. Great!
For simplicity sake I just pull a table over to the report, point it to my newly created dataset that's showing rows upon rows of data returned (in the Query Builder). I drag 1 column over (OrderID) to just make sure it's working. I preview the report.
Nothing shows up in my table.
For troubleshooting purposes I write a simple sql statement to pull the top 100 rows from just one table, change the dataset to my new, test dataset & run it. Runs fine. I get a lot of pages of 1 column. Table, dataset & datasource are connected and running fine.
Why does my statement, which works in SSMS and the query builder within SSRS, not return any data in SSRS table?
No, I haven't changed any properties or anything of the sort.
Exact same datasource in SSRS that I'm testing against in SSMS.
Here's a screenshot of my table as requested:
If this comes up again, it could very well be a corrupt datasource be it embedded or shared.
In my case it was shared. I deleted it all together, recreated it, tested it and when I previewed the report, all my data was there.
Have a nice day.
Related
In SSRS I have changed some tablenames and want to change the queries in all the reports that use those tables.
I have no trouble changing the queries. But when I run the reports in reportmanager they still use the old code.
If I edit the report in reportbuilder I can see that the code is changed.
If I save the report in reportbuilder then the reportmanager uses the correct code.
How do I make the change through a query without having to open each report and saving it again?
You can create a SYNONYM for the old table name to reference the new table name and your reports will still run correctly.
CREATE SYNONYM OldTableName FOR NewTableName
For more information please see this excellent introduction
http://www.sqlservertutorial.net/sql-server-basics/sql-server-synonym/
I'm trying to design a Matrix report through SSRS to aggregate a column for a range of dynamic values in another column (i.e. a pivot). This data consists of just over 13 million rows, so it's a large dataset.
When doing a PIVOT on this data via T-SQL, it's able to aggregate all of these rows in about ~1min, however when getting SSRS to do the pivoting for me through a Matrix report, I'm getting an OutOfMemory exception when trying to preview the report on my PC.
The query returning the dataset itself isn't complicated, it's as simple as:
SELECT
ID
,Test_Ref
,Data_issue_indicator
FROM MyTable
Where we're trying to do the sum of Data_issue_indicator (which can be either a 1 or 0) for values in Test_Ref, in which there is a dynamic range of values to aggregate against; in other words we cannot use a standard Tablix report because the amount of columns can increase at any time should a new Test_Ref value be introduced into the dataset.
I'm using Visual Studio Enterprise 2019, and my PC is a Windows 10, i7-8850H, with 16GB memory.
Is there a suggestion on getting around this issue?
When using SSRS, its recommended to grab more data once in case of using the dataset multiple times. but when you have a larger dataset it needs to be a trade off between what you want to achieve against do you need all the data.
So in this situation i would suggest to use a procedure to restrict the amount of data that you are grabbing to the report.
I have gone through this sort of scenario, and i had to do the same, because its not the query that is timing out but the huge amount of data that is loaded to the report which fails the report.
If you have SQL server profiler , you would see the SQL executed and completed, but the report times out rendering.
Two ideas, assuming that you plan to deploy the report to a server that will have the memory to handle this, and that you'd prefer to do this processing on the report server rather than the SQL server for some reason:
Don't test the functionality on your PC in Visual Studio. Design the report, deploy it to your Report Server, and test it there to see if it works.
When testing on your PC, force it somehow to use a much smaller dataset: one just large enough to verify that the pivoting Matrix works, but small enough that your PC's memory can handle it.
Or better yet, do option 2, and then option 1.
I'm new to CR (this is my first experience in fact). I'm trying to create a report based on a stored procedure I created in SQL Server 2012.
The SP is relatively simple and it runs just fine within SQL Server: all the data is right there in the results.
However, when I created a new standard report in Crystal Reports (2013), I'm able to access my server and database, select the SP, and the fields I want to use. It goes smoothly until I select "Finish". When the report loads, there's only the field headers from the SP.
I'm lost. I've tried it many times and continue to have no data. When I right-click to check the connection, it confirms connection. When I right-click on a field in the field explorer (I believe it's called) to view the data, there's nothing there.
The strange thing is, I created a view in SQL Server with the same query and when I added that view to CR, it worked fine. All my data was right there.
I also tried using a few other SPs in the database and I had the same issue —headers with no data, so I'm pretty confident it's not the SP itself.
Note: after selecting my SP and field when starting a new CR, I'm presented a window to choose the data range (which I assume is based on a date time parameter I have built into the SP). I didn't choose a date range because the end user will be selecting the range they need, so I checked the null boxes. I doubt if this plays any part, but I figured I'd mention it.
There must be something simple I'm missing here. I just don't get it. Any ideas? Thanks for taking the time to help.
put your code and SP so that i can identify
I'm pretty new to creating dynamic data websites - I created one a year ago that is working fine but now I'm trying a 2nd one and having a strange problem.
I'm using VS2010, VB.net & followed MS's waltkthrough for creating a web site using Linq to SQL. Things seem to be working with one MAJOR exception. I've pointed it to a SQL server in my domain but it won't display rows from the DB that were not created through the website. Records that I manually enter through SSMS do not show in the DD List for the table, but rows I inserted thru the website do appear.
I have cleared the table and refreshed the list view and all rows disappear. Then if I add rows via SSMS they do NOT show up but website added ones do. This is important because I'm going to be importing data from an old (non SQL) DB that I'm converting.
I really don't know where to look to resolve this - Does anyone have any ideas?
Thanks!
The records I entered thru ssms had bad values for a foreign key and evidently DD is doing inner joins instead of left outer. I don't know if that is a bug or a feature.
In visual studio datasource designer is there any way to refresh a table and its relations/foreign key constraints while keeping the custom queries?
The way I am doing it at the moment is removing the table and adding it again. This adds all the relations and refreshes all fields.
Also if I change a fields data type, is there a way to automatically refresh all the fields in the datasource? Again without deleting the table and adding it again.
Reason for this is because some of my TableAdapters have quite a number of complex queries attached to them and when I remove the table the adapter gets removed as well including all its queries.
I am using Visual Studio 2008 and connecting to a MySQL database.
Any1 have an idea?
Each table has a default query (The one on top with the check on it). When you dragged your tables in to the dataset to create the query, it wrote a SQL statement which it uses to schema your table. Keep that query simple, you might not actually use it in code, and you can always edit that query to update the table schema.
Every time you open the default query it connects to your datasource and allows you to select new columns that weren't in there before. If you want to update your existing columns, delete all the columns out of the table before you attempt to open the query. When you save the query, your updated columns get added back.
Make sure your connection string has permissions to view column information.
I reported this to MSFT but no response. The designer hangs all the time on the simplest of SQL statements. What I found that works for me is.
Add a new table to the designer.
Save it.
Shut down visual studio 2010.
Start VS 2010.
Add one or two more SQL statements and follow steps 2-4 again.
This is a pain in the neck but the only thing that stops Visual Studio dataset designer from hanging. I experienced in this same issue in VS 2008. I am connecting to Oracle but still shutting down VS and starting it back up works, but really, this is nonsense.
You can add/change/remove fields and relation ships, but i would suggest looking into NHibernate.
You should be able to right-click the dataset in solution explorer and select "Run Custom Tool" to refresh the table and it's query/relationships.
If that command is not there, check that the dataset properties has "MSDataSetGenerator" in the Custom Tool field.
Right click on your DataSet name and select Dataset Properties
Below the Query box you will see a button for Refresh Fields.
Click on Query Designer and the new field should show in your table list.