crystal reports "show sql query" - how accurate is it? - sql-server

The business is wanting new crystal reports to be built within designer only and less database objects, like procedures and views. The aim being for future transparent maintenance. But this is making tasks more difficult for me. I am wondering as to the accuracy of Show SQL Query? I have little control over the actual sql query being run on the server and feel like I am flying blind.
Obviously it makes more sense to build as much of the logic server-side but trying to keep client happy. My latest foray highlights some glaring issues with Show SQL Query:
Multiple table hits not showing! eg: 3 different tables needing to
hit the same base table. I have linked up the base table multiple
times and changed the naming to suit. But the query as shown by
Show SQL Query only mentions one of the 3 tables! Even though the
report is working off all 3 repeated tables just fine.
Nested logic is promlematic. I have to copy and paste into a
text editor, do a bit of formatting to make the sql readable and even
then it just doesn't look right logic wise.
Is there another way to keep everyone happy?

Related

SQL Server tables connection

I have to connect multiple tables that are part of single or multiple databases. Approximately 10-15 tables in each query have to be connected to generate data for the analysis in SQL Server 2014.
I don't have access to the database diagram or architecture and these reports are to be sent out weekly. I want to understand the approach on how to begin writing these kind of queries which are of basic and advanced level and identify the relationship between tables and what kind of advanced level queries I can learn or utilize like CTE, Rank Partition, Subqueries etc.
Anybody who can provide a rough flow diagram or structure about the approach will be really helpful.
It's very unlikely that owners of those source systems want to be directly queried every time someone runs a report. Since you already have access to SQL Server, I would suggest building a data warehouse with that.
You haven't provided a whole lot of information to go on, but SSIS packages could be created to connect to the source systems and load into your data warehouse. And furthermore, those packages can be scheduled through Agent.
As for modeling... Again it is difficult with the lack of information, but generally the star model works great for reporting, which is a fact table surrounded by dimension (or attribute) tables.
As for figuring out relationships without a diagram, this will have to be done via experimentation and tieing to existing reports to make sure your joins aren't dropping records or cascading.
Good luck.

Update Multiple SSRS Reports in bulk

I need to make identical changes to hundreds of reports, and I was hoping to do this via SQL instead of each indvidual report and it's query. I can extract the report query via xml and generate my list of reports, their location, and the query being used. But what I cannot figure out is how to update the report query and then get that updated back into the Catalog? database so that the report itself reflects the changes when executed? I have never seen where this is possible, but maybe someone on here has tried to do this or knows that it's flat out not possible.
I could use SSIS and do this, but I would prefer not to download all the RDLs and then update, and then redeploy/upload the reports. Was hoping to update in place the reports/RDLs.
You shouldn't have to download the RDLs, they should already be in your source control system, and ideally collected and grouped into project(s). If so, you are in luck - you can use the global search/replace capabilities of Visual Studio (BIDS) or Notepad++ to make your change.
If your change was to the structure of the report then you could simply write a quick nasty console app to load the RDL and manipulate the XML structure. But things like the report query are held as free-form text in a node, making it harder to apply mass updates in a reliable way.
You could look to refactor the report queries into stored procedures and/or functions, this will make future updates a bit easier. In any case if you change the report RDLs you've got no option but to republish the modified ones - there's no such thing as an in-place change on the server (having your queries as stored procedures would have avoided this issue).

Reporting Solution for Data Validation Queries

I'd like to get some advice on a reporting situation that I have. I am working in SQL Server. I have a ton of data validation queries that I run against a database. In general, for each query, I return two things -- one is the count of the offending records, and the other is the offending records themselves.
My goal is produce a report that gives the counts of the offending records for all data validation queries (ideally, on one sheet in an Excel workbook) and the offending records themselves (ideally, on separate sheets in an Excel workbook).
How is this best achieved? That is, what technology is best for this situation? For example, in the past, I have prototyped the queries in SSMS, copied them into a Windows batch file (and added code to write the results to separate text files), and called the batch file via the sqlcmd utility (using command prompt). However, I know that other solutions exist (e.g., SSRS). Would something like SSRS be a better tool for this situation? I'm hesitant to go the SSRS route, since I'm only giving metrics on one issue (i.e., counts of offending records) and the rest of the report consists of offending records.
This might get closed because it is a matter of opinion, but SSRS would be a good solution for this requirement. I think SSRS is a good fit if you have the following criteria:
You need to visualize the data in some kind of a table, chart, or graph
You want to send out automated emails every morning / week / month to a group of users (as opposed to just individual consumption)
You want to be able to export the report to other formats (excel or pdf) for additional analysis or sharing.
Otherwise, if it's just for you and you currently don't have SSRS running on the server, save yourself the overhead of running another service and just keep doing it in batch files.

Database cleanup

I inherited a SQL server database that is not well formatted. ( some consulting company came in to do the project and left without completing it)
the main issues I have with this database are:
Data types: a lot of tinyint and text types.
Tables are not normalized: some of the keys are names instead of seq ids.
A lot of tables that I am not sure are being used
a lot of stored procedures that i am not sure are being used
Badly named tables and stored procs
I also inherited the asp.net application that runs against this database.
I would like to clean this database up. I understand that changing the datatypes will have to happen at each table. for getting rid of all the extra tables and stored procs. what is the easiest way to do so.
any other tips to make it cleaner and smaller is appreciated.
I want to also mention that I have RedGate tools installed.( if that helps).
Thank you
Check out the Sql Server Data Tools they allow to create a project from a live database. Some of the things you can do in there is right click 'Find Usages' for the tables, views and functions.
So long as the previous developer used stored procedures and views rather than querying directly, it should find references to your project that way, without killing your project.
Also, for finding stored procedures that are not used, put in some basic logging at the top of each stored procedure in your application, after X amount of days, those that haven't been logged in your table are likely safe to remove, else a tedious search through your .NET code will find them.

Preparing to move to a single database

We have an application that has 1000+ databases and 600+ sprocs. Each database represents a different client.
Problem: We need to move this to a single database while creating as little effect on the ui as possible, meaning dont change all the sproc signatures at 1 time.
The connection string currently sets the database attribute, a proposal is to move that to the user attribute. This attribute (using SYSTEM_USER) could be used to determine the site identifier which would be used on the where clause.
The above would not be final solution, but allows us to make changes to the sproc signature at a slow controlled pace. Once all are done we can correct the connstring and get some connection pooling.
Are there any limitation to the number of logins/users that we can have on sqlserver 2005/8. Or has anyone been down this path that could shed some light on a better option.
See my answer here
Ideas for Combining Thousand Databases into One Database
Sounds like you two are working the same project. YOu will need to change every proc before you can move to one datbase or each client will see the others' data.
As for the number of logins on SQL Server 2005 / 08 - I don't think anyone has ever run into a hard limit here. A few thousand will NOT be any problem at all.
What you could consider for this scenario might be one schema inside your single DB per customer, e.g. customer "Miller" has a "miller" schema, with its objects inside, and customer "Brown" will have a "brown" schema.
And contrary to what HLGEM just responded - no, customers won't see each others data, if you specify proper permissions - each customer (and its users) into its own schema only - should work just fine.
Marc
You might also consider setting a distinctive application name in the connection string rather than using a distinctive user, which you can get into your where clause using APP_NAME(). I'm sure that SQL Server won't have a problem with thousands of logins, but you may prefer not to have to create them.

Resources