SQL Server Report Models - sql-server

I am currently working on a project which uses SQL Server Reporting Services 2012 to create a (large) set of reports.
We would like to enable some business users to create reports from a live Oracle database, however these are staff who have no skills around SQL or data models, nor an expectation to learn - this is repeatedly stated to be outside their remit (they are analysts recruited to be critical thinkers, not technical staff).
They require the capability of creating ad-hoc queries and reports from the database to answer questions as and when they arise but need to be able to create queries with and/or type clauses to reach record level data and generally create record sets for reading/review.
Currently the only option looks like using the legacy Report Model to pre-define the most commonly used business models on top of the live database as I can not prove that the Tabular Model provides querying capability required. We do not have data that forms into a dimensional model very easily, and even then often have questions that asks for multiple null values to be returned due to significant accepted data gaps.
Is anyone able to shed any light on how the current Microsoft BI stack would let non technical users ask the following type of query and return a single data set in SSRS Report Builder:
Select all records
where
created between two dates and match two keywords in text field 1
or
Updated between two dates and match three keywords in text field 2 and have a status of X
I know that tools such as Business Objects provide this sort of interface but I feel that I must be missing something within the MS solution as they had this so well covered with the Report Model.

Related

SQL Server tables connection

I have to connect multiple tables that are part of single or multiple databases. Approximately 10-15 tables in each query have to be connected to generate data for the analysis in SQL Server 2014.
I don't have access to the database diagram or architecture and these reports are to be sent out weekly. I want to understand the approach on how to begin writing these kind of queries which are of basic and advanced level and identify the relationship between tables and what kind of advanced level queries I can learn or utilize like CTE, Rank Partition, Subqueries etc.
Anybody who can provide a rough flow diagram or structure about the approach will be really helpful.
It's very unlikely that owners of those source systems want to be directly queried every time someone runs a report. Since you already have access to SQL Server, I would suggest building a data warehouse with that.
You haven't provided a whole lot of information to go on, but SSIS packages could be created to connect to the source systems and load into your data warehouse. And furthermore, those packages can be scheduled through Agent.
As for modeling... Again it is difficult with the lack of information, but generally the star model works great for reporting, which is a fact table surrounded by dimension (or attribute) tables.
As for figuring out relationships without a diagram, this will have to be done via experimentation and tieing to existing reports to make sure your joins aren't dropping records or cascading.
Good luck.

Generating documentation for Power BI

Is there a native solution/application/script for creating documentation in Power BI? I am especially interested in documenting all relationships.
Power BI Models (and the new Tabular Models) have DMVs that are separate from the MDSCHEMA rowsets for SSAS multidimensional. While some of the SSAS MD DMVs mostly work, the new TMSchema DMVs work well since they are made specifically for this type of model. The trick is that you must know the connection info. The port and database number change each time you open Power BI Desktop. But generating documentation can be done.
There are a couple of ways to go about it. You can use DAX Studio to get your connection info (a la Chris Webb). Or you can get that info dynamically from Power BI (a la The BIccountant). Using DAX Studio works as a one-time way to get documentation, or if you are ok updating the connection and database info each time you want to run it. The BIccountant way is more dynamic. I haven't tried it, but it looks promising.
To get relationships, you can get your connection info for your Power BI model and then run queries against the following DMVs:
$System.TMSCHEMA_RELATIONSHIPS
$System.TMSCHEMA_TABLES
$System.TMSCHEMA_COLUMNS
Pull those down into Power BI (either in the same model you are documenting, or in a different model). Then either
A) Use the Edit Queries functionality to merge the queries to add the Name column from the Tables DMV and the Explicit Name column from the Columns DMV based upon FromTableID, FromColumnID, ToTableID, ToColumnID.
B) Create relationships between these columns using the modeling functionality of Power BI to achieve the same affect.
Once you've done this and cleaned up column names, hidden/deleted unused fields, you can then use Power BI to create your documentation. You can create plain tables and/or use something like a force-directed graph to show relationships. Here's a screenshot of one that I made.

Automatically or easily updating my database

I have available to me a Report that is generated in Microsoft SharePoint, and it holds the quantities for certain items. The reports can be exported as excel documents, but if it is possible i would like to avoid that.
In my Access database I have all the same items but with additional data concerning special requests and item identification in the item's respective documentation folders.
I am looking for a way to have the select few columns that represent the quantities and some other factors, to be automatically updated in my database.
How can I go about this? Is there a specific terminology for what I am attempting to do, I am unable to find it on Google?
So to clarify ... you have item data exported from SharePoint and item data in Access and ideally you'd like to merge both and store the results in Access.
Or maybe another way of putting it, you would like to compliment the data in Access with the data from SharePoint.
If the database that powered the SharePoint report ran in Access as well, the word you are looking for is replication. You want to automatically replicate the data from one server/database to another.
Unfortunately I don't know of any software that replicates data to Access.
Your best bet would be to write a program that scheduled the running of the SharePoint report and then imported that data into Access.
I'm happy to give you the terminology of what to Google for. Just don't make me use SharePoint and Access. :)
If you have the same items in a report in SharePoint and in Access hopefully there is a field that uniquely identifies each item and is used in each table (a unique key). If these items (typically we would say 'records' or 'tuples' in database circles) are inventory SKUs or product numbers would be examples of potential unique keys. If you re taking the information in two tables and merging them together using a unique key we call it a 'Natural Join'. I know Access and SharePoint both support SQL and using SQL this would be done using a SELECT statement.
I would try googling: Natural Join tables in SharePoint and Accesss
Or: SQL SELECT between SharePoint and Access
Hope this helps.
If you choose linked tables to SharePoint (as opposed to importing them local), then you will always have a live copy of the data. In fact this is replicated model in Access 2010. Then a query could be used that joins in the additional table columns with quanity etc. Replication would need caution since any changes to the local access table would go back up to SharePoint and that may not be desired or even allowed.
In this case I would thus simply import the SharePoint tables local and again use a join based on a PK to the tables with quanity etc. that is local. Note that the local copy + cache runs very fast in 2010, and prior to Access 2010 + SharePoint 2010 the speed of such a setup is not so good compared to Access 2010.
If you are using an older version of Access + SharePoint then I would suggest you continue your approach of important the SharePoint tables (as opposed to being linked to the live tables on SharePoint). You then again simply use a query that joins in the additional columns you wish to display in your reports.
Such a results query would not only be of use for reports, but you could export that query into Excel or word.
Best regards.

Best Practise: presenting some data (making a report) from SQL Server

I have many SQL Server databases, each with a few tables containing important (from my point of view) information. I check the data (retrieving for example maximum or minimum value) within those tables using T-SQL queries.
Since I don't want to create views for each of the databases, I'm thinking about most convenient, easier and simply the best way to prepare summary which will be updating each time when opened.
The output file (or web page) will be used internally within technical team. All members can log into database using Windows authentication.
My idea was:
Excel + dynamic T-SQL --> I want to connect to database and execute T-SQL (cursor will go through all database names)
PowerShell (showing table using Out-GridView cmdlet)
PHP - first I will ask about all database names (executing select name fromsys.databases` and then execute query for each DB)
What is in your opinion best way? Do you have any better (from programmers point of view) way of getting such report/data?
You can use SSRS Reports .You have the options of exporting the report data to several formats such as pdf ,excel ,word .You can create a dataset for all your database .Since you are interested in showing aggregation and summation of values ,SSRS reports will be pretty useful in these cases .

Dimensional Level Security / Per User Data Security in SSAS Cube?

I'm part of a team looking to move from our relational data warehouse to a SSAS cube. With our current setup we have an "EmployeeCache" table (basically a fact) which is a mapping from each of our employee ids to their viewable employee ids. This table is joined in our model to our DimEmployee table so that for every query that needs personally identifiable information the DimEmployee records are filtered. The filter is applied from a session variable that is the user id which is making the query.
All of the examples we researched to provide dimension level security in a SSAS cube have required the use of Windows managed security. The systems that create the data that is being analyzed handle their own security. Our ETLs map the security structure into the aforementioned EmployeeCache and DimEmployee tables. We would like to keep this simple structure of security.
As we see it there is no way to pass in session values (aside from using the query string which we're not seeing it possible with Cognos 10.1) to the cube. We're also not seeing any examples out there on security which does not require the use of Windows auth.
Can someone explain if there is a way to achieve dimensional security as I have previously described in a SSAS cube? If there is no way possible could another cube provider have this functionality?
Two thoughts. Firstly, SSAS only supports windows authentication (see Analysis Services Only Windows Authentication) and this is unchanged in Sql Server 2012. But you can pass credentials in the connection string to analysis services. Secondly, could you alter the MDX of every query and add a slicer to restrict the data to only the data a user should see?

Resources