(Warning: SQL Server expert, TFS Dashboard Widget Noob)
My workplace has a TFS setup (version 15.117.27024.0) where there are dashboards with widgets where we can use Shared Queries as a data source. No problems here.
Question: Is it possible to configure a widget so that the source query is a custom Stored Procedure in TFS_Warehouse?
Thanks in advance.
Jim
I am not aware of a way to return the results of a stored proc in the TFS_Warehouse or other DB directly from a widget.
We use Otto Streifel's Wiql Editor extension (source) which allows us to write WIQL queries instead of clicking our way through the GUI that is provided to create our shared queries. The syntax is similar to SQL, but it is more limited than a SQL stored proc would be. Even so, we have been able to write some robust queries to create some very useful widgets. (For example to break out how many hours we have planned/worked based on tags applied to our work items)
The only other solution I could think of would be to use a Embedded Web page widget that would call the stored proc and display the results.
Related
I think it is a common issue: you have a large and complex database, and a few hundreds of reports built on top of it. everytime, before the DB schema is changed, someone need to look at the reports to make sure the reports are not affected.
Is there any tools to check the dependency of reports(reporting service) for the database change?
-- update
We are using MS reporting service for the report, and SQL server database.
Thanks in advance
You don't need to buy a special tool for it. SSMS has a nice UI for doing it, or alternatively, you can use one of the catalog views to query all report definitions at once.
Here's an MSDN link with some good samples:
http://msdn.microsoft.com/en-us/library/bb677168.aspx
I have a master database where we define all information of our software.
It contains
tables
queries
trigger
stored procedures
stored functions
meta data
in the table (content)
At the moment, with every change I manually (with some support from SQL Management Studio) edit files where I have all the CREATE, UPDATE, INSERT statements for the stuff mentioned above. When I have to create a new database I fire-up all the xyz.sql files, which contains my SQL statements.
I know there is a database creation script wizard in management studio, but this for example doesn't include the content data. I also need to make sure the stuff is executed for creation in the right order (e.g. queries , function, etc. last then structure tables are available).
At the moment I was thinking about a .NET project where I start read all the shema tables and then create the files automatically. In Ruby on rails the system creates a shema.rb and for the data yaml files. I tried work with this, but as many tables not created by active record (old c++ stuff also running), this won't work for me.
So does anyone have any hint for me how to do this best or any tool that fits perfect to my demand?
You can do this very easily in .NET using the SMO frameworks.
There are integrated tools for scripting out in dependency order, and you can script out data as well if you desire.
See my answer here for some info and links.
SQL Compare Pro should be able to load up your DDL creation scripts and deploy them to a target in the correct order. In the Edit Project dialog make sure you load your scripts as a Scripts Folder. For the data you'll need to use SQL Data Compare Pro. If you have any trouble or have questions, let me know as I work for Red Gate so will be able to help you with these tools.
I'm a little confused about why you've got UPDATEs given that these scripts create a database from scratch. Shouldn't they all be INSERTs?
SSMS does have the ability to create data scripts as well. You need SSMS 2008 and you need to go to Tasks/Generate Scripts and in the Choose Script Options pane you have to make sure Script Data is set to True.
If you're looking to maintain these scripts as a sensible way to source control your SQL Server objects, you might want to consider SQL Source Control. This will maintain your schema objects AND static data tables as individual .sql files.
"I know there is a database creation script wizard in management studio, but this for example doesn't include the content data."
You have to look carefully! Of course this build-in script engine can include the content data. You just have to click the button labeled "properties" (or something like that) and there you can change all the SMO script options including a full data dump.
This ends up in the script with many INSERT INTO... statements.
In-depth description
Try DbSourceTools.
It is a SQL Management tool designed specifically to script SQL databases to disk ( including data ), and then re-create them using "Deployment Targets".
We are using it for database source control in an agile project.
We have several SQL Server databases containing measurements from generators that we build. However, this useful data is only accessible to a few engineers since most are unfamiliar with SQL (including me). Are there any tools would allow an engineer to extract chosen subsets of the data in order to analyze it in Excel or another environment? The ideal tool would
protect the database from any accidental changes,
require no SQL knowledge to extract data,
be very easy to use, for example with a GUI to select fields and the chosen time range,
allow export of the data values into a file that could be read by Excel,
require no participation/input from the database manager for the extraction task to run, and
be easy for a newbie database manager to set up.
Thanks for any recommendations or suggestions.
First off, I would never let users run their own queries on a production machine. They could run table scans or some other performance killer all day.
We have a similar situation, and we generally create custom stored procedures for the users to "call", and only allow access to a backup server running "almost live" data.
Our users are familiar with excel, so I create a stored procedure with ample parameters for filtering/customizations and they can easily call it by using something like:
EXEC YourProcedureName '01/01/2010','12/31/2010','Y',null,1234
I document exactly what the parameters do, and they generally are good to go from there.
To set up a excel query you'll need to set up the data sources on the user's PC (control panel - data sources- odbc), which will vary slightly depending on your version of windows.
From in excel, you need to set up the "query", which is just the EXEC command from above. Depending on the version of Excel, it should be something like: menu - data - import external data - new database query. Then chose the data source, connect, skip the table diagram maker and enter the above SQL. Also, don't try to make one procedure do everything, make different ones based on what they do.
Once the data is on the excel sheet, our users pull it to other sheets and manipulate it at will.
Some users are a little advanced and "try" to write their own SQL, but that is a pain. I end up debugging and fixing their incorrect queries. Also, once you do correct the query, they always tinker with it and break it again. using a stored procedure means that they can't change it, and I can put it with our other procedures in the source code repository.
I would recommend you build your own in Excel. Excel can make queries to your SQL Server Database through an ODBC connection. If you do it right, the end user has to do little more than click a "get data" button. Then they have access to all the GUI power of Excel to view the data.
Excel allows to load the output of stored procedures directly into a tab. That IMO is the best way: users need no knowledge of SQL, they just invoke a procedure, and there are no extra moving parts besides Excel and your database.
Depending on your version of SQL server I would be looking at some of the excellent self service BI tools with the later editions such as Report Builder. This is like a stripped down version of visual studio with all the complex bits taken out and just the simple reporting bits left in.
If you setup a shared data source that is logging into the server with quite low access rights then the users can build reports but not edit anything.
I would echo the comments by KM that letting the great unwashed run queries on a production system can lead to some interesting results with either the wrong query being used or massive table scans or cartesian joins etc
Our centralized IT department has suggested two primary ad hoc query tools for our general user base of approximately 200 staff members:
Microsoft SQL Server Management Studio 2008 (SSMS)
Microsoft Access 2003
Environment
The backend database is a read-only Microsoft SQL Server 2005 database.
The schema is 400+ tables; allowing access to the raw data for our general staff would be a disaster.
We will be building an "abstraction layer" over the raw data for our general staff to run ad hoc queries against.
The abstraction layer will most likely contain a number of views.
A number of users have basic knowledge in Microsoft Access; none have used SSMS.
Which of the above tools (or alternative) would be best for a decidedly non-techie user base of approximately 200 people? What are the pros and cons of each?
Also, the IT department has suggested teaching people T-SQL so they may use SSMS. Is this reasonable?
How about this one? i-net Clear Reports (used to be called i-net Crystal-Clear) has a powerful ad-hoc reporting component that is made to be an easy-to-use thing for non-technical users. Your users won't have to know anything about reporting at all. They simply select the kind of report, the data et voila there is a report suiting the needs.
The data abstraction can be done easily by creating so called data-views which can be designed by e.g. your administration. There are various ways to access the ad hoc reporting GUI. We have a web GUI, a Java Applet or a standalone Java program.
The end users will not need any training since the GUI is highly intuitive.
The views can easily be build by drag and drop in addition to setting datatypes, formats and so on.
All reports (depending on security settings) can be accessed via DAV our a report repository gui.
The server supports different security settings on a per user or per group basis.
The standalone report designer is free and fully functional.
Disclosure: Yep. I work for the company who built this.
Your "abstraction layer" is the right approach to take with Access. Create an MDB with the basic views required linked into it and distribute to the users. Allow them to create new queries and reports in their own MDB as required.
Now how you are going to stop them from running a Cartesian join on tables with a million records or more I'm not quite sure.
Microsoft have a free tool for business and end users which called "Report Builder". It supports the full capabilities of SQL Server Reporting Services. The good thing it is provides a Microsoft Office look-like user interface.
You can download latest version "Report Builder 3.0" from here
http://www.microsoft.com/download/en/details.aspx?DisplayLang=en&id=6116
And for more information about MS Report Builder check this link
http://technet.microsoft.com/en-us/library/dd207008.aspx
Attempting to teach "non-techie" people T-SQL to query a schema with 400+ tables probably isn't going to do well, unless they are limited to querying the views only, and the views hide all the ugly complexities of various joins, grouping etc.
Our company was in a similar situation where Access was used early on, and then we switched everyone over to use T-SQL and SSMS. IMO, this is the approach you'd want to take.
Again though, the success of this will depend on the quality of your views, or better yet, reports you provide your end-users.
Randy
I would look more into something like Stonefieldquery.com that is designed for non developers to build reports. Not that the report writer or query builder in Access is bad, but may be too much. I think they also provide a way to centralize reports and queries where they can be shared. Multiple people are not going to be able to open a single access file and create a report (I think query building is OK.).
Most will use the drag and drop capability, but about 5-10%will come thing a need for SQL and then you can take advantage of the "teachable moment" and get them some training.
Cons for Access certainly would be cost; SSMS should be free assuming you're properly licensed for the SQL server.
Depending on the actual needs, some users might actually be better off with Crystal Reports (never thought I'd say that), or Reporting Services.
you could create a series of sql server analysis cubes and have the users conenct to those using excel so that they can use excel's pivot tables.
Being a newbie at ad hoc reporting and doing the work myself, I used Izenda.com ad hoc reporting. It was very straight forward, and I could do it myself versus outsourcing.
Check SQLS*Plus - http://www.sqlsplus.com
I found SQLS*Plus to be a very effective command line SQL server reporting tool - this is a free tool (for personal use) and allows me to generate reports with the titles, headers, in HTML and CSV formats, format columns in custom masks, set report length, pagesize, etc. As I understand it is very similar to very well known Oracle SQL*Plus reporting tool
we are trying to create reports programmatically in asp.net using microsoft reporting services. We are not sure if it is possible though. We have several queries for our reports. Instead of creating a separate .rdlc report for each of those queries, we are looking for a way so that we can feed the query / stored procedure to the reporting services engine and the reporting services engine will create a report and display it on the web. Is that possible?
Thanks
Not the way you are describing it.
Reporting services needs a report definition to generate the report (the .rdlc). It can't just guess. You can create the definition programmatically. After all, the .rdlc is just XML in the form of the report definition language schema.
You could loop through the result set, creating new report table columns for each column in the result set or something like that. I've never tried this, but I think it will be damn near impossible to get reliable formatting if the columns, sizes, etc are not known ahead of time. I don't know how many different queries you are talking about, but the effort to do something like this may not be worth it.
Are these queries radically different? Do they return the same basic type of data?