I have many SQL Server databases, each with a few tables containing important (from my point of view) information. I check the data (retrieving for example maximum or minimum value) within those tables using T-SQL queries.
Since I don't want to create views for each of the databases, I'm thinking about most convenient, easier and simply the best way to prepare summary which will be updating each time when opened.
The output file (or web page) will be used internally within technical team. All members can log into database using Windows authentication.
My idea was:
Excel + dynamic T-SQL --> I want to connect to database and execute T-SQL (cursor will go through all database names)
PowerShell (showing table using Out-GridView cmdlet)
PHP - first I will ask about all database names (executing select name fromsys.databases` and then execute query for each DB)
What is in your opinion best way? Do you have any better (from programmers point of view) way of getting such report/data?
You can use SSRS Reports .You have the options of exporting the report data to several formats such as pdf ,excel ,word .You can create a dataset for all your database .Since you are interested in showing aggregation and summation of values ,SSRS reports will be pretty useful in these cases .
Related
I would like to know how I can get data from SQL Server tables into my ASP.net MVC application without knowing the data-structure in advance.
Example:
a user uploads a .csv file into the application with an unknown data structure (can be 3 fields can be 50 fields with varying data types)
the .csv file gets stored into a SQL Server table
now I want the application to be able to display the data from these uploads in e.g. a HTML table without having to use a hardcoded model
Is it possible to display the data using a connection string and e.g. LINQ to SQL or EF? Best case would be where I can dynamically assign table names etc. into queries.
The models will still be used to access data belonging to the application logic, it's just the displaying of data from user-uploads that is not clear to me at this time.
EF and Linq2Sql will always require you to have a existing model associated to a database table.
If you really need that "dynamic" query, you can use a micro-ORM like dapper.Net to return your query to a dynamic type.
This solution won't save you from having generate the select sql query, by retrieving the list of fields from the table. Maybe you can use the sys tables from sql server (if that is the data base) for that. Like in here
Is there any handy tool that can make updating tables easier? Usually I got an Excel file with the original value in one column and new value in another column. Then I write a formula in Excel to create the 'update' statement. Is there any way to simplify the updating task?
I believe the approach in SQL server 2000 and 2005 would be different, so could we discuss them both? Thanks.
In addition, these updates usually request by "non-programmer" (which means they don't understand SQL, so it may not feasible to let them do query), is there any tool that can let them update the table directly without having DBAs do this task? Also, that tool needs to limit the privilege to only modify certain tables. And better has a way rollback the change.
Create a DTS package that will import a csv file, make the updates and then archives the file. The user can drop the file in a specific folder designated for the task or this can be done by an ops person. Schedule the DTS to run every hour, day, etc.
In case your users would insist that they keep using Excel, you've got several different possibilities of getting the data transferred to SQL Server. My preferred one would be to use DTS/SSIS, as mentioned by buckbova.
However, another method is by using OPENROWSET(), which makes it possible to query your Excel file as if it was a table. I wrote a small article about it here: http://blog.hoegaerden.be/2010/03/29/retrieving-data-from-excel/
Another approach that hasn't been mentioned yet (I'm not a big fan of letting regular users edit data directly in the DB), any possibility of creating a small custom application for them?
There you go, a couple more possible solutions :-)
Valentino.
I think the best approach is to expose a view on your data accessible to users who are allowed to do updates, and set up triggers on the view to perform the actual updates on the underlying data. Restrict change to only the columns they should be changing.
This technique can work on SQL Server 2000 and 2005.
I would add audit triggers on the underlying tables so you can always track changes.
You'll have complete control, and they can connect to it with Access or whatever and perform their maintenance.
You could create some accounts in SQL Server for these users and limit their access to only certain tables and columns along with onlu select / update / insert privileges. Then you could create an access database with linked tables to these.
We have been trying out SQL server reporting services. We are using SQL 2008 and Visual Studio 2008.
I Have a couple of linked reports, along the lines of
Report1: Summary of tickets that are still open past the date that they should have closed. Click through on a line to:
Report2: Details of a ticket and it’s dependant history data.
We have about ten databases, one for each client, with names like “TicketsDatabase_ClientA” “TicketsDatabase_BCustomer”.
I have deployed these reports to a SSRS server for internal review and testing, using “TicketsDatabase_ClientA” as the DB on the data source.
Table structures on all the databases are the same. there are other databases as well, including one that can provide us with a list of client databases,
But I’d like to roll them out for all clients’ data. What is the simplest way to deploy these reports so that we can look at all customers’ data? Ideally the report would start with a drop-down to select customer by name and then proceed on the appropriate database.
If this is not possible, I’d settle for a page that lists customers. It looked like we could upload multiple copies of Report1, and just put a different connection string on each one. But Report1 connects to the linked report Report2 by name, so I’d need multiple copies of Report2, each with a different name, and multiple edited copies of Report1, each edited to link to a different version of Report2.
This is looking like a really unpalatable, lengthy manual process that will need to be repeated whenever there’s a new report or a new customer.
Is there a way to choose the connection to use for a set of reports? What is best practice with this kind of case?
Update:
We have ended up with dynamic SQL - The reports have a Parameter bound to a dataset to select the database, and the main dataset using an SQL exec statement, e.g.
declare #dbName varchar(64)
SET #dbName =
(SELECT 'TicketsDatabase_' + db.[Name]
FROM MainDb.dbo.Clients db (nolock)
WHERE db.Id = #clientId)
EXECUTE ('USE ' + #dbName + '
SELECT Datatable.*
FROM ...
WHERE ...')
Disclaimer: I have really only used SQL Server Reporting Services 2005
I think you have three options.
1) Use dynamic connection strings ala sqlservercentral.com
2) Get a stored procedure to select the data based on input for you.
3) Get SSRS to get the data from a web service.
I'd consider using a ReportViewer control in local mode
This allows the database connection to be managed by ASP.NET rather than Reporting Services.
Using a pure SSRS solution here is awkward as you mentioned.
I don't know if this fits in with your request for "best practice" or whether it could be better described as "nasty hack" :) However, here's what we do in this situation:
We have a central database. In it is a table called Databases with a list of client and database names.
For example:
ClientName DatabaseName
Client A TicketsDatabase_ClientA
B Customer TicketsDatabase_BCustomer
We add a dataset to the report called Databases which has the following Sql statement:
SELECT ClientName, DatabaseName FROM Databases
We add a string parameter to the report called Database which uses this dataset as its "From Query" setting with the Value field being DatabaseName and the label field being ClientName. We also add a small 6pt font label on the report header with the expression =Parameters!Database.Label so we know what database this report is using.
So now we have a way to choose the database and as we create more databases we can add them to our centralised table and all reports that allow choosing databases will automatically have the new database as an option.
Now we simply have to update the Sql statement for our main dataset that the report is based on to take information from the correct database, like so (remember, the Sql statement is just a string expression so, like everything else in Reporting Services, you can build it dynamically):
="SELECT Field1, Field2, Field3 "
&"FROM " & Parameters!Database.Value & ".dbo.MyTable "
Sql Server doesn't mind jumping out of the current database to look in another database as long as the credentials are acceptable, so doing this allows you to dynamically select data from whatever database you want to, regardless of where the actual data source is connected to.
Of course, this assumes that your table structures are the same for the data you are reporting on.
You could always consume an SSIS package from SSRS.
This could perhaps have a looping container which could dynamically pick up all data for "each database name held in a table"
That way you would only need to insert a new row into the table and the SSIS package would automatically pick up the data.
We have a Visual C++ 6 app that stores data in an Access database using DAO. The database classes have been made using the ClassWizard, basing them on CDaoRecordset.
We need to move from Access to SQL Server because some clients have huge (1.5Gb+) databases that are really slow to run reports on (using Crystal Reports and a different app).
We're not too worried about performance on this VC++ app - it is downloading data from data recorders and putting it in the database.
I used the "Microsoft SQL Server Migration Assistant 2008 for Access" to migrate my database from Access into SQL Server - it then linked the tables in the original Access database. If I open the Access database then I can browse the data in the SQL Server database.
I've then tried to use that database with my app and keep running into problems.
I've changed all my recordsets to be dbOpenDynaset instead of dbOpenTable. I also changed the myrecordsetptr->open() calls to be myrecordsetptr->open(dbOpenDynaset, NULL, dbSeeChanges) so that I don't get an exception.
But... I'm now stuck getting an exception 3251 - 'Operation is not supported for this type of object' for one of my tables when I try to set the current index using myrecordsetptr->->SetCurrentIndex(_T("PrimaryKey"));
Are there any tricks to getting the linked tables to work without rewriting all the database access code?
[UPDATE 17/7/09 - thanks for the tips - I'll change all the Seek() references to FindFirst() / FindNext() and update this based on how I go]
Yes, but I don't think you can set/change the index of a linked table in the recordset, so you'll have to change the code accordingly.
For instance: If your code is expecting to set an index & call seek, you'll basically have to rewrite it use the Find method instead.
Why are you using SetCurrentIndex when you have moved your table from Access to SQL Server?
I mean - you are using Access only for linked table.
Also, as per this page - it says that SetCurrentIndex can be used for table type recordset.
In what context are you using the command SetCurrentIndex? If it's a subroutine that uses SEEK you can't use it with linked tables.
Also, it's Jet-only and isn't going to be of any value with a different back end.
I advise against the use of SEEK (even in Access with Jet tables) except for the most unusual situations where you need to jump around a single table thousands of times in a loop. In all other DAO circumstances, you should either be retrieving a limited number of records by using a restrictive WHERE clause (if you're using SEEK to get to a single record), or you should be using .FindFirst/FindNext. Yes, the latter two are proportionally much slower than SEEK, but they are much more portable, and also the absolute performance difference is only going to be relevant if you're doing thousands of them.
Also, if your SEEK is on an ordered field, you can optimize your navigation by checking whether the sought value is greater or lesser than the value of the current record, and choosing .FindPrevious or .FindNext, accordingly (because the DAO recordset Find operations work sequentially through the index).
I am trying to reconcile data from a website and a database programmatically. Right now my process is manual. I download data from the website, download data from my database, and reconcile using an Excel vlookup. Within Excel, I am only reconciling 1 date for many items.
I'd like to programmatically reconcile the data for multiple dates and multiple items. The problem is that I have to download the data from the website manually. I have heard of people doing "outer joins" and "table joins" but I do not know where to begin. Is this something that I code in VBA or notepad?
Generally I do this by bulk inserting the website data into a staging table and then write select statments to join that table to my data in the database. You may need to do clean up first to be able to match the records if they are stored differently.
Python is a scripting language. http://www.python.org
There are tools to allow you to read Excel spreadsheets. For example:
http://michaelangela.wordpress.com/2008/07/06/python-excel-file-reader/
You can also use Python to talk to your database server.
http://pymssql.sourceforge.net/
http://www.oracle.com/technology/pub/articles/devlin-python-oracle.html
http://sourceforge.net/projects/pydb2/
Probably the easiest way to automate this is to save the excel files you get to disk, and use Python to read them, comparing that data with what is in your database.
This will not be a trivial project, but it is very flexible and straight forward. Trying to do it all in SQL will be, IMHO, a recipe for frustration, especially if you are new to SQL.
Alternatively:
You could also do this by using VBA to read in your excel files and generate SQL INSERT statements that are compatible with your DB schema. Then use SQL to compare them.