Alternative to SQL Server Resource Governor - sql-server

I have a Report Server that runs SSRS 2008 R2 Standard. This server retrieves data from another SQL Server 2008 R2 (Standard again). I want to manage the query priority on these SQL Standard servers, in a manner similar to Resource Governor.
Resource Governor is an Enterprise-only feature and I cannot afford to upgrade to that edition at the moment. Can you suggest an approach/tool to achieve this, even just to some basic degree: Account-based, CPU usage, query time-out?

It's possible to run a job that scans currently running tasks and tracks which ones are blocking or hogging resources. You can then kill different spids based on your own criteria. What that criteria is depends on your own business needs.
I hate to give the advice of "Google this", but there are several tools that can do this, each with varying levels of customization available or you can roll your own. The code for rolling your own can get a little complex. Anyway, if you Google "sql block monitor" or "sql spid monitor" you should find several possible solutions.
These solutions typically do not take into account CPU and/or memory though as the Resource Governor does. You can look at the actual code being run though (hopefully stored procedure calls) and base your decisions on that. From the description of your problem it seems like that might be what you're looking to do.
EDIT: Would this be better as a wiki, since it doesn't have a definitive answer? It could then serve as a collection of the various solutions which exist. Perhaps someone more involved in the SO site can comment on that.

Related

Migrating MS Access 2013 DB to better work from home solution

I have a number of MS Access databases that range from the simple to the fairly complex. They exist as split databases on a shared drive on an on site LAN and the users all have accdr front ends to work with.
Until Covid 19 this worked quite well, now we all have to work from home. While I expected some performance issues, I did not quite expect performance to take quite that much of a hit. So I am looking for ways to migrate to something that will work well with everyone trying to work via a VPN.
An additional fly in the ointment is that there is no budget to work with and getting IT support is akin to summoning the great old ones (it's difficult and you are likely to die insane).
So I have begun to research some different options. MS SQL Server has come up, but I don't know very much about how to implement it. I do not have a dedicated machine to put this on.
I have looked at Sharepoint, but some of the stuff I have been reading makes it seem like this is not a great option as some of my queries are complicated and I have some pretty large tables (45k records, 100 fields per record) In the most complex DB, I have to add several thousand records each day and run several update queries on the freshly added records.
MS Azure looks promising, but again I don't know if that will put me at odds with the malevolent IT gods.
I started looking at office 365 Power apps, but I don't need any mobile device support and it doesn't look like it has the Oomph I need.
Google and Duck Duck Go haven't thrown up anything usefull that I could find among the dross. I'm certain what I need is out there, I just can't find it. I have found that One drive is right out, and likewise Sharepoint for anything other than the simplest DB I have built.
What I am looking for is any solutions, articles, books or even papyrus scrolls and stone tablets that might get me pointed in the right direction. Any Ideas? Any other information you need?
Edit: After so looking I have found that I may be able to get MS SQL server on a virtual server without angering the IT demons. Azure as a solution is out unless I find a suitable sacrifice. Any good places to look for information on how to use SQL server from a standing start?
Consider migrating the backend databases to SQL Server. There's a SQL Server Migration Assistant that will do this for you. Your frontend will contain links to the resulting SQL Server tables.
The last time I did this I got an immediate 2X performance improvement on a LAN. Over a VPN, you should expect similar, possibly better, performance improvements. Quite a good win for something so simple to do without having to do a full rewrite. Don't expect miracles however; Access by nature is a very thick client.
You don't necessarily need a full-blown SQL Server; SQL Server Express should suffice, and you can run that on any machine on your LAN. The download for SQL Server Express Edition can be found here.
You can read up on the migration process here.
You should consider using Remote Destop Connection first!
As you already managed to connect employees by VPN to LAN, you just need to enable remote access to their machines at work! That is the simplest solution and it doesn't need a fast connection or any changes on the application. You can enable WOL so they can turn on machines themselfes.
Of course you should also consider migrate to a RDBMS like Robert advised!

Third Party Tools for Monitoring SQL Server Performance

I'm in a situation where I came into a new job and I have to support several legacy systems. The original developer is no longer around. These legacy systems are really hammering away at our SQL Server and killing performance. I know that there are a lot of things that can be done in the code, but rewriting code is really my last resort.
What I'm looking for is some sort of tool that will monitor the queries coming into the server and give recommendations on indexing solutions. I know I can use the SQL Server Profiler but I'm looking for something a little more user friendly and something that can help me make the indexing decisions.
I know I didn't explain it very well, but I'm sure this is a common request. I'd like to make informed decisions on what to index and avoid "shooting from the hip" and indexing everything in sight. Thanks for any recommendations!
You don't need a third party tool for this.
Assuming SQL Server 2005+ as long as you can use SQL Profiler (actually SQL Trace - Don't use the Profiler GUI for this to reduce tracing overhead as much as possible) to collect a representative workload you can use the Database Tuning Advisor to automate analysis of the workload and make indexing recommendations.
You can also use the Missing Index DMVs for a quick overview of areas to investigate but the DTA will do more holistic analysis and take into account possible adverse effects of indexes on data modification statements.
+1 for Martin's answer, but since you asked about 3rd party tools, I'll mention one of my favorites (and no, I don't work for the company). Ignite for SQL Server does an excellent job of analyzing server activity in terms of wait time analysis. It won't make recommendations for you, but it will quickly identify the worst performing queries where you need to focus your effort.
SQL Server 2005+ has a lot of DMV's (Dynamic Management views) that you can query to get server info, as well as the Profiler / SQL Trace tool.
We administer several large database servers.
Idera is a good tool to manage multiple database servers easily.
I think you'd make a much better DBA if you learn more about the inbuilt functionality of SQL server.
Have a browse of
http://msdn.microsoft.com/en-us/library/ms188754.aspx
to find out more about DMV's and functions.
Another common issue with performance could be your indexes.
Theres a great tutorial that combines the DMV's with improving indexes here:
http://searchsqlserver.techtarget.com/tip/Using-dynamic-management-views-to-improve-SQL-Server-index-effectiveness
Idera is really worth checking out though as a good starting point. Combined with DMV's & SQL trace there shouldn't be much you won't be able to fix.
Idera just takes most of the leg work out of doing things.
http://www.idera.com/Content/Home.aspx
Idera: SQL Diagnostic Manager

Database on file server (Windows)

I am working for a company and I need to create a program really fast. My program will run with 100 users and they will make approximately 100 transactions each per day. As I am under time pressure, and various other constraints it is not possible to set up a proper database running on a server. I am therefore looking for alternatives that have some sort of transaction support without running on a server. I believe this could be solved using Microsoft Access, which is an alright solution, though I believe I will run into locking problems. Isn't is so that a whole table is locked as soon as one user attempts to read from it? Anyways... My question is what other alternatives there are.
The real answer is likely to vary significantly depending on what quantity of data is being talked about here.
I'd take a look at SQLite. It supports transactions, triggers, etc and is supported by things like NHibernate which may make your database mapping life much easier.
Check out SQLite.
Is sqlite a proper solution? Not sure how remote storage is supported, though. That's not a common feature.
You could look into SQL CE, it's a very good local database from Microsoft.
There are many options. As others have stated, setting up and running with SQLLite, SQL Server Express, or any of a number of other small, light, and free databases.
Assuming you need this today, I would go with the one you know most about. Further, I would stay away from anything resembling Access. If you don't already have experience in using it for multi user access, you are going to burn too much time figuring out the problems.
That said, I'd lean towards SQL Server express first. It's free and can scale up to full sql server with no code changes.
I believe this could be solved using Microsoft Access, which is an alright solution, though I believe I will run into locking problems.
I'd say locking and queuing would be the least of your worries. With 100 concurrent users, Access will probably corrupt itself in minutes. With 10k+ records/day, it will likely bog down your entire network in a month or so.
As I am under time pressure, and various other constraints it is not possible to set up a proper database running on a server.
You can bring a database server up in an hour. Much less time than you'll spend hacking away at Access. There's open-source virtual machine images, MSSQL Express, hosted solutions, etc. Time and cost should be non-issues.
About the only thing I can think of that would have you using Access is the Forms support (which can be hooked to MSSQL Server) or DBA maintenance. Frankly, though, at 100 users Access will take so much babysitting that you can afford a hosted SQL instance and still come out ahead.
I think that Firebird can be a very good alternative.
Firebird is available in embedded and can also work with server. It have many features.

Can SQL server 2008 handle 300 transactions a second?

In my current project, the DB is SQL 2005 and the load is around 35 transactions/second. The client is expecting more business and are planning for 300 transactions/second. Currently even with good infrastructure, DB is having performance issues. A typical transaction will have at least one update/insert and a couple of selects.
Have you guys worked on any systems that handled more than 300 txn/s running on SQL 2005 or 2008, if so what kind of infrastructure did you use how complex were the transactions? Please share your experience. Someone has already suggested using Teradata and I want to know if this is really needed or not. Not my job exactly, but curious about how much SQL can handle.
Its impossible to tell without performance testing - it depends too much on your environment (the data in your tables, your hardware, the queries being run).
According to tcp.org its possible for SQL Server 2005 to get 1,379 transactions per second. Here is a link to a system that's done it. (There are SQL Server based systems on that site that have far more transactions... the one I linked was just the first I one I looked at).
Of course, as Kragen mentioned, whether you can achieve these results is impossible for anyone here to say.
Infrastructure needs for high performance SQL Servers may be very differnt than your current structure.
But if you are currently having issues, it is very possible the main part of your problem is in bad database design and bad query design. There are many way to write poorly performing queries. In a high transaction system, you can't afford any of them. No select *, no cursors, no correlated subqueries, no badly performing functions, no where clauses that aren't sargeable and on and on.
The very first thing I'd suggest is to get yourself several books on SQl Server peroformance tuning and read them. Then you will know where your system problems are likely to be and how to actually determine that.
An interesting article:
http://sqlblog.com/blogs/paul_nielsen/archive/2007/12/12/10-lessons-from-35k-tps.aspx

Instrumenting Database Access

Jeff mentioned in one of the podcasts that one of the things he always does is put in instrumentation for database calls, so that he can tell what queries are causing slowness etc. This is something I've measured in the past using SQL Profiler, but I'm interested in what strategies other people have used to include this as part of the application.
Is it simply a case of including a timer across each database call and logging the result, or is there a 'neater' way of doing it? Maybe there's a framework that does this for you already, or is there a flag I could enable in e.g. Linq-to-SQL that would provide similar functionality.
I mainly use c# but would also be interested in seeing methods from different languages, and I'd be more interested in a 'code' way of doing this over a db platform method like SQL Profiler.
If a query is more then just a simple SELECT on a single table I always run it through EXPLAIN if I am on MySQL or PostgreSQL. If you are using SQL Server then Management Studio has a Display Estimated Execution Plan which is essentially the same. It is useful to see how the engine will access each table and what indexes it will use. Sometimes it will surprise you.
Recording the database calls, the gross timing and the number of records (bytes) returned in the application is useful, but it's not going to give you all the information you need.
It might show you usage patterns you were not expecting. It might show where your using "row-by-row" access instead of "set based" operations.
The best tool to use is SQL Profiler and analyse the number of "Reads" vs the CPU and duration. You want to avoid high CPU queries, high Read's and long durations (duh!).
The "group by reads" is a useful feature to bring to the top the nastiest queries.
If you're writing queries in SQL Management Studio you can enter: SET STATISTICS TIME ON and SQl Server will tell you how long the individual parts of a query took to parse, compile and execute.
You might be able to log this information by handling the InfoMessage event of the SqlConnection class (but I think using the SQL Profiler is much easier.)
I would have thought that the important thing to ask here is "what database platform are you using?"
For example, in Sybase, installing MDA tables might solve your problem, they provide a whole bunch of statistics from procedure call usage to average logical I/O, CPU time and index coverage. It can be as clever as you want it to be.
I definitely see the value in using SQL Profiler while you're app is running, and EXPLAIN or SET STATISTICS will give you information about individual queries, but does anyone routinely put measurement points into their code to gather information about database queries ongoing - that would pick up on for example, a query on a table that performs fine initially, but as the number of rows grows, becomes slower and slower.
If you're using MySQL or Postgre there's various tools for seeing query activity in real time, but I haven't found a tool as good as the SQL Profiler for measuring query performance over time.
I'm wondering if there is (or should be?) something similar to ELMAH in the way it just plugs in and gives you information without much additional effort?
If you're into Firebird you may want to watch sinatica.com.
We'll soon launch a real-time monitoring tool for Firebird DBAs.
< /shameless plug>
If you use Hibernate (I use the Java version, I'd imagine NHibernate has something similar), you can have Hibernate collect statistics about lots of different things. See, for example:
http://www.javalobby.org/java/forums/t19807.html

Resources