SQL Server Dba Simulation - sql-server

I'm looking into creating a SQL Server database and then creating some T-SQL scripts which simulate stress/ a load on the database. Which I can then investigate using windows performance monitor, to highlight the causes of the load, and how to fix it. The aim of this is to train an ability to investigate database performance problems and how to fix them.
I'm new to database administration but I can't seem to find many resources/information on performance and creating fake loads to test performance.
Any help would be appreciated.

Take a look at
SQL Query Stress - allows you to simulate load by executing queries repeatedly.
Diskspd - test I/O subsystem
If you need some data to test against you are also free to download the StackOverflow database and put it on your test server. Here's a LINK to a post on how to download the database.

Related

SQL Server 2005- Investigate what caused tempdb to grow huge

The tempdb of my instance grew huge eating up all the available disk space and causing applications to go down. Had to restart the instance in emergency. However, I want to investigate and dig deep as to what caused the temp db to grow huge all of sudden. What were the queries, processes that casued this? Can someone help me to pull the required info. I know I wont get much of historical Data from the SQL serevr. I do have the Idera SQL Diagnostic Manager(third party tool) deployed. Any help to use the tool would be really appreciated.
As for postmortem analysis, you can use the tools already installed on your server. For future proactive analysis, you can use SQL traces directly in SQL Profiler, or query the traces using SQL statements.
sys.fn_trace_gettable
sys.trace_events
You can also use an auditing tool that tracks every event that happened on a SQL Server instance and databases, such as ApexSQL Comply. It also uses SQL traces, configures them automatically,and processes captured information. It tracks object and data access and changes, failed and successful logins, security changes, etc. ApexSQL Comply loads all captured information into a centralized repository.
There are several reasons that might cause your tempdb to get very big.
A lot of sorting – if this requires more memory than your sql server has then it will store all temp results in tempdb
DBCC commands – if you’re frequently running commands such as DBCC CheckDB this might be the cause. These functions store its results in temp db
Very large resultsets – these are also using temp db to run properly
A lot of heavy transactions such as bulk inserts
Check out this article for more details http://msdn.microsoft.com/en-us/library/ms176029.aspx on how to troubleshoot this.
AK2,
We have Idera DM tool as well. If you know the time frame around what time your tempdb was used heavily you can go to History on the Idera tool to see what query was running at that time and what lead to the server to hose... On the "Tempdb Space used OverTime" you would usually see a straight line or a graph but at the time of heavy use of tempdb there's a pike and a straight drop. Referring to this time-frame you can check into Sessions>Details too see the exact query and who was running the query.
In our server this happens usually when there is a long query doing lots of join. or when there is an expensive query involving in dumping into temp table / table variable.
Hope this will help.
You can use SQL Profiler. Please try the link below
Sql Profiler

Application Hangs on SQL Server - restart required every time

We have an application which has a SQL Server 2000 Database attached to it. After every couple of days the application hangs, and we have to restart SQL Server service and then it works fine. SQL Server logs show nothing about the problem. Can anyone tell me how to identify this issue? Is it an application problem or a SQL Server problem?
Thanks.
Is it an application problem or a SQL Server problem?
Is it possible to connect to MS SQL Server using Query Analyzer or another instance of your application?
General tips:
Use Activity Monitor to find information about concurrent processes, locks and resource utilization.
Use Sql Server Profiler to trace server and database activity, to capture and save data to a table or file to analyze it later.
You can use Dynamic Management Views (\Database name\Views\System Views folder (in the Management Studio)) to get more detailed information about MS SQL Server internals.
If you have the problems with perfomance (not your case) - you can use Perfomance Monitor and Data Collector Sets to gather perfomance information
Hard to predict the issue, I will suggest you to check your application first.Check what all operations you are performing against data base, are you taking care of connection pooling, unused open connections can create issues.
Check if you can get any log from your application. Without any log information hardly we can suggest anything.
Read this
Application may be hanging due to Deadlock
check the SP runs at that time using Profiler
and check the table manipulation(use nolock),
check the buffer size and segregate the DB into two or three module.

ADO.NET database access

I have written a program in VB.NET and one of the things this program does is insert records into a Microsoft Access database. The backend of my program that access the database is written as an interchangeable layer. If I "swap" this layer out with a layer that used a Microsoft SQL Server database, my program flies. If I use MS Access, its still pretty quick, but it is much slower. Does anyone have any hints or tips on how to speed up ADO.NET transactions using Microsoft Access? I would really rather use MS Access over SQL Server so that I can distribute my database with my program (rather then connecting to some remote SQL Server). Any suggestions? Also, when I created the MS Access database, I created it in Access 2000 compatible mode. would it be faster to use 2003 compatible mode?
Thanks in advance
Although you need to install it, SQL Server Express supports "XCopy file deployment" where all you need to do to deploy the application is ship an .mdf file and your executables.
Details are here on MSDN.
This does support stored procedures: I've used it in our unit tests to dynamically create a mocked-out database on the fly.
Access is, as you're experiencing, less than optimal.
Have you taken a look at SQL Server Compact Edition. It can be embedded and distributed with your application...and should perform much better than Access.
SQL Server Compact 3.5 will give you the same benefit - a single database file that you can deploy and distribute (as long as you include the runtime assemblies in your app).
It has reduced query capabilities compared to a full SQL Server instance, but it is definitely faster than the Access engine.
I have used it with a mobile app that has a desktop component and it did everything I needed it to do.
Did you also have the Access backend open in Access at the same time? If so try your program without having it open. If that speeds things up then you should open either a database connection or a recordset (against a table with few records) and leave it open while processing the data.
The problem is that if you open and close objects or recordsets against an Access database file and someone else is in the Access database file, Jet wastes a lot of time doing locks against the LDB file. So keeping a permaneent connection to the Access database file solves this problem.
To my experience, ADO.NET is not much optimized for MS Access. Using the older ADO or DAO interfaces (which are available in VB.NET via COM) can bring you performance improvements about a factor 20 or more in some cases. But it all depends a lot of what SQL statements your program really does (lots of batch updates / insert, or lots of queries with large result sets, or lots of interactive LOAD-Transform-Store Cycles).
The MSDN features an Article on how to speed up ADO.NET: http://msdn.microsoft.com/en-us/library/ms998569.aspx
Even though the article is a bit dusty, it still makes a few good points :)
Other than that, using MS Access myself, I found that a few techniques such as caching of data, selecting without the source scheme or optimizing queries are suitable to keep the performance at a halfway decent level.

SQL commands to get performance statistics

Are there SQL commands that I could use to extract performance monitoring data from MS SQL 2005, such as:
transactions per second
page reads/writes
connections (##CONNECTIONS gives the total, but what about current)
physical reads
locks and blocks
other counters that might be interesting?
You want to look at Dynamic Management VIews (DMVs), introduced with SQL 2005.
This is a really great document from MS that gives you an overview as to how to use DMVs troubleshoot performance issues:
http://download.microsoft.com/download/1/3/4/134644fd-05ad-4ee8-8b5a-0aed1c18a31e/TShootPerfProbs.doc
The best way of seeing what's going on under the hood in SqlServer is to use the Performance Monitor built into windows, click Admin Tools -> Performance. If you haven't used it before the trick is to start it, then click the + icon at the centre top of the window, a dialog opens with 100s of different measures that you can then chart,.watch, or log.
SQL Server has loads of counters that you check out, what all the data means is of course a different question. This solution doesn't integrate with TSQL or Management Studio, but it is the best way of finding out what's going on.
A great place to learn how to performance tune SQL Server is Brent Ozar's website.
It includes details of how to use Performance Monitor, DMV's and how to data mine and interpret the results.
http://www.brentozar.com/sql-server-performance-tuning/

Automatically measure all SQL queries

In Maybe Normalizing Isn't Normal Jeff Atwood says, "You're automatically measuring all the queries that flow through your software, right?" I'm not but I'd like to.
Some features of the application in question:
ASP.NET
a data access layer which depends on the MS Enterprise Library Data Access Application Block
MS SQL Server
In addition to Brad's mention of SQL Profiler, if you want to do this in code, then all your database calls need to funnelled through a common library. You insert the timing code there, and voila, you know how long every query in your system takes.
A single point of entry to the database is a fairly standard feature of any ORM or database layer -- or at least it has been in any project I've worked on so far!
SQL Profiler is the tool I use to monitor traffic flowing to my SQL Server. It allows you to gather detailed data about your SQL Server. SQL Profiler has been distributed with SQL Server since at least SQL Server 2000 (but probably before that also).
Highly recommended.
Take a look at this chapter Jeff Atwood and I wrote about performance optimizations for websites. We cover a lot of stuff, but there's a lot of stuff about database tracing and optimization:
Speed Up Your Site: 8 ASP.NET Performance Tips
The Dropthings project on CodePlex has a class for timing blocks of code.
The class is named TimedLog. It implements IDisposable. You wrap the block of code you wish to time in a using statement.
If you use rails it automatically logs all the SQL queries, and the time they took to execute, in your development log file.
I find this very useful because if you do see one that's taking a while, it's one step to just copy and paste it straight off the screen/logfile, and put 'explain' in front of it in mysql.
You don't have to go digging through your code and reconstruct what's happening.
Needless to say this doesn't happen in production as it'd run you out of disk space in about an hour.
If you define a factory that creates SqlCommands for you and always call it when you need a new command, you can return a RealProxy to an SqlCommand.
This proxy can then measure how long ExecuteReader / ExecuteScalar etc. take using a StopWatch and log it somewhere. The advantage to using this kind of method over Sql Server Profiler is that you can get full stack traces for each executed piece of SQL.

Resources