We have an application which has a SQL Server 2000 Database attached to it. After every couple of days the application hangs, and we have to restart SQL Server service and then it works fine. SQL Server logs show nothing about the problem. Can anyone tell me how to identify this issue? Is it an application problem or a SQL Server problem?
Thanks.
Is it an application problem or a SQL Server problem?
Is it possible to connect to MS SQL Server using Query Analyzer or another instance of your application?
General tips:
Use Activity Monitor to find information about concurrent processes, locks and resource utilization.
Use Sql Server Profiler to trace server and database activity, to capture and save data to a table or file to analyze it later.
You can use Dynamic Management Views (\Database name\Views\System Views folder (in the Management Studio)) to get more detailed information about MS SQL Server internals.
If you have the problems with perfomance (not your case) - you can use Perfomance Monitor and Data Collector Sets to gather perfomance information
Hard to predict the issue, I will suggest you to check your application first.Check what all operations you are performing against data base, are you taking care of connection pooling, unused open connections can create issues.
Check if you can get any log from your application. Without any log information hardly we can suggest anything.
Read this
Application may be hanging due to Deadlock
check the SP runs at that time using Profiler
and check the table manipulation(use nolock),
check the buffer size and segregate the DB into two or three module.
Related
I want to run sql profiler to see the performance of my database Sql Server 2008, but I'm afraid that running the profiler in the same machine it will affect the performance of the server, and I don't want to slow down ther server.
A long time ago I heard from a DBA than he run the profiler from his laptop connected to the sql server, in a way that it does not affect the performance of the server.
Si bassically my question is How to run Sql profiler from an external computer without causing slow performance of the sql server?
Any database being profiled has to do work in order for profiling to be possible - there is no way around it. Generally speaking, observation of a system always induces a load to that system. However, SQL Server Profiler and other similar tools also do ADDITIONAL work outside of the target db, and this additional work can be offloaded to another computer.
To offload what you can, you just run SQL Server Profile from ANY machine that is not the database server. When you start a New Trace, you tell it to connect to the database on whatever server the database is running on. That's all there is to it. Your target db will incur some additional load (unavoidable), but you will be offloading as much work as you can to whatever machine you run Profiler on.
If you are able to connect to the computer from external computer,then there would be no issues running profiler remotely as well..
So basically my question is How to run Sql profiler from an external computer without causing slow performance of the sql server?
When you run profiler for long periods of time ,it affects performance,since it has to keep track of all events in memory and log it before discarding ..So running profiler for long periods of time is not recommended..
You also can use extended events starting from SQL2008(very light weight relative to profiler) to track events similar to Profiler ..
http://www.sqlteam.com/article/introduction-to-sql-server-2008-extended-events
Profiler can be initiated from any computer with appropriate permissions and access, but it ALWAYS runs on the actual SQL Server instance. There is no way around this. You can minimize the operations that are logged and filter by a specific user to mitigate performance issues, but that's about it.
The DBA in question may have run a server side trace, which can be less impactful, but it's still inititated ON the appicable instance.
I am a DBA and I am not aware of any performance issues by running SQL Server Profiler on the server itself. That said when you run SQL Server Profiler It loads just like SSMS where you can select which server to use.
If you have a query that is running so long that its killing SQL resources yes running it at all will still use up resources but regardless of where the source of the profiler is.
See screenshot of SS Profiler
If you are concerned about performance on the SQL Server instance don't run Profiler in production during peak hours.
If you want to minimize the impact of SQL Trace then it is the best to use the server-side tracing:
https://msdn.microsoft.com/en-us/library/cc293613.aspx
Like that you can record the SQL commands into a trace file and SQL Profiler is closed. When you are done with the SQL command collection, you can copy the trace file and open it using SQL Profiler in some other computer. It is much better than runing SQL Trace directly through SQL Profiler (which is called the client-side tracing).
I'm maintaining a legacy server app that generate DMO files from SQL Server views.
Sometimes the server crashes because SQL Server consumes all cpu resources.
Using the SQL Server monitor I see that the problem is in SQLDMO connections that are consuming all cpu time and blocking the server.
I don't understand the reason of that because the dmo connection is with TRANSACTION LEVEL READ UNCOMMITTED and these SQLs never finish, during weeks. The only solution is to shutdown the server.
I would suggest looking into the code why these connections are not closed. I'm guessing there's no proper closing at the end or something along those lines.
If that is not an option, you could consider running a scheduled job that kills off these specific jobs every so often if they ran for longer than say, 24 hours.
I just realized that my application was needlessly making 50+ database calls per user request due to some hidden coding -- hidden in the sense that between LINQ, persistence frameworks and events it just so turned out that a huge number of calls were being made without me being aware.
Is there a recommended way to analyze individual transactions going to my SQL 2008 database, preferably with some integration to my Visual Studio 2010 environment? I want to be able to 'spy' on individual transactions being made, but only for certain pieces of my code, and without making serious changes to either the code or database.
I addition to SQL Server Profiler, there are a number of performance counters you can look at to see both a real time evaluation and a historic trend:
Batch Requests/sec: Effectively measures the number of actual calls made to the SQL Server
Transactions/sec: Number of transactions in each database.
Connection resets/sec: number of new connections started from the connection pool by your site.
There are many more performance counters you can monitor, specially if you want to measure performance, but going through is besides the scope here. A good starting point is Monitoring Resource Usage.
You can use the SQL Profiler tool that comes with SQL Server Management Studio.
Microsoft SQL Server Profiler is a graphical user interface to SQL Trace for monitoring an instance of the Database Engine or Analysis Services. You can capture and save data about each event to a file or table to analyze later. For example, you can monitor a production environment to see which stored procedures are affecting performance by executing too slowly.
As mentioned, SQL Profiler is userful at the SQL Server level. It is not available in SQL Server SSMS Express however.
At the .NET level, LINQ to SQL and the Entity Framework both support logging. See Logging every data change with Entity Framework, http://msdn.microsoft.com/en-us/magazine/gg490349.aspx, http://peterkellner.net/2008/12/04/linq-debug-output-vs2008/.
What are my options for achieving a cold backup server for SQL Server Express instance running a single database?
I have an SQL Server 2008 Express instance in production that currently represents a single point of failure for my application. I have a second physical box sitting at the installation that is currently doing nothing. I want to somehow replicate my database in near real time (a little bit of data loss is acceptable) to the second box. The database is very small and resources are utilized very lightly.
In the case that the production server dies, I would manually reconfigure my application to point to the backup server instead.
Although Express doesn't support log shipping, I am thinking that I could manually script a poor man's version of it, where I use batch files to take the logs and copy them across the network and apply them to the second server at 5 minute intervals.
Does anyone have any advice on whether this is technically achievable, or if there is a better way to do what I am trying to do?
Note that I want to avoid having to pay for the full version of SQL Server and configure mirroring as I think it is an overkill for this application. I understand that other DB platforms may present suitable options (eg. a MySQL Cluster), but for the purposes of this discussion, let's assume we have to stick to SQL Server.
I would also advise for a script based log shipping. After all, this is how log shipping started. All you need is a time based agent to schedule the scripts (ie. Tasks Scheduler), and a smart(er) file copy (robocopy).
I need to determine if a specific application is running from a SQL Server 2005 job. The issue is that one of our applications we use to send data will hang, causing problems with any subsequent jobs that invokes it. If I can also obtain the CPU time, I can determine if it's likely a hung process.
A list of running applications would be good, but being able to lookup a specific executable name with the CPU time would be fantastic!
Any application launched by a job step will show as being run by the same logon account as the SQL Server Agent. Use a specific service account for the SQL Server Agent that won't be used for any other services. This willallow you to monitor the applications launched from by a job using Task Manager, Performance Monitor, etc.
Try opening the SQL Server Activity Monitor. You can also get some of the information from the stored proc sp_who2.
Have the job run an external script (batch file, KSH script) instead of a TSQL script.
I think the best approach is to run SQL Server Profiler as well as performance monitor and wait for the specified job to run. Then import the perfmon stats into profiler. You can do this from SQL Server profiler by going to File–> Import Performance Data… and point it to your Performance Monitor logs.
You should be able to choose the Process(all) counter to give you a list of all running processes, as well as getting CPU time for the processes. You can then correlate this to the application name and/or hostname from the Profiler logs to see whats going on.
I use the (free) replacement to task manager "Process Explorer" to get a better look at exe's and their dependencies.
Might be worth monitoring your issue with this.
http://technet.microsoft.com/en-us/sysinternals/bb896653.aspx