May I ask how to get the whole statement query from running session?
I'd like to check its actual execution plan of it and need to complete query, but sometimes it's too long for Management Studio’s monitor or dynamic view.
Any advice would be so helpful.
Use SQL Profiler and run a trace against the related database.
Related
I want to run sql profiler to see the performance of my database Sql Server 2008, but I'm afraid that running the profiler in the same machine it will affect the performance of the server, and I don't want to slow down ther server.
A long time ago I heard from a DBA than he run the profiler from his laptop connected to the sql server, in a way that it does not affect the performance of the server.
Si bassically my question is How to run Sql profiler from an external computer without causing slow performance of the sql server?
Any database being profiled has to do work in order for profiling to be possible - there is no way around it. Generally speaking, observation of a system always induces a load to that system. However, SQL Server Profiler and other similar tools also do ADDITIONAL work outside of the target db, and this additional work can be offloaded to another computer.
To offload what you can, you just run SQL Server Profile from ANY machine that is not the database server. When you start a New Trace, you tell it to connect to the database on whatever server the database is running on. That's all there is to it. Your target db will incur some additional load (unavoidable), but you will be offloading as much work as you can to whatever machine you run Profiler on.
If you are able to connect to the computer from external computer,then there would be no issues running profiler remotely as well..
So basically my question is How to run Sql profiler from an external computer without causing slow performance of the sql server?
When you run profiler for long periods of time ,it affects performance,since it has to keep track of all events in memory and log it before discarding ..So running profiler for long periods of time is not recommended..
You also can use extended events starting from SQL2008(very light weight relative to profiler) to track events similar to Profiler ..
http://www.sqlteam.com/article/introduction-to-sql-server-2008-extended-events
Profiler can be initiated from any computer with appropriate permissions and access, but it ALWAYS runs on the actual SQL Server instance. There is no way around this. You can minimize the operations that are logged and filter by a specific user to mitigate performance issues, but that's about it.
The DBA in question may have run a server side trace, which can be less impactful, but it's still inititated ON the appicable instance.
I am a DBA and I am not aware of any performance issues by running SQL Server Profiler on the server itself. That said when you run SQL Server Profiler It loads just like SSMS where you can select which server to use.
If you have a query that is running so long that its killing SQL resources yes running it at all will still use up resources but regardless of where the source of the profiler is.
See screenshot of SS Profiler
If you are concerned about performance on the SQL Server instance don't run Profiler in production during peak hours.
If you want to minimize the impact of SQL Trace then it is the best to use the server-side tracing:
https://msdn.microsoft.com/en-us/library/cc293613.aspx
Like that you can record the SQL commands into a trace file and SQL Profiler is closed. When you are done with the SQL command collection, you can copy the trace file and open it using SQL Profiler in some other computer. It is much better than runing SQL Trace directly through SQL Profiler (which is called the client-side tracing).
How do I see the currently executing SQL statements in SQL Server? I've poked around in SQL Server Management Studio, but I don't see anything "canned".
Profiler will log, and allow you to view, all activity on the server, if that's what you're looking for.
http://msdn.microsoft.com/en-us/library/ms187929.aspx
The active connections can be listed with the built in stored procedures sp_who and sp_who2. At least one of them (don't remember which one right now) shows executing commands on the exact time when the sp is run.
As mentioned in another answer, sql server profiler is a great tool which gives much more detail and logging of activity. The sp:s just provides a quick overview.
Activity Monitor (in SSMS under Management) is a GUI version of sp_who2. To identify the T-SQL being executed, run DBCC Inputbuffer and the SPID, eg,
DBCC inputbuffer(54)
SQL Server DMV's are great for finding information like this. For example the sys.dm_exec_connections table will show you a lot about the user connected to your database.
If it is of interest, our Cotega service has the ability to do an analysis of your database and show you this as well as a lot of other things (such as top queries by CPU, IO, etc) which is available to even the free accounts.
I hope that helps.
When I execute a T-SQL query it executes in 15s on sql 2005.
SSRS was working fine until yesterday. I had to crash it after 30min.
I made no changes to anything in SSRS.
Any ideas? Where do I start looking?
Start your query in SSIS then look into the Activity Monitor of Management Studio. See if the query is currently blocked by any chance, and in that case, what it is blocked on.
Alternatively you can use sys.dm_exec_requests and check the same thing, w/o the user interface getting in the way. Look at the session executing the query from SSIS, check it's blocking_session_id, wait_type, wait_time and wait_resource columns. If you find that the query is blocked, the SSIS has no fault probably and something in your environment is blocking the query execution. If on the other hand the query is making progress (the wait_resource changes) then it just executes slowly and its time to check its execution plan.
Have you tried making the query a stored procedure to see if that helps? This way execution plans are cached.
Updated: You could also make the query a view to achieve the same affect.
Also, SQL Profiler can help you determine what is being executed. This will allow you to see if the SQL is the cause of the issue, or Reporting Services rendering the report (ie: not fetching the data)
There are a number of connection-specific things that can vastly change performance - for example the SET options that are active.
In particular, some of these can play havoc if you have a computed+persisted (and possibly indexed) column. If the settings are a match for how the column was created, it can use the stored value; otherwise, it has to recalculate it per row. This is especially expensive if the column is a promoted column from xml.
Does any of that apply?
Are you sure the problem is your query? There could be SQL Server problems. Don't forget about the ReportServer and ReportServerTempDB databases. Maybe they need some maintenance.
The first port of call for any performance problems like this is to get an execution plan. You can either get this by running an SQL Profiler Trace with the ShowPlan Xml event, or if this isn't possible (you probably shouldn't do this on loaded production servers) you can extract the cached execution plan that's being used from the DMVs.
Getting the plan from a trace is preferable however, as that plan will include statistics about how long the different nodes took to execute. (The trace wont cripple your server or anything, but it will have some performance impact)
I'd like to know what stored procedures are currently running to diagnose some performance problems. How can I find that out?
Very useful script for analyzing locks and deadlocks: http://www.sommarskog.se/sqlutil/aba_lockinfo.html
It shows procedure or trigger and current statement.
I think you can do execute sp_who2 to get the list of connections, but then you'll need to run a trace through SQL Profiler on the specific connection to see what it's executing. I don't think that works with queries that are already running though.
DBCC INPUTBUFFER will show you the first 255 characters of input on a spid (you can use sp_who2 to determine the spids you're interested in). To see the whole command, you can use ::fn_get_sql().
You can use SQL Profiler to find that out.
EDIT:
If you can stop the app you are running, you can start SQL Profiler, run the app and look at what's running including stored procedures.
Using Enterprise Manager, you can open the Management tree section, and choose Current Activity -> Process Info. Double clicking on a Process ID will show you what that process is running. If it's a stored procedure, it will not show you the parameters. For that it would be better to use Brian Kim's suggestion of using the SQL Profiler.
I am trying to optimize some stored procedures on a SQL Server 2000 database and when I try to use SQL Profiler I get an error message "In order to run a trace against SQL Server you have to be a member of sysadmin fixed server role.". It seems that only members of the sysadmin role can run traces on the server (something that was fixed in SQL Server 2005) and there is no way in hell that I will be granted that server role (company policies)
What I'm doing now is inserting the current time minus the time the procedure started at various stages of the code but I find this very tedious
I was also thinking of replicating the database to a local installation of SQL Server but the stored procedure is using data from many different databases that i will spend a lot of time copying data locally
So I was wondering if there is some other way of profiling SQL code? (Third party tools, different practices, something else )
Your hands are kind of tied without profiler.
You can, however, start with tuning your existing queries using Query Analyzer or
any query tool and examining the execution plans. With QA, you can use
the Show Execution Plan option. From other tools you can use the
SET STATISTICS PROFILE ON / OFF
In query analyser:
SET STATISTICS TIME ON
SET STATISTICS IO ON
Run query and look in the messages tab.
It occurs to me this may require same privileges, but worth a try.
There is a workaround on SQL 2000 to obfuscate the Profiler connection dialogue box to limit the sysadmin connection to running traces only.
SQLTeam
Blog