I am interested in profiling SQL Server database transactions that are being executed by a java application.
Particularly I am interested in finding out what queries (query strings) get executed as a part of the transaction.
Database I am using is SQL Server 2008.
Right now I am selecting Exec Prepared SQL event and Prepare SQL event in the profiler but that does not give me the query strings that are being executed by the application. I see a bunch of exec statements but nothing more. It doesn't have details as to what query was being executed.
Would anyone have an idea of how can I get access to the query strings?
Regards,
My typicals are:
RPC:Completed.
SP Completed.
SP:StmtCompleted
Exec Prepared SQL
SQL : BatchCompleted
SQL : StmtCompleted
And I pick the TextData columnn (and some others).
Do you know about DMV queries?
http://sqlserverperformance.wordpress.com/2008/01/21/five-dmv-queries-that-will-make-you-a-superhero/
http://sqlserverperformance.wordpress.com/2011/02/04/five-dmv-queries-that-will-make-you-a-superhero-in-2011/
Related
I am trying to insert rows into a Microsoft SQL Server 2014 table from a query that hits a linked Oracle 11g server. I have read only access rights on the linked server. I have traditionally used OPENQUERY to to do something like the following:
INSERT INTO <TABLE> SELECT * FROM OPENQUERY(LINKED_SERVER, <SQL>)
Over time the SQL queries I run have been getting progressively more complex and recently surpassed the OPENQUERY limit of 8000 characters. The general consensus on the web appears to be to switch to something like the following:
INSERT INTO <TABLE> EXECUTE(<SQL>) AT LINKED_SERVER
However, this seems to require that distributed transactions are enabled on the linked server, which isn't an option this project. Are there any other possible solutions I am missing?
Can you get your second method to work if you disable the "remote proc transaction promotion" linked server option?
EXEC master.dbo.sp_serveroption
#server = 'YourLinkedServerName',
#optname = 'remote proc transaction promotion',
#optvalue = 'false'
If SQL Server Integration Services is installed/available, you could do this with an SSIS package. SQL Server Import/Export Wizard can automate a lot of the package configuration/setup for you.
Here's a previous question with some useful links on SSIS to Oracle:
Connecting to Oracle Database using Sql Server Integration Services
If you're interested in running it via T-SQL, here's an article on executing SSIS packages from a stored proc:
http://www.databasejournal.com/features/mssql/executing-a-ssis-package-from-stored-procedure-in-sql-server.html
I've been in a similar situation before, what worked for me was to decompose the large query string while still using the query method below. (I did not have the luxury of SSIS).
FROM OPENQUERY(LINKED_SERVER, < SQL >)
Instead of Inserting directly into your table, move your main result set into a local temporary landing table first (could be a physical or temp table).
Decompose your < SQL > query by moving transformation and business logic code into SQL Server boundary out of the < SQL > query.
If you have joins in your < SQL > query bring these result sets across to SQL Server as well and then join locally to your main result set.
Finally perform your insert locally.
Clear your staging area.
There are various approaches (like wrapping your open queries in Views) but I like flexibility and found that reducing the size of my open queries to the minimum, storing and transforming locally yielded better results.
Hope this helps.
I was told by our IT team that we cannot use the Openquery command in MS Sql Server anymore. They claimed it was possible to slow down the server because every query requires a full table load, and all queries slow it down, etc.etc.
I was somewhat puzzled by this as I thought an OpenQuery command was similar to the 'passthrough' query in Access. The query goes to the IBM server, which executes the command and only sends the results back to SQL Server. I have read through OpenQuery on the internet and nothing I've read makes me believe that it loads or sends a whole table and then SQL Server filters the results.
I assume its possible for them to lock down the DB2 servers and prevent linked servers from SQL Server, but for my future knowledge can someone explain any perils to using OpenQuery when connecting to IBM DB2?
Thanks,
Please read this. Can you avoid OpenQuery? The best alternative would be to use a store procedure command or at least craft a EXECSQL with a SP target.
I'm using SqlServer 2008. A usefull link is also ok. I've found some links, but since I'm not an expert with Sql Profiler, I can't seem to find how I would do this.
By the way, some data is retrieved with Stored Procedures, but others are done with sql in the .NET server layer.
In Management Studio, go to the Query portion of the menu and go down to "Include Actual Execution Plan", this will create a new tab when you execute a new query.
Edit for SQL Server 2008
If your doing SQL Profiler, turn on the trace Scans:Scan Started and Scans:Scan Ended
We are using ODBC from C on Linux. We can successfully execute direct statements ("INSERT ...", "SELECT ...", etc) using ODBC for both SQL Server 2008 and MySQL. We are migrating to stored procedures, so we first developed MySQL stored procedures. Calling MySQL stored procedures using ODBC works. Life is good.
The stored procedures are translated into T-SQL. We verify that they function by executing queries directly from Visual Studio. The database is filled, queries work. Huzzah.
We have a test program allowing us to use MySQL or SQL Server, direct execution or calling stored procedures. We call the T-SQL stored procedures from a C test program. Log output indicates that tables are being filled with data, queries are working, etc. Until the end, where a statement fails. The program exits (taking several seconds longer than normal). The other 3 cases work (direct MySQL, direct SQL Server, stored proc MySQL).
We examine the SQL Server database. It's empty. We have autocommit turned on, so I don't think it's a commit problem. The stored procs are bog simple, being copies of the direct SQL. Any ideas?
It sounds like the query is running - then errors out for some reason, and everything is wrapped up as a single transaction - and rolls back. Hence empty tables.
Does the stored procedure have any error trapping within it? SQL Server 2005 and later improved error handling enormously with TRY.. CATCH.
How can we get information regarding poorly performing sqls(taking too much time for execution)
Does MS SQL server maintains tables/views (Similar to v$sql in Oracle) for storing sql queries.
I Use the SQL Profiler to collect statistic data wich I then can use to nail down where there are need to do some work, tweak indexes and so on.
Here are some tips about monitoring with Profiler:
http://www.developer.com/db/article.php/3490086
http://vyaskn.tripod.com/analyzing_profiler_output.htm
take a look at sys.dm_exec_query_stats and this page. SQL Server 2005+ only.