I try to trace SQL command. I read this post : How can I monitor the SQL commands send over my ADO connection?
It does work for select but not for Delete/Insert/Update...
Configuration : A TADOConnection (MS SQL Server), a TADOTable, a TDatasource, a TDBGrid with TDBNavigator.
So I can trace the SELECT which occurs when the table is open, but nothing occurs when I use the DBNavigator to UPDATE, INSERT, or DELETE records.
When I use a TADOCommand to delete a record, it works too. It seems It doesn't work only when I use the DBNavigator so maybe a clue but I don't find anything about that.
Thanks in advance
Hopefully someone will be able to point you in the direction of a pre-existing library that does your logging for you. In particular, if FireDAC is an option, you might take a look at what it says in here:
http://docwiki.embarcadero.com/RADStudio/XE8/en/Database_Alerts_%28FireDAC%29
Of course, converting your app from using Ado to FireDAC, may not be an option for you, but depending on how great your need is, you could conceivably extract the Sql-Server-specific method of event alerting FireDAC uses into an Ado application. I looked into this briefly a while ago and it looked like it would be fairly straightforward.
Prior to FireDAC, I implemented a server-side solution that caught Inserts, Updates and Deletes. I had to do this about 10 years ago (for Sql Server 2000) and it was quite a performance to set up.
In outline it worked like this:
Sql Server supports what MS used to call "extended stored procedures" which are implemented in custom DLLs (MS may refer to them by a different name these days or have even stopped supporting them). There are Delphi libraries around that provide a wrapper to enable these to be written in Delphi. Of course, these days, if your Sql Server is 64-bit, you need to generate a 64-bit DLL.
You write Extended Stored Procedures to log the changes any way you want, then write custom triggers in the database for Inserts, Updates and Deletes, that feed the data of the rows involved to your XSPs.
As luck would have it, my need for this fell away just as I was completing the project, before I got to stress-testing and performance-profiling it but it did work.
Of course, not in every environment will you be allowed/able to install s/ware and trigger code on the Sql Server.
For interest, you might also take a look at https://msdn.microsoft.com/en-us/library/ms162565.aspx, which provides an SMO object for tracing Sql Server activity, though it seems to be 32-bit only at the moment.
For amusement, I might have a go at implementing an event-handler for the recordset object that underlies a TAdoTable/TAdoQuery, which sould be able to catch the changes you're after but don't hold your breath ...
And, of course, if you're only interested in client-side logging, one way to do it is to write handlers for your dataset's AfterEdit, AfterInsert and AfterDelete events. Those wouldn't guarantee that the changes are ever actually applied at the server, of course, but could provide an accurate record of the user's activity, if that's sufficient for your needs.
Related
I have a database ("DatabaseA") that I cannot modify in any way, but I need to detect the addition of rows to a table in it and then add a log record to a table in a separate database ("DatabaseB") along with some info about the user who added the row to DatabaseA. (So it needs to be event-driven, not merely a periodic scan of the DatabaseA table.)
I know that normally, I could add a trigger to DatabaseA and run, say, a stored procedure to add log records to the DatabaseB table. But how can I do this without modifying DatabaseA?
I have free-reign to do whatever I like in DatabaseB.
EDIT in response to questions/comments ...
Databases A and B are MS SQL 2008/R2 databases (as tagged), users are interacting with the DB via a proprietary Windows desktop application (not my own) and each user has a SQL login associated with their application session.
Any ideas?
Ok, so I have not put together a proof of concept, but this might work.
You can configure an extended events session on databaseB that watches for all the procedures on databaseA that can insert into the table or any sql statements that run against the table on databaseA (using a LIKE '%your table name here%').
This is a custom solution that writes the XE session to a table:
https://github.com/spaghettidba/XESmartTarget
You could probably mimic functionality by writing the XE events table to a custom user table every 1 minute or so using the SQL job agent.
Your session would monitor databaseA, write the XE output to databaseB, you write a trigger that upon each XE output write, it would compare the two tables and if there are differences, write the differences to your log table. This would be a nonstop running process, but it is still kind of a period scan in a way. The XE only writes when the event happens, but it is still running a check every couple of seconds.
I recommend you look at a data integration tool that can mine the transaction log for Change Data Capture events. We are recently using StreamSets Data Collector for Oracle CDC but it also has SQL Server CDC. There are many other competing technologies including Oracle GoldenGate and Informatica PowerExchange (not PowerCenter). We like StreamSets because it is open source and is designed to build realtime data pipelines between DB at the schema level. Till now we have used batch ETL tools like Informatica PowerCenter and Pentaho Data Integration. I can near real-time copy all the tables in a schema in one StreamSets pipeline provided I already deployed DDL in the target. I use this approach between Oracle and Vertica. You can add additional columns to the target and populate them as part of the pipeline.
The only catch might be identifying which user made the change. I don't know whether that is in the SQL Server transaction log. Seems probable but I am not a SQL Server DBA.
I looked at both solutions provided by the time of writing this answer (refer Dan Flippo and dfundaka) but found that the first - using Change Data Capture - required modification to the database and the second - using Extended Events - wasn't really a complete answer, though it got me thinking of other options.
And the option that seems cleanest, and doesn't require any database modification - is to use SQL Server Dynamic Management Views. Within this library residing, in the System database, are various procedures to view server process history - in this case INSERTs and UPDATEs - such as sys.dm_exec_sql_text and sys.dm_exec_query_stats which contain records of database transactions (and are, in fact, what Extended Events seems to be based on).
Though it's quite an involved process initially to extract the required information, the queries can be tuned and generalized to a degree.
There are restrictions on transaction history retention, etc but for the purposes of this particular exercise, this wasn't an issue.
I'm not going to select this answer as the correct one yet partly because it's a matter of preference as to how you approach the problem and also because I'm yet to provide a complete solution. Hopefully, I'll post back with that later. But if anyone cares to comment on this approach - good or bad - I'd be interested in your views.
I'm looking to implement some "Asynchronous Triggers" in Azure SQL Database. There was this other question that asked this same question with pretty much the same needs as mine but for SQL Server 2005/2008. The answer was to use the Service Broker. And it's a great answer that would serve my needs perfectly if it was supported in Azure SQL Databases, but it's not.
My specific need is that we have a fairly small set of inputs selected and stored by a user. A couple of those inputs are identifications of specific algorithms and some aggregate-level data all into a single record of a single table. Once saved, we want a trigger to execute those selected algorithms and process the aggregate-level data to break it down into tens of thousands of records into a few different tables. This 2-8 seconds to process, depending on the algorithms. (I'm sure I could optimize this a bit more but I don't think I can get this faster than 2-5 seconds just because of the logic that must be built into it.)
I am not interested in installing SQL Server inside a VM in Azure - I specifically want to continue using Azure SQL Database for many reasons I'm not going to get into in this post.
So my question is: Is there a good/obvious way to do this in Azure SQL Database alone? I can't think of one. The most obvious options that I can see are either not inside Azure SQL Database or are non-starters:
Use real triggers and not asynchronous triggers, but that's a problem because it takes many seconds for these triggers to process as it crunches numbers based on the stored inputs.
Use a poor-man's queueing system in the database (i.e. a new table that is treated as a queue and insert records into it as messages) and poll that from an external/outside source (Functions or Web Jobs or something). I'd really like to avoid this because of the added complexity and effort. But frankly, this is what I'm leaning towards if I can't get a better idea from the smart people here!
Thanks for the help!
(I am posting this here and not on DBA.StackExchange because this is more of an architectural problem than a database problem. You may disagree but because my current best option involves non-database development and the above question I referenced that was almost perfect for me was also located here, I chose to post here instead of there.)
As far as I know, it's not possible to do directly in Azure SQL Database, but there are a few options:
As #gotqn mentioned in a comment, you can use Azure Automation/Runbooks; applied to Azure SQL specifically.
You can also check out database jobs.
You can use LogicApps. It has a SQL connector that implements an asynchronous trigger...
https://azure.microsoft.com/en-us/services/logic-apps/
I have a SQL script that will refresh the dependent views of a table once the table has been modified, like adding new fields. The script will be run thru ExecuteNonQuery, see example below.
Using refreshCommand As New SqlClient.SqlCommand("EXEC RefreshDependentViews 'Customer','admin',0", SqlClient.SqlConnection, SqlClient.SqlTransaction)
refreshCommand.ExecuteNonQuery()
End Using
The above code when executed will take 4-5 seconds, but when I copy the script only and run it through MS SQL directly, it only takes 2-3 seconds.
My question is, why they have different intervals?
Please note that the MS SQL server is on my PC itself and also the code.
Thanks
SqlClient and SSMS have different connection-level options (SET options) by default, which can sometimes be a factor. I also wonder what the isolation level is for the two things, which could be compounded if you are using TransactionScope etc in your code.There could also simply be different system load at the time. Basically, hard to say just from that: but there are indeed some things that can impact this.
Note for bounty: please answer only if you know a tool that can monitor what changes in the same db, don't mention tools that compare 2 dbs. A visual tool, like Embarcadero Change Manager is appreciated.
I'd like to have a tool that allows me to see only "what changed" in a db, given a specific action.
Scenario can be:
1) start monitoring (with the tool)
2) user performs an action on GUI (like clicking the button "apply" after having changed the telephone number of a customer)
3) stop moniroting: show changes (with the tool) (in this case I should only see that the Address field has been changed)
Embarcadero's Change manager does this, but it does also many other things, and it is expensive. I am looking for a simpler tool that does only this.
Note: I don't need schema comparison, just simple data comparison.
If you have SQL Server 2008 Enterprise Edition you can use Change Data Capture to expose the transaction log in a more usable format than DBCC LOG dbname,3. See http://msdn.microsoft.com/en-us/library/bb522489.aspx for more information.
Check out the Lite/Free version of the various xSQL tools:
http://www.xsqlsoftware.com/LiteEdition.aspx
The have an object-level compare, as well as a data compare tool.
Those don't work "on the fly", but you can always have one database as a reference, and compare your current one against that baseline.
You can make Database Snapshots (quickly enough) and then compare with working DB. Both issues can be solved with free tools.
For free, one option is to use DBCC LOG dbname,3 to dump the transaction log.
Probably be fun interpreting the output, but certainly doable.
The other free alternative is to put audit tables in place and put auditing triggers on all your tables. This is a bit more flexible than change data capture because you can specify some additonal things that change data capture doesn't capture.
Profiler is also a tool that you can use to see what query the app sent to the database which should tell you what was changed in many cases. This is a less permanent solution than auditing, but consider if auditing is something that you would find useful in the logn run.
I can't imagine ever managing a database without setting up auditing. It's just too useful for fixing bad data changes or findong out who made a particular change.
The most obvious tool would be SQL Profiler. It will monitor every SQL statement, and thus every data change, that gets sent to the server and show you metrics about that statement, the account under which the statement was executed and a host of other pieces of information and it comes free with most SQL Server versions. If you only want to see data changes, you can add filters to only show Insert, Update and Delete statements.
If what you are trying to do is compare two databases to see what data is different between them, then I'd recommend something like Red-Gate's Data Compare (no I do not work for them). It is not free but invaluable for this type of column value by column value comparison.
In Maybe Normalizing Isn't Normal Jeff Atwood says, "You're automatically measuring all the queries that flow through your software, right?" I'm not but I'd like to.
Some features of the application in question:
ASP.NET
a data access layer which depends on the MS Enterprise Library Data Access Application Block
MS SQL Server
In addition to Brad's mention of SQL Profiler, if you want to do this in code, then all your database calls need to funnelled through a common library. You insert the timing code there, and voila, you know how long every query in your system takes.
A single point of entry to the database is a fairly standard feature of any ORM or database layer -- or at least it has been in any project I've worked on so far!
SQL Profiler is the tool I use to monitor traffic flowing to my SQL Server. It allows you to gather detailed data about your SQL Server. SQL Profiler has been distributed with SQL Server since at least SQL Server 2000 (but probably before that also).
Highly recommended.
Take a look at this chapter Jeff Atwood and I wrote about performance optimizations for websites. We cover a lot of stuff, but there's a lot of stuff about database tracing and optimization:
Speed Up Your Site: 8 ASP.NET Performance Tips
The Dropthings project on CodePlex has a class for timing blocks of code.
The class is named TimedLog. It implements IDisposable. You wrap the block of code you wish to time in a using statement.
If you use rails it automatically logs all the SQL queries, and the time they took to execute, in your development log file.
I find this very useful because if you do see one that's taking a while, it's one step to just copy and paste it straight off the screen/logfile, and put 'explain' in front of it in mysql.
You don't have to go digging through your code and reconstruct what's happening.
Needless to say this doesn't happen in production as it'd run you out of disk space in about an hour.
If you define a factory that creates SqlCommands for you and always call it when you need a new command, you can return a RealProxy to an SqlCommand.
This proxy can then measure how long ExecuteReader / ExecuteScalar etc. take using a StopWatch and log it somewhere. The advantage to using this kind of method over Sql Server Profiler is that you can get full stack traces for each executed piece of SQL.