I am looking for the possibility to capture the name of a table on CREATE TABLE, DROP TABLE and other operations in my postgres database.
I looked into event triggers and they seem to only be able to capture these events on ddl_command_end (https://www.postgresql.org/docs/current/functions-event-triggers.html#PG-EVENT-TRIGGER-SQL-DROP-FUNCTIONS), which should work for the CREATE case but not all of the others.
So I wanted to ask, if there exists a possibility to either get the data from a dropped table (as I would need it) or get the information before the event happens.
Thank you for your help!
In Oracle we have a option to recover table but in PostgreSQL we do not have that option. Only thing you can do for this kind of situation enabling archiving and following PITR steps. It could be on other server or on the server that your database running. It depends on significance of the dropped table and the database.
Related
Some one keeps dropping tables on one of our database's as soon as I gain access the server. I don't know who this some one is. I have nearly lost my job once because of this person.
So I was wondering is there a way I can check which user ran a query for DROP TABLE my_table so that I can prove to my boss I am innocent?
I found this article which may help you.
On SQL Server 2005 or newer, you could also investigate DDL triggers which would even allow you to prohibit certain DROP TABLE statements....
CREATE TRIGGER safety
ON DATABASE
FOR DROP_TABLE
AS
PRINT 'You must disable Trigger "safety" to drop tables!'
ROLLBACK
;
This would basically just prevent anyone from dropping a table
When the table's rows are changed, these changed rows are written to XML, and let me know that the table has been changed.
How can I do this?
If you're looking for a strict TSQL or SQL Server solution:
write a stored procedure to handle UPDATE, DELETE and INSERT functionality.
deny UPDATE, DELETE and INSERT to users
allow EXEC to users on this new stored proc
on each call to the stored proc, make an entry into another table, specifically built for auditing.
write a SQL Job to poll this audit table for new records. Use SQL Mail to send email. You weren't clear about what kind of notification you wanted, but I assumed email.
2nd less attractive solution: You could also use triggers on the table to capture the UPDATE, DELETE and INSERT activity. Strongly consider the stored proc solution over triggers.
If you can't alter how data is being changed in your table, the best solution would be to setup a trigger to capture changes in a separate table, and then write some code to periodically poll this table, and build your xml file.
Its worth noting that this will potentially slow down your db performance when editing data in this table (good for audit when users are making changes, bad for programatically changed data), and any errors coming from the trigger lead to quite misleading messages thrown back out of sql server.
See this question for some pointers on setting up the trigger.
Is there a way to see the history or any other information of insertions into a specific table of an SQL Server database?
Unless you are recording this information somewhere using a trigger, you would need some way of looking at the information in the transaction log. There are commercial tools like Lumigent for this.
You could use a trigger
Create a trigger on the table watching for inserts, updates, and deletes). The trigger would insert into another table (a history table).
This adds extra overhead, though, so I wouldn't do this on a really heavily updated table.
Look at this page for an example of how this is done.
This page has some code that generates the audit trail code for you.
Here is another SOF question about doing this using triggers.
If you are using SQL Server 2008, you can use the new Change Data Capture feature. This saves you from having to write triggers on all your tables.
For 2005 use triggers, for 2008 you can use the change data capture.
Aside from using a trigger, you could do something like add a column named "InsertedDate" and record the current date there. This would require you do your insertions through a stored procedure though.
I am creating a server level trigger in SQL 2008 to log table creation and drops. I need to log the database that the table was created in/dropped from. First I created a column with a default value of db_name(), but this always recorded master. Next I tried using this in my insert statement:
EVENTDATA().value('(/EVENT_INSTANCE/DatabaseName)[1]','nvarchar(max)')
This worked for a while, but suddenly it started recording master for all table creations and drops regardless of the database the table was in. All of the table drops have been done using SSMS. Does anyone know why I am seeing this behavior? Even more important, does anyone know how to log the correct database?
EDIT: I found an article which makes me think what I'm doing is incorrect. Apparently you should only capture create_table and drop_table from a database scoped trigger and not from a server_scoped trigger. I would still like to leave the question open though in case someone knows how to work around this.
HI,
You are correct, CREATE TABLE and DROP TABLE events should be recorded from within DDL Triggers that are defined at the database level.
Server Level Triggers are intended for server wide events, for example, when a Login occurs.
Here is an excellent article that may assist you in your developments.
http://www.developer.com/db/article.php/3552096
The following refence details which DDL events can be fired at either Database or Server scope.
http://msdn.microsoft.com/en-us/library/ms189871(SQL.90).aspx
Cheers,
I have a problem with a Database at my work. There is currently auditing in place, but its clunky, requires a lot of maintence, and it falls short in a few regards. So I am replacing it.
I want to do this in as generic of a way as possible and have designed the tables, and how everything will link and be updated.
Now, thats all fine and good, but I want to be able to write a generic way to insert records into these audit tables. (Without having to enter a command for each column in each table being changed.)
Is there anyway within a Stored Procedure to iterate over all the columns in a table? And I would like to write this in such a way that it will work with several tables, and automatically pickup and audit added columns and such.
Any ideas?
EDIT: Guess I should Clarify. I will be auditing data that is in the tables. But I will be using the same table(s) to store the audited data for every table in the database.
And I can not use Triggers because usually, when an update occurs, it occurs across multiple tables, but I would like all of these updates to be part of a single Change Set.
That is not a problem, because I can do all the Updates from within a single Stored Proc. I would just prefer some way like a loop, that i can get all the updated fields, figure out which ones changed, and the insert those changed ones into the audit table.
And I would like to do this without have a long list of if statements and insert statements for each column. (By doing this in a generic loop, it will handle added columns automatically and not be bothered by deleted columns)
By "added columns" I guess you are looking to audit DDL. If you use SQL 2005, then you want this link.
If don't use SQL 2005, then you probably want to either use one of the many SQL schema comparison tools, like SQL Red Gates tool set probably has something in there.
If you don't have $ for tools, then you might just want to run periodic queries against information_schema.tables and information_schema.columns. By periodically capturing these in permenant tables, you can identify when they have gained or lost rows (and hence a schema changed occured)
If you are doing data audit instead, then you'll want want to code generate some triggers, again using information_schema.tables and information_schema.columns.
There would be performance considerations, but you could add insert and update triggers to all of your tables, and have the triggers insert into your audit tables.
Use DDL Triggers (assuming you have SQL Server 2005+)!
http://www.sqlteam.com/article/using-ddl-triggers-in-sql-server-2005-to-capture-schema-changes
http://technet.microsoft.com/en-us/library/ms189871.aspx
That could be done if you were using a data access layer that could trap which tables and columns are being update and generating the insert statements for the audit table. In a stored procedure? Which stored procedure? Do you have a single one that does updates? Or are you creating one per table?
If it's an option for you, just upgrade to sql server 2008 and turn on Change Data Capture.