Create A Trigger Automaticly Every Day - sql-server

I have a MSSQL GPS database which create every month a database and every day a table. This is a real time database that getting data every 3 seconds. And all operations are making by a GPS program which I have no source code or access.
Databases looks like this :
-Comms201502 (Database)
-Comms201503 (Database)
-Comms201504 (Database)
-GPS20150401 (Database's Table)
-GPS20150402 (Database's Table)
-GPS20150403 (Database's Table)
-GPS20150404 (Database's Table)
-...
I have a trigger that receiving data from a table which I have to create that trigger for every day, and writing to my database.
Is there a anyway to create single trigger or create an automaticly create trigger every day ?
Best regards,

I would create a DDL trigger, this would be raised each time a CREATE TABLE Statement is executed and then adds a trigger to the new table to log the records to your table, such as:
CREATE TRIGGER trg_DDLCreateTrigger ON DATABASE
FOR CREATE_TABLE
AS
BEGIN
SET NOCOUNT ON;
DECLARE #EventData XML = EVENTDATA();
DECLARE #Schema SYSNAME
DECLARE #TableName SYSNAME
SELECT #Schema = EVENTDATA().value('(/EVENT_INSTANCE/SchemaName)[1]','nvarchar(max)'),
#TableName = EVENTDATA().value('(/EVENT_INSTANCE/ObjectName)[1]','nvarchar(max)')
DECLARE #sql VARCHAR(MAX)
SET #sql = 'CREATE TRIGGER [trg_' + #TableName + '] ON [' + #Schema + '].[' + #TableName + ']' +
' AFTER INSERT AS ' +
' INSERT INTO MyTable SELECT * FROM inserted' --Change this as required
EXEC (#Sql)
END
GO
You should probably add some validation in the trigger to check that the table is one that you are interested in capturing.
You can probably extend this so that there is a DDL trigger on a create database statement, which adds this trigger to the database.

Related

SQL Server 2014 Create DDL trigger on Create Table that creates a trigger for the table

I'm working with a third party tool that creates databases and tables. I would like a trigger on one of those tables being created. I thought I would try to create a server DDL trigger that fires when the table is created in a new database, which in turn creates a trigger on that table. I cannot add the trigger to the 'model' database because this table is created via the tool dynamically.
I've tried the following:
CREATE TRIGGER ddl_trig_createTable
ON ALL SERVER
FOR CREATE_TABLE
AS
DECLARE #databaseName varchar(255)
DECLARE #AffectedTables varchar(255)
SELECT #AffectedTables = EVENTDATA().value('(/EVENT_INSTANCE/ObjectName)[1]','nvarchar(100)')
IF (#AffectedTables IN ('DynamicallyCreatedTable'))
BEGIN
select #databaseName = CAST(eventdata().query('/EVENT_INSTANCE/DatabaseName[1]/text()') as NVarchar(128))
EXEC('CREATE TRIGGER ' + #databaseName + '.[dbo].[tgrDynamicTableTrigger]
ON
' + #databaseName + '.[dbo].[DynamicallyCreatedTable]
AFTER UPDATE
AS
BEGIN
SET NOCOUNT ON
-- trigger code here
END')
END
GO
Which produces the following error when the table is created;
Msg 166, Level 15, State 1, Line 1 'CREATE/ALTER TRIGGER' does not
allow specifying the database name as a prefix to the object name.
I tried changing the dynamic sql by replacing the fully qualified table name to attempt a 'use' statement:
--- 8< ---
EXEC('use ' + #databaseName + '
CREATE TRIGGER [dbo].[tgrDynamicTableTrigger]
ON
--- 8< ---
However, that produced the following error when the table was created:
Msg 111, Level 15, State 1, Line 2 'CREATE TRIGGER' must be the first
statement in a query batch.
Any ideas?
I'm using SQL Server 2014.
I believe that I have figured it out, thanks mostly to this answer.
Here's the code:
CREATE TRIGGER ddl_trig_createTable
ON ALL SERVER
FOR CREATE_TABLE
AS
DECLARE #statement nvarchar(max) = 'CREATE TRIGGER [dbo].[tgrDynamicTableTrigger]
ON
[dbo].[DynamicallyCreatedTable]
AFTER UPDATE
AS
BEGIN
-- trigger code here
END'
DECLARE #databaseName varchar(255)
DECLARE #AffectedTables varchar(255)
SELECT #AffectedTables = EVENTDATA().value('(/EVENT_INSTANCE/ObjectName)[1]','nvarchar(100)')
IF (#AffectedTables IN ('DynamicallyCreatedTable'))
BEGIN
SET #databaseName = CAST(eventdata().query('/EVENT_INSTANCE/DatabaseName[1]/text()') as NVarchar(128))
DECLARE #sql NVARCHAR(MAX) = QUOTENAME(#databaseName) + '.sys.sp_executesql';
EXEC #sql #statement;
END
GO

procedure to rollback a transaction from audit table in SQL Server

I have created an audit table that is populated by an audit Trail (triggers after every update, delete, and insert) on different tables in my database. I am now asked to create a stored procedure (script) to rollback the data change using the audit id. How do I go about do so. I wrote a script which seems good. The command is accepted by SQL Server (command completed Successfully). Unfortunately when I test it by passing the Audit_id, the command is completed but the data is not rolled back. This is the Procedure I developed. Any help will be greatly appreciated.
create PROCEDURE [dbo].[spAudit_Rollback_2]
#AUDIT_ID NVARCHAR(MAX)
AS
SET Nocount on
BEGIN
DECLARE
#TABLE_NAME VARCHAR(100),
#COLUMN VARCHAR(100),
#OLD_VALUE VARCHAR(200),
#ID varchar(50)
SELECT #TABLE_NAME = TABLE_NAME FROM AUDIT;
SELECT #COLUMN = [COLUMN] FROM AUDIT;
SELECT #AUDIT_ID = AUDIT_ID FROM AUDIT;
SELECT #OLD_VALUE = OLD_VALUE FROM AUDIT
SELECT #ID = ROW_DESCRIPTION FROM AUDIT;
update [Production].[UnitMeasure]
set #COLUMN = #OLD_VALUE
WHERE [Production].[UnitMeasure].[UnitMeasureCode] = #ID
END
[dbo].[spAudit_Rollback_2]'130F0598-EB89-44E5-A64A-ABDFF56809B5
This is the same script but using adventureworks2017 database and data.
If possible I would even prefer to use a variable to retrieve that table name from Audit and use that in the procedure. That too is giving me another error.
Any help with this procedure will be awesome.
This needs to be dynamic SQL because you're updating a column that's defined in a variable. Do the following in place of your current UPDATE statement.
DECLARE #sql VARCHAR(1000) = ''
SET #sql = 'UPDATE [Production].[UnitMeasure] ' +
'SET ' + #COLUMN + ' = ''' + #OLD_VALUE + '''' +
'WHERE [Production].[UnitMeasure].[UnitMeasureCode] = ''' + #ID + ''''
EXEC(#sql)

sp_rename failing when called from inside another stored procedure

First post from a self-taught data warehouse guy. I've done lots of searching and reading to get where I am now, but can't get past this sticking point.
Background: as part of our nightly ETL job, we have to copy many tables from many remote DBs (linked servers) into staging-area DBs. After table copies have finished, I continue with the transformation from the staging area DBs into production tables.
Since the remote DBs all have identical schema, I made a stored procedure in the production DB to do the work. The stored procedure accepts parameters of the remote database name and the table name. In the nightly job, SQL Server Agent runs an SSIS package; the package contains one (retry-looping) SSIS task for each remote database; all the tasks run concurrently; each task uses a variable to pass the DB name to SQL file; then the SQL file calls the stored procedure once for each table.
Example remote table and local staging-area table:
Remote: [FLTA].[cstone].[csdbo].[CLIENT]
Local: [FLTAL].dbo.[FLTA CLIENT]
The stored procedure is pretty simple, dropping the old table and using SELECT to make a fresh copy from the remote DB. It looks approximately like this:
CREATE PROCEDURE dbo.spTableCopyNew
(#p VARCHAR(50), #Tablename VARCHAR(50))
AS
-- Drop the existing table
EXEC('IF OBJECT_ID(''[' + #p + 'L].dbo.[' + #p + ' ' + #Tablename +']'', ''U'') IS NOT NULL
DROP TABLE [' + #p + 'L].dbo.[' + #p + ' ' + #Tablename +']'
)
-- Copy the new table
EXEC('SELECT * into [' + #p + 'L].dbo.[' + #p + ' ' + #Tablename +']
FROM [' + #p + '].[cstone].[csdbo].[' + #Tablename +']'
)
GO
The SQL looks roughly like this:
-- Set local variables for the remote server connection, the local database name, and the table prefix
DECLARE #Prefix varchar(50)
-- Accept the variables passed in from the SSIS task
SET #Prefix = ?
-- Copy the two tables
EXEC Datawarehouse.dbo.spTableCopy #Prefix, 'CLIENT'
EXEC Datawarehouse.dbo.spTableCopy #Prefix, 'PATIENT'
Maintenance is a breeze: when we need to grab a new table from all the remote databases, I just add it to the "productionLoad.sql" file.
It works really well...except when it doesn't.
Due to un-figured-out-yet reasons, sometimes a table fails to copy. And since I'm dropping the existing table before copying the new one, this will sometimes break things further down the line. My SSIS tasks will retry up to three times per remote DB, so occasional failures are no big deal. But if the same remote DB has three failures in one night, I'm gonna have a bad time.
My current attempt at a solution is to copy the remote table to a temp table, then ONLY AFTER that copy is successful, drop the local table and rename the temp table to the "real" table. Which brings me to the problem:
I can't get sp_rename to work when called from a stored procedure, to rename tables that exist in a different database than the stored procedure. I've created new variables to resolve expressions, then send those variables to sp_rename, since I can't pass expressions into that stored procedure.
Here's my attempt at a new stored procedure:
CREATE PROCEDURE dbo.spTableCopy
(#p VARCHAR(50), #Tablename VARCHAR(50))
AS
BEGIN
EXEC('USE [' + #p + 'L]')
-- Create variables for schema and table names
-- Since sp_rename can accept variables, but not expresssions containing variables.
DECLARE #RemoteTable VARCHAR(50) = '[' + #p + '].[cstone].[csdbo].[' + #Tablename +']'
DECLARE #LocalTableTemp VARCHAR(50) = '[' + #p + 'L].dbo.[' + #p + ' ' + #Tablename +'_temp]'
DECLARE #LocalTable VARCHAR(50) = '' + #p + ' ' + #Tablename + ''
-- Check for previous temp table and drop it
EXEC('IF OBJECT_ID(''[' + #p + 'L].dbo.[' + #p + ' ' + #Tablename +'_temp]'', ''U'') IS NOT NULL
DROP TABLE [' + #p + 'L].dbo.[' + #p + ' ' + #Tablename +'_temp]'
)
-- Copy the new table
EXEC('SELECT * into ' + #LocalTableTemp + '
FROM ' + #RemoteTable + ''
)
-- Drop the existing table
EXEC('IF OBJECT_ID(''[' + #p + 'L].dbo.[' + #p + ' ' + #Tablename +']'', ''U'') IS NOT NULL
DROP TABLE [' + #p + 'L].dbo.[' + #p + ' ' + #Tablename +']'
)
-- Rename temp table to real table
EXEC sp_rename #LocalTableTemp, #LocalTable
END
GO
This all works when executing it as normal SQL code, but when I make it into a stored procedure, sp_rename fails (everything else works). The final table [FLTAL CLIENT_temp] is there and contains the right data.
sp_rename returns the following error:
Msg 290, Level 16, State 2, Procedure sp_rename, Line 318
Invalid EXECUTE statement using object "Object", method "LockMatchID".
I've fought with this way too long.
Am I just screwing up the syntax?
Can I get sp_rename to work on other DBs with "USE?"
If not, will it work if I make a copy of my sp_tableCopy in every staging-area DB?
If not, will catch-try work inside this stored procedure, even if I call this stored procedure many times concurrently?
What else can I do to recover from failed table copies?
My alternate solution that I haven't pursued yet: after the temp table is successfully created, to TRUNC the existing table and insert everything from the temp table into the real table. That seems messy though.
P.S. Our IT guys are "looking into" the nature of the copy failures.
Try This...
USE
EXEC ..sp_rename '..', '<target_table>'
USE sourcedb
EXEC targetdatabase..sp_rename 'schema.oldtable', 'target_table'

SQL Server : update records in dynamically generated tables using parameters in stored procedure

I have to create a stored procedure where I will pass tableName, columnName, id as parameters. The task is to select records from the passed table where columnName has passed id. If record is found update records with some fixed data. Also implement Transaction so that we can rollback in case of any error.
There are hundreds of table in database and each table has different schema that is why I have to pass columnName.
Don't know what is the best approach for this. I am trying select records into a temp table so that I can manipulate it as per requirement but its not working.
I am using this code:
ALTER PROCEDURE [dbo].[GetRecordsFromTable]
#tblName nvarchar(128),
#keyCol varchar(100),
#key int = 0
AS
BEGIN
SET NOCOUNT ON;
BEGIN TRY
--DROP TABLE #TempTable;
DECLARE #sqlQuery nvarchar(4000);
SET #sqlQuery = 'SELECT * FROM ' + #tblName + ' WHERE ' + #keyCol + ' = 2';
PRINT #sqlQuery;
INSERT INTO #TempTable
EXEC sp_executesql #sqlQuery,
N'#keyCol varchar(100), #key int', #keyCol, #key;
SELECT * FROM #TempTable;
END TRY
BEGIN CATCH
EXECUTE [dbo].[uspPrintError];
END CATCH;
END
I get an error
Invalid object name '#TempTable'
Also not sure if this is the best approach to get data and then update it.
If you absolutely must make that work then I think you'll have to use a global temp table. You'll need to see if it exists before running your dynamic sql and clean up. With a fixed table name you'll run into problems with other connections. Inside the dynamic sql you'll add select * into ##temptable from .... Actually I'm not even sure why you want the temp table in the first place. Can't the dynamic sql just return the results?
On the surface it seems like a solid idea to have one generic procedure for returning data with a couple of parameters to drive it but, without a lot of explanation, it's just not the way database are designed to work.
You should create the temp table.
IF OBJECT_ID('tempdb..##TempTable') IS NOT NULL
DROP TABLE ##TempTable
CREATE TABLE ##TempTable()

SQL Server - How to copy data from dropped table when DROP_TABLE TRIGGER fires after table is dropped?

I do not wish to prevent table drops, but when certain tables are dropped in a database I would like to back up either the entire table or query the rows and select specific rows into another table before the drop.
With a normal trigger on a table, if a row was deleted you could access the 'Deleted' table and access those deleted rows.
The DROP_TABLE trigger fires after the table is dropped.
Is there an equivalent to the Deleted table for a DROP_TABLE trigger?
Is there a different approach I could use?
Or am I going to have to re-code the business logic in the windows service which creates and drops these tables?
(I REALLY don't want to write a trigger which rolls-back the drop, accesses and copies-out the data, then re-drops the table without firing the trigger recursively. I like inventiveness, but this is too mucky a solution for me)
I am running this on Microsoft SQL Server Enterprise Edition (64-bit)
and Microsoft SQL Server Developer Edition (64-bit)
Thanks for the help guys, but to directly answer my own questions:
For DDL triggers (which fire for DROP TABLE), there is no equivalent to the Deleted table within DELETE/UPDATE triggers
There is no equivalent solution without rolling-back the drop, copying-out the data and re-issuing the drop
The only appropriate and correct approach is to re-code the business logic in the windows service which creates and drops these tables - to permit a soft-delete/move/rename when required
If it's the recursive firing of the trigger that bothers you, that can be checked for. This will only run for the initial DROP TABLE.
alter Trigger ddlt_ProcessDropTable
on all server for drop_table
AS
begin
if( trigger_nestlevel() = 1 ) -- only run if top level drop table
begin
declare #data XML
set #data = EVENTDATA()
-- rollback the drop
rollback;
-- get table name
declare #TableName sysname, #SchemaName sysname, #DataBaseName sysname, #Sql nvarchar(1000);
select
#TableName = #data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'nvarchar(2000)'),
#SchemaName = #data.value('(/EVENT_INSTANCE/SchemaName)[1]', 'nvarchar(2000)'),
#DataBaseName = #data.value('(/EVENT_INSTANCE/DatabaseName)[1]', 'nvarchar(2000)')
/****
Do stuff with the dropped table...
****/
-- re-drop the table
set #sql = 'Drop Table ' +
QuoteName(#DataBaseName) + '.' + QuoteName(#SchemaName) + '.' + QuoteName(#TableName)
exec(#sql)
end
end
GO

Resources