I have table with 2 columns: id and name. need trigger which at change of records in this table would bring values after updating in other table.
Something like:
CREATE TRIGGER [dbo].[trigger_tablename] -- replace 'tablename' with your table name
ON [dbo].[tablename] FOR UPDATE -- replace 'tablename' with your table name
AS
BEGIN
insert into T_tablename_Monitor -- replace 'tablename' with your table name
select NewID(),ID, Name,'After Update',SUSER_SNAME(), getdate() from inserted
END
The monitor table might look like:
CREATE TABLE [dbo].[T_tablename_Monitor]( -- replace 'tablename' with your table name
[Row_ID] [varchar](36) NOT NULL,
[ID] [varchar](30) NOT NULL, -- replace with your type
[Name] [varchar](50) NULL, -- replace with your type
[Action] [varchar](50) NOT NULL,
[UserName] [varchar](100) NOT NULL,
[CTime] [datetime] NOT NULL
) ON [PRIMARY]
Here is how to create an update trigger
Related
I have a table created like this:
CREATE TABLE address_user
(
[username] VARCHAR(13) NOT NULL,
[address] CHAR(58) NOT NULL,
[id] BIGINT NOT NULL,
CONSTRAINT [PK_ address_user]
PRIMARY KEY CLUSTERED ([id] ASC)
);
Now I want to be able to keep the history modification of this table, so I want to make it as temporal table. I know the script to create a temporal table, the final result should be:
CREATE TABLE address_user
(
[username] VARCHAR(13) NOT NULL,
[address] CHAR(58) NOT NULL,
[id] BIGINT NOT NULL,
[sys_start_time] DATETIME2(7)
GENERATED ALWAYS AS ROW START HIDDEN NOT NULL,
[sys_end_time] DATETIME2 (7)
GENERATED ALWAYS AS ROW END HIDDEN NOT NULL,
PERIOD FOR SYSTEM_TIME ([sys_start_time], [sys_end_time]),
CONSTRAINT [PK_ address_user]
PRIMARY KEY CLUSTERED ([id] ASC)
)
WITH (SYSTEM_VERSIONING = ON (HISTORY_TABLE=[dbo].[address_user_history], DATA_CONSISTENCY_CHECK=ON));
The easy way to do that is just delete the previous table, and recreate the table with the good schema.
However, I have a lot of information in my table, save the data and delete the table, recreate it and re-insert the data make me uncomfortable.
So if you have a solution to transform the first table in temporal table without the need to delete everything and recreate it, it should be a great help!
Create the new table address_user_new, insert the data, then use sp_rename to rename address_user to address_user_old and address_user_new to address_user. This can all be done in a transaction to ensure ensure that the transition is atomic and apparently-instantaneous. eg
if object_id('address_user') is not null
ALTER TABLE address_user SET ( SYSTEM_VERSIONING = OFF)
go
if object_id('address_user_new') is not null
ALTER TABLE address_user_new SET ( SYSTEM_VERSIONING = OFF)
go
drop table if exists address_user
drop table if exists address_user_history
drop table if exists address_user_new
drop table if exists address_user_old
go
CREATE TABLE address_user
(
[username] VARCHAR(13) NOT NULL,
[address] CHAR(58) NOT NULL,
[id] BIGINT NOT NULL,
CONSTRAINT [PK_address_user]
PRIMARY KEY CLUSTERED ([id] ASC)
);
go
CREATE TABLE address_user_new
(
[username] VARCHAR(13) NOT NULL,
[address] CHAR(58) NOT NULL,
[id] BIGINT NOT NULL,
[sys_start_time] DATETIME2(7)
GENERATED ALWAYS AS ROW START HIDDEN NOT NULL,
[sys_end_time] DATETIME2 (7)
GENERATED ALWAYS AS ROW END HIDDEN NOT NULL,
PERIOD FOR SYSTEM_TIME ([sys_start_time], [sys_end_time]),
CONSTRAINT [PK_address_user_new]
PRIMARY KEY CLUSTERED ([id] ASC)
)
WITH (SYSTEM_VERSIONING = ON (HISTORY_TABLE=[dbo].[address_user_history], DATA_CONSISTENCY_CHECK=ON));
go
set xact_abort on
begin transaction
insert into address_user_new(username,address,id)
select username,address,id
from address_user with (tablockx)
exec sp_rename 'address_user', 'address_user_old', 'OBJECT'
exec sp_rename 'PK_address_user', 'PK_address_user_old', 'OBJECT'
exec sp_rename 'address_user_new', 'address_user', 'OBJECT'
exec sp_rename 'PK_address_user_new', 'PK_address_user', 'OBJECT'
commit transaction
I am working on an audit trail using SQL Server triggers to identify inserts, updates and deletes on tables. Below are my tables and trigger:
CREATE TABLE [dbo].[AuditTrail]
(
[AuditId] [INT] IDENTITY(1,1) NOT NULL,
[DateTime] [DATETIME] NOT NULL,
[TableName] [NVARCHAR](255) NOT NULL,
[AuditEntry] [XML] NULL,
CONSTRAINT [PK_AuditTrail] PRIMARY KEY CLUSTERED
)
CREATE TABLE [dbo].[Employee]
(
[ID] [UNIQUEIDENTIFIER] NOT NULL DEFAULT (newid()),
[NameEmployee] [NVARCHAR](255) NOT NULL,
CONSTRAINT [PK_Employee] PRIMARY KEY CLUSTERED
)
CREATE TABLE [dbo].[Transaction]
(
[ID] [UNIQUEIDENTIFIER] NOT NULL DEFAULT (newid()),
[NameTransaction] [NVARCHAR](255) NOT NULL,
CONSTRAINT [PK_Transaction] PRIMARY KEY CLUSTERED
)
CREATE TRIGGER AuditEmployee
ON [dbo].[Employee]
AFTER INSERT, DELETE, UPDATE
AS
BEGIN
SET NOCOUNT ON;
IF (SELECT COUNT(*) FROM deleted) > 0
BEGIN
DECLARE #AuditMessage XML
SET #AuditMessage = (SELECT * FROM deleted FOR XML AUTO)
INSERT INTO AuditTrail (DateTime, TableName, AuditEntry)
VALUES (GETDATE(), 'Simple', #AuditMessage)
END
END
GO
I have created a trigger for the employee table, but that is a static trigger. How to dynamically trigger with stored procedure depends on the number of tables we have, except the AuditTrail table?
I've created a trigger on a SQL Server 2012 table to log all INSERT, UPDATE and DELETE actions against the table to another table. I got my initial code from the blog post at http://sqlblog.com/blogs/jonathan_kehayias/archive/2010/01/11/tsql2sday-using-sys-dm-exec-sql-text-to-get-the-calling-statement.aspx.
Here are the scripts for a test table, the table I'm logging changes on the test table to, and the trigger on the test table:
CREATE TABLE [dbo].[TestAddresses](
[RecNo] [int] IDENTITY(1,1) NOT NULL,
[FirstName] [varchar](50) NULL,
[LastName] [varchar](50) NULL,
[Address] [varchar](50) NULL,
[City] [varchar](50) NULL,
[State] [varchar](2) NULL,
[Zip] [varchar](12) NULL
) ON [PRIMARY]
GO
-------------------------------------
CREATE TABLE [dbo].[TestAddressesLog](
[RecNo] [int] IDENTITY(1,1) NOT NULL,
[Sql] [nvarchar](max) NULL,
[Timestamp] [datetime] NOT NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
ALTER TABLE [dbo].[TestAddressesLog] ADD CONSTRAINT [DF_TestAddressesLog_Sql] DEFAULT (N'') FOR [Sql]
GO
ALTER TABLE [dbo].[TestAddressesLog] ADD CONSTRAINT [DF_TestAddressesLog_Timestamp] DEFAULT (getdate()) FOR [Timestamp]
GO
------------------------------------
ALTER TRIGGER [dbo].[TestAddressLogTrigger]
ON [dbo].[TestAddresses]
FOR INSERT, UPDATE, DELETE
AS
BEGIN
DECLARE #TEMP TABLE (EventType NVARCHAR(30), Parameters INT, EventInfo NVARCHAR(4000))
INSERT INTO #TEMP EXEC('DBCC INPUTBUFFER(##SPID)')
INSERT INTO TestAddressesLog (Sql)
SELECT EventInfo FROM #TEMP
END
This works great for standard CRUD operations where there are no parameters. However, when my .NET code executes a parameterized operation such as the following (assigning a value such as 'CA' to #NewState):
UPDATE TestAddresses SET State = #NewState WHERE State = 'TX'
I get the following result:
(#NewState int)UPDATE TestAddresses SET State = #NewState WHERE State = 'TX'
I need a way to get and record the value of the passed parameter. Anyone have a solution for this?
That's not going to work. The only way is to capture RPC Completed events using Extended Events, parse them and persist them.
the table:
CREATE TABLE [dbo].[User] (
[UserName] NVARCHAR (100) NOT NULL,
[Pasword] NVARCHAR (100) NOT NULL,
[Name] TEXT NOT NULL,
[LastName] TEXT NOT NULL,
[Location] TEXT NOT NULL,
[profesion] TEXT NOT NULL,
[Email] NVARCHAR (50) NOT NULL,
[Gender] TEXT NOT NULL,
PRIMARY KEY CLUSTERED ([UserName] ASC)
);
i want to update to:
CREATE TABLE [dbo].[User] (
[UserName] NVARCHAR (100) NOT NULL,
[Pasword] NVARCHAR (100) NOT NULL,
[Name] TEXT NOT NULL,
[LastName] TEXT NOT NULL,
[Location] TEXT NOT NULL,
[profesion] TEXT NOT NULL,
[Email] NVARCHAR (50) NOT NULL,
[Gender] TEXT NOT NULL,
[moneyinmillions] INT NOT NULL,
PRIMARY KEY CLUSTERED ([UserName] ASC)
);
the problem:
an error occurred while the batch was being executed
thanks for the help
In the interest of answering your question, here is the code you would want to add the moneyinmillions column to the User table:
ALTER TABLE [User]
ADD [moneyinmillions] INT NOT NULL;
Ways to Insert a column in your existing Table
Use the ALTER TABLE Statement
Do the following:
ALTER TABLE [dbo].[User]
ADD [moneyinmillions] INT NOT NULL
Using the Table Designer
In Object Explorer, right-click the table (here, User table) to which you want to add columns and choose Design.
Click in the first blank cell in the moneyinmillions column.
Press the TAB key to go to the Data Type cell and select a Data Type from the dropdown.
When you are finished adding columns, from the File menu, choose Save table name (User).
Using DROP TABLE and Re-Creating the Table
DROP TABLE [dbo].[User]
and then Execute the statements below:
CREATE TABLE [dbo].[User] (
[UserName] NVARCHAR (100) NOT NULL,
[Pasword] NVARCHAR (100) NOT NULL,
[Name] TEXT NOT NULL,
[LastName] TEXT NOT NULL,
[Location] TEXT NOT NULL,
[profesion] TEXT NOT NULL,
[Email] NVARCHAR (50) NOT NULL,
[Gender] TEXT NOT NULL,
[moneyinmillions] INT NOT NULL,
PRIMARY KEY CLUSTERED ([UserName] ASC));
(Note: The DROP Table Statement will remove the table definition and all the data, indexes, triggers, constraints, and permission specifications for that table. So, if you have data entry in some fields/columns, then do not use the DROP TABLE Statement because you'll loose all the data).
Did you know that you can right click on the table and open the design view to add/remove columns to or from a table ??
I have 2 databases DB1 and DB2. I need to write query to copy data from DB2 to DB1.
Both the databases have same table structure.
For Example:
CREATE TABLE DB1.Group(
GroupID [int] IDENTITY(1,1) NOT NULL,
[Company] [varchar](10) NOT NULL,
[Description] [varchar](1000) NOT NULL
)
CREATE TABLE DB1.Instance(
[InstanceID] [int] IDENTITY(1,1) NOT NULL,
[Description] [varchar](1000) NOT NULL,
[GroupID] [int] NOT NULL,
)
I read the data from DB2.Group and insert it in DB1.Group :
Insert into DB1.Group (Company,Description)
select Company,Description from DB2.Group
The GroupID is auto-incremented in DB1. And this I do not want to turn off as will conflict with the existing data.
Now, while inserting data into the DB1.Instance, I need to provide the new auto-incremented insert ids (GroupID) of DB1.Group table
Insert into DB1.Instance (Description,GroupID)
select Description, GroupID from DB2.Instance
Please guide me how can I do that.
Thanks.
Insert you first table with the new keys (leave the pk blank on insert) and make a (temporary) col in the DB1 table for the old key. lookup (join) your second insert on the old key column to get your new fk. When your done delete the old key column and your done.
Here is the sql:
CREATE TABLE DB1.Group(
GroupID [int] IDENTITY(1,1) NOT NULL,
[Company] [varchar](10) NOT NULL,
[Description] [varchar](1000) NOT NULL,
[old_key] [int]
)
CREATE TABLE DB1.Instance(
[InstanceID] [int] IDENTITY(1,1) NOT NULL,
[Description] [varchar](1000) NOT NULL,
[GroupID] [int] NOT NULL,
)
Insert into DB1.Group (Company,Description, old_key)
select Company,Description,GroupID from DB2.Group
Insert into DB1.Instance (Description,GroupID)
select Description, DB1.Group.GroupID
from DB2.Instance join DB1.Group ON DB1.Group.old_key = DB2.Instance.GroupID
ALTER TABLE DB1.Group DROP COLUMN old_key