Using SQL to send emails based on new entries in tables - sql-server

I'm looking for a way to send emails (or reminders/confirmations) to users who create a new record in a web application, which is then inserted into a table. Basically a dynamic way of sending emails.
I've been reading online about Triggers and DB mail and there seems to be alot of disadvantages going with this approach.
Does anyone have any recommendations on the best way to achieve this?
Flow: New Record Inserted into DB Table ->>> At this point the email address of the user who created the record in the application should receive a mail (basically a confirmation mail).
What I've tried already:
DB mail is already configured and working.
I've made the below Trigger (very basic) but from reading online using a trigger in this way will lead to load/performance issues of my DB.
But I'm on unsure on how to generate the emails and last record inserted.
CREATE TRIGGER [dbo].[INSERT_Trigger] ON [dbo].[Entity]
FOR INSERT
AS
EXEC msdb.dbo.sp_send_dbmail
#profile_name = 'DBMail',
#recipients = 'admni#domain.com', ####Here I would need dyncamic emails based on what user enters the new record which are stored in the same table
#query = 'Select Description From Entity where 'Last inserted record'####Here I would need the last record for that user entered
#subject = 'Subject' ,
#Body = 'Message Body',
#importance = 'High',

IMHO this approach is a design flaw: the DB tier is something that should be one of the leaves of the tier tree. The fact that the MS SQL Server is actually an application server and has support for such things is a legacy, but I don't think that it should be used.
At first look:
you might need to switch to another RDBMS
your production environment might not support SMTP for any reason
your attempt to send the mail could fail for various reasons - resulting in user not being notified and never trying it again
Yes, indeed, you can use SQL Server even as a message bus but would not be an efficient one. This concept is actually dispatching events of "notification needed" kind. The event is implemented as insert and the trigger is the consumer. But the event is produced inside your application, in a higher tier. Why not reacting to it there? Or, use the database only as a queue: store the details there, but process them in a way where you have more control.
You have not told us about the application you are creating, but I would create a separate background task (the implementation could vary depending on the application design - can be an OS-level scheduled task, windows service, or a background worker in your application) that checks periodically for mails not yet sent, and tries to send them, storing the result in the record. Things might get of course more complicated depending on load. But this way you can retry, and you are certainly taking load of the DB server, but at least you have the possibility to do so.

I have a trigger that works as u asked
First insert records in temporary table from inserted table
Second, declare parameters that you need
Third, add a cursor in your trigger and get parameters that u need in cursor from temporary table
Inside trigger you can declare recipients as u needed to be, query, and other stuffs
CREATE TRIGGER Trigger ON [Entity]
FOR INSERT
not for replication
AS
select ins.* into #temp from inserted ins
declare #Param1 integer, #Param2 integer
declare cursor forward only for
select Col1, Col2 from #temp order by Col1
open cursor
fetch next from cursor into #Param1, #Param2
while ##fetch status = 0
begin
declare #recipients varchar (max), #query varchar(max)
select #recipient = Col1 -- or whatever col contains the recipient address
from #temp
where Col1 = #Param1
select #query = description
from #temp
where Col2 = #Param2 -- or whatever condition give u the description for parameter
exec sp_send_dbmail
#profile_name = 'profile',
#recipients = #recipient
#subject = 'Subject' ,
#Body = #query,
#importance = 'High'
fetch next from cursor into #Param1, #Param2
end
close cursor
deallocate cursor
drop table #temp
--- note that the body can be formatted in html format, like bellow
declare #body varchar(max)
select #body = '<html><body>'
select #body = #body+'Hello'+'<b><b>'
select #body = #body+'Here is the description:'+#query+'<b><b>'
select #body = #body+'Regards'
select #body = #body+'</body></html>'

Related

AFTER INSERT trigger calling stored procedure which calls API, causes problem with ##IDENTITY

I am replacing a classic ASP website and VB6 back-end process for a client. It's a pretty high-traffic site where various processes insert a record into a queue table. A couple of the processes are different web services and one process is a form on the classic ASP website. For the new process, I created an AFTER INSERT trigger that calls a stored procedure, which makes an API call to another server. That API call will initiate code to process the new record. Here's the INSERT (disclaimer: code modified to mask client identity... and make me look as intelligent as possible):
-- =============================================
-- Author: Gary Jorgenson, RN
-- Create date: 05/17/2021
-- Description: notify system of New record
-- =============================================
CREATE TRIGGER [dbo].[tr_notify_new_record]
ON [dbo].[TableName]
AFTER INSERT
AS
BEGIN
DECLARE #id AS INT;
SELECT #id = I.id FROM INSERTED I
EXEC [dbo].[core_api_notify_new_record]
#record_id = #id
END
The stored procedure is as follows:
-- =============================================
-- Author: Gary Jorgenson, RN
-- Create date: 5/16/2021
-- Description: Notify system of new record
-- =============================================
CREATE PROCEDURE [dbo].[usp_api_newrecord_notify]
#record_id AS INT
AS
BEGIN
DECLARE #URL NVARCHAR(128) = 'https://ourwebsite.com/api/newrecord';
DECLARE #recID AS VARCHAR(50) = CONVERT(VARCHAR(50), #record_id);
SET #URL = CONCAT(#url, '/', #recID);
DECLARE #Object AS INT;
DECLARE #ResponseText AS VARCHAR(8000);
EXEC sp_OACreate 'MSXML2.XMLHTTP', #Object OUT;
EXEC sp_OAMethod #Object, 'open', NULL, 'post', #URL,'false'
EXEC [sp_OAMethod] #Object , 'setRequestHeader' , NULL , 'Content-Type' , 'application/json'
EXEC [sp_OAMethod] #Object , 'responseText' , #ResponseText OUTPUT
SELECT #ResponseText AS [Details]
EXEC [sp_OADestroy] #Object
END
To do initial testing, I made the API controller just do a simple insert into a log table that records the record ID of the newly created record. I tested the API locally using POSTMAN and that was successful. I then locked down the sp_OA... methods to provide only the permissions necessary, and tested the stored procedure by executing it in SSMS using the same credentials as will be used in production, and that was successful. Lastly, I enabled the AFTER INSERT trigger having a pretty high level of confidence that nothing could go wrong.
I was not correct.
A few minutes after turning on the trigger, a few customers called reporting that the website crashed when trying to submit the form. Working with the original website developer, we determined that his code was performing a multi-statement SQL Insert where the last statement called ##IDENTITY to get the new record ID. Somehow, the stored procedure and/or API call was affecting ##IDENTITY where it returned a null or zero value.
This makes no sense as the only other INSERT being made in the process, is behind the API controller which INSERTS a log record on a different machine, in a different instance of SQL Server.
The original website developer is changing ##IDENTITY to instead use SCOPE_IDENTITY(). We're going to test and see if that alleviates the problem. The whole thing makes me nervous though as I never imagined this process would have any effect on ##IDENTITY since I'm not inserting any records in any tables on the local machine. I'd like to have a better understanding of what happened in this process.
Ultimate my question is, how could my process have effected ##IDENTITY since I didn't insert any records?
Any advice is much appreciated!
As per the docs
##IDENTITY and SCOPE_IDENTITY return the last identity value generated in any table in the current session. However, SCOPE_IDENTITY returns the value only within the current scope; ##IDENTITY is not limited to a specific scope.
So the likely scenario when using ##IDENTITY is that the trigger is also doing an insert (maybe to a system table in your case) and therefore this new id is returned in ##IDENTITY back to the user - which of course is not the id they want.
I can't actually think of a case when you would want to use ##IDENTITY, these days you would normally use the OUTPUT clause to ensure you get back exactly the id(s) you are looking for. If for some reason that is not an option then SCOPE_IDENTITY() is a much better alternative to ##IDENTITY.
IDENT_CURRENT(TableName) is just as bad, in that the value returned is across ALL sessions and ALL scopes... you're restricting the table at the cost of widening the scope.
Note: As an aside your trigger is broken, because its assuming that Inserted will only contain a single record when in fact it can contain 0-N.
It turned out to be my own fault. In the stored procedure, I had a SELECT at the end of the procedure that returned the API response:
SELECT #ResponseText AS [Message]
The previous developer had code where he performed in INSERT, then queried ##IDENTITY for the new record id. Instead of getting an integer record id, he was getting the message from the API call. He changed his code to use SCOPE_IDENTITY() instead of ##IDENTITY and we tested to find it still didn't work. Digging a little deeper, we found the problem.

SQL Server Service Broker - Ways to improve SQL execution framework

Below is an outline of a SQL execution framework design using Service Broker that I have been playing with. I've outlined the process and have asked some questions through (highlight using a block quote) and would be interested in hearing any advice on the design.
Overview
I have a an ETL operation that needs to take data out of 5 databases and move it into 150 using select/insert statements or stored procedures. The result is about 2,000 individual queries, taking between 1 second to 1 hour each.
Each SQL query inserts data only. There is no need for data to be returned.
The operation can be broken up into 3 steps:
Pre-ETL
ETL
Post-ETL
The queries in each step can be executed in any order, but the steps have to stay in order.
Method
I am using Service Broker for asynchronous/parallel execution.
Any advice on how to tune service broker (e.g. any specific options to look at or guide for setting the number of queue workers?
Service Broker Design
Initiator
The initiator sends an XML message containing the SQL query to the Unprocessed queue, with an activation stored procedure called ProcessUnprocessedQueue. This process is wrapped in a try/catch in a transaction, rolling back the transaction when there is an exception.
ProcessUnpressedQueue
ProcessUnprocessedQueue passes the XML to procedure ExecSql
ExecSql - SQL Execution and Logging
ExecSql then handles the SQL execution and logging:
The XML is parsed, along with any other data about the execution that is going to be logged
Before the execution, a logging entry is inserted
If the transaction is started in the initiator, can I ensure the log entry insert is always committed if the outer transaction in the initiator is rolled back?
Something like SAVE TRANSACTION is not valid here, correct?
Should I not manipulate the transaction here, execute the query in a try/catch and, if it goes to the catch, insert a log entry for the exception and throw the exception since it is in the middle of the transaction?
The query is executed
Alternative Logging Solution?
I need to log:
The SQL query executed
Metadata about the operation
The time it takes for each process to finish
This is why I insert one row at the start and one at the end of the process
Any exceptions, if they exist
Would it be better to have an In-Memory OLTP table that contains the query information? So, I would have INSERT a row before the start of an operation and then do an UPDATE or INSERT to log exceptions and execution times. After the batch is done, I would then archive the data into a table stored to the disk to prevent the table from getting too big.
ProcessUnprocessedQueue - Manually process the results
After the execution, ProcessUnprocessedQueue gets back an updated version of the XML (to determine if the execution was successful, or other data about the transaction, for post-processing) and then sends that message to the ProcessedQueue, which does not have an activation procedure, so it can be manually processed (I need to know when a batch of queries has finished executing).
Processing the Queries
Since the ETL can be broken out into 3 steps, I create 3 XML variables where I will add all of the queries that are needed in the ETL operation, so I will have something like this:
#preEtlQueue xml
200 queries
#etlQueue xml
1500 queries
#postEtlQueue xml
300 queries
Why XML?
The XML queue variable is passed between different stored procedures as an OUTPUT parameter that updates it's values and/or add SQL queries to it. This variable needs to be written and read, so an alternative could be something like a global temp table or a persistent table.
I then process the XML variables:
Use a cursor to loop through the queries and send them to the service broker service.
Each group of queries contained in the XML variable is sent under the same conversation_group_id.
Values such as the to/from service, message type, etc. are all stored in the XML variable.
After the messages are sent to Service Broker, use a while loop to continuously check the ProcessedQueue until all the messages have been processed.
This implements a timeout to avoid an infinite loop
I'm thinking of redesigning this. Should I add an activation procedure on ProcessedQueue and then have that procedure insert the processed results into a physical table? If I do it this way, I wouldn't be able to use RECEIVE instead of a WHILE loop to check for processed items. Does that have any disadvantages?
I haven't built anything as massive as what you are doing now, but I will give you what worked for me, and some general opinions...
My preference is to avoid In-Memory OLTP and write everything to durable tables and keep the message queue as clean as possible
Use fastest possible hard drives in the server, write speed equivalent of NVMe or faster with RAID 10 etc.
I grab every message off the queue as soon as it hits and write it to a table I have named "mqMessagesReceived" (see code below, my all-purpose MQ handler named mqAsyncQueueMessageOnCreate)
I use a trigger in the "mqMessagesReceived" table that does a lookup to find which StoredProcedure to execute to process each unique message (see code below)
Each message has an identifier (in my case, I'm using the originating Tablename that wrote a message to the queue) and this identifier is used as a key for a lookup query run inside the the trigger of the mqMessagesReceived table, to figure out which subsequent Stored Procedure needs to be to run, to process each received message correctly.
Before sending a message on the MQ,
Can make a generic variable from the calling side (e.g. if a trigger is putting messages onto the MQ)
SELECT #tThisTableName = OBJECT_NAME(parent_object_id) FROM sys.objects
WHERE sys.objects.name = OBJECT_NAME(##PROCID)
AND SCHEMA_NAME(sys.objects.schema_id) = OBJECT_SCHEMA_NAME(##PROCID);
A configuration table is the lookup data for matching tablename with StoredProcedure that needs to be run, to process the MQ data that arrived and was written to the mqMessagesReceived table.
Here is the definition of that lookup table
CREATE TABLE [dbo].[mqMessagesConfig](
[ID] [int] IDENTITY(1,1) NOT NULL,
[tSourceTableReceived] [nvarchar](128) NOT NULL,
[tTriggeredStoredProcedure] [nvarchar](128) NOT NULL,
CONSTRAINT [PK_mqMessagesConfig] PRIMARY KEY CLUSTERED
(
[ID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
Here is the activation stored procedure that gets run as a message hits the queue
CREATE PROCEDURE [dbo].[mqAsyncQueueMessageOnCreate]
AS
BEGIN
SET NOCOUNT ON
DECLARE
#h UNIQUEIDENTIFIER,
#t sysname,
#b varbinary(200),
#hand VARCHAR(36),
#body VARCHAR(2000),
#sqlcleanup nvarchar(MAX)
-- Get all of the messages on the queue
-- the WHILE loop is infinite, until BREAK is received when we get a null handle
WHILE 1=1
BEGIN
SET #h = NULL
--Note the semicolon..!
;RECEIVE TOP(1)
#h = conversation_handle,
#t = message_type_name,
#b = message_body
FROM mqAsyncQueue
--No message found (handle is now null)
IF #h IS NULL
BEGIN
-- all messages are now processed, but we still have the #hand variable saved from processing the last message
SET #sqlcleanup = 'EXEC [mqConversationsClearOne] #handle = N' + char(39) + #hand + char(39) + ';';
EXECUTE(#sqlcleanup);
BREAK
END
--mqAsyncMessage message type received
ELSE IF #t = 'mqAsyncMessage'
BEGIN
SET #hand = CONVERT(varchar(36),#h);
SET #body = CONVERT(varchar(2000),#b);
INSERT mqMessagesReceived (tMessageType, tMessageBody, tMessageBinary, tConversationHandle)
VALUES (#t, #body, #b, #hand);
END
--unknown message type was received that we dont understand
ELSE
BEGIN
INSERT mqMessagesReceived (tMessageBody, tMessageBinary)
VALUES ('Unknown message type received', CONVERT(varbinary(MAX), 'Unknown message type received'))
END
END
END
CREATE PROCEDURE [dbo].[mqConversationsClearOne]
#handle varchar(36)
AS
-- Note: you can check the queue by running this query
-- SELECT * FROM sys.conversation_endpoints
-- SELECT * FROM sys.conversation_endpoints WHERE NOT([State] = 'CO')
-- CO = conversing [State]
DECLARE #getid CURSOR
,#sql NVARCHAR(MAX)
,#conv_id NVARCHAR(100)
,#conv_handle NVARCHAR(100)
-- want to create a chain of statements like this, one per conversation
-- END CONVERSATION 'FE851F37-218C-EA11-B698-4CCC6AD00AE9' WITH CLEANUP;
-- END CONVERSATION 'A4B4F603-208C-EA11-B698-4CCC6AD00AE9' WITH CLEANUP;
SET #getid = CURSOR FOR
SELECT [conversation_id], [conversation_handle]
FROM sys.conversation_endpoints
WHERE conversation_handle = #handle;
OPEN #getid
FETCH NEXT
FROM #getid INTO #conv_id, #conv_handle
WHILE ##FETCH_STATUS = 0
BEGIN
SET #sql = 'END CONVERSATION ' + char(39) + #conv_handle + char(39) + ' WITH CLEANUP;'
EXEC sys.sp_executesql #stmt = #sql;
FETCH NEXT
FROM #getid INTO #conv_id, #conv_handle --, #conv_service
END
CLOSE #getid
DEALLOCATE #getid
and the table named "mqMessagesReceived" has this trigger
CREATE TRIGGER [dbo].[mqMessagesReceived_TriggerUpdate]
ON [dbo].[mqMessagesReceived]
AFTER INSERT
AS
BEGIN
DECLARE
#strMessageBody nvarchar(4000),
#strSourceTable nvarchar(128),
#strSourceKey nvarchar(128),
#strConfigStoredProcedure nvarchar(4000),
#sqlRunStoredProcedure nvarchar(4000),
#strErr nvarchar(4000)
SELECT #strMessageBody= ins.tMessageBody FROM INSERTED ins;
SELECT #strSourceTable = (select txt_Value from dbo.fn_ParseText2Table(#strMessageBody,'|') WHERE Position=2);
SELECT #strSourceKey = (select txt_Value from dbo.fn_ParseText2Table(#strMessageBody,'|') WHERE Position=3);
-- look in mqMessagesConfig to find the name of the final stored procedure
-- to run against the SourceTable
-- e.g. #strConfigStoredProcedure = mqProcess-tblLabDaySchedEventsMQ
SELECT #strConfigStoredProcedure =
(select tTriggeredStoredProcedure from dbo.mqMessagesConfig WHERE tSourceTableReceived = #strSourceTable);
SET #sqlRunStoredProcedure = 'EXEC [' + #strConfigStoredProcedure + '] #iKey = ' + #strSourceKey + ';';
EXECUTE(#sqlRunStoredProcedure);
INSERT INTO [mqMessagesProcessed]
(
[tMessageBody],
[tSourceTable],
[tSourceKey],
[tTriggerStoredProcedure]
)
VALUES
(
#strMessageBody,
#strSourceTable,
#strSourceKey,
#sqlRunStoredProcedure
);
END
Also, just some general SQL Server tuning advice that I found I also had to do (for dealing with a busy database)
By default there is just one single TempDB file per SQL Server, and TempDB has initial size of 8MB
However TempDB gets reset back to the initial 8MB size, every time the server reboots, and this company was rebooting the server every weekend via cron/taskscheduler.
The problem we saw was slow database and lots of record locks but only first thing Monday morning when everyone was hammering the database at once as they began their work-week.
When TempDB gets automatically re-sized, it is "locked" and therefore nobody at all can use that single TempDB (which is why the SQL Server was regularly becoming non-responsive)
By Friday the TempDB had grown to over 300MB.
So... to solve the following best practice recommendation, I created one TempDB file per vCPU, so I created 8 TempDB files, and I have distributed them across two available hard drives on that server, and most importantly, set their initial size to more than we need (200MB each is what I chose).
This fixed the problem with the SQL Server slowdown and record locking that was experienced every Monday morning.

is there a way i can get an alert whenever a table gets records in sql server

i have a table in SQl server which occasionally gets data from a linked server, and than i have to do activities on it .
but the problem is there is no way to check if the data is inserted in table (table is always truncated after performing the activity so next time when data is pushed table is already empty) i manually check daily for data if it is inserted or not .
what i want is to get auto alert on my email (i already have db_mail configured and working) whenever the data is pushed in a table .
i have sa admin and complete privileges on Database and also on Windows server 2012 R2
You can do this with a trigger but you will have to do some preparations with privileges so the executor (the login that's inserting the records on your tracking table) can send email correctly:
CREATE TRIGGER dbo.TrackingTableNameAfterInsert ON TrackingTable
AFTER INSERT
AS
BEGIN
EXEC msdb.dbo.sp_send_dbmail
#profile_name = 'YourConfiguredProfile',
#recipients = 'youremail#mail.com',
#subject = 'Records were inserted on TrackingTable',
#body = ''
END
You might want to encapsulate the email sending on an SP and configure it's permissions there.
In regards to the following:
...table is always truncated after performing the activity so next time
when data is pushed table is already empty...
You can create a historical table and use a trigger to also insert inserted records on this table, so the TRUNCATE or DROP of the original one won't affect the copied records.
CREATE TABLE TrackingTableMirror (
/*Same columns and types*/
InsertedDate DATETIME DEFAULT GETDATE())
GO
CREATE TRIGGER dbo.TrackingTableInsertMirror ON TrackingTable
AFTER INSERT
AS
BEGIN
INSERT INTO TrackingTableMirror (
/*Column list*/)
SELECT
/*Column list*/
FROM
inserted AS I
END
This way you can check all records on this mirrored table and not the volatile one (and avoid all the email sending).
1) Create Profile and Account
You need to create a profile and account using the Configure Database Mail Wizard which can be accessed from the Configure Database Mail context menu of the Database Mail node in Management Node. This wizard is used to manage accounts, profiles, and Database Mail global settings.
2) Run Query
sp_CONFIGURE 'show advanced', 1
GO
RECONFIGURE
GO
sp_CONFIGURE 'Database Mail XPs', 1
GO
RECONFIGURE
GO
3)
USE msdb
GO
EXEC sp_send_dbmail #profile_name='yourprofilename',
#recipients='test#Example.com',
#subject='Test message',
#body='This is the body of the test message.
Congrates Database Mail Received By you Successfully.'
through the table
DECLARE #email_id NVARCHAR(450), #id BIGINT, #max_id BIGINT, #query NVARCHAR(1000)
SELECT #id=MIN(id), #max_id=MAX(id) FROM [email_adresses]
WHILE #id<=#max_id
BEGIN
SELECT #email_id=email_id
FROM [email_adresses]
set #query='sp_send_dbmail #profile_name=''yourprofilename'',
#recipients='''+#email_id+''',
#subject=''Test message'',
#body=''This is the body of the test message.
Congrates Database Mail Received By you Successfully.'''
EXEC #query
SELECT #id=MIN(id) FROM [email_adresses] where id>#id
END
4) Trigger Code
CREATE TRIGGER [dbo].[Customer_INSERT_Notification]
ON [dbo].[Customers]
AFTER INSERT
AS
BEGIN
SET NOCOUNT ON;
DECLARE #CustomerId INT
SELECT #CustomerId = INSERTED.CustomerId
FROM INSERTED
declare #body varchar(500) = 'Customer with ID: ' + CAST(#CustomerId AS VARCHAR(5)) + ' inserted.'
EXEC msdb.dbo.sp_send_dbmail
#profile_name = 'Email_Profile'
,#recipients = 'recipient#gmail.com'
,#subject = 'New Customer Record'
,#body = #body
,#importance ='HIGH'
END
I refer this link.

Writing a SQL trigger for the first time

I am trying to write a SQL trigger for the first time but running into a problem.
I have a database called mytest and within that is a table called customer. When someone registers a customer record is created. On the signup form that creates the customer record there is an option to add a doctors name.
What I am trying to do within the trigger is send the doctor an email with a simple message "This customer says you are their doctor, please can you reply to this email stating yes or No". My main option is to automate this part of the procedure to update the SQL table once I get this part working!
Here is what I have written so far
CREATE TRIGGER dbo.SEND_MAIL_TO_PRACTITIONER
ON dbo.mytest
AFTER INSERT
AS
BEGIN
DECLARE #PractitionerName VARCHAR(100)
DECLARE #body VARCHAR(100)
SET #PractitionerName=(SELECT PractitionerName FROM customer )
SET #body=(SELECT customername FROM customer)+' emailed us saying you were his doctor, please can you confirm yes or no'
IF #PractitionerName IS NOT NULL
BEGIN
EXEC msdb.dbo.sp_send_dbmail
#recipients = #PractitionerName,
#subject = 'TEST',
#body = #body;
END
END
GO
The SQL executes but no emails are being sent. I'd also like to change customername to a combination of FirstName and LastName fields.
Anybody help point me in the right direction?
You need to start the mail service first in SQL server consider sending a test mail.. If it fails check the mail log by right click in db mail option!
Configure as the doc
http://msdn.microsoft.com/en-IN/library/ms190307.aspx
http://technet.microsoft.com/en-us/library/ms176087.aspx

Update website when data in SQL Server changes?

I am stuck in a problem. There is an application that adds data to database, which is a closed source.
I am creating its web interface. The functionality I want is, that, if a value of some Field in a column is greater than a value in another field in a column, SQL server should http post a message to my site.
Is it possible in Microsoft SQL?
and if yes, How?
Ok, if another software is doing the inserts, you could do it like this...
ALTER TRIGGER [dbo].[ABCD] ON [dbo].[XXX]
FOR INSERT
AS
Declare #A -- from column 1 in INSERT
Declare #B -- from column 2 in INSERT
if (#A > #b)
begin
EXEC msdb.dbo.sp_send_dbmail
#recipients = #email,
#body = #message,
#subject = 'Latest record has column value A greater than column value B'
end
else
begin
--do whatever
end
Ps. sp_send_dbmail is a stored procedure that sends email or messages

Resources