I want to send email in a trigger. I am using msdb.dbo.sp_send_dbmail for that. I have problem how to setup permissions properly.
I do not want to add DatabaseMailUserRole of msdb to the user of the database which has access to the data.
Therefore, I have created another user mailer who has DatabaseMailUserRole.
The trigger definition is:
CREATE TRIGGER MyTrigger
ON MyTable
WITH EXECUTE AS 'mailer'
AFTER INSERT
AS
BEGIN
exec msdb.dbo.sp_send_dbmail ...
END
I am getting the following error:
The EXECUTE permission was denied on the object 'sp_send_dbmail', database 'msdb', schema 'dbo'.
(When I remove the WITH EXECUTE AS 'mailer' and add DatabaseMailUserRole to the user who is accessing the database, it works correctly.)
I would recommend not to do any heavy lifting and/or time consuming tasks (like sending an e-mail) in a trigger.
The trigger gets executed in the context of the transaction that caused it to fire - and that transaction will have to wait until the trigger and all its operations are done. This is a sure fire way to kill off any performance in your system.
What I'd rather recommend is this:
inside your trigger, just make a "note" into a table that something needs to be done (like sending an e-mail; store all the relevant info that you need into that table) - and then that's all the trigger does
create a separate, stand-alone task (e.g. a scheduled job, run by the SQL Server Agent) that checks that "command" table periodically, and then actually does the action needed - like sending the e-mail, updating a large number of rows, exporting data - whatever it might be
With such a setup, you make sure your triggers stay lean and nimble and don't cause more slowdown on your system than absolutely necessary.
Related
I have an Oracle db 11 that contains 3 instances. I need that event 31156 will be trace forever,
so I run this alter statment on each instance:
alter system set events '31156 trace name context forever, level 0x400';
The problem is that this parameter resets every time the db is restarted or getting into restricted mode and out of restricted mode.
I want to create 2 db triggers that will run this command, one after start up and the second after disabling restricted mode.
I created the start up trigger like that:
create trigger on_startup_trig after startup on database begin
alter system set events '31156 trace name context forever, level 0x400';
end;
so my first question is - will this trigger run on each of my instances when the db will be started or only once?
my second question is - is there a way to create a simmilar trigger for after disabling restricted session?
I disable restricted session like that:
alter system disable restricted session;
I need to be able to send an XMPP message when a row gets inserted into a particular table in our SQL Server database (and have it not make the insert fail if the XMPP server or code isn't available/fails/etc).
Is this possible without causing the insert to fail in some circumstances?
To avoid potentially blocking your database application, I'd recommend NOT to send any external messages directly from a trigger. After all, the trigger executes in the context of the SQL statement that caused it to fire, and if the trigger is delayed, then your statement will have to wait until the trigger is done (or has timed out).
Instead, what I'd do is this:
insert a row into a "command" table with enough information to be able to later send your XMPP message - this can be done in the trigger
have a separate piece of code, e.g. a scheduled SQL Server job, that checks that "Command" table every x minutes or hours or however frequently (or infrequently) that you need - and this job running separately and independently from your application should them attempt to send out those messages, and handle any potential error situations - while your main application happily works along not bothered by any delays, time outs etc.
I was planning to use SSIS logging to get task level details (duration of running, error message thrown-if any, user who triggered the job ) for my package.
SSIS was creating dbo.syssisLog table under System table and it was working just fine. Suddenly it stops creating table under System table and start creating under Users table. Also now it is not logging some events which were logged previously when created under System table. Events like: PackageStart and User:PackageStart/User:PackageEnd event for some tasks.
Can anyone please guide me what's going wrong here ?
The table showing under System versus User tables is fairly meaningless but if you want the table to show the same, set it as a MS shipped table
EXECUTE sys.sp_MS_marksystemobject 'sysssislog'
The way database logging works in the package deployment model, is that SSIS will attempt to log to dbo.sysdtslog90/dbo.sysssislog (depending on your version) but if that table doesn't exist, it will create it for you. There is a copy of that table in the msdb catalog which is marked as a system object. When SSIS creates its own copy, it just has the DDL somewhere in the bowels of the code that does logging. You'll notice it also creates a stored procedure sp_ssis_addlogentry to assist in the logging.
As for your observation for inconsistent logging behaviour, all I can say is I've never seen that. The only reason it won't log an event is if the event doesn't occur - either a precursor condition didn't happen or the package errors out. If you can provide a reproducible scenario where it does and then doesn't log events, I'll be happy to tell you why it does/doesn't do it.
I am trying to configure auditing on my SQL Server using Service Broker. I did all the configuration needed to capture the DDL Events (queue, routes, endpoints, event notification). It is working properly except that I am not able to get the hostname of the client from where the DDL event originated from.
Using the service broker's activation procedure, I tried reading the value from the message_body, but there's no xml element that contains the hostname. I can see a value for the SPID but am unable to make use of it. Exec'ing sp_who and querying sys.processes against this SPID doesn't return any value. And running sp_who without parameter shows only one process (I think it's the background process used by the service broker). Is it all because the message was sent asynchronously? But why will it cause the activation context to see different data on sys.processes view?
I am aware that there are DDL triggers that can achieve the same goal, but it seems it is tightly coupled to the command that causes it to fire. So if the triggers fails, the command will also fail.
UPDATE: I managed to retrieve the Hostname by using a combination of xp_cmdshell and sqlcmd (command line app). But I also realized that since the message is asynchronous, it is not always reliable (The SPID who issue the DDL command might have been disconnected already before the message is read from the queue).
I'm not exactly sure what you're trying to implement here, but it's expected that activated procedure will only see a subset of rows in DMVs. This has to do with activation context which often impersonates a different user that you use when debugging the procedure. That impersonated user will only see these rows of server-level views and DMVs to which it has permissions. See here and here for more info.
I need to signal a running application (Windows service) when certain things happen in SQL Server (2005). Is there a possibility to send a message from a trigger to an external application on the same system?
You can use a SQL Service Broker queue to do what you want.
The trigger can create a conversation and send a message on the queue.
When it starts, the external process should connect to the database and issue a WAITFOR (RECEIVE) statement on this queue. It will receive the message when the trigger sends it.
Not sure DBA's would approve of this, but there is a way to run commands using xp_cmdshell
"Executes a given command string as an operating-system command shell and returns any output as rows of text. Grants nonadministrative users permissions to execute xp_cmdshell."
Example from MS's site:
CREATE PROC shutdown10
AS
EXEC xp_cmdshell 'net send /domain:SQL_USERS ''SQL Server shutting down
in 10 minutes. No more connections allowed.', no_output
EXEC xp_cmdshell 'net pause sqlserver'
WAITFOR DELAY '00:05:00'
EXEC xp_cmdshell 'net send /domain: SQL_USERS ''SQL Server shutting down
in 5 minutes.', no_output
WAITFOR DELAY '00:04:00'
EXEC xp_cmdshell 'net send /domain:SQL_USERS ''SQL Server shutting down
in 1 minute. Log off now.', no_output
WAITFOR DELAY '00:01:00'
EXEC xp_cmdshell 'net stop sqlserver', no_output
Either:
Use RAISERROR (severity 10) to fire a SQL agent alert and job.
Load a separate table that is polled periodically by a separate mail handling process. (as HLGEM suggested)
Use a stored procedure to send the message and write to the table.
Each solution decouples the transactional trigger from a potentially long messaging call.
You can send an email from a trigger, but it isn't a recommended practice becasue if the email ssystem is down, no data changes can be made to the table.
Personally if you can live with less than realtime, I would information about the event you are interested in to another table (so the real change of data can go smoothly even if email is down for some reason.) Then I would have a job that checks that table every 5-10 minutes for any new entries and emails those out.
You can use a dbmail email message. It should not slow the trigger down if the mail server is down because the message is queued and then sent by and external (to sql) process.
The table idea sounds good if the application can access sql server.
You could also give access to that same table via sql 2005 native XML Services - which exposes procs through xml.
http://msdn.microsoft.com/en-us/library/ms345123(SQL.90).aspx
Depending on what sort of message you want to send, you could use a CLR stored procedure to connect to a socket on the running process and write the message to that. If you have no control over the process (i.e. can't modify it) you could build a bridge or use a library that can issue a message in a suitable format.
For reliable delivery, you could do something that uses MSMQ to deliver the message.
A reminder that triggers can be problematic for stuff like this because they are embedded in set-operations. And being associated with tables, they aren't very sensitive to the context in which they are fired. The problem can be if they fire on an operation that involves multiple rows, because it's hard to avoid invoking as many instances of your action as there are records in the operation. Several hundred emails are not unlikely, for instance.
Hopefully "things that happen" can be detected in closer association with the context in which they happen (which also can be interesting to try to backtrack from a trigger.)