Best Method to Spawn Process from SQL Server Trigger - sql-server

How would I go about spawning a separate process using a SQL Server 05/08 trigger? Ideally, I would like to spawn the process and have SQL Server not wait on the process to finish execution. I need to pass a couple parameters from the insert that is triggering the process, but the executable would take care of the rest.

a bit of CLR Integration, combined with SQL Service Broker can help you here.
http://microsoft.apress.com/feature/70/asynchronous-stored-procedures-in-sql-server-2005

you want to use the system stored procedure xp_cmdshell
info here: http://msdn.microsoft.com/en-us/library/ms175046.aspx

I saw that particular article, but didn't see an option to 'spawn and forget'. It seems like it waits for the output to be returned.
xp_cmdshell operates synchronously. Control is not returned to the caller until the command-shell command is completed.

Related

SQL Server - schedule a one-time task to run after a currently running task finishes?

I'm running a stored procedure right now that will likely take 12 hours to complete, however the exact actual time is unknown. I have another stored procedure that I need to run once the currently running one is finished. Is it possible in SQL Server to schedule something to run, one time, e.g. "after dbo.storedProcedureName completes execution"?
Yesterday I tried the schedule option in SQL Server Agent called "Start whenever the CPU's become idle" but this did not work.
This may be possible with some add-in, or with a loop monitoring running processes, but it isn't really the best way. Instead, you should set up an SQL Script to run the two in sequence, e.g.:
EXEC dbo.MyLongRunningProc;
EXEC dbo.MyOtherProc;
These are guaranteed to run in sequence, one right after the other, when run from inside SSMS or a similar tool.
You can create and execute (or schedule) a procedure that runs both
create procedure runBothInOrder
as
begin tran
exec procedure1
exec procedure2
commit

Problems in Executing the SP which has while(true) loop in SQL SERVER

I know it is not good way and may have many issues. But can anyone tell what are the problems in executing the SP which has while loop.
CREATE PROC udpParseData
BEGIN
WHILE(true)
BEGIN
--logic goes here
END
END
EXEC udpParseData
I want to run it like a service. Instead of running a service which check the DB continuously using SQL Dependancy. Any problem with this and also i have a main concern that how to stop the running SP udpParseData. One more option i have is run the same SP in Scheduler JOB. I wanted to know the restrictions and disadvantages of using this.
You don't want to do this - even a Windows service doesn't work like that. A Windows service would leverage a Timer that wakes up on an interval and starts the message pump.
Write the work you want to execute in a stored procedure and then create a Job on the SQL Server that executes that stored procedure, and setup the times you want it to run. The Job takes the place of the Timer in the Windows service.

Continue to run a SQL query after the connection closes

In SQL Server, is it possible to call a long-running stored procedure, close the connection and let it finish? I don't care about the result. Essentially I want to say "Hey SQL, do this work on your own server."
I want to avoid having to make some service to sit with the open connection and wait for it.
You'll want to use BeginExecuteNonQuery(AsyncCallback, Object), and then have a callback method that effectively does nothing. It's worth mentioning that the connection won't actually be closed, however.
MSDN SqlCommand.BeginExecuteNonQuery Method
Alternatively, you could use Service Broker to queue the request to run a stored procedure. There's a fair bit of plumbing to set up, though. It might not be worth it in your situation. Especially if you aren't using Service Broker for anything else. The advantage here is that the connection can be closed immediately after queuing with Service Broker, and placing messages into a queue is a very quick operation.
You could also use a SQL Agent job to run the procedure that has no schedule, then issue a SQL command to start the job with "EXEC msdb.dbo.sp_start_job #job_name = ''" If you have to change the command frequently it may not be ideal, but that can be done through SQL calls if needed.

Get hostname when reading Service Broker Queue (SQL Server 2005)

I am trying to configure auditing on my SQL Server using Service Broker. I did all the configuration needed to capture the DDL Events (queue, routes, endpoints, event notification). It is working properly except that I am not able to get the hostname of the client from where the DDL event originated from.
Using the service broker's activation procedure, I tried reading the value from the message_body, but there's no xml element that contains the hostname. I can see a value for the SPID but am unable to make use of it. Exec'ing sp_who and querying sys.processes against this SPID doesn't return any value. And running sp_who without parameter shows only one process (I think it's the background process used by the service broker). Is it all because the message was sent asynchronously? But why will it cause the activation context to see different data on sys.processes view?
I am aware that there are DDL triggers that can achieve the same goal, but it seems it is tightly coupled to the command that causes it to fire. So if the triggers fails, the command will also fail.
UPDATE: I managed to retrieve the Hostname by using a combination of xp_cmdshell and sqlcmd (command line app). But I also realized that since the message is asynchronous, it is not always reliable (The SPID who issue the DDL command might have been disconnected already before the message is read from the queue).
I'm not exactly sure what you're trying to implement here, but it's expected that activated procedure will only see a subset of rows in DMVs. This has to do with activation context which often impersonates a different user that you use when debugging the procedure. That impersonated user will only see these rows of server-level views and DMVs to which it has permissions. See here and here for more info.

Send message from SQL Server trigger

I need to signal a running application (Windows service) when certain things happen in SQL Server (2005). Is there a possibility to send a message from a trigger to an external application on the same system?
You can use a SQL Service Broker queue to do what you want.
The trigger can create a conversation and send a message on the queue.
When it starts, the external process should connect to the database and issue a WAITFOR (RECEIVE) statement on this queue. It will receive the message when the trigger sends it.
Not sure DBA's would approve of this, but there is a way to run commands using xp_cmdshell
"Executes a given command string as an operating-system command shell and returns any output as rows of text. Grants nonadministrative users permissions to execute xp_cmdshell."
Example from MS's site:
CREATE PROC shutdown10
AS
EXEC xp_cmdshell 'net send /domain:SQL_USERS ''SQL Server shutting down
in 10 minutes. No more connections allowed.', no_output
EXEC xp_cmdshell 'net pause sqlserver'
WAITFOR DELAY '00:05:00'
EXEC xp_cmdshell 'net send /domain: SQL_USERS ''SQL Server shutting down
in 5 minutes.', no_output
WAITFOR DELAY '00:04:00'
EXEC xp_cmdshell 'net send /domain:SQL_USERS ''SQL Server shutting down
in 1 minute. Log off now.', no_output
WAITFOR DELAY '00:01:00'
EXEC xp_cmdshell 'net stop sqlserver', no_output
Either:
Use RAISERROR (severity 10) to fire a SQL agent alert and job.
Load a separate table that is polled periodically by a separate mail handling process. (as HLGEM suggested)
Use a stored procedure to send the message and write to the table.
Each solution decouples the transactional trigger from a potentially long messaging call.
You can send an email from a trigger, but it isn't a recommended practice becasue if the email ssystem is down, no data changes can be made to the table.
Personally if you can live with less than realtime, I would information about the event you are interested in to another table (so the real change of data can go smoothly even if email is down for some reason.) Then I would have a job that checks that table every 5-10 minutes for any new entries and emails those out.
You can use a dbmail email message. It should not slow the trigger down if the mail server is down because the message is queued and then sent by and external (to sql) process.
The table idea sounds good if the application can access sql server.
You could also give access to that same table via sql 2005 native XML Services - which exposes procs through xml.
http://msdn.microsoft.com/en-us/library/ms345123(SQL.90).aspx
Depending on what sort of message you want to send, you could use a CLR stored procedure to connect to a socket on the running process and write the message to that. If you have no control over the process (i.e. can't modify it) you could build a bridge or use a library that can issue a message in a suitable format.
For reliable delivery, you could do something that uses MSMQ to deliver the message.
A reminder that triggers can be problematic for stuff like this because they are embedded in set-operations. And being associated with tables, they aren't very sensitive to the context in which they are fired. The problem can be if they fire on an operation that involves multiple rows, because it's hard to avoid invoking as many instances of your action as there are records in the operation. Several hundred emails are not unlikely, for instance.
Hopefully "things that happen" can be detected in closer association with the context in which they happen (which also can be interesting to try to backtrack from a trigger.)

Resources