Execute stored procedure from a Trigger after a time delay - sql-server

I want to call stored procedure from a trigger,
how to execute that stored procedure after x minutes?
I'm looking for something other than WAITFOR DELAY
thanks

Have an SQL Agent job that runs regularly and pulls stored procedure parameters from a table - the rows should indicate also when their run of the stored procedure should occur, so the SQL Agent job will only pick rows that are due/slightly overdue. It should delete the rows or mark them after calling the stored procedure.
Then, in the trigger, just insert a new row into this same table.
You do not want to be putting anything in a trigger that will affect the execution of the original transaction in any way - you definitely don't want to be causing any delays, or interacting with anything outside of the same database.
E.g., if the stored procedure is
CREATE PROCEDURE DoMagic
#Name varchar(20),
#Thing int
AS
...
Then we'd create a table:
CREATE TABLE MagicDue (
MagicID int IDENTITY(1,1) not null, --May not be needed if other columns uniquely identify
Name varchar(20) not null,
Thing int not null,
DoMagicAt datetime not null
)
And the SQL Agent job would do:
WHILE EXISTS(SELECT * from MagicDue where DoMagicAt < CURRENT_TIMESTAMP)
BEGIN
DECLARE #Name varchar(20)
DECLARE #Thing int
DECLARE #MagicID int
SELECT TOP 1 #Name = Name,#Thing = Thing,#MagicID = MagicID from MagicDue where DoMagicAt < CURRENT_TIMESTAMP
EXEC DoMagic #Name,#Thing
DELETE FROM MagicDue where MagicID = #MagicID
END
And the trigger would just have:
CREATE TRIGGER Xyz ON TabY after insert
AS
/*Do stuff, maybe calculate some values, or just a direct insert?*/
insert into MagicDue (Name,Thing,DoMagicAt)
select YName,YThing+1,DATEADD(minute,30,CURRENT_TIMESTAMP) from inserted
If you're running in an edition that doesn't support agent, then you may have to fake it. What I've done in the past is to create a stored procedure that contains the "poor mans agent jobs", something like:
CREATE PROCEDURE DoBackgroundTask
AS
WHILE 1=1
BEGIN
/* Add whatever SQL you would have put in an agent job here */
WAITFOR DELAY '00:05:00'
END
Then, create a second stored procedure, this time in the master database, which waits 30 seconds and then calls the first procedure:
CREATE PROCEDURE BootstrapBackgroundTask
AS
WAITFOR DELAY '00:00:30'
EXEC YourDB..DoBackgroundTask
And then, mark this procedure as a startup procedure, using sp_procoption:
EXEC sp_procoption N'BootstrapBackgroundTask', 'startup', 'on'
And restart the service - you'll now have a continuously running query.

I had kind of a similar situation where before I processed the records inserted into the table with the trigger, I wanted to make sure all the relevant related data in relational tables was also there.
My solution was to create a scratch table which was populated by the insert trigger on the first table.
The scratch table had a updated flag, (default set to 0), and an insert get date() date field, and the relevant identifier from the main table.
I then created a scheduled process to loop over the scratch table and perform whatever process I wanted to perform against each record individually, and updating the 'updated flag' as each record was processed.
BUT, here is where I was a wee bit clever, in the loop over process looking for records in the scratch table that had a update flag = 0, I also added the AND clause of AND datediff(mi, Updated_Date, getdate())> 5. So the record would not actually be processed until 5 minutes AFTER it was inserted into the scratch table.

Related

Stored procedure - truncate table

I've created a stored procedure to add data to a table. In mock fashion the steps are:
truncate original table
Select data into the original table
The query that selects data into the original table is quite long (it can take almost a minute to complete), which means that the table is then empty of data for over a minute.
To fix this empty table I changed the stored procedure to:
select data into #temp table
truncate Original table
insert * from #temp into Original
While the stored procedure was running, I did a select * on the original table and it was empty (refreshing, it stayed empty until the stored procedure completed).
Does the truncate happen at the beginning of the procedure no matter where it actually is in the code? If so is there something else I can do to control when the data is deleted?
A very interesting method to move data into a table very quickly is to use partition switching.
Create two staging tables, myStaging1 and myStaging2, with the new data in myStaging2. They must be in the same DB and the same filegroup (so not temp tables or table variables), with the EXACT same columns, PKs, FKs and indexes.
Then run this:
SET XACT_ABORT, NOCOUNT ON; -- force immediate rollback if session is killed
BEGIN TRAN;
ALTER TABLE myTargetTable SWITCH TO myStaging1
WITH ( WAIT_AT_LOW_PRIORITY ( MAX_DURATION = 1 MINUTES, ABORT_AFTER_WAIT = BLOCKERS ));
-- not strictly necessary to use WAIT_AT_LOW_PRIORITY but better for blocking
-- use SELF instead of BLOCKERS to kill your own session
ALTER TABLE myStaging2 SWITCH TO myTargetTable
WITH (WAIT_AT_LOW_PRIORITY (MAX_DURATION = 0 MINUTES, ABORT_AFTER_WAIT = BLOCKERS));
-- force blockers off immediately
COMMIT TRAN;
TRUNCATE TABLE myStaging1;
This is extremely fast, as it's just a metadata change.
You will ask: partitions are only supported on Enterprise Edition (or Developer), how does that help?
Switching non-partitioned tables between each other is still allowed even in Standard or Express Editions.
See this article by Kendra Little for further info on this technique.
The sp is being called by code in an HTTP Get, so I didn't want the table to be empty for over a minute during refresh. When I asked the question I was using a select * from the table to test, but just now I tested by hitting the endpoint in postman and I never received an empty response. So it appears that putting the truncate later in the sp did work.

Run job after trigger - SQL Server

I want to run a job after a trigger each time a column changes (for each row in the table), I would like to run a job.
The job should wait for 5 minutes and then run a stored procedure. I have made something, but when running it, it looks like the whole database is being locked, and I don't want the database to be locked while there are thousand of requests at the same time.
CREATE TRIGGER AfterUPDATETrigger
ON [TmpTable]
FOR UPDATE
AS
DECLARE #EmpID INT, #EmpName VARCHAR(50),
SELECT #EmpID = ID FROM foo ;
SELECT #EmpName = Name FROM foo ;
IF UPDATE(TimeSpan)
BEGIN
EXEC io_sp_delete_reservation #EmpID
WAITFOR DELAY '00:05:00.000';
END

Stored procedure that creates #table - unable to find it in list of tables

I'm trying to run a stored procedure that creates a local table - #table1
The stored procedure is supposed to look for values and create the table and insert the values into it...
INSERT INTO #table1
I execute the stored procedure and it shows that 1 row() affected, however, I am unable to find this table in the list of my tables. Why am I not able to see it or access it?
EDIT: I'm running the stored procedure inside SQL Server against a database. At the end of the stored procedure, the last line is:
Select * from #table1
Thanks.
The #table is a local temp table. It does not exist as a permanent table that you can look for outside the scope of the stored proc. Once the stored proc is run, the temp table is dropped because it is no longer in scope. Temp tables are stored temporarily in the tempdb database but with a different name because two people running the stored procedure at the same time would each have a table that can be referenced in the proc as #table but it would be two separate tables in the tempdb.
Now if what you are doing is looking to see what is in #table at a point in the stored proc in order to troubleshoot the proc, then you need to set thing up in the proc so that you can see the results at different stages or when you hit a certain state such as an error.
This could be something like adding a #debug variable to the proc so that when you are in debug mode, you can select the results to the screen when you are running something like:
CREATE PROC test_proc (#Id INT, #debug BIT = 0)
AS
CREATE TABLE #temp(id INT)
INSERT INTO #temp
VALUES (#Id), (1), (2)
IF #debug = 1
BEGIN
SELECT * FROM #temp
END
UPDATE #temp
SET Id = id-1
IF #debug = 1
BEGIN
SELECT * FROM #temp
END
GO
You would then execute the proc without debugging as so (note that since I am not returning something or inserting to permanent tables, this proc will insert to #temp but you can't see anything. I just didn't want to get complicated here, the steps of the proc will vary depending on what you want to do, the concept I am trying to show is how to use the debug variable):
EXEC test_proc #Id= 5
and with debugging as
EXEC test_proc #Id= 5, #debug= 1
Or it might involved using a table variable instead (because they don't get rolled back on error) and then inserting the data from that table variable into a logging table after the rollback occurs in the Catch block, so that you can see the values at the time the error occurred.
Without knowing more about why you are looking for #temp and what the data means and is used for, it is hard to say what you need to do.
Did you tried refreshing the tables after exceuting Stored procedure

TSQL: use global ##temp for multiple runs of same sp, how to reuse?

I have to run same sp with 100 diff #params once a month which reference data which is hard to get (view runs 2 min, and I need only 2% subset from this view). I want to create some ##temp table so then my sp in all instances will refer to it. How people do this on tsql arena?
Can I include:
If exist then do nothing
in the top of sp code, so it will run only once? and then delete table in separate clean job. or do some ##temp tables, I"m bit new to this . not sure will ## global temp table will work. ALso do I need to go with some special Isolation Level (Serial?) to do this.
Tx for help.
Mario
If the procedure only runs once a month, and view takes its time to run (expensive query) why not just create a table (maybe create some indexes too , to aid the following queries) and then finally create another procedure to populate that table each time you execute your original procedure.
-- Create a holding table
SELECT * INTO Holding_Table
FROM Your_View
WHERE 1 = 2
-- Procedure to populate data into that holding table
CREATE PROCEDURE populate_data
AS
BEGIN
TRUNCATE TABLE Holding_Table
INSERT INTO Holding_Table
SELECT * FROM Your_View
END
Now call this procedure from your existing procedure to populate the data and carry on working with the holding table as usual ...
ALTER PROCEDURE your_existing_Proc
AS
BEGIN
Exec populate_data
..... rest of the procedure definition

TSQL : Timeouts on High traffic table

I'm having issues with timeouts of a table on mine.
Example table:
Id BIGINT,
Token uniqueidentifier,
status smallint,
createdate datetime,
updatedate datetime
I'm inserting data into this table from 2 different stored procedures that are wrapped with transaction (with specific escalation) and also 1 job that executes once every 30 secs.
I'm getting timeout from only 1 of them, and the weird thing that its from the simple one
BEGIN TRY
BEGIN TRAN
INSERT INTO [dbo].[TempTable](Id, AppToken, [Status], [CreateDate], [UpdateDate])
VALUES(#Id, NEWID(), #Status, GETUTCDATE(), GETUTCDATE() )
COMMIT TRAN
END TRY
BEGIN CATCH
IF ##TRANCOUNT > 0
ROLLBACK TRAN;
END CATCH
When there is some traffic on this table (TempTable) this procedure keeps getting timeout.
I checked the execution plan and it seems I haven't missed any indexes in both stored procedures.
Also, the only index on TempTable is the clustered PK on Id.
Any ideas?
If more information is needed, do tell.
The 2nd stored procedure using this table isn't causing any big IO or something.
The job, however, uses an atomic UPDATE on this table and in the end of it DELETEs from the table, but as I checked on high IO of this table, the job takes no longer than 3 secs.
Thanks.
It is most propably because some other process is blocking your insert operation, It could be another insert, delete , update or some trigger or any other sql statement.
To find out who is blocking your operation you can use some esaily avialable stored procedures like
sp_who2
sp_whoIsActive (My Preferred)
While your insert statement is being executed/hung up execute one of these procedures and see who is blocking you.
In sp_who2 you will see a column by the name Blk_by get the SPID from that column and execute the following query
DBCC INPUTBUFFER(71);
GO
This will reutrn the last query executed by that process id. and it is not very well formatted the sql statement, all the query will be in one single line you will need to format it in your SSMS to acutally be able to read it.
On the other hand sp_WhoIsActive will only return the queries that are blocking other process and will have the query formatted just as the user has execute it. Also it will give you the execution plan for that query.

Resources