How to run a database procedure after Siebel EIM job - database

I would like to run a database procedure after a successful EIM job run to collect information and record historical changes.
How can I call this database procedure automatically without manual intervention.
BR
Hani

Depends on the solution approach you take. If all has to be handeled within Sieber, the a workflow that creates the job, monitors it result and then proceeds with the second step seems to me to be feasible. An alternative would be to use a CI system like Jenkins/Automic to handle the orchestration part.

Related

SSRS - Using ReportServer AddEvent, not processing subscriptions always

I'm trying to create a homegrown subscription system for our ReportServer for QA purposes. I'm creating subscriptions on the front end and setting them on a schedule to execute once in the past (1/1/2019 for example) so the only way they would be executed is manually.
I looked at the jobs and they are simply executing a procedure "ReportServer.dbo.AddEvent" with EventType and EventData as parameters. When I execute that procedure, it adds a record to the Event table and then the Notifications table for the RS to pick up and process the subscription. Most of the time it works but sometimes it just hangs. I'm finding this for data-driven subscriptions especially. They are stuck in pending/processing and never execute fully.
Has anyone else had issues with this before? I'm building out a homegrown system because we have 200+ reports that we want to execute on a nightly basis. They are usually pretty quick but I want to only execute a handful at a time and I'm taking care of this through a custom Queue table.
Figured it out within the log files... It was data-driven but each version had the same file name and it was trying to overwrite a file that was still open.

SQL server jobs precedence

I have a situation where I have a job that runs every day (Job A), job that runs every 2 days (Job B) and another job that runs every weekend (Job C). I need to make sure that Job A runs before Job B. If Job A does not run appropriately then i don't want Job B to run. The same thing applies to Job C. Anyone have any thoughts on how to go about this?
Appreciate any help
I have used a product called SQL Sentry to do what you are trying to do. SQL Sentry has a lot of other advanced monitoring and control functionality (like killing jobs that hang, queuing low priority jobs, etc). Here is their website https://sentryone.com/platform/sql-server-performance-monitoring.
This is a quote from one of their advertising:
19. Chaining and Queuing
Did you ever wish you could find just a few more hours in your
maintenance window, or need to have jobs run in a particular sequence?
The advanced chaining features in SQL Sentry Event Manager can assure
that interdependent jobs run in the proper order without wasting time
or resources.
Chaining
SQL Sentry Event Manager allows you to chain
SQL Agent Jobs, Windows Tasks, or Oracle Jobs across servers. You can
enforce dependencies and automate workflow throughout your entire
enterprise, even across platforms! The graphical chaining interface
allows you to design the workflow using variables such as completion,
success, or failure. More details are available in the User Guide, but
take a few minutes to watch our two video tutorials on chaining -
Graphical Chaining Interface and Advanced Chaining.
I need to make sure that Job A runs before Job B. If Job A does not run appropriately then i don't want Job B to run. The same thing applies to Job C.
Create all jobs A,B,C and schedule only job A..At the end of job A ,success event ,Call job B like below
EXEC dbo.sp_start_job N'Weekly Sales Data Backup' ;
GO
Now the same things applies to job c,call job c on success event of job B..
I would go with this approach..You also can go with an approach of insert success ,failure values into a table and ensure job b or c reads those values before starting

Control a trigger in SQL

I would like to know how we can control a trigger in SQL when the process is in progress.
Example: If any Update, Delete, Insert operation performed on a table (Employee Table) then I am executing a batch file located in the windows drive. Here the batch file takes around 5 minutes to complete the whole process. So my question here is for suppose I have a multiple application connected to the same table (Employee Table) and if different an performed on the table from Application and due to trigger the batch process started. In meanwhile from another application one more operation performed then it triggers the batch file again. Due to which the performance is degrading or getting crashed.
So here i would like to know is there any way to control the trigger. Such as until the batch file completes the process the second trigger is kept on hold and after the completion the process needs to be started again.
You could have a helper table where your sp check if running = 0, if so, writes 1 and batch starts. When the batch ends, set running = 0.
Ugh - Having a database trigger execute a batch script on the server sounds like a horrible idea. And it takes 5 minutes!!!???? You've pretty much abandoned the idea of concurrency, and it does not sound like a very stable system.
But assuming you really want to move forward with this design. I suggest that any process that performs DML on the table should establish an exclusive lock on the table before the DML is executed. Your trigger(s) are always executed after the DML is complete on SQL Server, which I suspect is too late.
I recommend that you force all DML on that table to be through stored procedures that always establish the table lock before proceeding. Your actions are thus guaranteed to be serialized, your batch script never executes more than once at any given time, and concurrent requests will be queued.
You could restrict write access to the table by giving write access to a single account that owns the stored procedures, and then grant execute privilege on the procedures to appropriate users/roles.
Actually, I recommend that you abandon your design completely. But I can't suggest an alternative because I have no idea what led you to your current design

Get SQL Agent job status without polling?

I'm trying to find a way to have the SQL Server 'SQL Agent' run a particular piece of code on job step events. I had hoped that there was a way using SMO to register a callback method so that when job steps begin or change status, my code is called. I'm not having any success. Is there any way to have these events pushed to me, rather than polling?
There is no Smo, DDL or trace event exposed for job execution (as far as I can see from Books Online), so I don't think you can do what you want directly. It would be helpful if you could explain exactly what your goal is (and your MSSQL version) and someone may have a useful suggestion. For example, a trace may be a better idea if you want to gather audit or performance data.
In the meantime, here are some ideas (mostly not very 'nice' ones):
Convert your jobs into SSIS packages (they have a full event model)
Build something into the job steps themselves
Log job step completion to a table, and use a trigger on the table to run your code
Run a trace with logging to a table and use a trigger on the table to run your code

How do I check job status from SSIS control flow?

Here's my scenario - I have an SSIS job that depends on another prior SSIS job to run. I need to be able to check the first job's status before I kick off the second one. It's not feasible to add the 2nd job into the workflow of the first one, as it is already way too complex. I want to be able to check the first job's status (Failed, Successful, Currently Executing) from the second one's, and use this as a condition to decide whether the second one should run, or wait for a retry. I know this can be done by querying the MSDB database on the SQL Server running the job. I'm wondering of there is an easier way, such as possibly using the WMI Data Reader Task? Anyone had this experience?
You may want to create a third package the runs packageA and then packageB. The third package would only contain two execute package tasks.
http://msdn.microsoft.com/en-us/library/ms137609.aspx
#Craig
A status table is an option but you will have to keep monitoring it.
Here is an article about events in SSIS for you original question.
http://www.databasejournal.com/features/mssql/article.php/3558006
Why not use a table? Just have the first job update the table with it's status. The second job can use the table to check the status. That should do the trick if I am reading the question correctly. The table would (should) only have one row so it won't kill performance and shouldn't cause any deadlocking (of course, now that I write it, it will happen) :)
#Jason: Yeah, you could monitor it or you could have a trigger start the second job when the end status is recieved. :)

Resources