How do I check job status from SSIS control flow? - sql-server

Here's my scenario - I have an SSIS job that depends on another prior SSIS job to run. I need to be able to check the first job's status before I kick off the second one. It's not feasible to add the 2nd job into the workflow of the first one, as it is already way too complex. I want to be able to check the first job's status (Failed, Successful, Currently Executing) from the second one's, and use this as a condition to decide whether the second one should run, or wait for a retry. I know this can be done by querying the MSDB database on the SQL Server running the job. I'm wondering of there is an easier way, such as possibly using the WMI Data Reader Task? Anyone had this experience?

You may want to create a third package the runs packageA and then packageB. The third package would only contain two execute package tasks.
http://msdn.microsoft.com/en-us/library/ms137609.aspx
#Craig
A status table is an option but you will have to keep monitoring it.
Here is an article about events in SSIS for you original question.
http://www.databasejournal.com/features/mssql/article.php/3558006

Why not use a table? Just have the first job update the table with it's status. The second job can use the table to check the status. That should do the trick if I am reading the question correctly. The table would (should) only have one row so it won't kill performance and shouldn't cause any deadlocking (of course, now that I write it, it will happen) :)
#Jason: Yeah, you could monitor it or you could have a trigger start the second job when the end status is recieved. :)

Related

SSIS - what is being processed right now?

For a report, I need to know what sub-components (parts of the control flow) of a SSIS package are being processed right now. I know about the catalog view catalog.executable_statistics, but it seems a record is added there after the execution of a given execution path is finished.
Is there a way to check what execution paths already entered pre-execute phase and not yet entered the post-execute phase? In other words, what is the package working on right now?
We use SQL server 2016 Enterprise edition.
Edit:
I prefer a solution that would work with the default logging level.
One option is querying catalog.execution_component_phases which will display the most recently run execution phase of each sub-component within a Data Flow Task while the package is executing. This will let you see what component has started a phase such as PreExecute but hasn't yet begun a subsequent one like PostExecute or ReleaseConnections. To use this, you'll need to set the logging level at either Performance or Verbose as well.
As far as I know there isn't any logging out of the box that will tell you the exact state of all subcomponents in the package when executing it from SQL server.
I simply use an SQL task at the start of some steps that inserts, and when done, updates a row with specifics like start/end datetime, package name & amount of rows processed. You could add a column which specifies the subcomponents affected is this way.

How to run a database procedure after Siebel EIM job

I would like to run a database procedure after a successful EIM job run to collect information and record historical changes.
How can I call this database procedure automatically without manual intervention.
BR
Hani
Depends on the solution approach you take. If all has to be handeled within Sieber, the a workflow that creates the job, monitors it result and then proceeds with the second step seems to me to be feasible. An alternative would be to use a CI system like Jenkins/Automic to handle the orchestration part.

Need to split two SQL Server Agent Jobs into different schedules

I have two SQL Agent jobs that share the same schedule due to an error I made during the creation of the second job. I generated a script in SSMS and edited some values, but I left the schedule_uid the same. Now it turns out that while those two jobs are running at the same time, they are corrupting each other's data.
What I need to do is leave the original job alone, but create a new schedule and have the second job use this new schedule. However, my searches for the correct way to do this have all resulted in dead-ends.
None of this can be done using a UI .. it all must be scripted so it can be run during a maintenance window without me present.
Thanks in advance for any assistance.
Use msdb.dbo.sp_detach_schedule followed by msdb.dbo.sp_add_jobschedule.

Get SQL Agent job status without polling?

I'm trying to find a way to have the SQL Server 'SQL Agent' run a particular piece of code on job step events. I had hoped that there was a way using SMO to register a callback method so that when job steps begin or change status, my code is called. I'm not having any success. Is there any way to have these events pushed to me, rather than polling?
There is no Smo, DDL or trace event exposed for job execution (as far as I can see from Books Online), so I don't think you can do what you want directly. It would be helpful if you could explain exactly what your goal is (and your MSSQL version) and someone may have a useful suggestion. For example, a trace may be a better idea if you want to gather audit or performance data.
In the meantime, here are some ideas (mostly not very 'nice' ones):
Convert your jobs into SSIS packages (they have a full event model)
Build something into the job steps themselves
Log job step completion to a table, and use a trigger on the table to run your code
Run a trace with logging to a table and use a trigger on the table to run your code

Sequential Scheduling of Jobs

We have scheduled a number of jobs in SQL Server 2000. We want these jobs to be executed in a sequential order i.e. the failure of one job should prevent the next job from running. Can someone help me on doing this or creating dependency between scheduled jobs.
You could define your jobs as steps of one single job. So you'll can specify on every step if the next step should be executed in case of error.
Rather than combining the jobs in to one single block, it is better to divide in to pieces to simplify the error detection and make the management easier. It gives you to control your process step by step. If your SQL jobs can be executed via batch files, you can use windows task scheduler and define dependencies. But if the subject is more complex ETL process management, it is better to manage this process on a job scheduler.
I've done this in a queue system to cache data where there were 4 or 5 steps involved and had to allow delays for replication between the steps.
It was rather time consuming to implement as there were parent tasks which spawned 1 to n child steps which sometimes needed to be executed in order, sometimes irrelevant.
If you go down this path, you then need to create a location for error messages and process logs.
I highly recommend if in any way it can be created as one job with multiple steps, you should use the existing jobs agent. Each individual step can be configured to exit on fail, continue on fail, email on fail, etc - It's rather flexible.

Resources