Sequential Scheduling of Jobs - sql-server

We have scheduled a number of jobs in SQL Server 2000. We want these jobs to be executed in a sequential order i.e. the failure of one job should prevent the next job from running. Can someone help me on doing this or creating dependency between scheduled jobs.

You could define your jobs as steps of one single job. So you'll can specify on every step if the next step should be executed in case of error.

Rather than combining the jobs in to one single block, it is better to divide in to pieces to simplify the error detection and make the management easier. It gives you to control your process step by step. If your SQL jobs can be executed via batch files, you can use windows task scheduler and define dependencies. But if the subject is more complex ETL process management, it is better to manage this process on a job scheduler.

I've done this in a queue system to cache data where there were 4 or 5 steps involved and had to allow delays for replication between the steps.
It was rather time consuming to implement as there were parent tasks which spawned 1 to n child steps which sometimes needed to be executed in order, sometimes irrelevant.
If you go down this path, you then need to create a location for error messages and process logs.
I highly recommend if in any way it can be created as one job with multiple steps, you should use the existing jobs agent. Each individual step can be configured to exit on fail, continue on fail, email on fail, etc - It's rather flexible.

Related

SQL Server Agent job dependency (not step)

We have 2 Jobs created in SQL Server Agent.
PreLoad
DWHLoad
Job step list in both Jobs have various steps.
DWHLoad needs to be run after successful completion of PreLoad Job.
As of now, I've scheduled PreLoad to run at 1:00AM and it finishes at 5:00AM.
DWHLoad to run at 6:00AM to avoid issues if PreLoad delays for any reason.
I could gather PreLoad steps into DWHLoad and run as one job to maintain dependency.
However, there are occasions where I need to run PreLoad separately and same is true with DWHLoad.
Is there a way to create dependency on Job and not on Job step?
i.e. Start DWHLoad only after successful completion of the PreLoad job?
Keep the 2 jobs you have and remove the schedule. This will allow you to right click and start the job for the times you want to run them manually. You mentioned that each job has multiple steps, so you will need to create a 3rd job with the combined steps from each job in the order you need. Add a schedule to the 3rd job and you will have the dependency you are wanting with a scheduled job.

SQL server jobs precedence

I have a situation where I have a job that runs every day (Job A), job that runs every 2 days (Job B) and another job that runs every weekend (Job C). I need to make sure that Job A runs before Job B. If Job A does not run appropriately then i don't want Job B to run. The same thing applies to Job C. Anyone have any thoughts on how to go about this?
Appreciate any help
I have used a product called SQL Sentry to do what you are trying to do. SQL Sentry has a lot of other advanced monitoring and control functionality (like killing jobs that hang, queuing low priority jobs, etc). Here is their website https://sentryone.com/platform/sql-server-performance-monitoring.
This is a quote from one of their advertising:
19. Chaining and Queuing
Did you ever wish you could find just a few more hours in your
maintenance window, or need to have jobs run in a particular sequence?
The advanced chaining features in SQL Sentry Event Manager can assure
that interdependent jobs run in the proper order without wasting time
or resources.
Chaining
SQL Sentry Event Manager allows you to chain
SQL Agent Jobs, Windows Tasks, or Oracle Jobs across servers. You can
enforce dependencies and automate workflow throughout your entire
enterprise, even across platforms! The graphical chaining interface
allows you to design the workflow using variables such as completion,
success, or failure. More details are available in the User Guide, but
take a few minutes to watch our two video tutorials on chaining -
Graphical Chaining Interface and Advanced Chaining.
I need to make sure that Job A runs before Job B. If Job A does not run appropriately then i don't want Job B to run. The same thing applies to Job C.
Create all jobs A,B,C and schedule only job A..At the end of job A ,success event ,Call job B like below
EXEC dbo.sp_start_job N'Weekly Sales Data Backup' ;
GO
Now the same things applies to job c,call job c on success event of job B..
I would go with this approach..You also can go with an approach of insert success ,failure values into a table and ensure job b or c reads those values before starting

How do you run SQL Server Merge Replication Jobs sequentially?

I work with an environment that uses Merge Replication to publish a dozen publications to 6 a dozen subscribers every 10 minutes. When certain jobs are running simultaneously, deadlocks and blocking is encountered and the replication process is not efficient.
I want to create a SQL Server Agent Job that runs a group of Merge Replication Jobs in a particular order waiting for one to finish before the next starts.
I created an SSIS package that started the jobs in sequence, but it uses sp_start_job and when run it immediately starts all the jobs so they are running together again.
A side purpose is to be able to disable replication to a particular server instead of individually disabling a dozen jobs or temporarily disabling replication completely to avoid 70+ individual disablings.
Right now, if I disable a Merge Replication job, the SSIS package will still start and run it anyway.
I have now tried creating an SSIS package for each Replication Job and then creating a SQL Server Agent job that calls these packages in sequence. That job takes 8 seconds to finish while the individual packages it is calling (starting a replication job) takes at least a minute to finish. In other words, that doesn't work either.
The SQL Server Agent knows when a Replication job finishes! Why doesn't an SSIS package or job step know? What is the point of having a control flow if it doesn't work?
Inserting waits is useless. the individual jobs can take anywhere from 1 second to an hour depending on what needs replicating.
May be I didn't see real problem but it is naturally that you need synchronization point and there are many ways to create it.
For example you could still run jobs simultaneously but let first job lock a resource that is needed for second, that will wait till resource will be unlocked. Or second job can listen log table in loop (with wait for a "minute" and self cancel after "an hour")...

Need to split two SQL Server Agent Jobs into different schedules

I have two SQL Agent jobs that share the same schedule due to an error I made during the creation of the second job. I generated a script in SSMS and edited some values, but I left the schedule_uid the same. Now it turns out that while those two jobs are running at the same time, they are corrupting each other's data.
What I need to do is leave the original job alone, but create a new schedule and have the second job use this new schedule. However, my searches for the correct way to do this have all resulted in dead-ends.
None of this can be done using a UI .. it all must be scripted so it can be run during a maintenance window without me present.
Thanks in advance for any assistance.
Use msdb.dbo.sp_detach_schedule followed by msdb.dbo.sp_add_jobschedule.

Running SQL Agent job for concurrent databases

I have a job that will create a job for all the databases in the SQL instance. I don't want the jobs to run sequentially. I need multiple databases to run at once, but I also want to make sure that I don't have too many databases running at the same time that might hinder performance on the server.
Is there a way to specify the number of concurrent jobs that can run at the same time or manage the jobs in a way that new jobs won't get started until the number of active jobs is less than what I specify?
You could create a stored procedure that accepts the SQL command(s) and the number [n] of jobs you want to run in parallel and let it create & start n jobs, then go into a loop to poll the msdb tables to see if said jobs are still running and each time it notices there are less than n jobs active (or left) it could start a new job until the entire set of databases has been handled. It should then wait for the last one to finish so the calling process knows everything is processed.
tips:
- Use the #job_id's returned by sp_add_job to check if a job is still running
- Use WITH (NOLOCK) on the system tables to avoid unnecessary locking, it doesn't really matter if you'd use 'dirty reads' when looking for the state of the jobs
- Use WAITFOR DELAY as to only check the tables every 5 seconds or so, otherwise your loop will eat wait too many resources !

Resources