Is it possible in SQL Server to run several jobs simultaneously under different sessions under same job.
For example, I have N stored procedures to run. They all have to be run under different sessions and start at the same time. I don't want to create N jobs, I want all of them start at the same time under 1 job.
In the past I've had one job create and start several other jobs using the sp_add_job command. If you set the delete level to 3 then the job will then get automatically deleted once it has completed.
The disadvantages are security and monitoring all the jobs.
I don't see any other option than using ssis sql script tasks for different scripts without any link between them and executing them. This will allow to run different SP or sql script to run parallel.Thanks!
Related
I have two SSIS ETL packages that I need to schedule to run on a daily basis. The two packages load data into two different staging databases so these can be run in parallel. However, at the end of execution of both the jobs, I need to call a separate job (stored procs) to load data into final database from the staging database.
Does SQL Server Job Scheduling Agent provide any features for tracking if the previous two jobs were completed successfully or not?
Any help is highly appreciated.
Thanks!
Your best bet is to Wrap your 2 SSIS packages in a Master Package - These can be run in parallel within this.
Then create a Job with this as step 1
Step 2 can be Exec sp_run_job [job you need to run]
I would recommend that you include a third 'control' package in your ssis project that contains Execute Package tasks to run the two packages in parallel within a Sequence Container, and then an Execute SQL task following the successful completion of the Sequence Container to kick off the stored procedure once these both complete.
Doing this, you only need to have one Agent job that runs the 'control' package.
I work with an environment that uses Merge Replication to publish a dozen publications to 6 a dozen subscribers every 10 minutes. When certain jobs are running simultaneously, deadlocks and blocking is encountered and the replication process is not efficient.
I want to create a SQL Server Agent Job that runs a group of Merge Replication Jobs in a particular order waiting for one to finish before the next starts.
I created an SSIS package that started the jobs in sequence, but it uses sp_start_job and when run it immediately starts all the jobs so they are running together again.
A side purpose is to be able to disable replication to a particular server instead of individually disabling a dozen jobs or temporarily disabling replication completely to avoid 70+ individual disablings.
Right now, if I disable a Merge Replication job, the SSIS package will still start and run it anyway.
I have now tried creating an SSIS package for each Replication Job and then creating a SQL Server Agent job that calls these packages in sequence. That job takes 8 seconds to finish while the individual packages it is calling (starting a replication job) takes at least a minute to finish. In other words, that doesn't work either.
The SQL Server Agent knows when a Replication job finishes! Why doesn't an SSIS package or job step know? What is the point of having a control flow if it doesn't work?
Inserting waits is useless. the individual jobs can take anywhere from 1 second to an hour depending on what needs replicating.
May be I didn't see real problem but it is naturally that you need synchronization point and there are many ways to create it.
For example you could still run jobs simultaneously but let first job lock a resource that is needed for second, that will wait till resource will be unlocked. Or second job can listen log table in loop (with wait for a "minute" and self cancel after "an hour")...
I have a job that will create a job for all the databases in the SQL instance. I don't want the jobs to run sequentially. I need multiple databases to run at once, but I also want to make sure that I don't have too many databases running at the same time that might hinder performance on the server.
Is there a way to specify the number of concurrent jobs that can run at the same time or manage the jobs in a way that new jobs won't get started until the number of active jobs is less than what I specify?
You could create a stored procedure that accepts the SQL command(s) and the number [n] of jobs you want to run in parallel and let it create & start n jobs, then go into a loop to poll the msdb tables to see if said jobs are still running and each time it notices there are less than n jobs active (or left) it could start a new job until the entire set of databases has been handled. It should then wait for the last one to finish so the calling process knows everything is processed.
tips:
- Use the #job_id's returned by sp_add_job to check if a job is still running
- Use WITH (NOLOCK) on the system tables to avoid unnecessary locking, it doesn't really matter if you'd use 'dirty reads' when looking for the state of the jobs
- Use WAITFOR DELAY as to only check the tables every 5 seconds or so, otherwise your loop will eat wait too many resources !
I am executing a stored procedure using SQL Server Agent Job in SQL Server 2005.
This job was running fast until yesterday. Since yesterday this job is taking more than 1 hour instead of 2 mins.
I executed the stored procedure in SSMS, it just took less than 1 minute to execute.
I could not figure out why it is taking more than 1 hour when executed as a SQL Server Agent job?
After some time commenting and assuming that the SP performs with the same input parameters and data well when executed in SSMS, I finnaly think I can give a last tip:
Depending on what actions are performed within the SP (e.g. inserting/updating/deleting a lot of data within a loop or cursor), you should set nocount on at the beginning of your code.
set nocount on
If this is not the case or does not help, please add more information, already mentioned in the comments (e.g. all settings of the Job and each Jobstep, what has been logged, what is in the Jobhistory, check SQLerrorlogs, eventlogs,....).
Also take a look at the "SQL Server Logs" maybe you can gather some info here. Also a look into the Application/System eventlo of the Databaseserver is always a good idea.
To get a basic overview you can use the Activitymonitor in SSMS, by selecting the Databaseserver and selecting "Activity monitor" from contextmenu and search for the sql agent.
My last try would be to try to run a sql trace for the agent. In this case you would start a trace and filter e.g. by the user that the SQLAgent Service runs. There are so many options you can set for traces, so I would recommend to google for it, search on MSDN or ask another question here on stackoverflow.
We have a large proc that runs in 88 seconds in SSMS and 30-45 minutes in SQL Server Agent. I added the dbo. prefix on all the table names and now it runs just as fast as SSMS.
I've noticed that SQL Agent jobs ignore the server's MAXDOP setting and run everything with a MAXDOP of 1. If I run a stored procedure in a query windows, it obeys the server settings and uses 4 processes. If I use SQL Agent, any stored procedure I run uses only one process.
I have a similar issue with a script that calls a number of UDFs that I created. The UDF's themselves normally run subsecond under SSMS. Likewise, running the reports I generate with them is bearable under SSMS (30d data in 8s, 365d data in 22s). I've always done NOCOUNT ON with my SQL Agent jobs as they normally generate text files out for pickin up by other processes or Excel and I do not want the extra data at the end, so it was not a solution for me.
In this case, when we run the exact same script under SQL Agent as a job, my times grow exponentially. My 8s script takes 2m30s and my 22s script takes 2h20m. This is the same whether I run it midday with other user activity and jobs or after hours with no user activity, nor jobs or backups running. Our server is idle and at best I get one of the 8 cores being utilized when run. DB is only about 10GB running on SSD with a cached RAID card and 16 of 32GB RAM is free. Since my SQL runs efficiently in SSMS, I am pretty well of the belief that I am hitting a threading limit of some sort. I have researched and tried adjusting MAXDOP just prior to the scripts in the SQL Agent with no luck.
Since this is an activity I want to schedule, it needs to be automated one way or another. I could let these scripts take the hours they need to run as SQL steps in SQL Agent jobs, but I decided to run from command line instead and I get the same performance I see in SSMS.
sqlcmd -S SQLSRVRHost -i "C:\My Script Loc With Spaces.sql" -v MyVar="VarValue" >"C:\MyOutputFile.txt"
So I created a batch script with the SQL jobs run from sqlcmd. Then I run the batch script from a SQL Agent job, so I still have the same management and control in place. My 4 SQL jobs that collectively took over 3 hours to run complete in 1 min and a few seconds from a single batch script executed by SQL Agent.
I hope this helps...
I am working with SQL Server 2008. Using the Agent, I have created a job and scheduled it to execute every minute.
The job executes a stored procedure that moves data from table XXX, to a temp table, and then eventually into table YYY.
The execution of the job may take more than one minute - since the data is rather large.
Will a second instance of the job be started even though the first instance is still running?
If so, should I mark records in temp table (status = 1) to indicate that those records are being processed by a previous instance of the job?
Is there a way for me to check that an instance of the job is currently running, so that I don't initiate a second instance of the job?
Is there another solution for this that I am unaware of? (throughput is important)
Only one instance of a particular job can run at any one time.
So there is no need to take any particular precautions against another execution of the same job beginning before the first one has stopped.
check this post
How to Prevent Sql Server Jobs to Run simultaneously
How to Prevent Sql Server Jobs to Run simultaneously
As Well HERE
Running Jobs
http://technet.microsoft.com/en-us/library/aa213815(v=sql.80).aspx
If a job has started according to its schedule, you cannot start another instance of that job on the same server until the scheduled job has completed. In multiserver environments, every target server can run one instance of the same job simultaneously.