I want to get the latest raw job history and save them in my format after job finished every times.
I have written a stored procedure to get the history with sp_help_jobhistory, then format the result, then save them into a new table.
But, when to call this stored procedure?
Is there some event fired when the job finishes?
Maybe there are some others solutions.
Thanks for your comments/answers.
As Akhil said, simply add a step in your job and ensure that it is chained correctly (assuming your job chains steps based on success only, on the final step success, execute your stored proc).
I have tried the solution, and it works well.
Step 1: Will call sp BeginHistoryLog, this sp will get the information of job from [msdb].[dbo].[sysjobs] tabel by job name. And this SP will write the initial data intto JobHistory which log the history message.
Step 2: Will call the sp which do the actual work.
Step 3: Will call SP EndHistoryLog, this SP will get the step2 execution information from msdb.dbo.sysjobhistory and INNER JOIN [msdb].[dbo].[sysjobs] by job id and step id. And this sp will write the execution inforamtion into JobHistory table.
Related
I have a batch load process that loads data into a staging database. I have a number of tasks that execute stored procedures which move the data to a different database. The tasks are executed when the SYSTEM$STREAM_HAS_DATA condition is satisfied on a given table.
I have a separate stored procedure that I want to execute only after the tasks have completed moving the data.
However, I have no way to know which tables will receive data and therefore do not know which tasks will be executed.
How can I know when all the tasks that satisfied the SYSTEM$STREAM_HAS_DATA condition are finished and I can now kick off the other stored procedure? Is there a way to orchestrate this step by step process similar to how you would in a SQL job?
There is no automated way but you can do it with some coding.
You may create a stored procedure to check the STATE column of the task_history view to see if the tasks are completed or skipped:
https://docs.snowflake.com/en/sql-reference/functions/task_history.html
You can call this stored procedure periodically using a task (like every 5 minutes etc).
Based on your checks inside of the stored procedure (all tasks were succeeded, the target SP wasn't executed today yet etc), you can execute your target stored procedure which needs to be executed after all tasks have been completed.
You can also check the status of all the streams via SELECT SYSTEM$STREAM_HAS_DATA('<stream_name>') FROM STREAM which does not process the stream, or SELECT COUNT(*) FROM STREAM.
Look into using IDENTIFIER for dynamic queries.
we have a requirement where SSIS job should trigger based on the availability of value in the status table maintained,point to remember here that we are not sure about the exact time when the status is going to be available so my SSIS process must continuously look for the value in status table,if value(ex: success) is available in status table then job should trigger.here we have 20 different ssis batch processes which should invoke based on respective/related status value is available.
What you can do is:
Scheduled the SSIS package that run frequently.
For that scheduled package, assign the value from the table to a package variable
Use either expression for disabling the task or constraint expression to let the package proceeds.
Starting a SSIS package takes some time. So I would recommend to create a package with the following structure:
Package variable Check_run type int, initial value 1440 (to stop run after 24 hours if we run check every minute). This is to avoid infinite package run.
Set For Loop, check if Check_run is greater than zero and decrement it on each loop run.
In For loop check your flag variable in Exec SQL task, select single result value and assign its result to a variable, say, Flag.
Create conditional execution branches based on Flag variable value. If Flag variable is set to run - start other packages. Otherwise - wait for a minute with Exec SQL command waitfor delay '01:00'
You mentioned the word trigger. How about you create a trigger when that status column meets the criteria to run the packages:
Also this is how to run a package from T-SQL:
https://www.timmitchell.net/post/2016/11/28/a-better-way-to-execute-ssis-packages-with-t-sql/
You might want to consider creating a master package that runs all the packages associated with this trigger.
I would take #Long's approach, but enhance it by doing the following:
1.) use Execute SQL Task to query the status table for all records that pertain to the specific job function and load the results into a recordset. Note: the variable that you are loading the recordset into must be of type object.
2.) Create a Foreach Loop enumerator of type ADO to loop over the recordset.
3.) Do stuff.
4.) When the job is complete, go back to the status table and mark the record complete so that it is not processed again.
5.) Set the job to run periodically (e.g., minute, hourly, daily, etc.).
The enhancement hear is that no flags are needed to govern the job. If a record exists then the foreach loop does its job. If no records exist within the recordset then the job exits successfully. This simplifies the design.
I have an issue where a stored procedure I built to process some data doesn't seem to work correctly when being run by the SQL Server Agent. If I manually execute the job everything works fine. I also get no errors.
The basic process is I have three source tables (A,B,C), each are for a different type of service. They are used to calculate a value, and since the calculation process is different each year, there are multiple results, so ID B43 will have result15, and result16, corresponding to the 2014/15 and 2015/16 financial years. The final result is stored in a master table that contains the ID, Type, and both result values for each record.
To make this work, I have six stored procedures that run sequentially, first for the 14/15 calculation then for the 15/16 calculation.
The general structure of each stored procedure is to take the data, manipulate it based on some mapping tables and create a CalculationInput table, which is simply there to review the input into the calculation for audit checks. Once this is done, the actual calculation occurs with the result is stored in a temp table #Results. After the temp table is built, I create some indexes, and then call another stored procedure, passing in the financial year as a parameter, which takes the temp table and does a insert update on the master table. After that, the stored procedure ends, and the next one is called.
StoredProcA15 -> InsUpdProc
StoredProcB15 -> InsUpdProc
StoredProcC15 -> InsUpdProc
StoredProcA16 -> InsUpdProc
StoredProcB16 -> InsUpdProc
StoredProcC16 -> InsUpdProc
As I said before, this works perfectly fine if I right click on the job and execute it. However if I then wait a week and look at the master table, new records won't be there, despite them being in the source table, as well as the CalculationInput table. If I then run the section of code that builds the temp table, the result appears there.
So my best guess at the source of the failure is the call to the insert/update procedure. Without knowing more about SQL Server concurrency, I am wondering if parent stored procedure isn't waiting for the insert/update procedure to finish before it ends and moves onto the next stored procedure.
Is this a possibility? And if so, what is the best way to fix it?
Sounds like a rights issue to me. Check the User Id that the Windows Service "SQL Server Agent" is running under. To troubleshoot change it to a windows login that has rights on your db and make sure that the password never expires.
To do that :
from control panel --> Administrative tools --> Services ---> SQL Agent --> Right click --> Properties --> Logon tab --> This Account --> your windows id / pswd --> click ok to save
try this on a job that has a schedule that requires it to run every 5 mins (so you dont have to wait for days lol ) and exec a dummy sp and observe if it executes a few times thru the scheduler.
log off from windows and it should still continue.
Hope that helps
A little bit stuck on a problem we are having, let me try my best to explain the scenario, issues and what we have tried.
We have an SQL job which is logging to a specific file tagged with a date appended on to the end
The first step checks what our log file is e.g. log_01012012.log, if this is incorrect we updated the table sysjobsteps to have the new file location for all steps - e.g log_02012012.log. We are doing this by calling a stored proc in the first job step.
Whilst this gets updated in the table the remaining steps keep using the old log, due to what I'm assuming is that this table only gets read once when the job is started.
We have tried to restart the job within the stored proc with the following code:
EXEC msdb.dbo.sp_stop_job #job_name = 'My Job'
waitfor delay '00:00:15'
EXEC msdb.dbo.sp_start_job #job_name = 'My Job'
However, when we kill the job it appears as it kills the stored procedure (which i guess is a child of the job) which means it never gets to the step to restart the job.
Is there a way in which a job can restart itself so it looks at the sysjobsteps table again and uses the correct location for the logging?
Things which might solve the issue would be
Being able to restart a job from the job itself
Being able to refresh the job in some respect.
Anything I need to clarify I will do my best but we are currently stuck and all input will be greatly appreciated!
go to sql server agent on SSMS
expand Jobs
create a job,
define the steps (sp or simple query)
define the schedule (how often or even start the job when the sqlserver restarts)
set notification to email you when job completes ( succeeded/failed )
hope this helps.
You could do something fancy with Service Broker. Something like:
1) Put a message into a broker queue as the last step of your job. The content can be empty; it's just a token to say "hey... the job needs to be resubmitted".
2) Write an activation stored procedure for the queue that puts in a delay (like you're doing in your already existing procedure) and then resubmit the job
Or, you instead of hardcoding the log file name in the job steps themselves, put that data in a table somewhere. Then, in your step that changes the location define the success condition as "go to next step" and the failure condition as "go to step 1". Then modify the code that changes the file location to return an error (thus triggering the job step's failure condition).
Realise nearly a decade old, but I had the same problem and solved it as follows. Using the code that the OP suggested:
EXEC msdb.dbo.sp_start_job #job_name = 'My Job A'
Simply create a matching pair of Jobs, each of which has an identical first step (doing the actual work) and a final step containing the above code but pointing at the opposite job name (basically you can't restart the same job from itself as far as I can tell). Just make sure you set the Step Behaviour in the Advanced tab to execute this final step in each case and the job will just rumble away indefinitely.
Is there a way to query the current status (executing, idle, etc) and the last result (successfull, failed, etc), and the last run time for a specific job name? The end result I am looking for is being able to display this information in an internal web application for various SSIS packages.
You should be able to find this information inMSDB - there are tables sysjobs, sysjobhistory and sysjobsteps which give the information that you are looking for
exec msdb.dbo.sp_help_job #job_name = 'TheJobName'
gives the information I want. So then I can just use a SqlDataReader to get the information. Note that this stored procedure returns multiple result sets.
The micrsoft documentation on this store procedure is
http://msdn.microsoft.com/en-us/library/ms186722(SQL.90).aspx
Another solution I have used is to update a reference table with the current status. It's quick and easy and usually very easy to retrieve the values you need.
For example, as soon as a package kicks off, insert a record with date and time, package name, etc.