How can I measure the duration of a package execution? - sql-server

I have an SSIS job that I would like to run from a procedure, and then, using the job start and job end date execute a select statement.
Getting the job start time is easy- just save the current time from just before you call the job.
How can I get the end tim? Can I use #endTime = GETDATE()? Does starting the job wait for it to end ?
Is it true in general about calls inside SQL procedures?
EDIT:
As people asked, I wanted to call an SSIS job using this code, which I found here:
declare #execution_id bigint
exec ssisdb.catalog.create_execution
#folder_name = 'mssqltips'
,#project_name = 'exec-ssis-stored-proc-ssis-sample'
,#package_name = 'Sample.dtsx'
,#execution_id = #execution_id output
exec ssisdb.catalog.start_execution #execution_id

SSIS already logs package execution durations and events, including step durations. You don't need to use GETDATE().
You can query the catalog.executions view of the SSISDB database to retrieve the execution status, start time and end time, eg:
select status, start_time, end_time,datediff(s,start_time,end_time) as duration
from catalog.executions
where execution_id = #execution_id
Or
select status, start_time, end_time, datediff(s,start_time,end_time) duration
from catalog.executions
where project_name = 'exec-ssis-stored-proc-ssis-sample'
and package_name = 'Sample.dtsx'
order by execution_id
for historical data

It depends on how to you run SSIS via SP. Is it a sql agent job or package execution (catalog)?
If you run it as package, it can be run synchronously or asynchronously.
If it is in asynchronous mode, SP just starts the SSIS package and doesn't wait.
IF it is in synchronous mode, it will wait.
The mode depends on SYNCHRONIZED parameter. This parameter should be set BEFORE execution starts, see the link below how to set it.
https://learn.microsoft.com/en-us/sql/integration-services/system-stored-procedures/catalog-set-execution-parameter-value-ssisdb-database
If you run SQL job from SP and that job executes SSIS package, then SP does not wait, it just activates the SQL Agent Job.

Related

Run parallel transaction while executing code in another session

Is there any possibility to run update at some specific point of time, but in different/parallel session? In the provided example I want some specific update to be run at the time when I run WAITFOR. Currently I have this WAITFOR block to have some time to switch to another SSMS (or other tool) window/tab and run update while it's waiting for 10 secs. Logically the only thing is needed to be done, is that transaction started at this point of time.
EXEC dbo.p_sync_from_accounts_ext_test #enable_snapshot_isolation = 1
, #run_update_flag = NULL
, #run_wait_for_10 = NULL
, #acc = #acc;
WAITFOR DELAY '00:00:10'; -- Execute update in parallel transaction
-- table update should be performed in that parallel transaction
EXEC dbo.p_finish_sync_attributes;
Yes, you can do it.
Method 1: loop-back linked server (linked server that points to your current server) that does not have DTC enabled. Call your SP from your linked server.
Methd 2: create an SQL Server Job and start the job programmatically.
Note that in the first case your update statement must be included in an SP. In the second case it is advisable but not necessary.

SSIS package execution succeeds but doesn't do its job

I use SQL Server Agent to fill tables in DataWarehouse. I have about 50 steps in the job and every step is run with a proxy account. And every step works correctly besides one:
SSIS package which contains about 20 Execute SQL Tasks which execute procedure. This is what I have in the Execute SQL Task:
DECLARE #RC int
DECLARE #fordate datetime = null
DECLARE #tablename sysname = 'D_ENTITY'
DECLARE #dataFolder varchar(1024) = 'C:\MountPoints1\src_etl\'
DECLARE #logFolder varchar(1024) = 'C:\MountPoints1\src_etl\'
DECLARE #debug bit = 0
EXECUTE #RC = [dbo].[ETL1A_HR]
#fordate
,#tablename
,#dataFolder
,#logFolder
,#debug
GO
The thing is, that if I execute the package from the SSIS catalog, it works ok. But if it is run by job, it succeeds, but only deletes from tables, but doesn't fill it. It seems like the procedure stops somewhere in the middle.
Any ideas?
Please advise, it took me days trying to solve this...
I think it maybe related to permissions. Executing the SSIS package will use your security context but running it from the agent impersonates the credentials defined in the proxy, and then runs the job step by using that security context.

Status of Stored Procedure call from Agent job, when job is stopped

We have a clean-up job, which calls a stored procedure, which in turn deletes one day's worth of records for a log table. This job runs every five minutes and usually completes in less than 10 seconds. Sometimes, it take much longer, as long as 15 minutes. During such instances, the log table gets locked and subsequent transactions timeout, till the job completes.
In order to address this, we came up with this solution -
1) Remove the scheduling of the existing job
2) Create a new job, to call the original job
3) Schedule the new job to run every 5 minutes
4) See below code of the new job
DECLARE #JobToRun NVARCHAR(128) = N'OM_EDU_Purge logs'
EXEC msdb.dbo.sp_start_job #JobToRun
WAITFOR DELAY '00:00:20'
IF EXISTS(SELECT 1
FROM msdb.dbo.sysjobs J
JOIN msdb.dbo.sysjobactivity A
ON A.job_id=J.job_id
WHERE J.name=#JobToRun
AND A.run_requested_date IS NOT NULL
AND A.stop_execution_date IS NULL
)
BEGIN -- job is running or finishing (not idle)
EXEC msdb.dbo.sp_stop_job #job_name=#JobToRun
-- could log info, raise error, send email etc here
END
This seems to work fine and stops the job if it is still running after 20 seconds. However, since the job calls a stored procedure, here is my question:
When the job is stopped, will it also terminate the stored procedure that is executing?
I think you query gets stuck because the log table being updated or creating records concurrently with you delete statement. So you might try to lock the table while delete statement execution. update your procedure inside query like this exp: delete from logs with(tablock)
Here, a stored proc is just calling another nested stored proc. So no, stored proc won't be stopped. The control will return to the calling stored proc. You should have sufficient error-handling in the proc to take care of scenarios where the called sproc errors out.

How to execute in parallel using Transaction SQL?

I need to call a stored procedure with hundreds different parameters in a scheduled SQL Agent job. Right now it's executed sequentially. I want to execute the stored procedure with N (e.g. N = 8) different parameters at the same time.
Is it a good way to implement it in Transaction SQL? Can SQL Server Service broker be used for this purpose? Any other options?
There is mention in a comment on the question of a table that holds the various parameters to call the proc with, and that the execution times vary a lot across the parameter values.
If you are able to add two fields to the table of parameters--StartTime DATETIME and EndTime DATETIME--then you can create 7 more SQL Agent Jobs and have them scheduled to run at the same time.
The Job Step of each Job should be the same and should be similar to the following:
DECLARE #Params TABLE (ParamID INT, Param1 DataType, Param2 DataType, ...);
DECLARE #ParamID INT,
#Param1Variable DataType,
#Param2Variable DataType,
...;
WHILE (1 = 1)
BEGIN
UPDATE TOP (1) param
SET param.StartTime = GETDATE() -- or GETUTCDATE()
OUTPUT INSERTED.ParamID, INSERTED.Param1, INSERTED.Param2, ...
INTO #Params (ParamID, Param1, Param2, ...)
FROM Schema.ParameterTable param
WHERE param.StartTime IS NULL;
IF (##ROWCOUNT = 0)
BEGIN
BREAK; -- no rows left to process so just exit
END;
SELECT #ParamID = tmp.ParamID,
#Param1Variable = tmp.Param1,
#Param2Variable = tmp.Param2,
FROM #Params tmp;
BEGIN TRY
EXEC Schema.MyProc #Param1Variable, #Param2Variable, ... ;
UPDATE param
SET param.EndTime = GETDATE() -- or GETUTCDATE()
FROM Schema.ParameterTable param
WHERE param.ParamID = #ParamID;
END TRY
BEGIN CATCH
... do something here...
END CATCH;
DELETE FROM #Params; // clear out last set of params
END;
That general structure should allow for the 8 SQL Jobs to run until all of the parameter value sets have been executed. It accounts for that fact that some sets will run faster than others as each Job will just pick the next available one off the queue until there are none left, at which time the Job will cleanly exit.
Two things to consider adding to the above structure:
A way of resetting the StartTime field to be NULL so that the row can re-run later
A way of handling errors (i.e. clean up of rows where StartTime IS NOT NULL AND EndTime IS NULL and the DATEDIFF between StartTime and GETDATE / GETUTCDATE is too much. A TRY / CATCH could do it by either setting StartTime back to NULL to get re-run OR maybe add a 3rd field for ErrorTime DATETIME that is reset to NULL at the start of the run (like the other 2 fields) but only set if an error happens. Those are just some thoughts.
SQL Server has nothing native built in to issue parallel queries from a T-SQL batch. You need an external driver. Someone who connects on N connections.
SQL Agent can do that if you create N jobs and start them manually. It is a hack, but it will work.
It is probably easier to write a small C# app do do this and put it into Windows Task Scheduler.

Create a CmdExec Job Step with t-sql inside

I want to execute a .exe with sql server. For security reasons I can't use the xp_cmdshell. So I decided to create job with a CmdExec Step.
The .exe file must receive 2 parameters. The problem is I never know the parameter.
Ex : I want to give 2 date, today and today + 1 day.
It is easy to do in T-sql, so it is possible to use t-sql INSIDE a CmdExec step ?
Frist,Create a Job with CmdExec step, command like this.
EXEC test.exe #Parm1, #Parm2
After that, in your code to execute .exe
-- Update Job Step with real parameter
UPDATE msdb.dbo.sysjobsteps
SET command = REPLACE(REPLACE(command,'#Parm1','NewParm1') ,'#Parm2','NewParm2')
WHERE job_id = #YouJobIDHere
AND step_id = #StepId
-- start job
EXEC = msdb.dbo.sp_start_job #job_name = #CustomJobName

Resources