execute failure precedence constraint without showing task failure - sql-server

I have 3 tasks in an SSIS package. The first task is an Execute Process Task that runs a PowerShell command to download a file from a website. If the command fails the task fails. So I have two connections from it to two other tasks: one for success and one for failure.
The package works as expected. If the Execute Process Task succeeds it runs the associated success task. If the fails it runs the associated failure task.
The problem is that if the Execute Process Task fails, it shows the task as failing even though it correctly/properly ran the failure task connected to it.
Is there anyway to get it to stop showing as failure but still run the associated failure task even when the Execute Process Task fails?

Update 1
If you need that the package doesn't result failure then you should set FailPackageOnFailure and FailParentOnFailure to False on the task you should also set MaximumErrorCount on the package itself to something greater than 1. If the Execute Process Task fails it will increment the package's error count and if the error count exceeds MaximumErrorCount then the package can/will still fail. Else if you only need that the task doesn't shows failure it cannot be done
SSIS -- Allow a task to fail but have the package succeed? (See all answers, not only the accepted one)
Initial Answer
You cannot use a failure precedence constraint if the precedent task will always succeed
I don't think you can do this. because the failure precedence constraint (connector) only works if the precedent task fail. You can use ForceExecutionResult property to let a task always success, but the failure connector will never be used.
Workaround
I am not sure if this can help, but instead of using failure precedence constraint, store the Execution Value in a Variable, using ExecValueVariable property, and add Expressions to the precedence constraints *(both connector will have the Success constraint and a similar Expression:
#[User::ResultValue] = 1
OR
#[User::ResultValue] = 0
Side Note: ExecValueVariable and ForceExecutionResult and other properties are found in the properties Tab, click on the Task and press F4 to show it

So it seems that you want the task to be shown as success even though the parent task fails and the task corresponding to the failure output is successful.
To do this you need to edit the field FailTaskReturnCodeIsNotSuccessValue to False so that even though the parent task doesn't return the successValue the task is not shown as failed.Hope this helps!!.

Related

Is it possible to unset the WHEN condition on a Snowflake task?

I have an existing Snowflake task that is scheduled to run once an hour and is using the WHEN condition to check if a particular stream has data. I'd like to ALTER this task so that it no longer uses the when condition, however, unset does not seem to work and I cannot modify it to NULL. The one workaround we've found is you can modify it to TRUE which seems to work (the statement executes and the task does run), but I'd prefer to have it unset like the other tasks that do not have a when condition set. I also recognize I could drop the task and recreate it, however, then I'd lose the history. This is not a significant problem, but seems strange that it cannot be modified after the fact and I've not seen any documentation that indicates this is expected behavior.
Note: I've made sure the root task is suspended before trying to make any changes.
alter task example_task unset when;
SQL compilation error: invalid property 'when' for 'TASK'
alter task example_task modify when NULL;
Invalid expression for task condition expression. Expecting one of
the following: [SYSTEM$STREAM_HAS_DATA]
alter task example_task unset condition;
SQL compilation error: invalid property 'condition' for 'TASK'
(Note: tested ^ because the column is labeled 'condition' when you run show tasks)
alter task example_task remove when;
SQL compilation error: syntax error line 1 at position 31 unexpected
'when'.
(Note: tested ^ because you use 'remove' to change the 'after' parameter)
alter task example_task modify when TRUE;
Statement executed successfully.
As a side note, this came to our attention while trying to use the snowflake terraform provider which unsuccessfully tried to update the when condition. Now I think I know why.
As an alternative approach, why not replace the whole task using CREATE OR REPLACE TASK...?

MSSQL agent: Failed notification if 0 rows exported

I am running a daily job in MSSQL agent, that runs a package to export a certain table to an excel file. After completion it sends me an email.
It happened couple of times that the table was empty and it exported 0 rows.
I did not receive a job failed notification, because the job didn't fail.
I would like to receive a failed notification when it's exporting 0 rows.
How would you achieve this?
Thanks
There are a number of ways to force a package to return a failure (which will then cause the calling Agent job to report a failure).
One simple way would be to put a Row Count task between the source and destination in the data flow task that populates the spreadsheet. Assign the value to a variable, say #RowsExported.
Back on the Control Flow tab, if there's more to the package, put a condition on the precedent constraint leading to the rest of the work where #RowsExported > 0 so the rest of the package will only continue if there were rows sent. Whether or not there's more to the package, add a new precedent constraint coming off the data flow with a condition #RowsExported == 0. Send that constraint to an Execute SQL task that just contains the script SELECT 1/0;.
Now, if zero rows are exported, the package will throw a division by zero error, and the calling job will fail.
There are other ways, of course, but this one can be implemented very quickly.

SQL script stuck with status: runnable and wait: preemptive_os_reportevent

An Index Defrag was executed and was later on terminated. It impacted 2 other processes to long run. One process cancelled by itself and the other was terminated. While re-running I was checking the status from sys.dm_exec_requests and notice that on the last part of the query which will insert the data into a table, it is changing status from running to runnable: preemptive_os_reportevent, etc.. Later on, the job once again cancelled by itself.
I want to learn why is the script changing status like that? Is that expected? And if something else is causing it to long run, what else should I check?
Note: I was also checking other active scripts at the time it was running and none was using the same target table.
This has been resolved yesterday. Apparently, the syslogs was full and is prohibiting the script to write logs therefore it is stuck and can't complete. Feel free to add inputs if you have some.

ssis-How do I check job status in a table continuously from SSIS control flow?

we have a requirement where SSIS job should trigger based on the availability of value in the status table maintained,point to remember here that we are not sure about the exact time when the status is going to be available so my SSIS process must continuously look for the value in status table,if value(ex: success) is available in status table then job should trigger.here we have 20 different ssis batch processes which should invoke based on respective/related status value is available.
What you can do is:
Scheduled the SSIS package that run frequently.
For that scheduled package, assign the value from the table to a package variable
Use either expression for disabling the task or constraint expression to let the package proceeds.
Starting a SSIS package takes some time. So I would recommend to create a package with the following structure:
Package variable Check_run type int, initial value 1440 (to stop run after 24 hours if we run check every minute). This is to avoid infinite package run.
Set For Loop, check if Check_run is greater than zero and decrement it on each loop run.
In For loop check your flag variable in Exec SQL task, select single result value and assign its result to a variable, say, Flag.
Create conditional execution branches based on Flag variable value. If Flag variable is set to run - start other packages. Otherwise - wait for a minute with Exec SQL command waitfor delay '01:00'
You mentioned the word trigger. How about you create a trigger when that status column meets the criteria to run the packages:
Also this is how to run a package from T-SQL:
https://www.timmitchell.net/post/2016/11/28/a-better-way-to-execute-ssis-packages-with-t-sql/
You might want to consider creating a master package that runs all the packages associated with this trigger.
I would take #Long's approach, but enhance it by doing the following:
1.) use Execute SQL Task to query the status table for all records that pertain to the specific job function and load the results into a recordset. Note: the variable that you are loading the recordset into must be of type object.
2.) Create a Foreach Loop enumerator of type ADO to loop over the recordset.
3.) Do stuff.
4.) When the job is complete, go back to the status table and mark the record complete so that it is not processed again.
5.) Set the job to run periodically (e.g., minute, hourly, daily, etc.).
The enhancement hear is that no flags are needed to govern the job. If a record exists then the foreach loop does its job. If no records exist within the recordset then the job exits successfully. This simplifies the design.

How do I raise an error in an Execute Sql Task in Integration Services?

Let me also back up a step - I'm trying to implement a sanity check inside an IS package. The idea is that the entire package runs in a read uncommitted transaction, with the final step being a check that determines that certain row counts are present, that kind of stuff. If they are NOT, I want to raise an exception and rollback the transaction.
If you can tell me how to do this, or, even better, suggest a better way to implement a sanity check, that would be great.
In order to fail the package if your observed rowcount differs from your expected rowcount:
Create a Package Global Variable to hold your expected rowcount. This could be derived from a RowCount step in your DFT, set manually, etc.
Edit your Execute SQL Task that provides the observed rowcount, and set the Result Set to Single Row.
In the Result Set tab of your Execute SQL Task, assign this Result Set to a variable.
Edit your constraint condition prior to your final step. Set the Evaluation Operation to Expression and Constraint. Set the Value to Failure. In your expression, evaluate ResultSetVariable <> ExpectedRowCountVariable.
If the observed rowcount does not equal the expected rowcount, the package will fail.
You can raise an error and roll back the transaction in an SSIS "Execute SQL" task with the following SQL:
Raiserror('Something went wrong',16,1)
This will cause the "Execute SQL" task to return the error to the SSIS package and the task will follow the "red" (failure) path. You can then use this path to rollback the transaction and do any tidying-up.
This approach has the advantage you can do all your processing in the Execute SQL task then call Raiserror if you need to.

Resources