Execute one SSIS Package from Another, but using a DIFFERENT proxy user. Is it possible? - sql-server

I have one SSIS Package that must run as Proxy A and another that must run as Proxy B. I would love to have the first package run, and, as one of its tasks, execute the second package. Is this possible?
Thanks a lot!

You could have the first package use sp_start_job to kick off a job that is set up to run the second package. If this is "fire-and-forget", that's all you need to do. If you need to wait until it's completed, things get more messy - you'd have to loop around calling (and parsing the output of) sp_help_jobactivity
and use WAITFOR DELAY until the run completes.
This is also more complex if you need to determine the actual outcome of running the second package.

Related

How do I run occasional tasks that update data in a database?

Ooccasionaly I need to run tasks that update data in a database. I might need to run them ever again, or not a new server - no idea. For I need to run them once and in a certain release only. And they should be outside of git index.
Some tutorial suggest that I run them with "custom migration", in which a 2nd directory for migrations is created called "custom_migrations" and they'll be run from there via Ecto.Migrator. But this will case a problem: I run all of the custom_migrations, then delete all of migration files (because I won't need them anywhere else, not on a new server either, once I've run them), then create new ones when a need arises, and then Ecto.Migrator will complain about absense of the migrations that I've deleted.
I'm also aware of ./bin/my_app eval MyApp.Tasks.custom_task1 but it's not convinient because I'll have to call it manually and passing arguments to a function isn't convinient via the command line.
What I want is: create a several files that I want to be run in this current release, once. Store them in a certain directory of an application. Deploy a application. They'll get run automatically, probably on application boot and then I remove them. Then, after some time, I may want to create new ones and only those new ones will need to get run.
How to do this? What's a recommended way in Ellixir/Phoenix?

Running two SSIS loop containers in parallel

In a package I have two loop containers that run fine one after the other. Each has its own variable name used to iterate over and load two different sets of Excel files to the same table. As far as I can tell there is no overlap between the packages so I thought to speed things up by running them in parallel.
When starting the package however (manually in SSIS), the containers look like they execute but then after a few seconds the entire package shows as complete without any errors, and none of the loop containers or subsequent tasks did anything.
The package log only shows validation completed for each of the loop containers.
Is there some switch somewhere to make two loop containers play nicely?
Here is what it looks like:
Place the two loops and their corresponding script tasks (via precedence constraints) in a sequence container. Connect the Create Table script task to the sequence container. Then connect the sequence container to D Product Family data flow.
Note: disabling a task won't affect operation as SSIS will just skip over the disabled task(s) and go to the next one until all tasks have been completed.

How do I force ssis package step to fail to test error handling

I have an ssis package with multiple different steps and types of steps, most are not script tasks. I want to make sure my error handling and error logging are working properly and I am wondering if there is a way which I can selectively force the package to fail at specific points to make sure it acts appropriately.
Thank you
There is properties called forceExecutionValue and ForceExecutionResult for every task. Just assign a value to it

SSIS Package - track calling job

I'm looking for ideas on how to automatically track the job that calls the package.
We have some genric packages that are called from different jobs, each job passes in different file paths as parameters and therefore processes very different size files depending on the path.
In the package I have some custom auditing setup which basically tracks the package start time and end time, and therefore the duration of execution. I want to be able to also track the job that called the package so if the package is running long, I can determine which job called it.
Also note I would prefer this automatic using possibly some sort of system variable or such, so that human error is not an issue. I also want these auditing tasks built into all of our packages as a template, so I would prefer not to use a user variable either - as different packages may use different variables.
Just looking for some ideas - appreciate any input
We use parent and child packages instead of different jobs calling the same package. You could send the information about which parent called it to the child package and then in the child package records that data to a table along with the start date and end date.
Our solution has a whole meta database that records all the details through logging of each step. The parent tells the child which configuration to use and log details against that configuration. The jobs call the parent package - never the child package (which doesn't have a configuration in the config table as it is always configured through variables sent in by the parent package. No human intervention necessary (except initial development or research when a failure occurs) needed.
Edit for existing jobs.
Consider that jobs can have multiple steps. Make the first step a SQL script that inserts the auditing information into a table including the start time of the package, the name of the job that called it and thename of the ssispacakge being called. Then the second step calls the SSIS package and then make the last step a SQL script that inserts the same data only with the end datetime.
A simple way to do this is to set up a variable on your SSIS package as a varchar. Set the value to the value of the variable to #[System::ParentContainerGUID] using an expression when it starts. SQL Agent won't set the value, so when run as an individual job it will be an empty string. But if called by another package it will contain the GUID of the calling package. You can test for that value. You can use a precedence contraint to control the program logic.
We have packages that run as a part of a big program but sometimes we need to run them individually. Each package has an email on failure task but we only want that to execute when the package is run individually. When it is part of the big run we collect the names of all packages that error and send them as one email from the master package. We don't want individual emails and a summary email going out on the same run.

How can I have my polling service call SSIS sequentially?

I have a polling service that checks a directory for new files, if there is a new file I call SSIS.
There are instances where I can't have SSIS run if another instance of SSIS is already processing another file.
How can I make SSIS run sequentially during these situations?
Note: parallel SSIS's running is fine in some circumstances, while in others not, how can I achieve both?
Note: I don't want to go into WHEN/WHY it can't run in parallel at times, but just assume sometimes it can and sometimes it can't, the main idea is how can I prevent a SSIS call IF it has to run in sequence?
If you want to control the flow sequentially, think of a design like where you can enqueue requests (for invoking SSIS) to a queue data structure. At a time, only the top request from the queue will be processed. As soon as that request completes, next request can be dequeued.

Resources