I found a post claiming that it is possible to pass parameters to a SQL Agent Job, but no "how to" was included. Can someone explain how this is done?
In my scenario, I need a stored proc to invoke the SQL Agent and pass a parameter(s). The SQL Agent job, in turn, must pass the parameter(s) to an SSIS package step.
The alternative I've heard is to have the stored proc write values to a table, and then have the SQL Agent job (or the SSIS package it invokes) read those values from the table. I will take this latter approach if I must, though it is klugey.
Update:
The motive for this exercise is to form an integration test consisting of (a) the SQL Agent job which provides a package configuration file and (b) the SSIS package which needs the values in the package configuration file. Thus I do not want to invoke the SSIS package directly. Furthermore, testers have neither permission to launch the SQL Agent job directly, nor should they be allowed to cause SQL Agent jobs to be created dynamically. The sproc (legally) circumvents the permission issue. Finally, the integration test can target one of dozens of SSIS packages, but it is not practical in my environment to have dozens of SQL Agent Job definitions. Ergo, a sproc launches the SQL Agent job, and the parameter indicates which SSIS package to launch.
Seems like you should be using the SSIS package configurations feature. Your stored proc can update the configuration value(s).
A different approach could be to dynamically create the job with the SSIS parameters set in the job step. A possible advantage of this is the job is persisted so if there are issues or questions about what a parameter was, the job is around and you can debug the job/job step creation logic.
If the Agent is simply a means to running an SSIS package, the DBAs at my last gig created 2 procs, RunSSISPackage and RunSSISPackage32 that did nothing more than call out to dtexec to run a package passing all the supplied parameters.
There's also GiiM's approach which can work just as well but until we know what problem you are attempting to solve, it's hard to state what is a better approach.
Related
I would like to create a stored procedure that calls a SQL Agent job, which will in turn call an SSIS package. The job and the SSIS package will reside on the same database server. However, I would prefer that the stored procedure resides on a different database server. The reason behind this is that we have an app that will be calling the sproc. I don't want to give it access to the database where the SQL Agent job and SSIS package reside.
How would I go about doing this?
In a different database in the same instance is very simple. Just use a doted name (db_name.schema_name.object_name).
In a different instance, you must create a linked server and use a doted name like :
instance_name.db_name.schema_name.object_name
I want to copy all data from all tables from one SQL server database to another existing SQL server database of the same structure. I have a script to initially delete all contents of all tables in my output database before proceeding so that it is 'fresh' for the copy.
I understand that 'select into' statements can get this done but I want to be able to do it in bulk. I want to emulate the behavior that works very well in Management Studio of:
Right-click a DB
Select 'Tasks'
Select 'Export Data...'
In here, I can select an output DB and then select all tables. The transfer goes straight through without issue. I cannot find a command line way to achieve this.
The reason I am after this is that we want a daily copy of the prod database in a testing environment, so need to task schedule this process to run each night.
Due to some contstraints, I can't use a bacpac in this case.
Using the import/export task in SSMS, the last step has 2 options. Run immediately or save as SSIS package. So - save it as a SSIS package. You can then run this package whenever you want. And yes - you will need to do this twice. Once for export, once for import. You can also do exactly the same thing using SSIS btw.
So how do you execute a package from the command line? Like you do for any question, you should search first. Some suggestions/examples are here.
And if needed, you can schedule this using the agent.
I have an SSIS package (SQL Server 2016) that includes an Execute DDL Task. The task is to backup a SSAS tabular cube (so it can be replicated on a 2nd server that's on a different domain).
Generally when I write SSIS packages, I parameterize everything that could change over time. In this case, I want to parameterize where I back the file up to as well as the password to use.
In a regular SSIS Script task, I can reference project parameters directly. I don't see any way to do this with an Execute DDL Task. The best I can do is source the entire script from a variable, which then references the parameters in question.
The challenge I have is that I'm working with a password, so I want to mark the password project parameter as sensitive. As soon as I do this, I cannot reference the parameter in a variable.
My options appear to be:
Leave the password project parameter not marked as sensitive (meaning it is visible to anyone who opens the package and/or has access to the SSISDB environment variables)
Hard-code the script inside the DDL task and not parameterize the password or file name at all. Further, encrypt the entire package (rather than just the sensitive properties) so that anyone who opens the package cannot see it.
The second option sounds the best, since it doesn't involve a password visible in plain text. However, it doesn't allow for any parameterization of the password. (Which means every password change would require a redeployment of the package.)
Have I missed something? Is there another way to reference a parameterized password inside an Execute DDL Task in SSIS that I don't know about? Any other recommendations for this scenario?
This is the syntax for referencing a sensitive parameter in an ssis script task:
I have an automated SSRS report that runs off a SQL Server SP. A subscription then exports this to a folder each morning. I'm not sure if this next step is possible, but maybe someone knows:
Another Excel report from an outside source will be dropped into the same folder each morning. Is it possible to then automatically compare the two reports and delete records from my automated report that do not match?
I know I can manually import the new Excel file into SQL Server, do a join and then delete the records. But is it possible to do this automatically?
This sounds like a job for SSIS. SSIS is designed to handle ETL workflows like this. I would create a package that pulls in the outside Excel and either the output from your SSRS subscription or preferably calls the stored procedure itself. You can do all the data comparison you need and then output in whatever format you need or run T-SQL statements against your source database based on the results of your comparison as appropriate to your use case. You'd deploy your package to a SQL Server setup for SSIS and create a SQL Agent job to run it at the appropriate interval.
After reviewing all the different options I am still confused.
Here is the scenario. We have multiple databases on the same server that we would like to have a Single SSIS job handle imports (or exports) into (from) a table from a file. We are calling this from vb.net and the job is running on SSIS on the server. We don't have xp_cmdshell available.
We need to pass to the job unique job information (it is possible that 2 people could be running the same job on the same db or on a different db on the same server), the database connection information (This cannot be stored and selected in the job, as db's may be added/removed as needed and we don't want to reconfigure the job) and the file name/path (on the server or permitted UNC path available to SSIS).
We have looked at the option of declaring the job/job steps and then directly executing the job. We like this idea in that the Jobs would be unique and we could have the sql proc that the job calls report issues back to a common log table by the job id, which would then be available to review.
What I don't really follow is how to pass the information that this job needs.
In http://code.msdn.microsoft.com/Calling-a-SSIS-Package-a35afefb I see them passing parameters using the set command, but I get confused by the explanation of the call that things are processed twice. Also, in that example, would I be changing the Master DB reference to my DB in the Add Job Step?
My issue is that no example is really clean and simple passing parameters and changing DB's, a lot use different options like a list of db's to process from a data source and none really cleanly show me what to do with a variable that will be passed on down to a called stored procedure.
I don't have time to delve deep and experiment, I need to see how it is done as I am trying to understand it at a level back so I know how we can utilize it and fit the information we need to use (ie what do I need for connection information to dynamically assign it) as I need to know it to understand where in the grand scheme I am getting that information. (We don't store that in the actual DB doing the job, we have a repository in a central DB for that, but I don't know exactly what I need to store!)
Brian
Parameters that are dynamic to a single run of a job can be passed in to the SSIS package through a config table. The process that starts the job sets any necessary parameters in the config table before starting the job. The job kicks off the SSIS package, which has a connection manager to read the values out of the config table and into parameter values within the SSIS package.
You mentioned that you have database connection information, and if you choose to pass in parameters through a table keep in mind that storing SQL login information in a database is bad practice. The connection manager in the SSIS package should use windows authentication, and permissions needed by the SSIS package can be granted to the SQLAgent service account.
From what I understand, you want to run a package (or packages)via a SQLAgent job and the database it will run against is subject to change.
As supergrady says, you can pass in specific parameters to the package through a config table.
What I did was to create a config table and add a status column (a bit that indicates on/off, true/false). This allows me to run a sql script setting the status for the specific databases that I want and turning off those that I don't want. For me this is easier than opening up the job and fiddling with the command line values which is another way of getting what you want. I hope this helps