Use a parameterized password in an Execute DDL Task - sql-server

I have an SSIS package (SQL Server 2016) that includes an Execute DDL Task. The task is to backup a SSAS tabular cube (so it can be replicated on a 2nd server that's on a different domain).
Generally when I write SSIS packages, I parameterize everything that could change over time. In this case, I want to parameterize where I back the file up to as well as the password to use.
In a regular SSIS Script task, I can reference project parameters directly. I don't see any way to do this with an Execute DDL Task. The best I can do is source the entire script from a variable, which then references the parameters in question.
The challenge I have is that I'm working with a password, so I want to mark the password project parameter as sensitive. As soon as I do this, I cannot reference the parameter in a variable.
My options appear to be:
Leave the password project parameter not marked as sensitive (meaning it is visible to anyone who opens the package and/or has access to the SSISDB environment variables)
Hard-code the script inside the DDL task and not parameterize the password or file name at all. Further, encrypt the entire package (rather than just the sensitive properties) so that anyone who opens the package cannot see it.
The second option sounds the best, since it doesn't involve a password visible in plain text. However, it doesn't allow for any parameterization of the password. (Which means every password change would require a redeployment of the package.)
Have I missed something? Is there another way to reference a parameterized password inside an Execute DDL Task in SSIS that I don't know about? Any other recommendations for this scenario?

This is the syntax for referencing a sensitive parameter in an ssis script task:

Related

Best approach to pass SSIS parameter/variable value

There are multiple ways to pass values to an SSIS package. We could use variables, Package parameters, or Project parameters also we could save values in a table and ask SSIS to pick the values from the table. Now, my way of coding is to use project Parameters and get the variables to use them. When deployed to the SSIS catalog, ENV can be set up to overwrite Param values as user requirements. Now, Am weighing the risks/ease of setting up ENV by the user to pass param values vs setting up a table to save values and code in SSIS to pick the values. Pls, pour in your thoughts on the pros and cons of both these approaches.
For eg: let's assume we have an SSIS package to save data to a CSV file. And Folder path where CSV files must be saved varies depending on servers(DEV/UA/Prod). Is it best to save folder path value in a table along with server name or is it best to set folder Path value as Param and ask the user who executes to set up the folder value in ENV at the time of execution depending on the server?
Update on 23 Mar 2022 - Based on all valuable inputs, I decided to use parameters and variables rather than using SQL table to pick values.
In my experience variables are best served by using an Execute SQL task and returning the results to a variable. It's modular and means it certain steps can easily be disabled if need be.
For managing connections (without outright hard-coding a connection string) I'd advise saving the CSV file location via a parameter. A parameter can be modified in the deployment environment via your SQL Script Agent and doesn't require changes to the source table. If I can avoid it, I wouldn't ever put file location information in the source table as it makes the table less exportable.
As mentioned in the official documentation:
Integration Services (SSIS) parameters allow you to assign values to properties within packages at the time of package execution.
Parameters were introduced in SQL Server 2012. This option was added to avoid using variables and external configuration files to pass arguments on the package execution.
If the CSV files directory changes based on the package environment, this means that parameters are the best suited for this situation. Still, other options can be used.
References
SSIS Parameters vs. Variables
There are several methods available and each have there pros and cons.
Utilizing parameters whether package or project level are good for anything that needs to change on a regular basis at execution. As those can be changed with a script task and then the package can be started at the end of the script. This also means whoever needs to execute the packages must have the appropriate security levels and knowledge.
Setting up environments is good for static content such as connection strings or email addresses for errors. It is possible to set up one master environment and have other folders utilize that environment or you would need to set it up for each folder. The downside is the person deploying the package needs to know how they are used and if they are outside the catalog folder than an extra step is required for mapping in the SQL server agent.
My preferred method is to create one table that holds the information and then the package connects to the table first step and loads the values to the variables. I have implemented this at my current position and it has become standard on all packages. It allows for some content to be defaulted and we have tables for DEV, QA, and Prod so as the packages are migrated the values are filled in. The table contains the Package Name, Variable Name, Variable Value, and audit columns to see when rows where added or updated. The table is temporal so it tracks all changes.
The packages execute a SP that pivots the rows to return a single row. The pivot is dynamic so it adds columns to the result set as needed. When a value is marked as default for the package name it appears for all packages but if the same Variable Name is listed under the package name it will show instead of the default. For example when testing I may want to send all error messages to my email instead of the group inbox. I will add a record of My Package Name, Email_Alert, My Email Address. This will then show as my email address for testing, when going to QA or Prod I do not include that record in the other tables so it uses the default inbox.
Utilizing the table also gives me the ability to have a SSRS report that shows the variables used for each package and also allows for me to change the values as needed while keeping an audit log of who changed what value and when. This is useful when something needs to change for backdating or anything else as I can make the change execute the package and then change the value back. If the department is ever audited on anything I have a full audit trail that I can provide in a matter of minutes and not days. There is also a rule that has been implemented that no values are allowed to be hard coded into variables anymore they must be in the table. Stored Procedure Names are also saved in the table and passed to the package so if we need to update a SP we do not need to redeploy the package.
We try to build all SSIS packages so we can adjust to changes without needing to redeploy as that is when mistakes often are made.

SSMS Query to Text File

I have a complicated query that marshals data to a temporary table, which I then marshal into a further output temporary table before finally selecting on it to display to screen. This gets saved out from Grid view, to text and I get the file I need for processing off site.
What I want to do is have this query be run-able and create that file on the local disk without any need for the operator to change the "Results to" option, or fiddle with anything.
What command or functionality might be available to me to do this?
I can not install any stored procedures or similar to the server involved.
Since you can't do anything on the server I would suggest writing an SSIS package. Create a data flow, and in your source object put your script. Your destination object will then point to the file you want. You have a fair number of options for output.
The SSIS package can then be run by
A SQL Job (assuming you are allowed even that)
A non SQL job running a bat file with a DTEXEC command
The DTEXECUI GUI.
Also you can store your SSIS package in the instance or on any fileshare you choose.

Pass connection information and parameters to server based SSIS job from vb.net

After reviewing all the different options I am still confused.
Here is the scenario. We have multiple databases on the same server that we would like to have a Single SSIS job handle imports (or exports) into (from) a table from a file. We are calling this from vb.net and the job is running on SSIS on the server. We don't have xp_cmdshell available.
We need to pass to the job unique job information (it is possible that 2 people could be running the same job on the same db or on a different db on the same server), the database connection information (This cannot be stored and selected in the job, as db's may be added/removed as needed and we don't want to reconfigure the job) and the file name/path (on the server or permitted UNC path available to SSIS).
We have looked at the option of declaring the job/job steps and then directly executing the job. We like this idea in that the Jobs would be unique and we could have the sql proc that the job calls report issues back to a common log table by the job id, which would then be available to review.
What I don't really follow is how to pass the information that this job needs.
In http://code.msdn.microsoft.com/Calling-a-SSIS-Package-a35afefb I see them passing parameters using the set command, but I get confused by the explanation of the call that things are processed twice. Also, in that example, would I be changing the Master DB reference to my DB in the Add Job Step?
My issue is that no example is really clean and simple passing parameters and changing DB's, a lot use different options like a list of db's to process from a data source and none really cleanly show me what to do with a variable that will be passed on down to a called stored procedure.
I don't have time to delve deep and experiment, I need to see how it is done as I am trying to understand it at a level back so I know how we can utilize it and fit the information we need to use (ie what do I need for connection information to dynamically assign it) as I need to know it to understand where in the grand scheme I am getting that information. (We don't store that in the actual DB doing the job, we have a repository in a central DB for that, but I don't know exactly what I need to store!)
Brian
Parameters that are dynamic to a single run of a job can be passed in to the SSIS package through a config table. The process that starts the job sets any necessary parameters in the config table before starting the job. The job kicks off the SSIS package, which has a connection manager to read the values out of the config table and into parameter values within the SSIS package.
You mentioned that you have database connection information, and if you choose to pass in parameters through a table keep in mind that storing SQL login information in a database is bad practice. The connection manager in the SSIS package should use windows authentication, and permissions needed by the SSIS package can be granted to the SQLAgent service account.
From what I understand, you want to run a package (or packages)via a SQLAgent job and the database it will run against is subject to change.
As supergrady says, you can pass in specific parameters to the package through a config table.
What I did was to create a config table and add a status column (a bit that indicates on/off, true/false). This allows me to run a sql script setting the status for the specific databases that I want and turning off those that I don't want. For me this is easier than opening up the job and fiddling with the command line values which is another way of getting what you want. I hope this helps

Pass a Parameter(s) to SQL Agent Job

I found a post claiming that it is possible to pass parameters to a SQL Agent Job, but no "how to" was included. Can someone explain how this is done?
In my scenario, I need a stored proc to invoke the SQL Agent and pass a parameter(s). The SQL Agent job, in turn, must pass the parameter(s) to an SSIS package step.
The alternative I've heard is to have the stored proc write values to a table, and then have the SQL Agent job (or the SSIS package it invokes) read those values from the table. I will take this latter approach if I must, though it is klugey.
Update:
The motive for this exercise is to form an integration test consisting of (a) the SQL Agent job which provides a package configuration file and (b) the SSIS package which needs the values in the package configuration file. Thus I do not want to invoke the SSIS package directly. Furthermore, testers have neither permission to launch the SQL Agent job directly, nor should they be allowed to cause SQL Agent jobs to be created dynamically. The sproc (legally) circumvents the permission issue. Finally, the integration test can target one of dozens of SSIS packages, but it is not practical in my environment to have dozens of SQL Agent Job definitions. Ergo, a sproc launches the SQL Agent job, and the parameter indicates which SSIS package to launch.
Seems like you should be using the SSIS package configurations feature. Your stored proc can update the configuration value(s).
A different approach could be to dynamically create the job with the SSIS parameters set in the job step. A possible advantage of this is the job is persisted so if there are issues or questions about what a parameter was, the job is around and you can debug the job/job step creation logic.
If the Agent is simply a means to running an SSIS package, the DBAs at my last gig created 2 procs, RunSSISPackage and RunSSISPackage32 that did nothing more than call out to dtexec to run a package passing all the supplied parameters.
There's also GiiM's approach which can work just as well but until we know what problem you are attempting to solve, it's hard to state what is a better approach.

Creating a New Database from Within a Stored Procedure

Due to an employee quitting, I've been given a project that is outside my area of expertise.
I have a product where each customer will have their own copy of a database. The UI for creating the database (licensing, basic info collection, etc) is being outsourced, so I was hoping to just have a single stored procedure they can call, providing a few parameters, and have the SP create the database. I have a script for creating the database, but I'm not sure the best way to actually execute the script.
From what I've found, this seems to be outside the scope of what a SP easily can do. Is there any sort of "best practice" for handling this sort of program flow?
Generally speaking, SQL scripts - both DML and DDL - are what you use for database creation and population. SQL Server has a command line interface called SQLCMD that these scripts can be run through - here's a link to the MSDN tutorial.
Assuming there's no customization to the tables or columns involved, you could get away with using either attach/reattach or backup/restore. These would require that a baseline database exist - no customer data. Then you use either of the methods mentioned to capture the database as-is. Backup/restore is preferrable because attach/reattach requires the database to be offline. But users need to be sync'd before they can access the database.
If you got the script to create database, it is easy for them to use it within their program. Do you have any specific pre-requisite to create the database & set permissions accordingly, you can wrap up all the scripts within 1 script file to execute.

Resources