SSIS package executed in Server agent doesn't do its work (even while reporting success) - sql-server

I have to say that I hate myself for such general question as "What I am doing wrong?" but I simply have no idea what can be the problem:
I've created SSIS package that takes the data from flat files (CSV), counts the average on one of the columns, groups by date and writes it to the database and deletes the original file. All works fine when executed within SSIS, but when I am scheduling it within Server Agent it simply doesn't work - log reports success but there is no new data in the database and the .csv file exists in its original location.
I know the problem with protection level set up in SSIS, so I've changed it to "EncryptAllWithPassword" and I use the same password with Server Agent.
Here is a link to the Server Agent Job script (created as "script job as DROP and CREATE")
Edit: Just to make things weirder, using
dtexcec /f {filepath} /de {password}
executes program without problem. I know I can shedule such command in the Windows itself, but i'd like to keep all scheduled jobs in one place - in the Server Agent
EDIT: Solved by changing the path to UNC

There are two important things to remember when setting up packages to run via a SQL Server Agent job.
Use UNC paths for all file locations, no matter how simple. There is a high probability that the server will have a different view of the file structure than your development machine, so UNC paths ensure that both machines are referencing the same paths.
Use a proxy account to execute that package, as described here http://www.mssqltips.com/sqlservertip/2163/running-a-ssis-package-from-sql-server-agent-using-a-proxy-account/.
The proxy account must have access to the physical paths and the server objects.
This also allows for security stratification on your various packages (not all packages need access to everything).

Related

How best to map a network drive from within SSIS

I've inherited an SSIS package which loads CSV files into a SQL database.
The first thing the package does is call a .BAT file which maps a network drive. The .BAT file contains the username and password used to map the drive in plain text, so urgently needs to be replaced.
I've written a solution which uses New-PSDrive in a Powershell script and creates a credentials XML file with the password encrypted. If I execute the ps1 script it works Ok.
If I call the ps1 script from within SSIS and run it through VS then it also works fine.
When I call the SSIS package that calls the script task through a SQL Agent job (executed as the SQL Server Agent user account) the drive mapping doesn't seem to work, and the package complains it cannot access the file from the file share.
I presume this is no working because the SQL Server Agent user account can't run Powershell queries? The package only errors because it cannot access the share, not during the execution of the Powershell script.
So, two questions:
Is there an obvious issue with the SQL Agent deployment idea
or
Is there a different way of mapping a network drive from within SSIS without storing the password in plain text anywhere?
This is the pattern I suggest using for accessing files on a remote server:
Access the files via unc path: \\MyServer\Files
This means that the package needs to run under a context that has access to the path. There are a couple of options:
Run the package under a proxy, i.e. the account credentials that were stored in the bat file. This means that you have to also grant access to the account on the sql instance as well
Use a group managed service account (gMSA). In this case, the sql agent service account is replaced with the gMSA and the job runs under sql agent. The gMSA needs to be granted access to the remote share as well as the sql instance. This is a much more secure way of addressing the whole process because there is no password to manage (AD takes care of that) and the account cannot be used to log in anywhere. But there is set up work to do to get it created so it's not the fastest option.

SSIS file exist check works in SSDT not running from SSISDB

this is one that has me stumped and Ive been doing this a long while.
Migrating to SQL server 2016, large number of ETL. Easy enough.
One of the ETL packages has a simple script task to take a table of files, run a file exists foreach loop.
it uses a project parameter to create the unc ( \servername\share) and then binds that to the file name in the script task.
use an environment config setup in SSISDB
execute in SSDT works fine, deploy to catalog and it cant see the file. i know youll say permissions, but ive permissioned everyone group to share and drive in case its that. SSISDB execution means it should be running under my security context and im domain admin, local admin and creator owner of the share.
even strangeR, i have created simple package to grab the contents of one of the files and import into a dump table in case permissions or pathway were duff ( even though they work in SSDT might be the enviroNment config in SSISDB). THIS WORKS FINE, therefore it cant be the envrionment setup of SSISDB being referenced.
please note this is not running from an agent job yet so wont be due to agent server account issue. need to get it running from ssisdb first then ill create an agent job
So -- script task cant see unc share, built from two variables, that works in ssdt and its running under same credentials...
Go
For what its work the script task code is
Dts.Variables("BolFileExists").Value = File.Exists(Dts.Variables("StrLoadFileLocation").Value.ToString & Dts.Variables("StrCurrentFile").Value.ToString)
This is a slightly different answer as it shows a different approach and removes the script task. I use a foreach to check if the file exists using GUI tools provided by SSIS:
Well I found the answer and I deserve to punch myself in the face.
Tried everything, it was a file variable and path variable being pulled together in the script task so tried concatenation that before the script task, pumped this into a table to ensure it was going to write table.
Literally everything was fine and still didn’t work.
The issue....
Building it as a 2017 package onto a 2016 Sql server.
I’ve not found what was missing dll wise but it must have been one of those that meant the script task couldn’t find the files but weird it didn’t break and just said the files weren’t there!
Thanks all for input, I’m going to go put my head in the door and slam it

SQL Agent cannot access UNC share

I have a SSIS package which creates a folder in an UNC share and then creates a file there (using script task).
The domain account which is used by SSIS and Agent has all the possible permissions in the DB computer and the share computer.
It always fails there.
I created a test SQL Agent job which creates a backup of the database in the same location and it fails too (Operating system error 5 - access denied).
EDIT: The above test example may be irrelevant since the backup operation is executed by SQL Server Database Engine and not the SQL Agent itself (Agent still schedules the task).
I cannot debug the script task in SSIS and therefore Im not sure what is the problem.
I have managed to fix this problem. The first problem was lack of sufficient task activation/execution permissions in the DCOM config node in Component Services. The permissions had to be set for SQL Server Integration Services.
The second problem was the fact that the UNC path looked like this:
\\192.168.250.51\C$\Folder\
I needed to create another share (visible) like that:
\\COMPUTER-NAME\Folder\
Also don't try to map any drives to the folders. It won't work.

TSQL and UNC Paths

I'm not an expert with TSQL so have patience with me please. So recently I was doing a project in TSQL on my local server using SQL Server 2008 R2 Management Studio. I was reading my files from a temp file on my C: drive and bulk inserting them into tables at the time.
Then I went and moved to a regular server instead of my local server on my machine.
It took me a bit to realize that I no longer had access to my local machine folders and files, and that is causing me issues.
I've read that one solution is to create a mapped drive on the server, but this is not an option for me.
So my question is what are other options for me? Could I use UNC paths to access my files or anything else?
The files I want to access are regular text files that are comma-delimited and newline terminated.
(I saw somewhat similar questions to mine, but there's seemed server specific or specific to their particular issues. Also none of their questions were answered.)
Actually a mapped drive won't work either because the account SQL runs under by default (local system if I recall) will not have network access.
So, the more reliable way to do this is definitely with a UNC path BUT there is more! (I've done this several times when I've needed to move database backups and log backups across servers for mirroring).
How?
On the SQL server machine AND the other server that will host the share, create a new user (same username and password on both machines) - assuming your not using AD. The user needs not be in any groups at all other than the users group but it must be called the same in both servers and the password must match.
On the SQL server machine change the account that SQL SERVER is running under. This is done in the SQL server configuration tool. Do not try to do this yourself via windows services. Choose the user that you created in no 1 above. Note you have to enter the pw. Restart SQL after you've changed it and verify SQL still runs fine. It should run just as before but now is running as a particular user with all the permissions of that user (which actually are very limited anyhow, but at least the user can access network resources).
On the remote server, make sure the new user has NTFS permissions on the folders that will host your share. Read/write perhaps or just read if SQL is only reading data.
On the remote server, create a share pointing to the appropriate folder that you set permissions for above. Make sure if you're using share permissions that the new user also has permissions on the share (not just on NTFS on the drive).
Once all of this is setup, your SQL scripts simply use the UNC path that points to the remote share and since SQL is running "as" a user with access to that share, SQL will see the files just fine!

How do I use a different database connection for package configuration?

I have an SSIS Package that sets some variable data from a SQL Server Package Configuration Table. (Selecting the "Specify configuration setings directly" option)
This works well when I'm using the Database connection that I specified when developing the package. However when I run it on a server (64 bit) in the testing environment (either as an Agent job or running the package directly) and I Specify the new connection string in the Connection managers, the package still reads the settings from the DB server that I specified in development.
All the other Connections take up the correct connection strings, it only seems to be the Package Configuration that reads from the wrong place.
Any ideas or am I doing something really wrong?
The only way I was able to do this was to use Windows Environment Variables. You can specify things like connection strings and user preferences in environment variables, and then pick up those environment variables from your SSIS Task.
I prefer to use Server Aliases in the SQL Client Configuration. That way, when you decide to point the package to another SQL Server it is as simple as editing the alias to point to the new server, no editing necessary in the SSIS package. When moving the package to a live server, you need to add the aliases, and it works.
This also helps when you have a real painful naming convention for servers, the alias can be a more descriptive name than the actual machine name.
I didn't actually understand your question completely but I store my connection settings in a configuration files usually one for each environment like dev, production etc. The packages read the connection settings from the config files when they are run.
When you're creating a job to call the SSIS package, and you're setting up the step, there is a tabbed area. The default tab is where you set the package name, and the next tab over is where you can set the configuration file. Have a config file for each package, and change for the server (dev, test, prod). The config file can be put directly on the dev, test, and prod servers, and then point to them when setting up that job.
If u are using SQL Server Package Configuration then all the properties of the packages will come from SQL Server table - Please check that
SSIS security the way it stands is terrible. No one will be able to support things when I am out of the office. The job never reads from the configuration file...I give up. It only works when I edit the string in the Data sources tab. However the password gets lost if you happen to go into the job a second time. Terrible design, absolutely horrible. You would think that when you specify a xml file in the job step it would read the connection string from there that is defined, but it does not. Does this really work for anyone else?
Goto the package properties and set deployment True. This should work for what you have done.
I had the identical question, and got the same answer, i.e. you cannot edit the connection string used for package configurations hosted in SQL Server, except if you specify that the SQL Server connection string should be in an environment variable.
This unfortunately does not work in my dev setup, where two environments are hosted on the same machine. I ended up following Scott Coleman's approach as detailed on SQL Server Central [Free sign-up and a good site]. The trick is that you create a view to store your configuration settings on one central server, and then use the machine that connects to it to determine which environment is active.
I used that approach, but also used the User connecting to the environment to make a determination, because my test and dev setups run on the same SSIS instance, but as different user names. Scott suggests in the comments that the application name should be set, but this cannot be changed in the package execution job step, so it was not an option.
One other caveat that I found was that I had to add "Instead of" triggers to my view to do the inserts, updates and deletes for configuration variables.
We want to keep our package configs in a database table, we know it gets backuped with our other data and we know where to find it. Just a preference.
I have found that to get this to work I can use an environment variable configuration to set the connection string of the connection manager that I am reading my package config from. (Although I had to restart the SQL Server agent before it could find the new environment variable. Not ideal when I deploy this to Production)
Looks Like when you run an SSIS package as a step in a scheduled task it works in this order:
Load each of the Package Configs in the order they appear in the Package Configuations Organiser
Set the Connection Strings from the Data sources tab in the Job Step properties of the Scheduled Job
Start running package.
I would have expected the first 2 to be the other way around so that I can set the data source for my package config from the scheduled job. That is where I would expect other people to look for it when maintaining the package.

Resources