My SSIS package uses Script Task which calls File.Exists() to determine if file is present.
I have to servers on same domain, e.g. THISDOMAIN, SERVER1 and SERVER2
I have user THISDOMAIN\ADMIN which is on both servers.
SERVER1 needs to access SERVER2 folder C:\ which is mapped to SERVER1 as D:\
On SERVER1 there is SQL Server 2008R2 which has SQL Server Agent with Account THISDOMAIN\ADMIN
If I log on to SSIS on SERVER1 with windows authentication as THISDOMAIN\ADMIN package executes successfully.
But if I execute the same package with SQL Server Agent, it does not see the network drive D:\ on 'SERVER1`
What I tried:
Why wouldn't SSIS package running in SQL Server Agent job transfer files to a network folder?
How do I create a step in my SQL Server Agent Job which will run my SSIS package?
When going across the network to access files/folders, it's always recommended to use the UNC path.
Especially in a situation like this where the drive may be mapped via Windows Explorer and works when you're logged in, but doesn't work when you're trying to access that same path from a service that isn't logged in.
Related
Assuming my SSIS package has:
Network file process (copy task)
Network flat file source task (that reads the file)
Oledb sql source task that reads data from table located on remote server (windows authentication)
Suppose I run the package using either of:
Visual Studio (runs with credentials of Visual Studio user)
SSIS catalog (right click package and execute, runs with credentials of SSMS user)
SQL Server agent job step (runs Under credentials of SQL Server service user)
SQL Server agent job with step using Run As based on proxy credentials (runs under credentials of proxy)
My understanding is that in all the 4 situations, only 1 hop is involved. But there are articles I have been reading and colleagues that are suggesting that there are 2 hops involved and I will have to deal with Kerberos without being able to explain to me where is the 2nd hop in this situation? Can someone advise?
I have an SSIS package that creates 2 text files using a data flow component. It connects to an SQL database and if the query returns data, the 2 files are created.
After that, a script task loops through the folder where the 2 files are written into, determines the oldest file and moves it to another folder.
Everything moves smoothly when I execute the package on the server. No errors and all functions are executed. Perfect.
However, when I schedule the SSIS package in a job, the said script task only gets executed when the SQL query (data flow component) returns no results and therefore no files are created. The script then moves the 2nd file from the last run to the other folder.
If the data flow does create 2 new files, the script task does not do anything.
Any ideas how to change this behavior?
Again, this only happens when executed through a job, not when run locally.
Thanks,
Daniel
When SSIS packages are executed from SQL Server they access file system using the SQL Server Service Account NT SERVICE\MSSQL$<Instance Name> (Where <Instance Name> should be replaced by the installed instance name). You have to Grant this account to access the selected directories or run the SQL job using a proxy account:
SQL Server Service account Permissions:
Configure File System Permissions for Database Engine Access
Setting proxy account:
Running a SSIS Package from SQL Server Agent Using a Proxy Account
Create a SQL Server Agent Proxy
The package is running using a proxy account. All the other package are running and saved in the same folder. In this one the file saved in this folder has to move to a different one which saves to differnt application which uses third party dll.
I am calling the library using reflection. But the dll is not registered in sql. Will that be a issue?
I am trying to move the contents of a folder to another folder.
Trying to keep this efficient, I decided to use a Process task:
While in SSDT, it works perfectly.
But when deployed to SQL Server SSIS on the same machine, it returns PROCESS EXIT 1 and fails the package.
Is there anything here that stands out that could prevent this from working? Thanks.
When SSIS packages are executed from SQL Server they access File System using the SQL Server Service Account NT SERVICE\MSSQL$<Instance Name> (Where <Instance Name> should be replaced by the installed instance name). You have to Grant this account to access the selected directories or run the SQL job using a proxy account:
SQL Server Serivce account Permissions:
Configure File System Permissions for Database Engine Access
Setting proxy account:
Running a SSIS Package from SQL Server Agent Using a Proxy Account
Create a SQL Server Agent Proxy
I have an SSIS package that pulls some data out of SQL Server into a .csv file, then copies the file to a network location (uses Robocopy). Runs fine as a SQL Agent job. When I use DtExec from my own machine, Robocopy fails with 'Access is denied.'
The same error occurs regardless of which network location I enter. If I use Robocopy to try to copy to a local drive on the server instead, it works fine. DtExec is running as my own account. If I use remote desktop to login to the server (again with my own account) I can access all the network shares, and run Robocopy directly no problem. I am using UNC names throughout - no drive letters anywhere.
Is it possible DtExec is running under some low-privilege account on the server that cannot access any network resources? If so, any ideas on how to work around it?
I have two server. Server A is the application server, and server B is the database server. Normally I use SQL Management Studio in Server A to query. I intend to do a database backup (.bak), but whenever I do this through SQLMS, the file is created at server B. I dont have access to do a remote connection to server B.
How do I get the backup file?
You can specify a UNC path for the target of the backup. Keep in mind, however, that the account that the SQL Server service runs under will need network access to that path. This is one good reason why the SQL installer asks you which account to run the services under.
I ended up using generation script. But instead run all in a single script, I run 3 different script (create table, create stored procedure, and data) at a time. I do this because if I run in a single script, I ended up an error. By the way, the script can be compressed (zip) to give a smaller file size.