I have created SSIS packages which creates new folder using file system task at output location and then creates data files in that folder using data flow task.
I have created one network drive which is mapped to azure blob. So I pass the network drive path like Z:\ to package and the folder and files are created as expected which are also reflected on the azure blob.
Now when I schedule this package through SQL agent job I get error that the cannot find part of the path Z:\folderName from file system task. So I though it is because the sql server agent service was not running through my user id . So I started sql server agent with my credentials but it still gives me same error.
Note: My Id doesnt have direct access to azure blob and the network drive is only accessible by my id.
We are currently using azure blob for dev but we may use separate server to store files due to which I cannot use flexible file system task available in SSIS azure service pack
We had reasons we could not use UNC paths (\server\share\file.txt) and struggled with mapped drives as well. The approach we ended up using was to explicitly map and unmap drives as part of our ETL Process.
Execute Process Task - Mount drive
We used a batch file to mount the drive as it made things easier on us but the command would take the form of something like
net use Z: \\server\share mypassword user#azure
Execute process task - Unmount Drive
Again, execute process task
net use /delete Z:
We ended up with a precursor step that I don't remember how we accomplished it, but we would test whether the drive had been unmounted before we attempted to mount it. I think it was File System Task to see if the drive existed and if so, then we had the Unmount task duplicated.
I was able to resolve this using credential manager. I logged in using a domain user and then added the azure path like \\server, userid and password to windows credentials . Then started the sql server agent service using the same domain user id . Set up the file path as \\server\share similar to what I have provided in windows credentials into the ssis package configurations.After doing this I was able to successfully execute the package through sql agent job.
I was able to solve this issue using script task (C# Code) to move data. I've generated a project parameter for Target ShareFolder and used it as ReadOnlyVariable for script task. The target share folder has been already mapped as a network drive on my local PC as well as on Application Server.
Related
I've inherited an SSIS package which loads CSV files into a SQL database.
The first thing the package does is call a .BAT file which maps a network drive. The .BAT file contains the username and password used to map the drive in plain text, so urgently needs to be replaced.
I've written a solution which uses New-PSDrive in a Powershell script and creates a credentials XML file with the password encrypted. If I execute the ps1 script it works Ok.
If I call the ps1 script from within SSIS and run it through VS then it also works fine.
When I call the SSIS package that calls the script task through a SQL Agent job (executed as the SQL Server Agent user account) the drive mapping doesn't seem to work, and the package complains it cannot access the file from the file share.
I presume this is no working because the SQL Server Agent user account can't run Powershell queries? The package only errors because it cannot access the share, not during the execution of the Powershell script.
So, two questions:
Is there an obvious issue with the SQL Agent deployment idea
or
Is there a different way of mapping a network drive from within SSIS without storing the password in plain text anywhere?
This is the pattern I suggest using for accessing files on a remote server:
Access the files via unc path: \\MyServer\Files
This means that the package needs to run under a context that has access to the path. There are a couple of options:
Run the package under a proxy, i.e. the account credentials that were stored in the bat file. This means that you have to also grant access to the account on the sql instance as well
Use a group managed service account (gMSA). In this case, the sql agent service account is replaced with the gMSA and the job runs under sql agent. The gMSA needs to be granted access to the remote share as well as the sql instance. This is a much more secure way of addressing the whole process because there is no password to manage (AD takes care of that) and the account cannot be used to log in anywhere. But there is set up work to do to get it created so it's not the fastest option.
I'm new to SSIS and stuck with a problem and hope some of them would have already gone through any of this.
Task:
Copying files from a remote server to a local machine folder using File System task and For each loop container.
Problem:
The job executes i.e. files are getting copied successfully when I execute from the SSIS designer but when deployed the project on the SQL server instance it isn't copying any files in fact the target folder is totally empty.
I'm not understanding this strange behavior. Any inputs would be of great help!
Regards-
Santosh G.
The For each loop will not error out if it doesn't find any files.
The SQL Agent account may not have access to read the directory contents.
Check is your path a variable - is it being set by a config or /SET statement?
Can you log the path before starting the for loop?
Can you copy a dummy file and see can SSIS see this file?
How are you running the job - cmd_exec() can give spurious results with file I/O tasks
The issue was related to the user authorizarions of the SQL Server agent service.
When I execute the job from SQL Server it uses agent service and for that agent service you need to assign a service user who has access rights to the desired file path.
Our objective is as follows
a) Pick up a file "Test.csv" from a Secure FTP location.
b) After picking up the file we need to insert the contents of the file into an object in Salesforce.
I created the following connection for the Remote SFTP (the location which will contain "Test.csv")
Step 1
This is as shown below
Step 2
Then I started to build a Data Synchronization Task as below
What we want is for the Informatica Cloud to connect to the secure FTP location and extract the contents from a .csv from that location into our object in Salesforce.
But as you can see in Step 2, it does not allow me to choose .csv from that remote location.
Instead the wizard prompts me to choose a file from a local directory (which is my machine ...where the secure agent is running) and this is not what I want
What should I do in this scenario ?
Can someone help ?
You can write a UNIX script to transfer the file to your secure agent and then use informatica to read the file. Although, I have never tried using sftp in cloud, I have used cloud and I do know that all files are tied up to the location of the secure agent( either server or local computer) .
The local directory is used for template files. The idea is that you set up the task using a local template and then IC will connect to the FTP site when you actually run the task.
The Informatica video below shows how this works at around 1:10:
This video explains how it works at around 1:10:
http://videos.informaticacloud.com/2FQjj/secure-ftp-and-salesforececom-using-informatica-cloud/
Can you elaborate the Secure agent OS as in Windows or Linux.
For Windows environment you will have to call the script using WINSCP or CYGWIN utility I recommend the former.
For Linux the basic commands in script should work.
I have a SSIS package which creates a folder in an UNC share and then creates a file there (using script task).
The domain account which is used by SSIS and Agent has all the possible permissions in the DB computer and the share computer.
It always fails there.
I created a test SQL Agent job which creates a backup of the database in the same location and it fails too (Operating system error 5 - access denied).
EDIT: The above test example may be irrelevant since the backup operation is executed by SQL Server Database Engine and not the SQL Agent itself (Agent still schedules the task).
I cannot debug the script task in SSIS and therefore Im not sure what is the problem.
I have managed to fix this problem. The first problem was lack of sufficient task activation/execution permissions in the DCOM config node in Component Services. The permissions had to be set for SQL Server Integration Services.
The second problem was the fact that the UNC path looked like this:
\\192.168.250.51\C$\Folder\
I needed to create another share (visible) like that:
\\COMPUTER-NAME\Folder\
Also don't try to map any drives to the folders. It won't work.
I have an SSIS 2005 package which is being executed using an SQL Agent Job. This package has a Web Service Task with different credentials than those being used to execute the package.
I am having the following problem when my package is executed on the Server:
"Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebserviceTaskException: System.UnauthorizedAccessException: Access to the path ..... is denied.
The package is being executed on a clustered environment. Both accounts have been created on the server so that package is able to write to TEMP files but problem still persists.
Based on this thread, you should be able to set the OutputLocation property of the webservice. This will need to be configured to point to a location the credentials of the user invoking the webservice will have access to. This might require an admin modifying ACLs on a directory path.
In a clustered environment, you should ensure the location being written to is a clustered resource. For example, you can create file share, but the file share must be a cluster resource so that it is available when the cluster fails over. If you are writing to a drive, ensure the drive is a cluster drive dependent on the proper SQL Server resource.