I am trying to automate the Account Creation process in Active Directory and I want to create the user home directory on a server and then I want it to become a shared folder with some user permissions.
I can create folder on that machine (remote machine) but I cannot convert it to a shared folder. Is there a way I can do that. Note I can create shared folder locally and can set permissions but cannot do that when the machine is different.
Which language are you using to script ?
You can do exactly what you are doing localy, on a remote computer using psExec from SysInternals. You'll find at th end of this post how to do it in ldap mixed in with WMI.
Related
I have created SSIS packages which creates new folder using file system task at output location and then creates data files in that folder using data flow task.
I have created one network drive which is mapped to azure blob. So I pass the network drive path like Z:\ to package and the folder and files are created as expected which are also reflected on the azure blob.
Now when I schedule this package through SQL agent job I get error that the cannot find part of the path Z:\folderName from file system task. So I though it is because the sql server agent service was not running through my user id . So I started sql server agent with my credentials but it still gives me same error.
Note: My Id doesnt have direct access to azure blob and the network drive is only accessible by my id.
We are currently using azure blob for dev but we may use separate server to store files due to which I cannot use flexible file system task available in SSIS azure service pack
We had reasons we could not use UNC paths (\server\share\file.txt) and struggled with mapped drives as well. The approach we ended up using was to explicitly map and unmap drives as part of our ETL Process.
Execute Process Task - Mount drive
We used a batch file to mount the drive as it made things easier on us but the command would take the form of something like
net use Z: \\server\share mypassword user#azure
Execute process task - Unmount Drive
Again, execute process task
net use /delete Z:
We ended up with a precursor step that I don't remember how we accomplished it, but we would test whether the drive had been unmounted before we attempted to mount it. I think it was File System Task to see if the drive existed and if so, then we had the Unmount task duplicated.
I was able to resolve this using credential manager. I logged in using a domain user and then added the azure path like \\server, userid and password to windows credentials . Then started the sql server agent service using the same domain user id . Set up the file path as \\server\share similar to what I have provided in windows credentials into the ssis package configurations.After doing this I was able to successfully execute the package through sql agent job.
I was able to solve this issue using script task (C# Code) to move data. I've generated a project parameter for Target ShareFolder and used it as ReadOnlyVariable for script task. The target share folder has been already mapped as a network drive on my local PC as well as on Application Server.
I have a network folder and two machine accounts, node1$ and node2$, both of them having full control permission on the folder.
I log in a SQL Server instance in the node1 machine and back up a SQL Server certificate on the network folder, which generates a crt and a pvk files.
Then I log in node2 machine and try to restore the certificate in another SQL Server instance. But I can't because node2$ has no permissions on the requested files .crt and .pvk.
Even more, if I check the created files, the very node$1 machine account has no explicit permissions on these files. Instead, I find an "owner rights" ACE.
So, the files don't seem to inherit the permissions that node1$ and node2$ have on the folder.
This issue doesn't occur with other kind of files.
I can solve this by manually assigning explicit permissions on the files to node2$.
But my question is: why .crt and .pvk files don't inherit the permissions as other types of file do?
The documentation explains it:
When performing a backup, the files will be ACLd to the service account of the SQL Server instance. If you need to restore the certificate to a server running under a different account, you will need to adjust the permissions on the files so that they are able to be read by the new account.
As you already figured out: inheritance is disabled because the T-SQL command BACKUP CERTIFICATE explicitly removes all permissions but for the service account.
I'm looking for a one click solution like a vbs or batch file which would copy 2 csv files from my local machine to a remote server on a particular location. This is primarily what I'm looking for.
Also, I would like to know if the same is possible if my account is not added into the server users list (I don't have access). Is there some way by which just that particular location will be shared?
You can try and share the specific folder on the server with your network. After that go to your client and open up the explorer. navigate to 'computer' and on the bottom you should find your network ressources. right click and add the server with his IP address. If done correctly you should have access to the servers folder as if it was one of your client. So you should now be able to create a desktop shortcut to the servers folder and be able to add files to the folder by drag & drop. Of course now you could write a simple batch file that copies all the contents of one of your folders to the servers folder.
Our objective is as follows
a) Pick up a file "Test.csv" from a Secure FTP location.
b) After picking up the file we need to insert the contents of the file into an object in Salesforce.
I created the following connection for the Remote SFTP (the location which will contain "Test.csv")
Step 1
This is as shown below
Step 2
Then I started to build a Data Synchronization Task as below
What we want is for the Informatica Cloud to connect to the secure FTP location and extract the contents from a .csv from that location into our object in Salesforce.
But as you can see in Step 2, it does not allow me to choose .csv from that remote location.
Instead the wizard prompts me to choose a file from a local directory (which is my machine ...where the secure agent is running) and this is not what I want
What should I do in this scenario ?
Can someone help ?
You can write a UNIX script to transfer the file to your secure agent and then use informatica to read the file. Although, I have never tried using sftp in cloud, I have used cloud and I do know that all files are tied up to the location of the secure agent( either server or local computer) .
The local directory is used for template files. The idea is that you set up the task using a local template and then IC will connect to the FTP site when you actually run the task.
The Informatica video below shows how this works at around 1:10:
This video explains how it works at around 1:10:
http://videos.informaticacloud.com/2FQjj/secure-ftp-and-salesforececom-using-informatica-cloud/
Can you elaborate the Secure agent OS as in Windows or Linux.
For Windows environment you will have to call the script using WINSCP or CYGWIN utility I recommend the former.
For Linux the basic commands in script should work.
I am trying to run a batch file to copy a backup file from one location to other.
I keep on getting the error:
Invalid drive specification
My path is as follows:
\\server\drive:\folder\folder\folder\*.bak drive:\folder\.bak
That typically doesn't work out too well. You'll want this:
cp \\server\C$\folder\folder\folder\copy.bak C:\folder\copied.bak
This presupposes that you actually have access to the folder \\server\C$\folder\folder\folder from your box. If you do not, then you need to configure permissions correctly on the server to give you access.
You only have access to administrative shares (\\server\C$ <- the $ denotes an admin share) if you have administrative rights on the server. If you don't you need to actively share the folder in question, i.e. on the server, navigate to drive:\folder\folder\folder and share it (context menu of the folder, menu item Sharing and Security). Note that you need at least temporary admin rights on the server in order to create a share.
Do not forget to configure the permissions for the share you create, so that the limited account you are using for the copy process has read rights.
Once this is et up, you should be able to copy the files using
Copy \\Server\NewShareName\*.bak c:\folder\.bak
If you have problems with the files being in use by another process, have a look at robocopy instead of the copy command.