How to copy file to remote server in Lotusscript - file

I want to create a Lotus Notes agent that will run on the server to generate a text file. Once the file is created, I need to send it to a remote server.
What is the best/easiest way to send the file to a remote server?
Thanks

If your "remote" server is on a local windows network, you can simply copy the file from the server file system to a UNC path (\myserver\folder\file.txt) using the FileCopy statement. If not, you may want to look at using a Java agent, which would make more file transfer protocols easily accessible.
In either case, be sure to understand the security restrictions on Notes agents - for your agent to run on the server and create a file on the server's file system, the agent will need to be flagged with a runtime security level of 2 or 3, and signed by an appropriately authorized ID.

Sending or copying files using O/S like commands to a remote server require that destination servers be also mapped as drives on your source server. As Ed rightly said, security needs to allow you to save files down onto the server and then try and copy them.
You can generate the file locally on the server and then use FTP commands in a script to send the file. Or if you're a java guru, you can try using Java.FTP to send the file as well. I had some trouble with it, but it should be possible providing an FTP account is setup on the destination server. FTP related stuff by a well known notes guy can be found here and here
I have done it using a script, and it's clumsy but effective in simply pushing files around. Ideally, if the server at the other end is a Domino server as well, you could actually attach the file in an email and send it to a mailin account on the destination server. I have done that before, and it's great as you can just pass the whole problem of getting files off to the SMTP process.

Related

Informatica Cloud - Picking up files from SFTP and inserting records in Salesforce

Our objective is as follows
a) Pick up a file "Test.csv" from a Secure FTP location.
b) After picking up the file we need to insert the contents of the file into an object in Salesforce.
I created the following connection for the Remote SFTP (the location which will contain "Test.csv")
Step 1
This is as shown below
Step 2
Then I started to build a Data Synchronization Task as below
What we want is for the Informatica Cloud to connect to the secure FTP location and extract the contents from a .csv from that location into our object in Salesforce.
But as you can see in Step 2, it does not allow me to choose .csv from that remote location.
Instead the wizard prompts me to choose a file from a local directory (which is my machine ...where the secure agent is running) and this is not what I want
What should I do in this scenario ?
Can someone help ?
You can write a UNIX script to transfer the file to your secure agent and then use informatica to read the file. Although, I have never tried using sftp in cloud, I have used cloud and I do know that all files are tied up to the location of the secure agent( either server or local computer) .
The local directory is used for template files. The idea is that you set up the task using a local template and then IC will connect to the FTP site when you actually run the task.
The Informatica video below shows how this works at around 1:10:
This video explains how it works at around 1:10:
http://videos.informaticacloud.com/2FQjj/secure-ftp-and-salesforececom-using-informatica-cloud/
Can you elaborate the Secure agent OS as in Windows or Linux.
For Windows environment you will have to call the script using WINSCP or CYGWIN utility I recommend the former.
For Linux the basic commands in script should work.

Transferring files with SFTP

So I'm trying to transfer files to a remote computer on an SSH system. 'I've used the sftp command, used lls to confirm the presence of the file in the local computer, and then implemented the put filename command. However, I receive the same result each time:
stat filename: No such file or directory
I just don't know what's going wrong! Any help or troubleshooting tips would be appreciated.
If you're currently using Windows you can download winscp and use that to transfer files. It has a nice graphic interface that is easy to interact with
Well, supposing that you are on a Linux/Unix environment, you could use scp. Typically, the syntax for an scp command would be like this:
$ scp foobar.txt your_username#remotehost.net:/some/remote/directory
The above command copies the file foobar.txt which resides in the local computer, to a specific directory in the remote machine, using a username (you will be asked for a password later).
The sftp command line client uses the ssh transport and will tunnel your connections using your key. So if you have ssh access, you should also have sftp access. This is a secure option for people who are more comfortable with ftp. Most GUI ftp clients should also support sftp.
I was facing also in this issue when trying to upload files from the local to the remote server. I did commands well and clean but the mistake I was making was that: I've logged into the remote server with ssh and then login with sftp. In that way, sftp will consider that your remote server is the local (as I logged in first to this via ssh) when using the command below:
put /c/path/to/file.txt
So, the thing to do is to login directly to the server via sftp and putting your local files in there.

TSQL and UNC Paths

I'm not an expert with TSQL so have patience with me please. So recently I was doing a project in TSQL on my local server using SQL Server 2008 R2 Management Studio. I was reading my files from a temp file on my C: drive and bulk inserting them into tables at the time.
Then I went and moved to a regular server instead of my local server on my machine.
It took me a bit to realize that I no longer had access to my local machine folders and files, and that is causing me issues.
I've read that one solution is to create a mapped drive on the server, but this is not an option for me.
So my question is what are other options for me? Could I use UNC paths to access my files or anything else?
The files I want to access are regular text files that are comma-delimited and newline terminated.
(I saw somewhat similar questions to mine, but there's seemed server specific or specific to their particular issues. Also none of their questions were answered.)
Actually a mapped drive won't work either because the account SQL runs under by default (local system if I recall) will not have network access.
So, the more reliable way to do this is definitely with a UNC path BUT there is more! (I've done this several times when I've needed to move database backups and log backups across servers for mirroring).
How?
On the SQL server machine AND the other server that will host the share, create a new user (same username and password on both machines) - assuming your not using AD. The user needs not be in any groups at all other than the users group but it must be called the same in both servers and the password must match.
On the SQL server machine change the account that SQL SERVER is running under. This is done in the SQL server configuration tool. Do not try to do this yourself via windows services. Choose the user that you created in no 1 above. Note you have to enter the pw. Restart SQL after you've changed it and verify SQL still runs fine. It should run just as before but now is running as a particular user with all the permissions of that user (which actually are very limited anyhow, but at least the user can access network resources).
On the remote server, make sure the new user has NTFS permissions on the folders that will host your share. Read/write perhaps or just read if SQL is only reading data.
On the remote server, create a share pointing to the appropriate folder that you set permissions for above. Make sure if you're using share permissions that the new user also has permissions on the share (not just on NTFS on the drive).
Once all of this is setup, your SQL scripts simply use the UNC path that points to the remote share and since SQL is running "as" a user with access to that share, SQL will see the files just fine!

Importing Data to SAS Remotely

I am trying to begin a project using a version of SAS I have remote access to. Ideally I would be able to type infile 'file_name.txt' and use the file I need. However, the directory I am in when I start SAS up is one associated with my account on this remote server. Hence I get the error that says essentially 'I have no idea what file you're talking about there isn't one here.' How can I get SAS to take a file from my hard drive instead?
Unless you have a file system that's mapped your local harddrive onto the remote server, your server can only tunnel keystrokes and graphics. This is probably a job for your sysadmin, but using applications like scp or ftp can be used to transfer file_name.txt from your local machine to the remote server. Alterantely, if the file is short, you can copy and paste the data in a universal format (like CSV) into a text editor in your terminal or virtual desktop.
Otherwise, SAS lives on your remote machine and is unaware of your local machine's filesystem. Which is good insofar as data privacy is concerned.

Changing connection to SQL Server in Microsoft Access 2003 ade file

I developed an Access 2003 application that is connected to SQL Server.
My problem is that I developed the software on my server, and the application runs on the client network on a different (identical) server.
As a result my executable file (Aka. .ADE) does not open on the client's computer, because of bad SQL Server connection.
My solution so far was to open the application file (.ADP) on the client's computer, changing the connection path from there and then creating there the executable file.
Now my client has only Access runtime environment, so I cannot do such thing.
I wonder if there is a way to determine the connection in an ADE file this way.
(I know I can change it through VBA, but when the connection is initially false, I don't even get to the VBA code stage.)
In the interest of keeping things simple, I'll say you need to set up a testing environment you control that mimics your client's environment. For instance, if they have a sql 2008 server named "SQL1", then you should install sql 2008 express on your machine, and rename your machine to "SQL1" so you can test. You'd also need to copy the schema of their database tables and put that same schema in your own test database, and fill it with test data that is similar to theirs. And you'll want to create duplicate logins as well.
With all that in place, I wouldn't think you'd need to update anything. Just copy the ADE file over to your client when you're done making changes. You could try to code your way though this scenario, but I've been there and done that. Having a test environment that apes your client's takes a lot of headaches out of the equation.

Resources