i have FTP created inside Default FTP Site on one server name A and trying to copy files from other server B and trying to place in Server C through batch script, which is configured in A as physical path, i can see transfer is happening but its going in root folder of A which is not correct. Please suggest me what code i need to write in batch script which can place file on sever c.
If you want to keep directory structure, you can compress to zip with directories and unzip at target. This may also reduce file transfer load. But if you wanna make it one by one, in the batch file, store full path and check full path at target. At target, create full path and copy file if path exists.
Related
suppose the IP address of my FTP server is xx.xxx.xx.xx and i need the output file to be stored in D:/example. I need to esnure that the path i give is in my FTP server. How can i include that in my fopen function, like a path which points to the example in my FTP server.
Generally speaking, this is how it goes:
there's a database server
there is a directory on one of its disks
that directory will be used in create directory command which creates a directory, Oracle object
it will be used as a target for your file-related operations. For example:
it'll contain CSV files which are source of external tables
.dmp files, result of data pump export, will be stored there (the same goes for import)
UTL_FILE will create files in that directory
All that means that your idea of creating a file on a FTP server might not work just as easy.
However, there's a way : if you create directory (Oracle object) using UNC (Universal Naming Convention) which points to a directory on the FTP server, the file might be created there. Do some research about it; I know I once did that (put files onto an application server), but that was long time ago and I don't remember everything I did.
Another option you might consider is DBMS_SCHEDULER package. Suppose you create a file on the database server (which is the simplest option; if you do it right, it is more or less trivial). Once the procedure (which creates the file) is done, call DBMS_SCHEDULER.CREATE_JOB using the executable job type and call an operating system batch file that will copy the file from the database server to the FTP server.
That's all I can say about it; at least, you have something to research & think about.
I have FTP server where people put some txt files that I need to process. I have for each loop for FTP task and with that, i receive a file to local directory from FTP server. Next, I need to process files from the directory in the foreach loop, and my question is can I somehow protect my loop to don't iterate through the files from past.
Example: someone put files on FTP yesterday. I process them and they are in my local directory. Today someone put a file on FTP, FTP task download them to the local directory and my loop will iterate through all files, yesterday and today. How can I avoid this, without deleting them manually
In the BW Designer, File Poller can look for file in the local server where the Designer is installed.
But If I want the File Poller to look at another server instead of the local server, I don't see any option to do that.
Is there any other way so that I can use File Poller to look on any other server and poll file from that server instead of the server where the Designer is installed?
Thanks
The File Poller can only access files and directories visible to the system the engine is running on, so you need to mount a network drive to poll files sitting on a remote server.
Another option would be to build a custom polling mechanism using the FTP Palette. You could use a Timer instead of a File Poller. On the first execution, the process would list the files in the remote folder using the FTP Dir activity then store this list in a Shared Variable. This list would contain the file names, last modification dates, etc.
Every time the process is triggered, it would run FTP Dir to compare the current list of files against the previous one to detect any changes (new files, modified files, etc.) then update the Shared Variable to keep the latest image of the remote folder. You could then run FTP Get to retrieve any new or modified file.
I have a folder on a webb server that the user can upload files to, I need to have a script that monitor that folder and whenever a new file is uploaded or changed I would like to copy that file to another folder on my server. This script should be monitoring the source folder all the time. Is this possible in a Windows Server 2008R2?
I am trying to set up an FTP task in SSIS. The source is not variable. The destination folder is predefined. However, I need the received file to be renamed.
LDAP.txt after fetching needs to be LDAP_20140204.txt in my target folder. on using variables, it throws the error
'Directory is not specified in the file connection manager'
Any work around for this?
Have you considered the File System Task? This will allow you to rename a file. http://technet.microsoft.com/en-us/library/ms140185.aspx