How to batch move/delete files on FTP using scripts - batch-file

I currently use Globalscapes CuteFTP as my FTP client and am in the process of cleaning up old, unused files. I use a script to upload new files to the FTP but that is based on a wildcard; uploading anything I have in a specific folder.
Now I want to do the opposite and delete files but only specific files. I have a list of over 1,000 file names that I need to remove (or ideally move to a designated folder) but I am not sure how to write the script to do this. Could someone help me create a batch relocate script or at least point me in the right direction?

You'll have better luck looking for some FTP client that allows scriptable actions. A quick search pointed out http://winscp.net/eng/docs/scripting which might be helpful.

Related

auto check in the files uploaded on Sharepoint using batch file

Could you please help me to upload files and auto check in on Share Point through batch script from windows server?
Whenever I try to upload files on SP thorough batch files, the script is automatically checked out and it is invisible to other users except me, so I need to check in every time manually.
Use PowerShell with SharePoint Client Object Model (CSOM). Ask your favorite search engine, there are plenty of examples out there.

AEM CQ5 Move option for a file to another location

I have Website configured in CQ5 AEM. I am trying to move some files to a folder on the same location. I have large count of files, each and every i should select and click on move to move files.
Any suggestion to move multiple files to a folder in the same location in adobe experience manager. In spent 1 day to send 150 files to a folder. I have almost 10000 files to move. Please suggest me to reduce my work.
I got some reference
http://docs.adobe.com/content/docs/en/cq/5-6-1/wcm/page_create_edit.html
but it is showing for 1 single file using move option.
Appreciate and Thanks in advance. It helps me alot.
If it is really about files, then you can try to connect to cq instance via WebDav and operate with files like in any file manager (but it does not work for me with pages and some other node types).
Otherwise, I would suggest to create package in package manager with all files (you want to move), download it, unzip, move required files to another folder, pack it back to package, upload and install. After that you can delete files in old location.
In second case, you, probably, will have to modify filter.xml according to new files' location.

Batch Processing Design Patterns

A partner who cannot support a real-time web service interface must SFTP CSV files to my linux environment.
The file is zipped and encrypted. The sftp server is a different virtual server than the one that will process the CSV data into my application's database.
I don't need help with the technical steps (bash script, etc) but I'm looking for file management conventions that assist with the following requirements:
Good auditabilty
Non-destructive
Recoverable
Basically I'm trying to figure out when it makes sense to make copies of the file, when to rename it to indicate some process step has been completed to a file, etc. (e.g. Do I keep the zip files or do I delete them once unzipped?)
There is going to be personal preference in the response, but I'm looking for that; to learn from someone who has more experience working with this type of interface. This seems better than me inventing something myself.
If the files are encrytped upon the network and within the files settings, then it cannot be successfully transmitted across unless the file is parsed within another file. You could try to make the sftp server foward the file onto a seperate machine,but this would only cause more issues because of the encryption type based on the files.

Creating A Log Of Files In A Folder and update into table

Can anyone help me to build a table that lists all files in a specified folder, so whenever a file is copied to that folder the table should update and make a log of files?
I need the list to retain the names, even if the file is moved from that folder or deleted. Later the data would be deleted by a scheduler.
Also I need the table to record the time exactly when the file was copied into that folder and not the modification or creation time.
I am using windows 7; how can I build a system with my desired behaviour?
Just turn on Windows file auditing, for that folder, the youtube video takes you through the process.
Microsoft provide information on their techNet site as to how you can use the LogParser tool to extract Security events from the Event Log DB.
Note: Admin questions should really be posted to the SuperUser site.

I have a huge music archive on a remote server. How can I set up a page that lets people interact with it?

What I want to have is some sort of file tree that people can interact with and download songs or folders of songs. I kinda got started with a file listing PHP script (which you can see here) but it requires me to copy the same index.php to every individual folder. I would be okay with this if I knew how to copy the same file to every subdirectory in the archive.
What do you guys think? Any ideas?
What you do is to use your webserver to browse the file hierarchy. But what you want is to use your script (the index.php). So instead of creating links that point to a deeper place in your file system, create links that point to your index.php, but add a parameter that tells the script the (relative) folder. Beware that you have to verify that only files can be shown that you actually want to make public (i.e. no files above your "entry" directory).

Resources