Windows batch file to download latest file in s3 bucket folder - batch-file

I am trying to make a windows batch script that will get me the name of the latest file in a folder in an s3 bucket. I want to use the output to download that file to the local machine.
Here is the current output from the AWS S3 ls command:
2017-12-19 13:28:51 0
2018-01-09 16:13:03 380093544960 FULL-BK-OLD.bak
2019-09-10 23:44:37 101958829056 Backup_FULL_20190910_164900.bak
2019-09-23 23:45:15 76693135360 Backup_FULL_20190923_120027.bak
2019-09-30 23:45:10 76976559104 Backup_FULL_20190930_120025.bak
2019-10-05 23:45:11 79225411584 Backup_FULL_20191005_120025.bak
The goal:
aws s3 cp s3://mybucket/backups/%latestFile% .

Related

How to upload react build folder to my remote server?

I'm trying to deploy my react build folder to my server. I'm using index.html and static that configured in my settings.py file to do that. (https://create-react-app.dev/docs/deployment/)
Since my backend is running on Ubuntu, I can't just copy from my Windows side and paste it. For now, I uploaded my build folder to my Google Drive and I download it on Ubuntu. But I still can't just copy and paste it on my PyCharm IDE, I can only copy the content in each file and then create a new file ony my server and paste the content to the file. This is just so time-consuming.
Is there any better way to do this?
you can use scp to upload the floder to remote
This link may help you:
https://linuxhandbook.com/transfer-files-ssh/
use scp command
# in dest folder:
scp username#remove_address:/path/for/deploy ./

Downloading many files at once in SSH from Google Cloud?

I just started working with google cloud on a project (using VM instances). I connected to SSH straight from the browser.
I will have thousands of .txt files in a few directories, and the "Download file" option only allows me to download 1 file at a time.
What's the easiest way to download all those files (or the whole directory) straight to my computer? Or, what method should I use/learn?
The easiest way will be to install the Cloud SDK on your local machine (see installation instructions here) and use the gcloud compute scp command to download your files or directories. For example:
gcloud compute scp --recurse vm-instance:~/remote-directory ~/local-directory
This will copy a remote directory, ~/remote-directory, from vm-instance to the ~/local-directory directory of your local host. You'll find more details about this command usage here.

How upload large number of files to a google storage bucket?

Which is the right way to upload a specific folder from a PC with thousands of files and subfolders to a Google Storage Bucket?
I tried with gsutil command:
gsutil -m cp -r myfolder gs://my-bucket
But transfer stops and only upload a few of hundred files until drops a Python error.
Is this the right way for do this?
Microsoft Azure Storage has a (wonderful) graphical tool called Microsoft Azure Storage Explorer and with command azcopy works perfectly, upload all thousand of files so quickly.
You can use gsutil with the -R recursive flag to upload all the files and sub-directories.
gsutil -m cp -R dir gs://my_bucket

PSFTP rename file extension

I'm using a PSFTP script to upload files to a remote server for processing and there are specific steps that I need to take with file in order for the file to be processed correctly by their system. Specifically, I have to upload the file with a ".u01" in the extension, then rename it to a ".r01". after it has been uploaded.
Is there a way to replace just the extension of the uploaded file? The rest of the filename can't be modified because the filename is corresponds with the contents of the file.
And no, I can't just upload it with a ".r" extension. Tried that already :P
The script is simple. It uploads all files from the "to_upload" directory into "ul" on the remote server:
cd ul
put -r to_upload/

Batch script to upload file using FTP to a particular folder in remote server

I am using this script to upload files using FTP. However, it is uploading to the root folder in the server.
Can any one tell me to upload the file to a particular path on remote server, say '/CurrentQA'?
Before this line:
ECHO binary >> %Commands%
Try adding this line:
ECHO cd /some/directory >> %Commands%
This should enter the desired directory in the remote server (I haven't tested it though).

Resources