How upload large number of files to a google storage bucket? - file

Which is the right way to upload a specific folder from a PC with thousands of files and subfolders to a Google Storage Bucket?
I tried with gsutil command:
gsutil -m cp -r myfolder gs://my-bucket
But transfer stops and only upload a few of hundred files until drops a Python error.
Is this the right way for do this?
Microsoft Azure Storage has a (wonderful) graphical tool called Microsoft Azure Storage Explorer and with command azcopy works perfectly, upload all thousand of files so quickly.

You can use gsutil with the -R recursive flag to upload all the files and sub-directories.
gsutil -m cp -R dir gs://my_bucket

Related

How to upload react build folder to my remote server?

I'm trying to deploy my react build folder to my server. I'm using index.html and static that configured in my settings.py file to do that. (https://create-react-app.dev/docs/deployment/)
Since my backend is running on Ubuntu, I can't just copy from my Windows side and paste it. For now, I uploaded my build folder to my Google Drive and I download it on Ubuntu. But I still can't just copy and paste it on my PyCharm IDE, I can only copy the content in each file and then create a new file ony my server and paste the content to the file. This is just so time-consuming.
Is there any better way to do this?
you can use scp to upload the floder to remote
This link may help you:
https://linuxhandbook.com/transfer-files-ssh/
use scp command
# in dest folder:
scp username#remove_address:/path/for/deploy ./

Download files from multiple links in google cloud storage

I have 1000 files in google cloud storage but these files in multiple directories so, how can I download them at the same time
I put all the links in excel file and use this command
cat C:/Users/tm/files.xlsx | C:/Users/tm/AppData/Local/Google/Cloud_SDK/google-cloud-sdk/bin/gsutil.cmd -m cp -I C:/Users/tm/Desktop/files
then I got this massege
stat: embedded null character in path
CommandException: 1 file/object could not be transferred.
Thanks in advance
gsutil -m cp -R gs://your-bucket-name/path/to/directory/ C:/Users/tma/Desktop/files

Downloading many files at once in SSH from Google Cloud?

I just started working with google cloud on a project (using VM instances). I connected to SSH straight from the browser.
I will have thousands of .txt files in a few directories, and the "Download file" option only allows me to download 1 file at a time.
What's the easiest way to download all those files (or the whole directory) straight to my computer? Or, what method should I use/learn?
The easiest way will be to install the Cloud SDK on your local machine (see installation instructions here) and use the gcloud compute scp command to download your files or directories. For example:
gcloud compute scp --recurse vm-instance:~/remote-directory ~/local-directory
This will copy a remote directory, ~/remote-directory, from vm-instance to the ~/local-directory directory of your local host. You'll find more details about this command usage here.

Windows batch file to download latest file in s3 bucket folder

I am trying to make a windows batch script that will get me the name of the latest file in a folder in an s3 bucket. I want to use the output to download that file to the local machine.
Here is the current output from the AWS S3 ls command:
2017-12-19 13:28:51 0
2018-01-09 16:13:03 380093544960 FULL-BK-OLD.bak
2019-09-10 23:44:37 101958829056 Backup_FULL_20190910_164900.bak
2019-09-23 23:45:15 76693135360 Backup_FULL_20190923_120027.bak
2019-09-30 23:45:10 76976559104 Backup_FULL_20190930_120025.bak
2019-10-05 23:45:11 79225411584 Backup_FULL_20191005_120025.bak
The goal:
aws s3 cp s3://mybucket/backups/%latestFile% .

What "option" to use with "WGET" for selecting only few files with particular extension from a FTP directory

I am trying to download files with particular datestamp as an extension from a folder through FTP server. Since the folder contains all other files, I wanted to download only files with a particular datestamp.
I tried using wget files_datestamp*.extension, which didn't work.
I also tried using wget -i files_datestamp*.extension, which downloads all.
My question is: What option to use with wget to download only particular files that I am interested in?
wget http://collaboration.cmc.ec.gc.ca/cmc/CMOI/NetCDF/NMME/1p0deg/#%23%23/CanCM3_201904_r4i1p1_20190501*.nc4
The link you've shared is over HTTP and not FTP. As a result, it is not possible to glob over the filenames, that is feasible only over FTP.
With HTTP, it is imperative that you have access to a directory listing page which tells you which files are available. Then use -r --accept-regex=<regex here> to download your files

Resources