I have .bat file inside that calls the AWS sync command:
aws s3 sync D:\Users\backup s3://mybucket
It syncs my local data to an S3 bucket. Then I created a Windows Scheduler Task for that .bat file, and every day at 2300hr the .bat file runs and syncs my local data to the S3 bucket.
If a failure happens during the sync on the local PC (example: network failure/s3 authentication fail) or remote S3 server, I want to get a failure notification via email.
What is the best and most efficient way to get this notification?
The AWS Command-Line Interface (CLI) can return an error code to indicate whether the command succeeded.
See: AWS CLI Return Codes
Therefore, you could do the following:
When running a CLI command, redirect output to a temporary file
Check the return code. If it is non-zero, send an email with the error code and the contents of the temporary file
It would be something like:
aws s3 cp ./foo s3://bar >output
code=$?
if [ $code -ne 0 ]
then <email stuff with output file>
fi
Related
I am attempting to send song metadata to a v2.6 Shoutcast server. Obviously, I can’t share the username and passwords here, but if I enter the following format in a web browser, it works:
http://<serverusername>:<serverpassword>#<IPaddress>:<port>/admin.cgi?<streampassword>&mode=updinfo&song=XXXXXXX – XXXXXXX
However, I want to send updated metadata using a Windows Batch file. I have used CURL successfully before to send HTTP commands but I don’t appear to be able to get it to work in this instance. I have tried various CURL configurations such as:
CURL http://<serverusername>:<serverpassword>#<IPaddress>:<port>/admin.cgi -d <streampassword>&mode=updinfo&song=XXXXXXX – XXXXXXX
It does appear to be connecting to the server since it returns quite a bit of log information, but no errors. If I change the username or password, then it fails authentication.
Any suggestions?
Situation:
I just created an Azure Storage file share and I figured out how to upload files via script (C#) to the Azure BLOB storage.
Problem:
I want to upload files via script to my Azure storage file share (not BLOB). I already installed the Azure CLI but the problem is I have to login first (az login) before I can take any actions.
Is there any way to upload files from a folder based on my PC to the Azure storage file share (testuser.file.core.windows.net\test) without mounting it?
Thanks a lot.
The are multiple was to accomplish this. If you only problem is the login part then you can use Service principal and certificate to automate login.
example
az login --service-principal -u http://azure-cli-2016-08-05-14-31-15 -p ~/mycertfile.pem --tenant contoso.onmicrosoft.com
You can see more options by using
az login -h
Hope this helps.
Is there any way to upload files from a folder based on my PC to the Azure storage file share (testuser.file.core.windows.net\test) without mounting it?
Yes, we also could use the C# code to upload file to azure file storage. Please have a try to use the following code.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("storage connection string");
// Create a CloudFileClient object for credentialed access to File storage.
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
// Get a reference to the file share we created previously.
CloudFileShare share = fileClient.GetShareReference("test");
share.CreateIfNotExists();
CloudFile cloudFile = share.GetRootDirectoryReference().GetFileReference("fileName");
Stream fileStream = File.OpenRead(#"localpath");
cloudFile.UploadFromStream(fileStream);
Some files get uploaded on a daily basis to an FTP server and I need those files under Google Cloud Storage. I don't want to bug the users that upload the files to install any additional software and just let them keep using their FTP client.
Is there a way to use GCS as an FTP server? If not, how can I create a job that periodically picks up the files from an FTP location and puts them in GCS?
In other words: what's the best and simplest way to do it?
You could write yourself an FTP server which uploads to GCS, for example based on pyftpdlib
Define a custom handler which stores to GCS when a file is received
import os
from pyftpdlib.handlers import FTPHandler
from pyftpdlib.servers import FTPServer
from pyftpdlib.authorizers import DummyAuthorizer
from google.cloud import storage
class MyHandler:
def on_file_received(self, file):
storage_client = storage.Client()
bucket = storage_client.get_bucket('your_gcs_bucket')
blob = bucket.blob(file[5:]) # strip leading /tmp/
blob.upload_from_filename(file)
os.remove(file)
def on_... # implement other events
def main():
authorizer = DummyAuthorizer()
authorizer.add_user('user', 'password', homedir='/tmp', perm='elradfmw')
handler = MyHandler
handler.authorizer = authorizer
handler.masquerade_address = add.your.public.ip
handler.passive_ports = range(60000, 60999)
server = FTPServer(("127.0.0.1", 21), handler)
server.serve_forever()
if __name__ == "__main__":
main()
I've successfully run this on Google Container Engine (it requires some effort getting passive FTP working properly) but it should be pretty simple to do on Compute Engine. According to the above configuration, open port 21 and ports 60000 - 60999 on the firewall.
To run it, python my_ftp_server.py - if you want to listen on port 21 you'll need root privileges.
You could setup a cron and rsync between the FTP server and Google Cloud Storage using gsutil rsync or open source rclone tool.
If you can't run those commands on the FTP server periodically, you could mount the FTP server as a local filesystem or drive (Linux, Windows)
I have successfully set up an FTP proxy to GCS using gcsfs in a VM in Google Compute (mentioned by jkff in the comment to my question), with these instructions:
http://ilyapimenov.com/blog/2015/01/19/ftp-proxy-to-gcs.html
Some changes are needed though:
In /etc/vsftpd.conf change #write_enable=YES to
write_enable=YES
Add firewall rules in your GC project to allow
access to ports 21 and passive ports 15393 to 15592 (https://console.cloud.google.com/networking/firewalls/list)
Some possible problems:
If you can access the FTP server using the local ip, but not the remote ip, it's probably because you haven't set up the firewall rules
If you can access the ftp server, but are unable to write, it's probably because you need the write_enable=YES
If you are tying to read on the folder you created on /mnt, but get a I/O error, it's probably because the bucket in gcsfs_config is not right.
Also, your ftp client needs to use the transfer mode set to "passive".
Set up a VM in the google cloud, using some *nix flavor. Set up ftp on it, and point it to a folder abc. Use google fuse to mount abc as a GCS bucket. Voila - back and forth between gcs / ftp without writing any software.
(Small print: fuse rolls up and dies if you push too much data, so bounce it periodically, once a week or once a day; also you might need to set the mount or fuse to allow permissions for all users)
When we upload something google drive its start uploading automatically so i need output after completion of successful uploading.
Normally we get message that sync complete i want that output through batch file
basically i am making files uploading automated through batch files
first upload some files to google drive using robocopy
then wait till sync complete and when sync get complete i need that sync complete message into text file so i can verify that sync get completed successfully
so is there any switch available which gives such kind of output?
Im unable to see images by the URL returned from getPublicUrl(). The image is uploaded directly to Google Cloud Storage via HTTP POST and I can see them in Datastore/Blobstore Viewer on http://localhost:8000.
try{
echo CloudStorageTools::getPublicUrl("gs://qbucket_first/sample.jpeg", true);
}catch ( CloudStorageException $e ) {
echo 'There was an exception creating the Image Public URL, details ' . $e->getMessage();
}
What I get is this: http://localhost:8080/_ah/gcs/qbucket_first/sample.jpeg But I receive a 404 error when entering that URL. How can I fix this problem? I am using Chrome, Mac OS X, GAELauncher v1.8.9 PHP.
Edit 1
If I run this code:
file_put_contents("gs://qbucket_first/hello.txt", "Hello");
echo CloudStorageTools::getPublicUrl("gs://qbucket_first/hello.txt", true);
I get
http://localhost:8080/_ah/gcs/qbucket_first/hello.txt
And when entering this URL, I can download the file and read the content. So it works with text files which means that Datastore is working on local dev.
There's the same thing in Python version Google App Engine.
dev_appserver.py create a fake Cloud Storage for local development, when you write something into Google Cloud Storage in local environment, it actually writes it into a fake space, getPublicUrl() in local environment only gets the fake url in your local Google Cloud Storage.
The first time you upload the file to the real Google Cloud Storage, so in the local environment you get 404.
the second time you save the file in your local runtime, so it actually writes into the local fake GCS environment, so the second time it works.
you are using the dev server, which seems to be the problem. Run it from production.
Does the image filename have spaces?
If yes, remove the spaces or replace with _.