Google Cloud Send Email Upon File Upload - google-app-engine

I'm new to Google Cloud, I'm trying to find out is there a way where everytime I upload a file to the Cloud Storage I can have an instance send an email to the user? I already have my device uploading files to Cloud Storage without any issues, however the device is also sending the emails to - and since it's an embedded application I'd prefer to off load that task.

Check out GCS Object change notifications. Its the more generic answer for "how do I take action when a file changes in GCS", but you could certainly implement a notification handler in appengine to handle your email notifications.

Related

Use Cloud Pub/Sub to trigger sending of email

I'm trying to figure out how to use Cloud Pub/Sub to trigger the sending of an email when a file is added to a storage bucket.
Currently using PHP72 in Google App Engine standard environment. First I created a Topic that creates a message when a file is added to the storage bucket. Then I created a Pull subscription which reads the message. I can view the messages in the GCP console, but what I would like to happen is that I want to be notified by email, preferably with a copy of the file added to the email as an attachment. Is this possible? I tried looking for a solution or tutorial but came up empty.
You can implement the send mail login in a cloud function which will be triggered by Pub/Sub (Node.js,Python,Go).
Using Pub/Sub to trigger a Cloud Function
Instead of using a pull subscription, you should probably use a push subscription with App Engine, combined with one of the third party mail services such as Send Grid or MailJet.
The upload of an object to GCS triggers a message to be sent to the topic, and the push subscription delivers that message to App Engine.
Unfortunately, there aren't any full tutorials asking for exactly what you want, but hopefully this helps. Feel free to request a community tutorial for this by filing an issue on the GCP community toturial repo.

Apache Beam/Google Dataflow - Exporting Data from Google Datastore to File in Cloud Storage

I need create a file report for user request. Each user select the filter for file report, and my application should generate a file in cloud storage and send a notification with the file link generated.
This is the application workflow:
the client selects a filter and request a report file
The application get this request and create a record in datastore with data about user selected filter
Stores the Datastore key URL Safe String from the new record in pubsub.
The Dataflow Pipeline read the key stored in PubSub.
Generate file in google cloud storage
Notifies the client with storage file url
It is possible to create a file for each pubsub entrance ?
How I do to create a file with custom name?
It is correct this architecture ?
Your use case sounds as if it would be more applicable to google cloud storage than cloud datastore. Google cloud storage is meant for opaque file-like blobs of data, and provides a method to receive pubsub notifications on file updates https://cloud.google.com/storage/docs/pubsub-notifications.
However, its a bit unclear why you're using the indirection of pubsub and datastore in this case. Could the server handling the client request instead directly make a call to the google cloud storage api?

Notification for Read Access for a Cloud Storage File

I am serving a file through Google Cloud Storage. If an user access it, is it possible to get a notification?
I have check Cloud Pub/Sub Notification. This deals about object created or delete inside a bucket.
I am looking for read access for a file. Is it possible to achieve?
There is no support for real-time notifications on object reads. You can, however, enable hourly access logs that you can scan over to see who's been reading your files recently, and you can get notifications when those access logs are available.

Is it possible to send text files as request message to Google PubSub

I am trying to store files from on Google Storage. However, I do not want my primary GAE to spend time behind it. Hence I am pushing this task onto Google PubSub which further pushes it to another GAE. I know that its possible to send encoded string via Google PubSub. Is it possible to achieve the same for text files ?

Is there a limit on how many files can be dynamically generated on Google Cloud Storage?

I have an app that uses appengine and Twilio to send voice messages.
In order to send a voice message you have to have your message content in a publicly available XML file. My app has a form that allows the user to create and post a message. Once the user posts the message, my app will dynamically create the XML file and save it to a bucket in Google Cloud Storage (GCS).
It appears that this can work up to ten times only. On the eleventh attempt, CGS does not generate and save the file. When Twilio tries to make the voice call, it just craps out with a 404 error because GCS never created the file. If I delete some of the files in my bucket, it'll start working again.
I can't find any documentation on Google regarding this kind of limitation. Does anyone know if there is such a limitation?
Thanks

Resources