Notification for Read Access for a Cloud Storage File - google-app-engine

I am serving a file through Google Cloud Storage. If an user access it, is it possible to get a notification?
I have check Cloud Pub/Sub Notification. This deals about object created or delete inside a bucket.
I am looking for read access for a file. Is it possible to achieve?

There is no support for real-time notifications on object reads. You can, however, enable hourly access logs that you can scan over to see who's been reading your files recently, and you can get notifications when those access logs are available.

Related

Is using Google Pub/Sub possible on the frontend (React)

I'm fairly new to things that aren't strictly front end, so after reading the Google pub/sub docs and doing a few searches its not clear to me whether using it with react is possible.
My use case is I (hypothetically) have tens of thousands of people on my webpage at a time that all need to be told at the same time that some external event occurred (the message would be very small).
I know Google Firestore has a listener feature but based on this specification it would not be within the free tier usage anymore. I've seen libraries that allow Google Pub/Sub to be used with IOT devices so I'm confused on why I can't find any resources on using it in the browser.
Creating a Cloud Pub/Sub subscriber in the frontend would be an anti-pattern for several reasons. First of all, the quota limits only allow 10,000 subscriptions per topic. Since you say you have tens of thousands of people on the web page at a time, you would not be able to create enough subscriptions for this case. Additionally, subscriptions created when users come to the website would not be able to get any notifications from before the time the subscription was created; Cloud Pub/Sub only guarantees delivery of messages published after the subscription was successfully created. Finally, you'd have the issue of security and authentication. In order to start a subscriber from the client, you'd need to pass it credentials that it could use. If you use separate credentials for each webpage viewer, then you'd have to create these credentials on the fly and revoke them when the user disappears. If you use the same credentials across all of the subscribers, then one subscriber could intercept the feed of another subscriber.
Overall, Cloud Pub/Sub is designed for the torrents use case: fewer feeds with a lot of data that has to be processed by fewer subscribers. What you are talking about is the trickles use case: a small number of messages that need to be distributed among a large number of subscribers with individual ACLs. Firebase Cloud Messaging is the product designed for this latter case.
While it is true that Cloud Pub/Sub is on the path for Google Cloud IoT, it is used on the publish side: many devices send their events to a topic that can be processed by subscribers. Note that these messages from devices don't come directly into Cloud Pub/Sub; they go through a Cloud IoT server and that server is what publishes the messages to Cloud Pub/Sub. Device authentication is done via Cloud IoT and not via permissions on Cloud Pub/Sub topics. The delivery of messages to IoT devices is not done with Cloud Pub/Sub.

How can I enforce rate limit for users downloading from Google Cloud Storage bucket?

I am implementing a dictionary website using App Engine and Cloud Storage. App Engine controls the backend, like user authentication etc., and Cloud Storage is used to store a JSON file for each dictionary entry.
I would like to rate limit how much a user can download in a given time period so they can't bulk download the JSON files and result in a big charge for me. Ideally, the dictionary would display a captcha if a user downloads too much at once, and allow them to keep downloading if they pass the captcha. What is the best way to achieve this?
Is there a specific service for rate limiting based on IP address or authenticated user? Should I do this through App Engine and only access Cloud Storage through App Engine (perhaps slower since it's using some of my dynamic resources to serve static content)? Or is it possible to have the frontend access Cloud Storage and implement the rate limiting on Cloud Storage directly? Is a Cloud bucket the right service for storage, here? And how can I allow search engine indexing bots to bypass the rate limiting?
As explained by Doug Stevenson in this post
"There is no configuration for limiting the volume of downloads for
files stored in Cloud Storage."
and explaining further:
"If you want to limit what end users can do, you will need to route
them through some middleware component that you build that tracks how
they're using your provided API to download files, and restrict what
they can do based on their prior behavior. This is obviously
nontrivial to implement, but it's possible."

Apache Beam/Google Dataflow - Exporting Data from Google Datastore to File in Cloud Storage

I need create a file report for user request. Each user select the filter for file report, and my application should generate a file in cloud storage and send a notification with the file link generated.
This is the application workflow:
the client selects a filter and request a report file
The application get this request and create a record in datastore with data about user selected filter
Stores the Datastore key URL Safe String from the new record in pubsub.
The Dataflow Pipeline read the key stored in PubSub.
Generate file in google cloud storage
Notifies the client with storage file url
It is possible to create a file for each pubsub entrance ?
How I do to create a file with custom name?
It is correct this architecture ?
Your use case sounds as if it would be more applicable to google cloud storage than cloud datastore. Google cloud storage is meant for opaque file-like blobs of data, and provides a method to receive pubsub notifications on file updates https://cloud.google.com/storage/docs/pubsub-notifications.
However, its a bit unclear why you're using the indirection of pubsub and datastore in this case. Could the server handling the client request instead directly make a call to the google cloud storage api?

Google Cloud Send Email Upon File Upload

I'm new to Google Cloud, I'm trying to find out is there a way where everytime I upload a file to the Cloud Storage I can have an instance send an email to the user? I already have my device uploading files to Cloud Storage without any issues, however the device is also sending the emails to - and since it's an embedded application I'd prefer to off load that task.
Check out GCS Object change notifications. Its the more generic answer for "how do I take action when a file changes in GCS", but you could certainly implement a notification handler in appengine to handle your email notifications.

Google Cloud Storage file access control list public-read-write is not working in app engine java

I am uploading image to Google Cloud Storage bucket. I want to give all user READ permission. Can anyone tell what should I pass in setAcl().
I want to get that image with http://storage.googleapis.com/BUCKET_NAME/IMAGE_NAME.jpg.
public-read-write is not a valid predefined ACL for objects, since objects, unlike buckets, do not really have a write permission. You are looking for public-read.

Resources