Drive Realtime API not granting permission to realtime document; normal drive API freaking out - google-drive-realtime-api

My app uses the Drive rest API and the Drive Realtime API in combination. We set the file's permissions so that the public has view access, and then emailed a link to it to a few thousand customers.
The file's permissions are set so that the public has view access, but:
When a user tries to open the realtime document, we get Drive Realtime API Error: not_found: File not found.
When a user tries to copy the non-realtime file, we get The authenticated user has not granted the app 689742286244 write access to the file 0B-NHh5QARZiUUFctN0Zjc3RKdWs (of course we are not asking to write
You can see the effects for yourself at https://peardeck.com/editor/0B-NHh5QARZiUUFctN0Zjc3RKdWs , and our embarrassing attempts to cover for the errors.
Interesting notes:
Sharing the file directly with a particular google account seems to lift the curse, and then that google account can find the file like normal. No extra permissions, just an explicit reference to a google account.
Setting the file so that the public has full write access seems to have no effect
Other files with the exact same settings in the exact same Drive folder can be opened successfully (but presumably have not been opened by so many people in the past). This makes me think there is some broken state within Google.
How can I avoid this? What's going on?(!?!?) Thanks for any help!

The realtime API does not support anonymous access. All users must have a Google account and be explicitly authorized to view or write to the file.

Related

Granting directory level permissions for Google Cloud Storage REST APIs

My aim is to be able to call the Google Storage REST APIs with an OAuth token granting an authenticated Google user the read/write permissions on a directory called "directoryName" inside my storage bucket.
So, far I have successfully managed to use the Storage APIs after adding the user to the ACL for the bucket. However, I do not want to grant the user the READ or WRITE permissions on the complete bucket but just on the user's directory inside the bucket e.g. bucket/directoryName.
e.g. I want to be able to call storage.objects.list for a directory inside the bucket without providing the user the permissions for the bucket but just for that directory (and subdirectories).
What I've tried so far: When I tried to call the GET method on https://www.googleapis.com//storage/v1/b/bucket/o?fields=kind%2Citems%28name%29&maxResults=150&prefix=directoryName with the user added to the directory's ACL (as Owner), I get the error response "code":403,"message":"myEmail#gmail.com does not have storage.objects.list access to myBucketName.appspot.com."
Is it possible to provide directory level permissions with Google Cloud Storage and list the contents of that directory only?
As explained in the documentation, there are no such thing as directories in Cloud Storage. As far as Storage is concerned, there are only buckets and inside them objects/files that may or may not have "/" in their name.
Due to this design choice, there's no option to set permissions on a "directory" in Cloud Storage. Please note however that you can create as many buckets as you want for no extra charge. You may create one bucket per user to fit your requirement.

Yii2 restrict access to files

I have Yii2 application where users can upload and share files of different types. Once a file is uploaded, it could be downloaded only by certain other users and there are a whole bunch of checks that go behind this process.
My problem is that the files are stored on the server and if someone has the link directly to the file then they can easily be downloaded without going through any kind of authorization or security checks. How can I prevent this?
P.S. It could be any kind of solution, not one related to Yii2.
The following approach comes to my mind.
Store the files at a location in file system that is not made publicly accessible by a web server.
Make them available by reading them from file system and sending them to browser when the user retrieves the URL that also does the security checks. A redirect to another URL that does not do security checks has to be avoided.
If you give more details about a more specific problem or question people can give you more specific information.

The authenticated user has not granted the app xxx write access to the child file xxx, which would be affected by the operation on the parent

I hit the below error when trying to insert the permission
"code": 403,
"message": "The authenticated user has not granted the app xxx write access to the child file xxx, which would be affected by the operation on the parent"
Here is what I am doing
We have two Google Account,
1. API Account - we used to create folder and change the ownership to Business account
2. Business Account - upload a file
now we try to share to folder to customer and we hit the above error
And here I using OAuth 2.0 Installed Applications to get the access token.
Please advise what is wrong I am doing here
I had the same issue but I realize that its because of the scope for credential wasn't setup properly. I only had DriveScopes.DRIVE_METADATA set which was not enough for downloading files. Once I added DriveScopes.DRIVE_FILE, DriveScopes.DRIVE, and DriveScopes.DRIVE_APPDATA, I was able to download the file without any problem. Hope this helps.
P.S. if you are changing credentials, you have to delete the previously saved credential file.
Based on the Official Google Documentation you received '403: The user has not granted the app' because the request is not on the ACL for the file. The user never explicitly opened the file with this Drive app.
It is highly recommended to use Google Open Picker and prompt the user to open the file or direct the user to Drive to open the file with the app.

Google Drive SDK: Modify application-owned file as user

I have a Google App Engine application that:
Authenticates a user and authorizes the drive.file scope;
Creates and stores a file on behalf of a user via an application-owned 'regular' Google account;
Shares that file with the user (grants write access).
However, when a user attempts to update one of these files via an authorized Drive service created by the app, the following exception is raised:
403: The authenticated user has not granted the app {appId} access to
the file {fileId}.
What am I missing? Given that the file was both initially created by and is still owned by the application, why is it necessary for the user to specifically grant the application access to the file?
My goal is for users to modify files (to which they have write access, that are stored in/owned by an application-owned account) as themselves in order to maintain appropriate 'last modifying user' attribution.
Is there anything I can do to work around this, other than (a) authorizing the 'drive' scope, (b) using the Google Picker or Drive UI to 'explicitly' open files with my app (does this imply the file must live in the user's Drive account?), or (c) having my application-owned account perform all file update operations?
File scope authorization is currently done as a user-app pair. Each user must individually authorize the app to access to the file.
Given that, I think you've identified the possible solutions. For b, the file doesn't need to be owned by the user, they just need access to it. Having shared it with them should be sufficient.

edge caching secure content on google app engine

In my app, I have to serve huge secure files ( svg drawings etc ) that I want to show only to logged in user. File do not change much, and if it does, It do have different url, so I would like to use edge cache on google app engine for faster loading to already logged in user.
My question is, how do I make it secure ? i.e. if the user logged out and if someone else use his browser can he see that content ? if so, how do I prevent it ?
Related: How do I prevent browser from remembering a url of a content on my website.
One of the solutions could be using Google Cloud Storage with ACL, i.e. in the way that only a particular logged in user has access to that file. This solution is limited to Google Accounts though.
UPDATE: Google Cloud Storage now has a short-life signed URLs:
"...you could provide a signed URL to a user which would allow the user to access that resource for a limited time. Anyone who knows the URL can access the resource for a limited time. (You specify the expiration time in the query string to be signed.)"
- so, that could be even closer to what you need.
Another solution is that you serve your huge file from your dynamic handlers. This however will consume a lot of CPU and bandwidth. Also, you'll still be limited by GAE quotas.
Related: you can't prevent browser from remembering visited URLs or any other kind of history. It depends solely on the specific browser and user preferences (not accessible to your app / javascript/whatever). The only thing you can ask your users is to clear their history, cookie and whatnot when they log out.
You could set expiration in the appconfig
Sorry I just have a german link (just try, maybe google switches to your language)
http://code.google.com/intl/de-DE/appengine/docs/java/config/appconfig.html
As long as not expired the browser could cache the file locally. This results in a small load time. However, if the cache is too small, the browser will request the file again.
A good browser will only make secured file to the logged user available.
However, you have no guarantee which browser is in use. Your user could always download secure file and publish them anywhere.
When a user give the login to third users, they could all the time access the secured files.
I don't think you could avoid to remember any links. In some way it is a contradiction to the above.

Resources