Google-App-Engine and Cloud-Storage, Access Denied - google-app-engine

I have created a bucket in Cloud-Storage, and granted permission for my-gae-app (GAE), as Full_Control. I also configured CORS (Cross-Origin Resource Sharing) on my bucket for my-gae-app. I setup the bucket default ACL to have my-gae-app as owner.
In my-gae-app application, I have form to allow users to upload pdf/image to my-bucket. (I use GCS Client Library Functions) The upload process worked fine, when upload button is clicked, the file is written to my-bucket without error and I can verify from the cloud storage console that the files are there. I checked the files (object) permission, and I can see my-app-gae is the owner.
Other form which display those uploaded files are not working. Even as simple as "<"img src="https://storage.cloud.google.com/mybucket/my-uploaded-image.jpg"/">"
The console show me that "GET ... 403 Forbidden". The page to show the pdf will display "Access Denied Access Denied".
I have tried to mark the object as "Shared Publicly" in the Cloud Storage console, than everything will work fine. But, this is not correct design. I need a solution to make my-bucket access by my-gae-app only, not publicly :o(
Can anyone please shed some light ? Much appreciated.

Whatever "other form which display those uploaded files" is authenticated as needs to be on he access control list for that object with READ or FULL_CONTROL permission. If you want that to be the case for all objects which you create in the bucket, the easiest way to do that is to set a Default Object Access Control for the bucket as described here: https://developers.google.com/storage/docs/accesscontrol#default
With the default object access control set, you can grant READ by default to your form; you just need to find out which user/service account/group your form is authenticating as.

Related

Can't access image files in Google Cloud storage from App Engine

Our GAE app has been running fine for years. I'm trying to switch to IAM roles to manage buckets from the default (fine-grained access). It doesn't work.
After switching to uniform access, I give StorageAdmin permissions to the GAE service account. At that point our code fails in getServingUrl():
String filename = "/gs/" + bucketName + "/" + fileName;
String url = service.getServingUrl( ServingUrlOptions.Builder.withGoogleStorageFileName(filename ));
An IllegalArgumentException is thrown with no detailed error message.
So, I play around with the permissions a bit more. I add allUsers with StorageAdmin permissions to the bucket. Two interesting things to note: 1) I can access the image directly from a browser using: https://storage.googleapis.com/bucket/filename.png. 2) Nothing changes on our app. Still get the same behavior as described above!
To me, this makes no sense. Doesn't allUsers mean anyone or any service can access the files? And why doesn't adding the GAE service account work?
There are two types of permissions allowed by the cloud storage to access any bucket or objects, these are IAM and ACLs , so if you are using IAM to access buckets then make sure that you are following the norm mentioned in the documentation as:
In most cases, IAM is the recommended method for controlling access to your resources. IAM controls permissioning throughout Google Cloud and
allows you to grant permissions at the bucket and project levels. You should use IAM for any permissions that apply to multiple objects in a bucket to reduce
the risks of unintended exposure. To use IAM exclusively, enable uniform bucket-level access to disallow ACLs for all Cloud Storage resources.
If you use IAM and ACLs on the same resource, Cloud Storage grants the broader permission set on the resource. For example, if your IAM permissions only
allow a few users to access my-object, but your ACLs make my-object public, then my-object is exposed to the public. In general, IAM cannot detect
permissions granted by ACLs, and ACLs cannot detect permissions granted by IAM.
You can also refer to the stackoverflow question where a similar issue has been faced by the OP and got resolved by changing the permission in access control list of the object as READ or FULL_CONTROL.

Unable to use "Query Editor" in developer console

While trying to use the query editor in developer console.
"SELECT ID FROM ACCOUNT"
It throws an erorr saying "This session is not valid for use with the REST API".
Any idea what excatly tthe issue here. Earlier it workerd fine.
The same problem when i click on "Open" dialog of developer console and select objects.
"CANNOT LOAD OBjects.This session is not valid for use with the REST API"
I ran into this same issue, where I could open Visualforce pages, Apex, etc but it would give me the error "This session is not valid for use with the REST API" any time I tried to use the Query Editor or create a new Trigger (which needs the object list) via the Developer Console. I tried every browser, flushing cookies, logging in as other users, etc with the same results.
I found that our org has API white listing enabled (https://help.salesforce.com/HTViewHelpDoc?id=security_control_client_access.htm&language=en_US) which blocks the API calls access unless explicitly granted.
From the Salesforce Documentation
"Contact Salesforce to enable API Client Whitelisting. After it’s enabled, all client access is restricted until explicitly allowed by the administrator. This restriction might block access to applications that your users are already using. Before you enable this feature, you should configure and approve connected apps for any client applications you want users to continue using, or give the users a profile or permission set with “Use Any API Client” enabled."
So adding the "Use Any API Client" permission set to your user profile should fix the issue.
To do this via a Permission Set, you can go to Setup > Users > Permission Sets and create a new one. Add a System Permission of 'Use Any API Client'.
There may be a way to enable API access for the Developer Console via whitelisting or the app settings but I was not able to easily find a way. You would definitely want to test whatever functionality you are creating with a user that does not have that permission enabled.
This is happening because of connected app is not configured correctly.
On 'API (Enable OAuth Settings)' panel, move 'Access and manage your data (api)' option from left to right of Selected OAuth Scopes field and then save the setting.
Wait for few minute and then try. It should work.
Looks like just the session is expired. Did you try to close developer console, relogin to your SF sandbox and open developer console again?
Create a Permission Set, and add a System Permission of 'Use Any API Client'.
And associate this permission set to users. It will solve the problem.

The authenticated user has not granted the app xxx write access to the child file xxx, which would be affected by the operation on the parent

I hit the below error when trying to insert the permission
"code": 403,
"message": "The authenticated user has not granted the app xxx write access to the child file xxx, which would be affected by the operation on the parent"
Here is what I am doing
We have two Google Account,
1. API Account - we used to create folder and change the ownership to Business account
2. Business Account - upload a file
now we try to share to folder to customer and we hit the above error
And here I using OAuth 2.0 Installed Applications to get the access token.
Please advise what is wrong I am doing here
I had the same issue but I realize that its because of the scope for credential wasn't setup properly. I only had DriveScopes.DRIVE_METADATA set which was not enough for downloading files. Once I added DriveScopes.DRIVE_FILE, DriveScopes.DRIVE, and DriveScopes.DRIVE_APPDATA, I was able to download the file without any problem. Hope this helps.
P.S. if you are changing credentials, you have to delete the previously saved credential file.
Based on the Official Google Documentation you received '403: The user has not granted the app' because the request is not on the ACL for the file. The user never explicitly opened the file with this Drive app.
It is highly recommended to use Google Open Picker and prompt the user to open the file or direct the user to Drive to open the file with the app.

Drive Realtime API not granting permission to realtime document; normal drive API freaking out

My app uses the Drive rest API and the Drive Realtime API in combination. We set the file's permissions so that the public has view access, and then emailed a link to it to a few thousand customers.
The file's permissions are set so that the public has view access, but:
When a user tries to open the realtime document, we get Drive Realtime API Error: not_found: File not found.
When a user tries to copy the non-realtime file, we get The authenticated user has not granted the app 689742286244 write access to the file 0B-NHh5QARZiUUFctN0Zjc3RKdWs (of course we are not asking to write
You can see the effects for yourself at https://peardeck.com/editor/0B-NHh5QARZiUUFctN0Zjc3RKdWs , and our embarrassing attempts to cover for the errors.
Interesting notes:
Sharing the file directly with a particular google account seems to lift the curse, and then that google account can find the file like normal. No extra permissions, just an explicit reference to a google account.
Setting the file so that the public has full write access seems to have no effect
Other files with the exact same settings in the exact same Drive folder can be opened successfully (but presumably have not been opened by so many people in the past). This makes me think there is some broken state within Google.
How can I avoid this? What's going on?(!?!?) Thanks for any help!
The realtime API does not support anonymous access. All users must have a Google account and be explicitly authorized to view or write to the file.

gsutil cors set command returns 403 AccessDeniedException

I'm following these instructions on how to set a CORS configuration on a Google Cloud Storage bucket and when I run the gsutil cors set command it returns the following error message:
AccessDeniedException: 403 The account for bucket "[REDACTED]" has been disabled.
For the record, I have access to the bucket. I have owner privileges for this project in the Developer Console. Running gsutil cp and gsutil ls work just fine.
Any ideas on what might be wrong here?
I'm answering my question because I found the solution for this issue. I hope this helps anyone else who runs into this, because at the time there was little info on the web describing to how to solve this.
It turns out that my user account did not have "owner" access to the bucket. Here are the steps I took to grant myself access:
1) First, navigate to your project's Cloud Storage Browser in the Developer Console.
2) Once you see a listing of the buckets that are linked to your project, check the box next to the bucket(s) you'd like to modify the permissions for and then click the "Bucket Permissions" button.
3) Next, add your user account to the list of permitted users. Set the permission level to "owner". Click the "Save" button when you're done.
You should have access to the bucket now, which means you won't run into any 403 errors. If you are, you did not set the entity correctly or are using a different account when you authenticated with gsutil. Double-check your work and try again.
Just providing another tip in case others find themselves in the same situation I did. Make sure to log in with the correct google account using gcloud auth login. This can be a tricky detail if there are multiple Google accounts.

Resources