I have a piece of code in my python application:
data = SOME_BYTES
dst_uri = boto.storage_uri(MY_PATH, "gs")
dst_uri.new_key().set_contents_from_string(data)
After I deployed to app engine, the code ran into this error.
<?xml version='1.0' encoding='UTF-8'?><Error><Code>AccessDenied</Code><Message>Access denied.</Message><Details>Caller does not have storage.objects.create access to bucket retained-media.</Details></Error>
Anyone can help me with that? Thanks!
"Caller does not have storage.objects.create access to bucket retained-media." says this is permission issue. You need to grant the robot account for the app permission to write to the bucket that you're using.
Related
I am following the below link.
https://github.com/GoogleCloudPlatform/getting-started-java/tree/master/bookshelf-standard/3-binary-data
I create a new Google Cloud Project and followed the above instructions and all fine on the remote server
I tried using an existing old appengine project (created 4-5 years ago). I get the following error at the given code:
"Caller does not have storage.objects.create access to bucket ..."
storage.create(BlobInfo.newBuilder(bucketName, fileName)
// Modify access list to allow all users with link to read file
.setAcl(new ArrayList<>(Arrays.asList(Acl.of(User.ofAllUsers(),
Role.READER)))).build(),
fileStream.openStream());
Following is the stacktrace
Uncaught exception from servlet
com.google.cloud.storage.StorageException: Caller does not have storage.objects.create access to bucket asw12.
at com.google.cloud.storage.spi.v1.HttpStorageRpc.translate(HttpStorageRpc.java:189)
at com.google.cloud.storage.spi.v1.HttpStorageRpc.create(HttpStorageRpc.java:240)
at com.google.cloud.storage.StorageImpl$3.call(StorageImpl.java:151)
at com.google.cloud.storage.StorageImpl$3.call(StorageImpl.java:148)
at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:94)
at com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:54)
at com.google.cloud.storage.StorageImpl.create(StorageImpl.java:148)
at com.google.cloud.storage.StorageImpl.create(StorageImpl.java:141)
at com.example.getstarted.util.CloudStorageHelper.uploadFile(CloudStorageHelper.java:65)
at com.example.getstarted.basicactions.CreateBookServlet.doPost(CreateBookServlet.java:70
I checked up the Google Service Accounts in my old project and it exists. How do I know, who is the 'Caller'?
If you use the google-cloud libraries from App Engine and don't otherwise specify, you will be acting as your project's app engine default service account. Its name is probably something like your-project-id#appspot.gserviceaccount.com.
To get the service account name, open the Service Accounts page in the console, or check the settings on your App Engine page.
I am using the default bucket name, but whenever I try to write a file, I get a 403 Forbidden. It tries to write to a bucket named: app_default_bucket.
This is the default bucket retrieved by file.DefaultBucketName(ctx).
Local file permissions also seem to be okay.
In production everything works as expected.
It's trying to write on your remote Google Cloud Storage account. Seems like a current bug. For now you might have to create/reconfigure the default bucket on your account.
Using the client library with the dev server not working in Go
I'm working with two separate projects. One is for production, the other dev.
I have backed up the production datastore into a bucket. Now I want to import that into the dev datastore. But when I try, I get the message:
Failed to read bucket: Bucket "the.bucket.name" is not accessible
I thought it might be permissions, I added the dev project with owners-devid and editors-devid and my-email as owners of the bucket. But still got the same error.
gsutil ls working for me, I think that I am not having an issue specifying the bucket.
The issue I had was that I was adding the dev project into permissions as
Project editors-############## Editor
vs
User [project name]#appspot.gserviceaccount.com Editor
The datastore import happens under the user account.
I have the same problem. Setting the permissions as:
User [project name]#appspot.gserviceaccount.com Writer
allows to perform the backup in another project bucket, however it doesn't allow to import from that bucket. I also tried to set owner permission, but the result was the same.
The error reported is:
Requested path https://storage.googleapis.com/[bucket_name]/[id_backup_info].info is not accessible/access denied
TransformationError
This error keeps coming up for a specific image.
There are no problems with other images and I'm wondering what the reason for this exception could be.
From Google:
"Error while attempting to transform the image."
Update:
Development server it works fine, only live it fails.
Thanks
Without more information I'd say it's either the image is corrupted, or it's in a format that cannot be used with get_serving_url (animate GIF for example).
I fought this error forever and incase anyone finds they get the dreaded TransformationError please note that you need to make sure that your app has owner permissions on the files you want to generate a url for
It'll look something like this in your IAM tab:
App Engine app default service account
your-project-name-here#appspot.gserviceaccount.com
In IAM on that member you want to scroll down to Storage and grant "Storage Object Admin" to that user. That is as long as you have your storage bucket under the same project... if not I'm not sure how...
This TransformationError exception seems to show up for permissions errors so it is a bit misleading.
I way getting this error because I had used the Bucket Policy Only permissions on a bucket in a different project.
However after changing this back to Object Level permissions and giving my App Engine app access (from a different project) I was able to perform the App Engine Standard Images operation (google.appengine.api.images.get_serving_url) that I was trying to implement.
Make sure that you set your permissions correctly either in the Console UI or via gsutil like so:
gsutil acl ch -u my-project-a#appspot.gserviceaccount.com:OWNER gs://my-project-b
I'm trying to backup GAE datastore to GS bucket as described here: https://developers.google.com/appengine/docs/adminconsole/datastoreadmin#Backup_And_Restore. I've tried to supply bucket name in forms:
bucket
/gs/bucket
/gs/bucket/path
but non of it work.
Every time I get a message:
There was a problem kicking some off the jobs/tasks:
Invalid bucket name: 'bucket'
What am I doing wrong? Is it possible at all to backup all data (including blob files) to GS without writing custom code for this?
I got it to work by adding the service account e-mail as a privileged user with write permission.
Here's what I did:
Create bucket via web interface (STORAGE>CLOUD STORAGE>Storage Browser > New Bucket)
Add APPID#appspot.gserviceaccount.com as a privileged user with edit permission (Permissions>Add Member)
Even thought it was part of the same project, for some reason I still had to add the project e-mail as a privileged user.
I suspect the bucket does not exist or else app engine does not have permission to write to the bucket.
Make sure the following are true:
You have created BUCKET. Use something like gsutil to create the bucket if necessary.
gsutil mb gs://BUCKET
Make sure your app engine service account has WRITE access to BUCKET.
The service account is of the form APP_NAME#appspot.gserviceaccount.com.
Add the service account to your project team with can edit access.
Alternatively you can change the bucket acl and the service account there. This option is more complicated.
Now start the backup using the form /gs/BUCKET
If you get an Bucket "/gs/BUCKET" is not accessible message then your bucket does not exist, or APP_NAME#appspot.gserviceaccount.com does not have access to your bucket.
NOTE: the form is /gs/BUCKET. The following are wrong: BUCKET, gs://BUCKET, gs/BUCKET etc.
Check that the bucket exists with the right permissions with following command:
gsutil getacl gs://BUCKET # Note the URI form here instead of a path.
Look for an entry like the following:
<Entry>
<Scope type="UserByEmail">
<EmailAddress>APP_NAME#appspot.gserviceaccount.com</EmailAddress>
</Scope>
<Permission>WRITE</Permission>
</Entry>
If you don't see one you can add one in the following manner:
gsutil getacl gs://BUCKET > acl.xml
vim acl.xml # Or your favorite editor
# Add the xml above
gsutil setacl acl.xml gs://BUCKET
Now the steps above will work.
Make sure to follow closely the instructions here:
https://cloud.google.com/appengine/docs/standard/python/console/datastore-backing-up-restoring#restoring_data_to_another_app
Things to make sure:
add ACL permission to your target application
if the backup is already created before adding permission to the bucket, find the backup and add permission
add [PROJECT_ID]#appspot.gserviceaccount.com as a member of you source application with Editor role
the path to import in your source application is /gs/bucket
I just spent a while wrestling with this myself. Thank you #fejta for your assistance.
I could not figure this out. I had added my user to the project, verified that I could write, manually updated the ACL (which should not have been required), ...
In the end, creating a bucket from the command line via:
gsutil mb gs://BUCKET
instead of the web user interface worked for me. Multiple buckets created either before or after adding the user to the team all resulted in 'Invalid bucket name'
I addressed it with:
/gs/BUCKET