Cloud Storage backend for Google Cloud CDN is currently in alpha. When I trie to use/enable it I get following error:
$ gcloud alpha compute backend-buckets update static-bucket --gcs-bucket-name my-bucket --enable-cdn
ERROR: (gcloud.alpha.compute.backend-buckets.update) There was a problem fetching the resource:
- Required 'Alpha Access' permission for 'Compute API'
How to get Alpha Access? I also don't see any apply form for Alpha Access.
In the documentation page "HTTP(S) Load Balancing with Google Cloud Storage Contents", there's a box with a link to a page to request access.
Related
I'm using a Cloud Storage instance to store images on my NextJS web app.
I refer to the images like so:
... some code
image="https://storage.cloud.google.com/{my-project}/myimage.jpg"
... some more code
I get an error as such:
GET http://localhost:3000/_next/image?url=https%3A%2F%2Fstorage.cloud.google.com{image link to Cloud Storage} 400 (Bad Request)
After some reading of docs I found that I have my Firestore settings to non-public so only certain authenticated users can access cloud content.
I can't specify a user since my local developer environment is running on localhost and I don't know how Firebase will see my requests since I'm already logged in with gcloud auth login on my local machine.
I have configured and restarted my server with the added next.config.js:
module.exports = {
images: {
domains: ["storage.cloud.google.com"]
}
}
I see there's an answer Frontend Authenticated request Google Cloud Storage which explains using a token but I'd like to not have another token to manage. Is there a solution where I can manage the access using a Service Account?
I'd like to avoid making all the images public with this solution.
I do not want to download the image as mentioned in the docs.
I found that the bucket could be restricted but if the link with the access token is used, then the image will be accessed.
https://firebasestorage.googleapis.com/v0/b/{project-id}/o/{image path}?alt=media&token={token}
This is not the ideal way I want since the token can be viewed on the source code and makes the image bucket accessible virtually publicly.
I am following the steps here: https://docs.snowflake.com/en/user-guide/data-load-snowpipe-auto-gcs.html and having trouble on step 6
I ran:
create storage integration my_integration
type = external_stage
storage_provider = gcs
enabled = TRUE
STORAGE_ALLOWED_LOCATIONS = ('gcs://<my-bucket>')
;
which completed successfully. Then DESC STORAGE INTEGRATION MY_INTEGRATION; successfully describes the storage integration and lists a STORAGE_GCP_SERVICE_ACCOUNT
However I cannot find this service account in the google cloud platform console that owns that bucket.
My snowflake account is on AWS though according to that tutorial page, I am allowed to use AWS or CGP for this integration.
Is there somewhere I should indicate in the create integration command which google cloud project I am referring to so that it knows where to create the service account?
And advice on performing this step?
This issue was because my bucket was not publicly accessible. Once I turned it to publicly accessible I was able to find that service account when adding roles.
It appears that the service account is not a service account "On" the google cloud platform account that hosts the bucket but rather one setup by snowflake on their own managed services.
So its like granting an external storage account permissions rather than an internal one.
I created an API using Python + FastAPI and deployed it to Google App Engine and I would like to measure the cost for each request made.
I saw there is a header "x-appengine-estimated-cpm-us-dollars" that show up when logged in with the owner account on GAE, but I didn't see it when accessed the API using the browser "https://example.uc.r.appspot.com/api"
Any idea how to can I see this header or a way to get an estimated cost for each request made?
Note: the deployed script is an API, not a website with auth like the one mentioned here (Usage of X-AppEngine-Estimated-CPM-US-Dollars in AppEngine)
According to the documentation:
If you access dynamic pages on your site while signed in using an administrator account, App Engine includes per-request statistics in the response headers
And then shows the description for this particular header, therefore, this is not something that is available for APIs hosted in AppEngine.
You could alternatively use the Cloud Billing API to gather some information, although not exactly the same.
Does anyone have an example of a GAE/Cloud Endpoints API method (in Java) that can take in an image from an Android app and upload it to Google Cloud Storage?
I cannot seem to find any samples on how to do this but it is possible from what I understand.
EDIT:
The tutorial here shows how to add a dependency to google app engine in eclipse and upload/download an image to Google Cloud Storage. Is it possible to do this with Cloud Endpoints somehow..? After all, they are both Google App Engine.
I want to offload as much of the upload/download code into my Cloud Endpoints API method(s), rather than coding everything inside of Android. This would allow me to reuse my Cloud Endpoints API on other clients.
More info I found: https://developers.google.com/api-client-library/java/apis/storage/v1#sample
Looks like this is the gradle dependency for the cloud endpoints backend?:
dependencies {
compile 'com.google.apis:google-api-services-storage:v1-rev66-1.21.0'
}
EDIT:
You should use this dependency inside cloud endpoints:
compile 'com.google.appengine.tools:appengine-gcs-client:0.5'
You can upload file to Google Storage using Json Api
You may or may not want to store file metadata to datastore thru Endpoints.
You may or may not want to authenticate your users thru Endpoints before give them possibility to store files to Storage.
What I want to say is that Storage / Endpoints / Datastore are three different things and you don't required to use them all together.
Useful link: https://github.com/pliablematter/simple-cloud-storage
You cannot directly upload (large) files to an Endpoints API method but instead need to receive them using the blobstore (or GCS) (https://cloud.google.com/appengine/docs/python/blobstore/ ). This requires the following:
Setup on your server a blobstore upload handler (which is just a regular webapp2 handler).
Expose an Endpoints method that calls blobstore.create_upload_url(), and then returns the upload URL to your App.
Within the App, upload the picture to that upload URL; the file will then be accessible within your upload handler, where you can move it to GCS, Datastore or somewhere else.
Solution : https://github.com/thorrism/GoogleCloudExample
Enable Google Cloud Storage : => https://console.developers.google.com/apis
Generate and download a key P12 : => https://console.developers.google.com/iam-admin/iam
Create a folder named "assets" and place there your key :
=> app/src/main/assets/"yourKey.P12"
For everyone can access your uploaded file reading do not forget to add the permissions on your Bucket
I searched the web for a while, but can't seem to find the right answer.
I have created a VM Instance on Google Compute Engine and I am running Jenkins on it. This instance checks a code repository I have and when a change occurs, I want to run the following command
gcloud --project=test preview app deploy -q app.yaml --version=dev
When I want to trigger a build, or it triggers it by himself, I get the following error:
Beginning deployment...
ERROR: Error Response: [403] Request had insufficient authentication scopes.
ERROR: (gcloud.preview.app.deploy) Could not retrieve the default Google Cloud Storage bucket for [test].
Please try again or use the [bucket] argument.
The VM instance does have acces to the storage as you can see in the following image, so I don't understand why this error pops up.
Go to the App Engine developer console https://appengine.google.com/ and in
Application Settings > Cloud integration > and turn ON the Google Cloud Storage
so this will set up the default bucket for you. This something that the new Google Developer Console will do automatically.If the project is created on the old console you have to do it manually.