I have one web project. Pipeline in Azure DevOps, instance in GCP. If I canceled pipeline in Google App Engine create version and i must pay for this.... How to set up a pipeline or Google App Engine so that the version in Google is recorded only when the pipeline is successfully completed
When you try to deploy a version of your APP to Google App Engine using a pipeline on Azure DevOps, if the deployment step itself is failed, normally the new version may not be deployed to App Engine. You just need to retry the pipeline.
However, if the deployment step has completed successfully, but one of the subsequent steps or jobs is failed and caused the pipeline failed, the new version has been deployed to App Engine. In this situation, you may not able to redeploy the same version to App Engine.
Related
I'm using Bitbucket Pipelines to deploy my project to Google App Engine via gcloud app deploy which does a great job accomplishing what I want.
gcloud app deploy takes like 6-8 extra minutes for "Updating service..." which is the time taking for them to update their backend infrastructure primarily the Load Balancer.
This is not an issue by itself except that it eats up my monthly Build Time from Bitbucket.
I would rather have the pipeline return as soon it sent off the build commands, and I'll check them myself on Google Cloud Logs server.
The question is: do we have any flag for gcloud app deploy to tell him not to wait for the "Updating service..." ?
These are all the gcloud app deploy flags. There's no such flag.
To see if there's a possible 'hack' you could use, you could try manually deploying your app yourself using gcloud app deploy --log-http. The --log-http flag will produce an output of all http requests made during the deploy. You'll see the endpoints being called, the http method, the headers, payload and the duration for each call. Examining those (especially around the 'updating' bit) might show you something that could potentially be of help.
On GCP, I run Cloud Build from one project and deploy code to App Engine in another project. It looks like the project where build runs from needs to have App Engine Admin API enabled. Is it a real request or I missed the real configurations?
App Engine Admin API is required as it is used for any App Engine-related management operations
Also, based on the link above:
The Admin API provides you with:
An integration point for your development and build tools.
Tighter control around deploying new versions, including the ability to automate traffic
migration between two versions or traffic splitting across one or more versions.
The ability to programmatically manage applications across multiple Google Cloud projects.
So yes, if you plan on deploying code to App Engine using Cloud Build, you need GAE Admin API enabled.
I'm using the gcloud SDK to try to deploy a Dockerfile and app.yaml to app engine. My App Engine service account has Project Editor and Storage Admin and SQL Client roles, and the Cloud Build, App Engine, and App Engine Flexible APIs are all enabled. Nonetheless, I keep encountering this error with gcloud beta:
ERROR: (gcloud.beta.app.deploy) Error Response: [7] Unable to write to staging bucket staging.
<project name>.appspot.com. Please grant access to the App Engine service account on your project
by visiting https://console.developers.google.com/storage/browser?project=<project name>
and this (less clear) error with vanilla gcloud:
ERROR: (gcloud.app.deploy) INVALID_ARGUMENT: unable to resolve source
In a different sandbox/dev project, I didn't have this issue, but in this production project, that hasn't been the case. Adding the SAs manually makes no difference. And weirdly, the deploy is writing the files to the staging bucket, so I don't trust this error. Where it seems to be failing is the step immediately after that--maybe App Engine Flexible service account agent wants to access them; maybe the Dockerfile needs to be passed onward to Cloud Build SA; but in any case, I have tried every combination of granting bucket access to both App Engine and Cloud Build SAs and their agents as well, and it hasn't worked.
My user account has app engine deployer role assigned to it, and even impersonating the App Engine account with its editor role didn't make a difference. Cloud Build has App Deployer role as well.
The error of the Vanilla GCloud indicates that your service account is likely missing some permissions. Try disabling and re-enabling the Cloud Build API in your project so a new service account is created.
This ensures that Cloud Build has permission to start builds.
Also, make sure to wait a few minutes before trying to deploy (like 10 minutes or so), so that the permissions can be propagated to all systems.
I would like to know how to deploy the application from bitbucket using pipelines to multiple Google Cloud projects.
Here is our current set up and it is working fine.
On Bitbucket, the application repo with development/UAT test/production branches, once the pull request approved and merged into development/production, it shall deploy to the GCP app engine through the pipelines.
The problem now, we want to isolate each client in GCP which mean each client will have its own GCP project, cloud SQL, App engines, storage bucket, etc...
I need some advice on how to change the deployment workflow in bitbucket and pipelines, so will work for the new set up.
For the branches setup on bitbucket, I'm thinking like below, but if I go for option2, then it seems too much if got more clients.
Option 1 (repo branches)
development/
UAT test/
validation/
production
Option 2 (repo branches)
development/
UAT test client1/
UAT test client2/
validation_client1/
validation_client2/
production_client1/
production_client2/
The first step, I know I have to create different app.yaml for each app engine service for each client, so it can deploy the app engine service to different CGP projects/bucket/SQL instance.
Also just found out the bitbucket-pipelines.yml only support 10 steps, if I create so many branches then it will over the limits for sure.
Does anyone have any suggestions about how should be set up?
Thanks,
You could create Cloud build triggers to specific bitbucket branches or repos (whatever your branching model is defined) and deploy the app engine implementation to the App engine service on the same project, and if you need to customize other steps, you could use custom steps as described here. Finally you can take a look at how to create a basic configuration file for Cloud Build if you are not very familiar with this product
I trying so much to deploy a spring boot application on google cloud app engine, after so many documents here im.
when i try to deploy a simple spring boot application like the exemples its works but with my app the deploy process take a long time and i got 502 error. looking at app engine logs my application keep on infinit reboot loop.
my app.yaml :
runtime: custom
env: flex
My docker file:
FROM gcr.io/google_appengine/openjdk
VOLUME /tmp
ADD vaptuber-jjaerp-0.0.1-SNAPSHOT.jar app.jar
CMD [ "java","-Djava.security.egd=file:/dev/./urandom","-jar","/app.jar"]
and here is the app log:
https://drive.google.com/file/d/0B1gG3dVgi0WoTzM2RXlaQjJka0E/view?usp=sharing
I has deployed a fat jar to test, in local machine works fine.
Spring Cloud GCP was just recently announced. It should help to better integrate your Spring application with GCP. Take a look at the following resources.
Announcing Spring Cloud GCP
Spring Cloud GCP
It sounds like your application is running out of memory and being killed and restarted by the OOM Killer. The solution is as simple as increasing the memory on the App Engine VM. See my answer here:
Deploy a SpringBoot / Angular 4 on Google App Engine with maven