Don't wait for "Updating service..." - google-app-engine

I'm using Bitbucket Pipelines to deploy my project to Google App Engine via gcloud app deploy which does a great job accomplishing what I want.
gcloud app deploy takes like 6-8 extra minutes for "Updating service..." which is the time taking for them to update their backend infrastructure primarily the Load Balancer.
This is not an issue by itself except that it eats up my monthly Build Time from Bitbucket.
I would rather have the pipeline return as soon it sent off the build commands, and I'll check them myself on Google Cloud Logs server.
The question is: do we have any flag for gcloud app deploy to tell him not to wait for the "Updating service..." ?

These are all the gcloud app deploy flags. There's no such flag.
To see if there's a possible 'hack' you could use, you could try manually deploying your app yourself using gcloud app deploy --log-http. The --log-http flag will produce an output of all http requests made during the deploy. You'll see the endpoints being called, the http method, the headers, payload and the duration for each call. Examining those (especially around the 'updating' bit) might show you something that could potentially be of help.

Related

which GCP component to use to fetch data from an API

I'm a little bit confused between gcp components, here is my use case :
daily, I need to fetch data from an external API (the API return json data), store it in GCS then load it in Bigquery, I already created the python script fetching the data and store it in GCS and i'm confused which component to use for deployment :
Cloud run : from the doc it is used for deploying services, so I think its a bad choose
Cloud function: I think it works, but it is used for even based processing (through single purpose function...)
composer :(I'll use composer to orchestrate tasks, such as preprocessing of files in GCS, load them to BQ, transfert them to an archive Bucket) through kubernetesPodOperator, create a task that trigger the script to get the data
compute engine: I don't think that its the best chose since there are better ones
app engine: also I don't think it a good idea since it is used to deploy and scale web app ...
(correcte me if i'm wrong in what I said, ) so my question is : what is the GCP component used for this kind of task
Cloud run : from the doc it is used for deploying services
app engine: also I don't think it a good idea since it is used to deploy and scale web app ...
I think you've misunderstood. Both Cloud run and Google App Engine (GAE) are serverless offerings from Google Cloud. You deploy your code to any of them and you can invoke their urls which in turn will cause your code to execute and do stuff like go fetch data from somewhere and save it somewhere.
Google App Engine has a shorter timeout than Cloud Run (can't remember if Cloud Run has time out). So, if your code will take a long time to run, you don't want to use Google App Engine (unless you make it a background task) and if you don't need a UI, then you don't need GAE.
For your specific scenario, you can deploy your code to Cloud Run and use Cloud Scheduler to schedule it to be invoked at specific times. We have that architecture running in a similar scenario (we have a task that runs once daily; it's deployed to Cloud Run; Google Scheduler invokes the endpoint, it runs and saves data to datastore linked to an App Engine App). We wrote a blog article on deploying to Cloud Run and another on securing your cloud run (based off our experience in the earlier described scenario)
GAE Timeout:
Every request to a Google App Engine (Standard) must complete within 1 - 10 minutes for automatic scaling and up to 24 hours for basic scaling (see documentation). For Google App Engine Flexible, the timeout is 60 minutes (documentation).

How to set up and build previous image working in GCP

I'm new to google cloud.
Our project has been deployed on google Cloud. We are using gcloud deploy commands to deploy any build and we are good with it.
My question is: what if my current build fails during production deployment and there's lots of users using the application.
So, how can I update the previous image/build immediately in GCP? I have tried it with pull docker and push it then submitting via "gcloud build submit" command... but it's not deploying actual specified docker image, it's deploying that folder where I'm running commands of "gcloud build submit"..
Please share your suggestions..
Your concern to not break all the users when you are deploying a broken version is legit, and you aren't alone!!
That's why a cool feature exist on App Engine (and also on Cloud Run): Traffic splitting.
To use it efficiently, you can deploy your new version in production like this:
gcloud app deploy --no-promote
Here the new version is deployed but 0% of the traffic is routed to it. Now use this command to increase the traffic, let say 1%
gcloud app services set-traffic <YOUR_SERVICE> --splits <OldVersionName>=99,<NewVersionName>=1
Monitor your application for a while, and, if there isn't error, continue to increase gradually the traffic, up to be confident and routed 100%
In case of bad version, set the traffic to 0% to the new version and now/or low impact to your users.

How to trigger a Google Cloud Build build steps with a Pull Request?

I have configured a CI pipeline using a cloudbuild.yaml file. I'm trying to launch this pipeline with Pull Requests. It seems that the provided build triggers: https://cloud.google.com/cloud-build/docs/running-builds/automate-builds are not allowing this option. Is there a way to use webhooks to overcome this limitation? Like sending an HTTP request after a pull request event to cloud builds topic and configure a cloud function as a subscriber to launch the pipeline.
Thanks,
The Cloud Build Github App does builds on pull request: https://cloud.google.com/cloud-build/docs/run-builds-on-github
There are three ways to run builds
Manually: through API/gcloud
(Beta) Build Triggers: configurable through Google Cloud Console
(Alpha) Github App: builds automatically on changes to repo and pull
requests
These can all be used independently or in combination with each other.
A completed pull request is merged into an upstream branch (master, release, or another name). https://help.github.com/en/articles/merging-a-pull-request
You can set the Google Cloud Build trigger (in Google Cloud Console) type to "Branch" and enter the relevant branch. Choose Cloud Build configuration file as your build configuration and enter your cloudbuild.yaml file location.

Start/Stop Google app engine custom runtime instances

I made a small web service using Node.js and PhantomJS, and deployed it to Google app engine using its flexible environments...
The problem is, the service is used only for half an hour each day, but the VM instances is running all time and I pay for that...
So I need to be automatically able to start the instance(s) before scheduled run time of my app, and then automatically stop them
I tried use Cron jobs to call start/stop via the API, as in here, but it failed..
Thanks for advance
We don't seem to currently expose the version stop method in the rest API:
https://cloud.google.com/appengine/docs/admin-api/
However - you can stop a version by running this command:
gcloud app versions list
gcloud app versions stop <version>
That will make sure the VMs get shut down. When you're ready to turn them back on...
gcloud app versions start <version>
Hope this helps!

Application Default Credentials not working locally with App Engine

Having a tough time getting the Default Application Credentials to load in the dataflow SDK when running locally in a java app engine project developing on OS X. Runs fine when deployed.
According to this the dev app server doesn't support them, and you're meant to use the gcloud command line tool's command: gcloud preview app run - but according to the official Google Group for the SDK here that command was deprecated in Jan 2016.
So I seem to be stuck between a rock and a hard place... Does anyone know how to get the Application Default Credentials to work locally with an App Engine app?
I'm trying to use the Dataflow API and it just throws up when it starts making use of the cloud storage api which is the first thing the Pipeline does because it can't seem to load the correct credentials from the environment variables ( that are definitely set on the ENV and in the appengine-web.xml <env-variables> element ) or from the ~/.config/cloud/default_application_credentials.json file.
Cheers!
Can you try running the following command and see if it solves it?
gcloud auth application-default login
This is fully supported (but poorly documented) within the dev appserver. There is a very well answered question that gives you step by step instructions here: Unable to access BigQuery from local App Engine development server

Resources