How to create Preemptive VM using Google Cloud SDK for java? - google-app-engine

I need to create preemptive VMs programmatically. I'm trying to make VMs using the Google Cloud SDK for Java but I'm unable to find any documentation about creation of Preemptive VMs.

I suspect you're not finding the answer because you're searching on "preemptive", when the term that is used in the Google Documentation is "preemptible". So the answer you want can be found here at google:
In answer to your question, though, if you are using the Google Cloud SDK for Java, when you create the instance you need to create a "preemptible" property under "scheduling", and set it to true.
The equivalent if you were sending the SOAP command over https:
POST https://www.googleapis.com/compute/v1/projects/[PROJECT_ID]/zones/[ZONE]/instances
{
'machineType': 'zones/[ZONE]/machineTypes/[MACHINE_TYPE]',
'name': '[INSTANCE_NAME]',
'scheduling':
{
'preemptible': true
},
...
}
... and if you are using the Google Cloud SDK CLI, then it's a matter of using the --preemtible command flag:
gcloud compute instances create [INSTANCE_NAME] --preemptible

Related

How can I connect Google App Engine to Elastic Cloud?

I'm trying to connect to my cluster on Elastic Cloud using elasticsearch-py on GAE, but I'm running into the following error:
ConnectionError: ConnectionError('VerifiedHTTPSConnection' object has no attribute '_tunnel_host') caused by: AttributeError('VerifiedHTTPSConnection' object has no attribute '_tunnel_host')
I've tried this fix that I've seen in a number of places already that reference the '_tunnel_host' error, but it's not resolving my issue:
from requests_toolbelt.adapters import appengine
appengine.monkeypatch()
I've also tried a few variations that I've seen for the es declaration, but none of them have worked; for example:
es = Elasticsearch(["https://elastic:password#xxxxx.us-central1.gcp.cloud.es.io:9243"],
send_get_body_as='POST',
use_ssl=True,
verify_certs=True)
I'd like to be able to establish the connection and begin sending and consuming data from my cluster, but can't find a way to do this. Any help would be much appreciated!
There is an article with example of real word app Elasticsearch on Google Cloud with Firebase functions.
On the other hand there is Google Cloud Marketplace with many available Elasticsearch solutions, for example:
1.You can deploy and configure Elasticsearch Cluster that works with kubernetes, using Google Click to Deploy containers.
Or Elasticsearch complete solution using virtual machines provided by Google.

AppEngine Standard Environment Pub/Sub Context in Go

Trying to get Pub/Sub working in AppEngine Standard Environment. Having problems getting the right context. The Pub/Sub client wants a context.Context but AppEngine only has appengine.Context. Can't find any examples or anything related to this, except for flexible environment (using context.Background) which I don't want to use. Am I the only person on the planet wanting to use Pub/Sub with AppEngine Standard Environment?
Ultimately I was using the wrong appengine. As of now, I have to import google.golang.org/appengine like the examples for Go 1.9. This is because I was providing appengine.context when I needed context.Context.
context.Context was introduced in Go 1.7 (2016). appengine.NewContext was changed to return context.Context in 2017.

How to create tasks in default queue of appengine while runtime is php72

I am trying to create a task for the default queue. for this, I wrote following codes, but it is not working.
//including
use google\appengine\api\taskqueue\PushTask;
use google\appengine\api\taskqueue\PushQueue;
//Initialising,
$task = new PushTask('/worker', [$values], ['header' => "Host: https://-myserviceurl"]);
$queue = new PushQueue('default');
$queue->addTasks([$task]);
My question is,
Can we create tasks from the flexible environment if the PHP
runtime is72?
If the above method won't help me to create tasks, then how to create one while my all services in the flexible environment?
You can use the Google Cloud Client library for PHP, which supports PHP 7.2, and will work in the Flexible App Engine environment.
You are looking into using the Cloud Tasks API, which does the same as the appengine task queues, but in a separate API.
You can check this documentation for the installation of these library and a sample code, and this documentation as a reference for the API methods and usage for the V2beta3 version of this API libraries, which is the latest, and the one I recommend you to use.

Combine the conditions to filter ancestor and property by GQL

I am create a Endpoint to query my Datastore by GQL, and I want to specify the ancestor and property, but GQL seems cannot combine those 2 conditions. Am I missed anythings?
My GQL is:
select * from Product where __key__ HAS ANCESTOR Key(modle1, '0') AND timestamp > 0
And I used the library: com.google.cloud.datastore
Is there have any other way to do my goal?
Thanks everyone.
Ok, finally I figured out my problem.
I MUST to upload a index definition for my query.
So, I follow this guide - Cloud Datastore Indexes, try to upload my index.
I am succeed in the end, but it took me a whole night.
If you use Google Cloud SDK to develop your appengine, there is a small hint for upload index.
Many Q&A or documents said you should upload indexs with appcfg.cmd/appcfg.py/appcfg.sh, but if you are developing appengine with Google Cloud SDK instead of Google Appengine SDK, you should not use that command. Replace with
gcloud app deploy indexes.yaml
or
gcloud datastore create-indexes
Ref: Migrating from AppCfg to gcloud Command Line
No matter what you develop your appengine with, you should write indexes.yaml (If you develop with Java, you will found some documents ask you to write datastore-indexes.xml, drop it, gcloud is seems not support that.).

How can we use Google Datastore Objectify remotely?

I am trying to use Google Objectify for Datastore (https://github.com/objectify/objectify). My app is not hosted on GAE, but I still make use of Datastore, so I need to use the remote API. Right now, I use the low level API and connect successfully like this :
DatastoreOptions options = DatastoreOptions.builder()
.projectId("PROJECT_NAME")
.authCredentials(AuthCredentials.createApplicationDefaults()).build();
Datastore client = options.service();
And the library used is http://googlecloudplatform.github.io/gcloud-java/0.2.0/index.html. My application defaults for "AuthCredentials.createApplicationDefaults()" is in my home folder in development as well as on the server.
In the doc I saw for Objectify, I did not see anyway of specifying the connection like above, thus no way of telling to use the credentials file in our home folder. The code I see for Objectify is mostly like this Objectify.ofy(). So I see no way with this method of telling to use the auth credentials defaults.
Thank you very much.
Use the Google App Engine remote api:
https://cloud.google.com/appengine/docs/java/tools/remoteapi
You could try gcloud-java datastore module.
http://googlecloudplatform.github.io/gcloud-java/0.2.0/index.html
But I encounter some performance issues on outside of Google Sandbox (GAE-Compute Engine)

Resources