Cloud Pub/Sub: Pull subscriber with GAE/Go Standard - google-app-engine

Can I create Cloud Pub/Sub pull subscriber with GAE/Go Standard Environment?
I read this page, but this is GAE Flexible environment.
I would like to create pull subscriber with GAE/Go Standard.
Can I do it?

No, You can't do that. The proper way to use PubSub with GAE Standard is to create a push subscription which delivers each message to an App Engine endpoint you have defined.

Related

Can Google Cloud functions share a datastore with Appengine?

I have an old AppEngine Java application using the AppEngine datastore. Is this what the marketing renamers at Google now (2019) call "Cloud Datastore"?
Can I create Google Cloud Functions that interact with the same datastore, and what are the steps needed to do so?
Yes, it's the same datastore. Also called/soon-to-be Cloud Firestore in Datastore mode (which all older apps will be converted to at some point).
Yes, you can access it from anywhere, even from outside Google Cloud. From Cloud Datastore (emphasis mine):
You can access Cloud Datastore from anywhere using the Cloud
Datastore API. Use the Google Cloud client libraries to store and
retrieve data from Cloud Datastore.
The same Cloud Datastore data is available regardless of if you use the App Engine libraries, the Google Cloud client libraries, or call
the API directly.
The major steps to access the datastore from a Cloud Function:
you can't use the GAE-specific client libraries like the one you likely used in your old app, you'll have to use one of the generic client libraries (or the REST or RPC APIs)
you'll have to give your CF's Identity/service account the proper access permissions, see Setting up authentication and Accessing your database from another platform.

Google Cloud Pub/Sub, publishing with a HTTP PUSH request

I am trying to use google clouds pub/sub feature to store incoming data from an IOT device. I have an event call back which should be sending a JSON string to the pub/sub topic, from the IOT device's back-end. The callback looks like this (where {project},{topic} and {YOUR_API_KEY} are filled in as required:
POST https://pubsub.googleapis.com/v1/projects/{project}/topics/{topic}:publish?key={YOUR_API_KEY}
{"messages":[{"data":"test"}]}
I am invariably getting error 403 with this set up. I have tried various slight variations on this and found other errors. I am very new to this topic, is there an obvious mistake I am making?
API Keys are not sufficient for authenticating to the Google Cloud Pub/Sub API. Only a subset of GCP services allow access using only an API key, as detailed in the Using API Keys documentation. You will need to use a service account to authenticate for using Cloud Pub/Sub. You might also want to consider Google Cloud IoT, which sends telemetry into Cloud Pub/Sub.

Google Cloud Datastore how to create, update and delete entity with http request

As Google Cloud Datastore client libraries are available for some language only. Now, How can do operation like create, update and delete of entity without using client libraries with HTTP request.
The Datastore API is built on HTTP and JSON, so any standard HTTP client can send requests to it and parse the responses.You can find more about building a flexible run time here
One of the classic way to do this is to expose the datastore CRUD operations through a set of REST APIs. Google offers Cloud Endpoints which are a set of "tools, libraries and capabilities that allow you to generate APIs and client libraries from an App Engine application" https://cloud.google.com/appengine/docs/java/endpoints/
You can have a look at this tutorial https://rominirani.com/google-cloud-endpoints-tutorial-part-1-b571ad6c7cd2#.1j9holpdt

Ingesting logs into bigquery from a python script

I want to ingest logs from an appengine app to bigquery without using appengine mapreduce ?
We've open sourced a Java implementation of migrating appengine logs to bigquery here: http://blog.streak.com/2012/07/export-your-google-app-engine-logs-to.html
See the bigquery docs here. You can post a multi-part http request that has the data you want to add to the table. If you are doing an append you won't need to provide the schema.
There's also a Python runtime implementation called "log2bq" that demonstrates how to ingest App Engine logs into BigQuery: http://code.google.com/p/log2bq/
Google has recently released a (BETA) feature called "Google Cloud Logging: Logs Export"
https://cloud.google.com/logging/docs/install/logs_export
They summarize it as:
Export your Google Compute Engine logs and your Google App Engine logs to a Google Cloud Storage bucket, a Google BigQuery dataset, a Google Cloud Pub/Sub topic, or any combination of the three.
I haven't tried out all of the functionality provided by this new service but...
We have recently started using the "Stream App Engine Logs to BigQuery" feature in our Python GAE project. This sends our app's logs directly to BigQuery as they are occurring to provide near real-time log records in a BigQuery dataset.

App Engine Datastore access

Is it possible to query App Engine's Datastore from outside the cloud, i.e. a client application?
I could possibly write an app to be housed within AppStore and query the Datastore returning XML-formatted data; I want to know, however, if there are any Datastore endpoints which would allow me to do it directly.
Also, in case it is possible, am I able to do so via SSL?
Yes. The remote_api library supports exactly this use-case. If you're using Java, there's a Java remote_api handler available, and the client will be available at some point in the future.
You can use this over SSL in the same way as any other handler.
There's no reason you couldn't create your own app engine application that exposes the datastore as a web service (either http or https). In fact, here is a link to a python version.

Resources