I read all the Pub/Sub docs and saw no mention of Snapshots.
But when I read the docs for the Node.js Client Libraries, I find references to Snapshot objects: https://googlecloudplatform.github.io/google-cloud-node/#/docs/pubsub/0.11.0/pubsub/snapshot
Huh?
It's an invite-only feature that seems to provide some sort of replayability functionality https://cloud.google.com/sdk/gcloud/reference/alpha/pubsub/.
Related
I used Google Pub/Sub to receive status changes on a build workflow I have in Google Build. There's a pre-defined topic called cloud-builds where you just name a new topic cloud-builds and Cloud Builds updates will populate the topic.
The Pub/Sub topic to which Cloud Build publishes these build update messages is called cloud-builds.
https://cloud.google.com/build/docs/subscribe-build-notifications
I'm curious if there is a complete list of pre-defined topics for Pub/Sub that automatically pipe from different services.
I looked around the docs for Pub/Sub but couldn't find a complete list.
Sorry, there's no list of topics like this. There are some other services that offer Pub/Sub notifications, for example Google Cloud Storage, but there's no centralized place where they are all documented. You will need to refer to the documentation of the particular service you are interested in.
Trying to get Pub/Sub working in AppEngine Standard Environment. Having problems getting the right context. The Pub/Sub client wants a context.Context but AppEngine only has appengine.Context. Can't find any examples or anything related to this, except for flexible environment (using context.Background) which I don't want to use. Am I the only person on the planet wanting to use Pub/Sub with AppEngine Standard Environment?
Ultimately I was using the wrong appengine. As of now, I have to import google.golang.org/appengine like the examples for Go 1.9. This is because I was providing appengine.context when I needed context.Context.
context.Context was introduced in Go 1.7 (2016). appengine.NewContext was changed to return context.Context in 2017.
The documentation I have read has only covered asynchronous pull - I'd like to verify that that is the only option for the Python API.
The Cloud Pub/Sub client library only support asynchronous subscribing, which is the recommended way to run a subscriber. For specific use cases where a synchronous pull is needed, use the REST/HTTP pull method or the gRPC pull method, which requires generating the service code.
We have a large dataset from an appengine app in our datastore. Now I want to do some ETL on them to push them to bigquery, and I thought of using a Dataflow batch job.
All examples I find are using this class to query the Datastore:
import com.google.api.services.datastore.DatastoreV1.Query;
And that does work. However, I'm not familiar wit this DatastoreV1 API and would like to use the API provided with the appengine SDK, like this:
import com.google.appengine.api.datastore.Query;
The problem is that the DatastoreIO doesn't accept these queries:
PCollection<Entity> projects = p.apply(Read.from(DatastoreIO.source().withQuery(q).withDataset(DATASET_ID)));
It will only take DatastoreV1.Query objects. Is there any way to use the app engine provided API's? I'm much more familiar with those calls. Better yet, if we could use Objectify, that would be awesome :)
Thanks!
This isn't possible with the current implementation of the API. We can look at adding as a feature, and would gladly accept a pull request to expand the current functionality. The AppEngine team is also actively working on increasing interoperability between their SDK and the Datastore API.
Are there any docs on how to flush the google app engine memcache using Go?
I can see flush_all() in the python docs https://developers.google.com/appengine/docs/python/memcache/functions
The memcache go code lists a flush function, which is not listed in the official docs. I suspect it works and is just undocumented.