memcache and a custom DoFn with dataflow - google-app-engine

I'm trying to use google memcache with dataflow. I'd like essentially like to transform data into memcache. Is it possible to use the google memcache api inside of dataflow?
I get the following error:
java.util.concurrent.ExecutionException: com.google.apphosting.api.ApiProxy$CallNotFoundException: The API package 'memcache' or call 'Set()' was not found. com.google.appengine.api.utils.FutureWrapper.setExceptionResult(FutureWrapper.java:65)
This is the line of code:
AsyncMemcacheService asyncCache = MemcacheServiceFactory.getAsyncMemcacheService("namespace");
asyncCache.put("key", "value", Expiration.byDeltaSeconds(100000)).get();

I think memcache is part of App Engine and not directly accessible outside of App Engine. As a result you won't be able to access it from Dataflow directly. What you could do is create an App Engine service that acted as a proxy and send requests to that from Dataflow.

Related

403 error when trying to list metrics in different project through monitoring api from app engine

I am using an app engine application written in python2.7 to collect monitoring metrics from different projects. I am running this from cloud shell The app engine application is deployed in projectA and it is able to collect the metrics from projectA through the below call
service = build('monitoring', 'v3', cache_discovery=True)
project_name = 'projects/{project_id}'.format(
project_id=project_id
)
metrics = service.projects().metricDescriptors().list(
name=project_name,
pageSize=config.PAGE_SIZE,
pageToken=next_page_token
).execute()
This call is written in a loop, Now i need to collect ProjectB metrics, I have owner access to ProjectB. When ProjectB is passed as parameter to project_id I am getting this below error:
logMessage: "Error: <HttpError 403 when requesting https://monitoring.googleapis.com/v3/projects/ProjectB/metricDescriptors?pageToken=&alt=json&pageSize=500 returned "Permission monitoring.metricDescriptors.list denied (or the resource may not exist).">"
severity: "ERROR"
sourceLocation: {
file: "/base/data/home/apps/s~ProjectA/list-metrics:20200706t123743.427891295940019389/main.py"
functionName: "post"
line: "665"
}
time: "2020-07-06T16:10:43.724399Z"
I am not sure what should be done to make this work?
I am very new to google cloud and its apis and also new to app engine,python2.7 kindly help, Thanks
I have solved this in this way by adding the default service account of the app engine to the monitoring.viewer role for ProjectB and now I am able to get the metrics from ProjectB also

Cloud Endpoints and App Engine

I've just started on Google Cloud and I'm creating an iOS app to interact with Google Cloud services via a mobile backend. I'm using Python to write the backend for App Engine. I've gone through the tutorials in creating an API based on endpoints - but I have a question.
Do I have to create a Cloud Endpoints API, and then an app on App Engine? Basically, I want to be able to register accounts on my iOS app, call an API which then makes use of Google Datastore to store the account details. From looking at the tutorials (both the cloud endpoints one and then the guestbook one), am I meant to expose Google Datastore, cloud storage etc. within the endpoints api? Or does that link into another app where that is all done?
Sorry if this sounds a bit silly, but I just want to make sure!
Thanks in advance.
In a nutshell, your Cloud Endpoints API is your application. Some of the documentation regarding Cloud Endpoints can be a bit confusing (or vague), but on the server side it's essentially a bunch of Python decorators or Java annotations that allow you to expose your application logic as a REST API.
I find the Java implementation of Cloud Endpoints more intuitive than the Python one, which requires a bit more work to (de-)serialise your objects. You could look at endpoints_proto_datastore.ndb.EndpointsModel which might take some of the boilerplate stuff out of the equation (defining messages).
Essentially, when you write your API, each endpoint maps to a python function. Inside that function you can do what you like, but typically it will be either:
Deserialise your POSTed JSON, validate it, and write some entities to Datastore (or Cloud SQL, BigTable, wherever).
Read one or more entities from Datastore and serialize them to JSON and return them to the client.
For example, you might define your API (the whole collection of endpoint functions) as
#endpoints.api(name='cafeApi', version='v1', description='Cafe API', audiences=[endpoints.API_EXPLORER_CLIENT_ID])
class CafeApi(remote.Service):
# endpoints here
For example, you might have an endpoint to get nearby cafes:
#endpoints.method(GEO_RESOURCE, CafeListResponse, path='cafes/nearby', http_method='GET', name='cafes.nearby')
def get_nearby_cafes(self, request):
"""Get cafes close to specified lat,long"""
cafes = list()
for c in search.get_nearby_cafes(request.lat, request.lon):
cafes.append(c.response_message())
return CafeListResponse(cafes=cafes)
A couple of things to highlight here. With the Python Endpoints implementation, you need to define your resource and message classes - these are used to encapsulate request data and response bodies.
So, in the above example, GEO_RESOURCE encapsulates the fields required to make a GeoPoint (so we can search by location using Search API, but you might just search Datastore for Cafes with a 5-star rating):
GEO_RESOURCE = endpoints.ResourceContainer(
message_types.VoidMessage,
lat=messages.FloatField(1, required=True),
lon=messages.FloatField(2, required=True)
)
and the CafeListResponse would just encapsulate a list of CafeResponse objects (with Cloud Endpoints you return a single object):
class CafeListResponse(messages.Message):
locations = messages.MessageField(CafeResponse, 1, required=False, repeated=True)
where the CafeResponse is the message that defines how you want your objects (typically Datastore entities) serialised by your API. e.g.,
class LocationResponse(messages.Message):
id = messages.StringField(1, required=False)
coordinates = messages.MessageField(GeoMessage, 3, required=True)
name = messages.StringField(4, required=False)
With that endpoint signature, you can access it via an HTTP GET at /cafeApi/v1/cafes/nearby?lat=...&lon=... or via, say, the Javascript API client with `cafeApi.cafes.nearby(...).
Personally, I found Flask a bit more flexible with working with Python to create a REST API.

Is it possible to execute a task inside Google App Engine taskqueue?

I want to do a unittest on a function inside Google App Engine taskqueue. I would like to know if there is any method by which we can execute the tasks in taskqueue so as to test whether it yields the desired output.
You need to manually deque the task from the queue and post to the url with all the params. Check these doc
https://cloud.google.com/appengine/docs/python/tools/localunittesting#Python_Writing_task_queue_tests
http://googleappengine.googlecode.com/svn/trunk/python/google/appengine/api/taskqueue/taskqueue_stub.py

Search support for Google App Engine Go runtime

There is search support (experimental) for python and Java, and eventually Go also may supported. Till then, how can I do minimal search on my records?
Through the mailing list, I got an idea about proxying the search request to a python backend. I am still evaluating GAE, and not used backends yet. To setup the search with a python backed, do I have to send all the request (from Go) to data store through this backend? How practical is it, and disadvantages? Any tutorial on this.
thanks.
You could make a RESTful Python app that with a few handlers and your Go app would make urlfetches to the Python app. Then you can run the Python app as either a backend or a frontend (with a different version than your Go app). The first handler would receive a key as input, would fetch that entity from the datastore, and then would store the relevant info in the search index. The second handler would receive a query, do a search against the index, and return the results. You would need a handler for removing documents from the search index and any other operations you want.
Instead of the first handler receiving a key and fetching from the datastore you could also just send it the entity data in the fetch.
You could also use a service like IndexDen for now (especially if you don't have many entities to index):
http://indexden.com/
When making urlfetches keep in mind the quotas currently apply even when requesting URLs from your own app. There are two issues in the tracker requesting to have these quotas removed/increased when communicating with your own apps but there is no guarantee that will happen. See here:
http://code.google.com/p/googleappengine/issues/detail?id=8051
http://code.google.com/p/googleappengine/issues/detail?id=8052
There is full text search coming for the Go runtime very very very soon.

Multiple writes in Google App Engine

In php i could make very long SQL statement to write a lot
of data in one db call.
Is there something similar in Google App Engine?
Can i build request somehow and then do just one mydata.put()
db.put can accept a list, so you can do db.put([entity1, entity2, entity3])

Resources