Is it possible to execute a task inside Google App Engine taskqueue? - google-app-engine

I want to do a unittest on a function inside Google App Engine taskqueue. I would like to know if there is any method by which we can execute the tasks in taskqueue so as to test whether it yields the desired output.

You need to manually deque the task from the queue and post to the url with all the params. Check these doc
https://cloud.google.com/appengine/docs/python/tools/localunittesting#Python_Writing_task_queue_tests
http://googleappengine.googlecode.com/svn/trunk/python/google/appengine/api/taskqueue/taskqueue_stub.py

Related

How can I check for how long the instance has been running on google app engine?

Scheduled a cron job to hit empty endpoints just to keep the instance running. How do I check the instance has been running for how long just to make sure it's the same instance that has been running since start.
Got it, so in Instances menu under App Engine it shows information related to the instance which includes Start Time.
You have three ways to check how much time the instance has been running.
The first one is through the Instances menu, as you said.
The second one would be through the Rest API. In the response you can see the start time as one of the json attributes.
The third one would be through the gcloud command line console with the command:
gcloud app instances describe --service=SERVICE --version=VERSION NAME
That will return you similar information as in the API request that also includes the start time.

Setting dataflow job dependencies using Cloud function and App engine

Just a thought, My first job will be triggered by a Cloud Function by any file arrival event. I will capture it's job ID in Cloud Function itself. Once I get the job ID, I will pass that ID to App engine code and that code will monitor the completion of job with that particular ID. Once the App engine code identifies the 'success' status of that job then it will trigger another job which was dependant on the successful completion status for prior job.
Any idea whether it can be made possible or not? If yes, then any help with code samples will be highly appreciated.

memcache and a custom DoFn with dataflow

I'm trying to use google memcache with dataflow. I'd like essentially like to transform data into memcache. Is it possible to use the google memcache api inside of dataflow?
I get the following error:
java.util.concurrent.ExecutionException: com.google.apphosting.api.ApiProxy$CallNotFoundException: The API package 'memcache' or call 'Set()' was not found. com.google.appengine.api.utils.FutureWrapper.setExceptionResult(FutureWrapper.java:65)
This is the line of code:
AsyncMemcacheService asyncCache = MemcacheServiceFactory.getAsyncMemcacheService("namespace");
asyncCache.put("key", "value", Expiration.byDeltaSeconds(100000)).get();
I think memcache is part of App Engine and not directly accessible outside of App Engine. As a result you won't be able to access it from Dataflow directly. What you could do is create an App Engine service that acted as a proxy and send requests to that from Dataflow.

How to add initial or default data in App Engine

Hey guys kind of a n00b in App engine and I have been strugling with this is there a way that I can add/bulk default data to Data Store.
I would like to create catalogs or example data, as well user or permission. I am not using the default App engine user instead I am using webapp2 User auth session base model.
Thanks
You can use the bulkloader: https://developers.google.com/appengine/docs/python/tools/uploadingdata
Or upload data to the blobstore and move it to the datastore.
This is a large topic but, I am using Java code running in task queues to do this.
Much easier to create random test and demo data through code.
Much more friendly to unit testing.
This requires no dependencies. It is just code running and accessing the datastore.
Sometimes easier to manipulate the datastore through code instead of scripts when logic is involved in the changes.
Allows us to upload new task definitions (a Java classes) embedded in a new app version. Then, we trigger the tasks executions by calling a servlet URL. These task classes are then removed from the next app version.
And using tasks, you get around the request execution timeout. If a task is long running, we split it as sequential tasks. When a task completes, it queues the next one automatically.
Of course, this requires a fair amount of coding but is really simple and flexible at the same time.

Multiple writes in Google App Engine

In php i could make very long SQL statement to write a lot
of data in one db call.
Is there something similar in Google App Engine?
Can i build request somehow and then do just one mydata.put()
db.put can accept a list, so you can do db.put([entity1, entity2, entity3])

Resources