I'm migrating my py 2.7 existing repo to py 3.7 currently working on Google App Engine.
I found that a runtime library (Runtime Utilities API) which is extensively used in the project.
from google.appengine.api.runtime import runtime
import logging
logging.info(runtime.memory_usage())
This will output memory usage statistics, where the numbers are expressed in MB. For example:
current: 464.0859375
average1m: 464
average10m: 379.575
I'm trying to find its alternative library compatible with python 3.7 but don't found any from GAE. Can somebody please help with this.
Is there any replacement from the Google side which I'm not aware of?
Unfortunately, the google.appengine.api.runtime.runtime module is deprecated since the version 1.8.1.
I also couldn't find any similar or equivalent official App Engine API for Python3.
As an alternative, you can try to implement these feature merely within your code. For instance, look at the answers of this question, which is relate on how to Get RAM & CPU stats with Python. Some of them include the use of the psutil library.
You can also consider to use a StackDriver Agent, that can transmit data for the metric types listed on this page to Stackdriver; such as CPU (load, usage, etc), Disk (bytes used, io_time, etc) and other metrics.
The following gets exactly the same values of memory usage that the dashboard shows:
def current():
vm = psutil.virtual_memory()
return (vm.active + vm.inactive + vm.buffers) / 1024 ** 2
If one wants to minimize the cost of transition, then the following can be put in a new module, and imported instead of the Google original interface:
import psutil
class MemoryUsage:
def __init__(self):
pass
#staticmethod
def current():
vm = psutil.virtual_memory()
return (vm.active + vm.inactive + vm.buffers) / 1024 ** 2
def memory_usage():
return MemoryUsage()
Related
Using Python 3.4 Google App Engine Flex.
Google documentation on using pull queues with Python says to "from google.appengine.api import taskqueue", but does not explain how to make taskqueue available to Python runtime.
They do link to "Easily Access Google API's from Python", where it explains how to install the api client via "pip install google-api-python-client"
This does not install the taskqueue lib.
From the previous doc, there is a link to "Installation", where it says:
Because the Python client libraries are not installed in the App Engine Python runtime environment, they must be vendored into your application just like third-party libraries.
This links to another page "Using third-party libraries", which states you need to either install a lib to /lib or use requirements.txt. Neither of these make taskueue available.
Searching for taskqueue.py in Google's github shows only an example module with the same name.
There is a documentation page on the module, but no information on how to install it.
There is a Python 2.7 example that google points to here, but it doesn't work. There is no setup of taskqueue, no requirements.txt, no instructions.
There is a stack overflow question on this topic here, and the checked answer says to install the SDK. That takes you to here, which takes you to here, which takes you here, which takes you to here, which provides the gcloud SDK download for deploying and managing gcloud. This does not include the python lib for taskqueue.
There is another similar stackoverflow question here, which says:
... this is now starting to feel like an infinite loop. Yes, it's been made crystal clear you need to import the taskqueue. But how do you make it available?
I've asked the question to Google Support and they haven't been able to answer for 4 days.
I've opened two issues, one here and another here. No answers yet.
Do not want to use Python < 3.4.
Do not want to use HTTP REST API.
Just want a simple pull queue.
Many of the docs you mentioned are standard environment docs and do not apply to the flexible environment.
From the Task Queue section in Migrating Services from the Standard Environment to the Flexible Environment:
The Task Queue service has limited availability outside of the
standard environment. If you want to use the service outside of the
standard environment, you can sign up for the Cloud Tasks alpha.
Outside of the standard environment, you can't add tasks to push
queues, but a service running in the flexible environment can be
the target of a push task. You can specify this using the
target parameter when adding a task to queue or by specifying
the default target for the queue in queue.yaml.
In many cases where you might use pull queues, such as queuing up
tasks or messages that will be pulled and processed by separate
workers, Cloud Pub/Sub can be a good alternative as it offers
similar functionality and delivery guarantees.
I have a question on using tensorflow on google cloud platform.
I heard that Google cloud tensorflow doesnt support Keras (keras.io). However, now i can see that Tensorflow has its own API to access Keras (https://www.tensorflow.org/api_docs/python/tf/contrib/keras).
Given this, can I use the above mentioned API inside google cloud, since it is coming out along with Tensorflow package? Any idea sir?
I am able to access this API from the tensorflow installed on a anaconda machine.
Option 1# Please try package-path option.
As per the docs...
-package-path=PACKAGE_PATH
"Path to a Python package to build. This should point to a directory containing the Python source for the job"
Try and give a relative path to keras from your main script.
More details here:
https://cloud.google.com/sdk/gcloud/reference/beta/ml-engine/local/train
Option 2# If you have a setup.py file
Inside your setup.py file within setup call pass argument install_requires=['keras']
Google Cloud Machine Learning Engine does support Keras (keras.io), but you have to list it as a dependency when starting a training job. For some specific instructions, see this SO post, or a longer exposition on this blog page. If you'd like to serve your model on Google Cloud Machine Learning or using TensorFlow Serving, then see this SO post about exporting your model.
That said, you can also use tf.contrib.keras, as long as you use the --runtime-version=1.2 flag. Just keep in mind that packages in contrib are experimental and may introduce breaking API changes between versions.
Have a look at this example on git which I saw was recenly added:
Keras Cloud ML Example
We are considering using Google Cloud Storage as an alternative to AWS, and so are planning to do some performance testing on GCS. One of the features we would like to test is searching for files at a certain path. Unfortunately, the SDK does not have the ability to search for a prefix. Instead, we are forced to use the Java client API. Here is the relevant code which is failing:
GcsService gcsService = GcsServiceFactory.createGcsService(RetryParams.getDefaultInstance());
AppIdentityService appIdentity = AppIdentityServiceFactory.getAppIdentityService();
ListOptions.Builder b = new ListOptions.Builder();
b.setRecursive(true);
b.setPrefix("folder/");
ListResult result = gcsService.list("rms-test-bucket", b.build());
Specifically, the code rolls over on the call to gcsService.list() with a NullPointerException. I attached all sources in IntelliJ, stepped through the code, and found that the cause was a call to ApiProxy.getDelegate() returning null, when it should have returned a non null value.
We suspect that there is a configuration problem somewhere although it is not clear what it might be.
Where are you running that code from? This could should be run in AE standard or AE Flexible compat (as that API is App Engine specific). For all other cases you should use the google-cloud-java client. In fact I would suggest using that client even on AE as it is supported on all platform and much richer in its functionality. For more information see here.
I'm not entirely sure what's wrong with your example, but if your goal is strictly to test GCS performance with searching for files at a certain path, the gsutil command-line utility contains a solid implementation of that logic. You could use it to evaluate performance. If you're testing from a GCE instance, it's already preinstalled.
I need to run some statistical tests in my app, which needs functions from scipy.stats. However I found Google App Engine doesn't trust SciPy. So is there any GAE supported libraries which can do some stats calculations e.g. generate random numbers, estimate CDF, run T-tests, check normality, etc. Thanks!
The Python 2.7 runtime includes NumPy. It's a scientific library that can help you in what you want to do.
Numpy is a language extension that defines the numerical array and matrix type and basic operations on them.
More info for NumPy here
Btw:
However I found Google App Engine doesn't trust SciPy.
I think should be:
GAE only supports native python code
Edit
For random numbers you can use Python's Random
For CDF http://pysclint.sourceforge.net/pycdf/ or http://code.google.com/p/netcdf4-python/ but I am not sure if they just contain native code. You can try if you like.
Also take a look here http://www.astro.cornell.edu/staff/loredo/statpy/
Another approach could be to have a home server running any python module you might please. Then use a PULL QUEUE TO communicate via REST with your "home" server and process the calculations there.
I am developing an application on Google app engine using Python. I want to use editdist feature of Python and for that reason I am importing editdist C python module in my program, but it is showing that module editdist does not exist.
When I import editdist for my local application it is working fine but not for Google app engine application.
Can anyone suggest me a method to import this module?
App Engine is a "pure python" environment, and you cannot use any C extensions or other code that must be compiled.
There is therefore no way to use this program on App Engine, and all of the competing "production quality" python libraries I found were implemented as C modules.
There do exist alternate implementations of the Levenshtein distance algorithm, though none are as nearly as fast as editdist. These more naïve implementations might still be acceptable depending upon your needs.
Here could be couple alternatives that are implemented with Python (I haven't tested them myself):
http://www.korokithakis.net/node/87
http://code.activestate.com/recipes/576874-levenshtein-distance/