I'm testing Google App Engine and trying to run a simple function to upload files to either the Blobstore or Cloud Storage. I'm typing the Python code directly in the Cloud Shell of my instance. The code is failing when I call:
from google.appengine.ext import blobstore
I get the error code:
Traceback (most recent call last):
File "upload_test.py", line 1, in <module>
from google.appengine.api import users
ImportError: No module named 'google.appengine'
Even though the documentation says that: You can use Google Cloud Shell, which comes with git and Cloud SDK already installed, I've tried installing a bunch of libraries:
gcloud components install app-engine-python
pip install google-cloud-datastore
pip install google-cloud-storage
pip install --upgrade google-api-python-client
I'm still getting the same error. How can I get the appengine library to work? Alternatively, is this the wrong method for creating an app that allows the user to upload files?
The google.appengine module is baked into the first-generation Python (2.7) runtime. It's not available to install via pip, in the second-generation (3.7) runtime, or in Cloud Shell.
The only way to use it is by writing and deploying a first-generation App Engine app.
Thanks #Dustin Ingram
I found the answer in this page.
The current "correct" way of uploading to Cloud Storage is to use google.cloud.storage. The tutorial I linked above explains how to implement it.
The impression I have, however, is that this uses twice the bandwidth as the solution via google.appengine. Originally, the front-end would receive an upload url and send the file directly to the Blobstore (or to Cloud Storage). Now the application uploads to the back-end which, in turn, uploads to Cloud Storage.
I'm not too worried, as I will not be dealing with excessively large files, but it seems strange that the ability to upload directly has been discontinued.
In any case, my problem has been solved.
Related
I am trying to generate signUrl in python 2.7 using v4 signing process as given here
below is the code given on the link:
def generate_singed_url(bucket_name, blob_name):
"""Generates a v4 signed URL for downloading a blob.
Note that this method requires a service account key file. You can not use
this if you are using Application Default Credentials from Google Compute
Engine or from the Google Cloud SDK.
"""
# bucket_name = 'your-bucket-name'
# blob_name = 'your-object-name'
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(blob_name)
url = blob.generate_signed_url(
version="v4",
# This URL is valid for 15 minutes
expiration=datetime.timedelta(minutes=15),
# Allow GET requests using this URL.
method="GET",
)
print("Generated GET signed URL:")
print(url)
print("You can use this URL with any user agent, for example:")
print("curl '{}'".format(url))
return url
This is how I am trying to import the storage:
from google.cloud import storage
But I am getting the error as:
File "<input>", line 1, in <module>
File "/Applications/PyCharm CE.app/Contents/plugins/python-ce/helpers/pydev/_pydev_bundle/pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
ImportError: No module named google.cloud
I tried installing google-cloud-storage library also tried installing lots of other google specific libraries but it's still giving the same import error.
Tried:
ImportError: No module named google.cloud
ModuleNotFoundError: No module named 'google.cloud' (uninstalling the libraries and again installing)
Edit: how can i generate signurl using python2.7 and app engine ?
I was running two python versions in my mac: python2.7 and python3 , the libraries was already installed for python3 but was missing for python2.7.
Used below command to install the library:
python2.7 -m pip install --upgrade google-cloud-storage --user
....Answering based on your comments....
If you were using Python3, you would have a virtual env and a requirements.txt file and when you deploy your project to Production, GAE would first install the contents of your requirements.txt file before running your App.
Since you're running Python2, you don't have that requirements.txt file and virtual env concept with GAE. Your App has to be uploaded together with any third party library you need (i.e. any library outside of these has to uploaded with your App). You do this via the 'lib' folder that I mentioned in the comments - (instructions for creating the folder can be found here). I believe the instructions are simple enough to follow.
I would say to first try using the lib concept on your local machine (this would mean installing the cloud storage library to that folder). It also means you have to run your app with dev_appserver.py.
Note that when you install google cloud storage client to the lib folder and run your app with dev_appserver.py, GAE will use the package in your lib folder instead of the one installed globally on your laptop.
If it works (i.e. you're able to create signed urls on your local machine), then go ahead and deploy to production.
If you have problems creating the lib folder and installing packages to it, let me know.
I am trying to deploy a simple "hello world" application in Appengine standard - Go environment. There is one source file hello.go with one import "google.golang.org/appengine"
I followed the documentation here to use the Admin API for deployment. But I got "can't find import: google.golang.org/appengine" error when I checked the status of the deployment.
So I uploaded the google.golang.org library folder to Cloud Storage, but documentation seems to only show how to include single files separately from GCS buckets, which is not feasible for big imported libraries.
I know all this will work with "gcloud app deploy", but I have to use the Admin API and standard environment.
Is there a way to tell Appengine to "go get" imported libraries?
Is there a way to include folders (not single files) from GCS bucket in
deployment config files?
The deployment works with gcloud because the appengine library is present in your GOPATH. gcloud fetches them from there.
In the case of the deployment using Admin API, in order to not include one per one all files from your libraries, the packages need to be present in Google Cloud Storage for ZIP deployment or you can use Cloud Source Repositories, build it with the CREATE call and then deploy it from a container with the Cloud Build image.
I am building a RESTful API using Python 3.6, the Falcon Framework, Google App Engine, and Firebase Cloud Firestore. At runtime I am receiving the following error ...
File "E:\Bill\Documents\GitHubProjects\LetsHang-BackEnd\lib\google\cloud\firestore_v1beta1\_helpers.py", line 24, in <module> import grpc
File "E:\Bill\Documents\GitHubProjects\LetsHang-BackEnd\lib\grpc\__init__.py", line 22, in <module>
from grpc._cython import cygrpc as _cygrpc
ImportError: cannot import name cygrpc
When researching StackOverFlow, I found an article regarding an AWS Lambda deployment, but it suggests a solution based on Docker. Docker is not a viable solution for us. I also found an article off StackOverflow that suggests running "pip install grpcio". We did not without luck.
We build the App Engine dependencies using a requirements.txt file. This file has the following contents ...
falcon==1.4.1
google-api-python-client
google-cloud-firestore
firebase-admin
enum34
grpcio
We apply the requirements file using the command ...
pip install -t lib -r requirements.txt
The App Engine server is started with the command ...
dev_appserver.py .
The development environment is Windows 10.
You seem to be mixing up the GAE standard and flexible environments:
using Python 3.6 is only possible in the flexible environment (which, BTW, is fundamentally Docker-based)
installing app dependencies in the lib directory and using dev_appserver.py for local development are only applicable to the standard environment
Somehow related: How to tell if a Google App Engine documentation page applies to the standard or the flexible environment
Ok. I will write up my findings just in case there's another fool like me.
First, Dan's response is correct. I was mixing standard and flexible environments. I had looked up a method for using the Falcon Framework with App Engine; as it turns out, the only article uses the standard environment. So that's how I wound up using dev_appserver.py. My app, however, is Python 3.6 and has dependencies that prevent stepping down to 2.7.
To develop locally for the flexible environment, you simply need to run as you normally would. In the case of Falcon Framework, that means using the Waitress wsgi server.
I find that it is a good practice to build and use a Python virtual environment. You use the virtualenv command for that. At deployment time, Google builds a docker container for the app in the cloud. To reconstruct all the necessary Python packages, you have to supply a requirements.txt file. If you have a virtual environment, then the requirements file is easily produced using pip freeze.
I'm being told the Issuer attribute can't be found on the endpoints object. According to google it should be there: Authenticating Users (Frameworks)
import endpoints
firebase_issuer = endpoints.Issuer(
issuer='https://securetoken.google.com/YOUR-PROJECT-ID',
jwks_uri='https://www.googleapis.com/service_accounts/v1/metadata/x509/securetoken#system.gserviceaccount.com')
#endpoints.api(
name='echo',
version='v1',
issuers=[firebase_issuer])
This is in my backend api which I want to allow firebase authentication.
I'm using Eclipse an the PyDev Google App engine library to write this backend. I'm seeing the error message:
Undefined variable from import: Issuer
or
firebase_issuer = endpoints.Issuer(
AttributeError: 'module' object has no attribute 'Issuer'
when I run it
I had this problem as well. Basically, my interpreter was referencing the gcloud SDK files which only have endpoints-1.0. The quick start has you install endpoints-2.0 to a lib directory that's uploaded with deployment, but the appengine_config.py script that includes it doesn't fire until deployment (I think). I.E. it's undefined locally.
To fix, I just installed endpoints-2.0 by running pip install -r requirements.txt, which points at your local interpreter (mine happens to be a virtualenv). This assumes you use the requirements.txt from the repo in the quickstart: google-endpoints==2.0.4.
This was easiest for me, but I believe there's a way to point a virtualenv at the lib directory that the quickstart has you create and target. This way, your local interpreter would be running off the same package list that GAE would be.
Please help me to solve this error? I'm doing this exercise on app engine (https://developers.google.com/appengine/articles/prediction_service_accounts) , but I'm stuck in step 6.2 because I raise this error(When I run the deploy operation, it is successful step 6.1):
: No module named appengine
Traceback (most recent call last):
File "/base/data/home/apps/s~01prediction/1.367567721220366691/main.py", line 29, in
from oauth2client.appengine import AppAssertionCredentials
The error in line 29 :
from oauth2client.appengine import AppAssertionCredentials
Did you run step 3.2? That should have copied some folders into prediction-demo-skeleton. You should have a folder called oauth2client inside prediction-demo-skeleton. Take a look at the folders that are inside prediction-demo-full.
ps: a good practice before deploying is to run your app using the devappserver.
The Google API Python Client now has a pre-packaged ZIP containing all dependencies that might make installation easier. See:
https://code.google.com/p/google-api-python-client/downloads/list
Select google-api-python-client-gae-1.1.zip for download. Unzip this file inside of your AppEngine app directory.
Along the lines of Sebastian's suggestion it generally is a good idea to test locally using the devappserver. In this case you should be able to get past the import issue, however AppAssertionCredentials won't actually be able to generate any access tokens until it is deployed into a production environment, so it will be of limited use for you.