I am trying to use the Google Cloud Client Library on Google App Engine. However, the Cloud Client Library and the App Engine SDK use google as an import name, and there are naming conflicts. How do I get them to work together?
When I try importing a Google Cloud Client Library module, I get the following error:
>> import google.cloud.datastore
Traceback (most recent call last):
File "<console>", line 1, in <module>
File "C:\[...]\libs\google\cloud\datastore\__init__.py", line 52, in <module>
from google.cloud.datastore.batch import Batch
ImportError: No module named cloud.datastore.batch
The reason this import error occurs is because the name google has already been imported from the App Engine SDK. This can be confirmed by running the command:
>>> print google.__path__
['C:\\Program Files (x86)\\Google\\Cloud SDK\\google-cloud-sdk\\ platform\\google_appengine\\google']
Notice that the path points to the SDK.
Any ideas on how to resolve this name conflict?
Related
I have an access with aws.
Is this possible to do the following
Google Cloud Function can run aws-cli commands.
Google App Engine can kick actions of aws-cli commands.
For example:
#!/usr/bin/env python
import subprocess
import sys
import awscli.clidriver
def aws_demo(request):
cmd = 'aws s3 ls'
result = subprocess.run(
cmd.split(" "),
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT
)
print(result.stdout.decode())
return str(result.stdout.decode())
Error :
Traceback (most recent call last):
File "/env/bin/aws", line 19, in <module>
import awscli.clidriver
ModuleNotFoundError: No module named 'awscli'
Google Cloud Function can run aws-cli commands?
The command-line AWS tools aren't available or installable in Google Cloud Functions. Your best bet is to use the boto3 Python package instead as John suggested.
As far as managing credentials within boto3 is concerned, there are a number of ways to configure credentials, as described here.
Google App Engine can kick actions of aws-cli commands?
I think as with the Google Cloud Function, is not possible to run AWS command using the command line tool awscli.
I would highly recommend to also use boto3 as it will keep your infrastructure equal among the different GCP products.
I hope it helps.
I'm testing Google App Engine and trying to run a simple function to upload files to either the Blobstore or Cloud Storage. I'm typing the Python code directly in the Cloud Shell of my instance. The code is failing when I call:
from google.appengine.ext import blobstore
I get the error code:
Traceback (most recent call last):
File "upload_test.py", line 1, in <module>
from google.appengine.api import users
ImportError: No module named 'google.appengine'
Even though the documentation says that: You can use Google Cloud Shell, which comes with git and Cloud SDK already installed, I've tried installing a bunch of libraries:
gcloud components install app-engine-python
pip install google-cloud-datastore
pip install google-cloud-storage
pip install --upgrade google-api-python-client
I'm still getting the same error. How can I get the appengine library to work? Alternatively, is this the wrong method for creating an app that allows the user to upload files?
The google.appengine module is baked into the first-generation Python (2.7) runtime. It's not available to install via pip, in the second-generation (3.7) runtime, or in Cloud Shell.
The only way to use it is by writing and deploying a first-generation App Engine app.
Thanks #Dustin Ingram
I found the answer in this page.
The current "correct" way of uploading to Cloud Storage is to use google.cloud.storage. The tutorial I linked above explains how to implement it.
The impression I have, however, is that this uses twice the bandwidth as the solution via google.appengine. Originally, the front-end would receive an upload url and send the file directly to the Blobstore (or to Cloud Storage). Now the application uploads to the back-end which, in turn, uploads to Cloud Storage.
I'm not too worried, as I will not be dealing with excessively large files, but it seems strange that the ability to upload directly has been discontinued.
In any case, my problem has been solved.
I am currently using appengine (standard environment) and datastore, we also have some CRON scripts running on our local servers that connect remotely to datastore and insert daily updates.
The main problem I am facing is that currently the new SDK:
https://cloud.google.com/appengine/docs/standard/python/download
After running the specified commands:
gcloud components install app-engine-python
The following script still does not work:
try:
import dev_appserver
dev_appserver.fix_sys_path()
except ImportError:
pass
from google.appengine.ext.remote_api import remote_api_stub
PROJECT_ID = "lipexdb-test"
remote_api_stub.ConfigureRemoteApiForOAuth(
'{}.appspot.com'.format(PROJECT_ID),
'/_ah/remote_api/') #notasecret
from google.appengine.ext import ndb
try:
from google.cloud import bigquery as bq
except Exception, e:
print "Cannot import bigquery"
print e
Failing with the following message:
Traceback (most recent call last):
File "testenv.py", line 7, in <module>
from google.appengine.ext.remote_api import remote_api_stub
ImportError: No module named google.appengine.ext.remote_api
I used to be able to make it work using the "Original App engine SDK for python" (available on the same link above) but that causes a few issues (mainly two "google" libraries in the path which then conflicts with all google.cloud. packages).
So does the new App engine SDK does not have the python packages? And how can I make them work along google.cloud (bigquery/pubsub) packages?
I'm trying to get google app engine to work on my Raspberry Pi. I keep getting this error.
Traceback (most recent call last):
File "main.py", line 26, in <module>
from google.appengine.ext.webapp.mail_handlers import InboundMailHandler
ImportError: No module named google.appengine.ext.webapp.mail_handlers
I downloaded google app engine and then ran these commands:
unzip google_appengine_1.9.40.zip
export PATH=$PATH:/home/pi/google_appengine/
The most trivial solution for such errors is to import the required package into you project directory. but to be honest it is not the best way to resolve this one. you may use Google App Engine SDK which will take care of all that headache, or there are another way you can follow:
Create a folder into your project directory and call it lib
Add all required packages into this folder.
Create a .py file and name it appengine_config.py
Add the below code snippets into this file:
import sys
import os.path
sys.path.insert(0, os.path.join(os.path.dirname(__file__), 'lib'))
appengine_config.py gets loaded every time a new instance is started, and should take care of your modules importing.
Regards.
It appears you're trying to directly execute your main.py as a standalone application, which is not how GAE app code works.
You're supposed to get the development server (from the SDK you downloaded) to execute your app code on your development machine (on GAE it's the GAE infra doing that). See Using the Local Development Server.
Please help me to solve this error? I'm doing this exercise on app engine (https://developers.google.com/appengine/articles/prediction_service_accounts) , but I'm stuck in step 6.2 because I raise this error(When I run the deploy operation, it is successful step 6.1):
: No module named appengine
Traceback (most recent call last):
File "/base/data/home/apps/s~01prediction/1.367567721220366691/main.py", line 29, in
from oauth2client.appengine import AppAssertionCredentials
The error in line 29 :
from oauth2client.appengine import AppAssertionCredentials
Did you run step 3.2? That should have copied some folders into prediction-demo-skeleton. You should have a folder called oauth2client inside prediction-demo-skeleton. Take a look at the folders that are inside prediction-demo-full.
ps: a good practice before deploying is to run your app using the devappserver.
The Google API Python Client now has a pre-packaged ZIP containing all dependencies that might make installation easier. See:
https://code.google.com/p/google-api-python-client/downloads/list
Select google-api-python-client-gae-1.1.zip for download. Unzip this file inside of your AppEngine app directory.
Along the lines of Sebastian's suggestion it generally is a good idea to test locally using the devappserver. In this case you should be able to get past the import issue, however AppAssertionCredentials won't actually be able to generate any access tokens until it is deployed into a production environment, so it will be of limited use for you.