Using App Engine Datastore outside of main.py - google-app-engine

I'm trying to use the App Engine datastore in my application outside of the bounds of one of the request handler pages (like main.py or the other files you can specify in app.yaml). Is this possible? When I try to run it, it says my database model does not have a method "put" associated with it, but it has no problems when I run it from a script connected to CGI and request handling (like main.py or any other .py files I declare similarly in app.yaml). Do you know what I can do to add CGI to those pages without adding request handling? Thanks!
Edit my code:
users.py file:
from google.appengine.ext import db, webapp
class User(db.Model):
email = db.EmailProperty()
password = db.StringProperty()
main.py file
from user import *
class CreateHandler(webapp.RequestHandler):
def get(self):
u = User()
u.email = "email#email.com"
u.password = "mypass"
u.put()
It gives me this error:
File "........./main.py", line 75, in get
u.put()
AttributeError: User instance has no attribute 'put'

Yes, you can access the datastore from other scripts. You don't have to add request handling to them, that can stay in your main script. Eg., you can do something like this:
app.yaml:
- url: /.*
script: main.py
main.py:
from SectionHandlers import * # This imports classes from SectionHandlers.py
application = webapp.WSGIApplication([
("/section1/.*", Section1Handler), # Map requests to handlers
("/section2/.*", Section2Handler),
], debug=True)
SectionHandlers.py:
from google.appengine.ext import db, webapp
class Section1Handler(BlogHandler):
def get(self):
# Code using 'db' here

Related

Google AppEngine Getting 403 forbidden trying to update cron.yaml

I am following the docs on how to backup datastore using AppEngine.
I am performing a gcloud app deploy cron.yaml command on a GCE VM that is meant to update a cronjob in AppEngine. the GCE VM and AppEngine cron are in the same project, and I have granted AppEngine admin to the GCE VM via a default Service Account. When I run this on my local machine, it updates fine. However on the GCE instance, thats where issues arise
Here are the files
app.yaml
runtime: python27
api_version: 1
threadsafe: true
service: cloud-datastore-admin
libraries:
- name: webapp2
version: "latest"
handlers:
- url: /cloud-datastore-export
script: cloud_datastore_admin.app
login: admin
cron.yaml
cron:
- description: "Daily Cloud Datastore Export"
url: /cloud-datastore-export?namespace_id=&output_url_prefix=gs://<my-project-id>-bucket
target: cloud-datastore-admin
schedule: every 24 hours
cloud_datastore_export.yaml
import datetime
import httplib
import json
import logging
import webapp2
from google.appengine.api import app_identity
from google.appengine.api import urlfetch
class Export(webapp2.RequestHandler):
def get(self):
access_token, _ = app_identity.get_access_token(
'https://www.googleapis.com/auth/datastore')
app_id = app_identity.get_application_id()
timestamp = datetime.datetime.now().strftime('%Y%m%d-%H%M%S')
output_url_prefix = self.request.get('output_url_prefix')
assert output_url_prefix and output_url_prefix.startswith('gs://')
if '/' not in output_url_prefix[5:]:
# Only a bucket name has been provided - no prefix or trailing slash
output_url_prefix += '/' + timestamp
else:
output_url_prefix += timestamp
entity_filter = {
'kinds': self.request.get_all('kind'),
'namespace_ids': self.request.get_all('namespace_id')
}
request = {
'project_id': app_id,
'output_url_prefix': output_url_prefix,
'entity_filter': entity_filter
}
headers = {
'Content-Type': 'application/json',
'Authorization': 'Bearer ' + access_token
}
url = 'https://datastore.googleapis.com/v1/projects/%s:export' % app_id
try:
result = urlfetch.fetch(
url=url,
payload=json.dumps(request),
method=urlfetch.POST,
deadline=60,
headers=headers)
if result.status_code == httplib.OK:
logging.info(result.content)
elif result.status_code >= 500:
logging.error(result.content)
else:
logging.warning(result.content)
self.response.status_int = result.status_code
except urlfetch.Error:
logging.exception('Failed to initiate export.')
self.response.status_int = httplib.INTERNAL_SERVER_ERROR
app = webapp2.WSGIApplication(
[
('/cloud-datastore-export', Export),
], debug=True)
The Error I'm getting is
Configurations to update:
descriptor: [/usr/local/sbin/pluto/<my-project-id>/datastore/cron.yaml]
type: [cron jobs]
target project: [<my-project-id>]
Do you want to continue (Y/n)?
Updating config [cron]...
failed.
ERROR: (gcloud.app.deploy) Server responded with code [403]:
Forbidden Unexpected HTTP status 403.
You do not have permission to modify this app (app_id=u'e~<my-project-id>').
I have checked other posts related to this, however they seem to deal with an old version/deployment of appengine
Service Accounts!
From Deploying using IAM roles:
To grant a user account the ability to deploy to App Engine:
Click Add member to add the user account to the project and then select all of the roles for that account by using the dropdown menu:
Required roles to allow an account to deploy to App Engine:
a. Set the one of the following roles:
Use the App Engine > App Engine Deployer role to allow the account to deploy a version of an app.
To also allow the dos.yaml or dispatch.yaml files to be deployed with an app, use the App Engine > App Engine Admin role
instead.
The user account now has adequate permission to use the Admin API to deploy apps.
b. To allow use of App Engine tooling to deploy apps, you must also give the user account the Storage > Storage Admin role
so that the tooling has permission to upload to Cloud Storage.
Optional. Give the user account the following roles to grant permission for uploading additional configuration files:
Cloud Scheduler > Cloud Scheduler Admin role: Permissions for uploading cron.yaml files.
Potentially of interest:
Deployments with predefined roles
Predefined roles comparison matrix
Okay after some tinkering. I added the project editor role to the service account linked to the GCE instance running my server. I am not fully aware if this is the role with least priviledge to enable this to work.

Local unit testing Google Cloud Storage signed URL

I am writing a new application using App Engine and, as the docs suggest to not use Blobstore API, I'm using the Google Cloud Storage client (GCS). All is good but I want to be able to return "signed urls" to clients so they can get the GCS resources without passing through the application. I believe that is what signet urls are for.
But how to test that? I can sucessfully test GCS calls from the client, but I have no idea how to test the client's HTTP calls using urlfetch.
Below is a full test case that illustrates my issue:
import base64
import mimetypes
import urllib
import urllib2
from datetime import datetime, timedelta
import time
from google.appengine.api import app_identity
from google.appengine.datastore import datastore_stub_util
from google.appengine.ext import testbed
from google.appengine.ext import ndb
import unittest
import cloudstorage
# IS THIS RIGHT ?
GCS_API_ACCESS_ENDPOINT = 'http://localhost:8000/_ah/gcs'
def sign_url(bucket_object, expires_after_seconds=60):
""" cloudstorage signed url to download cloudstorage object without login
Docs : https://cloud.google.com/storage/docs/access-control?hl=bg#Signed-URLs
API : https://cloud.google.com/storage/docs/reference-methods?hl=bg#getobject
"""
# source: https://github.com/voscausa/appengine-gcs-signed-url/blob/05b8a93e2777679d40af62cc5ffce933216e6a85/sign_url.py
method = 'GET'
gcs_filename = urllib.quote(bucket_object)
content_md5, content_type = None, None
# expiration : number of seconds since epoch
expiration_dt = datetime.utcnow() + timedelta(seconds=expires_after_seconds)
expiration = int(time.mktime(expiration_dt.timetuple()))
# Generate the string to sign.
signature_string = '\n'.join([
method,
content_md5 or '',
content_type or '',
str(expiration),
gcs_filename])
signature_bytes = app_identity.sign_blob(signature_string)[1]
google_access_id = app_identity.get_service_account_name()
# Set the right query parameters. we use a gae service account for the id
query_params = {'GoogleAccessId': google_access_id,
'Expires': str(expiration),
'Signature': base64.b64encode(signature_bytes)}
# Return the built URL.
result = '{endpoint}{resource}?{querystring}'.format(endpoint=GCS_API_ACCESS_ENDPOINT,
resource=gcs_filename,
querystring=urllib.urlencode(query_params))
return result
FILE_DATA = "This is file contents."
MIME = "text/plain"
class TestGCS(unittest.TestCase):
def setUp(self):
self.testbed = testbed.Testbed()
self.testbed.activate()
self.policy = datastore_stub_util.PseudoRandomHRConsistencyPolicy(probability=0)
self.testbed.init_datastore_v3_stub(consistency_policy=self.policy)
self.testbed.init_app_identity_stub()
self.testbed.init_memcache_stub()
self.testbed.init_urlfetch_stub()
self.testbed.init_blobstore_stub()
ndb.get_context().clear_cache()
def tearDown(self):
self.testbed.deactivate()
def test_gcs_works(self):
with cloudstorage.open('/mybucket/test.txt', 'w', content_type=MIME) as f:
f.write(FILE_DATA)
with cloudstorage.open('/mybucket/test.txt', 'r') as f:
data = f.read()
print(data)
self.assertEqual(data, FILE_DATA)
def test_signurl(self):
url = sign_url('/mybucket/test.txt')
# FIXME: Not yet working as we have no idea on how to access local GCS during the test.
result = urllib2.urlopen(url)
self.assertEqual(200, result.code)
self.assertEqual(FILE_DATA, result.read())
You can test GCS and service_accounts in your SDK, but you do not have a local appengine GCS service when you use a signed url.
But you can test your local app with service accounts and google cloud services.
Service accounts make it very easy to authorize appengine requests to other Google APIs and services.
To use a service account in the appengine SDK, you have to add two undocumented options when you run the development server:
--appidentity_email_address=<SERVICE_ACCOUNT_EMAIL_ADDRESS>
--appidentity_private_key_path=<PEM_KEY_PATH>
More info in this request for documentation issue
You can create or find the service account in the developers console permissions section of your appengine cloud project.
And you can create and download a p12 key for the service account.
Use OpenSSL to convert this p12 key in a RSA pem key.
I used this OpenSSL installer for Windows.
To create the pem key file in Windows use:
openssl pkcs12 -in <P12_KEY_PATH> -nocerts -nodes -passin pass:notasecret | openssl rsa -out <PEM_KEY_PATH>
Now you can use your cloud app service accounts in the development server and use app_identity to sign and authorize requests.

Google App Engine Admin SDK Reports API returning 403 Insufficient Permission error

I started with some sample code for App Engine from Google.
My app needs to use the Directory API and the Reports API from the Google Admin SDK.
I have created a project in the API Console and turned on the Admin SDK in Services.
I added the scopes (the same ones as used in the code below) to the "Manage API client access" section of Advanced Tools in my domain's Google cpanel.
The call to the Directory API works.
After that, the call to the Reports API fails with the error message:
"HttpError: https://www.googleapis.com/admin/reports/v1/activity/users/all/applications/admin?alt=json returned "Insufficient Permission">"
Thanks much for the assistance.
import webapp2
import os
from apiclient.discovery import build
from oauth2client.appengine import OAuth2Decorator
from oauth2client.appengine import OAuth2DecoratorFromClientSecrets
from apiclient import errors
import logging
import json
decorator = OAuth2DecoratorFromClientSecrets(
os.path.join(os.path.dirname(__file__), 'client_secrets.json'),
'https://www.googleapis.com/auth/admin.directory.user.readonly')
directoryauthdecorator = OAuth2Decorator(
client_id='123.apps.googleusercontent.com',
client_secret='456-abc',
callback_path='/oauth2callback',
scope='https://www.googleapis.com/auth/admin.directory.user.readonly '
'https://www.googleapis.com/auth/admin.reports.audit.readonly '
'https://www.googleapis.com/auth/admin.reports.usage.readonly'
)
class MainHandler(webapp2.RequestHandler):
def get(self):
self.response.write('Hello world!')
class OAuthHandler(webapp2.RequestHandler):
#directoryauthdecorator.oauth_required
def get(self):
users = []
# Get the authorized Http object created by the decorator.
auth_http = directoryauthdecorator.http()
# Get the directory service
service = build("admin", "directory_v1", http=auth_http)
result = []
page_token = None
while True:
try:
param = {}
param['domain'] = 'mydomain.com'
if page_token:
param['pageToken'] = page_token
files = service.users().list(**param).execute()
result.extend(files['users'])
page_token = files.get('nextPageToken')
if not page_token:
break
except errors.HttpError, error:
print 'An error occurred: %s' % error
break
users = []
for user in result:
logging.info(user['primaryEmail'])
users.append(user['primaryEmail'])
param = {}
param['userKey'] = 'all'
param['applicationName'] = 'admin'
service = build('admin', 'reports_v1', http=auth_http)
# this call fails with the 403 Insufficient Permissions error
results = service.activities().list(**param).execute()
logging.info(results)
app = webapp2.WSGIApplication([
('/', MainHandler),
('/users', OAuthHandler),
(directoryauthdecorator.callback_path, directoryauthdecorator.callback_handler()),
], debug=True)
I read this post and cleared the credentials from the datastore.
Hitting the /users url again I got the redirect_uri error message.
I went back to the API project, fixed the Redirect URIs, and downloaded the client_secrets.json file.
Now both calls work (one to Directory API, the other to Reports API).

Error in deployed GAE RequestHandler using Webapp2

I am using the webapp2 framework on Google App Engine, and I'm getting a basic error in one of my Request Handlers.
The app is running ok in the local instance, but causes the following traceback on the deployed version of Google App Engine:
Here's the code:
import os
from google.appengine.ext.webapp import template
import webapp2
import logging
class MainHandler(webapp2.RequestHandler):
def get(self):
logging.info('hi there 34')
template_values = {}
self.response.out.write('hello world 4')
path = os.path.join(os.path.dirname(__file__), 'index.html')
## This is the code that causes the bug ##
self.response.out.write(template.render(path, template_values))
## ## ## ##
debug = os.environ.get('SERVER_SOFTWARE', '').startswith('Dev')
app = webapp2.WSGIApplication(
[(r'/main', MainHandler)],
debug = debug)
def main():
app.run()
traceback error:
Traceback (most recent call last):
File "/base/python27_runtime/python27_dist/lib/python2.7/wsgiref/handlers.py",
line 86, in run
self.finish_response()
File "/base/python27_runtime/python27_dist/lib/python2.7/wsgiref/handlers.py",
line 127, in finish_response
self.write(data)
File "/base/python27_runtime/python27_dist/lib/python2.7/wsgiref/handlers.py",
line 202, in write
assert type(data) is StringType,"write() argument must be string"
AssertionError: write() argument must be string
What does this error mean?
I think response does not take unicode data, so you have to encode it first:
content = template.render(path, template_values)
self.response.out.write(content.encode('utf-8'))
Also I recommend Werkzeug. It works well on appengine and makes life so much easier. It helps to deal with request and response data, url routing, provides http exceptions, has great debugger for offline development and more. I think Werkzeug is a must to have for every python web dev in their toolbox.

Why the "Server not found" error when I enter the address - http://localhost:8080/ on my browser

I am new to Google App Engine. I am able to start the appserver from the command prompt of Windows and its showing "Info : The server is running at http://localhost:8080/" but still when I enter the address on my browser its showing the Sever not found error.
There are a quite a few reasons this could be happening. If you have checked your firewall settings I'm willing to bet that there is an issue with your app.yaml file.
If your main python script is called main and is located in the root directory of your application code a working example of an app.yaml file is:
application: yourapplicationname
version: 1
runtime: python
api_version: 1
- url: .*
script: main.py
Also make sure that within your main.py file the routes are correct.
eg:
from google.appengine.ext import webapp
class Main(webapp.RequestHandler):
def get(self):
self.response.out.write('Hello World')
def main():
application = webapp.WSGIApplication([('/', Main)],
debug=True)
util.run_wsgi_app(application)
if __name__ == '__main__':
main()

Resources