Our production app (python 2.7 standard environment) running on Google App Engine suddenly lost permissions to write (create objects) on Google Cloud Storage, without any change in the code on our side.
The code is able to create new buckets, but not new objects within them.
It seems that the default app engine service account is not granted the permission.
Needless to say, the service account has the Storage Object Creator role, as well as the Editor role on the project level.
Strangely, the exact same code running on the test environment project, continues to work perfectly.
We are using the api client library to obtain credentials, like so:
from oauth2client.appengine import AppAssertionCredentials
from apiclient.discovery import build as discovery_build
credentials = AppAssertionCredentials(scope=scope)
http = credentials.authorize(httplib2.Http())
service = discovery_build('storage', 'v1', http=http)
And then using the service to make the api call.
All calls to create objects are suddenly failing with the message:
"Anonymous caller does not have storage.objects.create access to /"
Any ideas what could suddenly have gone wrong ??
This turned up to be an issue with Google Cloud Storage (GCS).
A payed support ticket was opened, after approximately 90 hours, a rollback was made by Google GCS engineers which solved the issue, however, the root cause of the issue was not found or reported.
Very troubling that a production app can be affected this way for such a long time and eventually there is no explanation.
In Google App Engine Flexible Environment, I'd like to see which Version of my Service is the default.
The list operation of the Admin API does not show it:
https://appengine.googleapis.com/v1/apps/myapp/services/myservice/versions
Neither does the get operation:
https://appengine.googleapis.com/v1/apps/myapp/services/myservice.versions/myverson
In the Google App Engine API, I can use
ModulesService modulesService = ModulesServiceFactory.getModulesService();
String currentModuleName = modulesService.getCurrentModule();
String version = modulesService.getDefaultVersion(currentModuleName);
But that is not available in Flexible Environment.
How do I do this in the Admin API?
I believe it is because App Engine has stopped referring to a specific version as the default one.
App Engine seems to have no way to set a default version of a service since they replaced it with the feature of "Migrate traffic".
You can use the apps.services.get method to get the current traffic split for a service.
I've created a project and was following tutorials to enable Drive API and Drive SDK but Drive SDK isn't showing in console. Snapshot of console is added below:
You can use Google API such as Google Drive, Calendar etc. in Google App Engine(GAE). Unfortunately, they are not directly linked but you can use google-api-python-client library (which you have to install).
You can follow the Python Quickstart to start the integration with GAE and Google APIs.
Here is the following code given in the answer from the previous SO question:
import httplib2
from apiclient import discovery
credentials = get_credentials() #Your function to request / access stored credentials
#Authorise access to Drive using the user's credentials
http = credentials.authorise(httplib2.Http())
#The service object is the gateway to your API functions
service = discovery.build('drive', 'v2', http=http)
#Run your requests using the service object. e.g. list first 10 files:
results = service.files().list(maxResults=10).execute()
# ... etc ... Do something with results
Also include from the SO question is the API reference for the Google Drive.
I hope this helps you. :)
This is specifically a question relating to server to server authorisation between a python Google AppEngine app and Google's BigQuery, but could be relevant for other cloud services.
tldr; Is it possible to get the App Engine local development server to authenticate with the remote BigQuery service? Better yet is there a local BigQuery?
I understand that AppAssertionCredentials does not currently work on the local development server, though that in itself is very frustrating.
The alternative method which works for standard python code, outside of the local development server sandbox, detailed here does not work for the local development server because even with PyCrypto enabled the sandbox does not allow some posix modules e.g. 'pwd'.
I have got AppAssertionCredentials working on the remote server and the SignedJwtAssertionCredentials method working in native python locally, so the service accounts are set up properly.
The imports fail within oauth2client/crypt.py within the try/except blocks - after commenting them out the sandbox whitelist exceptions are easily seen.
I've fiddled around with adding 'pwd' to the whitelist, then another problem crops up, so I scurried back out of that rabbit hole.
I've tried including PyCrypto directly into the project with similar results.
I've also tried with OpenSSL with similar results.
I have looked for a local appengine specific PyCrypto to no avail, have I missed one? I should say this is on Mac OSX - perhaps I should fire up a linux box and give that a go?
A recent release of Google App Engine SDK added support for the AppAssertionCredentials method on the development server. To use this method locally, add the following arguments to dev_appserver.py:
$ dev_appserver.py --help
...
Application Identity:
--appidentity_email_address APPIDENTITY_EMAIL_ADDRESS
email address associated with a service account that
has a downloadable key. May be None for no local
application identity. (default: None)
--appidentity_private_key_path APPIDENTITY_PRIVATE_KEY_PATH
path to private key file associated with service
account (.pem format). Must be set if
appidentity_email_address is set. (default: None)
To use these:
In Google Developer Console, select a project then navigate to "API & auth" -> "Credentials" -> "Create new client ID".
Select "Service account" and follow the prompts to download the private key in PKCS12 (.p12) format. Take note of the email address for the service account.
Make sure you add that service account email address to the "Permissions" tab for any project that contains data it needs to access, by default it is added to the project team in which it was created.
Convert the PKCS12 format to PKCS1 format using the following command:
$ cat /path/to/xxxx-privatekey.p12 | openssl pkcs12 -nodes -nocerts -passin pass:notasecret | openssl rsa > /path/to/secret.pem
Start dev_appserver.py as:
$ dev_appserver.py --appidentity_email_address xxxx#developer.gserviceaccount.com --appidentity_private_key_path /path/to/secret.pem ...
Use appidentity module and AppAssertionCredentials in the same manner locally as you normally would in production.
Please ensure that /path/to/secret.pem is outside of your application source directory so that it is not accidentally deployed as part of your application.
So searching deeper for PyCrypto and local appengine sandbox lead me onto this thread and response specifically...
https://code.google.com/p/googleappengine/issues/detail?id=1627#c22
This is fixed in 1.7.4. However, you must use easy_install -Z
(--always-unzip) to install PyCrypto. The default zipfile option in
OSX 10.8 is incompatible with the sandbox emulation in the
dev_appserver.
The solution turns out to be very straight forward...
I used:
sudo easy_install pycrypto
and it should have been:
sudo easy_install -Z pycrypto
as per the thread above. Using PIP will work as well:
pip install pycrypto
or a manual download and install of pycrypto will also work. I tested all three.
If you have installed pycrypto with easy_install and without -Z flag then you may want to install pip just so you can easily uninstall pycrypto...
easy_install pip
for the record I built and installed libgmp, as pil and the manual install showed this warning...
warning: GMP or MPIR library not found; Not building
Crypto.PublicKey._fastmath.
Although this gave me fastmath, it was not essential to solve the problem as the Crypto libs gracefully fail to slowmath.
Another point that tripped me up for a bit was I removed pycrypto from app.yaml whilst testing to see if OpenSSL might give me all I need.
So dont forget to add...
- name: pycrypto
version: latest
into app.yaml under the libraries: section.
With this missing the native _counter library was not imported hence Counter failed etc.
Also for the record any talk of having to move Crypto into the app folders themselves or out of the default Mac OS X location of /Library/Python/2.7/site-packages/Crypto was only valid in earlier versions of the dev server.
Similarly there is now no need to edit any _WHITE_LIST_C_MODULES lists (which is in sandbox.py in appengine 1.8 onwards, which also includes the regex which allows Crypto.Util._counter etc)
The other bit of the puzzle in case you get here before discovering the key issue is that the key file you download from the console is PKCS12 and is downloaded as hex text, so I converted that to binary and then converted that to a PEM so I could include it in the source code.
I struggled with this one for a day or two. And I was finally able to get localhost working with server to server authentication, a service account and a .p12 cert.
If it's at all helpful to anyone, here's a simple gist: https://gist.github.com/dandelauro/7836962
I agree with the first post - the localhost/production impedance is a real pain in the a**. AppAssertionCredentials is the right way to go on production and I don't want to have two different code paths between production and localhost. So the development environments need to be adjusted to be able to perform the required authentication without affecting the main code path.
E.g., perhaps a developer could log in with their own Google account using appcfg.py and then that auth would be cached for a period such that AppAssertionCredentials would work out. The developer's Google account could be granted permissions on the appropriate environments (dev and test for us, e.g.)
re: "local BigQuery" - we have some initial stuff in place that uses SQLLite to simulate BigQuery interactions for unit tests and other offline/local testing, but of course, it's not a great simulation. I agree that all the Cloud Platform products need to spend as much time thinking about the development-time experience as App Engine has.
Is it possible to get the App Engine local development server to authenticate with the remote BigQuery service?
I think it's impossible to use AppAssertionCredentials as authentication method between BigQuery service and your local App Engine server currently.
Alternatively, I'm using OAuth2 authentication which is associated with specific user(this user must be registered in your project at google api console) to access BigQuery from local App Engine server.
For getting user OAuth2 authentication, I use oauth2client.client module in the app code.
I hope this will be helpful to your problem.
Updated:
This is what I'm doing for getting the user OAuth2 authorization.
Edited:
Added missing import statement.
Thanks mattes!
import os
import webapp2
import httplib2
from oauth2client.client import OAuth2Credentials
from oauth2client.appengine import StorageByKeyName, CredentialsModel, OAuth2DecoratorFromClientSecrets
from google.appengine.api import users
oauth2_decorator = OAuth2DecoratorFromClientSecrets(
os.path.join(os.path.dirname(__file__), 'client_secrets.json'),
scope='https://www.googleapis.com/auth/bigquery')
oauth2_decorator._kwargs = {'approval_prompt': 'force'}
class TestPage(webapp2.RequestHandler):
#oauth2_decorator.oauth_required
def get(self):
user_id = users.get_current_user().user_id()
credentials = StorageByKeyName(CredentialsModel, user_id, 'credentials').locked_get()
http = credentials.authorize(httplib2.Http()) # now you can use this http object to access BigQuery service
application = webapp2.WSGIApplication([
('/', TestPage),
(oauth2_decorator.callback_path, oauth2_decorator.callback_handler()),
], debug=True)
Suppose I uploaded another version (say, "version: 2" in file app.yaml ) of my Google App Engine application. Version 1 is still the default and version 2 is for testing. How do I run it then?
Once you upload a version on Appengine, you can switch between them easily.
Say that your app name is myapp, currently running version 1. You also have uploaded a version called 2-testing. Your default app (with version 1) can be reached by accessing myapp.appspot.com
If you wanted to access your versions explicitly, you joust need to access <version_name>-dot-myapp.appspot.com. Following the example it would be:
1-dot-myapp.appspot.com or 2-testing-dot-myapp.appspot.com
The -dot- is equivalent to <version>.<appname> but allows you to correctly serve a secure application with SSL
You can mark any version you want as default (serving myapp.appspot.com) using the admin console
edit: this is the official documentation page talking about domains and subdomains in Appengine
Under versions in your admin console you can find the live uri of a version, if you select the version.
And you can use traffic splitting, where you can use your own client ip or a cookie to test a version.
Docs: https://developers.google.com/appengine/docs/adminconsole/trafficsplitting