GAE task, are the urls secure by design? - google-app-engine

I'm trying to wrap my head around Google App Engine and more specifically at the Tasks.
My question is about security, if I define a queue like :
- url: /queues/long-task
script: urlhandlers.QueueLongTask.app
login: admin
Will I be sure that the /queues/long-task can only be accessed by admin AND task system ? I was not able to find a reference about this in the Google documentation.
Thank you in advance

You are correct, login: admin takes care of it.
Here you can find more info on the documentation:
https://cloud.google.com/appengine/docs/python/taskqueue/overview-push#Python_Securing_URLs_for_tasks
You can also use the headers like X-AppEngine-QueueName if you want to do specific things only when this is called from a task:
"These headers are set internally by Google App Engine. If your request handler finds any of these headers, it can trust that the request is a Task Queue request. If any of the above headers are present in an external user request to your app, they are stripped."

Related

Is a single Cookie Based API for multiple frontends possible from a CORS perspective?

I originally wrote an REST API to work with a previously written mobile app. The mobile programmer requested from me to generate an auth_token on login that he will pass as a header on each request that needed authentication. This API runs at api.example.com.
Later on, I was commissioned to write an AngularJS app that communicates with this API, so I had to use Access-Control-Allow headers on the backend for OPTIONS requests to be CORS compatible CORS so my browser allows the connection (looks like iOS does not look for this headers). This app runs at one.example.com.
Now, I have to write a second AngularJS app that will run at two.example.com and there's a third being planned for the near future at three.example.com.
My problem is that my Access-Control-Allow-Origin header looks like this:
Access-Control-Allow-Origin: http://one.example.com:80
* is not allowed, nor I'm able to set this header to more than one origin. So as far as I can see I have two solutions:
Implement token-based authentication in parallel to the current cookie-based one. I'm thinking on this. This will of course take some time I'm willing to save.
Send the requester a header or param to the API endpoint identifying the app on the OPTIONS request and server-side, produce the CORS headers accordingly. I don't even know if it's possible and this looks nasty for even thinking it.
Any better ideas?
If they have the same origin, example the same domain (example.com) or the same subdomain (1.ex.example.com and 2.ex.example.com) they can share the same cookie. Because cookie is based on the domain itself.

pull queues authorization from compute

I'm trying to access a pull queue from google compute with the compute OAuth token using python
from oauth2client import gce
from apiclient.discovery import build
import httplib2
credentials = gce.AppAssertionCredentials('')
http = httplib2.Http()
http=credentials.authorize(http)
credentials.refresh(http)
service = build('taskqueue', 'v1beta2', http=http)
tq=service.taskqueues()
tq.get(project=MY_APPENGINE_PROJECT, taskqueue=PULL_QUEUE_NAME, getStats=True).execute()
I keep getting HttpError 403 "you are not allowed to make this api call"
please help, what configure have I missing?
thanks,
Shay
UPDATE: Thanks to #Shay for asking this question, the issue he encountered is no longer an issue, as we have allowed aliases to work (when relevant) in the Task Queue API.
For posterity here is the original answer below:
Two of the most common mistakes I have seen are:
Forgetting to include the s~ in your App Engine Project. For example, if your application ID is my-awesome-app, then you are calling
tq.get(project='my-awesome-app', taskqueue=PULL_QUEUE_NAME...
when you should be calling
tq.get(project='s~my-awesome-app', taskqueue=PULL_QUEUE_NAME...
Forgetting to add the Compute service account to the task queue ACL in queue.yaml. To do this, you need to get the service account associated with your project and add it to the acl:
queue:
- name: pull-queue
mode: pull
acl:
- writer_email: 123845678986#project.gserviceaccount.com # can do all
and of course this would mean PULL_QUEUE_NAME = 'pull-queue' here. Also note, 123845678986#project.gserviceaccount.com should be replaced with the service account for your Compute Engine instance.

Program access admin on GAE - oauth2

I have a GAE app, with a URL I restrict to admin:
- url: /admin
script: _go_app
login: admin
I want to PUT or POST to this url with another Go program. What code do I need to write for the client to authenticate to GAE and dev_server.py? Is there a more sensible way that just mocking a web-browser and logging in? I don't need to authenticate or authorise other users, just the admin account for that app.
Is this OAuth? OAuth2? OpenID? Federated? Something else?
I realise this is a bit of an awkward question, since I'm not even sure what the right way to ask it is. However I am able to post to (in this example) /admin using a web browser after logging in with my (admin) gmail account. In that case the request (sent by Chrome) contains the cookies: __cfduid, ACSID (and what I think are Google Analytics IDs). Presumably one of those is responsible for my authentication. How do I get one of those?
And as a side question, if someone MITMs my connection (over http), can they hijack my admin session by reusing that cookie?
GAE likes OAuth2
Have a look at goauth2 . It seems to be the canonical OAuth2 library for Go. They provide a fairly comprehensive example at https://code.google.com/p/goauth2/source/browse/oauth/example/oauthreq.go .
With regards to your question "Presumably one of those is responsible for my authentication. How do I get one of those?", they state:
To obtain Client ID and Secret, see the "OAuth 2 Credentials" section under
the "API Access" tab on this page: https://code.google.com/apis/console/
And, finally, my humble opinion on "if someone MITMs my connection (over http), can they hijack my admin session by reusing that cookie?" is that you should never provide any authenticated connection (nor the connection that does the authentication) over plain http. Especially an admin section.
EDIT: To elaborate on the MITM question, make sure you use HTTPS for any login requests and subsequent requests for the same session, and make sure to set Secure and HttpOnly flags on your cookies.
OAuth2 if you want to use Google Accounts.
See here for details: https://developers.google.com/appengine/docs/go/users/overview (this section specifically deals with admin views)

Cloud Endpoints HTTP Cookies

I am implementing Cloud Endpoints with a Python app that uses custom authentication (GAE Sessions) instead of Google Accounts. I need to authenticate the requests coming from the Javascript client, so I would like to have access to the cookie information.
Reading this other question leads me to believe that it is possible, but perhaps not documented. I'm not familiar with the Java side of App Engine, so I'm not quite sure how to translate that snippet into Python. Here is an example of one of my methods:
class EndpointsAPI(remote.Service):
#endpoints.method(Query_In, Donations_Out, path='get/donations',
http_method='GET', name='get.donations')
def get_donations(self, req):
#Authenticate request via cookie
where Query_In and Donations_Out are both ProtoRPC messages (messages.Message). The parameter req in the function is just an instance of Query_In and I didn't find any properties related to HTTP data, however I could be wrong.
First, I would encourage you to try to use OAuth 2.0 from your client as is done in the Tic Tac Toe sample.
Cookies are sent to the server in the Cookie Header and these values are typically set in the WSGI environment with the keys 'HTTP_...' where ... corresponds to the header name:
http = {key: value for key, value in os.environ.iteritems()
if key.lower().startswith('http')}
For cookies, os.getenv('HTTP_COOKIE') will give you the header value you seek. Unfortunately, this doesn't get passed along through Google's API Infrastructure by default.
UPDATE: This has been enabled for Python applications as of version 1.8.0. To send cookies through, specify the following:
from google.appengine.ext.endpoints import api_config
AUTH_CONFIG = api_config.ApiAuth(allow_cookie_auth=True)
#endpoints.api(name='myapi', version='v1', auth=AUTH_CONFIG, ...)
class MyApi(remote.service):
...
This is a (not necessarily comprehensive list) of headers that make it through:
HTTP_AUTHORIZATION
HTTP_REFERER
HTTP_X_APPENGINE_COUNTRY
HTTP_X_APPENGINE_CITYLATLONG
HTTP_ORIGIN
HTTP_ACCEPT_CHARSET
HTTP_ORIGINALMETHOD
HTTP_X_APPENGINE_REGION
HTTP_X_ORIGIN
HTTP_X_REFERER
HTTP_X_JAVASCRIPT_USER_AGENT
HTTP_METHOD
HTTP_HOST
HTTP_CONTENT_TYPE
HTTP_CONTENT_LENGTH
HTTP_X_APPENGINE_PEER
HTTP_ACCEPT
HTTP_USER_AGENT
HTTP_X_APPENGINE_CITY
HTTP_X_CLIENTDETAILS
HTTP_ACCEPT_LANGUAGE
For the Java people who land here. You need to add the following annotation in order to use cookies in endpoints:
#Api(auth = #ApiAuth(allowCookieAuth = AnnotationBoolean.TRUE))
source
(Without that it will work on the local dev server but not on the real GAE instance.)

App Engine remote_api with OpenID

I've recently tried to switch my app engine app to using openID, but I'm having an issue authenticating with remote_api. The old authentication mechanism for remote_api doesn't seem to work (which makes sense) - I'm getting a 'urllib2.HTTPError: HTTP Error 302: Found', which I assume is appengine redirecting me to the openid login page I've set up.
I guess I'm missing something fairly obvious. Currently my remote_api script has the following in it -
remote_api_stub.ConfigureRemoteDatastore(app_id=app_id, path='/remote_api', auth_func=auth_func, servername=host, secure=secure)
where auth_func is
def auth_func():
return raw_input('Username:'), getpass.getpass('Password:')
Any ideas what I need to supply to remote_api? I guess similar issues would be encountered with bulkloader too. Cheers,
Colin
This was a fun one.
Looking at remote_api, the flow for authentication seems to be something like this:
Prompt the user for Google credentials
Post the credentials to https://www.google.com/accounts/ClientLogin
Parse the auth token out of the response body
Pass the token to https://myapp.appspot.com/_ah/login
Grab ACSID cookie set in the response
Pass the ACSID cookie in subsequent requests that require authorization
I couldn't find a lot of documentation on the new OpenID support, though Nick's blog entry was informative.
Here's the test app I wrote to see how things work:
app.yaml:
handlers:
- url: /remote_api
script: $PYTHON_LIB/google/appengine/ext/remote_api/handler.py
login: admin
- url: /.*
script: test.py
test.py:
class MainPage(webapp.RequestHandler):
def get(self):
user = users.get_current_user()
if user:
self.response.out.write("Hi, %s!<hr>admin is %s" % (user.user_id(),
users.is_current_user_admin()))
else:
self.redirect(users.create_login_url('/', None,
'https://www.google.com/accounts/o8/id'))
Flipping my auth mode between Google Accounts and Federated Login, I noticed a few things:
Admin users are correctly recognized by is_current_user_admin() with OpenID
Mixing modes doesn't work. With authentication set to Google Accounts, calling create_login_url with a federated_identity throws a NotAllowedError
An ACSID cookie is still produced at the end of the login process, only it comes from /_ah/openid_verify instead of /_ah/login
So what's happening with remote_api when using Federated Login? If we're using the default appengine_rpc.HttpRpcServer, it's dutifully following the same Google Account authentication process described at the top, only the app no longer considers the ACSID cookie returned by /_ah/login to be valid, so since you're still unauthenticated, you get a 302 redirect to the OpenID login page, /_ah/login_required.
I dunno what the right solution is here. Seems like it would require an API update. Maybe Nick or one of the other Googlers can weigh in.
For now, here's a hacky workaround:
Turn on Federated Login for your app
Make sure you're passing save_cookies=True when calling remote_api_stub.ConfigureRemoteDatastore for your console script
Attempt console authentication and get the 302 error
Login as an admin via your app's web interface
In your browser cookies, find the ACSID cookie for myapp.appspot.com
Find and edit your local ~/.appcfg_cookies file
Replace the ACSID cookie for myapp.appspot.com with the one from your browser
The next time you try to use remote_api, it should work without prompting for credentials. You'll have to repeat the last 4 steps every time the cookie expires, though. You can bump the expiration from 1 day to as high as 2 weeks in the admin console to minimize the annoyance. Have fun!
This is definitely an issue... mark your interest in getting Google to fix this by starring the ticket at http://code.google.com/p/googleappengine/issues/detail?id=3258 and feel free to add any of your workarounds there.
On a related note, we also recognize that the docs are somewhat sparse, so I'm working on an article which hopefully fills-in some of those holes... stay tuned and keep your eyes open at http://code.google.com/appengine/articles
Here's a workaround you can use until there's a more permanent solution in place.

Resources