GAE Python + Google Cloud Storage authorization problems - google-app-engine

I'm trying to get a Google Cloud Storage + Google App Engine (Python) "Hello world" app running.
I have followed the instructions to activate Google Cloud Storage and created a bucket from within my App Engine app's console. I've written a trivial program to write a test file, but when I run it, the cloud storage library throws a 403, apparently because my App Engine app's credentials aren't being presented, aren't valid, or don't have rights to this bucket.
Here's the code:
BUCKET = '/pcj-info-testing'
class TestGCS(webapp2.RequestHandler):
def create_file(self, filename):
self.response.write("Creating\n");
gcs_file = gcs.open(filename, 'w', content_type = 'text/plain');
gcs_file.write('Testing, 1, 2, 3\n')
gcs_file.write('At ' + datetime.now().isoformat())
gcs_file.close()
def get(self):
self.response.headers['Content-Type'] = 'text/plain'
filename = BUCKET + "/demo-file" + str(random.randrange(10**10, 10**11-1))
self.create_file(filename)
self.response.write('File stat:\n')
stat = gcs.stat(filename)
self.response.write(repr(stat))
and here is the relevant portion of the stack trace and error:
File "/base/data/home/apps/s~pcj-info/1.373629132682543570/gcstest.py", line 14, in create_file
gcs_file = gcs.open(filename, 'w', content_type = 'text/plain');
File "/base/data/home/apps/s~pcj-info/1.373629132682543570/cloudstorage/cloudstorage_api.py", line 74, in open
return storage_api.StreamingBuffer(api, filename, content_type, options)
File "/base/data/home/apps/s~pcj-info/1.373629132682543570/cloudstorage/storage_api.py", line 597, in __init__
errors.check_status(status, [201], path, headers, resp_headers)
File "/base/data/home/apps/s~pcj-info/1.373629132682543570/cloudstorage/errors.py", line 106, in check_status
raise ForbiddenError(msg)
ForbiddenError: Expect status [201] from Google Storage. But got status 403.
Path: '/pcj-info-testing/demo-file16955619179'.
Request headers: {'x-goog-resumable': 'start', 'x-goog-api-version': '2', 'content-type': 'text/plain', 'accept-encoding': 'gzip, *'}.
Response headers: {'alternate-protocol': '443:quic', 'content-length': '146', 'via': 'HTTP/1.1 GWA', 'x-google-cache-control': 'remote-fetch', 'vary': 'Origin', 'server': 'HTTP Upload Server Built on Jan 23 2014 15:07:07 (1390518427)', 'date': 'Sat, 08 Feb 2014 16:49:56 GMT', 'content-type': 'application/xml; charset=UTF-8'}.
Extra info: None.
I've looked at the access privileges on the bucket, and they look correct (although they should have been created correctly by default, according to the documentation)—the IDs listed as mine in the Google APIs Console are the same ones in the bucket permissions, with Owner permissions.
I can see the reasons that Google might return a 403 in their Status Codes documentation, but the urlfetch library in App Engine that makes the call doesn't return these error names, as far as I can tell (I assume they are the text strings returned in the HTTP response right after the error code, but the library only gives you the integer result, I think). So I'm at a loss for what to do next.
Is there any straightforward way to capture the error being returned by the API? My troubleshooting would vary dramatically if I knew the error was AccountProblem vs. InvalidSecurity or something. It's very frustrating that the API returns error codes in a way that's not accessible to users using the recommended library on a supported cloud development platform.
If I give "All Authenticated Users" access to write, it works. So it looks like I'm authenticating as someone, but that someone isn't on the permissions list. Adding the Gmail account I'm logged in as while testing the app doesn't help. (The app requires admin login to hit all URLs.)

You need to add the appengine service account as having write permissions.
You can find the service account email under the "Application Settings" of the appengine console, under "Service Account Name".

Related

AppEngine service account cannot be generated but it exist

So I created my first app engine and I think I deleted wrong service account. I was able to recover it following ideas here.
However I still can't redeploy app engine (or deploy any new engine in the project). Here is the error I'm getting. As I said, I managed to run projects.serviceAccounts.undelete endpoint but it doesn't help. Any idea what could be done to get this working again?
This may be due to network connectivity issues. Please check your network settings, and the status of the service you are trying to reach.
ERROR: (gcloud.app.deploy) HttpError accessing <https://appengine.googleapis.com/v1/apps/{project}/services/orch/versions?alt=json>: response: <{'vary': 'Origin, X-Origin, Referer', 'c
ontent-type': 'application/json; charset=UTF-8', 'date': 'Mon, 17 May 2021 19:47:03 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORI
GIN', 'x-content-type-options': 'nosniff', 'alt-svc': 'h3-29=":443"; ma=2592000,h3-T051=":443"; ma=2592000,h3-Q050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000
,quic=":443"; ma=2592000; v="46,43"', 'transfer-encoding': 'chunked', 'status': '500', 'content-length': '147', '-content-encoding': 'gzip'}>, content <{
"error": {
"code": 500,
"message": "AppEngine service account cannot be generated for p~{project}.",
"status": "INTERNAL"
}
}
edit 1:
I went to the source code of app engine and under the debugger there is info:
Please fix the following issue and try again:
App Engine service account
An error has occurred while adding the App Engine default service account. Visit this page and manually add the <project-name>#appspot.gserviceaccount.com account with editor role.
however when I try to add it I get
Email addresses and domains must be associated with an active Google Account, Google Workspace account, or Cloud Identity account.
I see that you have inadvertently deleted the default service account,this could cause the products which rely on App Engine (including Cloud Functions and Cloud Scheduler) to not work as expected, as you are experiencing now.
Please follow the below steps if 30 days has not elapsed since deleting the account:
My best recommendation is that you deploy your resources in a different project to work around this issue.
However if there is significant investment made into deploying the account you can reach out to GOOGLE CLOUD PLATFORM SUPPORT by using the following link[1], as depending upon the severity or type of your deployment ,there could be changes made to undelete it completely. This process can be done upto 30 days after deletion.
Incase 30 days have elapsed since the deletion of the account, Unfortunately it is not possible to recover the account.
[1] https://cloud.google.com/contact
[2]https://cloud.google.com/iam/docs/creating-managing-service-accounts#undeleting

Google Cloud Tasks cannot authenticate to Cloud Run

I am trying to invoke a Cloud Run service using Cloud Tasks as described in the docs here.
I have a running Cloud Run service. If I make the service publicly accessible, it behaves as expected.
I have created a cloud queue and I schedule the cloud task with a local script. This one is using my own account. The script looks like this
from google.cloud import tasks_v2
client = tasks_v2.CloudTasksClient()
project = 'my-project'
queue = 'my-queue'
location = 'europe-west1'
url = 'https://url_to_my_service'
parent = client.queue_path(project, location, queue)
task = {
'http_request': {
'http_method': 'GET',
'url': url,
'oidc_token': {
'service_account_email': 'my-service-account#my-project.iam.gserviceaccount.com'
}
}
}
response = client.create_task(parent, task)
print('Created task {}'.format(response.name))
I see the task appear in the queue, but it fails and retries immediately. The reason for this (by checking the logs) is that the Cloud Run service returns a 401 response.
My own user has the roles "Service Account Token Creator" and "Service Account User". It doesn't have the "Cloud Tasks Enqueuer" explicitly, but since I am able to create the task in the queue, I guess I have inherited the required permissions.
The service account "my-service-account#my-project.iam.gserviceaccount.com" (which I use in the task to get the OIDC token) has - amongst others - the following roles:
Cloud Tasks Enqueuer (Although I don't think it needs this one as I'm creating the task with my own account)
Cloud Tasks Task Runner
Cloud Tasks Viewer
Service Account Token Creator (I'm not sure whether this should be added to my own account - the one who schedules the task - or to the service account that should perform the call to Cloud Run)
Service Account User (same here)
Cloud Run Invoker
So I did a dirty trick: I created a key file for the service account, downloaded it locally and impersonated locally by adding an account to my gcloud config with the key file. Next, I run
curl -H "Authorization: Bearer $(gcloud auth print-identity-token)" https://url_to_my_service
That works! (By the way, it also works when I switch back to my own account)
Final tests: if I remove the oidc_token from the task when creating the task, I get a 403 response from Cloud Run! Not a 401...
If I remove the "Cloud Run Invoker" role from the service account and try again locally with curl, I also get a 403 instead of a 401.
If I finally make the Cloud Run service publicly accessible, everything works.
So, it seems that the Cloud Task fails to generate a token for the service account to authenticate properly at the Cloud Run service.
What am I missing?
I had the same issue here was my fix:
Diagnosis: Generating OIDC tokens currently does not support custom domains in the audience parameter. I was using a custom domain for my cloud run service (https://my-service.my-domain.com) instead of the cloud run generated url (found in the cloud run service dashboard) that looks like this: https://XXXXXX.run.app
Masking behavior: In the task being enqueued to Cloud Tasks, If the audience field for the oidc_token is not explicitly set then the target url from the task is used to set the audience in the request for the OIDC token.
In my case this meant that enqueueing a task to be sent to the target https://my-service.my-domain.com/resource the audience for the generating the OIDC token was set to my custom domain https://my-service.my-domain.com/resource. Since custom domains are not supported when generating OIDC tokens, I was receiving 401 not authorized responses from the target service.
My fix: Explicitly populate the audience with the Cloud Run generated URL, so that a valid token is issued. In my client I was able to globally set the audience for all tasks targeting a given service with the base url: 'audience' : 'https://XXXXXX.run.app'. This generated a valid token. I did not need to change the url of the target resource itself. The resource stayed the same: 'url' : 'https://my-service.my-domain.com/resource'
More Reading:
I've run into this problem before when setting up service-to-service authentication: Google Cloud Run Authentication Service-to-Service
1.I created a private cloud run service using this code:
import os
from flask import Flask
from flask import request
app = Flask(__name__)
#app.route('/index', methods=['GET', 'POST'])
def hello_world():
target = os.environ.get('TARGET', 'World')
print(target)
return str(request.data)
if __name__ == "__main__":
app.run(debug=True,host='0.0.0.0',port=int(os.environ.get('PORT', 8080)))
2.I created a service account with --role=roles/run.invoker that I will associate with the cloud task
gcloud iam service-accounts create SERVICE-ACCOUNT_NAME \
--display-name "DISPLAYED-SERVICE-ACCOUNT_NAME"
gcloud iam service-accounts list
gcloud run services add-iam-policy-binding SERVICE \
--member=serviceAccount:SERVICE-ACCOUNT_NAME#PROJECT-ID.iam.gserviceaccount.com \
--role=roles/run.invoker
3.I created a queue
gcloud tasks queues create my-queue
4.I create a test.py
from google.cloud import tasks_v2
from google.protobuf import timestamp_pb2
import datetime
# Create a client.
client = tasks_v2.CloudTasksClient()
# TODO(developer): Uncomment these lines and replace with your values.
project = 'your-project'
queue = 'your-queue'
location = 'europe-west2' # app engine locations
url = 'https://helloworld/index'
payload = 'Hello from the Cloud Task'
# Construct the fully qualified queue name.
parent = client.queue_path(project, location, queue)
# Construct the request body.
task = {
'http_request': { # Specify the type of request.
'http_method': 'POST',
'url': url, # The full url path that the task will be sent to.
'oidc_token': {
'service_account_email': "your-service-account"
},
'headers' : {
'Content-Type': 'application/json',
}
}
}
# Convert "seconds from now" into an rfc3339 datetime string.
d = datetime.datetime.utcnow() + datetime.timedelta(seconds=60)
# Create Timestamp protobuf.
timestamp = timestamp_pb2.Timestamp()
timestamp.FromDatetime(d)
# Add the timestamp to the tasks.
task['schedule_time'] = timestamp
task['name'] = 'projects/your-project/locations/app-engine-loacation/queues/your-queue/tasks/your-task'
converted_payload = payload.encode()
# Add the payload to the request.
task['http_request']['body'] = converted_payload
# Use the client to build and send the task.
response = client.create_task(parent, task)
print('Created task {}'.format(response.name))
#return response
5.I run the code in Google Cloud Shell with my user account which has Owner role.
6.The response received has the form:
Created task projects/your-project/locations/app-engine-loacation/queues/your-queue/tasks/your-task
7.Check the logs, success
The next day I am no longer able to reproduce this issue. I can reproduce the 403 responses by removing the Cloud Run Invoker role, but I no longer get 401 responses with exactly the same code as yesterday.
I guess this was a temporary issue on Google's side?
Also, I noticed that it takes some time before updated policies are actually in place (1 to 2 minutes).
For those like me, struggling through documentation and stackoverflow when having continuous UNAUTHORIZED responses on Cloud Tasks HTTP requests:
As was written in thread, you better provide audience for oidcToken you send to CloudTasks. Ensure your requested url exactly equals to your resource.
For instance, if you have Cloud Function named my-awesome-cloud-function and your task request url is https://REGION-PROJECT-ID.cloudfunctions.net/my-awesome-cloud-function/api/v1/hello, you need to ensure, that you set function url itself.
{
serviceAccountEmail: SERVICE-ACCOUNT_NAME#PROJECT-ID.iam.gserviceaccount.com,
audience: https://REGION-PROJECT-ID.cloudfunctions.net/my-awesome-cloud-function
}
Otherwise seems full url is used and leads to an error.

Authenticating between a Google Cloud Scheduler job and Google App Engine endpoint [duplicate]

What's the process for verifying the HTTP request from Google Cloud scheduler? The docs (https://cloud.google.com/scheduler/docs/creating) mention you can create a job with a target of any publicly available HTTP endpoint but do not mention how the server verifies the cron/scheduler request.
[Update May 28, 2019]
Google Cloud Scheduler now has two command line options:
--oidc-service-account-email=<service_account_email>
--oidc-token-audience=<service_endpoint_being_called>
These options add an additional header to the request that Cloud Scheduler makes:
Authorization: Bearer ID_TOKEN
You can process the ID_TOKEN inside your endpoint code to verify who is calling your endpoint.
For example, you can make an HTTP request to decode the ID Token:
https://oauth2.googleapis.com/tokeninfo?id_token=ID_TOKEN
This will return JSON like this:
{
"aud": "https://cloudtask-abcdefabcdef-uc.a.run.app",
"azp": "0123456789077420983142",
"email": "cloudtask#development.iam.gserviceaccount.com",
"email_verified": "true",
"exp": "1559029789",
"iat": "1559026189",
"iss": "https://accounts.google.com",
"sub": "012345678901234567892",
"alg": "RS256",
"kid": "0123456789012345678901234567890123456789c3",
"typ": "JWT"
}
Then you can check that the service account email matches the one that you authorized Cloud Scheduler to use and that the token has not expired.
[End Update]
You will need to verify the request yourself.
Google Cloud Scheduler includes several Google specific headers such as User-Agent: Google-Cloud-Scheduler. Refer to the documentation link below.
However, anyone can forge HTTP headers. You need to create a custom something that you include as an HTTP Header or in the HTTP body that you know how to verify. Using a signed JWT would be secure and easy to create and verify.
When you create a Google Cloud Scheduler Job you have some control over the headers and body fields. You can embed your custom something in either one.
Scheduler Jobs
[Update]
Here is an example (Windows command line) using gcloud so that you can set HTTP headers and the body. This example calls Cloud Functions on each trigger showing how to include an APIKEY. The Google Console does not have this level of support yet.
gcloud beta scheduler ^
--project production ^
jobs create http myfunction ^
--time-zone "America/Los_Angeles" ^
--schedule="0 0 * * 0" ^
--uri="https://us-central1-production.cloudfunctions.net/myfunction" ^
--description="Job Description" ^
--headers="{ \"Authorization\": \"APIKEY=AUTHKEY\", \"Content-Type\": \"application/json\" }" ^
--http-method="POST" ^
--message-body="{\"to\":\"/topics/allDevices\",\"priority\":\"low\",\"data\":{\"success\":\"ok\"}}"
Short answer
If you host your app in Google Cloud, just check if header X-Appengine-Queuename equals __scheduler. However, this is undocumented behaviour, for more information read below.
Furthermore, if possible use Pub/Sub instead of HTTP requests, as Pub/Sub is internally sent (therefore of implicitly verified origin).
Experiment
As I've found here, Google strips requests of certain headers1, but not all2. Let's find if there are such headers for Cloud Scheduler.
1 E.g. you can't send any X-Google-* headers (found experimentally, read more)
2 E.g. you can send X-Appengine-* headers (found experimentally)
Flask app used in the experiment:
#app.route('/echo_headers')
def echo_headers():
headers = {h[0]: h[1] for h in request.headers}
print(headers)
return jsonify(headers)
Request headers sent by Cloud Scheduler
{
"Host": []
"X-Forwarded-For": "0.1.0.2, 169.254.1.1",
"X-Forwarded-Proto": "http",
"User-Agent": "AppEngine-Google; (+http://code.google.com/appengine)",
"X-Appengine-Queuename": "__scheduler",
"X-Appengine-Taskname": [private]
"X-Appengine-Taskretrycount": "1",
"X-Appengine-Taskexecutioncount": "0",
"X-Appengine-Tasketa": [private]
"X-Appengine-Taskpreviousresponse": "0",
"X-Appengine-Taskretryreason": "",
"X-Appengine-Country": "ZZ",
"X-Cloud-Trace-Context": [private]
"X-Appengine-Https": "off",
"X-Appengine-User-Ip": [private]
"X-Appengine-Api-Ticket": [private]
"X-Appengine-Request-Log-Id": [private]
"X-Appengine-Default-Version-Hostname": [private]
}
Proof that header X-Appengine-Queuename is stripped by GAE
Limitations
This method is most likely not supported by Google SLAs and Depreciation policies, since it's not documented. Also, I'm not sure if header cannot forged when the request source is within Google Cloud (maybe they're stripped at the outside layer). I've tested with an app in GAE, results may or may not vary for other deployment options. In short, use at your own risk.
This header should work:
map (key: string, value: string)
HTTP request headers.
This map contains the header field names and values. Headers can be
set when the job is created.
Cloud Scheduler sets some headers to default values:
User-Agent: By default, this header is "AppEngine-Google;
(+http://code.google.com/appengine)". This header can be modified, but
Cloud Scheduler will append "AppEngine-Google;
(+http://code.google.com/appengine)" to the modified User-Agent.
X-CloudScheduler: This header will be set to true.
X-CloudScheduler-JobName: This header will contain the job name.
X-CloudScheduler-ScheduleTime: For Cloud Scheduler jobs specified in
the unix-cron format, this header will contain the job schedule time
in RFC3339 UTC "Zulu" format. If the job has an body, Cloud Scheduler
sets the following headers:
Content-Type: By default, the Content-Type header is set to
"application/octet-stream". The default can be overridden by explictly
setting Content-Type to a particular media type when the job is
created. For example, Content-Type can be set to "application/json".
Content-Length: This is computed by Cloud Scheduler. This value is
output only. It cannot be changed. The headers below are output only.
They cannot be set or overridden:
X-Google-: For Google internal use only. X-AppEngine-: For Google
internal use only. In addition, some App Engine headers, which contain
job-specific information, are also be sent to the job handler.
An object containing a list of "key": value pairs. Example: { "name":
"wrench", "mass": "1.3kg", "count": "3" }.
https://cloud.google.com/scheduler/docs/reference/rest/v1/projects.locations.jobs#appenginehttptarget
if request.META['HTTP_X_CLOUDSCHEDULER'] == 'true':
print("True")

Not being able to authenticate service accounts with AppAssertionCredentials on App Engine for a Gmail service

I am trying to build a Gmail service which will read a user's emails, once their IT admin has authenticated the App on the Apps marketplace. From the documentation, it seemed service accounts would be the right fit, for which I tried both:
scope = "https://www.googleapis.com/auth/gmail.readonly"
project_number = "c****io"
authorization_token, _ = app_identity.get_access_token(scope)
logging.info("Using token %s to represent identity %s",
authorization_token, app_identity.get_service_account_name())
#authorization_token = "OAuth code pasted from playground"
response = urlfetch.fetch(
"https://www.googleapis.com/gmail/v1/users/me/messages",
method=urlfetch.GET,
headers = {"Content-Type": "application/json",
"Authorization": "OAuth " + authorization_token})
and
credentials = AppAssertionCredentials(scope=scope)
http = credentials.authorize(httplib2.Http(memcache))
service = build(serviceName='gmail', version='v1', http=http)
listReply = gmail_service.users().messages().list(userId='me', q = '').execute()
I then started dev_appserver.py as per Unable to access BigQuery from local App Engine development server
However, I get an HTTP error code 500 "Backend Error". Same code, but when I paste the access_token from the OAuth playground, it works fine (HTTP 200). I'm on my local machine in case that makes any difference. Wondering if I'm missing anything? I'm trying to find all emails for all users of a particular domain where their IT admin has installed my Google Marketplace App.
Thanks for the help!
To do this type of impersonation, you should create a JWT and set the "sub" field to the email address of the user whose mailbox you want to access. Developer documentation: Using OAuth 2.0 for Server to Server Applications: Additional claims.
The python code to construct the credentials will look something like
credentials = SignedJwtAssertionCredentials(
"<service account email>#developer.gserviceaccount.com",
file("secret-privatekey.pem", "rb").read(),
scope=["https://www.googleapis.com/auth/gmail.readonly"],
sub="<user to impersonate>#your-domain.com"
)

Accessing a Google Drive spreadsheet from Appengine

I have an appengine app that needs to access a single, hard-coded spreadsheet on Google Drive.
Up until now I have been achieving this as follows:
SpreadsheetService service = new SpreadsheetService("myapp");
service.setUserCredentials("myusername#gmail.com", "myhardcodedpassword");
When I tried this today with a new user, I got InvalidCredentialsException even though the username and password were definitely correct. I got an email in my inbox saying suspicions sign-ins had been prevented, and there seems to be no way to enable them again.
I am also aware that hardcoding passwords in source is bad practice.
However, I have read very widely online for how to enable OAuth/OAuth2 for this, and have ended up wasting hours and hours piecing fragments of information from blogs, stackoverflow answers etc, to no avail.
Ideally the solution would involve an initial process to generate a long-lived access token, which could then be hard-coded in to the app.
I want a definitive list of steps for how to achieve this?
EDIT: As Google have redesigned the API Console, the details of the steps below have changed - see comments
OK here goes, step by step
Go to Google Cloud Console and register your project (aka application)
You need to note the Client ID, and Client Secret
Go to the OAuth Playground, click the gear icon and choose and enter your own credentials
You will be reminded that you need to go back to the Cloud COnsole and add the Oauth Playground as a valid callback url. So do that.
Do Step 1, choosing the spreadsheet scope and click authorize
Choose your Google account if prompted and grant auth when prompted
Do Step 2, Click 'Exchange auth code for tokens'
You will see an input box containing a refresh token
The refresh token is the equivalent of your long lived username/password, so this is what you'll hard code (or store someplace secure your app can retrieve it).
When you need to access Google Spreadsheets, you will call
POST https://accounts.google.com/o/oauth2/token
content-type: application/x-www-form-urlencoded
client_secret=************&grant_type=refresh_token&refresh_token=1%2xxxxxxxxxx&client_id=999999999999.apps.googleusercontent.com
which will return you an access token
{
"access_token": "ya29.yyyyyyyyyyyyyyyyyy",
"token_type": "Bearer",
"expires_in": 3600
}
Put the access token into an http header for whenever you access the spreadsheet API
Authorization: Bearer ya29.yyyyyyyyyyyyyyyyy
And you're done
Pinoyyid has indeed provided wonderful help. I wanted to follow up with Python code that will allow access to a Personal (web-accessible) Google Drive.
This runs fine within the Google App Engine (as part of a web app) and also standalone on the desktop (assuming the Google App Engine SDK is installed on your machine [available from: https://developers.google.com/appengine/downloads]).
In my case I added https://www.googleapis.com/auth/drive scope during Pinyyid's process because I wanted access to all my Google Drive files.
After following Pinoyyid's instructions to get your refresh token etc., this little Python script will get a listing of all the files on your Google drive:
import httplib2
import datetime
from oauth2client.client import OAuth2Credentials
from apiclient.discovery import build
API_KEY = 'AIz...' # from "Key for Server Applications" from "Public API Access" section of Google Developers Console
access_token = "ya29..." # from Piinoyyid's instructions
refresh_token = "1/V..." # from Piinoyyid's instructions
client_id = '654....apps.googleusercontent.com' # from "Client ID for web application" from "OAuth" section of Google Developers Console
client_secret = '6Cl...' # from "Client ID for web application" from "OAuth" section of Google Developers Console
token_expiry = datetime.datetime.utcnow() - datetime.timedelta(days=1)
token_uri = 'https://accounts.google.com/o/oauth2/token'
user_agent = 'python urllib (I reckon)'
def main():
service = createDrive()
dirlist = service.files().list(maxResults=30)
print 'dirlist', dirlist
result = dirlist.execute()
print 'result', result
def createDrive():
credentials = OAuth2Credentials(access_token, client_id, client_secret, refresh_token, token_expiry, token_uri, user_agent)
http = httplib2.Http()
http = credentials.authorize(http)
return build('drive', 'v2', http=http, developerKey=API_KEY)
main()
I'm grateful to all who have provided the steps along the way to solving this.

Resources