I studied and could successfully replicate the quickstart.py example on https://developers.google.com/drive/web/quickstart/quickstart-python to upload a file to my google drive using command line.
However, I wish to write an app that does the same, but through a cron job i.e. uploads a file everyday at 8am say, without the need to authenticate each time. Is there sample code/examples that I can look at to implement the oauth steps without the command line intervention?
Thanks!
You can use your App Engine app's built-in Service Account to authorize requests to the Google Drive API.
https://cloud.google.com/appengine/docs/python/appidentity/
https://developers.google.com/identity/protocols/OAuth2ServiceAccount
Your app will need to have an embedded Refresh Token, or some way of fetching it from a secure server. The Refresh Token acts a bit like a stored username/password, albeit with constrained access. Therefore you need to consider the security implications. For example, since it's uploading, it will only need drive.file scope, so your corpus of Drive files remain inaccessible.
If you're happy with the security implications, then the steps you need are described How do I authorise an app (web or installed) without user intervention? (canonical ?)
Related
I am work on a web application as an interface with Google Cloud Storage(GCS).
I am using a backend service to retrieve the list of files I stored on GCS and their URL with the JSON API and return that to my web application. However, I was not really able to load the files through those URL, which always came back with 403 forbidden.
I am not sure how GCS authentication work behind the scene and whether it is possible to directly grant access to web application. I am not sure how could I attach application authentication information via http request. What I know is I can do that via the backend service but for the reason of simplicity I wonder if it is possible to get around with that. One of the thing I tried is adding the web application domain(which will be sent via referrer in http request) into ACL to that bucket, which doesn't work at all.
And thanks to what #Brandon pointed out below. I am ok to grant anyone whoever have access to the application to view the content of the GCS since it is an internal app and I have already checked their authentication when I first serve the web application.
====
Solution
I ended up using the signedUrl that expire in 5 minutes and I highly recommend interact with gcs using gcloud (Their python document is really good). Thanks again for the thorough answer!
You have a user on a web browser who wants to download an object that only your application's service account has read access for. You have a few options:
Expand access: make these object publicly readable. Probably not the best choice if this info is sensitive, but if it's not, this is the easiest solution.
Give your app's credentials to the user so that they can authenticate as your app. This is a REALLY bad idea, and I probably shouldn't even list it here.
When a user wants to download a file, have them ask your app for it, and then have your app fetch the file and stream its contents to the user. This is the easiest solution for the client-side code, but it makes your app responsible for streaming file contents, which isn't really great.
When a user wants to download a file, have them ask your app for permission, and reply to them with some sort of token they can use to fetch the data directly from GCS.
#4 is what you want. Your users will ask your app for a file, your app will decide whether they are allowed to access that file via whatever you're doing (passwords? IP checks? Cookies? Whatever.) Then, your app will respond with a URL the user can use to fetch the file directly from GCS.
This URL is called a "signed URL." Your app uses its own private key to add a signature to a URL that indicates which object may be downloaded by the bearer and for long the URL is valid. The procedure for signing URLs is somewhat tricky, but fortunately the gcloud storage libraries have helper functions that can generate them.
I'm working on a project with Google App Engine. I am using continuous integration via Travis, and wish to be able to deploy directly from it. Due to a bug that will not be resolved directly, I can't rely on Travis' built-in GAE deployment, so I basically have to use mvn appengine:update manually. This requires me navigate to a generated URL and manually paste to the terminal an authentication code, which I can't do in automated builds.
It was suggested to me, however, that I do some Unix magic instead. While I can easily pick out the URL I need to navigate to from grep, I still need to log in to Google with my credentials in order to actually get the authentication code (which I can then grep out and pipe to the deployment program).
Given that, how do I log in to Google with my credentials, using only curl or similar command-line utilities?
I've accomplished similar things in the past using Service Accounts. These are likely a good fit for your problem.
Service Accounts will allow you to authenticate and upload your app without manual intervention.
Overview
A Service Account will allow you to do "passwordless" authentication like you may already do with ssh, and git, etc. by setting up your keys. This will remove the requirement that you log in manually, or follow the road to madness by trying to do a "manual" login automatically.
There are basically two steps:
Create your service account and key (with the right permissions)
Use that credential instead of what you're doing now
Resources
I think it's better to give a list of resources than concrete instructions since it's basically impossible to express concisely (even though it's a simple process, there's bound to be a lot of little things that annoy), everyone's requirements will be slightly different, and Google is likely to change the process at some point.
Using the Google Cloud Platform Console for App Engine | Permissions
Using OAuth 2.0 for Server to Server Applications
Setting up OAuth 2.0 | Service Accounts
gcloud auth activate-service-account
Hopefully that's enough to get you headed in the right direction.
Note
You'll likely have to spend some time looking at your .appcfg_oauth2_tokens_java and sorting out a variety of other annoyances, but I believe that this approach is the best way to solve your problem.
It sounds like you have a pretty straight-forward setup and that a Service Account alone will get you there, but if you need to get a little weird, the App Engine Admin API is always there.
I was handed an assignment but I don't know where to start.
The aim is to have 2 piece of code running. One will run in Open stack private cloud and perform the task of indexing two sets of text, with another running in EC2 with the task of matching the two indexed tests.
I want to access them via google app engine.
Ideally, I would like to click a button or perform an action on Google app engine, which then sends a request to Openstack to run the code and retrieve the output of a txt file.
That outputted text files will then be forwarded onto EC2 where the matching will occur and the results sent back to Google App Engine.
My question is, how can I send the files between the systems using REST requests?
FrankN --
EC2, GAE and OpenStack are disparate cloud computing platforms. To integrate them might include, say, using one platform while saving backups to another.
CloudU.Rackspace.com is a vendor-neutral education site about cloud computing (note: It'll take six or so hours to finish it all). This might help.
Disclaimer: I work for Rackspace.
Firstly, not really sure what your requirements are, why your code does or why are you trying to mix cloud providers in that way.
That said, I would suggest taking the upload from GAE and push it to AWS S3 where you can then retrieve and use as you please from EC2.
Not sure what functionality you are trying to get out of OpenStack that is not present in AWS; however, I would suggest building whatever you are building in EC2 first, then duplicate in on OpenStack services to avoid future vendor lock in.
I build Android app link to Google Cloud Storage. I want to allow access to GCS to my android app ONLY.
Google offers three solutions to securely connect to GCS:
Oauth 2.0 (So with google account)
Cookie-base Account (With google account too)
Service Account Authentication (With private Key, but locally installed on Android App: Very Bad if someone decompile my .apk)
Source: https://developers.google.com/storage/docs/authentication?hl=FR
Is there any other solution to connect securely over GCS ? I would like to connect on GCS to this way (Restrict to Android client ID: SHA1 to your .apk) : https://developers.google.com/appengine/docs/java/endpoints/auth
It is possible with GCS ? Should I use Blobstore to do that ?
Thanks in advance
This is something of a fundamental problem with computing. You can never completely trust that an application running on hardware that is under the total control of an unknown third party has not been somehow tampered with. There are many, many techniques to make tampering much more difficult, but remote systems will never be completely secure. There are several ways to verify that a user has a particular Google account, but you can't easily trust with certainty that a certain app is exactly your app.
That said, there are plenty of ways to design a secure application without trusting the client. What does your app need to be authorized to do? Upload objects? Download secure objects? Is there something bad that a user masquerading as your application could do?
I think you can use 1) to authenticate the information. The app will forward the authentication request to your server (with your own app login token), and when the user is validated by your own services, then the app will receive the oauth token to send to gcloud and receive the desired file.
I believe there is an undocumented Google API available to create and manage Google Cloud Console (and App Engine) projects on behalf of third party users.
Does anyone know how to use it?
I think older versions of the Google Eclipse Plugin obtained an OAuth2 token in the (undocumented) scope https://www.googleapis.com/auth/appengine.admin, and this allowed it to generate a Cloud Console project on your behalf. The latest version doesn't seem to do this. App Engine's own appcfg.py also uses this scope, but doesn't seem to do much more than deploy the code - I'm looking to change core settings for the project, such as Name, Redirect URLs, and Web Origins.
Any information would be appreciated.
I maintain a WordPress plugin providing secure Google Apps Login for end users, and currently have to give detailed instructions to admins for creating a new Cloud Console project manually, and entering settings such as Redirect URL. Ideally, I would create a simple on-line service to do all of this for them.
Thank you!
It is possible to programmatically create a new Developer Console project on behalf of a Google Account (yes, you read that right). You do so in a very roundabout way:
Request the https://www.googleapis.com/auth/drive.scripts scope from the user (standard OAuth 2.0 flow).
Use the Drive API's drive.insert method to create a new file with a mimetype of application/vnd.google-apps.script.
Somehow try to get the project ID, maybe by uploading some Apps Script code? This is the part that I was never able to figure out.
A little known fact is that every Google Apps Script project has a hidden Developer Console project associated with it. This project is not shown in the list of projects, but it does exist. It is created automatically when the user starts a new Apps Script project, and the drive.insert method is enough to cause this to happen.
How do you get to the hidden project? Well, the only way I know of is to open the Apps Script project from the Drive website, open the "Resources > Advanced Google Services" dialog, and click the link to the Developer Console. You'll find the project ID in the URL.
Aside from not being shown in your list of projects and not being able to use App Engine, this is a normal Developer Console project. You can add additional OAuth client credentials, service accounts, Compute Engine instances, etc. And of course once you have a project ID, all of the various management APIs will work: creating new virtual machines, making use of a service account's impersonation ability, etc.