This is specifically a question relating to server to server authorisation between a python Google AppEngine app and Google's BigQuery, but could be relevant for other cloud services.
tldr; Is it possible to get the App Engine local development server to authenticate with the remote BigQuery service? Better yet is there a local BigQuery?
I understand that AppAssertionCredentials does not currently work on the local development server, though that in itself is very frustrating.
The alternative method which works for standard python code, outside of the local development server sandbox, detailed here does not work for the local development server because even with PyCrypto enabled the sandbox does not allow some posix modules e.g. 'pwd'.
I have got AppAssertionCredentials working on the remote server and the SignedJwtAssertionCredentials method working in native python locally, so the service accounts are set up properly.
The imports fail within oauth2client/crypt.py within the try/except blocks - after commenting them out the sandbox whitelist exceptions are easily seen.
I've fiddled around with adding 'pwd' to the whitelist, then another problem crops up, so I scurried back out of that rabbit hole.
I've tried including PyCrypto directly into the project with similar results.
I've also tried with OpenSSL with similar results.
I have looked for a local appengine specific PyCrypto to no avail, have I missed one? I should say this is on Mac OSX - perhaps I should fire up a linux box and give that a go?
A recent release of Google App Engine SDK added support for the AppAssertionCredentials method on the development server. To use this method locally, add the following arguments to dev_appserver.py:
$ dev_appserver.py --help
...
Application Identity:
--appidentity_email_address APPIDENTITY_EMAIL_ADDRESS
email address associated with a service account that
has a downloadable key. May be None for no local
application identity. (default: None)
--appidentity_private_key_path APPIDENTITY_PRIVATE_KEY_PATH
path to private key file associated with service
account (.pem format). Must be set if
appidentity_email_address is set. (default: None)
To use these:
In Google Developer Console, select a project then navigate to "API & auth" -> "Credentials" -> "Create new client ID".
Select "Service account" and follow the prompts to download the private key in PKCS12 (.p12) format. Take note of the email address for the service account.
Make sure you add that service account email address to the "Permissions" tab for any project that contains data it needs to access, by default it is added to the project team in which it was created.
Convert the PKCS12 format to PKCS1 format using the following command:
$ cat /path/to/xxxx-privatekey.p12 | openssl pkcs12 -nodes -nocerts -passin pass:notasecret | openssl rsa > /path/to/secret.pem
Start dev_appserver.py as:
$ dev_appserver.py --appidentity_email_address xxxx#developer.gserviceaccount.com --appidentity_private_key_path /path/to/secret.pem ...
Use appidentity module and AppAssertionCredentials in the same manner locally as you normally would in production.
Please ensure that /path/to/secret.pem is outside of your application source directory so that it is not accidentally deployed as part of your application.
So searching deeper for PyCrypto and local appengine sandbox lead me onto this thread and response specifically...
https://code.google.com/p/googleappengine/issues/detail?id=1627#c22
This is fixed in 1.7.4. However, you must use easy_install -Z
(--always-unzip) to install PyCrypto. The default zipfile option in
OSX 10.8 is incompatible with the sandbox emulation in the
dev_appserver.
The solution turns out to be very straight forward...
I used:
sudo easy_install pycrypto
and it should have been:
sudo easy_install -Z pycrypto
as per the thread above. Using PIP will work as well:
pip install pycrypto
or a manual download and install of pycrypto will also work. I tested all three.
If you have installed pycrypto with easy_install and without -Z flag then you may want to install pip just so you can easily uninstall pycrypto...
easy_install pip
for the record I built and installed libgmp, as pil and the manual install showed this warning...
warning: GMP or MPIR library not found; Not building
Crypto.PublicKey._fastmath.
Although this gave me fastmath, it was not essential to solve the problem as the Crypto libs gracefully fail to slowmath.
Another point that tripped me up for a bit was I removed pycrypto from app.yaml whilst testing to see if OpenSSL might give me all I need.
So dont forget to add...
- name: pycrypto
version: latest
into app.yaml under the libraries: section.
With this missing the native _counter library was not imported hence Counter failed etc.
Also for the record any talk of having to move Crypto into the app folders themselves or out of the default Mac OS X location of /Library/Python/2.7/site-packages/Crypto was only valid in earlier versions of the dev server.
Similarly there is now no need to edit any _WHITE_LIST_C_MODULES lists (which is in sandbox.py in appengine 1.8 onwards, which also includes the regex which allows Crypto.Util._counter etc)
The other bit of the puzzle in case you get here before discovering the key issue is that the key file you download from the console is PKCS12 and is downloaded as hex text, so I converted that to binary and then converted that to a PEM so I could include it in the source code.
I struggled with this one for a day or two. And I was finally able to get localhost working with server to server authentication, a service account and a .p12 cert.
If it's at all helpful to anyone, here's a simple gist: https://gist.github.com/dandelauro/7836962
I agree with the first post - the localhost/production impedance is a real pain in the a**. AppAssertionCredentials is the right way to go on production and I don't want to have two different code paths between production and localhost. So the development environments need to be adjusted to be able to perform the required authentication without affecting the main code path.
E.g., perhaps a developer could log in with their own Google account using appcfg.py and then that auth would be cached for a period such that AppAssertionCredentials would work out. The developer's Google account could be granted permissions on the appropriate environments (dev and test for us, e.g.)
re: "local BigQuery" - we have some initial stuff in place that uses SQLLite to simulate BigQuery interactions for unit tests and other offline/local testing, but of course, it's not a great simulation. I agree that all the Cloud Platform products need to spend as much time thinking about the development-time experience as App Engine has.
Is it possible to get the App Engine local development server to authenticate with the remote BigQuery service?
I think it's impossible to use AppAssertionCredentials as authentication method between BigQuery service and your local App Engine server currently.
Alternatively, I'm using OAuth2 authentication which is associated with specific user(this user must be registered in your project at google api console) to access BigQuery from local App Engine server.
For getting user OAuth2 authentication, I use oauth2client.client module in the app code.
I hope this will be helpful to your problem.
Updated:
This is what I'm doing for getting the user OAuth2 authorization.
Edited:
Added missing import statement.
Thanks mattes!
import os
import webapp2
import httplib2
from oauth2client.client import OAuth2Credentials
from oauth2client.appengine import StorageByKeyName, CredentialsModel, OAuth2DecoratorFromClientSecrets
from google.appengine.api import users
oauth2_decorator = OAuth2DecoratorFromClientSecrets(
os.path.join(os.path.dirname(__file__), 'client_secrets.json'),
scope='https://www.googleapis.com/auth/bigquery')
oauth2_decorator._kwargs = {'approval_prompt': 'force'}
class TestPage(webapp2.RequestHandler):
#oauth2_decorator.oauth_required
def get(self):
user_id = users.get_current_user().user_id()
credentials = StorageByKeyName(CredentialsModel, user_id, 'credentials').locked_get()
http = credentials.authorize(httplib2.Http()) # now you can use this http object to access BigQuery service
application = webapp2.WSGIApplication([
('/', TestPage),
(oauth2_decorator.callback_path, oauth2_decorator.callback_handler()),
], debug=True)
Related
I'd like to use the Google Talent Solution (GTS).
The set-up docs explain how to set-up an Standard Env App Engine project using a service account. I've enabled GTS in my App Engine project, enabled Data Logging and added a Service Account Token Creator to the App Engine default service account that was created when I enabled GTS - [app-id]#appspot.gserviceaccount.com.
I've read the docs for a Python AppEngine project but it uses a deprecated API oauth2client and I'm trying to use google_auth instead (I've installed and vendored google_api and google_auth.
In my vendor appengine_config.py:
from google.appengine.ext import vendor
import os
google_api_path = "%s%s" % (os.path.dirname(os.path.realpath(__file__)), '/applications/[app-id]/modules/google_api')
vendor.add(google_api_path)
google_auth = "%s%s" % (os.path.dirname(os.path.realpath(__file__)), '/applications/[app-id]/modules/google')
vendor.add(google_auth)
I installed google_auth into a directory named google. And in a directory path /applications/[app-id]/modules/ which works well with Web2py, a Python framework.
My code:
from google.auth import app_engine
credentials = app_engine.Credentials()
print(credentials.token)
Alas, credentials.token is None
In all this set-up, config and code, what have I missed?
Possibly because the oauth2client has been deprecated. From googleapis/oauth2client:
Note: oauth2client is now deprecated. No more features will be added to the libraries and the core team is turning down support. We
recommend you use google-auth and oauthlib. For more details
on the deprecation, see oauth2client deprecation.
But I see google-auth uses gRPC, which at least not long ago wasn't compatible with standard environment GAE apps, see GRPC and types import error in App Engine Datastore, so YMMV.
After making the changes (which I've added to my question) the calls work!
Because I'm using a Service Account credentials.token is None and I can proceed to call a Google Talent Solution to add, for example, companies.
Apologies for the seemingly obvious question, but I figure the answer might help others. I can't for the life of me find documentation on the filepath within the Google App Engine VM (Cloud Shell) where I can find the static files being served from. I need to pull the latest upstream changes from a private github repo.
Note that I navigated elsewhere in the VM and even restarting the session didn't put me in a default project root path within the VM as I expected it to.
There are several issues to address here:
The Cloud Shell is a virtual shell
Google Cloud Shell is an interactive shell environment for Google
Cloud Platform.
The environment where you're working is a container running in a VM in a Google-owned project inside GCP.
You can verify this by checking the metadata server (only available for GCP VMs):
curl -H 'Metadata-Flavor:Google' "http://metadata.google.internal/computeMetadata/v1/?recursive=true&alt=text"
In the metadata provided you'll see how this container is created and configured.
The Cloud Shell is tied to the user, so you'll always access the same environment if you access it with the same credentials, no matter the project. However, if you access with a different user, you'll get a different environment.
You can't access GAE standard instances
GAE is a fully managed environment, and you won't be able to access it. In this way, you won't be able to find the root of the running app engine project.
However, by the way GAE deploys your code, it uses a staging bucket to gather the code before compiling. You can find your staging bucket through the App Engine Admin API. This is usually staging.<PROJECT_ID>.appspot.com, although you can change this configuration. You can get your files from there.
You can access GAE flex apps
However, the deployment in flex gets your files, build a Docker container with them, and then deploys this container inside a VM.
As per the docs, you can connect directly to your container by running:
gcloud app instances ssh [INSTANCE-NAME] --service [SERVICE] --version [VERSION]
docker exec -it gaeapp /bin/bash
Regarding your issue
According what you say in the comments of the question, your issue could come from a myriad of places. From changing the shell you're connecting to, to resetting your shell environment (deleting all the files), to a thousand different possible problems.
The best way to think about it is regard the Cloud Shell as a temporal environment to run commands, but not as a virtual machine.
Knowing that, you could mount a persistent filesystem (GCS through GCSFuse, Cloud Filestore, ...) to persist your work, or simply use Git to have your work always synced on a repo.
GAE Flex has some nice CI integrations, so that's a plus for going the Git route.
I would like to debug my Google App Engine (GAE) app locally but without using localhost. Since my application is made up of microservices, the urls in a production environment would be along the lines of:
https://my-service.myapp.appspot.com/
But code in one service can call another service and that means that the urls are hardcoded. I could of course use a mechanism in code to determine whether the app is running locally or on GAE and use urls that are different although I don't see how a local url would handle the since the only way to run an app locally is to use localhost. Hence:
http://localhost:8080/some-service
Notice that "some-service" maps to a servlet, whereas "my-service" is a name assigned to a service when the app is uploaded. These are really two different things.
The only possible solution I was able to find was to use a reverse proxy which would map one url to a different one. Still, it isn't clear whether the GAE development SDK even supports this.
Personally I chose to detect the local development vs GAE environment and build my inter-services URLs accordingly. I feel it was a well-worthy effort, I've been (re)using it a lot. No reverse proxy or any other additional ops necessary, it just works.
Granted, I'm using Python, so I'm not 100% sure a complete similar Java solution exists. But maybe it can point you in the right direction.
To build the per-service URLs I used modules.get_hostname() (the implementation is presented in Resolve Discovery path on App Engine Module). I believe the Java equivalent would be getInstanceHostname() from com.google.appengine.api.modules.
This method, when executed on the local server, automatically provides the particular port the server listens to for each service.
BTW, all my services for an app are executed by a single development server process, which listens on multiple ports (this is, I guess, how it can provide the modules.get_hostname() info). See Running multiple services using dev_appserver.py on different ports. This is part I'm unsure about: if/how the java local dev server can simultaneously run multiple services. Apparently this used to be supported some time ago (when services were still called modules):
Serving multiple GAE modules from one development server?
GAE modules on development server
This can be accomplished with the following steps:
Create an entry in the hosts file
Run the App Engine Dev server from a Terminal using certain options
Use IntelliJ with Remote debugging to attach the App Engine Dev server.
To edit the hosts file on a Mac, edit the file /etc/hosts and supply the domain that corresponds to your service:. Example:
127.0.0.1 my-service.myapp.com
After you save this, you need to restart your computer for the changes to take place.
Run the App Engine Dev server manually:
dev_appserver.sh --address=0.0.0.0 --jvm_flag=-Xdebug
--jvm_flag=-Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=8000
[path_to_exploded_war_directory]
In IntelliJ, create a debug configuration. Use the Remote template to create this configuration. Set the host to the url you set in the hosts file and set the port to 8000.
You can set a breakpoint and run the app in IntelliJ. IntelliJ will attach to the running instance of App Engine Dev server.
Because you are using a port during debugging and no port is actually used when the app is uploaded to the GAE during production, you need to add code that identifies when the app is running locally and when it's running on GAE. This can be done as follows:
private String mServiceUrl = "my-service.my-app.appspot.com";
...
if (SystemProperty.environment.value() != SystemProperty.Environment.Value.Production) {
mServiceUrl += ":8000";
}
See https://cloud.google.com/appengine/docs/standard/java/tools/using-local-server
An improved solution is to avoid including the port altogether and not having to use code to determine whether your app is running locally or on the production server. One way to do this is to use Charles (an application for monitoring and interacting with requests) and use a feature called Remote Mapping which lets you map one url to another. When enabled, you could map something like:
https://my-service.my-app.appspot.com/
to
https://localhost:8080
You would then enable the option to include the original host, so that this gets delivered to the local dev server. As far as your code is concerned it only sees:
https://my-service.my-app.appspot.com/
although the ip address will be 127.0.0.1:8080 when remote mapping is enabled. To use https on local host however does require that you enable ssl certificates for Charles.
For a complete overview on how to setup and debug microservices for a GAE Java app in IntelliJ, see:
https://github.com/JohannBlake/gae-microservices
Suppose I uploaded another version (say, "version: 2" in file app.yaml ) of my Google App Engine application. Version 1 is still the default and version 2 is for testing. How do I run it then?
Once you upload a version on Appengine, you can switch between them easily.
Say that your app name is myapp, currently running version 1. You also have uploaded a version called 2-testing. Your default app (with version 1) can be reached by accessing myapp.appspot.com
If you wanted to access your versions explicitly, you joust need to access <version_name>-dot-myapp.appspot.com. Following the example it would be:
1-dot-myapp.appspot.com or 2-testing-dot-myapp.appspot.com
The -dot- is equivalent to <version>.<appname> but allows you to correctly serve a secure application with SSL
You can mark any version you want as default (serving myapp.appspot.com) using the admin console
edit: this is the official documentation page talking about domains and subdomains in Appengine
Under versions in your admin console you can find the live uri of a version, if you select the version.
And you can use traffic splitting, where you can use your own client ip or a cookie to test a version.
Docs: https://developers.google.com/appengine/docs/adminconsole/trafficsplitting
I've been researching a solution to this all week and while there have been solutions to similar problems there are none that address and rectify this problem directly.
I have created a web application project using Google App Engine and Google Cloud SQL.
Running the GAE application using the eclipse Google plugin and a local MySQL server the application works great.
When running the application from the command line using:-
sudo /opt/appengine-java-sdk-1.6.1/bin/dev_appserver.sh --jvm_flag=-Drdbms.server=local --jvm_flag=-Drdbms.driver=com.mysql.jdbc.Driver --jvm_flag=-Drdbms.url=jdbc:mysql://localhost:3306/twincam?user=root --port=7070 /home/ben/workspace/Twincam/war
I get the following:-
java.lang.IllegalStateException: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at com.google.appengine.api.rdbms.dev.LocalRdbmsServiceLocalDriver.registerDriver(LocalRdbmsServiceLocalDriver.java:95)
I have the classpath referencing the mysql-connector.jar located in /Twincam/war/WEB-INF/lib/mysql-connector-java-5.1.18-bin.jar referenced by my user library as in the following .classpath file and directory structure :-
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="con" path="com.google.appengine.eclipse.core.GAE_CONTAINER"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER"/>
<classpathentry kind="lib" path="war/WEB-INF/lib/mysql-connector-java-5.1.18-bin.jar"/>
<classpathentry kind="lib" path="war/WEB-INF/lib/gson-2.1-javadoc.jar"/>
<classpathentry kind="lib" path="war/WEB-INF/lib/gson-2.1-sources.jar"/>
<classpathentry kind="lib" path="war/WEB-INF/lib/gson-2.1.jar"/>
<classpathentry kind="output" path="war/WEB-INF/classes"/>
</classpath>
Update: I checked file permissions and all are set at the default 664 so I'm confident that this is not the problem.
I had the same problem.
I solved it by dropping the mysql .jar in appengine-java-sdk-x.x.x/lib/impl.
Matt's answer helped me a lot, I believe this is a more complete explanation.
I can verify that you can get a local MYSQL instance to work with Google App Engine running in development mode, so as not to incur the upcoming costs associated with the Cloud SQL option from Google, while developing.
First, as Matt said you have to put the mysql-connector jar into the APPENGINE_HOME/lib/impl.
I am on windows. I did this by first finding where my SDK resides. In my project in eclipse in package explorer I right click on "App Engine SDK[App Engine - 1.6.4]", I select Properties from the drop down menu, and in the resulting pop up click the blue "Configure SDKs..." link.
This reveals the location of my App Engine SDK. Go to that folder in a Windows Explorer window, open lib/impl and drop in your mysql-connector jar, copied from your GAE eclipse project. My path was:
C:\Software\eclipse\plugins\com.google.appengine.eclipse.sdkbundle_1.6.4.v201203300216r37\appengine-java-sdk-1.6.4\lib\impl
Per the instructions here (https://developers.google.com/eclipse/docs/cloudsql-createapp) you are supposed to leave your java code connection string pointed at prod (jdbc:google:rdbms://... instead of jdbc:mysql://...) BUT you need to go into Eclipse project properties, Google, App Engine, Google Cloud SQL and under "Development SQL instance (used by local development server)" select the radio button "Use MySQL instance". Next time you launch GAE the connection string in your java code will be ignored in favor of your local MySQL host.
Make sure MySQL service is running and you are good to go.
This took me longer to figure out than it should have. I think the key is from the Google Docs, if you don't understand / use this information you will try and use a mysql jdbc string and run into socket permission errors because GAE can't go to port 3306 unless you do as I describe:
You do not need to explicitly connect to the Development SQL instance in your code - this is done for you automatically when you run your application in the development server. The development SQL instance to connect to is passed automatically to your development server via VM arguments by GPE at runtime.
(I uploaded 4 images to help with this explanation and only after I finished does it tell me you need 10 reputation points to load images--jeesh)
AFAIK If you are using GAE then you cannot use JDBC driver. I don't know how it could work in eclipse while Socet class is restricted in GAE.
You should use internal GAE driver instead mysql JDBC.
com.google.appengine.api.rdbms.AppEngineDriver
Then, in eclipse you might configure connection to your local mysql db as in example
see also this example how it is configured
Compiling as pure GWT and putting it into Tomcat with JDBC, should work, but not as a GAE application.
I just installed Eclipse + GPE4.2 in a new machine using GAE SDK1.7.5 on Juno Service Release 1 Build id: 20121004-1855 and was forced to copy the JDBC driver at appengine-java-sdk-x.x.x/lib/impl.
Initially I had forgotten about this and I spent about 2 days debugging my code.
It would be nice if there was more information available on GPE page issues. It would be nice to be able to Google more information.
Well, after read all written for you guys, and reading another stuff I finally can list my data from a table of my Google SQL Cloud instance with GAE, I just follow this steeps and It works!...
1.- I followed this example :[Using Google Cloud SQL with App Engine Java SDK
but in the guestbook.jsp I added this line:
<%# page import="com.google.appengine.api.rdbms.AppEngineDriver" %>
Then I changed the URL connection to this, with no username in my case.
url = "jdbc:google:rdbms:INSTANCE_NAME/DATABASE_NAME";
also I commented this line
// Class.forName("com.mysql.jdbc.GoogleDriver");
2.- As described above, I copied the mysql-connector.jar dropping it in appengine-java-sdk-x.x.x/lib/impl.
3.- I configured the project as is described here Create a New Web App with Cloud SQL Support
4.- I configured my Google SQL instance as is described here Configuring Access
5.- Finally I deployed the project as described here Uploading Your Application
I can run it locally and also works well.
Well, I hope this helps, thnks!