Could someone help me access Big Query from an App Engine application ?
I have completed the following steps -
Created an App Engine project.
Installed google-api-client, oauth2client dependencies (etc) into /lib.
Enabled the Big Query API for the App Engine project via the cloud console.
Created some 'Application Default Credentials' (a 'Service Account Key') [JSON] and saved it/them to the root of the App Engine application.
Created a 'Big Query Service Resource' as per the following -
def get_bigquery_service():
from googleapiclient.discovery import build
from oauth2client.client import GoogleCredentials
credentials=GoogleCredentials.get_application_default()
bigquery_service=build('bigquery', 'v2', credentials=credentials)
return bigquery_service
Verified that the resource exists -
<googleapiclient.discovery.Resource object at 0x7fe758496090>
Tried to query the resource with the following (ProjectId is the short name of the App Engine application) -
bigquery=get_bigquery_service()
bigquery.tables().list(projectId=#{ProjectId},
datasetId=#{DatasetId}).execute()
Returns the following -
<HttpError 401 when requesting https://www.googleapis.com/bigquery/v2/projects/#{ProjectId}/datasets/#{DatasetId}/tables?alt=json returned "Invalid Credentials">
Any ideas as to steps I might have wrong or be missing here ? The whole auth process seems a nightmare, quite at odds with the App Engine/PaaS ease-of-use ethos :-(
Thank you.
OK so despite being a Google Cloud fan in general, this is definitely the worst thing I have been unfortunate enough to have to work on in a while. Poor/inconsistent/nonexistent documentation, complexity, bugs etc. Avoid if you can!
1) Ensure your App Engine 'Default Service Account' exists
https://console.cloud.google.com/apis/dashboard?project=XXX&duration=PTH1
You get the option to create the Default Service Account only if it doesn't already exist. If you've deleted it by accident you will need a new project; you can't recreate it.
How to recover Google App Engine's "default service account"
You should probably create the default set of JSON credentials, but you won't need to include them as part of your project.
You shouldn't need to create any other Service Accounts, for Big Query or otherwise.
2) Install google-api-python-client and apply fix
pip install -t lib google-api-python-client
Assuming this installs oath2client 3.0.x, then on testing you'll get the following complaint:
File "~/oauth2client/client.py", line 1392, in _get_well_known_file
default_config_dir = os.path.join(os.path.expanduser('~'),
File "/usr/lib/python2.7/posixpath.py", line 268, in expanduser
import pwd
File "~/google_appengine-1.9.40/google/appengine/tools/devappserver2/python/sandbox.py", line 963, in load_module
raise ImportError('No module named %s' % fullname)
ImportError: No module named pwd
which you can fix by changing ~/oauth2client/client.py [line 1392] from:
os.path.expanduser('~')
to:
os.env("HOME")
and adding the following to app.yaml:
env_variables:
HOME: '/tmp'
Ugly but works.
3) Download GCloud SDK and login from console
https://cloud.google.com/sdk/
gcloud auth login
The issue here is that App Engine's dev_appserver.py doesn't include any Big Query replication (natch); so when you're interacting with Big Query tables it's the production data you're playing with; you need to login to get access.
Obvious in retrospect, but poorly documented.
4) Enable Big Query API in App Engine console; create a Big Query ProjectID
https://console.cloud.google.com/apis/dashboard?project=XXX&duration=PTH1
https://bigquery.cloud.google.com/welcome/XXX
5) Test
from oauth2client.client import GoogleCredentials
credentials=GoogleCredentials.get_application_default()
from googleapiclient.discovery import build
bigquery=build('bigquery', 'v2', credentials=credentials)
print bigquery.datasets().list(projectId=#{ProjectId}).execute()
[or similar]
Good luck!
Related
I am getting following error when I am running "goapp serve myapp/" from Myproject folder inside src.
go-app-builder: Failed parsing input: parser: bad import "unsafe" in
github.com/gorilla/websocket/client.go from GOPATH
my file structure is some something like this
$GOPATH/src
|-github.com/gorilla/websocket
|-MyProject
|-myapp
|-app.yaml
|-mainApp.go (which contain init function and part of app package)
Please let me know how to correct this.
on a related subject, I read that google app provide websocket support only in paid app. Is there a way for me to test my website before getting into payment mode? Or is there a better option that google app engine?
I am developing an API in golang directly on the "App Engine flexible environment" (formerly known as "Managed VMs").
So far, i have been using this kind of import in my .go files :
import (
"appengine"
"appengine/datastore"
...)
Recently I decided to use Google Cloud Storage to store images. It requires the import of "cloud.google.com/go/storage". My problem is that i'm unable to deploy the app with this import (not found), or any other short version ("go/storage") like I use for the appengine import.
After much research, I found this : https://github.com/golang/appengine#user-content-3-update-code-using-deprecated-removed-or-modified-apis
It specifies how to migrate an application using short imports (deprecated, like mine) to full imports (with repository explicit like "google.golang.org/appengine")
I followed the procedure and used the script they provide to update my code (aefix). They also say to add this line to my app.yaml file :
vm : true
If I do, I got this error message running 'gcloud app deploy' :
ERROR: (gcloud.app.deploy) Your application does not satisfy all of the requirements for a runtime of type [go]. Please correct the errors and try again.
If I don't, none of my imports are working and I get the following error :
can't find import: "google.golang.org/appengine/datastore"
Here is my app.yaml file :
runtime: go
api_version: go2
#vm : true
handlers:
- url: /.*
script: _go_app
Of course, all the imports are on the server under $GOPATH/src/ so they're not really missing, more badly referenced I guess.
I'm stuck on this problem since several days, any help of any kind would be appreciated !
Thanks
So sorry - we have some docs to go update. You cannot use the golang/appengine package with the App Engine flexible environment. The aefix tool won't work here either. Instead of the App Engine Go SDK, you want to use the Go client library here:
https://github.com/GoogleCloudPlatform/google-cloud-go
If you were previously using vm:true, you will need to upgrade to env:flex - the instructions (and the note on the go app engine library) are here:
https://cloud.google.com/appengine/docs/flexible/go/upgrading
Let me know if you have any questions!
I'm creating a GAE application.
When I set my GAE PHP application as an authorized application to access my
Cloud SQL instance, I get the following warning:
App Engine regions must be the same as Cloud SQL instance regions!
How can I verify or change the region of my GAE application?
Thx
You can use gcloud app describe --project <projectId> command to get the location.
You cannot change an app's region after you set it.
Refer here.
For example:
$ gcloud app describe --project myapp-1337
authDomain: gmail.com
codeBucket: staging.myapp-1337.appspot.com
defaultBucket: myapp-1337.appspot.com
defaultHostname: myapp-1337.appspot.com
featureSettings:
splitHealthChecks: true
gcrDomain: us.gcr.io
id: myapp-1337
locationId: us-central
name: apps/myapp-1337
servingStatus: SERVING
You can see the location of your application at [1].
Regarding changing the region, please see [2] for more information.
[1] - https://appengine.google.com/
[2] - Change GAE application location
Update: EU app creation is now possible from the new Developers console and doesn't require whitelist / premier status. Looks like Location tab will only show in the GAE console if account was whitelisted / Premier. A way to find app location is still in the old GAE console -> from the list click on the app to go to the dashboard -> if you see e~ in the link after app_id=, your app is in EU, else if you see s~ your app is in the US.
Another alternative is to use gcloud command suggested by Ilya Zakreuski below.
You can get the AppId from the runtime environment:
Java:
ApiProxy.getCurrentEnvironment().getAppId()
or Python:
os.environ['APPLICATION_ID']
where the prefixes as mentioned by #Ilya and #Nikita still apply:
prefixed with e~ means EU and s~ means US.
P.S. It looks like both the dev consoles have been updated to get the App ID from elsewhere, so they don't have this prefix.
I want to try Google appengine sdk with go.
I was getting following error
administrator#jadehol725:~/Documents/softwares/go$ svn checkout http://googleappengine.googlecode.com/svn/trunk/ googleappengine-read-only
svn: E175002: Unable to connect to a repository at URL 'http://googleappengine.googlecode.com/svn/trunk'
svn: E175002: OPTIONS request on '/svn/trunk' failed: 408 Request Time-o
Could you please tell how to download google appengine sdk for go.
It may be easier if you go to https://developers.google.com/appengine/downloads, pick "Google App Engine SDK for Go", and select the appropriate platform. There are also install instructions.
I had trouble setting up my first app engine projects. They don't necessarily play nice with source control when set up in the idiomatic go style. Check out this starter project for tips: https://github.com/SellJamHere/Go-AppEngineStarter. (Full disclosure, I made it.)
In case that can help, this thread mentions:
disable http-compression.
Make a backup of the file %APPDATA%\Subversion\servers
In a text editor, open the file %APPDATA%\Subversion\servers
Under [groups] add this line:
googleappengine = googleappengine.googlecode.com
Add this section:
[googleappengine]
http-compression = no
Save the file
Retry your SVN / TortoiseSVN operation
There's a similar question that was recently responded to on Stackoverflow here: Google Cloud Storage Client not working on dev appserver
The solution was to either upgrade the SDK to 1.8.8 or use the previous revision of the GCS client library which didn't have the bug instead.
I'm currently using 1.8.8 and have tried downloading multiple revisions and /_ah/gcs doesn't load for me. After using up a significant number of my backend instances trying to understand how GCS and app engine work together, it'd be great if I could just test it on my local server instead!
When I visit localhost:port/_ah/gcs I get a 404 not found error.
Just a heads up, to install the library all I did was drag and drop the code into my app folder. I'm wondering if maybe I skipped a setup step? I wasn't able to find the answer in the documentation!
thanks!!
Note
To clarify this is my first week using GCS, so my first time trying to use the dev_server to host it.
I was able to find the google cloud storage files I wrote to a bucket locally at:
localhost:port/_ah/gcs/bucket_name/file_suffix
Where port is by default 8080, and the file was written to: /bucket_name/file_suffix
For those trying to understand the full process of setting up a simple python GAE app and testing local writes to google cloud storage:
1. Follow the google app engine "quickstart":
https://cloud.google.com/appengine/docs/standard/python/quickstart
2. Run a local dev server with:
dev_appserver.py app.yaml
3. If using python, follow "App Engine and Google Cloud Storage Sample":
https://cloud.google.com/appengine/docs/standard/python/googlecloudstorageclient/app-engine-cloud-storage-sample
If you run into "ImportError: No module named cloudstorage" you need to create a file named appengine_config.py
touch appengine_config.py
and add to it:
from google.appengine.ext import vendor
vendor.add('lib')
GAE runs this script automatically when starting your local dev server with dev_appserver.py app.yaml, and it is necessary to run this script for GAE to find the cloudstorage library in your lib/ folder
4. "Writing a file to cloud storage" from the same tutorial:
def create_file(self, filename):
"""Create a file."""
self.response.write('Creating file {}\n'.format(filename))
# The retry_params specified in the open call will override the default
# retry params for this particular file handle.
write_retry_params = cloudstorage.RetryParams(backoff_factor=1.1)
with cloudstorage.open(
filename, 'w', content_type='text/plain', options={
'x-goog-meta-foo': 'foo', 'x-goog-meta-bar': 'bar'},
retry_params=write_retry_params) as cloudstorage_file:
cloudstorage_file.write('abcde\n')
cloudstorage_file.write('f'*1024*4 + '\n')
self.tmp_filenames_to_clean_up.append(filename)
with cloudstorage.open(
filename, 'w', content_type='text/plain', options={
'x-goog-meta-foo': 'foo', 'x-goog-meta-bar': 'bar'},
retry_params=write_retry_params) as cloudstorage_file:
cloudstorage_file.write('abcde\n')
cloudstorage_file.write('f'*1024*4 + '\n')
Where filename is /bucket_name/file_suffix
4. After calling create_file via a route in your WSGI app, your file will be available at:
localhost:port/_ah/gcs/bucket_name/file_suffix
Where port is by default 8080, and the file was written to: /bucket_name/file_suffix
Postscript
Unfortunately, I did not find either 3) or 4) in their docs, so I hope this helps someone get set up more easily in the future.
To access gcs objects on dev_appserver, you must specify the bucket & object name, i.e. /_ah/gcs/[bucket]/[object].
The storage simulator for the local server is working in later versions of the SDK. For Java, one may choose to follow a dedicated tutorial: “App Engine and Google Cloud Storage Sample”.