There's a similar question that was recently responded to on Stackoverflow here: Google Cloud Storage Client not working on dev appserver
The solution was to either upgrade the SDK to 1.8.8 or use the previous revision of the GCS client library which didn't have the bug instead.
I'm currently using 1.8.8 and have tried downloading multiple revisions and /_ah/gcs doesn't load for me. After using up a significant number of my backend instances trying to understand how GCS and app engine work together, it'd be great if I could just test it on my local server instead!
When I visit localhost:port/_ah/gcs I get a 404 not found error.
Just a heads up, to install the library all I did was drag and drop the code into my app folder. I'm wondering if maybe I skipped a setup step? I wasn't able to find the answer in the documentation!
thanks!!
Note
To clarify this is my first week using GCS, so my first time trying to use the dev_server to host it.
I was able to find the google cloud storage files I wrote to a bucket locally at:
localhost:port/_ah/gcs/bucket_name/file_suffix
Where port is by default 8080, and the file was written to: /bucket_name/file_suffix
For those trying to understand the full process of setting up a simple python GAE app and testing local writes to google cloud storage:
1. Follow the google app engine "quickstart":
https://cloud.google.com/appengine/docs/standard/python/quickstart
2. Run a local dev server with:
dev_appserver.py app.yaml
3. If using python, follow "App Engine and Google Cloud Storage Sample":
https://cloud.google.com/appengine/docs/standard/python/googlecloudstorageclient/app-engine-cloud-storage-sample
If you run into "ImportError: No module named cloudstorage" you need to create a file named appengine_config.py
touch appengine_config.py
and add to it:
from google.appengine.ext import vendor
vendor.add('lib')
GAE runs this script automatically when starting your local dev server with dev_appserver.py app.yaml, and it is necessary to run this script for GAE to find the cloudstorage library in your lib/ folder
4. "Writing a file to cloud storage" from the same tutorial:
def create_file(self, filename):
"""Create a file."""
self.response.write('Creating file {}\n'.format(filename))
# The retry_params specified in the open call will override the default
# retry params for this particular file handle.
write_retry_params = cloudstorage.RetryParams(backoff_factor=1.1)
with cloudstorage.open(
filename, 'w', content_type='text/plain', options={
'x-goog-meta-foo': 'foo', 'x-goog-meta-bar': 'bar'},
retry_params=write_retry_params) as cloudstorage_file:
cloudstorage_file.write('abcde\n')
cloudstorage_file.write('f'*1024*4 + '\n')
self.tmp_filenames_to_clean_up.append(filename)
with cloudstorage.open(
filename, 'w', content_type='text/plain', options={
'x-goog-meta-foo': 'foo', 'x-goog-meta-bar': 'bar'},
retry_params=write_retry_params) as cloudstorage_file:
cloudstorage_file.write('abcde\n')
cloudstorage_file.write('f'*1024*4 + '\n')
Where filename is /bucket_name/file_suffix
4. After calling create_file via a route in your WSGI app, your file will be available at:
localhost:port/_ah/gcs/bucket_name/file_suffix
Where port is by default 8080, and the file was written to: /bucket_name/file_suffix
Postscript
Unfortunately, I did not find either 3) or 4) in their docs, so I hope this helps someone get set up more easily in the future.
To access gcs objects on dev_appserver, you must specify the bucket & object name, i.e. /_ah/gcs/[bucket]/[object].
The storage simulator for the local server is working in later versions of the SDK. For Java, one may choose to follow a dedicated tutorial: “App Engine and Google Cloud Storage Sample”.
Related
I am developing an API in golang directly on the "App Engine flexible environment" (formerly known as "Managed VMs").
So far, i have been using this kind of import in my .go files :
import (
"appengine"
"appengine/datastore"
...)
Recently I decided to use Google Cloud Storage to store images. It requires the import of "cloud.google.com/go/storage". My problem is that i'm unable to deploy the app with this import (not found), or any other short version ("go/storage") like I use for the appengine import.
After much research, I found this : https://github.com/golang/appengine#user-content-3-update-code-using-deprecated-removed-or-modified-apis
It specifies how to migrate an application using short imports (deprecated, like mine) to full imports (with repository explicit like "google.golang.org/appengine")
I followed the procedure and used the script they provide to update my code (aefix). They also say to add this line to my app.yaml file :
vm : true
If I do, I got this error message running 'gcloud app deploy' :
ERROR: (gcloud.app.deploy) Your application does not satisfy all of the requirements for a runtime of type [go]. Please correct the errors and try again.
If I don't, none of my imports are working and I get the following error :
can't find import: "google.golang.org/appengine/datastore"
Here is my app.yaml file :
runtime: go
api_version: go2
#vm : true
handlers:
- url: /.*
script: _go_app
Of course, all the imports are on the server under $GOPATH/src/ so they're not really missing, more badly referenced I guess.
I'm stuck on this problem since several days, any help of any kind would be appreciated !
Thanks
So sorry - we have some docs to go update. You cannot use the golang/appengine package with the App Engine flexible environment. The aefix tool won't work here either. Instead of the App Engine Go SDK, you want to use the Go client library here:
https://github.com/GoogleCloudPlatform/google-cloud-go
If you were previously using vm:true, you will need to upgrade to env:flex - the instructions (and the note on the go app engine library) are here:
https://cloud.google.com/appengine/docs/flexible/go/upgrading
Let me know if you have any questions!
Could someone help me access Big Query from an App Engine application ?
I have completed the following steps -
Created an App Engine project.
Installed google-api-client, oauth2client dependencies (etc) into /lib.
Enabled the Big Query API for the App Engine project via the cloud console.
Created some 'Application Default Credentials' (a 'Service Account Key') [JSON] and saved it/them to the root of the App Engine application.
Created a 'Big Query Service Resource' as per the following -
def get_bigquery_service():
from googleapiclient.discovery import build
from oauth2client.client import GoogleCredentials
credentials=GoogleCredentials.get_application_default()
bigquery_service=build('bigquery', 'v2', credentials=credentials)
return bigquery_service
Verified that the resource exists -
<googleapiclient.discovery.Resource object at 0x7fe758496090>
Tried to query the resource with the following (ProjectId is the short name of the App Engine application) -
bigquery=get_bigquery_service()
bigquery.tables().list(projectId=#{ProjectId},
datasetId=#{DatasetId}).execute()
Returns the following -
<HttpError 401 when requesting https://www.googleapis.com/bigquery/v2/projects/#{ProjectId}/datasets/#{DatasetId}/tables?alt=json returned "Invalid Credentials">
Any ideas as to steps I might have wrong or be missing here ? The whole auth process seems a nightmare, quite at odds with the App Engine/PaaS ease-of-use ethos :-(
Thank you.
OK so despite being a Google Cloud fan in general, this is definitely the worst thing I have been unfortunate enough to have to work on in a while. Poor/inconsistent/nonexistent documentation, complexity, bugs etc. Avoid if you can!
1) Ensure your App Engine 'Default Service Account' exists
https://console.cloud.google.com/apis/dashboard?project=XXX&duration=PTH1
You get the option to create the Default Service Account only if it doesn't already exist. If you've deleted it by accident you will need a new project; you can't recreate it.
How to recover Google App Engine's "default service account"
You should probably create the default set of JSON credentials, but you won't need to include them as part of your project.
You shouldn't need to create any other Service Accounts, for Big Query or otherwise.
2) Install google-api-python-client and apply fix
pip install -t lib google-api-python-client
Assuming this installs oath2client 3.0.x, then on testing you'll get the following complaint:
File "~/oauth2client/client.py", line 1392, in _get_well_known_file
default_config_dir = os.path.join(os.path.expanduser('~'),
File "/usr/lib/python2.7/posixpath.py", line 268, in expanduser
import pwd
File "~/google_appengine-1.9.40/google/appengine/tools/devappserver2/python/sandbox.py", line 963, in load_module
raise ImportError('No module named %s' % fullname)
ImportError: No module named pwd
which you can fix by changing ~/oauth2client/client.py [line 1392] from:
os.path.expanduser('~')
to:
os.env("HOME")
and adding the following to app.yaml:
env_variables:
HOME: '/tmp'
Ugly but works.
3) Download GCloud SDK and login from console
https://cloud.google.com/sdk/
gcloud auth login
The issue here is that App Engine's dev_appserver.py doesn't include any Big Query replication (natch); so when you're interacting with Big Query tables it's the production data you're playing with; you need to login to get access.
Obvious in retrospect, but poorly documented.
4) Enable Big Query API in App Engine console; create a Big Query ProjectID
https://console.cloud.google.com/apis/dashboard?project=XXX&duration=PTH1
https://bigquery.cloud.google.com/welcome/XXX
5) Test
from oauth2client.client import GoogleCredentials
credentials=GoogleCredentials.get_application_default()
from googleapiclient.discovery import build
bigquery=build('bigquery', 'v2', credentials=credentials)
print bigquery.datasets().list(projectId=#{ProjectId}).execute()
[or similar]
Good luck!
I'm building a Google App Engine application with a Go backend + Polymer frontend. As a result, I'm using a dispatch.yaml file to serve both at the same time.
The problem I'm facing is that the datastore is empty when I restart my computer. I've tested this on both OSX 10.9.5 and 10.10.4. Both exhibit the same response upon a system reboot. Windows 7, however, seems to hold on to the data.
The documentation suggests that data should persist, since I'm not explicitly calling a clear. It's not. I've tried to set the datastore location myself using this:
dev_appserver.py --datastore_path=~/go_apps/data ~/go_apps/my_app
I'm receiving this error:
google.appengine.tools.devappserver2.errors.AppConfigNotFoundError: "/Users/anthony/go_apps/my_app is a directory but does not contain app.yaml or app.yml
Obviously, since I'm using a dispatch.yaml file, it wouldn't. So, since the backend, which handles the data, does have an app.yaml file, I try to set it there. I use this command:
dev_appserver.py --datastore_path=~/go_apps/data ~/go_apps/my_app/backend
That doesn't seem to work either, as I get this error:
sqlite3.OperationalError: unable to open database file
Okay? Well, not sure where to turn now. From what I could gather from other posts, that data is stored temporarily. But, I can't seem to set a custom, non-temporary location for the data. So, now I'm populating a datastore every time I reboot, which seems ridiculous.
* Edit *
I've tried the following, which seems like it tries to launch the app, and creates a datastore.db file at the correct location:
dev_appserver.py --datastore_path ~/go_apps/my_app/data/datastore.db ~/go_apps/my_app/dispatch.yaml ~/go_apps/my_app/backend/app.yaml ~/go_apps/my_app/frontend/app.yaml
However, I'm getting a weird error now:
/var/folders/04/3hxnpxc15wj2k4v40lkdncd00000gn/T/tmpkcQYnFappengine-go-bin/backend.go:13: can't find import: "github.com/gorilla/mux"
Does Go build to that folder temporarily? That import is definitely available, and always builds fine calling goapp serve.
Here is what my imports look like on backend.go
import (
//standard library
"fmt"
"net/http"
"time"
"log"
//third party
"github.com/gorilla/mux"
"github.com/gorilla/securecookie"
"github.com/dgrijalva/jwt-go"
"golang.org/x/crypto/bcrypt"
//my imports
"github.com/section14/go_polymer_comm_pkg/controller"
)
You have to pass the name of the file to be used as the persisted datastore, not a folder.
And next provide the folder of your app (which contains app.yaml). Don't mix the 2. So it should be something like:
dev_appserver.py --datastore_path=~/my_app/my_app.db ~/go_apps/my_app
Details can be found here:
The Go Development Server / Using the Datastore
Notes:
The default datastore file is in the temp folder, and your OS-X most likely clears that on system restart, that's why it is not preserved for you. On the other hand Windows 7 for example does not clear the temp folder on system restarts.
Got it up and running by adding both GOPATH and GOROOT environment variables to my .bash_profile. In total, these three paths (first path was already set) are needed for it to run:
# Add Google AppEngine path
export PATH=/Users/anthony/go_appengine:$PATH
# GOPATH
export GOPATH=/Users/anthony/go_appengine/gopath
export PATH=$PATH:$GOPATH
# GOROOT
export GOROOT=/Users/anthony/go_appengine/goroot
export PATH=$PATH:$GOROOT
This command is called from inside the project folder (mine resides outside of the appengine folder) for it to launch:
dev_appserver.py --datastore_path data/datastore.db dispatch.yaml backend/app.yaml frontend/app.yaml
Notice that the .yaml files are still there. It builds fine with them, and probably builds fine without them if you don't need a dispatch.yaml file.
Thanks #icza for the direction. Wanted to organize the steps in a post for easier reading.
I'm using Google Cloud Endpoints as back end of a mobile application.
Now i want to implement push notifications for the iOS client but can't load the .p12 certificate from an #ApiMethod, get this error message:
Invalid keystore reference. File does not exist:
/base/data/home/apps/s~my-ws/1.379168523188882449/MyCert.p12"
I added the certificate under /src directory but does not seem to recognize it.
I'm using IntelliJ IDEA, Appengine API 1.9.12, javapns (for Push Notifications) and Maven.
Edit
Maybe i made a step forward.
I put the .p12 file under /src/webapp/WEB-INF/ and added
<configuration>...<webResources><resource><includes><include>*.p12</include>
in my pom.xml.
Then i run mvn clean install && mvn appengine:endpoints_get_discovery_doc and inspected the generated myws-1.0-SNAPSHOT.war. Within the .war file there is my MyCert.p12certificate, but i get this error message now:
java.security.AccessControlException: access denied (\"java.io.FilePermission\" \"/WEB-INF/MyCert.p12\" \"read\")"
Could you check the code that is actually loading the cert? It may be that you need to remove a leading slash from the File constructor. This is not an App Engine thing but a java File thing.
Also, unrelated to your solution but helpful, is a thread here on best ways to store p12 files on App Engine.
I want to try Google appengine sdk with go.
I was getting following error
administrator#jadehol725:~/Documents/softwares/go$ svn checkout http://googleappengine.googlecode.com/svn/trunk/ googleappengine-read-only
svn: E175002: Unable to connect to a repository at URL 'http://googleappengine.googlecode.com/svn/trunk'
svn: E175002: OPTIONS request on '/svn/trunk' failed: 408 Request Time-o
Could you please tell how to download google appengine sdk for go.
It may be easier if you go to https://developers.google.com/appengine/downloads, pick "Google App Engine SDK for Go", and select the appropriate platform. There are also install instructions.
I had trouble setting up my first app engine projects. They don't necessarily play nice with source control when set up in the idiomatic go style. Check out this starter project for tips: https://github.com/SellJamHere/Go-AppEngineStarter. (Full disclosure, I made it.)
In case that can help, this thread mentions:
disable http-compression.
Make a backup of the file %APPDATA%\Subversion\servers
In a text editor, open the file %APPDATA%\Subversion\servers
Under [groups] add this line:
googleappengine = googleappengine.googlecode.com
Add this section:
[googleappengine]
http-compression = no
Save the file
Retry your SVN / TortoiseSVN operation