I 'm building an AppEngine application that stores data in Google Cloud Storage. I use the Google Cloud Storage Client (GCS) library as suggested.
My app is working when deployed on AppEngine (reading/writing/listing objects) but I cannot make it work on the development server. The development server keeps returning error 404 and GCS raises NotFoundError. The dev-appserver is supposed to emulate the cloud storage functionality without any specific configurations etc. I see in the log files that the dev server is accepting requests at "/_ah/gcs" yet it seems that there is no handler for that url. I 've tried with version 1.8.5 and 1.8.6. Apart from my app, not even the demo app provided by Google works.
Is there something that I 'm missing here, e.g. a special configuration for the dev-appserver?
Sorry the following change was pushed out too early by mistake. It only works with 1.8.8 SDK. We are streamlining the release process of gcs client to align with SDK. Sorry
https://code.google.com/p/appengine-gcs-client/source/detail?r=125
Without this change, it works on 1.8.7 SDK.
Related
I have deployed source code via gcloud command line with no issue. However, I am currently away from my desktop and see a critical change to my app.yaml file that I would like to make.
Is this possible to do via my Google Cloud account?
You can use the App Engine Admin API to patch the specific version of your service and update the instance type since your app is using App Engine Standard. You can use the "Try this API" feature to update it right from your browser.
We are planning the migration of an internal app running on Google App Engine Standard Environment for Java 8 from the now superseded App Engine APIs to the recommended client library for Cloud Firestore in Datastore Mode, also in order to enable porting to other execution environments.
Besides a host of issues with missing IN/OR query operators, we are also struggling with the setup for local testing: according to Using the Java 8 Local Development Server
The development web server simulates Datastore using a local
file-backed Datastore on your computer. The Datastore is named
local_db.bin, and it is created in your application's WAR directory,
in the WEB-INF /appengine-generated/ directory.
but we have no clue about how to connect Google Cloud Client Library for Datastore to the local emulator.
Defining default credentials with
gcloud auth application-default login
or setting the GOOGLE_APPLICATION_CREDENTIALS environment variable after obtaining the credentials for the service account with something like
gcloud iam service-accounts keys create key.json \
--iam-account=project-id#appspot.gserviceaccount.com
as suggested elsewhere, just results in the client library connecting to the actual cloud server, rather than to the local emulator, as per the ADC policy.
I'd expect the development server to automatically provide connection hints to the client library, but that's apparently not the case.
Any suggestion for setting up a local testing environment, taking into account that we can't just migrate to the standalone Datastore Emulator, as we need other services currently provided only by the App Engine development server (e.g. email submission)?
Edit / After further tinkering we are working around the issue by using both the Local Development Server and the standalone Datastore Emulator as:
gcloud beta emulators datastore start \
—project=project-id \
--host-port=localhost:8081 \
--data-dir=target/war
DATASTORE_EMULATOR_HOST=localhost:8081 java_dev_appserver.sh \
--port=8080 \
target/war
However, the process is quite cumbersome and difficult to automate: what we are looking for is a way to automatically connect Google Client Library to the Datastore Emulator managed by the Java 8 Local Development Server when launching the app with something like the App Engine Maven plugin, e.g. mvn appengine:run.
This GitHub issue was closed with the confirmation that the Datastore Client Library is not compatible with the local Web Server Datastore Emulator.
I actually tried it, to see if was possible to force a connection to the local Web Server. The code below sets a custom builder with the desired host configuration:
DatastoreOptions.Builder builder = DatastoreOptions.newBuilder();
builder.setHost("http://localhost:8080");
builder.setProjectId("<PROJECT_ID>");
Datastore ds = builder.build().getService();
Key key = ds.newKeyFactory().setKind("MyEntity").newKey("mykey");
Entity entity = Entity.newBuilder(key).set("p1", "Hello World!").build();
entity = ds.put(entity);
System.out.println(entity);
entity = ds.get(key);
System.out.println(entity);
After running the local Web Server I noticed a connection was indeed possible, however the Datastore Client Library returned the following error when trying to store new entities:
[INFO] GCLOUD: com.google.cloud.datastore.DatastoreException: Non-protobuf error: <html><head><title>Error 404</title></head>|<body><h2>Error 404</h2></body>|</html>. HTTP status code was 404.
With the following output by the Web Server:
Oct 02, 2019 3:05:59 PM com.google.appengine.tools.development.jetty9.LocalResourceFileServlet doGet
WARNING: No file found for: /v1/projects/<PROJECT_ID>:commit
I believe this adds to the confirmation that the new library is just not compatible to the old emulator.
The workaround you found is probably the best solution while you work/wait on the full migration to the Datastore mode Emulator.
In my previous App Engine projects I used the Cloud Datastore, and during development I could debug my app on the local server and it would use a local database, stored in a file I could wipe out if I wanted to start from scratch.
With Cloud Firestore, even when I'm running locally it's talking to my real cloud database. Is there still a local option? Note that I'm not talking about client-side persistence, I'm talking about a mock development DB.
Google recommends setting up multiple projects if you want dev/staging/production, and I'm guessing that's the answer, but I'd like to know before adjusting my workflow.
I think (now only a few months later) that this is supported. When I run my app, using dev_appserver.py, I see a message
INFO 2019-02-14 00:08:56,030 admin_server.py:150] Starting admin server at: http://localhost:8000
Going to that URL shows me all the instances I have been seeing. These seem to persist even when the dev_appserver is restarted. Reading this and other posts I was convinced that my development was using my actual cloud database, but going to https://console.firebase.google.com/project/myproject was showing completely different content.
Just to be sure (because google is google and everything is named the same) I'm using an appengine app and a gcloud project, storing things to Firestore using ndb.Models...
Oh, but careful. My app I was also using the cloudstorage (blobstore?) and even though the localhost:8000 showed these, THESE WERE THE REMOTE INSTANCES.
There is a local emulator for Firestore when using the Firebase CLI:
https://firebase.google.com/docs/rules/emulator-setup
I use the Google App Engine Standard environment to develop my Python app using Development SDK 1.9.61.
I'm trying to learn to use Google Cloud Storage in my app by following these instructions. I verified that my default and staging buckets do exist via the cloud console, and manually uploaded a sample file to each bucket using my browser.
Next, I programmatically uploaded some files to a bucket (so I thought) via my local development app instance per Google's instructions.
However, when I checked my cloud storage buckets via my GCP Console in my browser, I could not find the files. After searching my local development SDK console, I eventually found the files located in the local "Blobstore Viewer".
I'm confused, based on Google's instructions I expected to find the files in my project's cloud storage bucket.
I searched the App Engine Python Release Notes for some potential SDK version changes to explain this behavior, but couldn't find anything relevant.
Is this the way it's supposed to work? Are Google's instructions in error?
If you upload files to a local development server, those exist in-memory on your machine. The GCP Console doesn't interact with your local development server, it interacts with the public (production) Google Cloud Storage API.
So in essence, the files on your local dev server are in a completely different namespace. If you want to interact with the production version of Google Cloud Storage and see the results in the GCP console, you'll need to use a non-dev-server deployment of your application.
We are currently running a combined AppEngine / GCE app and thus far have kept all of our datastore access on the AppEngine side of things. Now we are exploring also allowing our GCE instance to make some queries into the (shared) datastore. To start, I'm trying to figure out how to run things locally. What we have so far:
A Go devappserver running
A Go standalone binary that wants to issues queries to the devappserver datastore.
We installed ('go get') google-api-go-client/datastore/v1beta2 so that we can use an API instead of issuing direct HTTP calls. However we are definitely willing to issue direct HTTP calls if this API library won't work in development.
We have service accounts set up (we already access GCS from GCE) but I doubt that's relevant for running locally...
I've seen some docs but they (a) only talk about Python & Java, and (b) discuss connecting to the (standalone) development datastore server, as opposed to the datastore embedded in AppEngine's devappserver (if those are even different?). There is also the following answer here on StackOverflow, but again it discusses connecting to the standalone development datastore server:
How to connect to the local google cloud Datastore db?
Any pointers would be much appreciated!
Ian
Currently this is not possible in the development environment for several reasons. The Google Cloud Datastore tool (gcd.sh) uses the java development server. However when developing go for App Engine you use the python development server, which has different underlying storage. There is a bug to track this issue on the github page.
You can still develop a Google Cloud Datastore application in go however there are many bugs in the current go client library. Unfortunately, the development server does not currently support the JSON API, which the go library uses (see the note at the top of the page).
Update: I wanted to make sure proppy's comment was seen as part of the answer. His suggestion does provide a way to use the protocol version of the API, which is probably more stable than the go client library above. It could also let you use the gcd.sh tool to test this in the development server. You will have to craft the HTTP requests yourself though, and you won't be able to share the data in the datastore between your application and the Cloud Datastore in development. However it is definitely a good workaround and lets you use the Cloud Datastore API, which as it develops will be easier to work with than other workarounds.
From proppy:
Note that you can still use Cloud Datastore Protobuf HTTP API with Go. The protobuf definition is available on GitHub, you can compile it to Go code using the Go protobuf compiler plugin and then send POST HTTP requests to /datastore/{version}/datasets/{datasetId}/{method}.
If the use case from your "GO" app server is straight forward enough, you may want to implement access by using an API call to your GAE service (perhaps extending the service to receive the API calls).
This has the added benefit of only having to make changes in one place if your datastore definitions or functions change.