In my previous App Engine projects I used the Cloud Datastore, and during development I could debug my app on the local server and it would use a local database, stored in a file I could wipe out if I wanted to start from scratch.
With Cloud Firestore, even when I'm running locally it's talking to my real cloud database. Is there still a local option? Note that I'm not talking about client-side persistence, I'm talking about a mock development DB.
Google recommends setting up multiple projects if you want dev/staging/production, and I'm guessing that's the answer, but I'd like to know before adjusting my workflow.
I think (now only a few months later) that this is supported. When I run my app, using dev_appserver.py, I see a message
INFO 2019-02-14 00:08:56,030 admin_server.py:150] Starting admin server at: http://localhost:8000
Going to that URL shows me all the instances I have been seeing. These seem to persist even when the dev_appserver is restarted. Reading this and other posts I was convinced that my development was using my actual cloud database, but going to https://console.firebase.google.com/project/myproject was showing completely different content.
Just to be sure (because google is google and everything is named the same) I'm using an appengine app and a gcloud project, storing things to Firestore using ndb.Models...
Oh, but careful. My app I was also using the cloudstorage (blobstore?) and even though the localhost:8000 showed these, THESE WERE THE REMOTE INSTANCES.
There is a local emulator for Firestore when using the Firebase CLI:
https://firebase.google.com/docs/rules/emulator-setup
Related
I use the Google App Engine Standard environment to develop my Python app using Development SDK 1.9.61.
I'm trying to learn to use Google Cloud Storage in my app by following these instructions. I verified that my default and staging buckets do exist via the cloud console, and manually uploaded a sample file to each bucket using my browser.
Next, I programmatically uploaded some files to a bucket (so I thought) via my local development app instance per Google's instructions.
However, when I checked my cloud storage buckets via my GCP Console in my browser, I could not find the files. After searching my local development SDK console, I eventually found the files located in the local "Blobstore Viewer".
I'm confused, based on Google's instructions I expected to find the files in my project's cloud storage bucket.
I searched the App Engine Python Release Notes for some potential SDK version changes to explain this behavior, but couldn't find anything relevant.
Is this the way it's supposed to work? Are Google's instructions in error?
If you upload files to a local development server, those exist in-memory on your machine. The GCP Console doesn't interact with your local development server, it interacts with the public (production) Google Cloud Storage API.
So in essence, the files on your local dev server are in a completely different namespace. If you want to interact with the production version of Google Cloud Storage and see the results in the GCP console, you'll need to use a non-dev-server deployment of your application.
Im creating a Node.js website that probably won't have loads of traffic, and was looking into cheap solutions to host the site. Came across Google cloud services offering free usage for their services with limits. A f1-mirco is more than enough for my needs, but I will happily pay for some usage if it goes over by any chance.
I wanted to setup a linux centOS 7 on GCE (which I already did), and run my application and REST API on it. Now here comes the problem.
I tried to use Google's datastore service, but it sprung an app engine instance and without it datastore won't work.
Is datastore entirely relying on app engine to function?? In the docs, it said if you use any of the client API, it requires app engine. What can I do to not use the client api and query data then? Don't want to use the app engine at the moment or datastore is just not for me then?
Thanks for any help!
Some of the underlying infrastructure of Cloud Datastore and App Engine are still tied together for creation, etc. So while creating an Cloud Datastore database also defines an App Engine instance for the project, it doesn't require you to use it. You don't get charged for App Engine either, unless you decide to deploy an App using it.
You should be totally fine use the Google Cloud Node client library on the f1 micro instance.
I've built systems on top of Google's App Engine and leveraged Google's Datastore, but for my new project I'm considering a containerized solution (using Google's Container Engine). Does anyone with experience using both technologies together know:
if this is possible to use Container Engine with Datastore?
if it's easy to set up a local containerized dev environment with gcd?
if there are some serious headaches I should consider before going down this route?
Absolutely! You can run any code you want in Container Engine, and if you add the datastore scope to your cluster when you create it, authentication to the Datastore API will be automatic if you're using Datastore's client libraries or tools.
I'm not familiar with the local gcd environment, so I can't help much here. Testing Docker containers locally before pushing them to the cloud works great, so the only question will be making sure the gcd dev environment can be exposed to your local containerized app.
The dev environment is the one issue I'm not sure of. Using Datastore from Container Engine should work fine.
What I have done is just created a service account and use the json key to access datastore when I am working locally. It seems to work pretty well.
I 'm building an AppEngine application that stores data in Google Cloud Storage. I use the Google Cloud Storage Client (GCS) library as suggested.
My app is working when deployed on AppEngine (reading/writing/listing objects) but I cannot make it work on the development server. The development server keeps returning error 404 and GCS raises NotFoundError. The dev-appserver is supposed to emulate the cloud storage functionality without any specific configurations etc. I see in the log files that the dev server is accepting requests at "/_ah/gcs" yet it seems that there is no handler for that url. I 've tried with version 1.8.5 and 1.8.6. Apart from my app, not even the demo app provided by Google works.
Is there something that I 'm missing here, e.g. a special configuration for the dev-appserver?
Sorry the following change was pushed out too early by mistake. It only works with 1.8.8 SDK. We are streamlining the release process of gcs client to align with SDK. Sorry
https://code.google.com/p/appengine-gcs-client/source/detail?r=125
Without this change, it works on 1.8.7 SDK.
I need to change my local datastore path of APP Engine. I have followed methods specified here How can I persist the local datastore for GoogleAppEngineLauncher between reboots? . I tried changing datastore path but it didn't work. I am using App Engine SDK 1.6.4, Python 2.7 and NDB as datastore on Windows 7. Also I could not find default datastore location on my computer as stated in dev_appserver.py -help output (which is a temp location but I searched while application was running and datastore was serving).
My objective is to stop local datastore cleaning itself on each launch (I am using launcher).
The datastore SHOULD persist between reboots.
When you shut down, the server you should see a message "Applying all pending transactions and saving the datastore" in the console. If not, you maybe seeing the following issue (two questions, same issue, first one has a workaround)
GAE SDK 1.6.4 dev_appserver datastore flush
App Engine local datastore content does not persist