When I re-deploy an app to the app engine, the app the existing data in the datastore is cleaned-up.
How do I prevent this from happening?
Update:1 I am using Java, 1.6.3. This is on the production system.
the datastore in production does not get flushed if you redeploy. if you change model definitions and structures it might be that the data you need does not belong to the new(er) model definition...
if you are speaking about the datastore on your local machine start the dev server with the
--datastore_path=/tmp/myapp_datastore myapp
if you are using python. don't know what the equivalent for the java SDK is.
you need to provide more details.
It might be that you are changing the app name just prior to deployment (so you're deploying to the correct app). But previous data in your local datastore was stored under the prior name. If you don't change the app name back before using the local datastore again, it can appear as though the data has disappeared.
Related
Is there a way to update selected files when using the App Engine Flexible env?
I'm facing an issue whenever I do a small change in the app.yaml file: to test it I would need to deploy the whole application which takes ~5mins.
Is there a way to update only the config file? OR is there a way to test these files locally.
Thanks!
The safe/blanket answer would be no as the flex env docker image would need to be updated regardless of how tiny the changes are, see How can I speed up Rails Docker deployments on Google Cloud Platform?
However, there might be something to try (YMMV).
From App Engine Flexible Environment:
You always have root access to Compute Engine VM instances. SSH access to VM instances in the flexible environment is disabled by
default. If you choose, you can enable root access to your app's VM
instances.
So you might be able to login as root on your GAE instance VM and try to manually modify a particular app artifact. Of course, you'd need to locate the artifact first.
Some artifacts might not even be present in the VM image itself (those used exclusively by the GAE infra, queue definitions, for example). But it should be possible to update these artifacts without updating the docker image, since they aren't part of the flex env service itself.
Other artifacts might be read-only and it might not be possible to change them to read-write.
Even if possible, such manual changes would be volatile, they would not survive an instance reload (which would be using the unmodified docker image), which might be required for some changes to take effect.
Lots of "might"s, lots of risks (manual fiddling with the app code could negatively impact its functionality), up to you to determine if a try is really worthy.
Update: it seems this is actually documented and supported, see Accessing Google App Engine Python App code in production
I am looking into using Google App Engine for a project and would like make sure I have a way to export all my data if I ever decide to leave GAE (or GAE shuts down).
Everything I search about exporting data from GAE points to https://developers.google.com/appengine/docs/python/tools/uploadingdata. However, that page contains this note:
Note: This document applies to apps that use the master/slave
datastore. If your app uses the High Replication datastore, it is
possible to copy data from the app, but Google does not currently
support this use case. If you attempt to copy from a High Replication
datastore, you'll see a high_replication_warning error in the Admin
Console, and the downloaded data might not include recently saved
entities.
The problem is that recently the master/slave datastore was recently deprecated in favor of the High Replication datastore. I understand that the master/slave datastore is still supported for a little while, but I don't feel comfortable using something that has officially been deprecated and is on its way out. So that leaves me with the High Replication datastore and the only way it seems to export the data is the method above that is not officially supported (and thus does not provide me with a guarantee that I can get my data out).
Is there any other (officially supported) way of exporting data from the High Replication datastore? I don't feel comfortable using Google App Engine if it means my data could be locked in there forever.
It took me quite a long time to setup the download of data from GAE as the documentation is not as clear as it should be.
If you extracting data from a Unix server, you maybe could reuse the script below.
Also, if you do not provide the "config_file" parameter, it will extract all your data for this kind but in a proprietary format which can only be used for restoring data afterwards.
#!/bin/sh
#------------------------------------------------------------------
#-- Param 1 : Namespace
#-- Param 2 : Kind (table id)
#-- Param 3 : Directory in which the csv file should be stored
#-- Param 4 : output file name
#------------------------------------------------------------------
appcfg.py download_data --secure --email=$BACKUP_USERID -- config_file=configClientExtract.yml --filename=$3/$4.csv --kind=$2 --url=$BACKUP_WEBSITE/remote_api --namespace=$1 --passin <<-EOF $BACKUP_PASSWORD EOF
Currently app engine datastore supports another option also. Data backup provision can be used to copy selected data into blob store or google cloud storage. This function is available under datastore admin area in app engine console. If required, the backed up data can then be downloaded from the blob viewer or cloud storage. For doing the backup for high replication datastore, it is recommended that datastore writes are disabled before taking the backup.
You need to configure a builtin called remote_api. This article has all the information and guide you need to be able to download all your data today and in the future.
I and my friend are working on a GWT-Google App Engine project, using Tortoise SVN and Google Code to synchronize the code.
We also synchronize the local_db.bin file in appengine-generated folder. But we cant get it work. After synchronize the db file, our local datastore is not updated as we expected.
That is a pain. Im worrying about our future, when our database get bigger and more complicated #A#.
Anyone please give me an advice. What should i do to synchronize our local datastore?
I have to suggestions:
1) Use remote api : https://developers.google.com/appengine/articles/remote_api to share a GAE hosted db locally.
2) Maybe you can use Gdrive to sync folders.
This is a really bad idea. Even if you weren't having trouble making both ends read from the same datastore file, the local datastore is in a binary format, and thus you won't both be able to work on the app at the same time, or you'll get merge conflicts you will be unable to resolve.
Instead, both for collaboration purposes and for testing and deployment, you should provide a set of test data you can easily load into the datastore. Store the test data in version control, and load it in using bulkloader or your own code.
I am developing an app locally in Google App Engine. I have built a small datastore for development purposes. Rebuilding it after every power cycle on my Mac got tedious so I made it permanent. Now I run my app locally with the following command:
/usr/local/bin/dev_appserver.py "--datastore_path=./permanent.datastore" appengine_prototype
Life is good. I have decided to deploy my app so I can test http post commands from a different machine. When I tried to register my current application id (example), I found that it was unavailable (shocker!). So I registered a different application id and planned to change my local application id to match. However, when I changed the
application: *app-id*
line in my app.yaml file, my app stopped recognizing my permanent datastore.
So, how can I change my application id to the one I registered, maintain the connection to the permanent datastore and then push the whole shebang online? I tried running the app twice locally, first with the permanent datastore referenced in the command and then without, hoping that the default temporary datastore would inherit from the previous permanent datastore. That didn't work. Do I need to start by copying the permanent datastore to the default temporary datastore? How would I do that? Any help would be much appreciated.
Thanks,
Dessie
If your intention is to eventually push your local data to your live environment anyway, then your best bet is to:
use bulkloader.py to backup your local data (while using oldid in your config)
then change your config to your newid
then use bulkloader.py to push your data to your new development server (ran with --datastore_path=./permanent.datastore2 or something)
then use bulkloader.py to push your data to the GAE production server
Details of bulkloader.py can be found in the docs and an example here
Is there a way to export the data on my AppEngine database to the development server (for testing purposes etc.) ?
Yes! Check out Google's "Uploading and Downloading Data"
If you'd like to test how your data
works with the app before uploading
it, you can load it into the
development server. Use the --url
option to point the tool at the
development server URL. For example:
appcfg.py upload_data --config_file=album_loader.py --filename=album_data.csv --kind=Album --url=http://localhost:8080/remote_api <app-directory>
The subsection on uploading and downloading all data is also worth looking at.
Not yet it seems
Of course you can go pulling the data yourself, one batch at a time...
Yes we can download all data from google app engine and can upload to datastore but sometimes uploading data to local development server is painfull because of errors. App Engine SDK versioning diffrences occurs this like problems. For example i developed an app 1 year ago. Today, i want update it. I downloaded all data from Google App Engine real servers. But i can't upload its to local development server. You know, we using EntityLoader class for this operation. Entity Class importing db module, but SDK throws, "no module named by db".
I suggest for App Engine lovers that; save your first test data for future. Don't think that i will download all datum for testing future. Save your own test data with Sqlite support. And save your deveopment enviromenment version for future. SDK Version updating sometimes causing painfully times for developers