Wiping the datastore? - google-app-engine

I'm working on an app engine project (java). I'm using the jdo interface. I haven't pushed the application yet (just running at localhost). Is there a way I can totally wipe my datastore after I publish? In eclipse, when working locally, I can just wipe the datastore by deleting the local file:
appengine-generated/local_db.bin
any facility like that once published?
I'm using jdo right now, but I might switch to objectify or slim3, and would want a convenient way to wipe my datastore should I switch over, or otherwise make heavy modifications to my classes.
Otherwise it seems like I have to setup methods to delete instances myself, right?
Thanks

you can delete it from admin console if there are not much enitty stored in your app. go to http://appengine.google.com and manually do it. easy for less than 2000-5000 entity.

This question addressed the same topic. There is no one command way to drop an entire datastore's worth of data. The only suggestion I have beyond those give in that previous question, would be to try out the new Mapper functionality, which would make it easy to map over an entire set of entities, deleting them as you went.

Related

Making changes to flask app without db.drop_all(), db.create_all()

I have a flask app that is deployed on Google's App Engine. I have noticed a minor bug and I would like to fix it but my database is already populated.
How can I make this minor code change and push / deploy back to my app without losing all my data? (which is probably a basic question but I'm not finding much. all tutorials online are focused on creating the app and deploy, not updating)
Thus far, I have been dropping and re-creating the tables whenever I redeploy, mostly out of ignorance. Here are the steps I have followed
1). make the change in my app
commit and push changes to bitbucket source code
in Google Cloud SDK: git pull
Google Cloud SDK: gcloud app deploy
These steps result in an empty database because the directory I am pushing from my local computer has an empty database. Is this where I should be using git merge?
Is this a database "migration" or is this a "git merge"? I'm not sure what the right terms are to use to research this further. Thanks.
There are a couple of angles to your question. I'm going to try to give you some information, but let me warn you, this isn't going to be a trivial change to your workflow, you'll have to change some things.
First of all, based on the way you worded your question I get the idea that you commit your database to git along with your code. If I got this right, then this is something that you need to stop doing. The database is not code, so it should not be committed to source control.
You should have a completely independent database on each installation of your application. For example, you will have a database on your own machine to do development. You will also need another database in your gcloud deployment. You may need more databases if you have other uses for your application. A very common third database for many people is one that is used for automated tests, which could also be located in your local development machine, but is not the same database that you use for day to day development.
To make changes to your database schema you will not drop and recreate tables anymore, that is clearly something that you already realized that needs an improvement. A good approach to make these changes is to use a database migration framework. These tools allow you to generate short scripts that make these changes to the database in a more focused way, without destroying and recreating everything, and for that reason, the data is in general not lost. For Flask-SQLAlchemy, the best option for database migrations is Flask-Migrate, which is a lightweight wrapper around the Alembic migration framework. (I might be biased here as I'm the author of the Flask-Migrate extension!).
Documentation for Flask-Migrate: https://flask-migrate.readthedocs.io/en/latest/.

how to backup all salesforce metadata

I'm trying to figure out the best way to backup all of our Salesforce metadata in our full sandbox.
We've had a large team working on numerous areas of Salesforce (configuration and development) and we've promoted all that code to our full sandbox. Before moving to production, we want to backup all the metadata. We are not concerned about actual data. We just want to make sure we backup all the metadata in our full sandbox, then promote to our production instance and finally do a refresh of our full sandbox.
We thought about using a change set, but that would be horribly tedious, time-consuming and would it indeed grab all metadata.
Would creating an unmanaged package be an option? I've never done anything with packages, so I'm in the dark on that process. Would it be easy to grab all the metadata?
I've read about options using the ANT Tool, which I have no experience using and it seems to be a little tricky to setup and configure.
I use Eclipse regularly, I don't believe Eclipse can grab all the metadata (approval processes, etc.)?
Any insight and help on solving this would be greatly appreciated.
Thank you.
I'd like to suggest using a version control. It's the best way for managing all changes into your project, store history of changes and comfortable team work. I prefer git but you can select any other.
For managing changes which can't be retrieved via Eclipse/ant migration tool or any other tool I use file named "NonMetaDataChanges" which stores all configuration steps which should be performed on fresh org for setup application before and after deployment of metadata. usually these manual changes takes no more than half a hour.
Also I've just checked that Approval Process can be retrieved via Eclipse.
Isn't the easiest way to create a metadata backup to create or refresh a sandbox? Other than custom settings, what will it miss? With the sandbox change-sets could be create to return production to a happier place.
The best path, I agree, is to put all changes under version control. But until all metadata can be extracted and re-imported, some kind of all-of-the-above approach must do.

syncdb and updating database schema in Django

I have been working on developing the models for a Django app. I am working on it in my spare time. I have an issue when it comes to testing. Whenever I realize I have a mistake in my models, I have to go through the trivial but annoying process of dropping the corresponding database and then recreating it and running python manage.py syncdb. Apparently, this is because Django's syncdb cannot change the database schema. I do not fully understand what that means. Although, I know it is a pain to have to keep deleting and recreating my database every time I try and change something in my models.
Is there a better way of doing this? Is something wrong with my install? I feel like this is such a simple thing to fix that I must be doing something wrong.
No, nothing is wrong with your installation. It is a limitation of django.
There is a 3rd party app called django-south, which can be used for managing "migrations" (changes to database models). It is a very widely used app for managing database changes.
There are lots of documentation online to help you understand how south works, and how to use it.
This is a good tutorial on south

Wordpress distributed development and database management

I am looking for a way to handle a distributed development for Wordpress. For the moment I set up a shared git repository on which I have all the code of the website versioned. The problem I'm having regards how to handle the database. Clearly I need our site running while we (me and other developers) improve the website locally. This means that the user of the website (which is not up yet) will be able to modify our database (user registration, etc.) while we are working on the development of the site locally, using a dump of the database.
What I am trying to understand is the best practice to handle a shared development like this, while the site is running and thus the database can change.
Not sure what you develop, theme or plugins but with WordPress the change in the database should not have an effect on your development, unless you set something up where the user can create new custom posts, with that I mean a new "custom post" not a new post based on a custom post, which could potentially change the behavior of what you are developing.
If the user runs into something odd because of what they did, well that's called bug fixing, the good news is that you can just export and import the database to fix whatever they run into.
Database data changes isn't your problem (dump exchange, if needed, solve most)
Changes of structure are another big question, you can try to see (for brain-powered solution) on LiquiBase

Play! Framework templates on GAE

As far as I learned there is no possibility to write directly to the filesystem on GAE.
Since the templates are stored in the app/view directory how do you solve the problem to have some interface that is used to editing the templates that are being used? Is it possible to have template editors who have access to the new version's template files, or they must work on a separate development server and when everyone is done then the app admin can upload the new version to GAE?
IMO the best solution is to store the templates directly in GAE? In the datastore itself or in the blobstore... I think nothing prevents Play! from extracting its VirtualFile from a datastore instead of the file system. But, you need to tweak a bit this part of Play! I don't think it's really terrible but it's not trivial.
You can save them in blobstore so it will slow. You could make a hybrid solution using Memcache + Datastore or play around with the Files API

Resources