I'm looking for the updated Django 1.5 command that can do the following action.
python manage.py reset <app>
What I want to do basically is DROP tables and UPDATE the database structure inside with a manage.py command.
The thing is that reset command is no longer working and
manage.py flush
or
manage.py sqlclear <app>
are just dropping the database / table content.
What's the updated reset version for Django 1.5?
I think what you are looking for is South. South is a 3rd party tool, which may soon be integrated into Django, that assists you with database migrations and schema changes. As is stands, Django 1.5 does not deal with schema changes very well, if at all. The only way to adjust a schema in Django 1.5 is to add new models. You wouldn't want to engage in the practice of adding a new model to fulfill a desired table alteration or deletion. Most developers turn to a 3rd part solution when they need to make schema adjustments.
See http://south.readthedocs.org/en/latest/about.html
See Tutorial http://south.readthedocs.org/en/latest/tutorial/part1.html#tutorial-part-1
South is database agnostic and deals with database migrations automatically for you. So if you change your schema it will detect it in models.py and make the appropriate changes. You can include South as an app to your Django project and install it through pip
Hope this helps
As a MySQL alternative, you can create a application.py next to manage.py with the following code.
This will DROP, CREATE, and UPDATE your database with your models.py
#!/usr/bin/python
import MySQLdb
import subprocess
dbname = "mydbname"
db = MySQLdb.connect(host="127.0.0.1", user="username", passwd="superpassword", db=dbname)
cur = db.cursor()
#Drop all database to Drop all tables
cur.execute("DROP DATABASE "+dbname)
#Recreate the DB
cur.execute("CREATE DATABASE "+dbname)
#Sync with manage.py
proc = subprocess.call(['python','manage.py','syncdb'])
print "\n\nFinished!"
Related
I use the doctrine migrations bundle to track changes in my database structure. I would like to ensure that when I'm deploying / adding a new server for my application that:
(A) the database schema is up to date (doctrine:migrations:migrate)
(B) the database always contains a pre-defined set of data
For (B) a good example is roles. I want a certain set of roles to always be present. I realize it is possible with database migrations, but I don't like the idea of mixing schema changes with data changes. Also if I use MySql migrations I would have to create a equivalent Sqlite migration for my test database.
Another option I'm aware of is data fixtures. However from reading the documentation I get the feeling that fixtures are more for loading test data. Also if I changed a role name I don't know how that would be updated using fixtures (since they either delete all data in the database before loading or append to it). If I use append then unique keys would also be a problem.
I'm considering creating some sort of command that takes a set of configuration files and ensures that certain tables are always in a consistent state matching the config files - but if another option exists I'd like to use it of course.
What is the best way to handle loading and managing required data into a database?
If you're using Doctrine Migrations, you can generate initial migration with whole database schema, then you should generate migrations (doctrine:migrations:generate or doctrine:migrations:diff) for all changes that are made in database structure AND also add there queries that will migrate existing data.
Fixtures are designed to pre-populate data (with doctrine:fixtures:load) and, in my opinion, they should be kept up-to-date with latest database schema and executed after doctrine:migrations:migrate / doctrine:schema:create.
So finally:
Create base migration with initial database schema (instead of executing doctrine:schema:create just generate migration file and migrate it)
Create new migrations for each database schema change AND for migrating existing data (such as role name changing)
Keep fixtures up-to-date with latest schema (you can use --append option and only update fixtures instead of deleting all database data first)
Then, when deploying new instance you can run doctrine:schema:create, then doctrine:migrations:version --add --all --no-interaction (mark all migrations as migrated, because you have already created latest schema) and doctrine:fixtures:load which will populate data to the database (also latest version, so data migrations from Doctrine migrations files are not required).
Note: Existing instances should NOT use doctrine:schema:update, but only doctrine:migrations:migrate. In our app we even block usage of this command, in app/console:
use Symfony\Component\Console\Output\ConsoleOutput;
use Symfony\Component\Console\Helper\FormatterHelper;
// Deny using doctrine:schema:update command
if(in_array(trim($input->getFirstArgument()), ['doctrine:schema:update'])) {
$formatter = new FormatterHelper();
$output = new ConsoleOutput(ConsoleOutput::VERBOSITY_NORMAL, true);
$formattedBlock = $formatter->formatBlock(['[[ WARNING! ]]', 'You should not use this command! Use doctrine:migrations:migrate instead!'], 'error', true);
$output->writeln($formattedBlock);
die();
}
This is what I figured out from my experience. Hope you will find it useful :-)
I have a strnage problem.
My django project has myapp module/application. My project uses south to do the schema migrations.
On localhost i have run ./manage.py schemamigration myapp --initial, then i have run migrate command.
But when in production environment i execute migrate command, this doesn't create the correponding table (of myapp models) in database.
It's strange because if i execute migrate --list, myapp has to migration and they are all marked (with * symbol ).
So, i'm thinking about deleting myapp and recreating it from scratch (with corresponding migrations). Is there better solution?
EDIT:
i have tried to delete myapp and to recreate it from scratch. So i have also delete tables of myapp in database (on localhost and on production server), and after all i have executed:
schemamigration myapp --initial command on localhost
migrate myapp command on localhost
migrate myapp 0001 --fake on production server
but South continues to not create the tables of myapp in database of production server.
If you accidentally or intentionally dropped a table in your DB and your are trying to run ./manage migrate myappThis will not create the dropped table in your DB.
Because South does not touch base with your DB.
In case you want to re-create your table. Migrate your schema to a previous version and migrate it latest. Please use the below code accordingly
manage.py migrate myapp 0002 --fake
manage.py migrate myapp
note: 002 is your previous migration version.
if you deleted your tables you shouldn't be running --fake unless you did a manage.py syncdb first. With no table, you should be able to run python manage.py migrate myapp and be done with it (or a manage.py syncdb). The first migration created by --initial has create table statements in it.
--fake explicitly tells south to not do anything but pretend it migrated (performed DB changes) and mark the history table as such.
I know its a bit late, but had this same problem and I found that the problem was that my manage.py was pointing to the wrong settings file hence wrong DB. Ensure your manage.py is pointing to the correct settings file and migrations are being made to the correct DB. This can arise if you are using multiple manage.py files or multiple settings files.
I've been reading up today on database synchronization in Magento.
One thing I am currently struggling with is what needs to be synced during development and during uploads to production. Now assuming that a batch of changes will consist of changes to the DB and code alike, below would be my understanding of a model workflow (I do not currently use a 'stage' server so that is bypassed in this example):
Sync dev DB from production DB
Checkout working copy of code to dev machine
Make changes and test them on dev server
Accept changes and commit them to svn repository
Touch Maintenance.flag on production server and prepare for upgrades (this altogether eliminates sync issues from users interacting with live data that is about to change right?)
Merge branches to trunk and deploy repository to production server
Sync dev DB back to production DB and test changes
So items # 1 & 7 I don't fully understand when working with Magento:
What needs to be synced and what doesn't?
It seems ridiculous to sync order and customer info to me so I wouldn't do it.
I would want product schema and data synced though obviously, and any admin changes, module changes, etc. How to handle that?
What about HOW to sync? (MySql dumps, import/export, etc)
Currently I'm using Navicat 10 Premium which has structure and data sync features (I haven't experimented yet but they look like a huge help)
So I don't necessarily need specifics here (but they would help). More or less I want to know what works for you and how happy you are with that system.
if you are using CE version then:
ditch svn and use GIT :)
never sync a database , prepare your database upgrades as extension upgrade files
have 3 sites dev, stage, live
live database is copied over to stage and dev when needed
make all your admin changes from live and just copy the whole database down the line
this way you never have to sync a database + if you do all config changes via extension upgrade scripts you can cold boot your magento to a new database structure wherever you want without loosing data structure
I use phpunit to build a dev db. I wrote a short script which dumps xml data from the live database and I used it table-by-table, munging anything sensitive and deleting what I didn't need. The schema for my dev database never changes and never gets rebuilt. Only the data gets dropped and recreated each phpunit run.
May not be the right solution for everyone because it's never going to be good for syncing dev up to stage/production, but I don't need to do that.
The main benefit is how little data I need for the dev db. It's about 12000 lines of xml, and handles populating maybe 30 different tables. Some small core tables persist as I don't write to them and many tables are empty because I do not use them.
The database is a representative sample, and is very small. Small enough to edit as a text file, and only a few seconds to populate each time I run tests.
Here's what it looks like at the top of each PHPUnit test. There's good documentation for PHPUnit and DbUnit
<?php
require_once dirname(dirname(__FILE__)) . DIRECTORY_SEPARATOR . 'top.php';
require_once "PHPUnit/Extensions/Database/TestCase.php";
class SomeTest extends PHPUnit_Extensions_Database_TestCase
{
/**
* #return PHPUnit_Extensions_Database_DB_IDatabaseConnection
*/
public function getConnection() {
$database = MY_DB
$hostname = MY_HOST
$user = MY_USER
$password = MY_PASS
$pdo = new PDO("mysql:host=$hostname;dbname=$database", $user, $password);
return $this->createDefaultDBConnection($pdo, $database);
}
/**
* #return PHPUnit_Extensions_Database_DataSet_IDataSet
*/
public function getDataSet() {
return $this->createXMLDataSet(dirname(dirname(__FILE__)) . DIRECTORY_SEPARATOR . 'Tests/_files/seed.xml');
}
}
So, now you just need a seed file that DbUnit reads from to repopulate your database each time Unit tests are invoked.
Start by copying your complete database twice. One will be your dev database and the second will be your "pristine" database, that you can use to dump xml data in case you start having key issues.
Then, use something like my xml dumper againt the "prisine" database to get your xml dumps and begin building your seed file.
generate_flat_xml.php -tcatalog_product_entity -centity_id,entity_type_id,attribute_set_id,type_id,sku,has_options,required_options -oentity_id >> my_seed_file.xml
Edit the seed file to use only what you need. The small size of the dev db means you can examine differences just by looking at your database versus what's in the text files. Not to mention it is much faster having less data.
I would like to completely empty the entire database, restoring it to the way it was when I just created it, using Django's manage.py. Possible?
What you can do to flush the DB and not have any migrate(south) problem afterwards is:
first, reset the data from the DB:
python manage.py flush
second, fake the migrations that are already applied:
python manage.py migrate --fake
third, if you have some fixture to load:
python manage.py loaddata my_sweet_json_file
Yes, you can use flush.
That will reset and restore everything in your entire database, regardless of which app or project the models are in. If you have a few databases, you can specify a single one in particular by using the --database switch
Examples:
python manage.py flush
python manage.py flush --database mydatabase
It there a tool for django to install some static data (necessary data to have application runing ) to database?
./manage.py syncdb will make the database schema i db, some tool for pushing static data to db ?
Thanks
Fixtures.
EDIT: Better example.
Fill your database with static data you need, then execute command:
./manage.py dumpdata --indent=4 > initial_data.json
After that every ./manage.py syncdb will insert data that you entered the first time into database.
If you are familiar with json format (or xml or yaml) you can edit initial data by hand. For details check out official documentation.