Issues with makemigrations and upgrading to Django 1.7 - django-1.7

My team had a project running on Django 1.6 with South and we just upgraded to 1.7. I followed the guide, cleared my old South migrations, ran makemigrations and migrate, and ultimately got my server running so I pushed up the changes to our repository.
Now a second developer on my team pulled down from the repo, upgraded to 1.7, and attempted to run migrate (my migrations from makemigrations were in the repo, so there didn't seem to be a need to run makemigrations). However, he's getting "Models aren't loaded yet" whenever he attempts to migrate (even with --fake). How can he get his environment up and running without deleting all my migrations and running makemigrations?
Also, looking ahead, we will have to make new schema migrations in 1.7 before pushing the code to our production server which is still on 1.6. Basically, we'll need to upgrade to 1.7 and then immediately apply new schema changes right after. Will there be any issues if we move off of South and apply new 1.7 migrations at the same time? Will Django know the difference between the initial past migrations that South originally applied vs. the new migrations that were created after moving off of South?

We ended up figuring out how to get this to work.
A lot of the "0002" migrations from the initial makemigrations were failing because the tables already existed, so we had to run migrate, fake one of the "0002" migrations, run migrate again, and repeat the process until it went through. It was a pain but it worked.

Related

From staging to existing production Laravel application

Hey folks and moderators,
Let’s say I have deployed a Laravel application on production server and how do I modify and update the application without affecting production data?
Assumed that I wanted to release the next version of the application with additional few columns for table Users.
The question is should I clone database from live to staging?
What is the right way to modify staging application and deploy to production without affecting the production database even though there are additional tables/columns from staging?
Currently I running two different environments and drop production tables and import from staging. It sounds like not efficient.
Any better idea to improve from staging to production?
Thank you!
I’ve been tried to search around unfortunately it brings no luck.
Assumed that I wanted to release the next version of the application with additional few columns for table Users.
Yes writing a new migration like this:
Schema::table('users', function (Blueprint $table) {
$table->string('email'); // Adding
$table->string('name', 50)->change(); //Changing
});
Thats for adding, for changing type, renaming and dropping columns, you will also have to install: composer require doctrine/dbal
See more info at the docs:https://laravel.com/docs/5.6/migrations#creating-columns
The question is should I clone database from live to staging? What is the right way to modify staging application and deploy to production without affecting the production database even though there are additional tables/columns from staging?
...
Any better idea to improve from staging to production?
I'm assuming you are using Git. A single server, and that you can put down your app for a while, maybe do the update when the less users are using the system.
Push to repository
ssh to production server
php artisan down --message="Your Awesome Message About your update" --retry=60
BACK UP YOUR DATABASE (Just in case something goes wrong!)
git pull
composer update
php artisan migrate
php artisan up
Just be sure that your migrations always have rollbacks!
You can make another back/dump after migrating if you want to use real data at staging or development.

Strategy to auto update database on new app version deploy

We have an JavaEE web application running with Maven to do the build process, JSF 2.2, Tomcat 7 as our server and MySQL 5.5 as our database. With the development of new features, sometimes we need to change our database structure. At this moment we have the work to do all this manually:
Wait until we have no clients online (around midnight)
Go to Tomcat manager
Undeploy context
Deploy new context
Go to phpMyAdmin and execute the SQL scripts
While our application is still "small" is still viable to do this process, but we are looking forward to automatize this. We already know about Jenkins, that can read our Git, build the .war using Maven and - not sure yet - do the deploy at Tomcat.
But I am not sure about how we will automatize our SQL scripts to execute when we deploy a new version. It needs to be robust, so it doesn't mess with our database, by for example, running it twice or something like that.
My question is if there is a better deployment process focusing on database changes that can help me.
Just to append the previous answer about liquibase, you could use the flywaydb too.
There are solutions for this out there. One of them is called Liquibase.
You can use liquibase to apply incremental database changes along with jenkins to automate build process.

Entitiy Framework code-first migrations in multiple development environments

I'm working on a project that uses EF6 with code-first migrations.
All the work till now was done on a Dev. environment, including the DB migrations.
I need to deploy the code to another environment (QA), however I've stumbled into a problem:
The DB exists, however there are no tables (I've created the DB manually).
Currently, the code in QA throws Invalid object name 'dbo.__MigrationHistory'. and truly this table doesn't exist in QA, it only exists in DEV where migrations were first enabled.
What is the best practice to work with migrations in multiple development environments (DEV => QA => STG => PROD)?
What is my best course of action?
UPDATE:
I've created the dbo.__MigrationHistory and the schema manually and populated the migrations table from the DEV table. The question still stands since I'll have to deploy to STG and PROD later this month.
If you utilize some kind of continuous integration, you could proceed as follows:
create build configurations in your migrations project that will transform
connection strings to fit environments' requirements (with XDT) on build,
build the project using MSBuild using respective configuration parameter,
use migrate.exe (e.g. from EF Nuget package's folder) to run migrations using the built DLL.
If you don't do CI, then you can just follow the first step and always deploy with your migrations project built with respective build configuration.
Notice: to mess up with __MigrationHistory table manually is never good practice.
To create database you can use CreateDatabaseIfNotExists initializer.
There are a few options you can try:
1- use database initializers:
Database.SetInitializer<UserDbContext>(new MigrateDatabaseToLatestVersion<UserDbContext, Configuration>());
2- use ssdt to migrate tables and data between environments (qa, dev, prod, etc)

Grails database migration changelog "rebase"

Is there an easy way to do a "git rebase"-like operation for Grails Database Migration plugin changelog scripts?
I have already several changelog scripts on top of the initial changelog from an old domain model. Now I'm deploying the application to a new environment and there's no need to migrate the database contents.
I could delete the scripts and generate a fresh initial script from the current domain model but then I'd have to install Grails to the old environment and execute dbm-clear-checksums there, right?
Is there an easier way to tell dbm that I don't want to create an old domain and patch it to current level?
Run the dbm-changelog-sync script - it marks everything as having been run.

What is the best website/web-app upgrade process?

We have a great process for upgrading our clients' websites as far as updating html/js code and assets is concerned (by using Subversion) that we are very happy with.
However, when it comes to upgrading databases, we are without any formal process.
If we add new tables/fields to our development database, when it comes to rolling it out to the production server we have to remember our changes and replicate them. We cannot simply copy the development database on top of the production database as client data would be lost (e.g. blog posts, account info etc).
We are also now in the process of building a web-app which is going to come across the same issues.
Does anyone have a solution that makes this process easier and less prone to error? How do big web-apps get round the problem?
Thanks.
I think that adding controls to the development process is paramount. At one of my past jobs, we had to script out all database changes. These scripts were then passed to the DBA with instructions on what environment to deploy them in. At the end of the day, you can implement technical solutions, but if the project is properly documented (IF!!!) then when it comes time for deployment, the developers should remember to migrate scripts, along with code files. My $.02
In my opinion your code should always be able to create your database from scratch, therefore it should also handle upgrades too. It should check a field in the database to see what version the schema is at and handle the upgrades to the latest version.
I had some good luck with: http://anantgarg.com/2009/04/22/bulletproof-subversion-web-workflow/
The author has a database versioning workflow (with PHP script), which is decent.
Some frameworks have tools which deal with the database upgrade. For example rails migrations are pretty nice.
If no convenient tool is available for your platform you could try scripting modifications to your development database.
In my company we use this model for some of our largest projects:
If the X is the just deployed version of our application and it's not different then the latest development version.
We create a new directory for the scripts naming it for example - version x + 1 and add it to the subversion repository.
When developer wants to make modification to the development database, he creates the .sql script with a name "1 - does something.sql" that makes the modifications (they must be indestructible), saves it and then runs it on the development database. He commits the web app code and the sql scripts. Each developer does the same and maintains the order of the execution of scripts.
When we need to deploy the version X+1 - we copy the x+1 web app code and the scripts to the production server, we backup the database, run the sql scripts one by one on the production database and deploy the new web application code.
After that we open a new (x + 2) sql script directory and repeat the proces ...
We basically have a similar approach as Senad, we maintain a changes.sql file in our repo that developers put their changes in. When we deploy to production, we:
Run a test deployment to the QA server:
first reproduce the production environment (app & db) in the QA server
run changes.sql against the qa db
deploy the app to qa
run integration tests.
When we are sure the app runs fine in qa with the scripted changes to the db (ie. nobody forgot to include their db changes in the changes.sql, or references, etc.) we:
backup the production database
run the scripts in the changes.sql file against the production db
deploy the app
clear the changes.sql file
All the deployment is run through automated scripts so we now we can reproduce it.
Hope this help
We have folder migrations/ inside almost every project and tehere are so called, "up" and "down" scripts (sql). Every developer is obliged to write his own up/down script and to verify it against testing environment.
There are other tools and frameworks for migrations, but we haven't got the time to test it...
Some are: DoctrineDB, rails migrations, propel (I think...), capistrano can do it also..

Resources