I'm using Django with Google Appengine. I've amended my models in my development environment and when I run manage.py makemigrations and manage.py migrate I'm informed no changes have been made but when I run my app I get a 1054 unknown column error.
Here's how I've solved it in the development environment - I'm not sure what I'll do in production once it is live.
I cleared the databases completely using MySQL
drop database mydb;
create database mydb;
I deleted the migrations folder from my directory structure
manage.py makemigrations myapp
manage.py migrate myapp
manage.py migrate does not check that all the tables are correct, and match the model. It just checks the migrations log.
If the migration log says that everything was migrated, then django will not change the database - even if some tables are wrong.
Since It's your development environment, you can simply drop the database, create it again, and run manage.py migrate.
Related
Hey folks and moderators,
Let’s say I have deployed a Laravel application on production server and how do I modify and update the application without affecting production data?
Assumed that I wanted to release the next version of the application with additional few columns for table Users.
The question is should I clone database from live to staging?
What is the right way to modify staging application and deploy to production without affecting the production database even though there are additional tables/columns from staging?
Currently I running two different environments and drop production tables and import from staging. It sounds like not efficient.
Any better idea to improve from staging to production?
Thank you!
I’ve been tried to search around unfortunately it brings no luck.
Assumed that I wanted to release the next version of the application with additional few columns for table Users.
Yes writing a new migration like this:
Schema::table('users', function (Blueprint $table) {
$table->string('email'); // Adding
$table->string('name', 50)->change(); //Changing
});
Thats for adding, for changing type, renaming and dropping columns, you will also have to install: composer require doctrine/dbal
See more info at the docs:https://laravel.com/docs/5.6/migrations#creating-columns
The question is should I clone database from live to staging? What is the right way to modify staging application and deploy to production without affecting the production database even though there are additional tables/columns from staging?
...
Any better idea to improve from staging to production?
I'm assuming you are using Git. A single server, and that you can put down your app for a while, maybe do the update when the less users are using the system.
Push to repository
ssh to production server
php artisan down --message="Your Awesome Message About your update" --retry=60
BACK UP YOUR DATABASE (Just in case something goes wrong!)
git pull
composer update
php artisan migrate
php artisan up
Just be sure that your migrations always have rollbacks!
You can make another back/dump after migrating if you want to use real data at staging or development.
I have a SQL Server CE database on my 'live' host that I deployed a few weeks ago. It has a migration history of two old migrations. Then I have my dev database, that has gone through umpteen migrations, and several delete and recreate moments.
Now I would like to use EF migrations to build a migration that will update the production db to match my code-first model on dev. I thought that if I cleared the prod migration history, and ran Add-Migration, EF would compare database and model, and generate a migration class to bring the db up to date with the model.
What really happens is that the migration that gets generated tries to create the whole dd, well, all tables, FKs, and Indexes. How do I get a proper update only, using EF migrations?
If you still have the deployed migration on your Dev box, you can create a script that will bring the deployed version up to date:
Update-Database -Script -SourceMigration: VersionDeployed -TargetMigration: CurrentMigration
You could also try bringing the PROD database down with the migration history (don't clear it). EF should compare the model in the last migration to the current model based on code.
http://cpratt.co/migrating-production-database-with-entity-framework-code-first/#at_pco=smlwn-1.0&at_si=54ad5c7b61c48943&at_ab=per-12&at_pos=0&at_tot=1
Is there an easy way to do a "git rebase"-like operation for Grails Database Migration plugin changelog scripts?
I have already several changelog scripts on top of the initial changelog from an old domain model. Now I'm deploying the application to a new environment and there's no need to migrate the database contents.
I could delete the scripts and generate a fresh initial script from the current domain model but then I'd have to install Grails to the old environment and execute dbm-clear-checksums there, right?
Is there an easier way to tell dbm that I don't want to create an old domain and patch it to current level?
Run the dbm-changelog-sync script - it marks everything as having been run.
This might sound as a noob question but I have a project that is using PSQL and Rails.
It is hosted on heroku.
My friend would like to help me with the development of this project.
Let's say he clones my heroku project and sets up his own database locally, makes changes to its schema and etc and pushes it.
Meanwhile I am also making changes to my local database, maybe working on another table, updating fields etc and push my code.
How can we have the database in sync? How can we each get the most recent version of the database with its most recent data?
Do we have to import/export the database schema all the time? Wouldn't this override our changes and data?
Any
I would suggest that you never push your local database to Heroku, instead write migrations and run them against your remote database heroku run rake db:migrate and rake tasks that make any changes to the data you need.
To retrieve your database you can use heroku db:pull although I would be more inclined to use the pg:transfer plugin (https://github.com/ddollar/heroku-pg-transfer) which does a PSQL backup on Heroku and then restores it locally.
You'll need to get into a good habit of pushing your changes to github and then frequently pulling to minimise the risk but git does a pretty good job of merging changes. Also, consider implementing some kind of CI server and have a chat channel (hipchat, campfire etc) which shares the commit messages between yourselves so you know what the other has been up to.
I have a Postgre SQL database in Heroku. I've had some problems. I needed to delete some local migration files and create brand new schema migration files (which worked) that had all migrations included.
A relationship from this new schema migration file already existed in my Heroku database and when I try to migrate I get this error:
django.db.utils.DatabaseError: relation "quizzer_speaker" already exists
How can I make Heroku actually do the migration? Or how can I make it go to a previous version where that relation didn't exist, so I can just migrate without problems?