This might sound as a noob question but I have a project that is using PSQL and Rails.
It is hosted on heroku.
My friend would like to help me with the development of this project.
Let's say he clones my heroku project and sets up his own database locally, makes changes to its schema and etc and pushes it.
Meanwhile I am also making changes to my local database, maybe working on another table, updating fields etc and push my code.
How can we have the database in sync? How can we each get the most recent version of the database with its most recent data?
Do we have to import/export the database schema all the time? Wouldn't this override our changes and data?
Any
I would suggest that you never push your local database to Heroku, instead write migrations and run them against your remote database heroku run rake db:migrate and rake tasks that make any changes to the data you need.
To retrieve your database you can use heroku db:pull although I would be more inclined to use the pg:transfer plugin (https://github.com/ddollar/heroku-pg-transfer) which does a PSQL backup on Heroku and then restores it locally.
You'll need to get into a good habit of pushing your changes to github and then frequently pulling to minimise the risk but git does a pretty good job of merging changes. Also, consider implementing some kind of CI server and have a chat channel (hipchat, campfire etc) which shares the commit messages between yourselves so you know what the other has been up to.
Related
I've develop a web platform that uses a PostgresSQL database along with Hasura to make GraphQL interface. This platform was deployed on a Google Cloud enviroment: the database is running in a Google Cloud SQL instance, the Hasura and a simple node.js servers are running on Cloud Run instances.
Anyway, since the database should keep growing, I need a secure and reliable way to keep track of changes done in development env to futher deploy it to production database.
The buck of the edits to the database schema are done using the Hasura Console and by now I just need a solution to track changes in data schema made in development enviroment to deploy only the needed changes to production
Reading about migrations I've found out Flyway as a solution to keep of these changes. However, there still some concerns about the implementation of Flyway in the project. But a couple of question arrise:
Is it possible to use the PostgresSQL (pgAdmin) backup generated files as migrations?
How could I make a migration from a development to the production database? Just by adding the remote url from Google Cloud SQL the do the migration?
There's no much need to keep track of changes of the data in production.
Is there a better option to control changes between development and production databases?
If a make frequent schema backup (using pgAdmin Backup tool) and run restore on the production database, it would do what I want?
Is it possible to use the PostgresSQL (pgAdmin) backup generated files as migrations?
I think you are going the wrong way. Flyway is about the migration scripts you execute to propogate DB version. The backup file contains the whole database. If you want to replace the whole database with the new version of it you may simply drop the old one and create the new one, but you will loose data that way. You can of course use flyway to restore the backup for you, but that way you'll get only the version table. If you'll update over several versions, then multiple restores will be performed that is not needed.
How could I make a migration from a development to the production database? Just by adding the remote url from Google Cloud SQL the do the migration?
I tried google'ing (entered "Google Cloud SQL flyway") and the first result pointed me to Umberto D'Ovido post Setup Flyway with Google Cloud SQL I'm sure with a little effor you'll find the instructions.
I am looking for a solution to sync DB between multiple developers (us at the office..).
We use Wordpress and MAMP (for now, MAMP/Headless WP and NPM/React in the future) and we want to use Appveyor (or similar) to deploy at dev-server and live-server, and want the DB to be synced everywhere or at least among us and the dev server and have a secondary (free standing) on the live-server.
Can this be done with Liquidbase or is there a better option?
Thanks :)
I don't know a whole lot about WordPress and how it uses the database, but in theory this should be possible as long as you are talking about syncing the schema changes. If you are also trying to sync the data, then Liquibase is not the right tool for the job.
To do this with Liquibase, try installing using the installer and working through some of the examples to get an idea for how the tool works. The examples use a local h2 in-memory database, so it is pretty painless to try things and start over if you mess things up.
After getting a feel for things, you will want to use the Liquibase generateChangeLog command to create the initial changelog that contains all the instructions for creating the schema as it exists on the database you are using when you run generateChangeLog. Then test that you can run liquibase update on a separate database and have WordPress use that database successfully.
Once you have proven that workflow, you can continue by following this pattern:
Before making changes to the WordPress schema, run liquibase snapshot to create a JSON formatted snapshot of the "DEV" schema - the schema you are changing in development mode. You will need additional options to generate the JSON format snapshot.
Make the desired changes to the WordPress "DEV" schema, most likely by using the WordPress app itself.
Use liquibase diffChangeLog to compare the JSON snapshot to the newly-altered "DEV" schema. This will add changesets to the existing changelog file that describe how to alter the schema to create the desired changes.
Use liquibase changeLogsSync on the "DEV" schema to update the liquibase tracking tables so that liquibase knows that the changes in the changelog already exist in that database.
Use liquibase update against the "PROD" database to have the new schema changes show up in that environment.
This workflow is described in the Liquibase docs for the snapshot command.
ps - there is no d in Liquibase :-)
Hey folks and moderators,
Let’s say I have deployed a Laravel application on production server and how do I modify and update the application without affecting production data?
Assumed that I wanted to release the next version of the application with additional few columns for table Users.
The question is should I clone database from live to staging?
What is the right way to modify staging application and deploy to production without affecting the production database even though there are additional tables/columns from staging?
Currently I running two different environments and drop production tables and import from staging. It sounds like not efficient.
Any better idea to improve from staging to production?
Thank you!
I’ve been tried to search around unfortunately it brings no luck.
Assumed that I wanted to release the next version of the application with additional few columns for table Users.
Yes writing a new migration like this:
Schema::table('users', function (Blueprint $table) {
$table->string('email'); // Adding
$table->string('name', 50)->change(); //Changing
});
Thats for adding, for changing type, renaming and dropping columns, you will also have to install: composer require doctrine/dbal
See more info at the docs:https://laravel.com/docs/5.6/migrations#creating-columns
The question is should I clone database from live to staging? What is the right way to modify staging application and deploy to production without affecting the production database even though there are additional tables/columns from staging?
...
Any better idea to improve from staging to production?
I'm assuming you are using Git. A single server, and that you can put down your app for a while, maybe do the update when the less users are using the system.
Push to repository
ssh to production server
php artisan down --message="Your Awesome Message About your update" --retry=60
BACK UP YOUR DATABASE (Just in case something goes wrong!)
git pull
composer update
php artisan migrate
php artisan up
Just be sure that your migrations always have rollbacks!
You can make another back/dump after migrating if you want to use real data at staging or development.
I have configured wordpress to Bitbucket when I changes push through source tree, the reflect of azure web app service where my code exists. The problem is that about the database cause Team are working on local and use online database but when push the changes the database is not update? How to resolve this problem.
Databases are generally not saved in your git repository, neither would this be desirable. If you'd like to easily sync your WordPress database between a local and remote install consider a tool like wordmove or a plugin like WP Migrate DB Pro.
We have a great process for upgrading our clients' websites as far as updating html/js code and assets is concerned (by using Subversion) that we are very happy with.
However, when it comes to upgrading databases, we are without any formal process.
If we add new tables/fields to our development database, when it comes to rolling it out to the production server we have to remember our changes and replicate them. We cannot simply copy the development database on top of the production database as client data would be lost (e.g. blog posts, account info etc).
We are also now in the process of building a web-app which is going to come across the same issues.
Does anyone have a solution that makes this process easier and less prone to error? How do big web-apps get round the problem?
Thanks.
I think that adding controls to the development process is paramount. At one of my past jobs, we had to script out all database changes. These scripts were then passed to the DBA with instructions on what environment to deploy them in. At the end of the day, you can implement technical solutions, but if the project is properly documented (IF!!!) then when it comes time for deployment, the developers should remember to migrate scripts, along with code files. My $.02
In my opinion your code should always be able to create your database from scratch, therefore it should also handle upgrades too. It should check a field in the database to see what version the schema is at and handle the upgrades to the latest version.
I had some good luck with: http://anantgarg.com/2009/04/22/bulletproof-subversion-web-workflow/
The author has a database versioning workflow (with PHP script), which is decent.
Some frameworks have tools which deal with the database upgrade. For example rails migrations are pretty nice.
If no convenient tool is available for your platform you could try scripting modifications to your development database.
In my company we use this model for some of our largest projects:
If the X is the just deployed version of our application and it's not different then the latest development version.
We create a new directory for the scripts naming it for example - version x + 1 and add it to the subversion repository.
When developer wants to make modification to the development database, he creates the .sql script with a name "1 - does something.sql" that makes the modifications (they must be indestructible), saves it and then runs it on the development database. He commits the web app code and the sql scripts. Each developer does the same and maintains the order of the execution of scripts.
When we need to deploy the version X+1 - we copy the x+1 web app code and the scripts to the production server, we backup the database, run the sql scripts one by one on the production database and deploy the new web application code.
After that we open a new (x + 2) sql script directory and repeat the proces ...
We basically have a similar approach as Senad, we maintain a changes.sql file in our repo that developers put their changes in. When we deploy to production, we:
Run a test deployment to the QA server:
first reproduce the production environment (app & db) in the QA server
run changes.sql against the qa db
deploy the app to qa
run integration tests.
When we are sure the app runs fine in qa with the scripted changes to the db (ie. nobody forgot to include their db changes in the changes.sql, or references, etc.) we:
backup the production database
run the scripts in the changes.sql file against the production db
deploy the app
clear the changes.sql file
All the deployment is run through automated scripts so we now we can reproduce it.
Hope this help
We have folder migrations/ inside almost every project and tehere are so called, "up" and "down" scripts (sql). Every developer is obliged to write his own up/down script and to verify it against testing environment.
There are other tools and frameworks for migrations, but we haven't got the time to test it...
Some are: DoctrineDB, rails migrations, propel (I think...), capistrano can do it also..