I would like to know how to update an existing database from another database in mongodb ... but previously a little of context.
I've created an app with strapi and mongodb ( more precisely with mongo atlas) and I'm using 2 environments, development and production(with a development and production database). At the beginning, i've created the development database and i copied/cloned this database to another database with this command (previously I created the db file with mongodump)
mongorestore --uri="mongodb+srv://<username>:<password>#<cluster-url>" --archive="app-development.db" --nsFrom='app-development.*' --nsTo='app-production.*
After creating this database i carried on with my develop into the development database, and now i would like to import the development database into the production database...but i got the error continuing through error: E11000 duplicate key error collection
So I don't know how can i re-import the databases, and also i would like to know if this way is a good practice or i should do it in a different way
Related
I am looking for a solution to sync DB between multiple developers (us at the office..).
We use Wordpress and MAMP (for now, MAMP/Headless WP and NPM/React in the future) and we want to use Appveyor (or similar) to deploy at dev-server and live-server, and want the DB to be synced everywhere or at least among us and the dev server and have a secondary (free standing) on the live-server.
Can this be done with Liquidbase or is there a better option?
Thanks :)
I don't know a whole lot about WordPress and how it uses the database, but in theory this should be possible as long as you are talking about syncing the schema changes. If you are also trying to sync the data, then Liquibase is not the right tool for the job.
To do this with Liquibase, try installing using the installer and working through some of the examples to get an idea for how the tool works. The examples use a local h2 in-memory database, so it is pretty painless to try things and start over if you mess things up.
After getting a feel for things, you will want to use the Liquibase generateChangeLog command to create the initial changelog that contains all the instructions for creating the schema as it exists on the database you are using when you run generateChangeLog. Then test that you can run liquibase update on a separate database and have WordPress use that database successfully.
Once you have proven that workflow, you can continue by following this pattern:
Before making changes to the WordPress schema, run liquibase snapshot to create a JSON formatted snapshot of the "DEV" schema - the schema you are changing in development mode. You will need additional options to generate the JSON format snapshot.
Make the desired changes to the WordPress "DEV" schema, most likely by using the WordPress app itself.
Use liquibase diffChangeLog to compare the JSON snapshot to the newly-altered "DEV" schema. This will add changesets to the existing changelog file that describe how to alter the schema to create the desired changes.
Use liquibase changeLogsSync on the "DEV" schema to update the liquibase tracking tables so that liquibase knows that the changes in the changelog already exist in that database.
Use liquibase update against the "PROD" database to have the new schema changes show up in that environment.
This workflow is described in the Liquibase docs for the snapshot command.
ps - there is no d in Liquibase :-)
I have a SQL Server CE database on my 'live' host that I deployed a few weeks ago. It has a migration history of two old migrations. Then I have my dev database, that has gone through umpteen migrations, and several delete and recreate moments.
Now I would like to use EF migrations to build a migration that will update the production db to match my code-first model on dev. I thought that if I cleared the prod migration history, and ran Add-Migration, EF would compare database and model, and generate a migration class to bring the db up to date with the model.
What really happens is that the migration that gets generated tries to create the whole dd, well, all tables, FKs, and Indexes. How do I get a proper update only, using EF migrations?
If you still have the deployed migration on your Dev box, you can create a script that will bring the deployed version up to date:
Update-Database -Script -SourceMigration: VersionDeployed -TargetMigration: CurrentMigration
You could also try bringing the PROD database down with the migration history (don't clear it). EF should compare the model in the last migration to the current model based on code.
http://cpratt.co/migrating-production-database-with-entity-framework-code-first/#at_pco=smlwn-1.0&at_si=54ad5c7b61c48943&at_ab=per-12&at_pos=0&at_tot=1
I have a Web application that I usually deployed using Web Deploy directly from Visual Studio (whatever branch I am currently using in VS - normally master). But now I'm introducing a second web app on Azure that will be built from the same repo but different branch. To make things simpler I will be configuring both Web apps on Azure to integrate directly with GitHub and associate them with specific branch.
I also added two additional web.config files: Web.Primary.config and Web.Secondary.config and configured app settings on Azure portal of each web app by adding additional value SCM_BUILD_ARGS and set them to
SCM_BUILD_ARGS=-p:PublishProfile=Primary // in primary web app
SCM_BUILD_ARGS=-p:PublishProfile=Secondary // in secondary web app
which I understand will transform correct config file with specific external services' configurations (DB connection, mail server, etc.).
Now the additional step that I would like to include in continuous deployment is run a set of SQL scripts that I have in my repo that I used to manually upgrade database during Web Deploy in VS. Individual scripts are actually doing specific database upgrade steps:
backup current tables - backup creates a set of Backup_OriginalTableName tables that are copied from existing ones and populated with existing data
drop whole DB model - all non-backup objects are being dropped from procedures, functions, types, views, tables...
create model - creates all tables, views and indices
create user types
create user functions
create stored procedures
restore data to new tables from backup tables - this step may occasionally break if we introduce new non-nullable columns to tables in the new model don't have defaults defined on them; I will somehow have to mitigate this problem by adding an additional script that will add missing columns to backup tables and give them some defaults, but that's a completely different issue.
I used to also have a set of batch files (BAT) in my VS solution that simply executed sqlcmd against specific database instance and executed these scripts in predefined order (as above). Hence I had batches:
Recreate Local.bat - this one used additional SQL scripts to not restore from backup but rather to recreate an empty DB with only lookup tables being populated and some default data for development purposes (like predefined test users)
Restore Local.bat - I used this script to simply restore database from backup tables discarding any invalid data I may have created while debugging/testing since last DB recreate/upgrade/restore
Upgrade Local.bat - upgrade local development DB executing scripts mentioned above
Upgrade Production.bat - upgrade production DB on Azure executing scripts mentioned above
So to support the whole deployment process I was now doing manually in VS I would now like to also execute these scripts against specific Azure SQL DB during continuous deployment. I suppose I should be running these right after code deployment because if that one fails, DB shouldn't be upgraded either.
I'm a bit confused where and how to do this? Can I configure this somewhere in Azure portal? I was looking for resources on the Web but I can't seem to find any relevant information how to do additional deployment steps to execute these scripts. I think this is some everyday scenario as it's hard to think of web apps not requiring databases these days.
Maybe it's just my process that is wrong for DB upgrade/deployment so let me also know if there is any other normal way that does DB upgrade/migration with continuous deployment on Azure... I may change my process to accommodate for this.
Note 1: I'm not using Entity Framework or any other full blown ORM. I'm rather using NPoco and all my DB logic is built in SPs that DAL is using.
Note 2: I'm aware of recently introduced staging capabilities of Azure, but my apps are on cheaper plan that doesn't support staging and I want to keep it this way as I may be introducing additional web apps along the way that will be using additional code branches and resources (DB, mail etc.)
It sounds to me like your db project is a good candidate for SSDT and inclusion in source control. You can create a MyDB.sqlproj that builds your db as a dacpac, and then you can use SqlPackage.exe Publish to accomplish your deployment to Azure.
We recently brought our databases under source control and follow a similar process to build and automatically deploy them (but not to a SQL Azure DB). We've found the source control, SSDT tooling support, and automated deployment options to be worth the effort of setting up and maintaining our project this way.
This SO question has some good notes for Azure deployment of a dacpac using SSDT:
How to publish DACPAC file to a SQL Server database project via SQLPackage.exe of SSDT?
I am assinged for the task of Continuous deployment from development server to production server.
In my development server all the database objects will be created under the 'DBO' Schema. But in Production server based on every Tenants company list differenet SCHEMAS will be there.
for E.g in my development server if a tablename is created like
dbo.ABC
dbo.XYZ
And while i creating a tenant(Omkar---db) (Sarkur,Mathur--- schemas), the database objects will be like
Sarkur.ABC, sarkur.XYZ
Mathur.ABC, Mathur.XYZ
Now, i have to compare these two databases to check whether any changes in structure of the database objects, addition / deletion of database objects. If so that changes has tobe synchronized in the production database.
If anyone know that how to compare these two different schemas object, pls let me know..
1 option that I know is looking suitable
Flyway :
It is Easy to setup, simple to master. Flyway let's you regain control of your database migrations with pleasure and plain sql.
Solves only one problem and solves it well. Flyway migrates your database, so you don't have to worry about it anymore.
Made for continuous delivery. Let Flyway migrate your database on application startup. Releases have never been this easy.
Big Plus It's Open Source framework!
http://flywaydb.org/
I have the following scenario for my application:
1 Production Server
1 Test Server
n Development Computers
For database migration we use Hibernate Schema Update for the Schema and DBUnit for filling in alle the production data (on all servers/computers). When the schema update is done I generate a new DTD File for the new schema, so I can do a fresh import of the DBUnit XML. The application updates the database at startup with the XML file (only on development and test servers/computers!)
Of course this approach is not optimal and fragile. So I looked at Liquibase and Flyway. Both seem to be great tools, but what I do not get is: How do I migrate the data? In my case, I dump the data of the production system once a week and add it to the applications source control as a DBUnit XML file, so all developers have "fresh" data and the test server has current production data, too.
The problem I see with Liquibase and Flyway is, that there is no solution how to do automated diffs from the database data and generate the migration changes automatically.
So my idea is the following with the following steps:
Set Hibernate to validate instead of update.
When a STRUCTURAL database change is needed, I add it to the migration script for the major version
No database inserts are in the migration script.
Generate a new DTD for DBunit based on the new database structure
Generate the DBUnit XML from the production database.
Another idea would be to utilize flyways JavaMigration and provide an initial Database Dump based on DBUnit. All other changes for database data will be handled in migration scripts. But still there is the problem: How to make diffs from the current migration script state and the production database state?
It would be awesome if anyone could provide me hints how to handle my scenario :)
If your goal is to use dumps of the PROD database in DEV and TEST environments, I would:
Configure the DB migration tool to run on application startup (both Flyway and Liquibase support this through their respective APIs)
Package all the DB structure migrations together with the app
Dump both data and structure from PROD
This way, when the PROD database is restored to DEV or TEST, the old metadata table of the migration tool is restored as well.
When the app starts, the migration tool will discover that the db structure is outdated and upgrade it to the newest version. Done.
No need to use DBUnit for this.
The short answer is that all your changes would be done through Liquibase or Flyway.
We use Flyway, with the same prod/test/development setup.
We make all db changes (structure or metadata) using Flyway migration scripts, stored in source control. Each time we do a new deployment to an environment, we first run the migration scripts there (using either the command line tool or the maven plugin). The code first goes to development environment, gets integration tested there and keeps going to test and production.
The main thing to watch out for is that Flyway requires a linear versioning to the files, so if two developers check in migrations at the same time, one of them will have to rename theirs.