We have a SQL Server 2014 database and we are planning to do a blue green deployment to this database so that the downtime for this database can be reduced during the deployment window.
Are there any options that can be leveraged for implementing this solution?
I would advise following approach from BlueGreenDeployment by Martin Fowler:
Databases can often be a challenge with this technique, particularly
when you need to change the schema to support a new version of the
software. The trick is to separate the deployment of schema changes
from application upgrades. So first apply a database refactoring to
change the schema to support both the new and old version of the
application, deploy that, check everything is working fine so you have
a rollback point, then deploy the new version of the application. (And
when the upgrade has bedded down remove the database support for the
old version.)
Related
We have a Live database project for which we currently update via manual deployments only. As part of our process improvement initiatives however, we're proposing to switch to SSDT to enable us improve and introduce some automation into our deployment process.
To facilitate a Proof of Concept, we've provisioned a test environment where we can implement and test the SSDT deployments to a clone or like-for-like representation of the existing Live database which is currently maintained in a Git repo on VSTS.
I do however have a few of concerns and queries relating to this.
What would be the recommended way to clone or re-create the Live database for our SSDT deployments to the test environment, without impacting the Live environment?
Will importing the Live database into our SSDT project and then publishing to the test environment have any adverse effect on the Live environment and can we expect any inconsistencies between the two databases following the import?
Should our Proof of Concept prove successful, will we need to migrate all our database assets to a new SSDT project, or can the SSDT project run in parallel with our existing Live database project and how?
Will any switch or migration to SSDT result in a downtime or outage for our project team?
If a migration of our existing database assets to a new SSDT project is not required, could we then integrate the SSDT functionality into the existing project.
Let's go through all your questions:
What do you mean by clone? Copy structure or with data also? In any case you'll need to import schema. Create new project and import the schema by right-clicking on the project. If you need data also, then the best way is to backup/restore.
SSDT doesn't naturally works with data. It has just DDLs. So when you sync the structure performance effect depends on the changes be made. Index re-creating can take a lot of time, however table creation will not use too much resources. Once again, if you are talking about backups, then you need to measure how much they affect you.
SSDT is the way how you store and develop your code. It's up to you to go with or not. My opinion is that SSDT is the best tool for SQL Server development, however it can become a challenge to set it up correctly.
As I said already that it's all depends on the changes. Some changes will not have downtime other ones can be very complex. SSDT will just generate the SQL script and there is no difference will it be ran manually or automatically by SSDT.
Not very sure what do you mean by migration. SSDT is the way how do you store and develop your code. Any SQL Server database can be put to SSDT (for some existing databases you'll need to put some effort to get into SSDT, but it's really possible)
I have, admittedly, done very little searching on my own, so feel free to insult and/or harass me.
I have two databases that I use, one for development and one for production. In Visual Studio 2010, I have simple overrides that change the connection string based on whether I'm building for Debug or for Release.
Right now, I manually change the database schemas when needed. I've thought about creating scripts, but I'm wondering if there is a better way? Can I create a DB in Visual Studio on my local computer, migrate those changes to the development SQL and then, finally, migrate up to Production so that the schema version matches the release?
Thanks in advance!
.NET 4 | Visual Studio 2010 | MS SQL Server 2008
Are you familiar with Visual Studio Database Edition? It's reason for being is to help you with what you just described. Not only does it allow you to version control your database schema, it allows you to build deployment T-SQL scripts to update a database from one version to another. It has built in schema compare and data compare. I highly recommend it. We use it to version control our database schema and to do deployments. I would never attempt to update database schemas by creating manual scripts anymore.
This used to be a pain in the ass for me. Then I bought this: http://www.red-gate.com/products/sql-development/sql-source-control/
Which, providing your're using a compatible Source-Control solution (such as SVN), you can make changes in one database, commit them, and then update other databases with the same structure.
As Randy mentions, the Database Edition of VS.NET will do this.
I use SQL Delta. It's a third party tool and very well priced for what it does. I've used it for a number of years on extremely large projects (that is the database has hundreds of tables and thousands of stored procedures) and it has never failed us in what it does.
The way I used it is this.
I get it ot produce a blank database
schema as per the development
database.
Move this database over to
the production server
Sync the production database with this database
You can also produce scripts and run the scripts and all of this can be automated.
I'm only slightly familiar with the features of Data Tier Applications and what Database Projects did in the Database Edition of visual studio.
Are these two different overlapping solutions for database version control? Or does Data Tier Application functionality replace outright the need to use visual studio database edition and database projects?
DAC provides an application model which can be used as an interface between developers and DBAs. The developer edits the model, the DBA manages/deploys from the model. For example, once the model is built or extracted, it can be deployed to multiple servers.
Imagine the .dacpac as an .exe. The developer builds an .exe and hands it off to someone. At this point, it would be nice if the developer doesn't have to worry about where that .exe runs because the .exe is internally consistent - it either runs or it doesn't. Why should the developer need to worry about targetting 2008, 2005, or Azure specifically? Just develop the app model and let DAC take care of the rest...
Having this deployment artifact also provides some new capabilities. Examples include versioned deployments, the ability to determine if someone has changed the database since the last deployment or upgrade, the ability to create the same database in different target servers.
Do you like having to manage a library of upgrade scripts for your various databases? Wouldn't it be nice if the entire state of your database could be built or captured (extracted) at any point in time?
The database application project mashup in VS 2010 will be resolved in an upcoming release of database-centric developer tools. Investing in dbschema or DAC wont affect forward compatibility.
Right now the difference between Database Projects and Data Tier Projects is at the point of deployment. If you want to create a dacpac, you'd use the Data Tier Project. If you want to createa .dbschema and sql migration file, you'd use the conventional Database Project.
As far as I'm aware, Data Tier Applications are expected to be important in future for SQL Azure deployment.
Unless you're specifically looking at SQL Azure, I'd use Database Projects for now. It all depends on what you're trying to achieve. It could be that SQL Source Control (by Red Gate, the company I work for) is more suited to your needs.
I believe Visual Studio database projects are targeted at developers.
Data Tier Applications are targeted at DBAs. See this blog for details.
For the last few years I was the only developer that handled the databases we created for our web projects. That meant that I got full control of version management. I can't keep up with doing all the database work anymore and I want to bring some other developers into the cycle.
We use Tortoise SVN and store all repositories on a dedicated server in-house. Some clients require us not to have their real data on our office servers so we only keep scripts that can generate the structure of their database along with scripts to create useful fake data. Other times our clients want us to have their most up to date information on our development machines.
So what workflow do larger development teams use to handle version management and sharing of databases. Most developers prefer to deploy the database to an instance of Sql Server on their development machine. Should we
Keep the scripts for each database in SVN and make developers export new scripts if they make even minor changes
Detach databases after changes have been made and commit MDF file to SVN
Put all development copies on a server on the in-house network and force developers to connect via remote desktop to make modifications
Some other option I haven't thought of
Never have an MDF file in the development source tree. MDFs are a result of deploying an application, not part of the application sources. Thinking at the database in terms of development source is a short-cut to hell.
All the development deliverables should be scripts that deploy or upgrade the database. Any change, no matter how small, takes the form of a script. Some recommend using diff tools, but I think they are a rat hole. I champion version the database metadata and having scripts to upgrade from version N to version N+1. At deployment the application can check the current deployed version, and it then runs all the upgrade scripts that bring the version to current. There is no script to deploy straight the current version, a new deployment deploys first v0 of the database, it then goes through all version upgrades, including dropping object that are no longer used. While this may sound a bit extreme, this is exactly how SQL Server itself keeps track of the various changes occurring in the database between releases.
As simple text scripts, all the database upgrade scripts are stored in version control just like any other sources, with tracking of changes, diff-ing and check-in reviews.
For a more detailed discussion and some examples, see Version Control and your Database.
Option (1). Each developer can have their own up to date local copy of the DB. (Up to date meaning, recreated from latest version controlled scripts (base + incremental changes + base data + run data). In order to make this work you should have the ability to 'one-click' deploy any database locally.
You really cannot go wrong with a tool like Visual Studio Database Edition. This is a version of VS that manages database schemas and much more, including deployments (updates) to target server(s).
VSDE integrates with TFS so all your database schema is under TFS version control. This becomes the "source of truth" for your schema management.
Typically developers will work against a local development database, and keep its schema up to date by synchronizing it with the schema in the VSDE project. Then, when the developer is satisfied with his/her changes, they are checked into TFS, and a build and then deployment can be done.
VSDE also supports refactoring, schema compares, data compares, test data generation and more. It's a great tool, and we use it to manage our schemas.
In a previous company (which used Agile in monthly iterations), .sql files were checked into version control, and (an optional) part of the full build process was to rebuild the database from production then apply each .sql file in order.
At the end of the iteration, the .sql instructions were merged into the script that creates the production build of the database, and the script files moved out. So you're only applying updates from the current iteration, not going back til the beginning of the project.
Have you looked at a product called DB Ghost? I have not personally used it but it looks comprehensive and may offer an alternative as part point 4 in your question.
Let's suppose I want to add a new feature to my ASP.NET MVC application running SQL Server 2008 as a data source. In order to implement this new feature, I need to add a few new columns to existing database tables.
After performing these changes on my development server and implementing the new features, what's the easiest way to perform the same database changes on the production server while deploying the new version of my application? Is there any way to automate this?
Edit: As I just found out, Visual Studio 2008's Server Explorer seems to be able to extract the necessary changes for me by comparing two different database layouts (Right-click database, click on "Compare Schema"). Does this usually cover my requirements or is there any big gotcha when using this feature?
I believe versioning the database using manually generated scripts similar to the approach described by K Scott Allen is well worth the investment in time. But not the automated solution you're asking for.
Red Gate's SQL Compare utility might do it for you if your needs are relatively straightforward. If not, a tool like ER-Win or ER-Studio can handle hard-core schema and migrations.
You should have db and app layer versioning. Period.
If you have version db 1.0 and app layer 1.0 in production all the changes which are performed afterwards for versions 1.1 and 1.1.5 should be "upgradable" via scripts.
All "alter table" , and "alter proc" statements are runnable via scripts.
Or alternatively:
Restore 1.0 db to db_old database. Create the production db from scripts and just copy the data ( if you don't have very complicated database should not be difficult)
Automatic deployment for applayer 1:0.
Yet again for the whole process you must train it in DEV , test in TEST verify it in qa and lately perform it in PROD environment.
Edit: I personally think that If the team is not able smoothly to upgrade from version 1.0 to 1.1 on the same time on DEV - smells like bad design and mix in the responsibilities on what should be on the app layer and what on the db server