I'm doing a web app using EF 4.0, and taking the "model first" approach - so I define all my entities, generate a DDL and create a database structure based the model.
Now, every time something changes in the model, I regenerate the DDL and the database structure is recreated from scratch - tables are dropped and recreated. In this process, I lose all configuration data that was already. This is fine for now, but going forward, once the app goes to production, how will I be able to upgrade the database if I decide to change something?
To simplify, (how) can I upgrade the database from the model and keep all the data?
Thanks!
You need another workflow or T4 templated for DB generation. It is already available in Entity Designer Database Generation Power pack extension for Visual Studio 2010. The only problem is that these workflows are using database tools from Visual Studio which are only available in Premium and Ultimate edition.
If you don't have VS 2010 Premium or Ultimate you must deploy new DB to test environment first and write diff script by yourselves or buy some diff script generator - I recommend DB tools from Red Gate.
Related
I have, admittedly, done very little searching on my own, so feel free to insult and/or harass me.
I have two databases that I use, one for development and one for production. In Visual Studio 2010, I have simple overrides that change the connection string based on whether I'm building for Debug or for Release.
Right now, I manually change the database schemas when needed. I've thought about creating scripts, but I'm wondering if there is a better way? Can I create a DB in Visual Studio on my local computer, migrate those changes to the development SQL and then, finally, migrate up to Production so that the schema version matches the release?
Thanks in advance!
.NET 4 | Visual Studio 2010 | MS SQL Server 2008
Are you familiar with Visual Studio Database Edition? It's reason for being is to help you with what you just described. Not only does it allow you to version control your database schema, it allows you to build deployment T-SQL scripts to update a database from one version to another. It has built in schema compare and data compare. I highly recommend it. We use it to version control our database schema and to do deployments. I would never attempt to update database schemas by creating manual scripts anymore.
This used to be a pain in the ass for me. Then I bought this: http://www.red-gate.com/products/sql-development/sql-source-control/
Which, providing your're using a compatible Source-Control solution (such as SVN), you can make changes in one database, commit them, and then update other databases with the same structure.
As Randy mentions, the Database Edition of VS.NET will do this.
I use SQL Delta. It's a third party tool and very well priced for what it does. I've used it for a number of years on extremely large projects (that is the database has hundreds of tables and thousands of stored procedures) and it has never failed us in what it does.
The way I used it is this.
I get it ot produce a blank database
schema as per the development
database.
Move this database over to
the production server
Sync the production database with this database
You can also produce scripts and run the scripts and all of this can be automated.
I'm only slightly familiar with the features of Data Tier Applications and what Database Projects did in the Database Edition of visual studio.
Are these two different overlapping solutions for database version control? Or does Data Tier Application functionality replace outright the need to use visual studio database edition and database projects?
DAC provides an application model which can be used as an interface between developers and DBAs. The developer edits the model, the DBA manages/deploys from the model. For example, once the model is built or extracted, it can be deployed to multiple servers.
Imagine the .dacpac as an .exe. The developer builds an .exe and hands it off to someone. At this point, it would be nice if the developer doesn't have to worry about where that .exe runs because the .exe is internally consistent - it either runs or it doesn't. Why should the developer need to worry about targetting 2008, 2005, or Azure specifically? Just develop the app model and let DAC take care of the rest...
Having this deployment artifact also provides some new capabilities. Examples include versioned deployments, the ability to determine if someone has changed the database since the last deployment or upgrade, the ability to create the same database in different target servers.
Do you like having to manage a library of upgrade scripts for your various databases? Wouldn't it be nice if the entire state of your database could be built or captured (extracted) at any point in time?
The database application project mashup in VS 2010 will be resolved in an upcoming release of database-centric developer tools. Investing in dbschema or DAC wont affect forward compatibility.
Right now the difference between Database Projects and Data Tier Projects is at the point of deployment. If you want to create a dacpac, you'd use the Data Tier Project. If you want to createa .dbschema and sql migration file, you'd use the conventional Database Project.
As far as I'm aware, Data Tier Applications are expected to be important in future for SQL Azure deployment.
Unless you're specifically looking at SQL Azure, I'd use Database Projects for now. It all depends on what you're trying to achieve. It could be that SQL Source Control (by Red Gate, the company I work for) is more suited to your needs.
I believe Visual Studio database projects are targeted at developers.
Data Tier Applications are targeted at DBAs. See this blog for details.
What is the best way to version database objects (Trigger, SPs, and other elements) in VSS in a similar fashion to the way that we store source code in VSS and access it in Microsoft Visual Studio?
We would like to check database elements in and out in VSS or a similar tool so that we can store database objects in a central location, and also so that we can have versioning of database elements, for example, version history of stored procedures.
We currently use SQL Server 2005 as our database engine.
If you are using VS, the easiest way to control your source objects is to create a Database project using the "Database" project template in Visual Studio.
The entire database project can be associated with source control (VSS in your case) and then all your DB object scripts are versioned.
A very important point to note is to make sure that Developers get out of their old habits of directly updating / changing objects in the DB because doing this will not stop them from doing so. An easy way out of that dilemma is to ensure that the DB project is built and deployed periodically(Continuous integration) just like your code is.
This will ensure that if changes are directly done to DB, they will be lost and hence automatically inculcate the behavioural change in developers.
Refer link for a step by step tutorial as to how to get started using a DB project.
Screen shots are for VS2010 but DB projects have been around since VS 2005 and more or less on the same lines. Very easy to use.
If you're willing to use Subversion or TFS, and are using SSMS to make your database changes, Red Gate's SQL Source Control maybe a tool that would work for you.
http://www.red-gate.com/products/SQL_Source_Control/
[Edit]
We've now added VSS and SourceGear Vault support. Try the early access build:
http://www.red-gate.com/MessageBoard/viewtopic.php?t=12265
I am looking for options to get our vast collection of DB objects across many DBs into source control (TFS 2010). Once we succeed here, we will work toward generating our alter scripts for a particular DB change via TFS build.
The problem is, our data architecture group is responsible for maintaining the DB objects (excluding SPs), and they work within a model centric process, via ERWin. What this means, is that they maintain the DBs via ERWin models, and generate alters from them that are used to release changes.
In order to achieve our goal of getting the DB objects (not just the ERWin models) into TFS, I believe the best option is to do this via Visual Studio DB projects. From what I can tell, there is very little urgency for CA to continue supporting an integration between ERWin and Visual Studio, that no longer works as of Visual Studio 2008 DB Ed. GDR. If I have been mislead in this regard, please feel free to set me straight.
One potential solution is to:
Perform changes in the ERWin model.
Take the alter script generated from ERWin, and import the script into the appropriate Visual Studio DB project, updating the objects in the in the DB project
Check the changed objects in the DB project into TFS.
TFS Build executes to generate the alter scripts that will be used to push the changes through our release process.
My question is, is this solution viable, or are there any other options?
Your solution sounds quite cumbersome to me, as you've essentially got a clash between two different ways of working, which you're trying to resolve.
How do the data architecture group work? Do they use the version control in ERWin, or what process do they have for managing versions?
I would look at integrating this into your build process, so the alter scripts are retrieved from that, instead of having a manual merge process into DB objects.
This could be expanded further, to integrate the two teams source control systems, allowing you to see one view of database objects going forward.
Let's suppose I want to add a new feature to my ASP.NET MVC application running SQL Server 2008 as a data source. In order to implement this new feature, I need to add a few new columns to existing database tables.
After performing these changes on my development server and implementing the new features, what's the easiest way to perform the same database changes on the production server while deploying the new version of my application? Is there any way to automate this?
Edit: As I just found out, Visual Studio 2008's Server Explorer seems to be able to extract the necessary changes for me by comparing two different database layouts (Right-click database, click on "Compare Schema"). Does this usually cover my requirements or is there any big gotcha when using this feature?
I believe versioning the database using manually generated scripts similar to the approach described by K Scott Allen is well worth the investment in time. But not the automated solution you're asking for.
Red Gate's SQL Compare utility might do it for you if your needs are relatively straightforward. If not, a tool like ER-Win or ER-Studio can handle hard-core schema and migrations.
You should have db and app layer versioning. Period.
If you have version db 1.0 and app layer 1.0 in production all the changes which are performed afterwards for versions 1.1 and 1.1.5 should be "upgradable" via scripts.
All "alter table" , and "alter proc" statements are runnable via scripts.
Or alternatively:
Restore 1.0 db to db_old database. Create the production db from scripts and just copy the data ( if you don't have very complicated database should not be difficult)
Automatic deployment for applayer 1:0.
Yet again for the whole process you must train it in DEV , test in TEST verify it in qa and lately perform it in PROD environment.
Edit: I personally think that If the team is not able smoothly to upgrade from version 1.0 to 1.1 on the same time on DEV - smells like bad design and mix in the responsibilities on what should be on the app layer and what on the db server