A New BI / Database project: how to take databases under version control? - sql-server

We are starting a new BI project in our company. We have at least three developers working with database design and development. Our tools include Sparks EA, SQL Server 2008 EE and undetermined reporting tools. What kind of tools one can use for database version control in SQL Server? What kind of version control systems there are available for database development (managing versions of database schema, tables, stored procedures etc.)?

Visual Studio Team System includes the Database Edition features for database source control, deployment, schema comparison and more. You don't necessarily need a tool to do database versioning but if you do want a tool then VSTS is one option.

Often it's easier to export a set of scripts that would create your database from scratch, and put those in version control instead.
Then, if you want to upgrade a database to a "version" from source control, you could use a tool like SQL Compare or "Data Dude" to compare two databases (or creation scripts and a database) and apply the changes from one to the other.

We create all database changes in scripts, never use the GUI. Then save the scripts in folders that are part of our Subversion store just like any other piece of code. Since our configuration management people will not push to prod without a script, we have no rtrouble at all enforcing this. The beauty is that you have all the scripts you need for a particular change located together and those in another change not yet ready to go to prod are not pushed by accident.
We store all our SSIS packages as files as well and we store our configurations in sripts as well and push them the same way.
Should you want to do a comparison between databases to make sure nothing was missed, I highly recommend SQLCompare by Red-gate.

Related

How do you deal with multiple developers and database changes?

I would like to know how you guys deal with development database changes in groups of 2 or more devs? Do you have a global db everyone access, maybe a local copy and manually apply script changes? It would be nice to see pros and cons that you've noticed for each approach and the number of devs in your team.
Start with "Evolutionary Database Design" by Martin Fowler. This sums it up nicely
There are have been other questions about DB development that may be useful too, for example Is RedGate SQL Source Control for me?
Our approach is that everyone has their own DB, the complete DB can be created from create scripts with base data if required. All the scripts required for this are in source control.
All scripts are CREATE scripts and they reflect the current state of the database schema. Upgrades are in separate SQL files which can upgrade existing DBs from a specific version to a newer one (run sequentially). After all the updates have been applied, the schema must be identical to what you would get from running the setup scripts.
We have some tools to do this (we use SQL Server and .NET):
Scripting is done with a tool which also applies a standard formatting so that the changes are well traceable with text diff tools (and by the SCM)
A runtime module takes care of comparing the existing DB objects, run updates if required, automatically apply "non-destructive" changes, then check the DB objects again to ensure a correct migration before committing the changes
The toolset is available as open-source project (licensed under LGPL), it's called the bsn ModuleStore (note that it is limited to SQL Server 2005/2008/Azure and to .NET for the runtime part).
We use what was code named "Data Dude" - the database features in TFS and Visual Studio - to deal with this. When you "get latest" and bring in code that relies on a schema change, you also bring in the revised schemas, stored procedures etc. You rigght-click the database project and Deploy; that gets your local schema and sp in sync but doesn't overwrite your data. The job of working out the script to get you from your old schema to the new one falls to Visual Studio, not to you or your DBA. We also have "populate" scripts for things like lists of provinces and a deploy runs them for you.
So much better than the old way which always fell apart at high stress times, with people checking in code then going home and nobody knowing what columns to add to make the code work etc.

Will Visual Studio 2010 Deploy SQL DB Schema Updates?

I have, admittedly, done very little searching on my own, so feel free to insult and/or harass me.
I have two databases that I use, one for development and one for production. In Visual Studio 2010, I have simple overrides that change the connection string based on whether I'm building for Debug or for Release.
Right now, I manually change the database schemas when needed. I've thought about creating scripts, but I'm wondering if there is a better way? Can I create a DB in Visual Studio on my local computer, migrate those changes to the development SQL and then, finally, migrate up to Production so that the schema version matches the release?
Thanks in advance!
.NET 4 | Visual Studio 2010 | MS SQL Server 2008
Are you familiar with Visual Studio Database Edition? It's reason for being is to help you with what you just described. Not only does it allow you to version control your database schema, it allows you to build deployment T-SQL scripts to update a database from one version to another. It has built in schema compare and data compare. I highly recommend it. We use it to version control our database schema and to do deployments. I would never attempt to update database schemas by creating manual scripts anymore.
This used to be a pain in the ass for me. Then I bought this: http://www.red-gate.com/products/sql-development/sql-source-control/
Which, providing your're using a compatible Source-Control solution (such as SVN), you can make changes in one database, commit them, and then update other databases with the same structure.
As Randy mentions, the Database Edition of VS.NET will do this.
I use SQL Delta. It's a third party tool and very well priced for what it does. I've used it for a number of years on extremely large projects (that is the database has hundreds of tables and thousands of stored procedures) and it has never failed us in what it does.
The way I used it is this.
I get it ot produce a blank database
schema as per the development
database.
Move this database over to
the production server
Sync the production database with this database
You can also produce scripts and run the scripts and all of this can be automated.

How do you Move Dev Database Changes to Production Database?

I have been working on a project and gotten it through the first stage. However, the requirments ended up changing and I have to add new tables and redo some of the foriegn key references in the DB.
The problem I have is my lack of knowledge of dealing with doing this kind of change to a staging then production database once I get the development done on dev database.
What are some strategies for migrating database schema changes and maintaining data in the database?
About as far as my knowledge is on doing this is open up Sql Server Management Studio and starting adding tables manually. I know this is probably a bad way to do it so looking for how to do it properly while realizing I probably started out wrong.
For maintaining schema changes you can use ApexSQL Diff, a SQL Server and SQL Azure schema comparison and synchronization tool, and for maintaining data in the database you can use ApexSQL Data Diff, a SQL Server and SQL Azure data comparison and synchronization tool.
Hope this helps
Disclaimer: I work for ApexSQL as a Support Engineer
You have to have something called as a "KIT". Obviously, if you are maintaining some kind of a source control, all the scripts for the changes that you do in the development environments should be maintained in the source control configuration tool.
Once you are done with all the scripts/changes that you deem certified to move to next higher environment. Prepare the kit with having all these scripts in folders (ideally categorized as Procedures, Tables, Functions, Bootstraps) And then have a batch files that could execute these scripts in the kit in a particular order using OSQL command line utility.
Have separate batch files for UAT/ Staging/ production so that you can just double click on the batch file to execute the kit in the appropriate server. Check for OSQL options.
This way all your environments are in sync!
I typically use something like the SQL Server Publishing Wizard to produce SQL scripts of the changes. That is a rather simple and easy approach. The major downside with that tool is that the produced will drop and recreate tables that are not changed but used by procedures that have changed (and I can't understand why), so there is some manual labour involved in going through the script and remove things that don't need to be there.
Note that you don't need to download and install this tool; you can launch it from within Visual Studio. Right-click on a connection in the Server Explorer and select "Publish to Provider" in the context menu.
Red Gate SQL Compare and SQL Data Compare all the way. Since my company bought it, it saved me tons of time staging our databases from DEV to TEST to ACCEPTANCE to PRODUCTION.
And you can have it synchronize with a scripts folder too for easy integration in a source control system.
http://www.red-gate.com
You might want to check out a tool like Liquibase: http://liquibase.org/
You can use visual studio 2015. Go to Tools=> SQL server => New Schema comparison
step 1) Select source and target Database.
Click on Compare option.
step 2) once comparison completed, you can click on icon Generate Script(Shift+alt+G)
this will generate Commit script.
step 3) To generate rollback script for database changes just swap database from step 1
There are some tools available to help you with that.
If you have Visual Studio Team edition, check database projects (aka DataDude aka Visual Studio Team for Database Professionals) See here and here
It allows you to generate a model from the dev/integration database and then (for many, but not all cases) automatically create scripts which update your prod database with the changes you made to dev/integration.
For VS 2008, make sure you get the GDR2 patches.
We have found the best way to push changes is to treat databases changes like code. All changes are in scripts, they are in source control and they are part of a version. Nothing is ever under any circumstances pushed to prod that is not scripted and in source control. That way you don't accidentally push changes that are in dev, but not yet ready to be pushed to prod. Further you can restore prod data to the dev box and rerun all the scripts not yet pushed and you have fresh data and all the dev work preserved. This also works great when you have lookup values to tables that are chaging that you don;t want pushed to prod until other things move as well. Script the insert and put it with the rest of the code for the version.
It's nice to use those tools to do a compare to see if something is missed in the scripts, but I would NEVER rely on them alone. Far too much risk of pushing something "not yet ready for prime time" to prod.
A good database design tool (such as Sybase Powerdesigner) will allow you to create the design changes to the data model, then generate the code to implement those changes. You can then store and run the code as you choose. This tool should also be able to do reverse engineering when you inherit a database you didn't build.
Finding all the changes between development and production is often difficult even in an organized, well-documented environment. Idera has a tool for SQL Server which will detect structural differences between your development and production database and another tool which detects changes in the data. In fact, I often use these to go the other direction and sync development with production to start a new project.

How do you manage your sqlserver database projects for new builds and migrations?

How do you manage your sql server database build/deploy/migrate for visual studio projects?
We have a product that includes a reasonable database part (~100 tables, ~500 procs/functions/views), so we need to be able to deploy new databases of the current version as well as upgrade older databases up to the current version. Currently we maintain separate scripts for creation of new databases and migration between versions. Clearly not ideal, but how is anyone else dealing with this?
This is complicated for us by having many customers who each have their own db instance, rather than say just having dev/test/live instances on our own web servers, but the processes around managing dev/test/live for others must be similar.
UPDATE: I'd prefer not to use any proprietary products like RedGate's (although I have always heard they're really good and will look into that as a solution).
We use Red-Gate SQLCompare and SQLDataCompare to handle this. The idea is simple. Both compare products let you maintain a complete image of the schema or data from selected tables (e.g. configuration tables) as scripts. You can then compare any database to the scripts and get a change script. We keep the scripts in our Mercurial source control and tag (label) each release. Support can then go get the script for any version and use the Redgate tools to either create from scratch or upgrade.
Redgate also has an API product that allows you to do the compare function from your code. For example, this would allow you to have an automatic upgrade function in your installer or in the product itself. We often use this for our hosted web apps as it allows us to more fully automate the rollout process. In our case, we have an MSBuild task that support can execute to do an automatic rollout and upgrade. If you distribute to third-parties, you have to pay a small additional license fee for each distribution that includes the API.
Redgate also has a tool that automatically packages a database install or upgrade. We don't use that one as we have found that the compare against scripts for a version gives us more flexibility.
The Redgate tools also help us in development because they make it trivial to source control the schema and configuration data in a very granular way (each database object can be placed in its own file)
The question was asked before SSDT projects appeared, but that's definitely the way I'd go nowadays, along with hand-crafting migration scripts for structural db changes where there is data that would be affected.
There's also the MS VSTS method (2008 description here), anyone got a good article on doing this with 2010 and the pros/cons of using these tools?

Versioning SQL Server DDL code

I'd like to have all DB DDL code under CVS.
We are using Subversion for our .NET code but all database code remains still unversioned.
All we know is how important DB logic can be. I've googled but I've found only few (expensive tools). I believe there exists other (cheaper) solution(s).
What approach do you advise to follow? What tools are most appropriate?
SQL Server 2005, VS 2008 TS, TSVN
UPDATE
Our coding scenario is that developers cannot access to PROD DB directly. It is changed only by scripts (so this is not a problem)
I'm mostly interested in the DEV environment where all of developers have full access.
So it happens that a developer overwrite USP previously changed by another.
I'd like to have the possibility to restore lost version / compare USPs revisions etc.
UPDATE-2
To create deployment script we are using Red-Gate SQL Compare.
Works perfectly - so deployment scripts are not a case.
If you haven't already read it, Martin Fowler's article Evolutionary Database Design is a great place to start.
The article is hard to summarize, but it describes how his team dealt with database versioning in a rapidly changing development process. They created their own tools to facilitate things: scripts to bring users up to the current master, to copy any version of the schema so users could debug one another's working copies, etc..
For a solid low-tech solution, I've found it helpful to keep two kinds of DDL scripts in source control:
A master version that can create the database objects from scratch.
'Version upgrade' scripts for each development iteration.
They're redundant to a degree, but extremely useful (particularly when it comes to deployment).
If you haven't already looked at the Visual Studio Database Edition GDR (a.k.a. "Data Dude"), you should definitely download it and try it out:
http://www.microsoft.com/downloads/details.aspx?FamilyID=bb3ad767-5f69-4db9-b1c9-8f55759846ed&displaylang=en
Among other things, the GDR will facilitate team development by making it easy for each developer to maintain their own local copy of a database, version scripts, create deployment scripts to move a database schema to a new version, and even support database rollback.
It's free if you are using team system developer edition. Check it out.
If you are using Visual Studio Team Suite or Visual Studio Developer Edition, you are entitled to a copy of Visual Studio Database Professional. This is designed to do exactly what you describe, and much more. We use it to manage our database schema (code).
Randy
We use Subversion for all our database code as well. Since nothing is allowed to go to Prod unless it is in a script, there seems to be no porblem with getting people to put all the scripts into subversion. We tend to write alter table scripts to change tables with existing data and then recreate the whole table structure in case we need to create a new database from scratch (we often have the same database structure on multiple servers as some of our clients are very large and do not want their data accidentally available to the competition and so pay for separate servers and therefore may need to create the whole database again with no data.) For objects that don't directly store data we drop the orginal object and recreate it with a create statement. Each project has it's own home inthe repository and each database does too, so the script may be in more than one place to facilitate deployment.
But the real key is that no one can load to Prod without a script. We don't give our devs direct rights to prod, so they have no problem doing things in scripts as opposed to using SSMS.
I wrote SMOscript which generates a CREATE script for each object in a database.
Use this tool to generate into a directory covered by CVS, and update your repository.
Finally I found this tool and approach extremely useful and very easy to introduce
(at least at the beginning - where no versioning solution on the place):
http://www.codeproject.com/KB/database/SQLScripter.aspx
You can run it out of the box.
For final solution I'd incline to GDR.
This also sounds interesting:
Freeware:
http://dbsourcetools.codeplex.com/
http://www.codeproject.com/KB/database/ScriptDB4Svn.aspx
http://www.codeproject.com/KB/database/SQLScripter.aspx
http://blog.boxedbits.com/archives/133
Commercial:
http://www.nobhillsoft.com/Randolph.aspx
You should use Management Studio (SSMS) and place the .sql under source control, possibly separate schema objects under folders.
Hope this helps
See if Wizardby fits your needs.

Resources