I am trying to automate our build process. Until now I was able to create the code packages using CruiseControl.net, nAnt and MSBuild.
But I am completely stuck when it comes to database comparing. I want to compare the development database to our QA database and create a script with the changes.
I currently use the VS2010 database project and I could compare them manually. Is there any way I can make this comparison automatic?
If I deploy my project, I get an .sql file generated using the dev database, but it is the whole database. I would like to have the differences between dev and qa only.
Is there any way to perform this approach? Any tool that allow me to compare the schema and run it from msbuild or the command line.
Thanks!
I've had a lot of luck with RedGate's SQLCompare product, and it comes with an SDK.
http://www.red-gate.com/supportcenter/Content?c=knowledgebase%5CSQL_Comparison_SDK%5CKB200801000220.htm&p=SQL%20Comparison%20SDK
You can use xSQL Software's SQL Schema Compare command line utility to automate the process with very little effort. And, the tool is free for SQL Server Express.
Related
Let's say we have a Visual Studio 2015 project and a database project related (SQL Server 2014), each developer wants to have their local db on each machine. I want to create a kind of script in TFS that developers can regenerate their local db from TFS including some dictionary and mandatory data in some tables to make it runable.
There is something like pre- and post-deployment script, is there any interesting document related or video where I can find it for someone that never did this before?
It is pretty straightforward:
get your schema into ssdt / in .sql files
create a script to deploy to the local instance using ssdt or something like the command line version of redgate SQL compare
find a way to deploy static data using pre/post deploy scripts or some other way
I tried to document how to do it with ssdt here:
https://the.agilesql.club/taxonomy/term/34
There are lots of tools to help just pick one and go with it, if you need help shout :)
Ed
We are using TFS (service, not server) to manage the versions of our database schema. Periodically TFS generates a build and saves it in a drop folder.
Following best practices (http://martinfowler.com/articles/evodb.html), recommendations, and above all, based in our own experience (and suffering), we want to generate automatically a daily/weekly diff script with the changes checked in.
There are a lot of tools (RedGate, Visual Studio, Open Sourece) that help to do this job, and I've tried all. But in all the cases needs to be done manually.
We are doing as often as we can… but since it is manually it is not as often as it should be necessary ;)
Is there a way to do it automatically (unattended)? Can be done between 2 deployment scripts of builds? Or to do it between 2 databases? Is it also possible to compare data automatically too?
Yes, it is possible to automatically generate the difference script. We have done that in my company.
We're using vsdbcmd command line tool from Visual Studio to generate the deployment script from the build and later use that script to deploy in the test servers. We do that using Visual Studio 2010.
I'm currently in the process of redesigning our department's source control strategy using Team Foundation Server (TFS) in regard to database objects. Essentially, we store nothing in TFS at this time. I have discovered SSDT and really enjoy their integration within Visual Studio and think it will make our transition into TFS much easier.
So, Does SSDT have the capability of generating scripts based on the delta's of my SSDT project verses what is in our server? It seems from what I have researched, I will only be able to generate an entire database script.
Requirements (Mind you, our developers do not have ddl access to production):
I cannot drop a database to re-create it
I cannot drop ALL objects like all stored procs to re-create them but only what I need
Tables will need to be altered not dropped and only what has changed
Dacpac's are out of the question
Our best option based on our environment at this time is to use scripts for updates
Our database environment is currently SQL Server 2008 R2. My SSDT version is the latest 2013 that was published in June.
Yes, if you do a publish from the project you will pretty much meet all of these requirements, though dacpacs are built as part of the process. The schema compare and pre/post deploy scripts are stored in the dacpac and the publish reads what "should" be present against what is currently in the database. It then generates a change script of all necessary changes to bring the database in line with the project.
Make sure you use the Refactor/Rename when renaming objects - that will cut down on the table drop/recreate operations. You may want to be careful with the "Drop objects not in the project" options. If you haven't been careful with making sure all objects created in your production server are in your project, you could accidentally drop something important just because someone didn't get it checked in.
There are command lines to the SQLPackage command that can generate change detail reports and scripts that you can use. The scripts need to be run through SQLCMD or in SQLCMD mode, but you can definitely produce scripts pretty easily.
So I am in the throes of developing our Continuous Integration practices. We are a .Net/MSSQL shop. We will all soon be on VS2012. We have settled on CruiseControl.Net for CI server, using msbuild to compile our projects. We use SVN (possibly switching to Git later, but that's another discussion) for source control. I'm leaning towards using InstallShield to deploy code packages (usually web apps and/or batch exeutables) to our QA and production servers. (CCNet would build these MSI's as part of our CI.) We are also starting to include unit testing in our projects, and will use NUnit integrated with CCNet to run them automatically upon check-in.
So far this works for our standard web app/exe development. Where it does not fit in (yet) is with our MSSQL change management, or lack thereof. It's been pretty cowboy how we've done this. Some folks have used Migrator.Net. Others just do a SQL Compare with Redgate and generate a script. Still others have hand-written sql scripts. It may or may not be in SVN. "Source control" at the db level is basically "we have backups of our databases." Boo, hiss. Needless to say that if we want some consistency with our CI and with our deployments, we need to settle on something. So far I am leaning towards using VS SQL projects to handle the change management and deployment.
Note: we (developers) are not supposed to push changes. Sys admins do that. So we can't run anything to deploy code or sql.
So, 2 problems to solve (I think):
What "technique" to use so that our CI server blows away a CI version of the database so that unit tests can be tested against it. I've settled that VS2012 SQL projects can do that. CCNet can run msbuild against the db project, which recreates the database. This is fairly easy.
How to generate change scripts for our QA and prod environments? This one I'm stuck on.
VS can do a schema compare and then generate the sql script -- but it is dependent on sqlcmd. So our sys admins would have to run sqlcmd from the command prompt to deploy it... probably not ideal. Right?
I could run msbuild again to deploy... but I don't want the database re-created, I just want changes deployed.
So what are the options here? I need something self-contained for the admins to run -- and check-in to SVN. Should I make another msi for database deployments? Can CCNet/msbuild make some other kind of "deployment package" for database changes (not re-creation) where the sys admins can double-click and go?
How do you all handle this?
Thanks
Tom
Check out the SQL Server Data Tools package from the Microsoft site.
This will register a new SQL Server 2012 Database type project to contain the definition for all of your database structures. Upon build, this will generate a create script that you can use to deploy your database.
Then for upgrading your database, use the SQLPACKAGE.EXE tool using the create script and target database server name to generate an Update.sql script.
Update: Also on the issue of how you're running unit tests, you could create supplemental methodologies that invoke the create scripts by launching a process and and passing the path to the output create.sql script, then have your tests 'tear down' the database using the same method but with a drop database statement.
We've got a SQL Server instance with some 15-20 databases, which we check in TFS with the help of RedGate. I'm working on a script to be able to replicate the instance (so a developer could run a local instance when needed, for example) with the help of these scripts. What I'm worried about is the dependencies between these scripts.
In TFS, RedGate has created these folders with .sql files for each database:
Functions
Security
Stored Procedures
Tables
Triggers
Types
Views
I did a quick test with Powershell, just looping over these folders to execute the sql, but I think that might not always work. Is there a strict ordering which I can follow? Or is there some simpler way to do this? To clarify, I want to be able to start with an completly empty SQL Server instance, and end up with a fully configured one according to what is in the TFS (without data, but that is ok). Using Powershell is not a requirement, so if it is simpler to do some other way, that is preferrable.
If you're already using RedGate they have a ton of articles on how to move changes from source control to database. Here's one which describes moving database code from TFS using sqcompare command-line:
http://www.codeproject.com/Articles/168595/Continuous-Integration-for-Database-Development
If you compare to any empty database it will create the script you are looking for.
The only reliable way to deploy the database from scripts folders would be to use Red Gate SQL Compare. If you run the .sql files using PowerShell, the objects may not be created in the right order. Even if you run them in an order that makes sense (functions, then tables, then views...), you still may have dependency issues.
SQL Compare reads all of the scripts and uses them to construct a "virtual" database in memory, then it calculates a dependency matrix for it so when the deployment script is created, things are done in the correct order. That will prevent SQL Server from throwing dependency-related errors.
If you are using Visual Studio with the database option it includes a Schema Compare that will allow you to compare what is in the database project in TFS to the local instance. It will create a script for you to have those objects created in the local instance. I have not tried doing this for a complete instance.
You might have to at most create the databases in the local instance and then let Visual Studio see that the tables and other objects are not there.
You could also just take the last backup of each database and let the developer restore them to their local instance. However this can vary on each environment depending on security policy and what type of data is in the database.
I tend to just use PowerShell to build the scripts for me. I have more control over what is scripted out when so when I rerun the scripts on the local instance I can do it in the order it needs to be done in. May take a little more time but I get better functioning scripts for me to work with, and PS is just my preference. There are some good scripts already written in the SQL Community that can help you on this. Jen McCown did a blog post of all the post her husband has written for doing just this, right here.
I've blogged about how to build a database from a set of .sql files using the SQL Compare command line.
http://geekswithblogs.net/SQLDev/archive/2012/04/16/how-to-build-a-database-from-source-control.aspx
The post is more from the point of view of setting up continuous integration, but the principles are the same.