SQL Server Schema and data comparison using SSDT - sql-server

I'm trying to do data comparison and automate database upgrade script generation by using SQL Server Database tools > SchemaComparison/DataComparison.
Using VS command window, I am able to run
> Tools.NewDataComparison [args...]
> SQL.DataCompareExportToFile [file]
which produces .sql file containing required inserts/updates/deletes.
Now I would like to go a step further and automate this.
I have been looking at Microsoft.VisualStudio.Data.Tools.DataCompare, Microsoft.SqlServer.Dac and similar assemblies but haven't found any of the methods above.
Which dll exposes these?
I have also tried to run these methods with devenv.exe (which only starts VS and executes the arguments in command window) but was only successful at running Tools.NewDataComparison. Is there a way to chain or reuse the context, so I can run the SQL.DataCompareExportToFile afterwards?
I know I can use SqlPackage.exe to export dacpac and bacpac, but comparison is possible only for schema. Data comparison is not. Is there a way to use bacpac for comparison?
I would accept any solution which would allow me to compare the databases and generate the upgrade .sql script. Either in C# or any other script.

Unfortunately I cannot comment to ask for clarification. If this is a DB project in Visual Studio you can setup Pre and Post execution scripts to manage data updates.
We break ours down into
Reference Data - Static lookup data, we kill and fill so Ref data is always clean and controlled.
Migration Data - Copies data from another source and is based on what release is being run and only executes once.
Sandbox Data - Aligns data for repeatable test by the front-end. Think of unit testing. Only runs on non-Prod environments.
So once you have this in place, then we use SqlPackage to upgrade the DAC which includes the Pre and Post scripts.
If you are talking about actually comparing 2 databases, then I only know client tools like VS Data Compare and Red Gate that could do that. But you could try using Merge between tables or maybe explore an SSIS solution.

Related

Is there a way I can get a script / schema of my SQL Server Database with SSMS?

I just lost some stored procedures in a test database. So I don't have to go through this again is there a way I can generate a script that I could use to recreate ALL my database objects. I am using the SSMS so I am hoping there's some option with that which will allow me to get a script for recreation of everything.
Right Click (the database in object explorer)
Tasks
Generate Scripts
or use a version control tool or documentation tool from some one like redgate
finally you could simply take regular backups
As suggested, you can generate scripts using management studio for any views or functions but
The ideal way is to keep a repository of every object in SVN , TFS or something similar.
We do it for our dev, test (and off course for prod). We do not treat non-prod environments as non-important.

Execute SQL script after SSIS build or check-in

I am trying to find a better way to test our SSIS application, which is used to load data from a file into SQL Server and validate the file data.
I have created a SQL script which can be run to insert 'bad' data into the database tables, and ensure that our validations are performing correctly.
The SQL script:
- loads 'bad' data
- executes the SSIS validations
- ensures the errors in the data were detected
- Outputs a PASS or FAIL
- Deletes the TEST data if passed
Is there anyway I can get this script to be run automatically somehow, for example after someone checks in some code? Should I add it as a stored proc?
I had a look at the Default template Build definition but I couldn't see how to run this SQL script.
The wrong way to do this is using a build server. There are lots of parts to a continuous integration and a build server is really designed for compilation and validation that does not require an instance of your environment. There are two good practices that you can adopt:
Create a test harness that allows you to load a singe package from your SSIS scripts and test the inputs and outputs. Essentially unit tests. I did this for a customer a few years ago and it worked well.
Use a release management tool to push out your packages, the data, and everything else that you need. Release Management for Visual Studio can do this easily.
A rule of thumb that I always go buy is if I do not need an instance of anything but in memory objects then all I need is a build server. However if I need an instance of my app then I want to add release management tools to my continuous integration strategy.
It doesn't appear that you can do it as part of a checkin based on my 2 minutes of searching.
How to modify the default Check-in Action in TFS?
http://msdn.microsoft.com/en-us/library/ms243849(VS.80).aspx
Can I modify the default Check-in Action in TFS 2012? (pretty picture)
http://www.codeproject.com/Tips/562994/Perform-a-custom-action-for-Check-in-event-in-Micr
Instead, you'll need to set up a build server to handle the action. Out of the box, MSBuild doesn't have the ability to run a SQL Script so you'll need to snag the MSBuild Extension Pack and leverage SqlExecute
http://msdn.microsoft.com/en-us/library/ms181712.aspx
http://msdn.microsoft.com/en-us/library/ms181710(v=vs.90).aspx
http://msdn.microsoft.com/en-us/library/cc668755(v=vs.90).aspx
http://www.msbuildextensionpack.com/help/4.0.5.0/html/0d864b98-649a-5454-76ea-bd3069fde8bd.htm
You can use SQL Server Agent to embed for example your SQL code and you can than run this job from your application.

Executing Database Job as part of TFS Build

I have a request to perform a "smoke test" for database deployments that I am doing within TFS. I would like to be able to Execute a database job after deployment in TFS that would be used to populate data. After that, the possibility to run some SQL statements with results of how many records were inserted, things like that.
I have looked into Unit Testing with SSDT, but was wondering if there are any other options (plus it seems you can only kick off SPROCs with that method).
Any advice would be much appreciated.
Assuming you already have a SSDT project and you're using the DACPAC to deploy as part of you build, and you are just wondering how to include the population of data as part of the deploy. What I usually do is just use a post-deploy script with a whole bunch of INSERT's (I typically split it up into separate files with INSERT's for each table, then include these in the main post-deploy script using the :r syntax). To generate the insert scripts I'll usually use the functionality in SSMS to script out a database, and in the advanced options just tell it to script data-only, then copy paste that into my post-deploy script.
Personally, I would use either sqlcmd EXEC tasks, but probably (more likely) this sqlcmd-msbuild task.
http://www.msbuildextensionpack.com/help/4.0.5.0/html/3b72c130-7fc9-8b8a-132c-62999e5b1183.htm
You can have INSERT scripts.
You could also incorporate some 'unit testing' methods that Andy Leonard discusses.
http://www.dotnetrocks.com/default.aspx?showNum=312
I would recommend that you do not do this as part of a build.
You want your build to be fast and that likely means that you do not have not should you need an instance of your application. You should use a release management tool to deploy your application to an environment and them run instance tests there.
You can use RM for Visual Studio 2013 to create a release pipeline that does this. These tools contain drag an drop elements for creating and updating databases.
http://nakedalm.com/building-release-pipeline-release-management-visual-studio-2013/
Have you considered tSQLt? tSQLt is open source and probably the most-used SQL Server unit test framework available. To use it correctly you'd insert the data within the test itself (INSERTs). This is a particularly cunning way of dealing with test set up, as tSQLt rolls back the transaction after the test has run, bringing your test environment back to a known state.

Automatically generate a database schema diff script

We are using TFS (service, not server) to manage the versions of our database schema. Periodically TFS generates a build and saves it in a drop folder.
Following best practices (http://martinfowler.com/articles/evodb.html), recommendations, and above all, based in our own experience (and suffering), we want to generate automatically a daily/weekly diff script with the changes checked in.
There are a lot of tools (RedGate, Visual Studio, Open Sourece) that help to do this job, and I've tried all. But in all the cases needs to be done manually.
We are doing as often as we can… but since it is manually it is not as often as it should be necessary ;)
Is there a way to do it automatically (unattended)? Can be done between 2 deployment scripts of builds? Or to do it between 2 databases? Is it also possible to compare data automatically too?
Yes, it is possible to automatically generate the difference script. We have done that in my company.
We're using vsdbcmd command line tool from Visual Studio to generate the deployment script from the build and later use that script to deploy in the test servers. We do that using Visual Studio 2010.

Recreate database from RedGate checked-in scripts

We've got a SQL Server instance with some 15-20 databases, which we check in TFS with the help of RedGate. I'm working on a script to be able to replicate the instance (so a developer could run a local instance when needed, for example) with the help of these scripts. What I'm worried about is the dependencies between these scripts.
In TFS, RedGate has created these folders with .sql files for each database:
Functions
Security
Stored Procedures
Tables
Triggers
Types
Views
I did a quick test with Powershell, just looping over these folders to execute the sql, but I think that might not always work. Is there a strict ordering which I can follow? Or is there some simpler way to do this? To clarify, I want to be able to start with an completly empty SQL Server instance, and end up with a fully configured one according to what is in the TFS (without data, but that is ok). Using Powershell is not a requirement, so if it is simpler to do some other way, that is preferrable.
If you're already using RedGate they have a ton of articles on how to move changes from source control to database. Here's one which describes moving database code from TFS using sqcompare command-line:
http://www.codeproject.com/Articles/168595/Continuous-Integration-for-Database-Development
If you compare to any empty database it will create the script you are looking for.
The only reliable way to deploy the database from scripts folders would be to use Red Gate SQL Compare. If you run the .sql files using PowerShell, the objects may not be created in the right order. Even if you run them in an order that makes sense (functions, then tables, then views...), you still may have dependency issues.
SQL Compare reads all of the scripts and uses them to construct a "virtual" database in memory, then it calculates a dependency matrix for it so when the deployment script is created, things are done in the correct order. That will prevent SQL Server from throwing dependency-related errors.
If you are using Visual Studio with the database option it includes a Schema Compare that will allow you to compare what is in the database project in TFS to the local instance. It will create a script for you to have those objects created in the local instance. I have not tried doing this for a complete instance.
You might have to at most create the databases in the local instance and then let Visual Studio see that the tables and other objects are not there.
You could also just take the last backup of each database and let the developer restore them to their local instance. However this can vary on each environment depending on security policy and what type of data is in the database.
I tend to just use PowerShell to build the scripts for me. I have more control over what is scripted out when so when I rerun the scripts on the local instance I can do it in the order it needs to be done in. May take a little more time but I get better functioning scripts for me to work with, and PS is just my preference. There are some good scripts already written in the SQL Community that can help you on this. Jen McCown did a blog post of all the post her husband has written for doing just this, right here.
I've blogged about how to build a database from a set of .sql files using the SQL Compare command line.
http://geekswithblogs.net/SQLDev/archive/2012/04/16/how-to-build-a-database-from-source-control.aspx
The post is more from the point of view of setting up continuous integration, but the principles are the same.

Resources