Execute SQL script after SSIS build or check-in - sql-server

I am trying to find a better way to test our SSIS application, which is used to load data from a file into SQL Server and validate the file data.
I have created a SQL script which can be run to insert 'bad' data into the database tables, and ensure that our validations are performing correctly.
The SQL script:
- loads 'bad' data
- executes the SSIS validations
- ensures the errors in the data were detected
- Outputs a PASS or FAIL
- Deletes the TEST data if passed
Is there anyway I can get this script to be run automatically somehow, for example after someone checks in some code? Should I add it as a stored proc?
I had a look at the Default template Build definition but I couldn't see how to run this SQL script.

The wrong way to do this is using a build server. There are lots of parts to a continuous integration and a build server is really designed for compilation and validation that does not require an instance of your environment. There are two good practices that you can adopt:
Create a test harness that allows you to load a singe package from your SSIS scripts and test the inputs and outputs. Essentially unit tests. I did this for a customer a few years ago and it worked well.
Use a release management tool to push out your packages, the data, and everything else that you need. Release Management for Visual Studio can do this easily.
A rule of thumb that I always go buy is if I do not need an instance of anything but in memory objects then all I need is a build server. However if I need an instance of my app then I want to add release management tools to my continuous integration strategy.

It doesn't appear that you can do it as part of a checkin based on my 2 minutes of searching.
How to modify the default Check-in Action in TFS?
http://msdn.microsoft.com/en-us/library/ms243849(VS.80).aspx
Can I modify the default Check-in Action in TFS 2012? (pretty picture)
http://www.codeproject.com/Tips/562994/Perform-a-custom-action-for-Check-in-event-in-Micr
Instead, you'll need to set up a build server to handle the action. Out of the box, MSBuild doesn't have the ability to run a SQL Script so you'll need to snag the MSBuild Extension Pack and leverage SqlExecute
http://msdn.microsoft.com/en-us/library/ms181712.aspx
http://msdn.microsoft.com/en-us/library/ms181710(v=vs.90).aspx
http://msdn.microsoft.com/en-us/library/cc668755(v=vs.90).aspx
http://www.msbuildextensionpack.com/help/4.0.5.0/html/0d864b98-649a-5454-76ea-bd3069fde8bd.htm

You can use SQL Server Agent to embed for example your SQL code and you can than run this job from your application.

Related

SQL Server Schema and data comparison using SSDT

I'm trying to do data comparison and automate database upgrade script generation by using SQL Server Database tools > SchemaComparison/DataComparison.
Using VS command window, I am able to run
> Tools.NewDataComparison [args...]
> SQL.DataCompareExportToFile [file]
which produces .sql file containing required inserts/updates/deletes.
Now I would like to go a step further and automate this.
I have been looking at Microsoft.VisualStudio.Data.Tools.DataCompare, Microsoft.SqlServer.Dac and similar assemblies but haven't found any of the methods above.
Which dll exposes these?
I have also tried to run these methods with devenv.exe (which only starts VS and executes the arguments in command window) but was only successful at running Tools.NewDataComparison. Is there a way to chain or reuse the context, so I can run the SQL.DataCompareExportToFile afterwards?
I know I can use SqlPackage.exe to export dacpac and bacpac, but comparison is possible only for schema. Data comparison is not. Is there a way to use bacpac for comparison?
I would accept any solution which would allow me to compare the databases and generate the upgrade .sql script. Either in C# or any other script.
Unfortunately I cannot comment to ask for clarification. If this is a DB project in Visual Studio you can setup Pre and Post execution scripts to manage data updates.
We break ours down into
Reference Data - Static lookup data, we kill and fill so Ref data is always clean and controlled.
Migration Data - Copies data from another source and is based on what release is being run and only executes once.
Sandbox Data - Aligns data for repeatable test by the front-end. Think of unit testing. Only runs on non-Prod environments.
So once you have this in place, then we use SqlPackage to upgrade the DAC which includes the Pre and Post scripts.
If you are talking about actually comparing 2 databases, then I only know client tools like VS Data Compare and Red Gate that could do that. But you could try using Merge between tables or maybe explore an SSIS solution.

Automating Database Deployment with TeamCity

We are currently using TeamCity and I am wondering if it is possible to have it handle our database process. Here is what I am trying to accomplish.
User runs a build
TeamCity remotes into database server (or tells a program to via command line)
SQL script is run that updates record(s)
Copys the mdf/ldf back to team city for manipulation in the build
Alternatively it could work like this if this is easier
User logs in to database server and runs batch file which does the following:
SQL script is run that updates record(s)
MDF/LDF is copied and then uploaded to repository
Build process is called through web hook with parameter
I cant seem to find anything that even gets me started. Any help getting pointed in the right direction would be helpful.
From your description above I am going to guess you are trying to make a copy of a shared (development) database, which you then want to modify and run tests against on the CI server.
There is nothing to stop you doing what you describe with TeamCity (as it can run any arbitrary code as a build step) but it is obviously a bit clunky and it provides no specific support for what you are trying to do.
Some alternative approaches:
Consider connecting directly to your shared database, but place all your operations within a transaction so you can discard all changes. If your database offers the capability, consider database snapshots.
Deploy a completely new database on the CI when you need one. Automate the schema deployment and populate it will test data. Use a lightweight database such as SQL Local DB, or SQL Lite.

How to map data with VS2012 SSDT project

I am trying to investigate how to work with SSDT properly. Currently I made publish working however I can not understand how to use this tool for development.
What I mean is that I can change the code, but how to see the actual result? I can add database to Object explorer and execute the statement, but how can I save it to the project after I made some changes?
Typically the way I work with SSDT is to have a local database instance (Sql Express, Developer or LocalDb) and use that to push your changes to locally to test against and then when you are ready you can push your changes to the dev/test/prod server.
To actually run the code you will need to deploy it to a sql server, ssdt basically gives you what a compiler would give you - to run the actual code you need sql.
I would also investigate using tSQLt to write unit tests then you can push the project and the tests to your local instance and use that to check the data against the model.
When you are working in SSDT, the changes that you make are to model. You then need to deploy these changes by doing a diff using the schema compare. The schema compare will do a diff between the model and target and prepare an upgrade script for you.
Try hte following link: https://msdn.microsoft.com/en-us/library/dd193250(v=vs.100).aspx

Executing Database Job as part of TFS Build

I have a request to perform a "smoke test" for database deployments that I am doing within TFS. I would like to be able to Execute a database job after deployment in TFS that would be used to populate data. After that, the possibility to run some SQL statements with results of how many records were inserted, things like that.
I have looked into Unit Testing with SSDT, but was wondering if there are any other options (plus it seems you can only kick off SPROCs with that method).
Any advice would be much appreciated.
Assuming you already have a SSDT project and you're using the DACPAC to deploy as part of you build, and you are just wondering how to include the population of data as part of the deploy. What I usually do is just use a post-deploy script with a whole bunch of INSERT's (I typically split it up into separate files with INSERT's for each table, then include these in the main post-deploy script using the :r syntax). To generate the insert scripts I'll usually use the functionality in SSMS to script out a database, and in the advanced options just tell it to script data-only, then copy paste that into my post-deploy script.
Personally, I would use either sqlcmd EXEC tasks, but probably (more likely) this sqlcmd-msbuild task.
http://www.msbuildextensionpack.com/help/4.0.5.0/html/3b72c130-7fc9-8b8a-132c-62999e5b1183.htm
You can have INSERT scripts.
You could also incorporate some 'unit testing' methods that Andy Leonard discusses.
http://www.dotnetrocks.com/default.aspx?showNum=312
I would recommend that you do not do this as part of a build.
You want your build to be fast and that likely means that you do not have not should you need an instance of your application. You should use a release management tool to deploy your application to an environment and them run instance tests there.
You can use RM for Visual Studio 2013 to create a release pipeline that does this. These tools contain drag an drop elements for creating and updating databases.
http://nakedalm.com/building-release-pipeline-release-management-visual-studio-2013/
Have you considered tSQLt? tSQLt is open source and probably the most-used SQL Server unit test framework available. To use it correctly you'd insert the data within the test itself (INSERTs). This is a particularly cunning way of dealing with test set up, as tSQLt rolls back the transaction after the test has run, bringing your test environment back to a known state.

Automatically generate a database schema diff script

We are using TFS (service, not server) to manage the versions of our database schema. Periodically TFS generates a build and saves it in a drop folder.
Following best practices (http://martinfowler.com/articles/evodb.html), recommendations, and above all, based in our own experience (and suffering), we want to generate automatically a daily/weekly diff script with the changes checked in.
There are a lot of tools (RedGate, Visual Studio, Open Sourece) that help to do this job, and I've tried all. But in all the cases needs to be done manually.
We are doing as often as we can… but since it is manually it is not as often as it should be necessary ;)
Is there a way to do it automatically (unattended)? Can be done between 2 deployment scripts of builds? Or to do it between 2 databases? Is it also possible to compare data automatically too?
Yes, it is possible to automatically generate the difference script. We have done that in my company.
We're using vsdbcmd command line tool from Visual Studio to generate the deployment script from the build and later use that script to deploy in the test servers. We do that using Visual Studio 2010.

Resources