I am trying to investigate how to work with SSDT properly. Currently I made publish working however I can not understand how to use this tool for development.
What I mean is that I can change the code, but how to see the actual result? I can add database to Object explorer and execute the statement, but how can I save it to the project after I made some changes?
Typically the way I work with SSDT is to have a local database instance (Sql Express, Developer or LocalDb) and use that to push your changes to locally to test against and then when you are ready you can push your changes to the dev/test/prod server.
To actually run the code you will need to deploy it to a sql server, ssdt basically gives you what a compiler would give you - to run the actual code you need sql.
I would also investigate using tSQLt to write unit tests then you can push the project and the tests to your local instance and use that to check the data against the model.
When you are working in SSDT, the changes that you make are to model. You then need to deploy these changes by doing a diff using the schema compare. The schema compare will do a diff between the model and target and prepare an upgrade script for you.
Try hte following link: https://msdn.microsoft.com/en-us/library/dd193250(v=vs.100).aspx
Related
I'm trying to do data comparison and automate database upgrade script generation by using SQL Server Database tools > SchemaComparison/DataComparison.
Using VS command window, I am able to run
> Tools.NewDataComparison [args...]
> SQL.DataCompareExportToFile [file]
which produces .sql file containing required inserts/updates/deletes.
Now I would like to go a step further and automate this.
I have been looking at Microsoft.VisualStudio.Data.Tools.DataCompare, Microsoft.SqlServer.Dac and similar assemblies but haven't found any of the methods above.
Which dll exposes these?
I have also tried to run these methods with devenv.exe (which only starts VS and executes the arguments in command window) but was only successful at running Tools.NewDataComparison. Is there a way to chain or reuse the context, so I can run the SQL.DataCompareExportToFile afterwards?
I know I can use SqlPackage.exe to export dacpac and bacpac, but comparison is possible only for schema. Data comparison is not. Is there a way to use bacpac for comparison?
I would accept any solution which would allow me to compare the databases and generate the upgrade .sql script. Either in C# or any other script.
Unfortunately I cannot comment to ask for clarification. If this is a DB project in Visual Studio you can setup Pre and Post execution scripts to manage data updates.
We break ours down into
Reference Data - Static lookup data, we kill and fill so Ref data is always clean and controlled.
Migration Data - Copies data from another source and is based on what release is being run and only executes once.
Sandbox Data - Aligns data for repeatable test by the front-end. Think of unit testing. Only runs on non-Prod environments.
So once you have this in place, then we use SqlPackage to upgrade the DAC which includes the Pre and Post scripts.
If you are talking about actually comparing 2 databases, then I only know client tools like VS Data Compare and Red Gate that could do that. But you could try using Merge between tables or maybe explore an SSIS solution.
I am trying to find a better way to test our SSIS application, which is used to load data from a file into SQL Server and validate the file data.
I have created a SQL script which can be run to insert 'bad' data into the database tables, and ensure that our validations are performing correctly.
The SQL script:
- loads 'bad' data
- executes the SSIS validations
- ensures the errors in the data were detected
- Outputs a PASS or FAIL
- Deletes the TEST data if passed
Is there anyway I can get this script to be run automatically somehow, for example after someone checks in some code? Should I add it as a stored proc?
I had a look at the Default template Build definition but I couldn't see how to run this SQL script.
The wrong way to do this is using a build server. There are lots of parts to a continuous integration and a build server is really designed for compilation and validation that does not require an instance of your environment. There are two good practices that you can adopt:
Create a test harness that allows you to load a singe package from your SSIS scripts and test the inputs and outputs. Essentially unit tests. I did this for a customer a few years ago and it worked well.
Use a release management tool to push out your packages, the data, and everything else that you need. Release Management for Visual Studio can do this easily.
A rule of thumb that I always go buy is if I do not need an instance of anything but in memory objects then all I need is a build server. However if I need an instance of my app then I want to add release management tools to my continuous integration strategy.
It doesn't appear that you can do it as part of a checkin based on my 2 minutes of searching.
How to modify the default Check-in Action in TFS?
http://msdn.microsoft.com/en-us/library/ms243849(VS.80).aspx
Can I modify the default Check-in Action in TFS 2012? (pretty picture)
http://www.codeproject.com/Tips/562994/Perform-a-custom-action-for-Check-in-event-in-Micr
Instead, you'll need to set up a build server to handle the action. Out of the box, MSBuild doesn't have the ability to run a SQL Script so you'll need to snag the MSBuild Extension Pack and leverage SqlExecute
http://msdn.microsoft.com/en-us/library/ms181712.aspx
http://msdn.microsoft.com/en-us/library/ms181710(v=vs.90).aspx
http://msdn.microsoft.com/en-us/library/cc668755(v=vs.90).aspx
http://www.msbuildextensionpack.com/help/4.0.5.0/html/0d864b98-649a-5454-76ea-bd3069fde8bd.htm
You can use SQL Server Agent to embed for example your SQL code and you can than run this job from your application.
We originally dismissed using database projects in conjunction with TFS as our solution for our deployment and soucecontrol needs. However, in the interest of thoroughness, I'm exploring and prototyping it.
I've set up my database project (with add to source control checked). I've checked in the changes. Now, where do you develop from?
I've tried ...
connecting to the remote development server to make changes
syncing schema to (localdb)\Projects and making changes there
directly in the Source Control Explorer
With option 1 and 2 I don't see an automated way to add code to source control. Am I suppose to be working in the Source Control Explorer? (this seems a little silly)... Is there a way to commit the entire solution to source control? My apologies in advance, I'm a database developer and this concept of a "solution" is very foreign to me.
Also there were a lot of chatter about Visual Studios doing a lot of ugly things in the back ground that turned a lot of development shops off of database projects. Can someone share your experiences with me? Some of the pitfalls and gotchas.
And yes, we have looked at Redgate SourceControl (very nice tool).
Generally people do one of two things:
Develop in Visual Studio, via the Solution Explorer. Just open the project like you would any other project, add tables, indexes, etc. You even get the same GUI for editing DB objects as you get in SSMS. All changes will automatically be added to TFS Pending changes (just like any other code change), and can be checked in when you're ready.
Deploy the latest DB (using Publish in VS) to any SQL Server, make your changes in SSMS, then do a Schema Compare in Visual Studio to bring your changes back into your DB project so they can be checked into TFS.
I've been using DB projects for many years and I LOVE them! Every developer I've introduced them to, refuses to develop without them from that point on.
I'm going to explain you briefly how we use DB projects with TFS.
We basically have one DB already done and if we require any changes or new tables we create them or alter them directly in SQL Server (each developer has its own dev SQL Server).
Then in VS from the SQL Server Object Explorer we drag the tables we want into the DB project so when we check in the changes, every user in TFS would be able to get them and then publish that project that will generate and execute a script into the DB.
This is the way we use to develop when we need to add specific tables or records to the DB so we don't have to send emails with scripts or have them stored in an specific location (even with source control). This way we can get latest version of the project and publish it to ensure we have the latest DB version although it requires the user (who made the changes) to add them to the DB project.
Other way could be to do all the changes (and can be done without any problem) directly in the DB project and then publish it. That one would be a more right way to do it so you do all the changes directly in a source controlled project, but as you know, is always more comfortable to work directly through the SQLMS.
Hope this helps somehow.
We use the SSDT tools and have implemented the SQL Server Database Project Type to develop our databases:
http://www.techrepublic.com/blog/data-center/auto-deploy-and-version-your-sql-server-database-with-ssdt/
The definition of database objects and peripheral SQL Code (e.g. functions, sprocs, triggers etc) sit within the Visual Studio project and all changes are managed through VS. The interface is very similar to SSMS and, at this point doesn't cause any issues.
The benefits of this approach for us are as follows:
An existing SQL database can be imported into the SQL Server Project and managed through Visual Studio.
SQL object definitions & code can managed through the same version control system as the rest of the application code.
SQL Code can be checked for errors within Visual Studio in much the same way as you'd check your C# / VB for compilation / reference errors.
You can compare database schema's (within Visual Studio) between environments and easily identify key changes that you need to be aware of.
The SQL project can be compiled into a DACPAC file for automating deployment to different servers using a CI / Build Server (using the sqlpackage.exe utility without any custom scripts or code).
In essence developers can have a local version of the database to work on but would manage any changes through VS, then publish the changes to their local database. Once the changes are complete, the changes are committed to your version control system and then built centrally & automatically through a CI / Build server to ensure that all changes integrate and play nicely in much the same way that your other code is.
Hope that helps :)
I have been doing some research on the correct procedure to follow when working with both a development and live production database. The best article i have found was this one: Strategies for Database Development and Deployment, but i can not accept the idea that i have to maintain a Word document, manually, of every change i make to the DB. That seems ridiculous to me...
I use SQL Server Management Studio to manage my SQL databases both in dev and in prod. Is there a way to deploy latest changes to production WITHOUT destroying tables and data. Can someone please point me to a good procedural article on how this is done in SSMS.
Thanks
It is irresponsible to make changes to a database design without creating change scripts that are put into source control.
However, if you are already in this curcumstance, I suggest buying red_gate's SQLCompare. It will look at the two databases and script the differnces. You still can't run this willy-nilly though - sometimes you have made changes to the dev database that are not yet part of the curent version being pushed to prod and SQLCompare has no way to know this. It is far better to create the scripts as you go (Using alter table when the table currently exists so as not to disrupt existing data) and keep them in source control with the rest of code that you will be pushing at the same time.
The only right way to do it in prodiction - with or without Management Studio - prepare, check, test and run the scripts manually.
WITH FRESH BACKUP!
A common strategy is to keep an ordered set of change scripts, e.g. prefixed with date or database version, which can easily be tested on the development database by starting with a fresh backup from production. The change scripts can often be generated from SQL Server Management Studio when making changes, or could be crafted manually in case of more complex changes.
Another suggestion would be to version control the database definitions (tables, procedures etc). This can be easily achieved by using SQL Server Management Studio to generate create scripts for all objects after each update. This way you can easily compare changes made over time, or between different environments.
We're trying out VS2010 database projects for a new development, using the following dev cycle:
Use Management Studio to develop changes on a local DB instance (using the designers etc)
Use VS2010 schema compare to sync / import these changes to the VSDB project
Check in the VSDB project and run automated build / test etc
When I want to 'get latest' from source control, I then:
Update the VSDB project files from source control
Use Schema Compare to push the changes from the project to my local database instance
This is where it starts to break down... Because schema compare is trying to synchronise the two versions, it attempts to undo any changes I've made to my local database as part of my own feature development.
Obviously, you can tell schema compare to skip changes to the objects I've modified, but sadly this doesn't always work correctly: http://connect.microsoft.com/VisualStudio/feedback/details/564026/strange-schema-compare-behavior-sql-2008-database-projects.
Fundamentally, the problem exists because the definitions in the VSDB project are not automatically synchronised with my local database; thus I need to use Schema Compare to do a 'poor mans merge' every time I get a change.
One possible solution could be to:
Use Schema Compare to sync any changes from my local DB to the VSDB project first
Update the VSDB project from source control (therefore using the source control tooling to do the merge, rather than Schema Compare)
Schema Compare the changes from source control into my local DB instance
...which is far from ideal.
Is RedGate SQL Source Control better in this regard?
What about the new 'Juneau' SQL toolset?
you use 'Deploy' to push source changes to the database. Either Deploy Solution from the top-row Build menu, or you can right-click on the project in the Solution Explorer and select Deploy.
Deploy is configurable in the Project properties.
HTH
Your process is backwards which is why this is difficult. Changes should flow from VSDB to your database, not the other way around. Try this:
Use the designers in Management Studio if you like them but script
out any changes you make and add them into your VSDB project.
Instead of using Schema Compare use the built in Deployment functionality. This will automatically script and deploy the incremental changes to your local database in a single click
Since you mention other possible solutions, I'll elaborate on how our shop manages data structure changes and propogation to dev db's.
For tracking and applying differences, we've written a C# app that effectively abstracts database actions out to classes that we append to an Action list. The engine dynamically loads modules that represent database versions, and adds each item in the module to a list of actions to be performed for that version upgrade, then processes the list. Actions include DataRowInsertAction, TableCreateAction, ColumnModifyAction, etc.
One benefit of using this approach was that we were able to commit standard .cs files to subversion and users can bring their own dev databases up-to-date simply by checking out the latest and runnig it. Another huge advantage is that we can target multiple database engines, since the Actions themselves know what SQL to render based on which database engine is being targeted.
As a side note, we use AdeptSQL to compare databases, and love it. It'll create a complete list of differences, and you can generate a script to go either direction (given Database 'A' and Database 'B', upgrade A to B, or downgrade B to A.)
For a small additional charge, they offer extended functionality to perform a data diff as well.
http://www.adeptsql.com/
The idea behind SQL Source Control is basically to turn the development process on its' head - instead of working with database scripts and pushing the changes to a database, you make the changes to the database and SQL Source Control calculates the deltas and updates the local scripts and allows you to commit the changes to your source control system.
SQL Source Control currently only integrates with SQL Server Management Studio, but there is now a VS package called SQL Connect that you can use in VS 2010 to work in much the same way as in SQL Source Control. http://www.red-gate.com/products/sql-development/sql-connect/index-2