How to distribute SQL database with application? - sql-server

I'm developing a WPF desktop application, and the database currently resides in SQL Server Express 2012. The app is undergoing acceptance testing, and periodically I have to backup my database and restore it on the tester's PC. This is setting alarm bells ringing that distribution is going to be a pain when we come to start working with customers - I don't like the idea of them having to use SSMS.
I wondered if there was any way to distribute the mdf file and have my application connect to that "directly"? I've seen examples of connection strings that specify an MDF file/path, and wondered if it was really that simple or does it still need attaching in SSMS first? (I'm guessing so, otherwise how would you configure things like users and roles?).

Most applications I've encountered run SQL CREATE scripts during the installation process. I'd prefer that way, because when you will have to change your schema some time in the future (and you will), you will have to do something similar, because you cannot exchange the old MDF (with your users' data) against an empty updated one.

Related

Sharing database project in TFS

I created a database project as part of my solution with scripts for my tables. I'm using database first, so all I do is run the project to build/deploy my tables to the database.
I'm working with a few others so I checked the SQL project into TFS.
So the other people can get the solution, run the SQL project and generate the local database for themselves.
The problem is, it might generate them under another local instance. For instance, on my home computer, it generated it under (localdb)\Projects, but on my laptop, under (localdb)\ProjectsV12.
This breaks the connection strings (which of course can be fixed). But this leaves me wondering, is there a better way to develop the SQL project collaboratively?
If you have the SQL code under source control then everyone can open that solution to edit/create copies of the database. Ideally you have an automated process but that does not work for local dev.
Since sharing code is a bad idea and you have expressed that the database is used by more than one solution I would consider packaging and distributing it.
If you create a SSDT database project you can compile your database into a package that can upgrade any instance. You can then share that .dacpac output easily.
You might even what to share it with Nuget so that each dependency is automatically updated.
You can setup a SQL alias name to standardize the connection string across developer machines.
Alias .\SQLEXPRESS to (LocalDB)\MSSQLLocalDB

Update data in localDB

We are using Visual Studio 2012, and have a database project that deploys to an instance of localDB on each developers machine. All development work uses the localDB instance, until we are ready to checkin code and publish to our development SQL Server. My problem is that after a certain amount of time, all the data/primary key values in each person's localDB is becoming outdated. I would like to periodically "sync" everyone's instance of localDB, so that we are all using the same data for various lookup/dependency tables/userIds/etc.
Is there an easy way to include a data script in the database project that would push data to each instance automatically, or perhaps to grab a backup of our main dev DB and use it to copy over each developers localDB? I just need a way to periodically update everyone to make sure we are all working off of the same data.
There's no easy way to write these scripts, but I highly recommend that you write post-deploy scripts and include them in your project so when the project is pushed, everyone gets those changes. More importantly, they should be there so the changes will be released when you publish your project. You'll need to write them in such a manner that they can be repeated (if not exists ... insert type commands).
You can definitely backup and restore the shared database regularly as long as you know you've got the local changes checked in. Nothing is worse than losing a bunch of work because you've restored over your database.
You can use the "Data Compare" option under the SQL menu as well, but you want to understand what you're doing before just running it or you'll likely experience data loss.

Proper structure of asp.net website and database in visual studio

My main problem is where does database go?
The project will be on SVN and is developed using asp.net mvc repository pattern. Where do I put the sql server database (mdf file)? If I put it in app_data, then my other team mates can check out the source and database and run it with the database being deployed in the vs instance.
The problem with this method are:
I cannot use SQL Management Studio with this database.
Most web hosts require me to deploy the database using their UI or SQL Management studio. Putting it in App Data will make no sense.
Connection String has to be edited each time I'm moving from testing locally to testing on the web host.
If I create the database using SQL Management studio, my problems are:
How do I keep this consistent with the source control (team mates have to re-script the db if the schema changes).
Connection string again. (I'd like to automatically use the string when on production server).
Is there a solution to all my problems above? Maybe some form of patterns of tools that I am missing?
Basically your two points are correct - unless you're working off a central database everyone will have to update their database when changes are made by someone else. If you're working off a central database you can also get into the issues where a database change is made (ie: a column dropped), and the corresponding source code isn't checked in. Then you're all dead in the water until the source code is checked in, or the database is rolled back. Using a central database also means developers have no control over when databsae schema changes are pushed to them.
We have the database installed on each developer's machine (especially good since we target different DBs, each developer has one of the supported databases giving us really good cross platform testing as we go).
Then there is the central 'development' database which the 'development' environment points to. It is build by continuous integration each checkin, and upon successful build/test it publishes to development.
Changes that developers make to the database schema on their local machine need to be checked into source control. They are database upgrade scripts that make the required changes to the database from version X to version Y. The database is versioned. When a customer upgrades, these database scripts are run on their database to bring it up from their current version to the required version they're installing.
These dbpatch files are stored in the following structure:
./dbpatches
./23
./common
./CONV-2345.dbpatch
./pgsql
./CONV-2323.dbpatch
./oracle
./CONV-2323.dbpatch
./mssql
./CONV-2323.dbpatch
In the above tree, version 23 has one common dbpatch that is run on any database (is ANSI SQL), and a specific dbpatch for the three databases that require vendor specific SQL.
We have a database update script that developers can run which runs any dbpatch that hasn't been run on their development machine yet (irrespective of version - since multiple dbpatches may be committed to source control during a single version's development).
Connection strings are maintained in NHibernate.config, however if present, NHibernate.User.config is used instead, however NHibernate.User.config is ignored from source control. Each developer has their own NHibernate.User.config, which points to their local database and sets the appropriate dialects etc.
When being pushed to development we have a NAnt script which does variable substitution in the config templates for us. This same script is used when going to staging as well as when doing packages for release. The NAnt script populates a templates config file with variable values from the environment's settings file.
Use management studio or Visual Studios server explorer. App_Data isn't used much "in the real world".
This is always a problem. Use a tool like SqlCompare from Redgate or the built in Database Compare tools of Visual Studio 2010.
Use Web.Config transformations to automatically update the connection string.
I'm not an expert by any means but here's what my partner and I did for our most recent ASP.NET MVC project:
Connection strings were always the same since we were both running SQL Server Express on our development machines, as were our staging and production servers. You can just use a dot instead of the computer name (eg. ".\SQLEXPRESS" or ".\SQL_Named_Instance").
Alternatively you could also use web.config transformations for deploying to different machines.
As far as the database itself, we just created a "Database Updates" folder in the SVN repository and added new SQL scripts when updates needed to be made. I always thought it was a good idea to have an organized collection of database change scripts anyway.
A common solution to this type of problem is to have the database versioning handled in code rather than storing the database itself in version control. The code is typically executed on app_start but could be triggered in other ways (build/deploy process). Then developers can run their own local databases or use a shared development database. The common term for this is called database migrations (migrating from one version to the next). Here is a stackoverflow question for .net tools/libraries to make this easier: https://stackoverflow.com/questions/8033/database-migration-library-for-net
This is the only way I would handle this on projects with multiple developers. I've used this successfully with teams of over 50 developers and it's worked great.
The Red Gate solution would be to use SQL Source Control, which integrates into SSMS. Its maintains a sql scripts folder structure in source control, which you can keep in the same folder/ respository that you keep your app code in.
http://www.red-gate.com/products/SQL_Source_Control/

Transferring changes from a dev DB to a production DB

Say I have a website and a database of that website hosted locally on my computer (for development) and another database hosted (for production)...ie first I do the changes on the dev db and then I do the changes to the prod DB.
What is the best way to transfer the changes that I did on the local database to the hosted database?
If it matters, I am using MS Sql Server (2008)
The correct way to do this with Visual Studio and SQL Server is to add a Database Project to the web app solution. The database project should have SQL files that can recreate the entire database completely on a new server along with all the necessary tables, procedures users and roles.
That way, they are included in the source control for all the rest of the code as well.
There is a Changes sub-folder in the Database Project where I put the SQL files that apply any new alterations or additions to the database for subsequent versions.
The SQL in the files should be written with proper "if exists" blocks such that it can be run safely multiple times on an already updated database without error.
As a rule, you should never make your changes directly in the database - instead modify the SQL script in the project and apply it to the database to make sure your source code (the SQL files) is always up to date.
We do this in the (Ruby on) Rails world by writing "migrations," which capture the changes you make to the DB structure at each point. These are run with a migration tool (a task for rake), which also writes to a DB table so it knows whether a particular migration has been run or not.
You could make a structure like this for your dev platform (.Net?), but I think that in other answers to this question people will suggest available tools for handling database versioning in your development platform, or perhaps for your specific DB.
I don't know any of these, but check out this list. I see a lot of pay things out there, but there must be something free. Also check this out.
I migrate changes via change scripts written by developers when they have tested/verified their changes. (The exception being moving large data.) All scripts are stored in a Source control system. and can be verified by DBAs.
It is manual, sometime time consuming but effective, safe and controled process.
Databases are too vital to copy from dev.
There are tools to help create/verify these scripts.
See http://www.red-gate.com/
I have used their tools to compare 2 databases to create scripts.
Brian
If the changes are small, I sometimes make them by hand. For larger changes, I use Red Gate's SQL Compare to generate change scripts. These are hand-verified and run in the QA environment first to make sure they don't break anything. For large changes, we run a special backup prior to making the change both in QA and in production.
We used to use the approach provided by Ron. It makes sense for a big project with dedicated team of DBAs. But if you do not have a dedicated developers who write code only for DB this approach is time and resource expensive.
The approach to use RedGate DB compare is also not good. You still have a do a lot of manual work you can skip some step by mistake.
It needs something better. This is was the reason why we built the "Agile DB Recreation/Import/Reverse/Export tool"
The tool is free.
Advantages: your developers use any prefered tools to develop DEV DB. Then they run the DB RIRE and it makes reverseengeniring DB (tables, views, stor proc, etc) and export data into XML files. XML files you can keep in the any code repository system.
And the second step is to run DB RIRE one more time to generate difference scripts between structure and data in XML files and in Production DB.
Of course you can make as much iterations as you need.

Script for pushing database change to servers

This is probably the most classic database problem.
I have an E-commerce software solution hosted on a SQL server for data, and a web server for the frontend. Every instance/customer has its own database on SQL Server 2008.
During development of the next version, I might change or add tables, views, stored procedures etc.
How do I publish this change to all databases, without losing data? It should be done via a script or something similar. Centralized management is the key...
Perhaps it's something you've already considered, but my company uses software from Red Gate (http://www.red-gate.com/) which compares our development version of the DB and the production one, generates and executes scripts to bring production on par with development.
(I'm not a sales person from Red Gate, but I think this might be what you're looking for)
I use SQL Compare for schema changes and SQL Data Compare for data changes. Works like a charm!
This problem is essentially one of automating the manual process of logging on to a SQL Server, and running a script against one or more databases, that does the modifications you need.
It's made worse, of course, if the instances of SQL Server that you need to update are remote from you, and therefore not directly accessible.
It's also vital to ensure that the scripts are applied in sequence - it would be no point running the "add index" script before the "create table" script.
The way we've solved this is with a web service that packages script files as datasets, and delivers them in the correct sequence to the remote systems when they call home.
On the remote SQL Server, we have a .NET application which calls the web service, downloads the script files, unpacks them and applies them to the database.
When the remote system calls in, it supplies the ID of the most recent upgrade it has. When the web service completes, it knows the last one it delivered. It's therefore trivial to know what level the remote systems are at.
The only manual intervention required is to create the scripts in the first place, and upload them to the central server.
A script should be executed on sql server machine by db admin.
Main algorithm of such scipt is about to create backup, lock table each table in loop, alter it, release.
Another poster mentioned the Red Gate products, and I'll throw another commercial product out there - Quest Change Director:
http://www.quest.com/change-director-for-sql-server/
Disclaimer: I work for Quest, although I'm not in sales. Change Director does comparisons, syncing, links to a change management system, can use your dev/qa server as a source or use T-SQL scripts, has an audit trail and rollback capabilities, etc.
Like you said, central management is key, and this product focuses on that.

Resources