I recently executed an incorrect UPDATE query (yes, ouch) and turns out my company doesn't have any recent backup for me to restore the data. Are there any version control tools or automated backup techniques that I can use so this doesn't happen again
Liquibase, while it mainly oriented on managed schema-changes, it can be (TBT!) (with the same concept of roll-backed changesets) used for data-changes
Version control tools and automated backup techniques are different things
Regarding the automated backup tools here are some examples :
pgbarman : We use it in our organization
pgbackrest
Related
RedGate ReadyRoll is wonderful DB versioning tool which works like a charm. But the only one drawback - price.
I've researched what to use and found DBUp, Envolve... These tools are good but:
Doesn't have ability to define Up and Down scripts to be able apply and rollback migrations
Apply migration based on name ordering and not creation time. It's not big issue because developers could give the name based on the time but...
These tools doesn't allow to do development in different branches when developers uses not shared DB but their own and change DB schema separately. How to handle the issues when Team Lead have to switch between branches or when QA have to test feature before it goes to master - it's open question for me.
Using SSDT - as a case but it makes live very difficult if need do something with a DATA and not only with schema. What can be done very simple if use migration-based approach - becomes really big issue when try to do it with state-based one.
Can somebody suggest some tool for DB versioning with migration-based approach which can work with .NET stack and been free or paid but cheaper than RedGate?
Thanks
P.S.
What if use Entity Framework only for migrations? Without entities, etc. Create DB Context and do everything just to create migration files which has Up/Down and then write there scripts or define sql file to apply up/down migration changes. The only question - it's snapshots. How it will works with it.
Have somebody used this approach?
ReadyRoll also comes in a Core Edition, which is a free
entitlement if you also own a Visual Studio Enterprise license. I
admit this information isn't too helpful if you have VS Community or
Professional.
I'm updating this answer because Redgate no longer supports ReadyRoll Core edition. Instead Redgate's supported migrations tech is Flyway migrations, which comes with a free community edition.
We have decided to use EF Code First migrations to do what we need. If compare it with DbUp - EF Code First migrations contains Down script which allows us rollback DB to any target migration script.
Yes, it's not alternative for ReadyRoll and I have to accept that there are no any similar product which is surprised for me.
But it does what we need and after more than 1 month of usage I can say - it fit for our needs.
Because we we use EF Code First only for migrations scripts there is no issue with Team Collaboration described here because snapshot is always the same if your context doesn't have any entities.
Our project has about 20 developers, but our application makes relatively light use of databases. We have a collection of about 5 databases, all of which are very small and would have less than 20 tables each, none of which have millions of rows or anything large.
We have two options on the table for how to manage the evolution of the databases over time:
Some kind of tool. Currently we're using Visual Studio database projects, which contain the current definition of the schema, and look at a reference database to generate a diff script. We then use this diff script to bring the reference database up to date.
Use version scripts to build the database from a baseline. The scripts are manually placed in source control. Any data migration to move data from old columns/tables to new would be part of these scripts. There would be a version recorded in the DB somewhere and upgrading would run all scripts between DB version and the current version.
The second option seems to be widely used and I have found an indepth discussion here: http://odetocode.com/blogs/scott/archive/2008/01/31/versioning-databases-the-baseline.aspx
The problem we have with what we've got at the moment is that we don't have access over our Production databases. This means to create a release package, we have to restore a backup of Production into another location, generate a diff against that referece DB and give the script to the production DB team. So our release to production is different to our other environments.
This makes the idea of running versioned scripts appealing because we use the same scripts in all environments, and there's no ad-hoc work in deployment (eg manual restore of prod to reference DB). But given that we have such a small scale DB situation, I feel like we can hardly be a difficult case for the DB tools out there. What we want is something as simple as possible which is easy to understand.
Do the tools such as RedGate's suite make sense for this kind of scenario, or should we go with versioned scripts? Cost isn't so much of an issue, it's more about creating a Pit of Success where maintaining and deploying the DB is as basic and automated as possible.
I'm the product manager at Red Gate for SQL Compare, which generates diff scripts between two databases. I'd like you to take a look at our SQL Source Control tool, which will allow you to track schema changes as and when they're made in development. When it comes to deployment, if you know which schema version is in production, you can generate a deployment script from your source controlled versions. Of course you should always be testing this out in a staging environment before running on production.
Scott's article makes an excellent point in regards to migration script, and Denis alludes to more complex changes that can't realistically be second guessed by comparison tools, and would therefore require custom migration scripts to be managed and used appropriately. The next version of SQL Compare in conjunction with SQL Source Control will therefore manage both your schema versions and your migration scripts, allowing you to get the best of both worlds. If you'd like to see early screenshots of this, please email me at David dot Atkinson at red-gate dot com. I'd really love to discuss your requirements so we can better design the tool.
In my experience there always is more to it than mere schema changes. If you split a column in two, or shift a column to a separate table, or other such things, you need to migrate both the schema and the data.
No tool or script will allow you to migrate the actual data automatically. At the very most you'll get a diff for the schema which your devs may find useful as a reminder/check list for DB version migration scripts (sequences of create/alter/drop and insert/update/delete done in a single transaction).
How do you manage your sql server database build/deploy/migrate for visual studio projects?
We have a product that includes a reasonable database part (~100 tables, ~500 procs/functions/views), so we need to be able to deploy new databases of the current version as well as upgrade older databases up to the current version. Currently we maintain separate scripts for creation of new databases and migration between versions. Clearly not ideal, but how is anyone else dealing with this?
This is complicated for us by having many customers who each have their own db instance, rather than say just having dev/test/live instances on our own web servers, but the processes around managing dev/test/live for others must be similar.
UPDATE: I'd prefer not to use any proprietary products like RedGate's (although I have always heard they're really good and will look into that as a solution).
We use Red-Gate SQLCompare and SQLDataCompare to handle this. The idea is simple. Both compare products let you maintain a complete image of the schema or data from selected tables (e.g. configuration tables) as scripts. You can then compare any database to the scripts and get a change script. We keep the scripts in our Mercurial source control and tag (label) each release. Support can then go get the script for any version and use the Redgate tools to either create from scratch or upgrade.
Redgate also has an API product that allows you to do the compare function from your code. For example, this would allow you to have an automatic upgrade function in your installer or in the product itself. We often use this for our hosted web apps as it allows us to more fully automate the rollout process. In our case, we have an MSBuild task that support can execute to do an automatic rollout and upgrade. If you distribute to third-parties, you have to pay a small additional license fee for each distribution that includes the API.
Redgate also has a tool that automatically packages a database install or upgrade. We don't use that one as we have found that the compare against scripts for a version gives us more flexibility.
The Redgate tools also help us in development because they make it trivial to source control the schema and configuration data in a very granular way (each database object can be placed in its own file)
The question was asked before SSDT projects appeared, but that's definitely the way I'd go nowadays, along with hand-crafting migration scripts for structural db changes where there is data that would be affected.
There's also the MS VSTS method (2008 description here), anyone got a good article on doing this with 2010 and the pros/cons of using these tools?
Interested if anyone has used VSTS Database Edition extensively and, if so, which features did you find the most useful over the standard Visual Studio database projects?
What are the most compelling features as opposed to alternative schema management options or tools like RedGate's SqlCompare etc?
Edit: Microsoft just released the RTM version of Database Edition (GDR) which adds support for SQL Server 2008 - link is here. I've previously blogged (briefly) about it here.
Has anyone had a chance to do any real work with the GDR? It looks like there are some real enhancements including refactoring support. I'd be really interested to hear if people are using it with SQL Server 2008...
Download From: [http://www.microsoft.com/downloads/details.aspx?FamilyID=bb3ad767-5f69-4db9-b1c9-8f55759846ed&displaylang=en]
We use the database edition functionality of Team Suite on Stack Overflow. As Vaibhav said, mostly it is useful because it gives you a one-click way to reverse engineer a database into source control, and keep it up to date.
Note that it also has decent Data and Schema compare tools as well. You can compare projects to physical databases and vice-versa. This makes it pretty easy to keep your database up to date, no matter where you make changes -- in the filesystem database project, or in the physical database itself.
If you compare it to tool like RedGates, that are specifically taylored for SQL Server, the benefits are that if you have the proper MSDN subscription you do not have to spend more money for other tools (but keep in mind that RedGate tools are much more mature) and it covers some points (like regression tests and unit tests at the DB level) that other tools do not cover and it make so in a integrate manner with other testing tool of VSTS, so that you can record results in Team System.
Compared to a tool like Embarcadero ErStudio (my solution of choice) it misses the cross database features, and this is a big problem, at least for me.
If you are a "all Microsoft" shop with the proper MSDN subscription it could be worth spending time on it.
We are currently using the GDR 2008 projects for managing our entire database development and deployment on a greenfield system. We use a TFS build script to call out to the MSBuild task for deploying the databases along with executing the data generation plans for pre populating the testing environment with data.
The key with the data generation plans was finding the build task to use which is :
TaskName="DataGeneratorTask"
AssemblyName="Microsoft.Data.Schema.Tasks, Version=9.1.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"
All of this gdr project work has been unbelievably helpful and I think it is well worth the learning curve to get to know these project types. The value they provide is astronomical in productivity and visibility.
It allows us all to view the entire system in a single visual studio solution along with allowing us to start with a clean slate of our system at any point in time with either a click of the deploy command or a custom build configuration.
This blog
will help with getting the TFSBuild script to run if you're interested.
The VSDB test integration is so painful to configure that we abandoned it, and that's the only thing it's got that Red-Gate doesn't.
Red-Gate's tool is miles more useful. It does live DB and scripts in folders, but also has "snapshots." The aspect of Red-Gate SQL Compare that gives it the win is its Snapshot ability and the fact that your license allows you to deploy their assemblies and use them to perform database maintenance at customer run-time.
It has made upgrades in the COTS application that I develop a breeze. A Snapshot is a binary schema representation. You can package them as resources in an assembly, then use the snapshot in a customer run-time schema compare to bring an existing database up to the current rev.
Probably the best advantages are around being able to version control individual DB schema objects (which you could do with the older "Database Projects"), but have the power to "build"/deploy the project and convert those individual scripts into a complete database.
The ability to import scripts and have the Wizard covert individual schema items into separate files is very handy if you've inherited a DB schema.
Given that recently the licensing model changed, it makes it even more enticing because it's included with the Developer edition SKU. It also provised support for "Database Unit Tests" which might be useful.
From the 2008 GDR, I understand that they now support SQL Server 2008.
You can do database versioning for one. That is useful.
The other thing that is really useful is the ability to define type of seed data for testing. Through this Visual Studio will populate the database with random data and this is great for testing purposes.
There are other benefits as well of course.
It is always useful to put everything under the same source control, so your data-dude can be shelving, checking in, compare with history, and even resolve workitems and bugs using the same tools that other team members are using.
Also to be able to have one versionning mechanism across the whole application, in other words, it doesn't make sense to say that my source control has all the versions of my project while your database can't fit with any of these old versions, unless you take a backup or a snapshot of the database with each build.
My company has a number of relatively small Access databases (2-5MB) that control our user assisted design tools. Naturally these databases evolve over time as data bugs are found and fixed and as the schema changes to support new features in the tools. Can anyone recommend a database diff tool to compare both the data and schema from one version of the database to the next? Any suggestions will be appreciated: free, open source, or commercial.
I use Red Gate Sql Compare for comparing schemas. It also has an interesting feature that allows you to save a snapshot of the schema which you can then use in later diffs. for example compare the schema of today with the schema of a month ago.
I use ApexSQL Diff. It is an excellent tool for doing just what you're describing...compare schema, compare data, generate change scripts. It not free, but it works well.
NOTE: ApexSQL Diff only works with SQL Server.
We never actually purchased it as we ended up using SQL Server 2005, but DBDiff seemed to do the trick: http://www.dkgas.com/downdbdiff.cgi
It works with any ODBC compatible DB.
I've used Total Access Detective in the past and it did the trick. It's a while ago though so you might want to investigate first...
If you're looking for a free alternative to Red Gate's most excellent SQL Compare, you might want to check SQLDBDigg made by SQLDBTools. It's what I used until I caved and bought SQL Compare.
It's not a perfect solution, but I often export both databases as txt/SQL files and then use a diff program, such as the one that comes with TortoiseSVN. You can then see all of the differences. It doesn't automatically create the SQL though to sync the dbs.
http://www.diffkit.org
Features
High performance, for large datasets (+10MM rows).
Very low memory overhead, even on very large datasets.
High quality-- comprehensive embedded regression test suite for the application/framework.
Java run everywhere (tm) — Linux, Solaris, OS X, Windows, etc.
Cross database-- Oracle, MySQL, DB2, and any JDBC datasource.
Command-line driven; no GUI needed; can run in headless environments.
XML configuration file driven.
Free Open Source Software.
Apache License, Version 2.0.
Clean Object Oriented Design make extension easy.
Easily embeddable as a Java library (jar).