I am using Microsoft SQL Server 2017 and I want to implement CI/CD using Jenkins.
If I am making the proc changes or changing the name of column in my dev database, I want those changes to be reflecting in the QA database after I push my dev database changes.
I am aware about red gate but I don't want to use it. Looking to implement it without using any paid software.
Redgate (who I work for) offers Flyway Community, which is a free tier. This employs a migrations-based model in which you author each change as a SQL migration script. You then invoke Flyway migrate (its command line) from Jenkins to intelligently apply the migration scripts to the various environments managed by your Jenkins pipeline.
Related
I'm trying to build a CI/CD for my Microsoft SQL Server database projects. It will work with Microsoft DevOps pipelines.
I have all databases in Visual Studio databases projects with the GIT as source control. My objective is to have something that I can release databases with the use of DevOps pipelines to the diferents enviroments:
DEV
UAT
PROD
I was thinking of using DBGhost: http://www.innovartis.co.uk/ but I can't find updated information about this tool (only very old info) and there is very little information about it on the internet and how to use it (is it still in use?).
I would like to use a mix of DBGhost and DevOps. DBGhost to Source Scripting, Building, Comparing, Synchronizing, Creating Delta Scripts, Upgrading and DevOps to make releases (that would call the builds created by DBGhost)
If you have any ideas using this or other methods I am grateful because currently all releases are manual and it is not very advisable to do.
We have this configured in our environment using just DevOps. Our database is in a Visual Studio database project. The MSBuild task builds the project and generates a DACPAC file as an artifact, and the Release uses the "SQL Server Database Deploy" task to deploy this to the database. The deploy task needs to use an account with enough privileges to create the database, logins, etc., but takes care of performing the schema compare, generating the delta scripts, and executing them. If your deploy is going to make changes that could result in data loss such as removing columns, you will need to include the additional argument /p:BlockOnPossibleDataLoss=false in the deploy task. This flag is not recommended unless you know there will be changes that will cause data loss; without the flag any deploy which would result in data lost will fail.
I,
We are currently working on a .net core project that will use multiple databases with the same structure.
In short, this is a multi tenant project and each tenant will use the same web application (multiple instances behind a load balancer) BUT each tenant will have its own database.
We are searching for the best solution to ease our deployment process.
When the application (and DB) is updated, we need to update the database structure on all SQL servers and all databases (one SQL can contain x databases).
FYI, application and SQL server are hosted on AWS, our CI/CD is Azure DevOps.
And last (but not least) limitation: we are working on VSCode only (MAC & Linux laptop).
So, we looked for some solutions :
Using Database projects (.sqlproj) + DACPAC generation deployed using DevOps, but it's not available on VSCode
Using Migration: not working with multiple databases and dynamic connection strings
Using SQL script: too complicated to maintains by hand a SQL script that takes care of possible cases
So could someone give us some advice to solve this problem?
The general solution here is to generate SQL Scripts for each deployment, and integrate those into your CI/CD process.
You could use EF Migrations to generate a SQL Script, that is then tested, deployed to your repo as a first-class asset, and deployed by your CI/CD pipeline. Or you could use SSDT to manage the schema and generate change scripts. But those aren't the only reasonable ways.
If you are modifying the schema by hand without using SSDT, you would normally just use a tool to generate the change script. And go from there.
There are many tools (including SSDT) that help you to diff a development environment against a target production schema and generate the change scripts. Eg Redgate ReadyRoll
Note that if you intend to perform online schema updates you need to review the change scripts manually for offline DDL operations, and to ensure that your code/database changes have the correct forward and backward compatibility to support a rollout while the application is online.
And preparing, reviewing, testing, and editing the database change scripts is not something that everyone on the team dev needs to do. So you can always consider jumping onto a Windows VM for that task.
So I am in the throes of developing our Continuous Integration practices. We are a .Net/MSSQL shop. We will all soon be on VS2012. We have settled on CruiseControl.Net for CI server, using msbuild to compile our projects. We use SVN (possibly switching to Git later, but that's another discussion) for source control. I'm leaning towards using InstallShield to deploy code packages (usually web apps and/or batch exeutables) to our QA and production servers. (CCNet would build these MSI's as part of our CI.) We are also starting to include unit testing in our projects, and will use NUnit integrated with CCNet to run them automatically upon check-in.
So far this works for our standard web app/exe development. Where it does not fit in (yet) is with our MSSQL change management, or lack thereof. It's been pretty cowboy how we've done this. Some folks have used Migrator.Net. Others just do a SQL Compare with Redgate and generate a script. Still others have hand-written sql scripts. It may or may not be in SVN. "Source control" at the db level is basically "we have backups of our databases." Boo, hiss. Needless to say that if we want some consistency with our CI and with our deployments, we need to settle on something. So far I am leaning towards using VS SQL projects to handle the change management and deployment.
Note: we (developers) are not supposed to push changes. Sys admins do that. So we can't run anything to deploy code or sql.
So, 2 problems to solve (I think):
What "technique" to use so that our CI server blows away a CI version of the database so that unit tests can be tested against it. I've settled that VS2012 SQL projects can do that. CCNet can run msbuild against the db project, which recreates the database. This is fairly easy.
How to generate change scripts for our QA and prod environments? This one I'm stuck on.
VS can do a schema compare and then generate the sql script -- but it is dependent on sqlcmd. So our sys admins would have to run sqlcmd from the command prompt to deploy it... probably not ideal. Right?
I could run msbuild again to deploy... but I don't want the database re-created, I just want changes deployed.
So what are the options here? I need something self-contained for the admins to run -- and check-in to SVN. Should I make another msi for database deployments? Can CCNet/msbuild make some other kind of "deployment package" for database changes (not re-creation) where the sys admins can double-click and go?
How do you all handle this?
Thanks
Tom
Check out the SQL Server Data Tools package from the Microsoft site.
This will register a new SQL Server 2012 Database type project to contain the definition for all of your database structures. Upon build, this will generate a create script that you can use to deploy your database.
Then for upgrading your database, use the SQLPACKAGE.EXE tool using the create script and target database server name to generate an Update.sql script.
Update: Also on the issue of how you're running unit tests, you could create supplemental methodologies that invoke the create scripts by launching a process and and passing the path to the output create.sql script, then have your tests 'tear down' the database using the same method but with a drop database statement.
Let's suppose I want to add a new feature to my ASP.NET MVC application running SQL Server 2008 as a data source. In order to implement this new feature, I need to add a few new columns to existing database tables.
After performing these changes on my development server and implementing the new features, what's the easiest way to perform the same database changes on the production server while deploying the new version of my application? Is there any way to automate this?
Edit: As I just found out, Visual Studio 2008's Server Explorer seems to be able to extract the necessary changes for me by comparing two different database layouts (Right-click database, click on "Compare Schema"). Does this usually cover my requirements or is there any big gotcha when using this feature?
I believe versioning the database using manually generated scripts similar to the approach described by K Scott Allen is well worth the investment in time. But not the automated solution you're asking for.
Red Gate's SQL Compare utility might do it for you if your needs are relatively straightforward. If not, a tool like ER-Win or ER-Studio can handle hard-core schema and migrations.
You should have db and app layer versioning. Period.
If you have version db 1.0 and app layer 1.0 in production all the changes which are performed afterwards for versions 1.1 and 1.1.5 should be "upgradable" via scripts.
All "alter table" , and "alter proc" statements are runnable via scripts.
Or alternatively:
Restore 1.0 db to db_old database. Create the production db from scripts and just copy the data ( if you don't have very complicated database should not be difficult)
Automatic deployment for applayer 1:0.
Yet again for the whole process you must train it in DEV , test in TEST verify it in qa and lately perform it in PROD environment.
Edit: I personally think that If the team is not able smoothly to upgrade from version 1.0 to 1.1 on the same time on DEV - smells like bad design and mix in the responsibilities on what should be on the app layer and what on the db server
Are there any tools, or 'best practices' for creating Migrations on MSSQL? I have a Dev & Production database, and the Dev one often has new SPROCs created, and occasionally the structure is added to. I'd like to be able to write a set of scripts during each iteration which will update the Dev server, then execute all the scripts at release time to update Production. In ruby I can do this with migrations - is there an equivalent?
There are a few:
Rails Migrations running on SQL Server
RikMigrations
Tarantino
Migrator.NET
Machine Migrations
Subsonic Migrations
dbDeploy.NET
Fluent Migrator
Source
For what it's worth, my favourite of these is Fluent Migrator.
Here at Red Gate we've now got a migrations solution that uses SQL Compare and SQL Source Control.
http://www.red-gate.com/MessageBoard/viewtopic.php?t=14107
It's currently an early access build, due out by the end of the year, so we're keen to get as much feedback as possible.