SQL Server Data Tools 2012 - deployment package for multiple databases - sql-server

In our company we have database solution that contains three SQL Server instances each with different databases. Each instance has some jobs and replication.
As for now we are maintaining creation and update scripts manually and execute them with bat files.
Our deployment package contains scripts for all objects including jobs and replication.
We want to automate our process to make and test deployment packages after every svn commit - continuous integration. Also we have branches for every release. Release correspond to a database version. Different clients have different releases/versions installed. We need to create deployment package for any branch.
Can we use SQL Server Data Tools 2012 for our needs? I have only seen tutorials for single database and I don't know how to use it in more complex environment.
Optionally we could use Data Tools for maintaining schema scripts and write manually scripts for jobs/replication. But can we use the build process to combine it all into one package?

You should be able to use SSDT for this, by way of Publish Profiles. Create a publish profile for each instance and set up your CI jobs accordingly.
Standardizing your database names across instances (especially if they're all for the same product) would help.

Related

How to build CI/CD for MS SQL Server?

I'm trying to build a CI/CD for my Microsoft SQL Server database projects. It will work with Microsoft DevOps pipelines.
I have all databases in Visual Studio databases projects with the GIT as source control. My objective is to have something that I can release databases with the use of DevOps pipelines to the diferents enviroments:
DEV
UAT
PROD
I was thinking of using DBGhost: http://www.innovartis.co.uk/ but I can't find updated information about this tool (only very old info) and there is very little information about it on the internet and how to use it (is it still in use?).
I would like to use a mix of DBGhost and DevOps. DBGhost to Source Scripting, Building, Comparing, Synchronizing, Creating Delta Scripts, Upgrading and DevOps to make releases (that would call the builds created by DBGhost)
If you have any ideas using this or other methods I am grateful because currently all releases are manual and it is not very advisable to do.
We have this configured in our environment using just DevOps. Our database is in a Visual Studio database project. The MSBuild task builds the project and generates a DACPAC file as an artifact, and the Release uses the "SQL Server Database Deploy" task to deploy this to the database. The deploy task needs to use an account with enough privileges to create the database, logins, etc., but takes care of performing the schema compare, generating the delta scripts, and executing them. If your deploy is going to make changes that could result in data loss such as removing columns, you will need to include the additional argument /p:BlockOnPossibleDataLoss=false in the deploy task. This flag is not recommended unless you know there will be changes that will cause data loss; without the flag any deploy which would result in data lost will fail.

Azure continuous deployment from GitHub and database upgrades

I have a Web application that I usually deployed using Web Deploy directly from Visual Studio (whatever branch I am currently using in VS - normally master). But now I'm introducing a second web app on Azure that will be built from the same repo but different branch. To make things simpler I will be configuring both Web apps on Azure to integrate directly with GitHub and associate them with specific branch.
I also added two additional web.config files: Web.Primary.config and Web.Secondary.config and configured app settings on Azure portal of each web app by adding additional value SCM_BUILD_ARGS and set them to
SCM_BUILD_ARGS=-p:PublishProfile=Primary // in primary web app
SCM_BUILD_ARGS=-p:PublishProfile=Secondary // in secondary web app
which I understand will transform correct config file with specific external services' configurations (DB connection, mail server, etc.).
Now the additional step that I would like to include in continuous deployment is run a set of SQL scripts that I have in my repo that I used to manually upgrade database during Web Deploy in VS. Individual scripts are actually doing specific database upgrade steps:
backup current tables - backup creates a set of Backup_OriginalTableName tables that are copied from existing ones and populated with existing data
drop whole DB model - all non-backup objects are being dropped from procedures, functions, types, views, tables...
create model - creates all tables, views and indices
create user types
create user functions
create stored procedures
restore data to new tables from backup tables - this step may occasionally break if we introduce new non-nullable columns to tables in the new model don't have defaults defined on them; I will somehow have to mitigate this problem by adding an additional script that will add missing columns to backup tables and give them some defaults, but that's a completely different issue.
I used to also have a set of batch files (BAT) in my VS solution that simply executed sqlcmd against specific database instance and executed these scripts in predefined order (as above). Hence I had batches:
Recreate Local.bat - this one used additional SQL scripts to not restore from backup but rather to recreate an empty DB with only lookup tables being populated and some default data for development purposes (like predefined test users)
Restore Local.bat - I used this script to simply restore database from backup tables discarding any invalid data I may have created while debugging/testing since last DB recreate/upgrade/restore
Upgrade Local.bat - upgrade local development DB executing scripts mentioned above
Upgrade Production.bat - upgrade production DB on Azure executing scripts mentioned above
So to support the whole deployment process I was now doing manually in VS I would now like to also execute these scripts against specific Azure SQL DB during continuous deployment. I suppose I should be running these right after code deployment because if that one fails, DB shouldn't be upgraded either.
I'm a bit confused where and how to do this? Can I configure this somewhere in Azure portal? I was looking for resources on the Web but I can't seem to find any relevant information how to do additional deployment steps to execute these scripts. I think this is some everyday scenario as it's hard to think of web apps not requiring databases these days.
Maybe it's just my process that is wrong for DB upgrade/deployment so let me also know if there is any other normal way that does DB upgrade/migration with continuous deployment on Azure... I may change my process to accommodate for this.
Note 1: I'm not using Entity Framework or any other full blown ORM. I'm rather using NPoco and all my DB logic is built in SPs that DAL is using.
Note 2: I'm aware of recently introduced staging capabilities of Azure, but my apps are on cheaper plan that doesn't support staging and I want to keep it this way as I may be introducing additional web apps along the way that will be using additional code branches and resources (DB, mail etc.)
It sounds to me like your db project is a good candidate for SSDT and inclusion in source control. You can create a MyDB.sqlproj that builds your db as a dacpac, and then you can use SqlPackage.exe Publish to accomplish your deployment to Azure.
We recently brought our databases under source control and follow a similar process to build and automatically deploy them (but not to a SQL Azure DB). We've found the source control, SSDT tooling support, and automated deployment options to be worth the effort of setting up and maintaining our project this way.
This SO question has some good notes for Azure deployment of a dacpac using SSDT:
How to publish DACPAC file to a SQL Server database project via SQLPackage.exe of SSDT?

Store SSIS package in named database?

I have a server hosting a number of different databases, all with the same structure, and a library of SSIS packages, some of which are common across the client databases and others are client-specific.
I'm aware that you can store packages in MSDB, but this is a shared store across the whole SQL instance - is it possible to attach a package to a specific database?
Actually, when you store packages into the msdb, they are stored in specific instance's msdb. Either run SELECT * FROM dbo.sysdtspackages90 (2005) or SELECT * FROM dbo.sysssispackages (2008) across all your instances and you'll determine which one is currently hosting your packages. If you are using folders, I have fancier version of these queries available.
What I believe you are observing is an artifact of the tools. There is only one instance of the SQL Server Integration Services Service. This service doesn't stop you from storing packages in specific instance, it just makes it a little more complex to do so. Or as I see it, by ditching the GUI (SSMS) you free yourself from the fetters of non-automated administration.
How do you push packages into the other named instances? You can either edit the service's .ini file as described in the above link and then reconnect to the Integration Services thing in SSMS or use a command line or query approach to managing your packages. We used the SSISDeployManifest in my previous shops with success to handle deployments. There is a GUI associated to the .ssisDeploymentManifest and you can use that to handle your deploys or you're welcome to work with the .NET object model to handle deployments. I was happy with my PowerShell SSIS deployment and maintenance but your mileage my vary.
Finally to give concrete examples for a command line deployment, the following would deploy a package named MyPackage.dtsx sitting in the root of C to named instances on the current machine and deploy them into the root of MSDB.
dtutil.exe /file "C:\MyPackage.dtsx" /DestServer .\Dev2008 /COPY SQL;"MyPackage"
dtutil.exe /file "C:\MyPackage.dtsx" /DestServer .\Test2008 /COPY SQL;"MyPackage"
I have an earlier version of my PowerShell script for generating calls to dtexec instead of using the object library directly.
Let me know if you have further questions

Distribute OLAP cubes as part of application setup

We currently have our custom application that is being distributed with our database (SQL 2005/2008). It is an easy task, before we release a new version we just pack our database into SQL initialization scripts (these create tables and populate data). We use SQL Management studio to generate these scripts.
As a next step we would like to deploy OLAP cube (along with ETL commands made with Integration Services) that would be used to analyze the data in the original database. .
We know to create and design a cube, but I do not even know how could be generalize all these packages and deploy them as a solution, script or something that our customers could install on their servers. Customers do not have a Visual studio and we need to create "something" in a wizard (with some input required from customer e.g. OLAP cube name, server etc) for them to deploy it.
How do you do that?
From Here:
Microsoft SQL Server 2005 Analysis
Services (SSAS) provides three tools
for deploying an Analysis Services
database onto an Analysis Services
server in the production environment:
Using an XML Script Use SQL Server Management Studio to generate an XML
script of the metadata of an existing
Analysis Services database, and then
run that script on another server to
recreate the initial database.
Using the Analysis Services Deployment Wizard Use the Analysis
Services Deployment Wizard to use the
XMLA output files generated by an
Analysis Services project to deploy
the project’s metadata to a
destination server.
Synchronizing Analysis Services Databases Use the Synchronize
Database Wizard to synchronize the
metadata and data between any two
Analysis Services databases.
In addition to using one of the
deployment tools, you can deploy
Analysis Services by using the backup
and restore functionality. For more
information, see Backing Up and
Restoring an Analysis Services
Database.
The Analysis Services Deployment Wizard can be found in your start menu under SQL 2005, Analysis Services, Deployment Wizard. This takes the asdatabase file in your bin directory and creates an XMLA script that creates the SSAS database.
Links:
Using the Analysis Services Deployment Wizard
Readme for Ascmd Command-line Utility Sample
Or alternatively, you can use a tool to build the Cubes and Schemas that provide a simple mechanism for deploying initial implementations and a smooth upgrade path.
As you know deployment, isn't just a case of implementing a database even an OLAP database in the target environment. There's also the ETL, and tables to consider, which also involves ensuring that at every step of the way you're creating table/SQL scripts, and all this is fine and dandy until you come to provide an upgrade to your product, and need to upgrade the SSIS/DW Relational Schema Tables and SSAS Cube structures.
What you find is MS is no help at all here. It's helpful for initial deployments, but doesn't provide much in the way of in situ upgrades.
This is a problem that we have faced up to and developed a tool to address, so that we're able to do the things that you are trying to do, but do them smoothly. Leaving our technicians to focus on building high quality Data Warehouses, rather than technologies to do mundane, annoying, fraught with danger but necessary things like "upgrades".
Check out http://www.dataacademy.com, this is the product we've developed to do successfully, just what you are trying to do. Drop me a mail, if you'd like to discuss further.
Cheers and the best of luck.

Transfer objects and data between SQL 2005 databases

I am wanting to transfer objects (tables, stored procedures, data etc) between two servers (Dev box and Live box) and was wondering what the best approach for doing this is?
In SQL Server 2000, you could transfer all objects and data between databases. Now all there is is 'copy data' and 'write a query'. Where has the second option gone?
Both databases are SQL 2005 (with service pack 2). When transferring, primary keys and relationships should be kept intact as well as all the views and other associated data with regards to ASP.NET authentication. Integration Services is not setup up on the live server, so that is not an option.
The only way I can think of is generating scripts, then running them on the other server, but that is more time consuming than the old way (this is how I am doing it now).
If you are willing to pay, I recommend Sql Compare and Sql Data Compare from Red Gate.
Very useful products.
Database Publishing Wizard
http://sqlhost.codeplex.com/
It's a shame you haven't got Integration Services installed as you could use the "Copy Database Wizard". I believe this creates an SSIS package that runs on the destination server.
If you have Visual Studio 2008, you could try the Data comparison and Schema comparison tools.
Your best bet is probably a schema & data comparison tool; there's various tools listed at http://www.mssqltips.com/tip.asp?tip=1069
You don't mention the scope of your application or the number of developers, etc., so it is a little hard to make any recommendations. However, if your development consists of multiple concurrent projects and multiple developers and you are copying from a Development to Production I would recommend something like the following:
implement 3 "areas": dev, qa, production.
develop all changes in dev, create all changes in scripts, use something like cvs or sourcesafe to track changes on all objects
when changes are ready and tested, run your scripts in qa, this will validate your scripts and install procedure
when ready run your scripts and install procedure on production
note: qa is almost identical to production, except applied changes waiting for their final production install. dev contains any work in progress changes, extra debug junk, etc. You can periodically restore a production backup onto qa and dev to resync them (just make sure all developers are aware of this and plan accordingly), because (depending on the number of developers) they (production vs. qa vs. dev) will start to incur more differences over time.

Resources