SQL Server and Continuous Integration - sql-server

We have a huge SQL Server database (~1TB). Our DB deploy only target the objects changed/created. i.e let's say out of 500 stored procedure, if there is a change in 2-3 procedures, we deploy only those 3 procedures in production.
We maintain our DB objects in source code repository. So, if we need to rollback a change, we can easily.
I need suggestions on how we can automate our DB deploys taking in consideration the rollback process(we might need it at some point).With database being huge, backup and restore is not an option.
We use Jenkins as primary CI tool.
Thoughts?

Related

Data Migration from On Prem to Azure SQL (PaaS)

We have an on-prem SQL Server DB (SQL Server 2017 Comp 140) that is about 1.2 TB. We need to do a repeatable migration of just the data to an on cloud SQL (Paas). The on-prem has procedures and functions that do cross DB queries which eliminates the Data Migration Assistant. Many of the tables that we need to migrate are system versioned tables (just to make this more fun). Ideally we would like to move the data into a different schema of a different DB so we can avoid the use of External tables (worried about performance).
Moving the data is just the first step as we also need to do an ETL job on the data to massage it into the new table structure.
We are looking at using ADF but it has trouble with versioned tables unless we turn them off first.
What are other options that we can look and try to be able to do this quickly and repeatedly? Do we need to change to IaaS or use a third party tool? Did we miss options in ADF to handle this?
If I summarize your requirements, you are not just migrating a database to cloud but a complete architecture of your SQL Server, which includes:
1.2 TB of data,
Continuous data migration afterwards,
Procedures and functions for cross DB queries,
Versioned tables
Point 1, 3, and 4 can be done easily by creating and exporting .bacpac file using SQL Server Management Studio (SSMS) from on premises to Azure Blob storage and then importing that file in Azure SQL Database. The .bacpac file that we create in SSMS allows us to include all version tables which we can import at destination database.
Follow this third-party tutorial by sqlshack to migrate data to Azure SQL Database.
The stored procedures can also be moved using SQL Scripts. Follow the below steps:
Go the server in Management Studio
Select the database, right click on it Go to Task.
Select Generate Scripts option under Task
Once its started select the desired stored procedures you want to copy
and create a file of them and then run script from that file to the Azure SQL DB which you can login in SSMS.
The repeatable migration of data is challenging part. You can try it with Change Data Capture (CDC) but I'm not sure that is what exactly your requirement. You can enable the CDC on database level using below command:
Use <databasename>;
EXEC sys.sp_cdc_enable_db;
Refer to know more - https://www.qlik.com/us/change-data-capture/cdc-change-data-capture#:~:text=Change%20data%20capture%20(CDC)%20refers,a%20downstream%20process%20or%20system.

Restore Azure SQL DB over an existing DB to maintain backup history

I'm setting up an Azure SQL DB for our Web App. We have enabled Point In Time Retention (PITR) and Long Term Retention (LTR). Our process is to keep backups for 1 year.
Periodically, we need to upgrade the DB by applying SQL scripts. Sometimes there is a problem with the upgrade scripts and the upgrade fails. We need to rollback the database to the previous version.
To rollback the DB I tried the restore feature. However, the restore feature seems to only create new DBs; therein lies the problem. Restoring to a new DB and removing the old one works great, but we lose all our backup history. It appears backups are tied to the DB (probably to the ResourceId).
So, how can I use Azure SQL DB and periodically restore a DB and still maintain all the back up history?
Unfortunately, restoring from a backup in Azure SQL Database always creates a new database. The secret here may be to rename the newly restored database with the name of the original database. You will even see that the restored database once renamed it then shows all the security recommendations, automatic tuning recommendations of the original database.
So delete existing database, restored the database, and rename it as the original database.
You can reference this document Recover an Azure SQL database using automated database backups , it gives the answer that all the recover are creating new database.
By default, SQL Database backups are stored in geo-replicated blob storage (RA-GRS). The following options are available for database recovery using automated database backups:
Create a new database on the same SQL Database server recovered to a specified point in time within the retention period.
Create a database on the same SQL Database server recovered to the
deletion time for a deleted database.
Create a new database on any SQL Database server in the same region
recovered to the point of the most recent backups.
Create a new database on any SQL Database server in any other region
recovered to the point of the most recent replicated backups.
If you configured backup long-term retention, you can also create a new database from any LTR backup on any SQL Database server.
improtant:
You cannot overwrite an existing database during restore.
"So, how can I use Azure SQL DB and periodically restore a DB and still maintain all the back up history?"
You can use Database replacement:
If the restored database is intended as a replacement for the original database, you should specify the original database's compute size and service tier. You can then rename the original database and give the restored database the original name using the ALTER DATABASE command in T-SQL.
Hope this helps.

Restore sql server database without deleting existing database tables

I want to copy production server database to Development server, I am using backup and restore to take production server db to Development server.
I have restored it successfully and added tables and SP in restored DB.
Again next day I have to restore same database, after restore my existing tables and SP will get deleted.
I can not use SP and extra tables on production server DB. I want to copy it to development server with a real time data and on development server I can do anything without impacting to production server db.
Can anyone suggest better way to doing this?
You are saying you want to merge changes.
Use a database diff tool (such as database compare in visual studio 2013 to generate a 'difference' script between your dev and prod database. For example you run this tool against dev and prod and it spits out a bunch of create procedure, view, table etc. scripts
Generate insert scripts for all dev - only tables
Restore prod over dev
Execute the scripts from step 1 and 2 in dev
There is often some human intervention required in any kind of merge. For example what if your prod tables have a column that is a different data type in dev, and your SP's are expecting this?
The other option is to build integrations that just load data from prod into dev, but this requires maintenance as changes occur in dev and prod
Yes, you should take a incremental backup rather than a full backup, since a full backup has already been restored to your Dev database. Then while you re-store it should get the newly created DB objects or data.

How to sync schema continuously in two databases in Sql Server (for unit testing)

We have a database against which we run unit tests for components that require a database (for several reasons we are not mocking the DAL everywhere).
We are using Sql Server 2008 R2 and in the development db server we have our development database (ApplicationName_Dev) and our testing db (ApplicationName_UT).
The unit tests create the test data they need and delete it afterwards so the tables could/should be empty when no tests are running.
The problem is keeping the schema of the unit test database up to date.
The best solution for me (to my limited knowledge) would be to have a Sql Server Agent Job that would run once a night (or when manually started) that would drops all the tables in the UT database, generate a create script for all tables, indexes and relationships in the Dev-database, and run the create scripts on the UT-database. Note that we don't need to insert any data.
Is there any way of programmatically (T-Sql, SMO etc) generating Create scripts for all tables including indexes and relationships?
In Management Studio I can right click the database->Tasks->Generate scripts...->Choose Objects->Tables and I get just the scripts that I want (except for the "Use [ApplicationName_Dev]" on the first line.
Please help.
Regards,
Mathias
I'd create an SSIS package - there's a task called "Transfer SQL Server Objects Task". Specify your Source and Destination Connections & Databases, set DropObjectsFirst to True, and CopyAllObjects (or just CopyAllTables and CopyAllViews) also, and you should be set. (And obviously, don't set CopyData to true).
You also need to set the CopyIndexes and other such table options, for those table structures you want.
Setting up a job to run an SSIS package is also quite easy.
You could use a tool like SQL Delta. You create a "script" (SQL Delta specific script) using SQL Delta and essentially , what you can do is get it to sync the source database with the destination database. It can also pump in data into some or all tables if needed.
The whole process can be automated using a scheduled job using the Scheduler (part of Windows).

Best way to copy a database (SQL Server 2008)

Dumb question - what's the best way to copy instances in an environment where I want to refresh a development server with instances from a production server?
I've done backup-restore, but I've heard detach-copy-attach and one guy even told me he would just copy the datafiles between the filesystems....
Are these the three (or two, the last one sounds kind of suspect) accepted methods?
My understanding is that the second method is faster but requires downtime on the source because of the detach aspect.
Also, in this situation (wanting an exact copy of production on a dev server) what's the accepted practice for transferring logins,etc.? Should I just backup and restore the user databases + master + msdb?
Easiest way is actually a script.
Run this on production:
USE MASTER;
BACKUP DATABASE [MyDatabase]
TO DISK = 'C:\temp\MyDatabase1.bak' -- some writeable folder.
WITH COPY_ONLY
This one command makes a complete backup copy of the database onto a single file, without interfering with production availability or backup schedule, etc.
To restore, just run this on your dev or test SQL Server:
USE MASTER;
RESTORE DATABASE [MyDatabase]
FROM DISK = 'C:\temp\MyDatabase1.bak'
WITH
MOVE 'MyDatabase' TO 'C:\Sql\MyDatabase.mdf', -- or wherever these live on target
MOVE 'MyDatabase_log' TO 'C:\Sql\MyDatabase_log.ldf',
REPLACE, RECOVERY
Then save these scripts on each server. One-click convenience.
Edit:
if you get an error when restoring that the logical names don't match, you can get them like this:
RESTORE FILELISTONLY
FROM disk = 'C:\temp\MyDatabaseName1.bak'
If you use SQL Server logins (not windows authentication) you can run this after restoring each time (on the dev/test machine):
use MyDatabaseName;
sp_change_users_login 'Auto_Fix', 'userloginname', null, 'userpassword';
The fastest way to copy a database is to detach-copy-attach method, but the production users will not have database access while the prod db is detached. You can do something like this if your production DB is for example a Point of Sale system that nobody uses during the night.
If you cannot detach the production db you should use backup and restore.
You will have to create the logins if they are not in the new instance. I do not recommend you to copy the system databases.
You can use the SQL Server Management Studio to create the scripts that create the logins you need. Right click on the login you need to create and select Script Login As / Create.
This will lists the orphaned users:
EXEC sp_change_users_login 'Report'
If you already have a login id and password for this user, fix it by doing:
EXEC sp_change_users_login 'Auto_Fix', 'user'
If you want to create a new login id and password for this user, fix it by doing:
EXEC sp_change_users_login 'Auto_Fix', 'user', 'login', 'password'
UPDATE:
My advice below tells you how to script a DB using SQL Server Management Studio, but the default settings in SSMS miss out all sorts of crucial parts of a database (like indexes and triggers!) for some reason. So, I created my own program to properly script a database including just about every type of DB object you may have added. I recommend using this instead. It's called SQL Server Scripter and it can be found here:
https://bitbucket.org/jez9999/sqlserverscripter
I'm surprised no-one has mentioned this, because it's really useful: you can dump out a database (its schema and data) to a script, using SQL Server Management Studio.
Right-click the database, choose "Tasks | Generate Scripts...", and then select to script specific database objects. Select the ones you want to copy over to the new DB (you probably want to select at least the Tables and Schemas). Then, for the "Set Scripting Options" screen, click "Advanced", scroll down to "Types of data to script" and select "Schema and data". Click OK, and finish generating the script. You'll see that this has now generated a long script for you that creates the database's tables and inserts the data into them! You can then create a new database, and change the USE [DbName] statement at the top of the script to reflect the name of the new database you want to copy the old one to. Run the script and the old database's schema and data will be copied to the new one!
This allows you to do the whole thing from within SQL Server Management studio, and there's no need to touch the file system.
Below is what I do to copy a database from production env to my local env:
Create an empty database in your local sql server
Right click on the new database -> tasks -> import data
In the SQL Server Import and Export Wizard, select product env's servername as data source. And select your new database as the destination data.
Its hard to detach your production dB or other running dB's and deal with that downtime, so I almost always use a Backup / restore method.
If you also want to make sure to keep your login's in sync check out the MS KB article on using the stored proc sp_help_revlogin to do this.
The detach/copy/attach method will take down the database. That's not something you'd want in production.
The backup/restore will only work if you have write permissions to the production server. I work with Amazon RDS and I don't.
The import/export method doesn't really work because of foreign keys - unless you do tables one by one in the order they reference one another. You can do an import/export to a new database. That will copy all the tables and data, but not the foreign keys.
This sounds like a common operation one needs to do with database. Why isn't SQL Server handling this properly? Every time I had to do this it was frustrating.
That being said, the only painless solution I've encountered was Sql Azure Migration Tool which is maintained by the community. It works with SQL Server too.
I run an SP to DROP the table(s) and then use a DTS package to import the most recent production table(s) onto my development box.
Then I go home and come back the following morning. It's not elegant; but it works for me.
If you want to take a copy of a live database, do the Backup/Restore method.
[In SQLS2000, not sure about 2008:] Just keep in mind that if you are using SQL Server accounts in this database, as opposed to Windows accounts, if the master DB is different or out of sync on the development server, the user accounts will not translate when you do the restore. I've heard about an SP to remap them, but I can't remember which one it was.

Resources