Automatically replace dev database with live database? - sql-server

I have live and dev versions of an ASP.NET website, with corresponding live and dev versions of the SQL Server back end. Each is on its own server (so 4 total) on the same local network.
Every so often, we've been manually replacing the dev DB with the live DB, using backup & restore, so that the dev DB is a reasonably recent snapshot of the live DB. What's the best way to automate that process?
I'd be interested in either having it run on a schedule, or making it a pushbutton process where we still control the timing.
(For what it's worth, we use CruiseControl.net; so an option that would let us use the CC.net dashboard to launch this process would be nice.)

1- you can use replication (transactional or snapshot) for synchronizing two databases.
2- write an application that get backup from db and restore it. (use SMO)
3- write an application that delete all data from dev db and copy all data from live db to it. you can use bcp (SQLBulkCopy) for doing this work.

I think scheduling or automating a restore would be relatively simple with the RESTORE command in T-SQL. Schedule your backups to go to a designated location and set whatever schedule / script that refreshes the dev database to look in the same location.

Why not set up a Snapshot replication between your live and dev database? This is the way we handle keep our dev database in synch with our production database.
http://msdn.microsoft.com/en-us/library/ms151198.aspx

Related

Synchronize data from the production Azure SQL database with the staging database allowing changes to be made in the second

I have two databases in Azure SQL Database almost identical in their structure (unless one of the two is modified). I need to synchronize the data from the production database with the staging database, being able to make changes in staging without harming production unless I need to do a production restore (that's another topic).
In the case that there is no solution, I want to be able to make staging equal to production when my developers need it, is there a way to modify a database with a backup of another without having to create a new database (since you would have to modify the server name in the app)
The process I currently employ across my environments is:
Start with a baseline where STAGING and PROD are exactly the same. This you can achieve by restoring the latest backup(s) from your PROD environment to your STAGING environment.
As changes occur in DEV and need to be released to STAGING, create the appropriate release scripts to apply to STAGING.
When you need to refresh the data in STAGING, restore the latest backup(s) from PROD to STAGING again.
Optional: Run any post-refresh STAGING specific scripts that are needed. E.g. if you obfuscate any of the data, or change of the data to signify it's the STAGING environment.
Run any release scripts in STAGING, that haven't yet been released to PROD.
Over time your release scripts that were used in STAGING, should get used to release to PROD, which is what will help keep the two environments in sync.
Repeat steps 2, 3, and 4 at the frequency acceptable to your goals.
I have the above in a SQL Job so it's essentially a one click process, but my database is SQL Server on-prem. I also use a schema comparison tool called SQL Examiner to automate generating the release scripts I need, but not sure it's applicable to Azure SQL Database.

Automating / scripting TFS database refresh

I am attempting to refresh a TFS 2015 test environment on a regular basis. The TFS production environment consists out of 3 servers, and the test environment has the exact same layout (separate App, Build and SQL server - no sharepoint involved).
TFS version : 14.0.23128.0 (Tfs2015)
SQL Server 2012 SP2 CU-1
OS : Windows 2008 R2 Enterprise
Seeing that the refresh will occur regularly and to minimize down time, I want to automate / script the process to be execute at night.
Getting a production back-up from the SQL Server seems no problem, as this could be scheduled from within the TFS Admin Console (Scheduled backups).
At this stage I am trying to figure out how the “scheduled restore” will work. In no time I was bumping my head against attempting to stop all collections before the restore starts on the test environment...
My question: Is such an automated style of TFS backup and restore doable, and is there by any chance a product out there that supports this out of the box?
You can setup automated refresh for every MSSQL database. I would do it using SQLAgent task. Restore command should be preceded with following statement to ensure all connections are terminated otherwise restore will fail:
ALTER DATABASE TFS_POC
SET RESTRICTED_USER
WITH ROLLBACK IMMEDIATE
GO
There is not a way to have a total automated style of TFS restore, no matter you are using TFSBackup.exe, or using Restore Databases in Scheduled Backups, selecting your database backup is required. Also, stopping services is necessary, as stopping services helps protect against data loss or corruption during the restoration process.
To provide a bit of feedback on the issue.
The requirements were fairly simple in our case in the sense that we only wanted the version control databases restored.
This simplified things a bit as restoring functionality such as reporting, builds, sharepoint etc. were left out of scope.
In the end I did succeed in getting the process automated to an extent, by making use of an amalgamation of SQL server stored procedures, Powershell and some standard Tfs commands.
The first step was to get the production Tfs databases backed up.
This was achieved by making use of SQL server stored procures.
Note: You must back up all of the databases to the same time stamp to help ensure against data loss!
https://www.visualstudio.com/en-us/docs/setup-admin/tfs/admin/backup/backup-db-architecture
These database backups are then copied to a shared restore location where the test environment’s Tfs SQL Server can access them during the restore process.
This was also done via some SQL stored procedures.
These steps are scheduled within SQL Server to execute nightly, with the previous day’s backup in the shared location being overwritten each time.
For the orchestration of the restore actions involved, TeamCity was used.
When a restore is requested you can initiate (or schedule) it via TeamCity.
The first step will be to stop all the test environment’s Tfs collections.
This was done by using the TfsServiceControl.exe - https://www.visualstudio.com/en-us/docs/setup-admin/tfs/command-line/tfsservicecontrol-cmd
Next we will execute some stored procedures on the test environment’s SQL Server that will start the DB restore process of the test environment’s Tfs DB’s.
The next steps will prepare the Tfs databases to run in the test environment and among other things, clear away the production values that are embedded in them.
This was done by a PowerShell script that in essence just calls the commands that will be run when a manual restore is done. Most of the commands reside within the TfsConfig command.
https://www.visualstudio.com/en-us/docs/setup-admin/tfs/commandline/tfsconfig-cmd
TfsConfig PrepareClone
TfsConfig ChangeServerID
TfsConfig RemapDBs
TfsConfig Accounts /ResetOwner
TfsConfig Accounts /add
TfsConfig registerDB
TfsConfig ConfigureMail
TfsConfig RebuildWarehouse
Clearing cache
After these steps were completed the test environment’s Tfs collections are started.
This is done by using the TfsServiceControl.exe - https://www.visualstudio.com/en-us/docs/setup-admin/tfs/command-line/tfsservicecontrol-cmd
It is now possible to request or schedule a restore. A bit primitive, but so far it works fine.

Database mirroring for report database

I have a problem that our production database is under heavy load.
So we have decide that we setup a second SQL server running copy of production database that doesn't need to be 100% uptodate with production database.
After searching I have found that asynchronous mirroring of production database might do the trick.
Mirror database would be only read-only for reports and stuff.
So I managed to set it up, but have found out that I can't read any data from mirror database because it is in recovery mode.
No I would like to know if my problem is solvable with mirroring or we should use alternative?
Thanks to Blim, we have decided for transaction replication. It works great on our development database (so it should in production too).
Step by step article: http://www.sql-server-performance.com/2010/transactional-replication-2008-r2/

Best SQL Server 2005 database transfer method

We currently have a Live and Dev environment within our offices, at regular intervals we need to move the live DB to Dev to make sure data is updated for the dev team to work with.
However the live DB is becoming very difficult to manage as its almost hit 100Gb, we currently run a DB backup, copy the file to the other server and restore it. however this is becoming a major headache that can take upwards of 4 - 5 hours.
does anyone have any good recommendations for how we can move the DB in a more efficient manor?
We are using MS SQL Server 2005 Standard Edition.
The best way to update your dev server from the production is to implement a shipping logs strategy. Perform an incremental backup of your production database daily and place the incremental log in a location where the development server can see it. Then once a week take all the incremental backups (there should be 5) and increment the development database to make it look like the production. The process can be automated with sql server tools out there if you want it to, or you can write a little program that will generate the scripts for you from the file names in the directory where you put the log files. After you do the operation a few times and see the TSQL that SQL server will generate for you each time, you will get a good idea how to write the script generator utility. You can even automate the restore process to your dev box with the same utility, just connect to the dev server and run the scripts that it generates and then automate the running of the utility. Most programmers could whip up this utility in a day or two at best as long as they have a decent understanding of sql server and tsql.
You have other options as well, but this one would probably solve most of your issues
You get incremental backups of your production database in addition to your full backups that you may or may not do.
The utility you write will save time and automate the process all you have to do is check that it was successful or not and you have the utility email you the success/failure. If you are cloud based use an amazon tool for email, or if you azure based, use sendgrid.com.
Your time for producing the utility is not great.

Restart log shipping when out of sync

The scenario is. A database secondary server are for different reason out of sync or is suspected that is not sync. Someone has made the secondary databases online by mistake or other mishaps. If you now want to make sure that they are set back on track. How do you do that? Preferably swiftly and for many databases at once.
When you set up a log shipping between two servers using the guide it takes care of the initial backup and copying of backup file and then the initial restore.
If I have to redo that I have to unable/enble and redo the loghipping and fill all the parameters again. Is there an other way? Can I use sqllogship application?
I there a "C:\Program Files\Microsoft SQL Server\100\Tools\Binn\sqllogship.exe" -Restart -server SQLServ\PROD2
Or is there something that could be done easily with powershell and SQL Server Management Objects - SMO?
I want to use all the parameters that are already in tables like log_shipping_secondary.
I have not found any scripts for doing this. I looked at the generated script when I used the guide but that does not contain the inital backup and copy. I can write my own script. I am just afraid someone will say: Why did you not just run: $smoLogShipping.Redo
If you bring a standby database on-line (i.e.) restore it with_recovery then this will break the log-shipping. The only way to re-establish log shipping is to restore the standby database from a full backup of the source again and use no_recovery / standby mode.
I do not know of any community supported script to do what you ask but it can be scripted easy enough. The GUI can handle most of the process, you would then just need to tweak it be parameterized and customized to the work flow that you are after. The link below gives an example of what I'm talking about.
Scripting Log Shipping Automation

Resources