Clarification on proper practices for backing up SQL Server databases? - sql-server

Recently I found myself needing to back up a client database running on SQL Server 2008 R2.
Normally I would do this by selecting the "Backup" task in SQL Server Management Studio, which produces a single, portable archive of the database. However, a co-worker said that some customers standard backup practice is to simply create a copy of the .MDF and .LOG files instead. Another then spoke up recommending against such methods, stating that the only 'proper' procedure for this is using the Backup task to produce a file as described above, before backing that up instead. In the event of a problem, restoring this backup file and applying transaction logs allows you to restore service without losing data.
I agree with the recommendation of using the provided task in combination with transaction logs, but I wasn't entirely sure of myself so I kept my mouth shut. Let's be kind and assume that the .MDF/.LOG files were copied when SQL Server itself had been shut down cleanly and completely - is there actually a good reason to use the Backup task instead of copying the raw files, or are we mistaken?
I did some reading on MSDN and found that copying the files and restoring them can result in issues in some cases (see Limitations when using Xcopy deployment), but is that the only difference?

Using the backup task will reset the log files whilst simply copying them will not: this helps with database performance.
Always use the backup task: this can also be scheduled.

Related

Transferring MDF/LDF files to target server

Background:
I have a medium-sized database (900GB) that needs to be copied onto another server (driven via code, not scheduled). Currently we take a backup (to .bak), move it to a staging server, and restore it to the target server. The target server does not have enough space to hold the backup file, and the restored instance simultaneously, thus the staging server. These transfers (backup to staging, restore from staging) happen over SMB2. The staging server needs to go away due to business requirements, however. It is worth mentioning the target server will be taken offline (and used offline) after the transfer, so I'm not sure the mirroring or replication options are valid.
I have identified two options -- one is to backup the database to the primary server, and open up firewall rules/smb to serve the backup file to the target server over SMB. ("RESTORE FROM \x.x.x.x\blah\db.bak"). Security isn't a fan, though.
The ideal solution (and one that could easily be implemented in every other database I've worked with), is to quiesce the database and transfer the datafiles (in the case of ms-sql, mdf and ldf files). However, upon research I see there is no such functionality available out of the box. I know I can take the database offline to copy the mdf/ldf safely, but that's not an acceptable solution (database must remain online).
I have read LOTS of posts and Microsoft documentation regarding VSS / shadow copy, but I have also read lots of conflicting information about the reliability of using VSS/sqlwriter to copy the mdf/ldf file to the target server, and simply re-attaching the database.
I am looking for documentation or advice (or even backup software that can be programmatically driven via an API) to accomplish this goal of transferring the database without requiring a secondary holding place. Currently I'm researching how to drive this copying process with Powershell, using VSS(vssadmin/vshadow from sdk), but I'm not confident in what I'm reading, and it's not even clear to me if VSS/sqlwriter is a supportable method to copying online LDF/MDF files. Any advice is appreciated.
Thanks,

Make a SQL Server .bak file protected and known only within my application

I'm using VB.net 2013 and SQL Server 2008R2.
In my application I have 2 subs: Backup and Restore (created with SMO) that create a backup of database and restore a backup file to SQL Server.
These are working, but I have a problem :
The .bak file is not protected, so someone can restore it to a SQL Server even without my program.
Is there anything that I can do when create and restore the .bak file in order that :
The file can be restored only within my application
When I do a restore, first to check if the selected file is a file that is created within my application, otherwise an error should be displayed and no restore should be made.
Thank you !
Starting with SQL Server 2014, you can encrypt your backups. And prior to SQL Server 2012, you could protect the media set with a password (but it wasn't really good protection from my understanding). If you're worried about it, you're going to have to write your own backup/restore mechanism. Just make sure you don't miss anything! :)
If you intend to use symmetrical encryption, the password can be easily extracted from you app' binaries. Same is true for any asymmetrical algorithm, since you need both private and public keys in the same place (unless there would be no question).
In short - impossible to do the way you stated it, short of creating your own backup file format. And even if you do, where's the guarantee it will be as reliable and well-tested as that of Microsoft's?
A bit of advise for the future: never cross your way with DBAs, especially corporate ones. The first time your app will fail to restore its backup (or is it absolutely bugless?) or introduce corruption, you might learn about pneuma, soma and sarx a little bit more than you would like it to...

Restart log shipping when out of sync

The scenario is. A database secondary server are for different reason out of sync or is suspected that is not sync. Someone has made the secondary databases online by mistake or other mishaps. If you now want to make sure that they are set back on track. How do you do that? Preferably swiftly and for many databases at once.
When you set up a log shipping between two servers using the guide it takes care of the initial backup and copying of backup file and then the initial restore.
If I have to redo that I have to unable/enble and redo the loghipping and fill all the parameters again. Is there an other way? Can I use sqllogship application?
I there a "C:\Program Files\Microsoft SQL Server\100\Tools\Binn\sqllogship.exe" -Restart -server SQLServ\PROD2
Or is there something that could be done easily with powershell and SQL Server Management Objects - SMO?
I want to use all the parameters that are already in tables like log_shipping_secondary.
I have not found any scripts for doing this. I looked at the generated script when I used the guide but that does not contain the inital backup and copy. I can write my own script. I am just afraid someone will say: Why did you not just run: $smoLogShipping.Redo
If you bring a standby database on-line (i.e.) restore it with_recovery then this will break the log-shipping. The only way to re-establish log shipping is to restore the standby database from a full backup of the source again and use no_recovery / standby mode.
I do not know of any community supported script to do what you ask but it can be scripted easy enough. The GUI can handle most of the process, you would then just need to tweak it be parameterized and customized to the work flow that you are after. The link below gives an example of what I'm talking about.
Scripting Log Shipping Automation

SQL Server online backup with MozyPro

Anyone using MozyPro to backup SQL Server databases?
I'm concerned about the way it does the backup. It just copies data files the way they are. Not using the backup database command.
Is it safe?
MozyPro uses the Volume Shadow Service (VSS) to create backups for SQL Server. SQL Server 2005 has been engineered so that VSS backups are consistent. So this is definitely a valid way to back up SQL Server databases.
Here is a white paper on how the SQL Server 2005 SQL Writer works with VSS.
Microsoft® SQL Server™ 2005 provides
support for creating snapshots from
SQL Server data using Volume Shadow
Copy Service (VSS). This is
accomplished by providing a VSS
compliant writer (the SQL writer) so
that a third-party backup application
can use the VSS framework to back up
database files. This paper describes
the SQL writer component and its role
in the VSS snapshot creation and
restore process for SQL Server
databases. It also captures details on
how to configure and use the SQL
writer to work with backup
applications in the context of the VSS
framework.
Here is the MozyPro manual (PDF), which describes how to restore SQL Server backups that were made using VSS.
That being said, if you don't trust this method, there is nothing stopping you from setting up a backup job and just having Mozy backup your *.bak files.
Judging by the hell I am currently going through with Mozy.. NO NO NO!
The backups work, in theory, just not the restore part. Mozy's extreme incremental backup system results in restores that can take weeks. Apparently. I'm still waiting despite talking their top level tech support, over 10 days have passed.
https://github.com/candera/hobocopy
WHY DOES HOBCOPY USE THE VOLUME SHADOW SERVICE?
Because HoboCopy copies from a VSS snapshot, it is able copy even
files that are in locked by some other program. Further, certain
programs (such as SQL Server 2005) are VSS-aware, and will write their
state to disk in a consistent state before the snapshot is taken,
allowing a sort of "live backup". Files locked by VSS-unaware programs
will still be copied in a "crash consistent" state (i.e. whatever
happens to be on the disk). This is generally a lot better than not
being able to copy the file at all.

Nightly importable or attachable copies of production database

We would like to be able to nightly make a copy/backup/snapshot of a production database so that we can import it in the dev environment.
We don't want to log ship to the dev environment because it needs to be something we can reset whenever we like to the last taken copy of the production database.
We need to be able to clear certain logging and/or otherwise useless or heavy tables that would just bloat the copy.
We prefer the attach/detach method as opposed to something like sql server publishing wizard because of how much faster an attach is than an import.
I should mention we only have SQL Server Standard, so some features won't be available.
What's the best way to do this?
MSDN
I'd say use those procedures inside a SQL Agent job (use master.xp_cmdshell to perform the copy).
You might want to put the big huge tables on their own partition and have this partition belong to a different file group. You would backup then backup and restore the main file group.
You might want to also consider doing incremental backups. Say, a full backup every weekend and an incremental every night. I haven't done file group backups, so I don't know if these work well together.
I'm guessing that you are already doing regular backups of your production database? If you aren't, stop reading this reply and go set it up right now.
I'd recommend that you write a script that automatically runs, say once a day, that:
Drops your current test database.
Restores your current production backup to your test environment.
You can write a simple script to do this and execute it using the isql.exe command line tool.

Resources