SQL Server online backup with MozyPro - sql-server

Anyone using MozyPro to backup SQL Server databases?
I'm concerned about the way it does the backup. It just copies data files the way they are. Not using the backup database command.
Is it safe?

MozyPro uses the Volume Shadow Service (VSS) to create backups for SQL Server. SQL Server 2005 has been engineered so that VSS backups are consistent. So this is definitely a valid way to back up SQL Server databases.
Here is a white paper on how the SQL Server 2005 SQL Writer works with VSS.
Microsoft® SQL Server™ 2005 provides
support for creating snapshots from
SQL Server data using Volume Shadow
Copy Service (VSS). This is
accomplished by providing a VSS
compliant writer (the SQL writer) so
that a third-party backup application
can use the VSS framework to back up
database files. This paper describes
the SQL writer component and its role
in the VSS snapshot creation and
restore process for SQL Server
databases. It also captures details on
how to configure and use the SQL
writer to work with backup
applications in the context of the VSS
framework.
Here is the MozyPro manual (PDF), which describes how to restore SQL Server backups that were made using VSS.
That being said, if you don't trust this method, there is nothing stopping you from setting up a backup job and just having Mozy backup your *.bak files.

Judging by the hell I am currently going through with Mozy.. NO NO NO!
The backups work, in theory, just not the restore part. Mozy's extreme incremental backup system results in restores that can take weeks. Apparently. I'm still waiting despite talking their top level tech support, over 10 days have passed.

https://github.com/candera/hobocopy
WHY DOES HOBCOPY USE THE VOLUME SHADOW SERVICE?
Because HoboCopy copies from a VSS snapshot, it is able copy even
files that are in locked by some other program. Further, certain
programs (such as SQL Server 2005) are VSS-aware, and will write their
state to disk in a consistent state before the snapshot is taken,
allowing a sort of "live backup". Files locked by VSS-unaware programs
will still be copied in a "crash consistent" state (i.e. whatever
happens to be on the disk). This is generally a lot better than not
being able to copy the file at all.

Related

Azure VM with SQL Server database - backup and file recovery

I have an Azure VM - Windows (Windows Server 2008 R2 Datacenter). It has Microsoft SQL Server 2008 R2 running on it (version v10.50.6549).
The Azure VM has backups running according to a policy - and I can see in the backups blade for the VM that they are running nightly.
If I have an issue with the SQL Server, and need to roll back to a prior version of the database, will the File Recovery option from the VM backup be adequate?
Or should I also be running SQL Server backups via a maintenance plan on the server on the VM?
If I have an issue with the SQL Server, and need to roll back to a
prior version of the database, will the File Recovery option from the
VM backup be adequate?
Maybe. VM Backups don't always give you consistent SQL backups. They usually work, but not always. If you have everything setup just right and get consistent VM backups, it might be ok-- but you are running a fairly old OS on that VM, so I'd be nervous. Very nervous. If the data is really important to you, then you should backup the data, not just the VM. Sometimes you want to restore just the data to another VM to investigate, not the entire server. I also hope you have more than just "last night's VM backup" at any given time. Sometimes bad things happen on Friday and you don't notice until Monday.
Or should I also be running SQL backups via a maintenance plan on the
server on the VM?
Yes, you should be running SQL backups if you your data is important. If your data is really important (you don't want to lose half a day of it), you should be doing full backups periodically (e.g. nightly) transaction log backups many times per hour, and keeping a few weeks worth of backups in rotation. If your data is super-important (you don't want to lose more than a few seconds), you should be mirroring it over to another database server in near-time (asynchronously). If it is critical (you don't want to lose any data), then you want to mirror to another server in real-time (synchronously).
Of course, if you are already running in Azure and don't have a DBA, managing a database is a lot easier, safer, more available, and generally cheaper if you use Azure SQL rather than trying to manage your own instance SQL Server in a VM-- oh yeah, and backups are handled for you, with millisecond point-in-time recovery for up to 45 days-- and they handle the mirroring for you to. If you want to mirror to another region across the country, you do have to pay extra for that, though.

SharePoint 2013 Backup and Restore from database side only

Does SharePoint 2013 restore only from Database?
I have a scheduled script in MSSQL Server to run all database backups daily , and my SharePoint site also require a daily differential/weekly full backup usually happen in Central Administration. I am aware that multiple backups running would break log chain in this case.
If I stop doing backup in Central Administration and let DB does the backup only, would I be still able to restore my SharePoint site (Contents and Configurations)?
Does SharePoint 2013 restore only from Database?
The short answer is no. A full fidelity SharePoint farm backup is mostly databases but there is also configuration information and data that is stored outside of the databases. The Central Admin backup facility (as well as the Backup-SPFarm powershell commands) initiate SQL backups as well as backups of all the stuff that isn't in SQL. That is the only point-and-click (or type a single command) solution.
Could you get away with only having some of the databases to recreate your environment? Sure but then you'd have to have a documented and tested (and ideally automated) process for recreating the farm from the databases.

Clarification on proper practices for backing up SQL Server databases?

Recently I found myself needing to back up a client database running on SQL Server 2008 R2.
Normally I would do this by selecting the "Backup" task in SQL Server Management Studio, which produces a single, portable archive of the database. However, a co-worker said that some customers standard backup practice is to simply create a copy of the .MDF and .LOG files instead. Another then spoke up recommending against such methods, stating that the only 'proper' procedure for this is using the Backup task to produce a file as described above, before backing that up instead. In the event of a problem, restoring this backup file and applying transaction logs allows you to restore service without losing data.
I agree with the recommendation of using the provided task in combination with transaction logs, but I wasn't entirely sure of myself so I kept my mouth shut. Let's be kind and assume that the .MDF/.LOG files were copied when SQL Server itself had been shut down cleanly and completely - is there actually a good reason to use the Backup task instead of copying the raw files, or are we mistaken?
I did some reading on MSDN and found that copying the files and restoring them can result in issues in some cases (see Limitations when using Xcopy deployment), but is that the only difference?
Using the backup task will reset the log files whilst simply copying them will not: this helps with database performance.
Always use the backup task: this can also be scheduled.

Some data loss with backup and restore in SQL Server 2008

We have developed and delivered a database-driven application about 1 and half a year ago. During this time, they have backed up database, re-installed software and restored database for a few times. Also they have sent us their database afew times to perform some update to the db structure. They have used the built-in backup and restore capability of the software. The software is using SMO (SQl Server Management Object) to perform backup/restore operations.
They now claim that some of their data has been lost during recent backup and restores.
Is such a claim possible or is just their data entry fault?
I have checked their db manually and data they have been added was not there.
Was there any report on SMO bug?
Backup and restore are critical facilities in SQL Server. They always, always, always backup the database consistently (point-in-time). You cannot backup or restore in a way that looses a part of the data or introduces other inconsistencies. SQL Server always prevents you from doing that with an error message.
(The myth that data loss might be possible comes from other RDBMSs like MySQL where backups were a challenge a few years ago).
The problem is elsewhere: In the DML executed on the database.

Advice needed: cold backup for SQL Server 2008 Express?

What are my options for achieving a cold backup server for SQL Server Express instance running a single database?
I have an SQL Server 2008 Express instance in production that currently represents a single point of failure for my application. I have a second physical box sitting at the installation that is currently doing nothing. I want to somehow replicate my database in near real time (a little bit of data loss is acceptable) to the second box. The database is very small and resources are utilized very lightly.
In the case that the production server dies, I would manually reconfigure my application to point to the backup server instead.
Although Express doesn't support log shipping, I am thinking that I could manually script a poor man's version of it, where I use batch files to take the logs and copy them across the network and apply them to the second server at 5 minute intervals.
Does anyone have any advice on whether this is technically achievable, or if there is a better way to do what I am trying to do?
Note that I want to avoid having to pay for the full version of SQL Server and configure mirroring as I think it is an overkill for this application. I understand that other DB platforms may present suitable options (eg. a MySQL Cluster), but for the purposes of this discussion, let's assume we have to stick to SQL Server.
I would also advise for a script based log shipping. After all, this is how log shipping started. All you need is a time based agent to schedule the scripts (ie. Tasks Scheduler), and a smart(er) file copy (robocopy).

Resources