SQL Database Backup with minimum storage size - sql-server

I want to take a full backup of a SQL Server DB, but the concern is the other PC where I want to restore is not having enough disk space. I tried zipping the bak file to reduce the size but, even that doesn't help. Is there a way to take a FULL backup in SQL with minimum storage usage? I know, we can take differential backup but that's not an option as I am restoring the database to the new PC for the very first time. I want to know any configurable setting while taking the backup which might help me in reducing the size of the backup file. Any help is appreciated.

Related

How the unallocated space in SQL Server affects copying to another SQL Server

I am a newbie in SQL Server, I have a task to move the whole SQL Server to another.
I am trying to estimate how much space I need in the new SQL Server.
I ran EXEC sp_spaceused
And the following came up:
When I look into the output, it seems that the Database is using ~122GB (reserved), but when looking in the total database size (mdf + ldf) it is ~1.8 TB.
Does that mean when I copy the Database from the existing SQL Server to a new one I will need ~1.8 TBs into the new?
I am thinking about creating a back-up and copy the back-up to the new Server. How does the back-up takes into consideration the unallocated space? Does the back gets closer to the reserved or the database_size? I understand that this is without taking into consideration the uncompressed in the back-up, which will improve the file size.
Thx for the help.
The backup file will be much smaller than 1.8TB, since unallocated pages are not backed up. But the log and data files themselves will be restored to an identical size, so you will need 1.8TB on the target in order to restore the database in its current state.
Did you check to see if your log file is large due to some uncontrolled growth that happened at some point and is maybe no longer necessary? If this is where all your size is, it's quite possible you can fix this before the move. Make sure you have a full backup and take at least one log backup, then use DBCC SHRINKFILE to bring the log file back into the right stratosphere, especially if it was caused by either a one-time abnormal event or a prolonged log backup neglect that has now been addressed.
I really don't recommend copying/moving the mdf/ldf files (background) or using the SSMS UI to "do a copy" since you can have much greater control over what you're actually doing by using proper BACKUP DATABASE and RESTORE DATABASE commands.
How do I verify how much of the log data is being used?
If you're taking regular log backups (or are in simple recovery), it should usually be a very small % of the file. DBCC SQLPERF(LogSpace); will tell you % in use for all log files.
To minimize the size that the log will require in the backup file itself, then:
if full recovery, back up the log first.
if simple recovery, run a couple of CHECKPOINT; commands first.

SQL Server Copy database to different server

I'm trying to take a copy of a database into another server. I usually been making backup, copy the backup file to another server and restore it. But the backup is 90GB, and the space left after copying the backup to the destination folder is only 26GB. As you probably understand by now I'm not able to restore the database as it doesn't have enough space. So my question is is possible to restore database by replacing backup file? Any other suggestions? Increasing the disc space is not an option as this is just a testing server and the space will be enough after restoring. Thank you
Depending on your SQL Server version have you tried enabling compressing during backup. You would be surprised by how small the backup file can get after compression. Also if your database is set to SIMPLE recovery you could look to reduce the log file size before you backup.
You can find some steps on how to enable compression during backup here
https://sqlbak.com/blog/how-to-configure-backup-compression/

Restored database taking massive amount of space

I restored a database with a 1.5gb .bak file. Everything works fine except the restored database now takes 64gb of space.
I've heard about shrinking databases and log files but how should I find out what is it that takes so much space and what I can "shrink" so that the data itself won't change. I need this production backup data in my development environment as it is.
I don't need full logs in the development environment where I'm doing the restoring. How to find out is it the data or the logs that take more space?
I'm using SQL Server Management Studio 2017
Maybe logs?
I suggest you to analyze if that fits you. Make the backups much shorter: See More
BACKUP DATABASE XXXXX TO DISK 'C:\XXX.bak' WITH COPY_ONLY
You can also change the Recovery Model from Full (Default) to Simple, after the restore.
Then SHRINK it
I'm honestly not sure if all of those are necessary, but that works for me to reduce space. Maybe shrinking before changing recovery model is better, or maybe one of those are not the best pratices.
You could see what is the file size of a database in your backup using
restore filelistonly from disk = 'here_the_full_pass_to_your_backup_including_file_name'
So you could plan how much space it will need.
How to find out is it the data or the logs that take more space?
Please update your question with the results of
use MyDB;
exec sp_spaceused;
Your question: "How to find out if it is the data or the logs that take more space?"
Answer: Here is one way. Right click your database in Sql Mgt. Studio, click Reports=>Standard Reports and then disk usage.
If you don't understand the differences between the full and simple recovery models, I encourage you to do some reading. Also understand the consequences of shrinking files and auto-grow. Shrinking files won't cause you to lose committed data, but will cause a performance hit later on if and when Sql needs to auto-grow files.
If you don't need the full recovery model and aren't concerned about auto-grow, then change it to simple or bulk-logged and then shrink the log file(s).
If you are not concerned about auto-grow, then you can also shrink data file(s).

Creating locally database (SQL Server) from remote server for locally test environment

I have to create script which will create the database locally (from database which is on server). Database is 20GB+ and every day is bigger.
What is your advise to do that? I can generate script with all database objects in SSMS, but how operate with data insertion? I mean full script (script which includes data) is not good option. What about one Full Backup and Differetial Backups?
Run generated scripts for all DB objects
Restore Full Backup
Restore Differential Backup
This work is required in order to make possible to test the app locally, on my (another dev's) machine. Thank for every advise!
My advice is to go with a full backup first & get differential backups afterward.
Because a differential backup is not independent and it must be based on the latest full backup of the data. That means there should have a full backup as a base. A differential backup contains only the data that has changed since the differential base. Typically, differential backups are smaller and faster to create than the base of a full backup and also require less disk space to store backup images.
Therefore, using differential backups can save available space and also speed up the process of restoring. At restore time, the full backup is restored first, followed by the most recent differential backup.

Backup SQL Server while minimizing bandwidth

I want to implement an automated backup system for my site's SQL Server 2005 DB that will backup nightly to Amazon's S3 service. But since S3 charges both for space and bandwidth used, I would like to minimize the size of the files that I transfer in. What is the best way to achieve this?
I should clarify that I'm not really talking so much about compression, which is pretty straightforward, but concerning backup strategies like whether to do differential backups all the time, whether I need to copy transaction logs, etc.
Differential backups will be smaller than full backups, of course. However, you should consider the restoration side as well. You'll need your last full backup as well as your differentials to perform the restore which can add up to a lot of bandwidth/transfer time for a restore. One option is to perform a full backup weekly and do differentials daily (or a similar type of schedule).
As for transaction logs, it depends on what granularity you're looking for in restoring your data. If restoring to the last full or differential backup is sufficient, then you don't need to worry about taking transaction log backups. If that's not the case, then transaction log backups will be necessary.
Either use a commercial product do compress the backups like Red Gate Backup Pro or just zip-compress it after you're done.
Write a .batch script or powershell script that will find the file/s created in the past day and zip them up. Then FTP or whatever you have to do.
A powershell example that I just came across.

Resources