Best SQL Server 2005 database transfer method - sql-server

We currently have a Live and Dev environment within our offices, at regular intervals we need to move the live DB to Dev to make sure data is updated for the dev team to work with.
However the live DB is becoming very difficult to manage as its almost hit 100Gb, we currently run a DB backup, copy the file to the other server and restore it. however this is becoming a major headache that can take upwards of 4 - 5 hours.
does anyone have any good recommendations for how we can move the DB in a more efficient manor?
We are using MS SQL Server 2005 Standard Edition.

The best way to update your dev server from the production is to implement a shipping logs strategy. Perform an incremental backup of your production database daily and place the incremental log in a location where the development server can see it. Then once a week take all the incremental backups (there should be 5) and increment the development database to make it look like the production. The process can be automated with sql server tools out there if you want it to, or you can write a little program that will generate the scripts for you from the file names in the directory where you put the log files. After you do the operation a few times and see the TSQL that SQL server will generate for you each time, you will get a good idea how to write the script generator utility. You can even automate the restore process to your dev box with the same utility, just connect to the dev server and run the scripts that it generates and then automate the running of the utility. Most programmers could whip up this utility in a day or two at best as long as they have a decent understanding of sql server and tsql.
You have other options as well, but this one would probably solve most of your issues
You get incremental backups of your production database in addition to your full backups that you may or may not do.
The utility you write will save time and automate the process all you have to do is check that it was successful or not and you have the utility email you the success/failure. If you are cloud based use an amazon tool for email, or if you azure based, use sendgrid.com.
Your time for producing the utility is not great.

Related

Moving more than 200 SQL Server 2008 Databases to another machine

I have a server which is running SQL Server 2008R2 SP1 and there are more than 200 online databases with their specific login and I want to move all of these databases to another machine with the same software SQL Server 2008R2 SP1.
I can afford downtime up to 8 hours so I'm not looking for complex solution like database mirroring and logshipping.
I have tried Backup & Restore, it work fine but trouble me in two ways :
1- I couldn't transfer login to new server so all of my users will have to create their user again which will my users to get angry.
(I have tried to script all of the logins from the security section but since I have realize the hash algorithm between two system maybe different, I cannot transfer my logins in this way)
2 - I have to backup & restore each database one by one which is so time consuming.
I have also tried copy wizard, but I have encountered with following error message at the final stage :
"SQL Server schedule job" job failed the job was invoked by user sa step to run was step 1.
Since mysource server is running based on windows server 2003, I cannot use the powershell V3 solution.
I have done many searches but all I found was a solution to transfer one database to another server.
But since I'm looking for a solution to transfer mass databases, this situation is being hair pulling and so difficult So I wil be very appreciated if you can provide me an easy and practical solution.
for moving logins between servers
https://support.microsoft.com/en-us/kb/918992
[moving logins][1]
then backup and restore all databases in that migraton day using pre built restore script if the db size is small , later restore the differential backups and make it online. if the size is an issue then implement log shipping or mirroring

Tips for manually writing SQL Server upgrade scripts

We have some large schema changes coming down the pipe and are in needs of some tips in writing upgrade scripts manually. We're using SQL Server 2000 and do not have access to automated tools nor are they an option at this point in time. The only database tool we have is SQL Server Management Studio.
You can import the database to a local machine with has a newer version of SQL, then you can use the 'Generate Scripts' feature to script out a lot of the database objects.
Make sure to set in the Advanced Settings to script for SQL Server 2000.
If you are having problems with the script generated, you can try breaking it up into chunks and run it in small batches. That way if you have any specific generated scripts you can just write the SQL manually to get it to run.
While not quite what you had in mind, you can use Schema comparing tools like SQL Compare, and then just script the changes to a sql file, which you can then edit by hand before running it. I guess that would be as close to writing it manually without writing it manually.
If you -need- to write it all manually i would suggest getting some intellisense-type of tools to speed things up.
Your upgrade strategy is probably going to be somewhat customized for your deployment scenario, but here are a few points that might help.
You're going to want to test early and often (not that you wouldn't do this anyway), so be sure to have a testing DB in your initial schema, with a backup so you can revert back to "start" and test your upgrade any number of times.
Backups & restores can be time-consuming, so it might be helpful to have a DB with no data rows (schema-only) to test your upgrade script. Remember to get a "start" backup so you can go back there on-demand.
Consider stringing a series of scripts together - you can use one per build, or feature, or whatever. This way, once you've got part of the script working, you can leave it alone.
Big data migration can get tricky. If you're doing data transformations, copying or moving rows to new tables, etc., be sure to check row counts before the move and account for all rows afterwards.
Plan for failure. If something goes wrong, have a plan to fix it -- whether that's rolling everything back to a backup taken at the beginning of the deployment, or whatever. Just be sure you've got a plan and you understand where your go / no-go points are.
Good luck!

Advice needed: cold backup for SQL Server 2008 Express?

What are my options for achieving a cold backup server for SQL Server Express instance running a single database?
I have an SQL Server 2008 Express instance in production that currently represents a single point of failure for my application. I have a second physical box sitting at the installation that is currently doing nothing. I want to somehow replicate my database in near real time (a little bit of data loss is acceptable) to the second box. The database is very small and resources are utilized very lightly.
In the case that the production server dies, I would manually reconfigure my application to point to the backup server instead.
Although Express doesn't support log shipping, I am thinking that I could manually script a poor man's version of it, where I use batch files to take the logs and copy them across the network and apply them to the second server at 5 minute intervals.
Does anyone have any advice on whether this is technically achievable, or if there is a better way to do what I am trying to do?
Note that I want to avoid having to pay for the full version of SQL Server and configure mirroring as I think it is an overkill for this application. I understand that other DB platforms may present suitable options (eg. a MySQL Cluster), but for the purposes of this discussion, let's assume we have to stick to SQL Server.
I would also advise for a script based log shipping. After all, this is how log shipping started. All you need is a time based agent to schedule the scripts (ie. Tasks Scheduler), and a smart(er) file copy (robocopy).

Warm Standby SQL Server/Web Server

Warm Standby SQL Server/Web Server
This question might fall into the IT category but as the lead developer I must come up with a solution to integration as well as pure software issues.
We just moved our web server off site and now I would like to keep a warm standby of both the website and database (SQL 2005) on site.
What are the best practices for going about doing this with the following environment?
Offsite: our website and database (SQL 2005) are on one Windows 2003 server. There is a firewall in front of the server which makes
replication or database mirroring not an option. A vpn is also not an option.
My thoughts as a developer were to write a service which runs at the remote site to zip up and ftp the database backup to an ftp server
on site. Then another process would unzip the backup and restore it to the warm standby database here.
I assume I could do this with the web site as well.
I would be open to all ideas including third party solutions.
If you want a remote standby you probably want to look into a log shipping solution.
This article may help you out. In the past I had to develop one of these solutions for the exact same problem, writing it from scratch is not too hard. The advantage you get with log shipping is that you have the ability to restore to any point in time and you avoid shipping these big heavy full backups around and instead ship light transaction log backups, and occasionally a big backup.
You have to keep in mind that transaction log backups are useless without having both the entire sequence of transaction log backups and a full backup.
You have exactly the right idea. You could maybe write a script that would insert the names of the files that you moved into a table that your warm server could read. Your job could then just poll this table at intervals and not worry about timing.
Forget about that - just found this. Sounds like what you are setting out to do.
http://www.sqlservercentral.com/articles/Administering/customlogshipping/1201/
I've heard good things about Syncback:
http://www.2brightsparks.com/syncback/sbpro-features.html
Thanks for the link to the article sambo99. Transaction log shipping was my original idea, but I dismissed it because the servers are not in the same domain not even in the same time zone. My only method of moving the files from point A to point B is via FTP. I am going to experiment with just shipping the transaction logs. And see if there is a way to fire off a restore job at given intervals.
www.FolderShare.com is a good tool from Microsoft. You could log ship to a local directory and then synchronize the directory to any other machine. You could also syncrhronize the website folders as well.
"Set it and forget it" type solution. Setup your SQL jobs to clear older files and you'll never have to edit anything after the initial install.
FolderShare (free, in beta) is currently limited to 10,000 files per library.
For all interested the following question also ties into my overall plan to implement log shipping:
SQL Server sp_cmdshell

Nightly importable or attachable copies of production database

We would like to be able to nightly make a copy/backup/snapshot of a production database so that we can import it in the dev environment.
We don't want to log ship to the dev environment because it needs to be something we can reset whenever we like to the last taken copy of the production database.
We need to be able to clear certain logging and/or otherwise useless or heavy tables that would just bloat the copy.
We prefer the attach/detach method as opposed to something like sql server publishing wizard because of how much faster an attach is than an import.
I should mention we only have SQL Server Standard, so some features won't be available.
What's the best way to do this?
MSDN
I'd say use those procedures inside a SQL Agent job (use master.xp_cmdshell to perform the copy).
You might want to put the big huge tables on their own partition and have this partition belong to a different file group. You would backup then backup and restore the main file group.
You might want to also consider doing incremental backups. Say, a full backup every weekend and an incremental every night. I haven't done file group backups, so I don't know if these work well together.
I'm guessing that you are already doing regular backups of your production database? If you aren't, stop reading this reply and go set it up right now.
I'd recommend that you write a script that automatically runs, say once a day, that:
Drops your current test database.
Restores your current production backup to your test environment.
You can write a simple script to do this and execute it using the isql.exe command line tool.

Resources