how to repair master database in sql server 2005 - sql-server

For some unknown reasons, all of a sudden my sql servers master database has been corrupted and sql service won't run. I have spent hours and tried various things like trying to run the service under different accounts, also checked that no compression is set in the data folder etc but nothing seems to work. I copied master database from another instance and the service would start and I can connect to the database instance via management studio but I won't see my databases. I have backup of the corrupted master database (mdf and log file) but just wondering how can we fix this database to see all my databases? thanks

Just restore from the backup (good to have one!), it's the recommended way:
http://blogs.technet.com/b/fort_sql/archive/2011/02/01/the-easiest-way-to-rebuild-the-sql-server-master-database.aspx
In case you don't have a good backup, you will have to rebuild master database:
http://msdn.microsoft.com/en-us/library/ms144259%28v=sql.90%29.aspx
Copying over from another existing instance is not recommended.

Related

Restore from SQL Server database and replicate to another database

We have a SQL Server database that is hosted by a 3rd party. We have access to the backup files of that SQL Server database which we grab daily and restore to an "in-house" SQL Server for the purpose of creating reports and looking at the data without the chance of affecting the "live" data on the hosted server.
At the moment our restore process only allows us to access day old data, but we would like to increase the frequency of backup/restore processes so our "sandbox" database is more up to date, like maybe 2 or 3 hours old. One issue is that during the restore process everyone has to be out of the database so restoring every couple of hours might be cumbersome.
Is it possible to setup a 3rd SQL Server for the "sandbox" database to replicate to so that when we restore the data to it, that data will then replicate to the "user" database.
Is there a better way to get the data from the actual production server?

Migrating massive databases in SQL Server

One of the tasks in a project that I'm working on is to migrate an existing database on SQL Server 2000, to a new server which runs SQL Server 2008. This database is extremely huge, with 23 million rows and a 78GB mdf file.
What is the best way to migrate a database of this size?
My current approach would be to:
allow for application downtime so that the application doesn't write records to the database
perform a full backup on SQL Server 2000.
move backup file over to new server across the network.
restore full backup on SQL Server 2008.
configure the application to refer to the database on the new server
restart application.
decommission the database on SQL Server 2000.
However, I'm not sure how much application downtime that would involve.
Are there any easier approaches, or an approach that involves very little downtime? Can a backup be taken while the application is running? Obviously I would need to stop the application when the backup file is transferred and the restore is completed. Interested to hear your approaches to a task like this.
If you're open to downtime:
Detach the database
Copy data file(s) and log file(s) to the new server
Attach the database on the new server instance
Detaching closes the database and finalizes the files so they safely can be moved (or backed up via filesystem backup). It will no longer be accessible on the server instance until you reattach it.
Don't cut and paste / move the data and log files, just in case something bombs during the copy.
There are several other things to keep in mind when migrating to a new server instance, like making sure logins exist on the new instance, features in use that might be deprecated, etc.
Here's the reference for detach/attach.

Restore to remote database from my local machine

I am registered at Go Daddy and want to restore the database there from my local machine. The tool they provide me with doesn't work unless it's from them. I'm trying to restore from my local SQL server, but when I browse I can't restore the local files of the remote database.
They're intentionally preventing users from restoring backups that are "foreign" to them in order to satisfy an obscure Microsoft security recommendation.
You will have to perform a schema comparison and a data comparison between your local machine and the empty database on the hosting to generate the scripts to re-create all of the objects and data. (Having those scripts available in a source control storage would also be helpful.)
There are basically two ways to restored the Database using Sql Server.
Through SSMS Restore Utility.
Manual Database restore Script from here

how to check a SQL Azure bacpac is not corrupt

I just lost hours to a corrupt SQL Azure bacpac backup. None of the restore mechanisms I used reported anything wrong when restoring the backup, and schema and data seemed to be there, but given there most definitely was something wrong with that backup.
I'm using Redgate's SQL Azure backup, but afaik all that's is doing is using the create as copy of, checking until it is completed and then using the bacpac export Azure features to take the backup.
Is there a way to confirm a SQL Azure bacpac backup is not corrupt?
More information on the symptoms of the backup:
Doing a restore through the DAC Framework Client Side Tools or the Azure Management Portal doesn't report any error
Taking a quick look into the list of tables and the top 100 records of 1-2 tables looked well
SQL users were left into a state that couldn't be mapped to a SQL login (as if those users were created with the 'without login' option in a local database). This didn't happen in the other backups of the database.
In that question I link to someone else having a similar situation with an on premises backup that was corrupted due to running out of disk space: http://www.sqlmag.com/forums/aft/96868
We're going to look into more automated checking of the output of the bacpac file probably involving a temporary RESTORE - this is really the only way to check the file is complete. Before that we are looking at ensure the .bacpac file (really just a .zip file) is intact and contains the data we're expecting in there.
If you use the other mechanism of backing up to a local SQL Server we have much more control over that.
Feel free to drop me an e-mail if you have any more ideas or requests. richard.mitchell#red-gate.com

deploy the database easily

How do you easily deploy an MS SQL Server database to a third-party hosting the easiest? Nothing comes to mind except generating lots of SQL scripts and running them on the database, or calling LINQ's DataContext.CreateDatabase .
You can deloy the backup of database to the server and restore it
You can deloy the datafiles and transaction log of database to the server and attach it
And, of course, you can generate a lots of sql-queries to generate the database structure and fill it with data.

Resources