How to manage environment-specific values in build process? - sql-server

I want to automate the build process of server instance that I maintain. In version control I have a script containing every single command and configuration I used to build the instance in production.
Now I want to write a master build script that applies all these these scripts to a target instance.
While I try to keep my development environment as production-like as possible, there are some values that will always be different. To handle this, the build script should accept environment-specific values and pass the values to the relevant build steps.
The server instance has one user database. In production, the user database files are created on a drive that does not exist in my development environment, and files are larger than I have free space for in development.
When I set up the instance in production, I used this script. This is what I currently have in version control:
USE [master]
GO
CREATE DATABASE [QuoteProcessor] ON PRIMARY (
NAME = N'System_Data',
FILENAME = N'G:\SQLData\QuoteProcessor\System_Data.mdf',
SIZE = 500 MB,
MAXSIZE = UNLIMITED,
FILEGROWTH = 10%
),
FILEGROUP [DATA] DEFAULT (
NAME = N'QuoteProcessor_Data',
FILENAME = N'G:\SQLData\QuoteProcessor\QuoteProcessor_Data.ndf',
SIZE = 600 GB,
MAXSIZE = UNLIMITED,
FILEGROWTH = 10%
)
LOG ON (
NAME = N'QuoteProcessor_Log',
FILENAME = N'G:\SQLLogs\QuoteProcessor\QuoteProcessor_Log.ldf',
SIZE = 100 GB,
MAXSIZE = UNLIMITED,
FILEGROWTH = 10%
);
ALTER DATABASE [QuoteProcessor] SET COMPATIBILITY_LEVEL = 100
GO
ALTER DATABASE [QuoteProcessor] SET ANSI_NULL_DEFAULT OFF
GO
ALTER DATABASE [QuoteProcessor] SET ANSI_NULLS OFF
GO
ALTER DATABASE [QuoteProcessor] SET ANSI_PADDING OFF
GO
ALTER DATABASE [QuoteProcessor] SET ANSI_WARNINGS OFF
GO
ALTER DATABASE [QuoteProcessor] SET ARITHABORT OFF
GO
ALTER DATABASE [QuoteProcessor] SET AUTO_CLOSE OFF
GO
ALTER DATABASE [QuoteProcessor] SET AUTO_CREATE_STATISTICS ON
GO
ALTER DATABASE [QuoteProcessor] SET AUTO_SHRINK OFF
GO
ALTER DATABASE [QuoteProcessor] SET AUTO_UPDATE_STATISTICS ON
GO
ALTER DATABASE [QuoteProcessor] SET CURSOR_CLOSE_ON_COMMIT OFF
GO
ALTER DATABASE [QuoteProcessor] SET CURSOR_DEFAULT GLOBAL
GO
ALTER DATABASE [QuoteProcessor] SET CONCAT_NULL_YIELDS_NULL OFF
GO
ALTER DATABASE [QuoteProcessor] SET NUMERIC_ROUNDABORT OFF
GO
ALTER DATABASE [QuoteProcessor] SET QUOTED_IDENTIFIER OFF
GO
ALTER DATABASE [QuoteProcessor] SET RECURSIVE_TRIGGERS OFF
GO
ALTER DATABASE [QuoteProcessor] SET DISABLE_BROKER
GO
ALTER DATABASE [QuoteProcessor] SET AUTO_UPDATE_STATISTICS_ASYNC OFF
GO
ALTER DATABASE [QuoteProcessor] SET DATE_CORRELATION_OPTIMIZATION OFF
GO
ALTER DATABASE [QuoteProcessor] SET TRUSTWORTHY OFF
GO
ALTER DATABASE [QuoteProcessor] SET ALLOW_SNAPSHOT_ISOLATION OFF
GO
ALTER DATABASE [QuoteProcessor] SET PARAMETERIZATION SIMPLE
GO
ALTER DATABASE [QuoteProcessor] SET READ_COMMITTED_SNAPSHOT ON
GO
ALTER DATABASE [QuoteProcessor] SET HONOR_BROKER_PRIORITY OFF
GO
ALTER DATABASE [QuoteProcessor] SET RECOVERY SIMPLE
GO
ALTER DATABASE [QuoteProcessor] SET MULTI_USER
GO
ALTER DATABASE [QuoteProcessor] SET PAGE_VERIFY CHECKSUM
GO
ALTER DATABASE [QuoteProcessor] SET DB_CHAINING OFF
GO
USE [master]
GO
ALTER DATABASE [QuoteProcessor] SET READ_WRITE
GO
In the development environment, I can use the same filegroups, but I have to use different paths and different sizes for the database files.
I see several solutions:
Edit the script by hand for every environment. I can't really automate this, or use it to track changes to environment-specific values.
Make one copy of the script for each environment. I could automate the selection of script depending on environment. This would duplicate the specification of things that should never change independently, like all the ALTER DATABASE statements.
Abstract away environment-specific values using scripting variables and define those values in another place, like an environment configuration file.
I think option 3 is the cleanest solution. It's the one I explore here.
For example, I could use sqlcmd scripting variables to replace the CREATE DATABASE statement with this:
CREATE DATABASE [QuoteProcessor] ON PRIMARY (
NAME = N'System_Data',
FILENAME = N'$(PrimaryDataFileFullPath)',
SIZE = $(PrimaryDataFileSize),
MAXSIZE = UNLIMITED,
FILEGROWTH = 10%
),
FILEGROUP [DATA] DEFAULT (
NAME = N'QuoteProcessor_Data',
FILENAME = N'$(UserDataFileFullPath)',
SIZE = $(UserDataFileSize),
MAXSIZE = UNLIMITED,
FILEGROWTH = 10%
)
LOG ON (
NAME = N'QuoteProcessor_Log',
FILENAME = N'$(LogFileFullPath)',
SIZE = $(LogFileSize),
MAXSIZE = UNLIMITED,
FILEGROWTH = 10%
);
And to create the database in production, I could invoke the script like this:
sqlcmd -i QuoteProcessor.sql -v PrimaryDataFileFullPath="G:\SQLData\QuoteProcessor\System_Data.mdf" -v PrimaryDataFileSize="500 MB" -v UserDataFileFullPath="G:\SQLData\QuoteProcessor\QuoteProcessor_Data.ndf" -v UserDataFileSize="600 GB" -v LogFileFullPath="G:\SQLLogs\QuoteProcessor\QuoteProcessor_Log.ldf" -v LogFileSize="100 GB"
The master build script would read the values from a configuration file and pass them to sqlcmd.
There would be one configuration file for production, one for development; one for every distinct environment in my organization.
I haven't decided how to stored the environment-specific values yet, but I was thinking that an INI file or an XML file would make it easy.
Can anyone else offer advice on solving a similar problem? I'm not sure if this is the best way to do what I want. Is there an easier or better-supported way of managing environment-specific values for this problem? Should I be using some tool that manages this kind of thing for me?

This is just my take on these
1. Edit the script by hand for every environment. I can't really
automate this, or use it to track changes to environment-specific
values.
I would recommend against this. This allows people to accidently make changes to code that you didn't intend for them to touch. Not that the others prevent it, but this welcomes the most risk.
2. Make one copy of the script for each environment. I could automate the selection of script depending on environment. This would duplicate the specification of things that should never change independently, like all the ALTER DATABASE statements.
This works, but you run into the problem when servers change, and based on your criteria on how you are determining what is a dev server or a prod server, the script maybe out dated.
3. Abstract away environment-specific values using scripting variables and define those values in another place, like an environment configuration file.
This is how SSDT, microsoft sql server data tools projects do it.
There's also a hybrid approach where you can abstract away the environmental specific values but not have an environement configuration file, by using template parameters (again in sql server at least)
http://msdn.microsoft.com/en-us/library/hh230912.aspx

Related

How can I change Database creation SQL script auto-generated by Visual Studio

I am using Visual Studio Enterprise 2015 for my database project, and using it to generate a Database_Create.sql script which I deploy and run against different servers.
There is some VisualStudio-generated code which needs to be manually edited which sets SQLCMD command variables for database name, default data and default log directories.
I can get the default data and default log directories programmatically in TSQL using SERVERPROPERYT('InstanceDefaultDataPath') but can't mix this with the SQLCMD commands.
I have a solution which I added manually to the Database_Create.sql script, but would like to bake it into the VS project and avoid a manual edit.
Can I change VisualStudio's code generation process so that the generated Database_Create.sql script will contain my code?
Specifically, the autogenerated code is :
Visual Studio generated a change script with the following lines:
GO
SET ANSI_NULLS, ANSI_PADDING --etc.;
SET NUMERIC_ROUNDABORT OFF;
GO
:setvar DatabaseName "Foo"
:setvar DefaultFilePrefix "Foo"
:setvar DefaultDataPath ""
:setvar DefaultLogPath ""
GO
:on error exit
GO
:setvar __IsSqlCmdEnabled "True"
GO
IF N'$(__IsSqlCmdEnabled)' NOT LIKE N'True'
BEGIN
PRINT N'SQLCMD mode must be enabled to successfully execute this script.';
SET NOEXEC ON;
END
GO
USE [master];
GO
PRINT N'Creating $(DatabaseName)...'
GO
CREATE DATABASE [$(DatabaseName)] COLLATE Latin1_General_CI_AS
GO
PRINT N'Creating [MediaFileGroup]...';
GO
ALTER DATABASE [$(DatabaseName)]
ADD FILEGROUP [MediaFileGroup] CONTAINS FILESTREAM;
GO
ALTER DATABASE [$(DatabaseName)]
ADD FILE (NAME = [MediaFileGroup_12345678], FILENAME = N'$(DefaultDataPath)$(DefaultFilePrefix)_MediaFileGroup_12345678.mdf') TO FILEGROUP [MediaFileGroup];
GO
USE [$DataBaseName)];
I want to add a section of SQL code which will replace the
ALTER DATABASE [$(DatabaseName)]
ADD FILE (NAME = [MediaFileGroup_12345678], FILENAME = N'$(DefaultDataPath)$(DefaultFilePrefix)_MediaFileGroup_12345678.mdf') TO FILEGROUP [MediaFileGroup];
(since this is the line that uses DefaultDataPath) so that my computed DefaultDataPath is used
My question is not about obtaining the Default Data Path programmatically - I can do that. My question is : How can I change the VisualStudio-generated code section?

SQL server set AUTOGROW_ALL_FILES fails

I try to execute this:
USE [MyDB]
GO
declare #autogrow bit
SELECT #autogrow=convert(bit, is_autogrow_all_files) FROM sys.filegroups WHERE name=N'PRIMARY'
if(#autogrow=0)
ALTER DATABASE [MyDB] MODIFY FILEGROUP [PRIMARY] AUTOGROW_ALL_FILES
GO
And it fails with:
Database state cannot be changed while other users are using the database 'HistoryDBTest'
How can I go around it?
You'll need to change the database to single user mode. Use this with care; considering the error is "other users are using the database" this means that those users will have their connections to the database cut off, and their transactions rolled back.
USE master;
GO
ALTER DATABASE MyDB SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
GO
USE [MyDB];
GO
DECLARE #autogrow bit;
SELECT #autogrow = CONVERT(bit, is_autogrow_all_files)
FROM sys.filegroups
WHERE name = N'PRIMARY';
IF (#autogrow = 0) ALTER DATABASE [MyDB] MODIFY FILEGROUP [PRIMARY] AUTOGROW_ALL_FILES;
GO
USE master;
GO
ALTER DATABASE MyDB SET MULTI_USER;

How to complete remove filestream and all attached files

I have tried the FILESTREAM feature for MSSQL (2008R2 Data Center) on a local database, to experiment. The real database is running on a server. I have setup the whole FILESTREAM, using this query:
/* CREATE FILESTREAM AND FILESTREAM TABLE */
USE [master]
GO
ALTER DATABASE SenONew
ADD FILEGROUP [FileStream]
CONTAINS FILESTREAM
GO
ALTER DATABASE SenONew
ADD FILE
(
NAME = 'fsSenONew',
FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLEXPRESS\MSSQL\DATA\SenONew.ndf'
)
TO FILEGROUP [FileStream]
GO
USE [SenONew]
GO
CREATE TABLE Filestore(
FileID int PRIMARY KEY,
RowID uniqueidentifier ROWGUIDCOL NOT NULL UNIQUE DEFAULT NEWSEQUENTIALID(),
FileDescr nvarchar(max),
FileIndex varbinary(max) FILESTREAM NULL)
GO
And I was experimenting with adding a few files then deleting them.
Now since this was only meant to be an experiment, I also want to get rid of it. I'm using my local server for the development of the database that will be used on the real server, thus I'm creating BackUp's on my local server then Restore this on the real server, so it gets updated (software is in development, so the database structure changes much as well as the data and I need to do a full restore to the real server, where the software is being tested on).
After hours of searching, I couldn't find anything on my problem.
I understand that I need to:
Remove the database table storing the FILESTREAM information
I need to remove the FILE of the FILESTREAM
Remove the filegroup
So I'm using this query to get rid of everything I set up in the first place:
/* DROP FILESTREAM TABLE AND FILEGROUP */
USE SenONew
DROP TABLE Filestore
GO
ALTER DATABASE SenONew
REMOVE FILE fsSenONew
ALTER DATABASE SenONew
REMOVE FILEGROUP [FileStream]
GO
So I do everything as I should and it completes without error as well. So when I enter my filegroups, files and my file location, I see they are all completely removed:
But when I do a BACKUP of my local database (which include the deleted FILESTREAM, file path and filegroup) and try to restore the server with it, I get errors.
SQL to create a BACKUP:
/* CREATE BACKUP OF DATABASE WITHIN CURRECT CONNECTION */
DECLARE #FileName2 nvarchar(250)
SELECT #FileName2 = (SELECT 'C:\SenO BackUp\' + convert(nvarchar(200),GetDate(),112) + ' SenONew.BAK')
BACKUP DATABASE SenONew TO DISK=#FileName2
GO
Then do the Restore on the server:
/* CREATE RESTORE OF DATABASE WITHIN REAL SERVER CONNECTION */
use master
alter database SenONew set offline with rollback immediate;
DECLARE #FileName2 nvarchar(250)
SELECT #FileName2 = (SELECT '' + convert(nvarchar(200),GetDate(),112) + ' SenONew.BAK')
RESTORE DATABASE SenONew
FROM DISK = #FileName2
alter database SenONew set online with rollback immediate;
I get the following error:
*(Msg 5121, Level 16, State 2, Line 7
The path specified by "C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLEXPRESS\MSSQL\DATA\SenONew.ndf" is not in a valid directory.
Msg 3156, Level 16, State 3, Line 7 File 'fsSenONew' cannot be restored to 'C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLEXPRESS\MSSQL\DATA\SenONew.ndf'. Use WITH MOVE to identify a valid location for the file.
Msg 3119, Level 16, State 1, Line 7 Problems were identified while planning for the RESTORE statement. Previous messages provide details.
Msg 3013, Level 16, State 1, Line 7 RESTORE DATABASE is terminating abnormally. )*
I deleted the .ndf FILESTREAM location, why is it a specified path? Also, why is fsSenONew trying to restore? I can't get my head around it. Are there paths internally that I need to delete?
You can check:
SELECT * FROM SenONew.sys.data_spaces WHERE name = 'FileStream'
it should return 0 rows.
There is a procedure to remove FILESTREAM features from a SQL Server 2008 database :
ALTER TABLE Filestore DROP column FileIndex
GO
ALTER TABLE Filestore SET (FILESTREAM_ON="NULL")
GO
ALTER Database SenONew REMOVE FILE fsSenONew
GO
ALTER Database SenONew REMOVE FILEGROUP [FileStream]
GO
as described in this article. But the steps you did should do the same thing.
Your problem is certainly strange, but I suggest that you try using following
USE SenONew
EXEC Sp_help
EXEC Sp_helpfile
EXEC Sp_helpfilegroup
You may find something suspicious there like another table using that FILEGROUP.
I have done exactly the steps you describe and cannot reproduce your problem. Check how your Restore database screen looks like.
1.Remove the FILESTREAM attribute from columns and tables. You'll need to move data to a new column.
ALTER TABLE MyTable
ADD FileData varbinary(max) NULL;
GO
update MyTable
set FileData = FileStreamData
GO
ALTER TABLE MyTable
DROP column FileStreamData
GO
ALTER TABLE MyTable SET (FILESTREAM_ON="NULL")
GO
EXEC sp_RENAME 'MyTable.FileData', 'FileStreamData', 'COLUMN'
GO
2.Remove files from the FILESTREAM and drop the FILE and FILESTEAM.
ALTER DATABASE [MyDatabase] SET RECOVERY Simple
GO
EXEC SP_FILESTREAM_FORCE_GARBAGE_COLLECTION
ALTER DATABASE [MyDatabase] REMOVE FILE [MyFile]
GO
ALTER DATABASE [MyDatabase] REMOVE FILEGROUP [MyFileGroup]
GO
ALTER DATABASE [MyDatabase] SET RECOVERY FULL
GO
This is my script that worked for me. It was a command missing, to empty the file:
ALTER TABLE FilesTable DROP column FileContent
GO
ALTER TABLE FilesTable SET (FILESTREAM_ON="NULL")
GO
USE mydbname
GO
DBCC SHRINKFILE (N'filestreamfile', EMPTYFILE)
GO
EXEC sp_filestream_force_garbage_collection #dbname = N'mydbname'
ALTER Database mydbname REMOVE FILE filestreamfile
GO
ALTER Database mydbname REMOVE FILEGROUP FILESTREAMGROUP
GO
IF COL_LENGTH('FilesTable','FileContent') IS NULL
BEGIN
ALTER TABLE FilesTable ADD FileContent [varbinary](max) NULL
END
GO

Database script not executing in SQL Server 2008

In my SQL Server 2008 R2, I have a database script which generates successfully, but when I am trying to execute that script it only shows Executing query message and nothing happen.
I had waited at-least 10 minutes for result but force fully I have to stop executing that query.
Note: All other queries are working normally, but only database script is not executing as explained above
I don't know what's going on...
More details: This thing is not happening on particular DataBase, it is a problem on all the database of my SQL Server.
lets see it by example.
In SQL Server 2008 R2, I have following type of script.
USE [master]
GO
/****** Object: Database [BillingApplication] Script Date: 01/22/2013 17:42:04 ******/
IF NOT EXISTS (SELECT name FROM sys.databases WHERE name = N'BillingApplication')
BEGIN
CREATE DATABASE [BillingApplication] ON PRIMARY
( NAME = N'BillingApplication', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\BillingApplication.mdf' , SIZE = 3072KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB )
LOG ON
( NAME = N'BillingApplication_log', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\BillingApplication_log.ldf' , SIZE = 1024KB , MAXSIZE = 2048GB , FILEGROWTH = 10%)
END
GO
ALTER DATABASE [BillingApplication] SET COMPATIBILITY_LEVEL = 100
GO
IF (1 = FULLTEXTSERVICEPROPERTY('IsFullTextInstalled'))
begin
EXEC [BillingApplication].[dbo].[sp_fulltext_database] #action = 'enable'
end
GO
ALTER DATABASE [BillingApplication] SET ANSI_NULL_DEFAULT OFF
GO
ALTER DATABASE [BillingApplication] SET ANSI_NULLS OFF
GO
ALTER DATABASE [BillingApplication] SET ANSI_PADDING OFF
GO
ALTER DATABASE [BillingApplication] SET ANSI_WARNINGS OFF
GO
ALTER DATABASE [BillingApplication] SET ARITHABORT OFF
GO
ALTER DATABASE [BillingApplication] SET AUTO_CLOSE OFF
GO
ALTER DATABASE [BillingApplication] SET AUTO_CREATE_STATISTICS ON
GO
ALTER DATABASE [BillingApplication] SET AUTO_SHRINK OFF
GO
ALTER DATABASE [BillingApplication] SET AUTO_UPDATE_STATISTICS ON
GO
ALTER DATABASE [BillingApplication] SET CURSOR_CLOSE_ON_COMMIT OFF
GO
ALTER DATABASE [BillingApplication] SET CURSOR_DEFAULT GLOBAL
GO
ALTER DATABASE [BillingApplication] SET CONCAT_NULL_YIELDS_NULL OFF
GO
ALTER DATABASE [BillingApplication] SET NUMERIC_ROUNDABORT OFF
GO
ALTER DATABASE [BillingApplication] SET QUOTED_IDENTIFIER OFF
GO
ALTER DATABASE [BillingApplication] SET RECURSIVE_TRIGGERS OFF
GO
ALTER DATABASE [BillingApplication] SET DISABLE_BROKER
GO
ALTER DATABASE [BillingApplication] SET AUTO_UPDATE_STATISTICS_ASYNC OFF
GO
ALTER DATABASE [BillingApplication] SET DATE_CORRELATION_OPTIMIZATION OFF
GO
ALTER DATABASE [BillingApplication] SET TRUSTWORTHY OFF
GO
ALTER DATABASE [BillingApplication] SET ALLOW_SNAPSHOT_ISOLATION OFF
GO
ALTER DATABASE [BillingApplication] SET PARAMETERIZATION SIMPLE
GO
ALTER DATABASE [BillingApplication] SET READ_COMMITTED_SNAPSHOT OFF
GO
ALTER DATABASE [BillingApplication] SET HONOR_BROKER_PRIORITY OFF
GO
ALTER DATABASE [BillingApplication] SET READ_WRITE
GO
ALTER DATABASE [BillingApplication] SET RECOVERY FULL
GO
ALTER DATABASE [BillingApplication] SET MULTI_USER
GO
ALTER DATABASE [BillingApplication] SET PAGE_VERIFY CHECKSUM
GO
ALTER DATABASE [BillingApplication] SET DB_CHAINING OFF
GO
EXEC sys.sp_db_vardecimal_storage_format N'BillingApplication', N'ON'
GO
USE [BillingApplication]
GO
/****** Object: Table [dbo].[tbCustBill] Script Date: 01/22/2013 17:42:07 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[tbCustBill]') AND type in (N'U'))
BEGIN
-- And continue the entire script
Now as I had seen in SQL Server Profiler than the execution till following code work perfectly.
ALTER DATABASE [BillingApplication] SET RECURSIVE_TRIGGERS OFF
GO
And on the the next line executing become stop.
I don't know what's going on....
On force full stop of execution its generate error as below
Msg 5069, Level 16, State 1, Line 1
ALTER DATABASE statement failed.
So, that's it may be some SQL Sever configuration problem...
I think you need to ensure you can place an exclusive lock on the database to change this setting. Even though you just created the database you may have established more than one connection to it. Make sure you disable IntelliSense in Management Studio, that you have no other windows connected to this database, and that you switch context to another database. Then set the database to single user, make your changes, and set it back:
USE master;
GO
ALTER DATABASE BillingApplication SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
GO
ALTER DATABASE BillingApplication SET RECURSIVE_TRIGGERS OFF;
GO
ALTER DATABASE BillingApplication SET DISABLE_BROKER;
GO
... other changes ...
ALTER DATABASE BillingApplication SET MULTI_USER;
GO
If this still waits then you may be waiting on some very large transaction to roll back, and you can check in another window what you are waiting for by looking at sys.dm_exec_requests and/or sys.dm_os_waiting_tasks.
Though, if you are creating the database, why do you think you need to explicitly disable broker? It's not enabled by default...
You have
ALTER DATABASE [BillingApplication] SET DISABLE_BROKER
without ane ON or OFF. Set that and it should work.
Nope, that's not it. I'd guess it has something to do with the Service Broker sub-system, but that's not something I'm familiar with. Hopefully someone else who knows that system can answer this...\

Error restoring database backup

I am getting an error using SQL Server 2012 when restoring a backup made with a previous version (SQL Server 2008). I actually have several backup files of the same database (taken at different times in the past). The newest ones are restored without any problems; however, one of them gives the following error:
System.Data.SqlClient.SqlError: Directory lookup for the file
"C:\PROGRAM FILES\MICROSOFT SQL
SERVER\MSSQL.1\MSSQL\DATA\MYDB_ABC.MDF" failed with the operating
system error 3(The system cannot find the path specified.).
(Microsoft.SqlServer.SmoExtended)
This is a x64 machine, and my database file(s) are in this location: c:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL.
I do not understand why it tries to restore on MSSQL.1 and not MSSQL11.MSSQLSERVER.
Sounds like the backup was taken on a machine whose paths do not match yours. Try performing the backup using T-SQL instead of the UI. Also make sure that the paths you're specifying actually exist and that there isn't already a copy of these mdf/ldf files in there.
RESTORE DATABASE MYDB_ABC FROM DISK = 'C:\path\file.bak'
WITH MOVE 'mydb' TO 'c:\valid_data_path\MYDB_ABC.mdf',
MOVE 'mydb_log' TO 'c:\valid_log_path\MYDB_ABC.ldf';
When restoring, under Files, check 'Relocate all files to folder'
The backup stores the original location of the database files and, by default, attempts to restore to the same location. Since your new server installation is in new directories and, presumably, the old directories no longer exist, you need to alter the directories from the defaults to match the location you wish it to use.
Depending on how you are restoring the database, the way to do this will differ. If you're using SSMS, look through the tabs and lists until you find the list of files and their associated disk locations - you can then edit those locations before restoring.
I have managed to do this from code. This was not enough
Restore bkp = new Restore();
bkp.PercentCompleteNotification = 1;
bkp.Action = RestoreActionType.Database;
bkp.Database = sDatabase;
bkp.ReplaceDatabase = true;
The RelocateFiles property must be filled with the names and paths of the files to be relocated. For each file you must specify the name of the file and the new physical path. So what I did was looking at the PrimaryFilePath of the database I was restoring to, and use that as the physical location. Something like this:
if (!string.IsNullOrEmpty(sDataFileName) && !File.Exists(sDataFileName))
{
if (originaldb != null)
{
if (string.Compare(Path.GetDirectoryName(sDataFileName), originaldb.PrimaryFilePath, true) != 0)
{
string sPhysicalDataFileName = Path.Combine(originaldb.PrimaryFilePath, sDatabase + ".MDF");
bkp.RelocateFiles.Add(new RelocateFile(sLogicalDataFileName, sPhysicalDataFileName));
}
}
}
Same for the log file.
I had the same problem, and this fixed it without any C# code:
USE [master]
ALTER DATABASE [MyDb]
SET SINGLE_USER WITH ROLLBACK IMMEDIATE
RESTORE DATABASE [MyDb]
FROM DISK = N'D:\backups\mydb.bak'
WITH FILE = 1,
MOVE N'MyDb' TO N''c:\valid_data_path\MyDb.mdf',
MOVE N'MyDb_log' TO N'\valid_log_path\MyDb.ldf',
NOUNLOAD,
REPLACE,
STATS = 5
ALTER DATABASE [MyDb] SET MULTI_USER
GO
As has already been said a few times, restoring a backup where the new and old paths for the mdf and ldf files don't match can cause this error. There are several good examples here already of how to deal with that with SQL, none of them however worked for me until I realised that in my case I needed to include the '.mdf' and '.ldf' extensions in the from part of the 'MOVE' statement, e.g.:
RESTORE DATABASE [SomeDB]
FROM DISK = N'D:\SomeDB.bak'
WITH MOVE N'SomeDB.mdf' TO N'D:\SQL Server\MSSQL12.MyInstance\MSSQL\DATA\SomeDB.mdf',
MOVE N'SomeDb_log.ldf' TO N'D:\SQL Server\MSSQL12.MyInstance\MSSQL\DATA\SomeDB_log.ldf'
Hope that saves someone some pain, I could not understand why SQL was suggesting I needed to use the WITH MOVE option when I already was doing so.
Please try to uncheck the “Tail-Log Backup” option on the Options page of the Restore Database dialog
There is some version issue in this. You can migrate your database to 2012 by 2 another methods:-
1) take the database offline > copy the .mdf and .ldf files to the target server data folder and attach the database. refer this:-
https://dba.stackexchange.com/questions/30440/how-do-i-attach-a-database-in-sql-server
2) Create script of the whole database with schema & Data and run it on the target server(very slow process takes time). refer this:-
Generate script in SQL Server Management Studio
Try restarting the SQL Service. Worked for me.
Just in case this is useful for someone working directly with Powershell (using the SMO library), in this particular case there were secondary data files as well. I enhanced the script a little by killing any open processes and then doing the restore.
Import-module SQLPS
$svr = New-Object ("Microsoft.SqlServer.Management.Smo.Server") "server name";
$svr.KillAllProcesses("database_name");
$RelocateData1 = New-Object "Microsoft.SqlServer.Management.Smo.RelocateFile, Microsoft.SqlServer.SmoExtended, Version=13.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91" ("primary_logical_name","C:\...\SQLDATA\DATA\database_name.mdf")
$RelocateData2 = New-Object "Microsoft.SqlServer.Management.Smo.RelocateFile, Microsoft.SqlServer.SmoExtended, Version=13.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91" ("secondary_logical_name_2","C:\...\SQLDATA\DATA\secondary_file_2.mdf")
$RelocateData3 = New-Object "Microsoft.SqlServer.Management.Smo.RelocateFile, Microsoft.SqlServer.SmoExtended, Version=13.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91" ("secondary_logical_name_3","C:\...\SQLDATA\DATA\secondary_file_3.mdf")
$RelocateLog = New-Object "Microsoft.SqlServer.Management.Smo.RelocateFile, Microsoft.SqlServer.SmoExtended, Version=13.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91" ("database_name_log","C:\...\SQLDATA\LOGS\database_name_log.ldf")
Restore-SqlDatabase -ServerInstance "server-name" -Database "database_name" -BackupFile "\\BACKUPS\\database_name.bak" -RelocateFile #($RelocateData1, $RelocateData2, $RelocateData3, $RelocateLog) -ReplaceDatabase
You should remove these lines from your script.
CONTAINMENT = NONE
ON PRIMARY
( NAME = N'StudentManagement', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.SQLEXPRESS\MSSQL\DATA\StudentManagement.mdf' , SIZE = 10240KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB )
LOG ON
( NAME = N'StudentManagement_log', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL11.SQLEXPRESS\MSSQL\DATA\StudentManagement_log.ldf' , SIZE = 5696KB , MAXSIZE = 2048GB , FILEGROWTH = 10%)
GO
ALTER DATABASE [StudentManagement] SET COMPATIBILITY_LEVEL = 110
GO
IF (1 = FULLTEXTSERVICEPROPERTY('IsFullTextInstalled'))
begin
EXEC [StudentManagement].[dbo].[sp_fulltext_database] #action = 'enable'
end
GO
ALTER DATABASE [StudentManagement] SET ANSI_NULL_DEFAULT OFF
GO
ALTER DATABASE [StudentManagement] SET ANSI_NULLS OFF
GO
ALTER DATABASE [StudentManagement] SET ANSI_PADDING OFF
GO
ALTER DATABASE [StudentManagement] SET ANSI_WARNINGS OFF
GO
ALTER DATABASE [StudentManagement] SET ARITHABORT OFF
GO
ALTER DATABASE [StudentManagement] SET AUTO_CLOSE OFF
GO
ALTER DATABASE [StudentManagement] SET AUTO_CREATE_STATISTICS ON
GO
ALTER DATABASE [StudentManagement] SET AUTO_SHRINK OFF
GO
ALTER DATABASE [StudentManagement] SET AUTO_UPDATE_STATISTICS ON
GO
ALTER DATABASE [StudentManagement] SET CURSOR_CLOSE_ON_COMMIT OFF
GO
ALTER DATABASE [StudentManagement] SET CURSOR_DEFAULT GLOBAL
GO
ALTER DATABASE [StudentManagement] SET CONCAT_NULL_YIELDS_NULL OFF
GO
ALTER DATABASE [StudentManagement] SET NUMERIC_ROUNDABORT OFF
GO
ALTER DATABASE [StudentManagement] SET QUOTED_IDENTIFIER OFF
GO
ALTER DATABASE [StudentManagement] SET RECURSIVE_TRIGGERS OFF
GO
ALTER DATABASE [StudentManagement] SET DISABLE_BROKER
GO
ALTER DATABASE [StudentManagement] SET AUTO_UPDATE_STATISTICS_ASYNC OFF
GO
ALTER DATABASE [StudentManagement] SET DATE_CORRELATION_OPTIMIZATION OFF
GO
ALTER DATABASE [StudentManagement] SET TRUSTWORTHY OFF
GO
ALTER DATABASE [StudentManagement] SET ALLOW_SNAPSHOT_ISOLATION OFF
GO
ALTER DATABASE [StudentManagement] SET PARAMETERIZATION SIMPLE
GO
ALTER DATABASE [StudentManagement] SET READ_COMMITTED_SNAPSHOT OFF
GO
ALTER DATABASE [StudentManagement] SET HONOR_BROKER_PRIORITY OFF
GO
ALTER DATABASE [StudentManagement] SET RECOVERY SIMPLE
GO
ALTER DATABASE [StudentManagement] SET MULTI_USER
GO
ALTER DATABASE [StudentManagement] SET PAGE_VERIFY CHECKSUM
GO
ALTER DATABASE [StudentManagement] SET DB_CHAINING OFF
GO
ALTER DATABASE [StudentManagement] SET FILESTREAM( NON_TRANSACTED_ACCESS = OFF )
GO
ALTER DATABASE [StudentManagement] SET TARGET_RECOVERY_TIME = 0 SECONDS
This usually happens, when you are using one MSSQL Studio for backup (connected to old server) and restore (connected to new one). Just make sure you are executing the restore on the correct server. Either check the server name and IP in the left pane in UI or dou
If you're doing this with C#, and the physical paths are not the same, you need to use RelocateFiles, as one answer here also mentioned.
For most cases, the below code will work, assuming:
You're just restoring a backup of a database from elsewhere, otherwise meant to be identical. For example, a copy of production to a local Db.
You aren't using an atypical database layout, for example one where the rows files are spread across multiple files on multiple disks.
In addition, the below is only necessary on first restore. Once a single successful restore occurs, the below file mapping will already be setup for you in Sql Server. But, the first time - restoring a bak file to a blank db - you basically have to say, "Yes, use the Db files in their default, local locations, instead of freaking out" and you need to tell it to keep things in the same place by, oddly enough, telling it to relocate them:
var dbDataFile = db.FileGroups[0].Files[0];
restore.RelocateFiles.Add(new RelocateFile(dbDataFile.Name, dbDataFile.FileName));
var dbLogFile = db.LogFiles[0];
restore.RelocateFiles.Add(new RelocateFile(dbLogFile.Name, dbLogFile.FileName));
To better clarify what a typical case would be, and how you'd do the restore, here's the full code for a typical restore of a .bak file to a local machine:
var smoServer = new Microsoft.SqlServer.Management.Smo.Server(
new Microsoft.SqlServer.Management.Common.ServerConnection(sqlServerInstanceName));
var db = smoServer.Databases[dbName];
if (db == null)
{
db = new Microsoft.SqlServer.Management.Smo.Database(smoServer, dbName);
db.Create();
}
restore.Devices.AddDevice(backupFileName, DeviceType.File);
restore.Database = dbName;
restore.FileNumber = 0;
restore.Action = RestoreActionType.Database;
restore.ReplaceDatabase = true;
var dbDataFile = db.FileGroups[0].Files[0];
restore.RelocateFiles.Add(new RelocateFile(dbDataFile.Name, dbDataFile.FileName));
var dbLogFile = db.LogFiles[0];
restore.RelocateFiles.Add(new RelocateFile(dbLogFile.Name, dbLogFile.FileName));
restore.SqlRestore(smoServer);
db.SetOnline();
smoServer.Refresh();
db.Refresh();
This code will work whether you've manually restored this Db before, created one manually with just the name and no data, or done nothing - started with a totally blank machine, with just Sql Server installed and no databases whatsoever.
Please change the .mdf file path. Just create a folder in any drive, ie - in "D" drive, just create a folder with custom name (dbase) and point the path to the new folder, mssql will automatically create the files.
"C:\PROGRAM FILES\MICROSOFT SQL SERVER\MSSQL.1\MSSQL\DATA\MYDB_ABC.MDF"
to
"D:\dbase\MYDB_ABC.MDF"

Resources