Powershell restore SQL Server database to new database - sql-server

I have a database $CurrentDB and I want to restore a backup of $CurrentDB to $NewDB. The T-SQL command looks like this:
USE [master]
ALTER DATABASE [NewDB]
SET SINGLE_USER WITH ROLLBACK IMMEDIATE
RESTORE DATABASE [NewDB]
FROM DISK = N'D:\Backups\CurrentDB.bak'
WITH FILE = 1,
MOVE N'CurrentDB' TO N'D:\Databases\NewDB.mdf',
MOVE N'CurrentDB_log' TO N'D:\Logs\NewDB_log.ldf',
NOUNLOAD, REPLACE, STATS = 5
ALTER DATABASE [NewDB]
SET MULTI_USER
GO
I am attempting to user Restore-SqlDatabase but I don't know how to properly -RelocateFile
$CurrentDB = "CurrentDB"
$NewDB = "NewDB"
$NewDBmdf = "NewDB.mdf"
$CurrentDBlog = "CurrentDB_log"
$NewDBldf = "NewDB_log.ldf"
$backupfile = $CurrentDB + "ToNewDB.bak"
$RelocateData = New-Object
Microsoft.SqlServer.Management.Smo.RelocateFile($CurrentDB, $NewDBmdf)
$RelocateLog = New-Object
Microsoft.SqlServer.Management.Smo.RelocateFile($CurrentDBlog, $NewDBldf)
Restore-SqlDatabase -ServerInstance $SQLServer -Database $NewDB -BackupFile
$backupfile -ReplaceDatabase -NoRecovery -RelocateFile #($RelocateData,
$RelocateLog)
I can't seem to locate an example of what I am attempting to do. I have seen plenty of examples of restoring databases with the same name but different files. I want a different name and different file names. I am open to suggestions.

You don't have to use SMO just because your're in PowerShell.
import-module sqlps
$database = "NewDb"
$backupLocation = "D:\Backups\CurrentDB.bak"
$dataFileLocation = "D:\Databases\NewDB.mdf"
$logFileLocation = "D:\Logs\NewDB_log.ldf"
$sql = #"
USE [master]
ALTER DATABASE [$database]
SET SINGLE_USER WITH ROLLBACK IMMEDIATE
RESTORE DATABASE [$database]
FROM DISK = N'$backupLocation'
WITH FILE = 1,
MOVE N'CurrentDB' TO N'$dataFileLocation',
MOVE N'CurrentDB_log' TO N'$logFileLocation',
NOUNLOAD, REPLACE, STATS = 5
ALTER DATABASE [$database]
SET MULTI_USER
"#
invoke-sqlcmd $sql
And if you don't have sqlps installed, you can use System.Data.SqlClient from Powershell to run TSQL.

$RelocateData = [Microsoft.SqlServer.Management.Smo.RelocateFile]::new($CurrentDB, $NewDBmdf)
$RelocateLog = [Microsoft.SqlServer.Management.Smo.RelocateFile]::new($CurrentDBlog, $NewDBldf)
Restore-SqlDatabase -ServerInstance $SQLServer -Database $NewDB -BackupFile $backupfile `
-ReplaceDatabase -NoRecovery -RelocateFile #($RelocateData, $RelocateLog)

Related

Problem sending ftp file using SQL Server stored procedure

I am trying to send a file by ftp using a stored procedure in SQL Server.
When I run the procedure I get;
'OPTS':command not implemented.
after the first line of the script and then;
PORT address does not match originator.
after the 'put' command.
The stored procedure is;
ALTER PROCEDURE [dbo].[TestFTP]
#FTPScriptFile nvarchar(128)
AS
SET NOCOUNT ON;
BEGIN
DECLARE #FTPCommand nvarchar(256)
Set #FTPCommand = 'ftp -s:' + #FTPScriptFile
EXEC master..xp_cmdshell #FTPCommand
END
RETURN
The ftp script file contains;
open ftp.jht.co.uk 21
username
password
binary
put "D:\TestFiles\SampleFile.csv"
disconnect
bye
Any idea as to the problem?
Thanks for taking the time to look and any help would be greatly appreciated.
I decided to take a look at the first suggestion which was to use Powershell instead and that worked.
I created a stored procedure to create a script file;
$Directory = "D:\TestFiles"
$filename = "Example.csv"
$fullfile = "D:\TestFiles\Example.csv"
$ftpserver = "ftp://ftp.jht.co.uk/"
$username = "user"
$password = "pwd"
$ftpserverURI = New-Object -TypeName System.Uri -ArgumentList $ftpserver, [System.UriKind]::Absolute
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object -TypeName System.Net.NetworkCredential -ArgumentList $username, $password
$uri = New-Object -TypeName System.Uri -ArgumentList $ftpserverURI, $filename
$webclient.UploadFile($uri, [System.Net.WebRequestMethods+Ftp]::UploadFile, $fullfile)
And then ran the script with powershell;
DECLARE #Command nvarchar(256)
Set #Command = 'powershell "' + #ScriptFile + '"'
EXEC master..xp_cmdshell #Command
Thanks to everyone who took a look and suggested a solution.

Invoke-sqlcmd wait for Restoration to complete before going to next task

I'm running this powershell script in order to restore a database on sql server.
$Query = #" EXEC msdb.dbo.sp_start_job N'Job'; GO "#
Invoke-Sqlcmd -ServerInstance .... -Query $Query -Verbose
I get this output => VERBOSE: 'job' started successfully.
When I check SQL server I find that the Database in restoring state but didn't finish.
My problem is that I need to make sure that the database was successfully restored before passing to the next tasks.
I made a deep research on how to do it using other methods, like using the dbo.restorehistory to get the result but it's not really efficient.
Is there a way to make Invoke-sqlCmd command wait for the job to finish before continuing the execution of the script?
I found a solution to make sure that the restauration completed successfuly by using the stored procedure sp_help_jobhistory , here's the link for documentation: https://learn.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-help-jobhistory-transact-sql?view=sql-server-ver15.
Here's the script:
[string]$output;
$Query =#"
USE msdb;
GO
declare #temp table
(
job_id varchar(255),
job_name varchar(255),
run_status varchar(255),
run_date varchar(255),
run_time varchar(255),
run_duration varchar(255),
operator_emailed varchar(255),
operator_netsent varchar(255),
operator_paged varchar(255),
retries_attempted varchar(255),
server varchar(255)
);
INSERT #temp Exec dbo.sp_help_jobhistory #job_name = N'JobName';
select top 1 run_status from #temp order by run_date Desc, run_time ;
GO
"#
$output = Invoke-Sqlcmd -ServerInstance xxxxxx -Database xxxxx -Username xxx -Password xxxx -Query $Query
$result = $output.itemArray
if ($result -eq '1'){
Write-Host "Restauration succeded"
}
else { Write-Host "Restauration failed"}
I used a temporary table to read the content of the stored procedure, then I got the last job executed by ordering the result of the select query.
Also I added this script in a step before to test the restauration completion, because I can't access the database while it is in the restoration phase.
[string]$sucess = 'false'
$i = 0;
$ErrorActionPreference= 'silentlycontinue'
$query =#"
SELECT * FROM tiers Where id = 1;
GO
"#
do{
Invoke-Sqlcmd -ServerInstance xxxxxx -Database xxxxx -Username xxx -Password xxxx -Query $query
if($?){
$sucess = 'true';
break;
}
Write-Host "Echec"
start-sleep -Seconds 60
$i++
} While ($i -le 2 )
if ($sucess -eq 'true'){
Write-Host "Success"
}
Else {
Write-Host "Wait for too long, you need to manually check the restauration steps"
}
You can simply add the do...While in the first script, in my case I need to test the completion of restauration seperately from the script that checks the status of it.

SQL Server : drop all databases except the system ones

In PowerShell I am using the following code to delete all non system SQL Server databases:
invoke-sqlcmd -ServerInstance $sqlInstanceName -U $sqlUser -P $sqlPass -Query "
EXEC sp_MSforeachdb
'IF DB_ID(''?'') > 4
BEGIN
ALTER DATABASE [?] SET SINGLE_USER WITH ROLLBACK IMMEDIATE
DROP DATABASE [?]
END'
"
And it seems to do the job. But when I re-run it I get:
invoke-sqlcmd : Option 'SINGLE_USER' cannot be set in database 'master'.
Option 'SINGLE_USER' cannot be set in database 'tempdb'.
At C:\tmp\drop.ps1:19 char:5
+ invoke-sqlcmd -ServerInstance $sqlInstanceName -U $sqlUser -P $sq ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [Invoke-Sqlcmd], SqlPowerShellSqlExecutionException
+ FullyQualifiedErrorId : SqlError,Microsoft.SqlServer.Management.PowerShell.GetScriptCommand
I thought IF DB_ID(''?'') > 4 would skip any system dbs:
How can I omit system databases and allow SQL Server 2008 agent job to move past ERROR_NUMBER 208?
How do I make it terminate gracefully if only system dbs (master, model, msdb, tempdb) are found?
I suspect what is happening here is syntax checking before the actual evaluation of IF. You need to introduce another level of "dynamism" to your query.
EXEC sp_MSforeachdb
'IF DB_ID(''?'') > 4
BEGIN
EXEC (''ALTER DATABASE [?] SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
DROP DATABASE [?]'' )
END'

How to delete SQL Server databases trough vNEXT build task properly

We have a PowerShell cleanup script for our test machines:
$sqlConnection = new-object system.data.SqlClient.SqlConnection("Data Source=.\SQLExpress;Integrated Security=SSPI;Initial Catalog=master")
try {
$sqlConnection.Open()
$commandText = #"
exec sp_msforeachdb 'IF ''?'' NOT IN (''master'', ''model'', ''msdb'', ''tempdb'')
BEGIN
drop database [?]
END'
"#
$sqlCommand = New-Object System.Data.SqlClient.SqlCommand
$sqlCommand.CommandText = $commandText
$sqlCommand.Connection = $sqlConnection
$SQLCommand.CommandTimeout = 0
$sqlCommand.ExecuteNonQuery()
}
finally{
$sqlConnection.Close()
}
Normally it works, but sometimes it cannot delete databases, since there seem to be some open connections and the build task fails to delete the databases as they are in use.
This also seems to occur at "some point" or "random".
Any advice to enhance the script?
(using lates TFS 2017 on prem and SQL Server 2014)
If you need to cut off all users with no warning, set the database offline before dropping it.
$commandText = #"
exec sp_msforeachdb 'IF ''?'' NOT IN (''master'', ''model'', ''msdb'', ''tempdb'')
BEGIN
alter database [?] set offline with rollback immediate;drop database [?];
END';
"#
if found a script here:
Drop all databases from server
-- drops all user databases
DECLARE #command nvarchar(max)
SET #command = ''
SELECT #command = #command
+ 'ALTER DATABASE [' + [name] + '] SET single_user with rollback immediate;'+CHAR(13)+CHAR(10)
+ 'DROP DATABASE [' + [name] +'];'+CHAR(13)+CHAR(10)
FROM [master].[sys].[databases]
where [name] not in ( 'master', 'model', 'msdb', 'tempdb');
SELECT #command
EXECUTE sp_executesql #command
it works as intended, still thanks for your help
Might I suggest using SMO?
push-location;
import-module sqlps -disablenamechecking;
pop-location
$serverName = '.';
$server = new-object microsoft.sqlserver.management.smo.server $servername;
foreach ($db in $server.Databases | where {$_.IsSystemObject -eq $false}) {
$server.killDatabase($db.Name);
}

What is the proper way to move a database from one drive to another in SQL Server 2005?

I'm not looking to relocate the database to another server entirely, but just move the data file(s) and log file to another drive with more space. I've seen conflicting directions on how to do this, so I'm looking for the recommended proper way of doing it.
Detach the Database:
use master
go
sp_detach_db 'mydb'
Move the Database files (Xcopy through xp_cmdshell shown):
DECLARE #SRCData nvarchar(1000)
SET #SRCData = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\mydb.mdf';
DECLARE #SRCLog nvarchar(1000)
SET #SRCLog = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\mydb_log.ldf';
DECLARE #FILEPATH nvarchar(1000);
DECLARE #LOGPATH nvarchar(1000);
SET #FILEPATH = N'xcopy /Y ' + #SRCData + N' D:\Data';
SET #LOGPATH = N'xcopy /Y ' + #SRCLog + N' E:\Log';
exec xp_cmdshell #FILEPATH;
exec xp_cmdshell #LOGPATH;
ReAttach Database:
sp_attach_db 'mydb', 'D:\Data\mydb.mdf', 'E:\Log\mydb_log.ldf'
There's more detail at this Microsoft KB article.
Another way - detach database files (database->tasks->detach), move them to new drive and then attach again.
But way described by Jay S is the simpliest.
To be absolutely safe, I would do the following:
Backup the database to a BAK file.
Take the current database offline, or delete it if you want to.
Restore the database and change the location of the MDF and LDF files.
Scripts sample:
-- Get the file list from a backup file.
-- This will show you current logical names and paths in the BAK file
RESTORE FILELISTONLY FROM disk = N'C:\Backups\MyDatabaseName.bak'
-- Perform the restore of the database from the backup file.
-- Replace 'move' names (MDFLogicalName, LDFLogicalName) with those found in
-- the previous filelistonly command
restore database MyDatabaseName
from disk = N'C:\Backups\MyDatabaseName.bak'
with move 'MDFLogicalName' to 'D:\SQLData\MyDatabaseName.mdf',
move 'LDFLogicalName' to 'D:\SQLLogs\MyDatabaseName_log.ldf',
replace, stats=10;
Notes
The first script will get you the current names and paths that you'll need in the second script. The second script restores the database back to the name you want it to have, but you can change where you want to store. In the example above, it moves the MDF and LDF files to the D: drive.
I'd rather not enable xp_cmdshell on my SQL Server instance, so I wrote a function to do this using Powershell instead; it was especially useful when I had to move a large number of databases.
function Move-Database
{
param ($database, $newPath)
$paths = Invoke-SqlCmd "SELECT master_files.physical_name as Path
FROM sys.databases
JOIN sys.master_files ON master_files.database_id = databases.database_id
WHERE databases.name = '$database';";
$paths = $paths | % { $_.Path };
if (!$paths)
{
throw "Unknown database '$database'";
}
Write-Host "Setting $database to single-user mode...";
Invoke-SqlCmd "ALTER DATABASE [$database] SET SINGLE_USER WITH ROLLBACK IMMEDIATE;";
Write-Host "Detaching $database";
Invoke-SqlCmd "EXEC sp_detach_db '$database';";
if (!(test-path $newPath))
{
[void](mkdir $newPath);
}
$clauses = #();
foreach ($oldFile in $paths)
{
$filename = [System.IO.Path]::GetFileName($oldFile);
$newFile = [System.IO.Path]::Combine($newPath, $filename);
$clauses += "(FILENAME = `"$newFile`")";
Write-Host "Moving $oldFile to $newFile";
mv $oldFile $newFile;
}
$clauses = $clauses -join ", ";
Write-Host "Re-attaching $database";
Invoke-SqlCmd "CREATE DATABASE [$database] ON $clauses FOR ATTACH;";
Write-Host "All done!";
}
You can use it like so:
Move-Database -database "MyDatabase" -newPath "D:\SqlData";
I also think this method is a bit more robust than the others - what if your database is split into many files or you have a strange naming convention for logs for example?
I needed to move multiple databases within same server, so I expanded the accepted solution a bit, to avoid copying and pasting or retyping commands. This allows moving data files in one script run, only changing the database name. Note this assumes that advanced commands are enabled; if not, use sp_configure. The data and log files are assumed to be in the same directory.
use master
DECLARE #DBName nvarchar(50)
SET #DBName = 'YOUR_DB_NAME'
DECLARE #RC int
EXEC #RC = sp_detach_db #DBName
DECLARE #NewPath nvarchar(1000)
SET #NewPath = 'E:\Data\Microsoft SQL Server\Data\';
DECLARE #OldPath nvarchar(1000)
SET #OldPath = 'C:\Program Files\Microsoft SQL Server\MSSQL11.SQLEXPRESS\MSSQL\DATA\';
DECLARE #DBFileName nvarchar(100)
SET #DBFileName = #DBName + '.mdf';
DECLARE #LogFileName nvarchar(100)
SET #LogFileName = #DBName + '_log.ldf';
DECLARE #SRCData nvarchar(1000)
SET #SRCData = #OldPath + #DBFileName;
DECLARE #SRCLog nvarchar(1000)
SET #SRCLog = #OldPath + #LogFileName;
DECLARE #DESTData nvarchar(1000)
SET #DESTData = #NewPath + #DBFileName;
DECLARE #DESTLog nvarchar(1000)
SET #DESTLog = #NewPath + #LogFileName;
DECLARE #FILEPATH nvarchar(1000);
DECLARE #LOGPATH nvarchar(1000);
SET #FILEPATH = N'xcopy /Y "' + #SRCData + N'" "' + #NewPath + '"';
SET #LOGPATH = N'xcopy /Y "' + #SRCLog + N'" "' + #NewPath + '"';
exec xp_cmdshell #FILEPATH;
exec xp_cmdshell #LOGPATH;
EXEC #RC = sp_attach_db #DBName, #DESTData, #DESTLog
go
You also need to make sure the user under which the SQL Server process is running has access to the folder. For SQL2014, the default user process is "NT Service\MSSQL$SQL2014".

Resources