Microsoft.SqlServer.Management.Smo.Server.SqlRestore timeout issues - database

I have a script that I created a powershell script based on a script I found, Start-Migration. It is a great script and gave me a lot of really good ideas; however I am running into some issues when attempting to restore large databases or databases that take longer than 10 minutes. I have attempted to use both them invoke-sqlcmd2 function I found and the class for the restore for the microsoft.sqlserver.management.smo namespace. both of which are timing out after 10 minutes. I have also tried increasing the connection timeout even setting the connection to 1200. any suggestions would be welcomed.
Function Restore-SQLDatabase {
<#
.SYNOPSIS
Restores .bak file to SQL database. Creates db if it doesn't exist. $filestructure is
a custom object that contains logical and physical file locations.
.EXAMPLE
$filestructure = Get-SQLFileStructures $sourceserver $destserver $ReuseFolderstructure
Restore-SQLDatabase $destserver $dbname $backupfile $filestructure
.OUTPUTS
$true if success
$true if failure
#>
[CmdletBinding()]
param(
[Parameter(Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[object]$server,
[Parameter(Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[string]$dbname,
[Parameter(Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[string]$backupfile,
[Parameter(Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[object]$filestructure
)
$servername = $server.Name
$server.ConnectionContext.StatementTimeout = 0
$restore = New-Object "Microsoft.SqlServer.Management.Smo.Restore"
foreach ($file in $filestructure.databases[$dbname].destination.values) {
$movefile = New-Object "Microsoft.SqlServer.Management.Smo.RelocateFile"
$movefile.LogicalFileName = $file.logical
$movefile.PhysicalFileName = $file.physical
$null = $restore.RelocateFiles.Add($movefile)
}
Write-Host "Restoring $dbname to $servername" -ForegroundColor Yellow
try{
$Percent = [Microsoft.SqlServer.Management.Smo.PercentCompleteEventHandler]{
Write-Progress -Id 1 -Activity "Restoring $dbname to $ServerName" -PercentComplete $_.Percent -Status ([System.String]::Format("Progress: {0}%",$_.Percent))
}
$restore.add_PercentComplete($Percent)
$restore.PercentCompleteNotification = 1
$restore.add_Complete($Complete)
$restore.ReplaceDatabase = $true
$restore.Database = $dbname
$restore.Action = "Database"
$restore.NoRecovery = $false
$device = New-Object -TypeName Microsoft.SqlServer.Management.Smo.BackupDeviceItem
$device.Name = $backupfile
$device.DeviceType = "File"
$restore.Devices.Add($device)
Write-Progress -Id 1 -Activity "Restoring $dbname to $servername" -PercentComplete 0 -Status([System.String]::Format("Progress: {0}%",0))
$restore.SqlRestore($servername)
# $query = $restore.Script($ServerName)
# Write-Host $query
# Invoke-Sqlcmd2 -ServerInstance $servername -Database master -Query $query -ConnectionTimeout 1200
# Write-Host "Restoring $dbname to $servername from " $restore.Devices.ToString() -ForegroundColor Magenta
Write-Progress -Id 1 -Activity "Restore $dbname to $servername" -Status "Complete" -Completed
return $true
}
catch{
Write-Error $_.Exception.ToString()
Write-Warning "Restore failed: $($_.Exception.InnerException.Message)"
return $false
}
when the restore process takes place ,$restore.SqlRestore($ServerName), on my larger databases it returns saying that the script timed out. I am trying to figure out how to correct this. I have tried increasing the statementtimeout = 1200 and it still stops after 10 minutes. i even attempted to us an invoke-sqlcmd As you can see I commented it out when trying different options. I am at wits end right now.

If you have to use the script you found
I think that this line `$server.ConnectionContext.StatementTimeout = 0' is not actually doing anything or changing the timeout threshold.
Try changing the line to this, instead
$server = New-Object("Microsoft.SqlServer.Management.Smo.Server") $server
$server.ConnectionContext.StatementTimeout = 0
$server.ConnectionContext.Connect()
However, I would recommend avoiding this approach entirely, because you can use the much much easier SQLPS Cmdlets. The approach you're using comes from a long time ago, in which we didn't have actual PowerShell cmdlets to work with SQL.
Nowadays, we have the Restore-SQLDatabase cmdlet, which lets you do with one line what your code now is doing in ~30 lines!!
How to Use Restore-SQLDatabase instead
This is so much easier, and you'll really thank me, I think.
Restore-SqlDatabase -ServerInstance "$server\InstanceName" `
-Database $dbname `-BackupFile $BackupFile
and...that's it. One single line! And you can also specify the timeout easily here, unlike before. To specify the maximum timeout:
Restore-SqlDatabase -ServerInstance "$server\InstanceName" `
-Database $dbname -BackupFile $BackupFile -ConnectionTimeout 0
That should get you started. Trust me, using the Cmdlets is so much easier than using some random script you find online.

Okay I got it to work. I am not completely sure why it worked but here is what I did. I created a parameter $Server as follows:
$Server = New-Object Microsoft.SqlServer.Management.Smo.Server $Source
$Server.ConnectionContext.ConnectTimeout = 0
Now when I call the function that actually does the restore, I was doing something that I didn't need to do:
$ServerName = $server.Name
instead of using the $ServerName, I used the object $server:
$restore.SqlRestore($servername) # Original Script
$restore.SqlRestore($server) # Change
and now it stays open long enough to complete the restore process.

Related

How to read SQL rows as PowerShell parameters

On Azure, I am running multiple .sql files from a container in 100s of Azure SQL Databases via Powershell runbook.
I want Powershell to read the server name and the database name to run the scripts from my SQL Server table that looks like this:
Servername
Databasename
Status
Server-01
DB-01
Process
Server-01
DB-02
Skip
Server-02
DB-03
Process
In my current version of the Powershell script, it can read the files in the container and run them in a given server and database:
# Get the blob container
$blobs = Get-AzStorageContainer -Name $containerName -Context $ctx | Get-AzStorageBlob
# Download the blob content to localhost and execute each one
foreach ($blob in $blobs)
{
$file = Get-AzStorageBlobContent -Container $containerName -Blob $blob.Name -Destination "." -Context $ctx
Write-Output ("Processing file :" + $file.Name)
$query = Get-Content -Path $file.Name
Invoke-Sqlcmd -ServerInstance "Server-01.database.windows.net" -Database "DB-01" -Query $query -AccessToken $access_token
Write-Output ("This file is executed :" + $file.Name)
}
I am looking for a method that will read the rows from the table and feed them into the -ServerInstance and -Database fields in the Invoke-Sqlcmd. Ideally it can filter out the Skip rows.
One method is to load the database list into a DataTable and iterate over the list for each query. Change the $databaseListConnectionString in the example code below per your authentication method and set the connection AccessToken if/as needed.
# get database list
$databaseListConnectionString = "Data Source=YourServer;Initial Catalog=YourDatabase"
$databaseListQuery = "SELECT ServerName, DatabaseName FROM dbo.DatabaseList WHERE Status = 'Process';"
$dataAdapter = New-Object System.Data.SqlClient.SqlDataAdapter($databaseListQuery, $databaseListConnectionString)
$dataAdapter.SelectCommand.Connection.AccessToken = $access_token
$databaseList = New-Object System.Data.DataTable
[void]$dataAdapter.Fill($databaseList)
# Get the blob container
$blobs = Get-AzStorageContainer -Name $containerName -Context $ctx | Get-AzStorageBlob
# Download the blob content to localhost and execute each one
foreach ($blob in $blobs) {
{
$file = Get-AzStorageBlobContent -Container $containerName -Blob $blob.Name -Destination "." -Context $ctx
Write-Output ("Processing file :" + $file.Name)
$query = Get-Content -Path $file.Name
foreach($database in $databaseList.Rows) {
Invoke-Sqlcmd -ServerInstance "$($database.ServerName)" -Database "$($database.DatabaseName)" -Query $query -AccessToken $access_token
Write-Output ("This file is executed :" + $file.Name)
}
}
}

Generate Scripts for Tables and all Extended Properties

Looking for a way to generate individual scripts for each table and include any relationships or extended properties. (After getting this working I will try to generate scripts for stored procedure, and function)
I am aware of the Sql-Server UI Generator (Database=>Tasks=>Generate Scripts) but that does one big file, not individuals. If there is a way to make this produce individual files (with out doing them 1 at a time) that would be best.
I have used Powershell package DBATools with some limited success. The following will make a file that contains create scripts for the table and the table's extended property but not the column extended properties.
$server = "sql01"
$database = "MyDatabase"
$table = "MyTable"
Get-DbaDbTable -SqlInstance $server -Database $database -Table $table | Export-DbaScript -FilePath ($database + "\" + $table +".sql")
Get-DbaDbTable -SqlInstance $server -Database $database -Table $table | Get-DbaExtendedProperty | ForEach-Object { Export-DbaScript -InputObject $_ -FilePath ($database + "\" + $table +".sql") -Append }
The answer was given by Larnu in the commnets.
The comment pointed out the option to save scripts individually and I have been over looking it for years.
While you found the option in SSMS, I wanted to say that this is also possible using the dbatools approach you tried. The "secret sauce" is specifying a ScriptingOptions object that controls the scripting behavior.
$so = New-DbaScriptingOption;
$so.ExtendedProperties = $true;
foreach ($table in Get-DbaDbTable -SqlInstance . -Database AdventureWorks2019){
$path = '{0}.{1}.sql' -f $table.Schema, $table.Name;
Export-DbaScript -InputObject $table -FilePath $path -ScriptingOptionsObject $so;
}

How may I export the output of SQL query in Excel fetched through PowerShell?

I am using this power-shell script to fetch the versions of all SQL Servers in a list.
How may I export the result columns (only query output not error messages) into excel and send to email after the script is run?
Can someone help me add the required script please?
Import-Module SQLPS -DisableNameChecking
$ServerInstences = Get-Content "D:\DBA\All_Server_monitoring\ServerList.txt"
$SQLQuery = #"
Select ##Servername 'Server Name' ,##version 'Version'
"#
$DBName = "master"
$ServerInstences |
ForEach-Object {
$ServerObject = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -ArgumentList $_
Invoke-Sqlcmd -ServerInstance $_ -Database $DBName -Query $SQLQuery
}
The easiest way to export data to a csv file is by using Export-CSV which takes an input object (or object array) and a path and can fill out the csv file from that object/array. For you, it would look like this:
$results = Invoke-Sqlcmd -ServerInstance $_ -Database $DBName -Query $SQLQuery
New-Item -Path "MYPATH.csv"
Export-CSV -Path "MYPATH.csv" -InputObject $results -Append
CSV files are versatile and can be opened with the most lightweight text editors. They also can be easily emailed and opened with Excel.

How to create a copy of existing SQL DB on same server

I to perform an operation a part of which has me looking for a way to create a copy of SQL DB on same server. I tried the suggestion given at Copy SQL Server database with PowerShell script . However the resulting copy is about a quarter size of the actual DB.
Ideas anyone?
Thanks
If your PowerShell solution is working except you are noticing a file-size discrepancy with the newly copied database compared to the source database, it may not be an actual problem.
SQL Server database and log sizes are variable and are typically not an exact indication of the amount of data they contain. The copied database may be "optimized" in terms of its disk file usage in a way that the source database is currently not.
There are three things you can do to convince yourself you have a working solution.
Run the Shrink command on both of the databases to free space and see if the resulting disk files are more similar in terms of size https://learn.microsoft.com/en-us/sql/t-sql/database-console-commands/dbcc-shrinkdatabase-transact-sql?view=sql-server-ver15
Write a benchmark script that compares record counts on all the tables. If you see a discrepancy in the record count, you know you have a problem.
Example benchmark script:
declare #sourceCount int;
declare #copyCount int;
set #sourceCount = (select count(*) from SourceDb.dbo.SomeTable);
set #copyCount = (select count(*) from CopyDb.dbo.SomeTable);
if #sourceCount <> #copyCount
begin
select 'PROBLEM!'
end
-- Now repeat for all the other tables
Use a SQL data comparison tool to automate the method in step 2. above. There are many such tools including the one built-in to Visual Studio 2019 (Tools menu) or otherwise: https://learn.microsoft.com/en-us/sql/ssdt/how-to-compare-and-synchronize-the-data-of-two-databases?view=sql-server-ver15
However, all these methods will only work reliably if you can ensure that either database isn't receiving updates during the copy/benchmarking process. If someone is accessing one or the other of the databases independently while you are measuring, and they alter the data independently while you are measuring, your results would be invalidated, possibly without your knowing.
EDIT
I managed to make it work as soon as I have started using the SqlServer module instead of the SQLPS module, because the latter had long been deprecated. I edited the answer I am referring to in my initial post, below.
I was having a similar error implementing this. Tried literally everything, it just wouldn't work. What did work for me, was generating a script through the ScriptTransfer method, create the new database and then apply the script to the new database through Invoke-SqlCmd. I have posted a detailed explanation and code in this answer.
Okay. I managed to implement this. And anybody who needs to do this in future, please try this:
Import-Module SQLPS -DisableNameChecking
$SQLInstanceName = "$env:COMPUTERNAME\sqlexpress"
$SourceDBName = "xxx"
$CopyDBName = "${SourceDBName}_copy"
$Server = New-Object -TypeName 'Microsoft.SqlServer.Management.Smo.Server' -ArgumentList $SQLInstanceName
$SourceDB = $Server.Databases[$SourceDBName]
$CopyDB = New-Object -TypeName 'Microsoft.SqlServer.Management.SMO.Database' -ArgumentList $Server , $CopyDBName
# Delete any existing copy
Try
{
Invoke-Sqlcmd -ServerInstance "$SQLInstanceName" -Query "Drop database $CopyDBName;" -Username "***" -Password "****" -Verbose
}
Catch
{
Write-Output 'Failed to delete database'
}
$CopyDB.create()
$ObjTransfer = New-Object -TypeName Microsoft.SqlServer.Management.SMO.Transfer -ArgumentList $SourceDB
$ObjTransfer.DestinationDatabase = $CopyDBName
$ObjTransfer.DestinationServer = $Server.Name
$ObjTransfer.DestinationLoginSecure = $true
$ObjTransfer.CopyData = $true
$ObjTransfer.CopyAllObjects = $false
$ObjTransfer.CopyAllDatabaseTriggers = $true
$ObjTransfer.CopyAllDefaults = $true
$ObjTransfer.CopyAllRoles = $true
$ObjTransfer.CopyAllRules = $true
$ObjTransfer.CopyAllSchemas = $true
$ObjTransfer.CopyAllSequences = $true
$ObjTransfer.CopyAllSqlAssemblies = $true
$ObjTransfer.CopyAllSynonyms = $true
$ObjTransfer.CopyAllTables = $true
$ObjTransfer.CopyAllViews = $true
$ObjTransfer.CopyAllStoredProcedures = $true
$ObjTransfer.CopyAllUserDefinedAggregates = $true
$ObjTransfer.CopyAllUserDefinedDataTypes = $true
$ObjTransfer.CopyAllUserDefinedTableTypes = $true
$ObjTransfer.CopyAllUserDefinedTypes = $true
$ObjTransfer.CopyAllUserDefinedFunctions = $true
$ObjTransfer.CopyAllUsers = $true
$ObjTransfer.PreserveDbo = $true
$ObjTransfer.Options.AllowSystemObjects = $false
$ObjTransfer.Options.ContinueScriptingOnError = $true
$ObjTransfer.Options.IncludeDatabaseRoleMemberships = $true
$ObjTransfer.Options.Indexes = $true
$ObjTransfer.Options.Permissions = $true
$ObjTransfer.Options.WithDependencies = $true
$ObjTransfer.TransferData()

Storing output of Stored Procedure in file after calling it from Powershell

I am trying to call a SP (Ola's maintenance script!) on a remote server (that part of the code works), and the output of the SP is the results of DBCC CHECKDB (so it's in the Message tab).
I tried to put together some code to capture this Message output into a file on the remote server, but the file is not being created, though the SP completes fine.
$OutputFile = "\\XXX\E$\SQLAdmin\DatabaseCheckDB\ScriptOutput\ScriptOutput.txt"
$handler = [System.Data.SqlClient.SqlInfoMessageEventHandler] {param($sender, $event) Out-File -filepath $OutputFile -inputobject $event.Message };
$SqlConnection.add_InfoMessage($handler);
$SqlConnection.FireInfoMessageEventOnUserErrors = $true;
$SqlConnection = new-Object System.Data.SqlClient.SqlConnection("Server=XXX;DataBase=master;Integrated Security=SSPI")
$SqlConnection.Open() | Out-Null
$cmd = new-Object System.Data.SqlClient.SqlCommand("dbo.DatabaseIntegrityCheck", $SqlConnection)
$cmd.CommandType = [System.Data.CommandType]'StoredProcedure'
$cmd.Parameters.Add("#Databases","ALL_DATABASES") | Out-Null
$cmd.ExecuteNonQuery() | Out-Null
$SqlConnection.Close()
Can anyone see what i'm doing wrong here? THanks in advance!
Do you have the SQL Powershell module installed (sqlps)? If so, then you can use this and pipe the output from the Verbose stream (which contains printed messages from SQL), to your file.
Invoke-Sqlcmd -Query 'DBCC CHECKDB' `
-ServerInstance '(local)' `
-Database 'tempdb' `
-Verbose 4>&1 |
Out-File c:\temp\test.txt
If that isn't an option, then I think I have spotted the problem in your original code - you wire up the InfoMessage event, but you then proceed to create a brand new SqlConnection. This new SqlConnection doesn't have the event handler on it, and so won't respond to any of the printed messages.
Try replacing
$SqlConnection.add_InfoMessage($handler);
$SqlConnection.FireInfoMessageEventOnUserErrors = $true;
$SqlConnection = new-Object System.Data.SqlClient.SqlConnection("Server=XXX;DataBase=master;Integrated Security=SSPI")
with
$SqlConnection = new-Object System.Data.SqlClient.SqlConnection("Server=XXX;DataBase=master;Integrated Security=SSPI")
$SqlConnection.add_InfoMessage($handler);
$SqlConnection.FireInfoMessageEventOnUserErrors = $true;

Resources