Troubleshooting DSC configuration - sql-server

I'm looking for an advice on how to troubleshoot DSC configuration. I'm using Azure DSC Extension and the logs are:
VERBOSE: [2017-09-08 02:56:48Z] [VERBOSE] [sql-sql-0]:
[[xSqlServer]ConfigureSqlServerWithAlwaysOn] The file "MSDBLog" has been
modified in the system catalog. The new path will be used the next time the
database is started.
VERBOSE: [2017-09-08 02:56:50Z] [ERROR] Exception setting "StartupParameters":
"STARTUPPARAMETERS: unknown property."
VERBOSE: [2017-09-08 02:56:50Z] [VERBOSE] [sql-sql-0]:
[[xSqlServer]ConfigureSqlServerWithAlwaysOn] Stopping SQL server instance
'MSSQLSERVER' ...
VERBOSE: [2017-09-08 02:57:07Z] [VERBOSE] [sql-sql-0]:
[[xSqlServer]ConfigureSqlServerWithAlwaysOn] Starting SQL server instance
'MSSQLSERVER' ...
From there it just freezes then times out.
This is the only part of my DSC configuration that uses xSQLServer:
xSqlServer ConfigureSqlServerWithAlwaysOn
{
InstanceName = $env:COMPUTERNAME
SqlAdministratorCredential = $Admincreds
ServiceCredential = $SQLCreds
MaxDegreeOfParallelism = 1
FilePath = "F:\DATA"
LogPath = "G:\LOG"
DomainAdministratorCredential = $DomainFQDNCreds
DependsOn = "[xSqlLogin]AddSqlServerServiceAccountToSysadminServerRole"
}
This is the part of MicrosoftAzure_xSqlServer.psm1 that contains mentions of "startup":
function Alter-SystemDatabaseLocation([string]$FilePath, [string]$LogPath,[PSCredential]$ServiceCredential )
{
$permissionString = $ServiceCredential.UserName+":(OI)(CI)(F)"
icacls $FilePath /grant $permissionString
icacls $LogPath /grant $permissionString
Invoke-Sqlcmd "Use master"
Invoke-sqlCmd "ALTER DATABASE tempdb MODIFY FILE (NAME = tempdev, FILENAME = '$FilePath\tempdb.mdf');"
Invoke-sqlCmd "ALTER DATABASE tempdb MODIFY FILE (NAME = templog, FILENAME = '$LogPath\templog.ldf');"
Invoke-sqlCmd "ALTER DATABASE model MODIFY FILE (NAME = modeldev, FILENAME = '$FilePath\model.mdf');"
Invoke-sqlCmd "ALTER DATABASE model MODIFY FILE (NAME = modellog, FILENAME = '$LogPath\modellog.ldf');"
Invoke-sqlCmd "ALTER DATABASE msdb MODIFY FILE (NAME = MSDBData, FILENAME = '$FilePath\msdbdata.mdf');"
Invoke-sqlCmd "ALTER DATABASE msdb MODIFY FILE (NAME = MSDBLog, FILENAME = '$LogPath\msdblog.ldf');"
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SqlWmiManagement')| Out-Null
$smowmi = New-Object Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer
$sqlsvc = $smowmi.Services | Where-Object {$_.Name -like 'MSSQL*'}
$OldStartupParameters = $sqlsvc.StartupParameters
$params = '-d'+$FilePath+'\master.mdf;-e'+$LogPath+'\ERRORLOG;-l'+$LogPath+'\mastlog.ldf'
$sqlsvc[1].StartupParameters = $params
$sqlsvc[1].Alter()
}
What should be my next steps to understand the problem?
If it makes any difference, I'm trying to create SQL Server Always On Availability group on Windows Server 2016 and SQL Server 2016 SP1, by using templates and DSC code that work with Windows Server 2012 R2 and SQL Server 2014.

my advice is to use existing working code ;) I don't think it makes sense to copy paste the blog post. But I'll leave links to relevant files here.
Links:
1. DSC modules
2. ARM Template. Note: Arm template that deploys sql needs to have vnet and domain already in place. Also, you will get a hard time trying to deploy this template outside of that framework, so probably just copy out dsc extension\dsc scripts and use in your deployments.
3. Parameters file example
The xSqlServer (the new one) is not capable of moving sql to another disk, so you are stuck with custom script or this old xSql module. Also, please note, DSC modules inside packages are modified. These DSC configurations won't work with not modified versions of modules (xSqlCreateVirtualDataDisk, xDatabase, xSQLServerAlwaysOnAvailabilityGroupListener maybe some other modules, I can't recall at this point).
PS. working that configuration out and patching relevant parts wasn't exactly a pleasant journey...
PPS. that repo also contains dsc configuration for ADDS that can also run in parallel (compared to official example).

try this.
$sqlsvc = $smowmi.Services | Where-Object {$_.Name -eq 'MSSQLSERVER'}
$OldStartupParameters = $sqlsvc.StartupParameters
$params = '-d'+$FilePath+'\master.mdf;-e'+$LogPath+'\ERRORLOG;-l'+$LogPath+'\mastlog.ldf'
$sqlsvc[0].StartupParameters = $params
$sqlsvc[0].Alter()

Related

Automating Database Counts for All SQL Servers

I need database counts for every SQL server instance (PROD/Non PROD) in an environment.
If I logged into each and every SQL Server then it is very tedious task for me, so I need to automate it.
Is there any t-sql or powershell script from which I can get consolidated report for database counts for all servers at one place.
You can use Azure Functions with Timer schedule to automate what you want.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-timer
And you can use stored procedures with azure functions.
Take a look at the SqlServer module documentation, specifically the Get-SqlDatabase cmdlet. You can import your server names from a file or define them in an array and then iterate through them.
#get your credential
$credentials = get-credential
#read server names from a file
$servers = get-content "C:\Some\Path\servers.txt"
#use calculated properties
$servers | select-object `
#{Name='ServerName';Expression={$_}},
#{Name='DatabaseCount';Expression={
#force to array to ensure .count exists
#(Get-SqlInstance -MachineName $_ -Credential $credential | Get-SqlDatabase).Count
}}
If you are unable to install the SqlServer module you can do it with the SMO
#load the assembly
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO")
#read server names from a file
$servers = get-content "C:\Some\Path\servers.txt"
#use calculated properties
$servers | select-object `
#{Name='ServerName';Expression={$_}},
#{Name='DatabaseCount';Expression={
(New-Object Microsoft.SqlServer.Management.SMO.Server $_).Databases.Count
}}
Output:
ServerName DatabaseCount
---------- -------------
ServerOne 12
ServerTwo 22
ServerThree 6
Note that there is no error checking and I am assuming that you are running in a context that has rights to the server and DB engine.

Powershell error setting Environment variable for an SSIS package execution

My Powershell script executes an SSIS package, but first over-rides an Environment variable. The Alter method on the EnvironmentInfo object fails with a generic error message: "Operation 'Alter' on object [EnvironmentInfo[#Name='MyVariable']' failed during execution."
I also tried removing the environment variable and changing the Project parameter, but received the same error on the Alter method for the Project object.
I suspect this is either 1) a shortcoming of using the 32-bit version of SQL Server 2012, or 2) a permissions issue.
I've made sure the executing Windows Account has full privileges on the SSISDB database and the SSIS Catalog project, and the child folder, environment, etc.
Any ideas on the error or how I can get more details? I don't see anything in the Windows Event Logs.
Here's my code:
# Variables
$ServerInstance = "MyServer"
$32bitSQLServer = "false" #use #null for 32-bit
$SSISNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
$FolderName = "MyFolder"
$ProjectName = "MyProject"
$PackageName = "MyPackage.dtsx"
$EnvironmentName = "MyEnvironment"
$VariableName = "MyVariable"
$VariableValue = Read-Host "What is the new environment variable value? "
# Create a connection to the server - Have to use Windows Authentication in order to Execute the Package
$sqlConnectionString = `
"Data Source=" + $ServerInstance + ";Initial Catalog=master;Integrated Security=SSPI;"
$sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString
# Create the Integration Services object
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.IntegrationServices")
$integrationServices = New-Object $SSISNamespace".IntegrationServices" $sqlConnection
# Get the Integration Services catalog
$catalog = $integrationServices.Catalogs["SSISDB"]
# Get the folder
$folder = $catalog.Folders[$FolderName]
# Get the project
$project = $folder.Projects[$ProjectName]
# Get the environment
$environment = $folder.Environments[$EnvironmentName]
# Get the environment reference
$environmentReference = $project.References.Item($EnvironmentName, $FolderName)
$environmentReference.Refresh()
# Get the package
$package = $project.Packages[$PackageName]
# Set the Environment Variable
$environment.Variables[$VariableName].Value = $VariableValue
$environment.Alter()
# Execute the package
Write-Host "Running Package " $PackageName "..."
$result = $package.Execute($32bitSQLServer, $environmentReference)
# Alternate approach, also not working
# $project.Parameters[$VariableName].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Literal,$VariableValue)
# $project.Alter()
# $result = $package.Execute($32bitSQLServer, $environmentReference)
Write-Host "Done."
Just alter the Parameter on the package and don't worry about the environment variable. I am sure it doesn't change the value stored in the package on the server, just the object held in your$package variable. Something like this:
$package.Parameters[$VariableName].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Literal, $VariableValue)
$package.Alter()
$execution = $integrationServices.Catalogs['SSISDB'].Executions[$package.Execute($false, ($package.Parent.References[$package_name, $folder_Name]))]
do {
$execution.Refresh()
} until ($execution.Completed)
So with some digging I found the answer to my error. Turned out I had multiple versions (11.0.0.0, 12.0.0.0, 13.0.0.0) of the assembly Microsoft.SqlServer.Management.IntegrationServices in my GAC (Windows\assembly). Examining the schema of the SSISDB catalog, it was version 12.0.5000.0, which meant I needed the 12.0.0.0 version of the assembly. The code I was using:
[Reflection.Assembly]::LoadWithPartialName( "Microsoft.SqlServer.Management.IntegrationServices")
was loading the wrong version (probably 13.0.0.0), so I needed to explicitly load the assembly version matching this installation of SSIS, which was 12.0.0.0:
[System.Reflection.Assembly]::Load("Microsoft.SqlServer.Management.IntegrationServices, Version=12.0.0.0, Culture=neutral, processorArchitecture=MSIL, PublicKeyToken=89845dcd8080cc91")

Reading an SSIS package using powershell when the package is stored in an Integration Service server

I'm using below powershell script to read Version Build from a dtsx pkg(read as an xml). I'm able to read the file as long as it's in my local or shared path. However, one of the packages are on an Integration Services server path and I'm not able to simply use get-content on that path. Below is the code I'm using for local/shared path files.
$xml = [xml](get-content *filepath*)
$value = $xml.Executable.Property | Where-Object {$_.Name -like 'VersionBuild'}
$ver_dev = $value.'#text'
I read on the internet about net use but I guess that's just for mapping shared path, the integration server path is not really a shared path as we can only access it using SQL server(as per my understanding), the path is as displayed in sql server is as follows Server_path
I also came across some code invoking packages in SQL server using powershell, however I am novice in powershell(only started for this version compare purpose) and could hardly understand anything.
The ocde in itself is simple and frankly that's what motivated me to automate this Version compare, however I'm stuck on reading from this server path. Any help would be greatly appreciated. Thanks
If your packages are stored in an Integration Services Catalog, the code in this link below will save an .ispac file (which is a zip file) to local storage and extract the contents. The code below, based on that, goes as far as downloading a specific .ispac project file. From there, you'll need to extract the dtsx package (as in the linked example).
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") | Out-Null
$SsisServer = 'SERVERNAME'
$Folder = 'FOLDER'
$ProjectName = 'PROJECT'
## Build the list of SQL Server names.
$smo = New-Object Microsoft.SqlServer.Management.Smo.Server $SsisServer
$SsisNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
[System.Reflection.Assembly]::LoadWithPartialName($SsisNamespace) | Out-Null
$SqlConnectionstring = "Data Source=$SsisServer;Initial Catalog=master;Integrated Security=SSPI;"
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection $SqlConnectionstring
# Create the Integration Services object
$IntegrationServices = New-Object "$SsisNamespace`.IntegrationServices" $SqlConnection
$Catalog = $IntegrationServices.Catalogs["SSISDB"]
$oFolder = $Catalog.Folders[$Folder]
$oProject = $oFolder.Projects[$ProjectName]
$ISPAC = $oProject.GetProjectBytes()
[System.IO.File]::WriteAllBytes(($DownloadFolder + "\" + $oProject.Name + ".ispac"),$ISPAC)
If your packages are stored in msdb, the code below, from Jamie Thomson, will download all packages to local storage.
Param($SQLInstance = "localhost")
#####Add all the SQL goodies (including Invoke-Sqlcmd)#####
add-pssnapin sqlserverprovidersnapin100 -ErrorAction SilentlyContinue
add-pssnapin sqlservercmdletsnapin100 -ErrorAction SilentlyContinue
cls
$ssisSQL = "WITH cte AS (
SELECT cast(foldername as varchar(max)) as folderpath, folderid
FROM msdb..sysssispackagefolders
WHERE parentfolderid = '00000000-0000-0000-0000-000000000000'
UNION ALL
SELECT cast(c.folderpath + '\' + f.foldername as varchar(max)), f.folderid
FROM msdb..sysssispackagefolders f
INNER JOIN cte c ON c.folderid = f.parentfolderid
)
SELECT c.folderpath,p.name,CAST(CAST(packagedata AS VARBINARY(MAX)) AS VARCHAR(MAX)) as pkg
FROM cte c
INNER JOIN msdb..sysssispackages p ON c.folderid = p.folderid
WHERE c.folderpath NOT LIKE 'Data Collector%'"
$Packages = Invoke-Sqlcmd -MaxCharLength 10000000 -ServerInstance $SQLInstance -Query $ssisSQL
Foreach ($pkg in $Packages)
{
$pkgName = $Pkg.name
$folderPath = $Pkg.folderpath
$fullfolderPath = "c:\temp\$folderPath\"
if(!(test-path -path $fullfolderPath))
{
mkdir $fullfolderPath | Out-Null
}
$pkg.pkg | Out-File -Force -encoding ascii -FilePath "$fullfolderPath\$pkgName.dtsx"
}

How to bring SQL Server database out of "restoring" mode using SMO

I'm bringing databases out of log shipping, trying to use SMO to do it. I'm attempting to mimic the following T-SQL using SMO:
restore database <database name> with recovery
Here's my code:
# select secondary_database from msdb.dbo.log_shipping_secondary_databases
$dsSecLSDB = $secInst.Databases["MSDB"].ExecuteWithResults("select secondary_database from log_shipping_secondary_databases")
$secLSDB = $dsSecLSDB.Tables.Rows
foreach($db in $secLSDB.secondary_database) {
write-host "Restoring database (bringing online)..."
$secRestrObj = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore -Property #{
Action = 'Database';
Database = $db;
NoRecovery = $FALSE;
}
$secRestrObj.SqlRestore($secInst);
write-host "Done with restore."
}
The error:
Microsoft.SqlServer.Management.Smo.PropertyNotSetException: To accomplish this action, set property Devices
The available options for DeviceType (from https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.devicetype(v=sql.105).aspx) are:
LogicalDevice
Tape
File
Pipe
VirtualDevice
The problem is, I don't know which DeviceType to create. My guess is LogicalDevice but I don't know the value of it. Has anyone done this before?
I'm facing the same trouble.
I guess that the method sqlRestore is limited to a strong database restore, from a backup file/device.
Just bringing "online" a database with the status "restoring" (ie stand by db from log shipping or from a broken AG) seems to be possible only executing the "restore database with recovery" with the invoke-sqlcmd applet.
Or if anybody has an other solution ...

How to transfer data from a newer to an older MS SQL server using cmd?

I have a database server (let's call it S) with some data running MS SQL 2012. On the other end of the world there are tablets running MS SQL 2012 EXPRESS (let's call them TS). A part of the data must be transferred from S to TS. Due to many reasons the only way to communicate is to send files from S to proxy servers to have them picked up by TS later.
Currently I'm using backup files (.BAK) so I can simply create a partial backup of S' database and restore it on TS.
Now here is the problem:
It might happen that in a while S will switch to MS SQL 2014. It's not possible to install 2014 on TS because I have no direct access to them. In this case S will still be able to restore data created with SQL 2012 on TS but not the other way around because it's impossible to restore backups from a newer SQL version on an older one. So this is the point I need help in.
I've tried to export the data using the server management tool on S instead of creating a backup and import everything on TS. This works well but the export must happen automatically per command line. The good old sqlpubwiz is not an option because it was discontinued and only works up to version 2008 R2.
So I need ideas how to export from S using cmd/powershell and import on TS only using files to communicate even if they are running different versions of MS SQL Server.
I've used detatched databases before to distribute databases to laptops before. They are much quicker to import than a backup, but I'm not sure about inter-version compatibility.
You could write a tool that you package up as an exe that contains CSV files and a loader script which uses Bcp.exe
I would look at using the built-in replication features, setting up S as a snapshot replication publisher, and TS as a subscriber via FTP. SQL Server Express can act as a subscriber (but not a published) in a replication scenario.
http://msdn.microsoft.com/en-us/library/ms151832.aspx
If you want to write a more intricate scenario, you may want to consider Sync Framework
Problem solved! Powershell does the trick but it was way too slow. At the end I've still used BCP. BCP means a lot of scripting to do but works about 30 times faster than using powershell. However, here is the powershell script I've used:
set-psdebug -strict
$DirectoryToSaveTo='...' #path where to save the data
$ServerName='...' #Hostname\Instance
$Database='...' #database to export
$ErrorActionPreference = "stop"
Trap {
$err = $_.Exception
write-host $err.Message
while( $err.InnerException ) {
$err = $err.InnerException
write-host $err.Message
};
break
}
$v = [System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.SqlServer.SMO')
if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMOExtended') | out-null
}
$My='Microsoft.SqlServer.Management.Smo'
$s = new-object ("$My.Server") $ServerName
$Server=$s.netname -replace '[\\\/\:\.]',' '
$instance = $s.instanceName -replace '[\\\/\:\.]',' '
$DatabaseName =$database -replace '[\\\/\:\.]',' '
$DirectoryToSaveTo=$DirectoryToSaveTo+'\'+$Instance+'\'
$CreationScriptOptions = new-object ("$My.ScriptingOptions")
$CreationScriptOptions.ExtendedProperties= $true
$CreationScriptOptions.DRIAll= $true
$CreationScriptOptions.Indexes= $true
$CreationScriptOptions.Triggers= $true
$CreationScriptOptions.ScriptBatchTerminator = $true
$CreationScriptOptions.Filename = "$($DatabaseName)_Schema.sql";
$CreationScriptOptions.IncludeHeaders = $true;
$CreationScriptOptions.ToFileOnly = $true # no need of string output as well
$CreationScriptOptions.IncludeIfNotExists = $true # not necessary but it means the script can be more versatile
$transfer = new-object ("$My.Transfer") $s.Databases[$Database]
$transfer.options=$CreationScriptOptions # tell the transfer object of our preferences
$scripter = new-object ("$My.Scripter") $s # script out the database creation
$scripter.options=$CreationScriptOptions # with the same options
$scripter.Script($s.Databases[$Database]) # do it
"USE $Database" | Out-File -Append -FilePath "$($DatabaseName)_Schema.sql"
"GO" | Out-File -Append -FilePath "$($DatabaseName)_Schema.sql"
$transfer.options.AppendToFile=$true
$transfer.options.ScriptDrops=$true
$transfer.EnumScriptTransfer()
$transfer.options.ScriptDrops=$false
$transfer.EnumScriptTransfer()
"All written to $($DatabaseName)_Schema.sql"
This script exports only the schema but it should be quite obvious how to export also all the data.

Resources