My Powershell script executes an SSIS package, but first over-rides an Environment variable. The Alter method on the EnvironmentInfo object fails with a generic error message: "Operation 'Alter' on object [EnvironmentInfo[#Name='MyVariable']' failed during execution."
I also tried removing the environment variable and changing the Project parameter, but received the same error on the Alter method for the Project object.
I suspect this is either 1) a shortcoming of using the 32-bit version of SQL Server 2012, or 2) a permissions issue.
I've made sure the executing Windows Account has full privileges on the SSISDB database and the SSIS Catalog project, and the child folder, environment, etc.
Any ideas on the error or how I can get more details? I don't see anything in the Windows Event Logs.
Here's my code:
# Variables
$ServerInstance = "MyServer"
$32bitSQLServer = "false" #use #null for 32-bit
$SSISNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
$FolderName = "MyFolder"
$ProjectName = "MyProject"
$PackageName = "MyPackage.dtsx"
$EnvironmentName = "MyEnvironment"
$VariableName = "MyVariable"
$VariableValue = Read-Host "What is the new environment variable value? "
# Create a connection to the server - Have to use Windows Authentication in order to Execute the Package
$sqlConnectionString = `
"Data Source=" + $ServerInstance + ";Initial Catalog=master;Integrated Security=SSPI;"
$sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString
# Create the Integration Services object
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.IntegrationServices")
$integrationServices = New-Object $SSISNamespace".IntegrationServices" $sqlConnection
# Get the Integration Services catalog
$catalog = $integrationServices.Catalogs["SSISDB"]
# Get the folder
$folder = $catalog.Folders[$FolderName]
# Get the project
$project = $folder.Projects[$ProjectName]
# Get the environment
$environment = $folder.Environments[$EnvironmentName]
# Get the environment reference
$environmentReference = $project.References.Item($EnvironmentName, $FolderName)
$environmentReference.Refresh()
# Get the package
$package = $project.Packages[$PackageName]
# Set the Environment Variable
$environment.Variables[$VariableName].Value = $VariableValue
$environment.Alter()
# Execute the package
Write-Host "Running Package " $PackageName "..."
$result = $package.Execute($32bitSQLServer, $environmentReference)
# Alternate approach, also not working
# $project.Parameters[$VariableName].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Literal,$VariableValue)
# $project.Alter()
# $result = $package.Execute($32bitSQLServer, $environmentReference)
Write-Host "Done."
Just alter the Parameter on the package and don't worry about the environment variable. I am sure it doesn't change the value stored in the package on the server, just the object held in your$package variable. Something like this:
$package.Parameters[$VariableName].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Literal, $VariableValue)
$package.Alter()
$execution = $integrationServices.Catalogs['SSISDB'].Executions[$package.Execute($false, ($package.Parent.References[$package_name, $folder_Name]))]
do {
$execution.Refresh()
} until ($execution.Completed)
So with some digging I found the answer to my error. Turned out I had multiple versions (11.0.0.0, 12.0.0.0, 13.0.0.0) of the assembly Microsoft.SqlServer.Management.IntegrationServices in my GAC (Windows\assembly). Examining the schema of the SSISDB catalog, it was version 12.0.5000.0, which meant I needed the 12.0.0.0 version of the assembly. The code I was using:
[Reflection.Assembly]::LoadWithPartialName( "Microsoft.SqlServer.Management.IntegrationServices")
was loading the wrong version (probably 13.0.0.0), so I needed to explicitly load the assembly version matching this installation of SSIS, which was 12.0.0.0:
[System.Reflection.Assembly]::Load("Microsoft.SqlServer.Management.IntegrationServices, Version=12.0.0.0, Culture=neutral, processorArchitecture=MSIL, PublicKeyToken=89845dcd8080cc91")
Related
I need to use ApplicationIntent=ReadOnly with my SQL command in powershell which is connecting to a replica database. Can anyone help ?
Since replicas servers could not be accessed directly. So I need to use this command. I know how to manually do it but need help on code.
$SQLQuery = "SELECT x.SCode, x.DatabaseName FROM dbo.Logins x ORDER BY x.SCode"
$auth = #{Username = $SQLUserName; Password = $SQLAdminPassword}
try
{
$allTenants = Invoke-Sqlcmd -Query $SQLQuery -ServerInstance $SQLServerName -Database 'SShared'-QueryTimeout -0 #Auth -ErrorAction Stop
Write-Log -LogFileName $logfile -LogEntry ("Found {0} tenants" -f $allTenants.Count)
}
I am geeting the below error using this -
Exception Message A network-related or instance-specific error
occurred while establishing a connection to SQL Server
The server was not found or was not accessible. Verify that the
instance name is correct and that SQL Server is configured to allow
remote connections. (provider: Named Pipes Provider, error: 40
- Could not open a connection to SQL Server)
There's a few ways that you can do this.
Easy way
dbatools
There is a PowerShell module for interacting with SQL Server created by the SQL Server community called dbatools.
In the module, there is a function called Invoke-DbaQuery which is essentially a wrapper for Invoke-Sqlcmd.
This function has a parameter, -ReadOnly, that you can use that was created exactly for this scenario.
# Changing your $auth to a PSCredential object.
$cred = [System.Management.Automation.PSCredential]::New(
$SqlUserName,
(ConvertTo-SecureString -String $SqlAdminPassword -AsPlainText -Force))
# Splatting the parameters for read-ability.
$QueryParams = #{
Query = $SQLQuery
SqlInstance = $SQLServerName
Database = 'SShared'
QueryTimeout = 0
SqlCredential = $cred
ReadOnly = $true # <-- Specifying read-only intent.
ErrorAction = 'Stop'
}
$allTenants = Invoke-DbaQuery #QueryParams
Other way
Invoke-Sqlcmd
If you can't, won't, don't want to use dbatools, you can still use Invoke-Sqlcmd. The latest release at the time of writing, has the option to specify the parameter -ConnectionString.
You can state that it's read-only there.
# Splatting again for read-ability.
$SqlcmdParams = #{
Query = $SQLQuery
QueryTimeout = 0
ConnectionString = "Data Source=$SQLServerName;Initial Catalog=SShared;User ID=$SqlUserName;Password=$SqlAdminPassword;Integrated Security=false;ApplicationIntent=ReadOnly" # <-- Specifying read-only intent.
ErrorAction = 'Stop'
}
Invoke-Sqlcmd #SqlcmdParams
I'm looking for an advice on how to troubleshoot DSC configuration. I'm using Azure DSC Extension and the logs are:
VERBOSE: [2017-09-08 02:56:48Z] [VERBOSE] [sql-sql-0]:
[[xSqlServer]ConfigureSqlServerWithAlwaysOn] The file "MSDBLog" has been
modified in the system catalog. The new path will be used the next time the
database is started.
VERBOSE: [2017-09-08 02:56:50Z] [ERROR] Exception setting "StartupParameters":
"STARTUPPARAMETERS: unknown property."
VERBOSE: [2017-09-08 02:56:50Z] [VERBOSE] [sql-sql-0]:
[[xSqlServer]ConfigureSqlServerWithAlwaysOn] Stopping SQL server instance
'MSSQLSERVER' ...
VERBOSE: [2017-09-08 02:57:07Z] [VERBOSE] [sql-sql-0]:
[[xSqlServer]ConfigureSqlServerWithAlwaysOn] Starting SQL server instance
'MSSQLSERVER' ...
From there it just freezes then times out.
This is the only part of my DSC configuration that uses xSQLServer:
xSqlServer ConfigureSqlServerWithAlwaysOn
{
InstanceName = $env:COMPUTERNAME
SqlAdministratorCredential = $Admincreds
ServiceCredential = $SQLCreds
MaxDegreeOfParallelism = 1
FilePath = "F:\DATA"
LogPath = "G:\LOG"
DomainAdministratorCredential = $DomainFQDNCreds
DependsOn = "[xSqlLogin]AddSqlServerServiceAccountToSysadminServerRole"
}
This is the part of MicrosoftAzure_xSqlServer.psm1 that contains mentions of "startup":
function Alter-SystemDatabaseLocation([string]$FilePath, [string]$LogPath,[PSCredential]$ServiceCredential )
{
$permissionString = $ServiceCredential.UserName+":(OI)(CI)(F)"
icacls $FilePath /grant $permissionString
icacls $LogPath /grant $permissionString
Invoke-Sqlcmd "Use master"
Invoke-sqlCmd "ALTER DATABASE tempdb MODIFY FILE (NAME = tempdev, FILENAME = '$FilePath\tempdb.mdf');"
Invoke-sqlCmd "ALTER DATABASE tempdb MODIFY FILE (NAME = templog, FILENAME = '$LogPath\templog.ldf');"
Invoke-sqlCmd "ALTER DATABASE model MODIFY FILE (NAME = modeldev, FILENAME = '$FilePath\model.mdf');"
Invoke-sqlCmd "ALTER DATABASE model MODIFY FILE (NAME = modellog, FILENAME = '$LogPath\modellog.ldf');"
Invoke-sqlCmd "ALTER DATABASE msdb MODIFY FILE (NAME = MSDBData, FILENAME = '$FilePath\msdbdata.mdf');"
Invoke-sqlCmd "ALTER DATABASE msdb MODIFY FILE (NAME = MSDBLog, FILENAME = '$LogPath\msdblog.ldf');"
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SqlWmiManagement')| Out-Null
$smowmi = New-Object Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer
$sqlsvc = $smowmi.Services | Where-Object {$_.Name -like 'MSSQL*'}
$OldStartupParameters = $sqlsvc.StartupParameters
$params = '-d'+$FilePath+'\master.mdf;-e'+$LogPath+'\ERRORLOG;-l'+$LogPath+'\mastlog.ldf'
$sqlsvc[1].StartupParameters = $params
$sqlsvc[1].Alter()
}
What should be my next steps to understand the problem?
If it makes any difference, I'm trying to create SQL Server Always On Availability group on Windows Server 2016 and SQL Server 2016 SP1, by using templates and DSC code that work with Windows Server 2012 R2 and SQL Server 2014.
my advice is to use existing working code ;) I don't think it makes sense to copy paste the blog post. But I'll leave links to relevant files here.
Links:
1. DSC modules
2. ARM Template. Note: Arm template that deploys sql needs to have vnet and domain already in place. Also, you will get a hard time trying to deploy this template outside of that framework, so probably just copy out dsc extension\dsc scripts and use in your deployments.
3. Parameters file example
The xSqlServer (the new one) is not capable of moving sql to another disk, so you are stuck with custom script or this old xSql module. Also, please note, DSC modules inside packages are modified. These DSC configurations won't work with not modified versions of modules (xSqlCreateVirtualDataDisk, xDatabase, xSQLServerAlwaysOnAvailabilityGroupListener maybe some other modules, I can't recall at this point).
PS. working that configuration out and patching relevant parts wasn't exactly a pleasant journey...
PPS. that repo also contains dsc configuration for ADDS that can also run in parallel (compared to official example).
try this.
$sqlsvc = $smowmi.Services | Where-Object {$_.Name -eq 'MSSQLSERVER'}
$OldStartupParameters = $sqlsvc.StartupParameters
$params = '-d'+$FilePath+'\master.mdf;-e'+$LogPath+'\ERRORLOG;-l'+$LogPath+'\mastlog.ldf'
$sqlsvc[0].StartupParameters = $params
$sqlsvc[0].Alter()
We use an Azure Elastic Pool resulting in multiple client databases and one master database with references to the client database.
We already have multiple databases and are working on a new version of the code. We use EF6 Code-First.
When we make a change to our model (add a property) we create the migration file and need to call Update-Database for all existing client databases.
This is monkey work we want to skip.
I already have a Powershell script to connect to the master database and execute a query on a table. This returns the names of the child databases.
With it I can change the Web.config and replace the Template database name with the proper name of the child database.
Now I need to call Update-Database to execute the migration scripts. With this last part I'm struggling because I'm running the ps1-script outside Visual Studio and thus the command Update-database is unknown. I tried using migrate.exe but then I get lots of errors.
I think the easiest solution is to run my script within the Package manager console but I can't figure out how to do that.
I managed to get it working. After I placed the ps1-file in the root of my code folder I could run it in the Package Manager Console using .\UpdateDatabases.ps1.
For completeness here's the script I created. I'm new to PowerShell so some optimizations might be possible.
cls
$currentPath = (Get-Item -Path ".\" -Verbose).FullName
#Read Web.config
$webConfig = $currentPath + "\<your project>\Web.config"
$doc = (Get-Content $webConfig) -as [Xml]
$DatabaseNamePrefix = $doc.configuration.appSettings.add | where {$_.Key -eq 'DatabaseNamePrefix'}
#Get Master connectionstring
$root = $doc.get_DocumentElement();
foreach($connString in $root.connectionStrings.add | where {$_.Name -eq "Master"})
{
$masterConn = $connString.connectionString
}
#Connect to master database
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $masterConn
#Query Client table for the child database names
$SqlQuery = "select Code from Clients"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = $SqlQuery
$SqlCmd.Connection = $SqlConnection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
#Put query result in dataset
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
foreach ($row in $DataSet.Tables[0].Rows)
{
$clientDbName = $row[0].ToString().Trim()
#Change Web.Config
foreach($connString in $root.connectionStrings.add | where {$_.Name -eq "DevelopmentDb"})
{
$newDatabaseName = "Database=" + $DatabaseNamePrefix.value + $clientDbName + ";";
$newConn = $connString.connectionString -replace "(Database=.*?;)",$newDatabaseName
$connString.connectionString = $newConn;
}
$doc.Save($webConfig)
#Update database
Update-Database -ConfigurationTypeName Application
}
"Finished"
You may want to take a look at Azure Elastic Database Jobs. Which is designed to work with the elastic database pools.
The Elastic Database Jobs SDK includes also PowerShell components.
I'm trying to call a maintenance SP from within an Azure runbook:
inlinescript {
.........
$Cmd = New-object System.Data.SqlClient.SqlCommand
$Cmd.Connection = $Conn
$Cmd.CommandText = "EXEC [dbo].[BackupLogTable] #tableName, #olderThan"
$Cmd.Parameters.AddWithValue("#tableName", $TableName)
$Cmd.Parameters.AddWithValue("#olderThan", $OlderThan)
$Cmd.ExecuteNonQuery()
.....
}
The SP is declared like this:
alter procedure [dbo].[BackupLogTable] (
#tableName nvarchar(512),
#olderThan int
)
with execute as owner as
and I can successfully run it from SSMS under the same user my runbook uses. But when testing it in Azure portal I'm getting the following error:
Exception calling "ExecuteNonQuery" with "0" argument(s): "The
parameterized query '(#tableName nvarchar(4000),#olderThan
nvarchar(4000))EXEC [dbo].' expects the parameter '#tableName', which
was not supplied."
I tried every other variants of passing the parameters found on the net like this one:
$Cmd.CommandText = "[BackupLogTable]"
$Cmd.CommandType = [System.Data.CommandType]::StoredProcedure
$Cmd.Parameters.Add("#tableName", [System.Data.SqlDbType]::NVarChar, 512) | Out-Null
$Cmd.Parameters["#tableName"].Value = $TableName
$Cmd.Parameters.Add("#olderThan", [System.Data.SqlDbType]::Int) | Out-Null
$Cmd.Parameters["#olderThan"].Value = $OlderThan
and many others but it always fails:
Exception calling "ExecuteNonQuery" with "0" argument(s): "Procedure
or function 'BackupLogTable' expects parameter '#tableName', which
was not supplied."
What am I doing wrong?
How are you passing your parameters to your inline script?
There's some limitations; e.g. Get-AutomationVariable / AutomationCredential is not available in the InlineScript.
"The InlineScript activity runs a block of commands in a separate, non-workflow session and returns its output to the workflow. While commands in a workflow are sent to Windows Workflow Foundation for processing, commands in an InlineScript block are processed by Windows PowerShell. The activity uses the standard workflow common parameters including PSComputerName and PSCredential which allow you to specify that the code block be run on another computer or using alternate credentials."
So, there's some limitations on how to pass and get variables.
However you can pass values into the inlinescript using $Using.
E.g: InlineScript { Write-output $using:TableName }
Hopefully that'd do the trick.
See also the recommendations for inlinescript on: https://technet.microsoft.com/en-us/library/dn469257(v=sc.16).aspx#bkmk_InlineScript
I have a database server (let's call it S) with some data running MS SQL 2012. On the other end of the world there are tablets running MS SQL 2012 EXPRESS (let's call them TS). A part of the data must be transferred from S to TS. Due to many reasons the only way to communicate is to send files from S to proxy servers to have them picked up by TS later.
Currently I'm using backup files (.BAK) so I can simply create a partial backup of S' database and restore it on TS.
Now here is the problem:
It might happen that in a while S will switch to MS SQL 2014. It's not possible to install 2014 on TS because I have no direct access to them. In this case S will still be able to restore data created with SQL 2012 on TS but not the other way around because it's impossible to restore backups from a newer SQL version on an older one. So this is the point I need help in.
I've tried to export the data using the server management tool on S instead of creating a backup and import everything on TS. This works well but the export must happen automatically per command line. The good old sqlpubwiz is not an option because it was discontinued and only works up to version 2008 R2.
So I need ideas how to export from S using cmd/powershell and import on TS only using files to communicate even if they are running different versions of MS SQL Server.
I've used detatched databases before to distribute databases to laptops before. They are much quicker to import than a backup, but I'm not sure about inter-version compatibility.
You could write a tool that you package up as an exe that contains CSV files and a loader script which uses Bcp.exe
I would look at using the built-in replication features, setting up S as a snapshot replication publisher, and TS as a subscriber via FTP. SQL Server Express can act as a subscriber (but not a published) in a replication scenario.
http://msdn.microsoft.com/en-us/library/ms151832.aspx
If you want to write a more intricate scenario, you may want to consider Sync Framework
Problem solved! Powershell does the trick but it was way too slow. At the end I've still used BCP. BCP means a lot of scripting to do but works about 30 times faster than using powershell. However, here is the powershell script I've used:
set-psdebug -strict
$DirectoryToSaveTo='...' #path where to save the data
$ServerName='...' #Hostname\Instance
$Database='...' #database to export
$ErrorActionPreference = "stop"
Trap {
$err = $_.Exception
write-host $err.Message
while( $err.InnerException ) {
$err = $err.InnerException
write-host $err.Message
};
break
}
$v = [System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.SqlServer.SMO')
if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMOExtended') | out-null
}
$My='Microsoft.SqlServer.Management.Smo'
$s = new-object ("$My.Server") $ServerName
$Server=$s.netname -replace '[\\\/\:\.]',' '
$instance = $s.instanceName -replace '[\\\/\:\.]',' '
$DatabaseName =$database -replace '[\\\/\:\.]',' '
$DirectoryToSaveTo=$DirectoryToSaveTo+'\'+$Instance+'\'
$CreationScriptOptions = new-object ("$My.ScriptingOptions")
$CreationScriptOptions.ExtendedProperties= $true
$CreationScriptOptions.DRIAll= $true
$CreationScriptOptions.Indexes= $true
$CreationScriptOptions.Triggers= $true
$CreationScriptOptions.ScriptBatchTerminator = $true
$CreationScriptOptions.Filename = "$($DatabaseName)_Schema.sql";
$CreationScriptOptions.IncludeHeaders = $true;
$CreationScriptOptions.ToFileOnly = $true # no need of string output as well
$CreationScriptOptions.IncludeIfNotExists = $true # not necessary but it means the script can be more versatile
$transfer = new-object ("$My.Transfer") $s.Databases[$Database]
$transfer.options=$CreationScriptOptions # tell the transfer object of our preferences
$scripter = new-object ("$My.Scripter") $s # script out the database creation
$scripter.options=$CreationScriptOptions # with the same options
$scripter.Script($s.Databases[$Database]) # do it
"USE $Database" | Out-File -Append -FilePath "$($DatabaseName)_Schema.sql"
"GO" | Out-File -Append -FilePath "$($DatabaseName)_Schema.sql"
$transfer.options.AppendToFile=$true
$transfer.options.ScriptDrops=$true
$transfer.EnumScriptTransfer()
$transfer.options.ScriptDrops=$false
$transfer.EnumScriptTransfer()
"All written to $($DatabaseName)_Schema.sql"
This script exports only the schema but it should be quite obvious how to export also all the data.