I'm using TFS 2015 Update 2 to create a release. One of my release steps is a "PowerShell on Target Machines" task that I'm using to apply a DSC configuration.
I'd like to use the Script Arguments field to pass in parameters from TFS to the DSC script.
My script looks like this:
Param(
[string]$data
)
configuration ApplyConfig
{
Script Backup {
SetScript = {
#do some stuff with $data
}
TestScript = {
Write-Output "Print param"
Write-Output $data
return $true
}
GetScript = {
return #{"Test" = "test data"}
}
}
}
ApplyConfig
The Script Arguments field contains this:
-Destination "$(ApplicationPath)"
However, at this point, $data seems to always be null. How can I get the argument defined in the Script Arguments field into my Script Resource?
When you reference $data in the TestScript you need the 'using' scope:
TestScript = {
Write-Output "Print param"
Write-Output $using:data
return $true
}
The TestScript executes on a different PowerShell context; 'using' allows you to copy the value of $data across those contexts.
My recommendation for flexibility is to declare a configuration hash table in your DSC script and pass parameters in to configure it. My Continuous Delivery with TFS / VSTS – Server Configuration and Application Deployment with Release Management blog post has a complete walkthrough of how to use DSC and Release Management in TFS 2015 Update 2.
Getting the parameters in then becomes a case of declaring your parameters as follows:
param(
[Parameter(Position=1)]
[string]$myFirstParameter,
[Parameter(Position=2)]
[string]$mySecondParameter
)
and then passing in the value in either directly:
Script Arguments field contains 'myFirstValue' 'mySecondValue'
or better as variables:
Script Arguments field contains $(myFirstValue) $(mySecondValue)
Related
I have a module with a lot of advanced functions.
I need to use a long list of ValidateSet parameters.
I would like to put the whole list of possible parameters in an array and then use that array in the functions themselves.
How can I pull the list of the whole set from an array?
New-Variable -Name vars3 -Option Constant -Value #("Banana","Apple","PineApple")
function TEST123 {
param ([ValidateScript({$vars3})]
$Fruit)
Write-Host "$Fruit"
}
The problem is that when I use the function it doesn't pull the content from the constant.
TEST123 -Fruit
If I specify the indexed value of the constant then it works.
TEST123 -Fruit $vars3[1]
It returns Apple.
You are misunderstanding how ValidateScript ...
ValidateScript Validation Attribute
The ValidateScript attribute specifies a script that is used to
validate a parameter or variable value. PowerShell pipes the value to
the script, and generates an error if the script returns $false or if
the script throws an exception.
When you use the ValidateScript attribute, the value that is being
validated is mapped to the $_ variable. You can use the $_ variable to refer to the value in the script.
... works. As the others have pointed out thus far. You are not using a script you are using a static variable.
To get what I believe you are after, you would do it, this way.
(Note, that Write- is also not needed, since output to the screen is the default in PowerShell. Even so, avoid using Write-Host, except for in targeted scenarios, like using color screen output. Yet, even then, you don't need it for that either. There are several cmdlets that can be used, and ways of getting color with more flexibility. See these listed MS powershelgallery.com modules)*
Find-Module -Name '*Color*'
Tweaking your code you posted, and incorporating what Ansgar Wiechers, is showing you.
$ValidateSet = #('Banana','Apple','PineApple') # (Get-Content -Path 'E:\Temp\FruitValidationSet.txt')
function Test-LongValidateSet
{
[CmdletBinding()]
[Alias('tlfvs')]
Param
(
[Validatescript({
if ($ValidateSet -contains $PSItem) {$true}
else { throw $ValidateSet}})]
[String]$Fruit
)
"The selected fruit was: $Fruit"
}
# Results - will provide intellisense for the target $ValidateSet
Test-LongValidateSet -Fruit Apple
Test-LongValidateSet -Fruit Dog
# Results
The selected fruit was: Apple
# and on failure, spot that list out. So, you'll want to decide how to handle that
Test-LongValidateSet -Fruit Dog
Test-LongValidateSet : Cannot validate argument on parameter 'Fruit'. Banana Apple PineApple
At line:1 char:29
Just add to the text array / file but this also means, that file has to be on every host you use this code on or at least be able to reach a UNC share to get to it.
Now, you can use the other documented "dynamic parameter validate set". the Lee_Daily points you to lookup, but that is a bit longer in the tooth to get going.
Example:
function Test-LongValidateSet
{
[CmdletBinding()]
[Alias('tlfvs')]
Param
(
# Any other parameters can go here
)
DynamicParam
{
# Set the dynamic parameters' name
$ParameterName = 'Fruit'
# Create the dictionary
$RuntimeParameterDictionary = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary
# Create the collection of attributes
$AttributeCollection = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
# Create and set the parameters' attributes
$ParameterAttribute = New-Object System.Management.Automation.ParameterAttribute
$ParameterAttribute.Mandatory = $true
$ParameterAttribute.Position = 1
# Add the attributes to the attributes collection
$AttributeCollection.Add($ParameterAttribute)
# Generate and set the ValidateSet
$arrSet = Get-Content -Path 'E:\Temp\FruitValidationSet.txt'
$ValidateSetAttribute = New-Object System.Management.Automation.ValidateSetAttribute($arrSet)
# Add the ValidateSet to the attributes collection
$AttributeCollection.Add($ValidateSetAttribute)
# Create and return the dynamic parameter
$RuntimeParameter = New-Object System.Management.Automation.RuntimeDefinedParameter($ParameterName, [string], $AttributeCollection)
$RuntimeParameterDictionary.Add($ParameterName, $RuntimeParameter)
return $RuntimeParameterDictionary
}
begin
{
# Bind the parameter to a friendly variable
$Fruit = $PsBoundParameters[$ParameterName]
}
process
{
# Your code goes here
$Fruit
}
}
# Results - provide intellisense for the target $arrSet
Test-LongValidateSet -Fruit Banana
Test-LongValidateSet -Fruit Cat
# Results
Test-LongValidateSet -Fruit Banana
Banana
Test-LongValidateSet -Fruit Cat
Test-LongValidateSet : Cannot validate argument on parameter 'Fruit'. The argument "Cat" does not belong to the set "Banana,Apple,PineApple"
specified by the ValidateSet attribute. Supply an argument that is in the set and then try the command again.
At line:1 char:29
Again, just add to the text to the file, and again, this also means, that file has to be on every host you use this code on or at least be able to reach a UNC share to get to it.
I am not sure exactly what your use case is, but another possibility if you're using PowerShell 5.x, or newer, is to create a class, or if you're using an older version you could embed a little C# in your code to create an Enum that you can use:
Add-Type -TypeDefinition #"
public enum Fruit
{
Strawberry,
Orange,
Apple,
Pineapple,
Kiwi,
Blueberry,
Raspberry
}
"#
Function TestMe {
Param(
[Fruit]$Fruit
)
Write-Output $Fruit
}
I had prepared a script to pull some report w.r.t SQL server and out put will be pushed to different CSV sheets. After output is generated, all the CSV's are merged to single Excel file with the help of custom created function and that excel will be sent to my email address.
While running htrough powershell_ise.exe, it is running fine and I am receiving the email successfully. When I scheduled the same script, I am receiving the email but with out excel attachments. I am suspecting that custom created function is not used, because I dont see any converted excel files in the desired location.
I tried all possible ways, like dot sourching, pasting the function in the script itself but no luck.
I am a beginner in powershell, can some one please help me if i am missing some thing.
Thanks,
Anil
Function Merge-CSVFiles
{
Param(
$CSVPath = "D:\Anil\Missing_Indexes", ## Soruce CSV Folder
$XLOutput="D:\Anil\Missing_Indexes.xls" ## Output file name
)
$csvFiles = Get-ChildItem ("$CSVPath\*") -Include *.csv
$Excel = New-Object -ComObject excel.application
$Excel.visible = $false
$Excel.sheetsInNewWorkbook = $csvFiles.Count
$workbooks = $excel.Workbooks.Add()
$CSVSheet = 1
Foreach ($CSV in $Csvfiles)
{
$worksheets = $workbooks.worksheets
$CSVFullPath = $CSV.FullName
$SheetName = ($CSV.name -split "\.")[0]
$worksheet = $worksheets.Item($CSVSheet)
$worksheet.Name = $SheetName
$TxtConnector = ("TEXT;" + $CSVFullPath)
$CellRef = $worksheet.Range("A1")
$Connector = $worksheet.QueryTables.add($TxtConnector,$CellRef)
$worksheet.QueryTables.item($Connector.name).TextFileCommaDelimiter = $True
$worksheet.QueryTables.item($Connector.name).TextFileParseType = 1
$worksheet.QueryTables.item($Connector.name).Refresh()
$worksheet.QueryTables.item($Connector.name).delete()
$worksheet.UsedRange.EntireColumn.AutoFit()
$CSVSheet++
}
$workbooks.SaveAs($XLOutput,51)
$workbooks.Saved = $true
$workbooks.Close()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($workbooks) | Out-Null
$excel.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($excel) | Out-Null
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
}
While running through powershell_ise.exe, it is running fine and I am receiving the email successfully
This is because the ISE on your box is able to load your custom function during runtime and generate the report.
But when this is scheduled through a job, the script will run on a different server than your box, hence the script will not be able to find the function and you don't get the report.
I have faced this kind of issue before while using custom functions.
The work around that i can suggest is wrapping for custom functions in a separate module and importing the module in your main script. ( preferably save the module in the same location as your script for easy troubleshooting).
Example:
Save your function in a .psm1 module file
Function ScriptExample {
Param ( [string] $script,
[string] $jobname,
[string] $jobcategory,
[hashtable] $config,
[string] $deletelogfilepath,
[string] $servername)
#your-function-here#
{
}
Return $script;
}
Now call this module in your main script as follows,
$importmoduleroot = "C:\temp\SO_Example.psm1"
###### LOAD MODULES ######
# Import all related modules written to support this process
$modules = get-childitem -path $importmoduleroot -include SO*.psm1 -recurse;
You can then call your function and pass in the parameters within the main script,
ScriptExample -script $script `
-jobname $jobname `
-jobcategory $jobcategory `
-config $config `
-servername $ServerName `
-deletelogfilepath $deletelogfilepath
I'm looking to combine data from two sources in an existing script.
I have one list of servers which is pulled via a Citrix command called Get-XAServer. Using this cmdlet, an array is created, with two properties, server and logonmode. Running $1stList looks like this:
SERVER LOGONMODE
Server1 AllowLogOns
Server2 AllowLogOns
Now, I want to update this list of servers that can't be pulled via the get-XAserver cmdlet. So, inside the script, I've just got an array variable that's like this, but from a list of servers that's in the following format:
$2ndList = "Server3", "Server4", "Server5"
Problem is, the server property isn't attached to the 2nd list. So, when i try to combine the arrays, they aren't parsed properly.
How do I iterate through the 2nd list so that the server and logonmode properties are both added to each/every server in the $2ndList array?
You could use a foreach:
foreach ($server in $2ndlist) {
$1stList += [pscustomobject]#{
SERVER = $server
LOGONMODE = ""
}
}
Or a ForEach-Object loop:
$2ndList | % {
$1stList += [pscustomobject]#{
SERVER = $_
LOGONMODE = ""
}
}
I have a report in SSRS, which has a parameter in it. For each possibility in the parameter, I need an Excel file. This comes down to 50 Excel files. the only way I know to schedule a report is to go to the reporting services home page, go to my report, click manage, click subscriptions > New subscription and to enter a file name, path, user name, password, schedule, parameter and ultimately press OK.
Is there a quicker way to do this, or is there a way which allows me to create the 50 reports more quickly, like copying a subscription or something like that?
try creating a ssis package and running the report for all values of the parameter. i had seen someone do this in my previous company.
data driven subscriptions are available only in enterprise and developer editions - yours could be standard.
You could also write a script in PowerShell or write an app in C#/VB. Here is an example done in PowerShell. Here is an example done in C#. Using either of these approaches, you could programmatically render the reports as you see fit. You can also create subscriptions this way as well.
PowerShell solution to the OP:
# Create a proxy to the SSRS server and give it the namespace of 'RS' to use for
# instantiating objects later. This class will also be used to create a report
# object.
$reportServerURI = "http://<SERVER>/ReportServer/ReportExecution2005.asmx?WSDL"
$RS = New-WebServiceProxy -Class 'RS' -NameSpace 'RS' -Uri $reportServerURI -UseDefaultCredential
$RS.Url = $reportServerURI
# Set up some variables to hold referenced results from Render
$deviceInfo = "<DeviceInfo><NoHeader>True</NoHeader></DeviceInfo>"
$extension = ""
$mimeType = ""
$encoding = ""
$warnings = $null
$streamIDs = $null
# Next we need to load the report. Since Powershell cannot pass a null string
# (it instead just passes ""), we have to use GetMethod / Invoke to call the
# function that returns the report object. This will load the report in the
# report server object, as well as create a report object that can be used to
# discover information about the report. It's not used in this code, but it can
# be used to discover information about what parameters are needed to execute
# the report.
$reportPath = "/PathTo/Report"
$Report = $RS.GetType().GetMethod("LoadReport").Invoke($RS, #($reportPath, $null))
# Report parameters are handled by creating an array of ParameterValue objects.
# $excelInput: either pass in as a parameter and run 50 times, or reset
# this value and run it each time with the updated excel file
$excelInput = "<ExcelFile>";
$parameters = #()
$parameters += New-Object RS.ParameterValue
$parameters[0].Name = "Excel Input File"
$parameters[0].Value = $excelInput
# Add the parameter array to the service. Note that this returns some
# information about the report that is about to be executed.
$RS.SetExecutionParameters($parameters, "en-us") > $null
# Render the report to a byte array. The first argument is the report format.
# The formats I've tested are: PDF, XML, CSV, WORD (.doc), EXCEL (.xls),
# IMAGE (.tif), MHTML (.mhtml).
$RenderOutput = $RS.Render('PDF',
$deviceInfo,
[ref] $extension,
[ref] $mimeType,
[ref] $encoding,
[ref] $warnings,
[ref] $streamIDs
)
# Convert array bytes to file and write
$OutputFile = $excelInput + ".pdf"
$Stream = New-Object System.IO.FileStream($OutputFile), Create, Write
$Stream.Write($RenderOutput, 0, $RenderOutput.Length)
$Stream.Close()
Using SQL Server 2008 R2 and Powershell 2.0
I have created a hash of options that I want to set in a SQL Server database like so:
$dbopts = #{
"AutoCreateStatistics"=$true;
"AutoUpdateStatistics"=$true;
"AutoShrink"=$false;
};
I want to set these options on a SQL Server database using SMO. I created a function which accepts a SMO Database object and the DatabaseOptions hash as inputs.
function setDatabaseOptions {
param ($db, $opts);
foreach ($opt in $db.DatabaseOptions) {
# what to write here?
}
}
I want to set the options in the hash to the database. Not sure the best way to do this. Any suggestions? An example, to set an option explicitly you would do
$db.DatabaseOptions.AutoCreateStatistics = $true
With no disrespect to Shay as in his last comment try to enumerate the hashtable.
function setDatabaseOptions {
param ($db, $opts)
foreach ($key in $opts.Keys) { $db.DatabaseOptions.$key = $opts[$key] }
$db.DatabaseOptions.Alter()
}
You assign the correct hash item to the property of the database. I don't work with SMO but you may also need to invoke a method to apply the updates.
function setDatabaseOptions {
param ($db, $opts)
$opt = $db.DatabaseOptions
$opt.AutoCreateStatistics = $opts['AutoCreateStatistics']
$opt.AutoUpdateStatistics = $opts['AutoUpdateStatistics']
$opt.AutoShrink= $opts['AutoShrink']
}