I have an hashtable that I add to when I am creating jobs with the following code:
$jobtimer = #{}
1..10 | foreach {
Start-Job -name $_ -ScriptBlock {
Start-Sleep -Seconds (Get-Random -Minimum 456 -Maximum 46546520)
}
$jobtimer[$_] = [System.Diagnostics.Stopwatch]::StartNew()
}
(thank you #mjolinor for help with this by the way!!)
I then loop through my jobs and fetch time elapsed on each job. But I'm having problems reading the values of the array. If I directly reference an index it works:
$jobtimer[4].elapsed.totalseconds
But if I try to loop through my jobs and output the elapsed time I get nothing:
get-job | foreach {
echo $jobtimer[$_.name].elapsed.totalseconds
}
How do I loop through and reference my array?
This is because when you are setting you are using an integer, but for getting you are using a string ( a job's name is a string, so the jobs will have names like "1", "2" etc and not 1, 2.
Either do below while setting:
$jobtimer["$_"] = [System.Diagnostics.Stopwatch]::StartNew()
Or do below while getting:
$jobtimer[[int]$_.name]
Add casting from string to int in accessing array:
get-job | foreach { $jobtimer[[int]$_.name] }
Related
I am trying to use $a variable in this script for working with intermediate steps so that I don't have to use $array[$array.Count-1] repeatedly. Similarly for $prop as well . However, values are being overwritten by last value in loop.
$guests = Import-Csv -Path C:\Users\shant_000\Desktop\UploadGuest_test.csv
$output = gc '.\Sample Json.json' | ConvertFrom-Json
$array = New-Object System.Collections.ArrayList;
foreach ($g in $guests) {
$array.Add($output);
$a = $array[$array.Count-1];
$a.Username = $g.'EmailAddress';
$a.DisplayName = $g.'FirstName' + ' ' + $g.'LastName';
$a.Password = $g.'LastName' + '123';
$a.Email = $g.'EmailAddress';
foreach ($i in $a.ProfileProperties.Count) {
$j = $i - 1;
$prop = $a.ProfileProperties[$j];
if ($prop.PropertyName -eq "FirstName") {
$prop.PropertyValue = $g.'FirstName';
} elseif ($prop.PropertyName -eq "LastName") {
$prop.PropertyValue = $g.'LastName';
}
$a.ProfileProperties[$j] = $prop;
}
$array[$array.Count-1] = $a;
}
$array;
All array elements are referencing one actual variable: $output.
Create an entirely new object each time by repeating JSON-parsing:
$jsontext = gc '.\Sample Json.json'
..........
foreach ($g in $guests) {
$a = $jsontext | ConvertFrom-Json
# process $a
# ............
$array.Add($a) >$null
}
In case the JSON file is very big and you change only a few parts of it you can use a faster cloning technique on the changed parts (and their entire parent chain) via .PSObject.Copy():
foreach ($g in $guests) {
$a = $output.PSObject.Copy()
# ............
$a.ProfileProperties = $a.ProfileProperties.PSObject.Copy()
# ............
foreach ($i in $a.ProfileProperties.Count) {
# ............
$prop = $a.ProfileProperties[$j].PSObject.Copy();
# ............
}
$array.Add($a) >$null
}
As others have pointed out, appending $object appends a references to the same single object, so you keep changing the values for all elements in the list. Unfortunately the approach #wOxxOm suggested (which I thought would work at first too) doesn't work if your JSON datastructure has nested objects, because Copy() only clones the topmost object while the nested objects remain references to their original.
Demonstration:
PS C:\> $o = '{"foo":{"bar":42},"baz":23}' | ConvertFrom-Json
PS C:\> $o | Format-Custom *
class PSCustomObject
{
foo =
class PSCustomObject
{
bar = 42
}
baz = 23
}
PS C:\> $o1 = $o
PS C:\> $o2 = $o.PSObject.Copy()
If you change the nested property bar on both $o1 and $o2 it has on both objects the value that was last set to any of them:
PS C:\> $o1.foo.bar = 23
PS C:\> $o2.foo.bar = 24
PS C:\> $o1.foo.bar
24
PS C:\> $o2.foo.bar
24
Only if you change a property of the topmost object you'll get a difference between $o1 and $o2:
PS C:\> $o1.baz = 5
PS C:\> $o.baz
5
PS C:\> $o1.baz
5
PS C:\> $o2.baz
23
While you could do a deep copy it's not as simple and straightforward as one would like to think. Usually it takes less effort (and simpler code) to just create the object multiple times as #PetSerAl suggested in the comments to your question.
I'd also recommend to avoid appending to an array (or arraylist) in a loop. You can simply echo your objects inside the loop and collect the entire output as a list/array by assigning the loop to a variable:
$json = Get-Content '.\Sample Json.json' -Raw
$array = foreach ($g in $guests) {
$a = $json | ConvertFrom-Json # create new object
$a.Username = $g.'EmailAddress'
...
$a # echo object, so it can be collected in $array
}
Use Get-Content -Raw on PowerShell v3 and newer (or Get-Content | Out-String on earlier versions) to avoid issues with multiline JSON data in the JSON file.
My cmdlet get-objects returns an array of MyObject with public properties:
public class MyObject{
public string testString = "test";
}
I want users without programming skills to be able to modify public properties (like testString in this example) from all objects of the array.
Then feed the modified array to my second cmdlet which saves the object to the database.
That means the syntax of the "editing code" must be as simple as possible.
It should look somewhat like this:
> get-objects | foreach{$_.testString = "newValue"} | set-objects
I know that this is not possible, because $_ just returns a copy of the element from the array.
So you'd need to acces the elements by index in a loop and then modify the property.This gets really quickly really complicated for people that are not familiar with programming.
Is there any "user-friendly" built-in way of doing this? It shouldn't be more "complex" than a simple foreach {property = value}
I know that this is not possible, because $_ just returns a copy of the element from the array (https://social.technet.microsoft.com/forums/scriptcenter/en-US/a0a92149-d257-4751-8c2c-4c1622e78aa2/powershell-modifying-array-elements)
I think you're mis-intepreting the answer in that thread.
$_ is indeed a local copy of the value returned by whatever enumerator you're currently iterating over - but you can still return your modified copy of that value (as pointed out in the comments):
Get-Objects | ForEach-Object {
# modify the current item
$_.propertyname = "value"
# drop the modified object back into the pipeline
$_
} | Set-Objects
In (allegedly impossible) situations where you need to modify a stored array of objects, you can use the same technique to overwrite the array with the new values:
PS C:\> $myArray = 1,2,3,4,5
PS C:\> $myArray = $myArray |ForEach-Object {
>>> $_ *= 10
>>> $_
>>>}
>>>
PS C:\> $myArray
10
20
30
40
50
That means the syntax of the "editing code" must be as simple as possible.
Thankfully, PowerShell is very powerful in terms of introspection. You could implement a wrapper function that adds the $_; statement to the end of the loop body, in case the user forgets:
function Add-PsItem
{
[CmdletBinding()]
param(
[Parameter(Mandatory,ValueFromPipeline,ValueFromRemainingArguments)]
[psobject[]]$InputObject,
[Parameter(Mandatory)]
[scriptblock]$Process
)
begin {
$InputArray = #()
# fetch the last statement in the scriptblock
$EndBlock = $Process.Ast.EndBlock
$LastStatement = $EndBlock.Statements[-1].Extent.Text.Trim()
# check if the last statement is `$_`
if($LastStatement -ne '$_'){
# if not, add it
$Process = [scriptblock]::Create('{0};$_' -f $Process.ToString())
}
}
process {
# collect all the input
$InputArray += $InputObject
}
end {
# pipe input to foreach-object with the new scriptblock
$InputArray | ForEach-Object -Process $Process
}
}
Now the users can do:
Get-Objects | Add-PsItem {$_.testString = "newValue"} | Set-Objects
The ValueFromRemainingArguments attribute also lets users supply input as unbounded parameter values:
PS C:\> Add-PsItem { $_ *= 10 } 1 2 3
10
20
30
This might be helpful if the user is not used to working with the pipeline
Here's a more general approach, arguably easier to understand, and less fragile:
# $dataSource would be get-object in the OP
# $dataUpdater is the script the user supplies to modify properties
# $dataSink would be set-object in the OP
function Update-Data {
param(
[scriptblock] $dataSource,
[scriptblock] $dataUpdater,
[scriptblock] $dataSink
)
& $dataSource |
% {
$updaterOutput = & $dataUpdater
# This "if" allows $dataUpdater to create an entirely new object, or
# modify the properties of an existing object
if ($updaterOutput -eq $null) {
$_
} else {
$updaterOutput
}
} |
% $dataSink
}
Here are a couple of examples of use. The first example isn't applicable to the OP, but it's being used to create a data set that is applicable (a set of objects with properties).
# Use updata-data to create a set of data with properties
#
$theDataSource = #() # will be filled in by first update-data
update-data {
# data source
0..4
} {
# data updater: creates a new object with properties
New-Object psobject |
# add-member uses hash table created on the fly to add properties
# to a psobject
add-member -passthru -NotePropertyMembers #{
room = #('living','dining','kitchen','bed')[$_];
size = #(320, 200, 250, 424 )[$_]}
} {
# data sink
$global:theDataSource += $_
}
$theDataSource | ft -AutoSize
# Now use updata-data to modify properties in data set
# this $dataUpdater updates the 'size' property
#
$theDataSink = #()
update-data { $theDataSource } { $_.size *= 2} { $global:theDataSink += $_}
$theDataSink | ft -AutoSize
And then the output:
room size
---- ----
living 320
dining 200
kitchen 250
bed 424
room size
---- ----
living 640
dining 400
kitchen 500
bed 848
As described above update-data relies on a "streaming" data source and sink. There is no notion of whether the first or fifteenth element is being modified. Or if the data source uses a key (rather than an index) to access each element, the data sink wouldn't have access to the key. To handle this case a "context" (for example an index or a key) could be passed through the pipeline along with the data item. The $dataUpdater wouldn't (necessarily) need to see the context. Here's a revised version with this concept added:
# $dataSource and $dataSink scripts need to be changed to output/input an
# object that contains both the object to modify, as well as the context.
# To keep it simple, $dataSource will output an array with two elements:
# the value and the context. And $dataSink will accept an array (via $_)
# containing the value and the context.
function Update-Data {
param(
[scriptblock] $dataSource,
[scriptblock] $dataUpdater,
[scriptblock] $dataSink
)
% $dataSource |
% {
$saved_ = $_
# Set $_ to the data object
$_ = $_[0]
$updaterOutput = & $dataUpdater
if ($updaterOutput -eq $null) { $updaterOutput = $_}
$_ = $updaterOutput, $saved_[1]
} |
% $dataSink
}
I have a set of strings gathered from logs that I'm trying to parse into unique entries:
function Scan ($path, $logPaths, $pattern)
{
$logPaths | % `
{
$file = $_.FullName
Write-Host "`n[$file]"
Get-Content $file | Select-String -Pattern $pattern -CaseSensitive - AllMatches | % `
{
$regexDateTime = New-Object System.Text.RegularExpressions.Regex "((?:\d{4})-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}(,\d{3})?)"
$matchDate = $regexDateTime.match($_)
if($matchDate.success)
{
$loglinedate = [System.DateTime]::ParseExact($matchDate, "yyyy-MM-dd HH:mm:ss,FFF", [System.Globalization.CultureInfo]::InvariantCulture)
if ($loglinedate -gt $laterThan)
{
$date = $($_.toString().TrimStart() -split ']')[0]
$message = $($_.toString().TrimStart() -split ']')[1]
$messageArr += ,$date,$message
}
}
}
$messageArr | sort $message -Unique | foreach { Write-Host -f Green $date$message}
}
}
So for this input:
2015-09-04 07:50:06 [20] WARN Core.Ports.Services.ReferenceDataCheckers.SharedCheckers.DocumentLibraryMustExistService - A DocumentLibrary 3 could not be found.
2015-09-04 07:50:06 [20] WARN Core.Ports.Services.ReferenceDataCheckers.SharedCheckers.DocumentLibraryMustExistService - A DocumentLibrary 3 could not be found.
2015-09-04 07:50:16 [20] WARN Brighter - The message abc123 has been marked as obsolete by the consumer as the entity has a higher version on the consumer side.
Only the second two entries should be returned
I'm having trouble filtering out duplicates of $message: currently all entries are being returned (sort -Unique is not behaving as I would expect it to). I also need the correct $date to be returned against the filtered $message.
I'm pretty stuck with this, can anyone help?
We can do what you want, but first let's backup just a little bit to help us do this better. Right now you have an array of arrays, and that's difficult to work with in general. What would be better is if you had an array of objects, and those objects had properties such as Date and Message. Let's start there.
if ($loglinedate -gt $laterThan)
{
$date = $($_.toString().TrimStart() -split ']')[0]
$message = $($_.toString().TrimStart() -split ']')[1]
$messageArr += ,$date,$message
}
is going to become...
if ($loglinedate -gt $laterThan)
{
[Array]$messageArr += [PSCustomObject]#{
'date' = $($_.toString().TrimStart() -split ']')[0]
'message' = $($_.toString().TrimStart() -split ']')[1]
}
}
That produces an array of objects, and each object has two properties, Date and Message. That will be much easier to work with.
If you only want the latest version of any message that's easily done with the Group-Object command as such:
$FilteredArr = $messageArr | Group Message | ForEach{$_.Group|sort Date|Select -Last 1}
Then if you want to display it to screen like you are, you could do:
$Filtered|ForEach{Write-Host -f Green ("{0}`t{1}" -f $_.Date, $_.Message)}
My take (not tested) :
function Scan ($path, $logPaths, $pattern)
{
$regex = '(\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2})\s(.+)'
$ht = #{}
$logPaths | % `
{
$file = $_.FullName
Write-Host "`n[$file]"
Get-Content $file | Select-String -Pattern $pattern -CaseSensitive -AllMatches | % `
{
if ($_.line -match $regex -and $ht[$matches[2]] -gt $matches[1])
{ $ht[$matches[2]] = $matches[1] }
}
$ht.GetEnumerator() |
sort Value |
foreach { Write-Host -f Green "$($_.Value)$($_.Name)" }
}
}
This splits the file at the timestamp, and loads the parts into a hash table, using the error message as the key and the timestamp as the data (this will de-dupe the messages in-stream).
The timestamps are already in string-sortable format (yyyy-MM-dd HH:mm:ss), so there's really no need to cast them to [datetime] to find the latest one. Just do a straight string compare, and if the incoming timestamp is greater than an existing value for that message, replace the existing value with the new one.
When you're done, you should have a hash table with a key for each unique message found, having a value of the latest timestamp found for that message.
Is there a function that reverses elements passed via pipeline?
E.g.:
PS C:\> 10, 20, 30 | Reverse
30
20
10
You can cast $input within the function directly to a array, and then reverse that:
function reverse
{
$arr = #($input)
[array]::reverse($arr)
$arr
}
10, 20, 30 | Sort-Object -Descending {(++$script:i)}
Here's one approach:
function Reverse ()
{
$arr = $input | ForEach-Object { $_ }
[array]::Reverse($arr)
return $arr
}
Using $input works for pipe, but not for parameter. Try this:
function Get-ReverseArray {
Param(
[Parameter(Mandatory = $true, Position = 0, ValueFromPipeLine = $true)]
$Array
)
begin{
$build = #()
}
process{
$build += #($Array)
}
end{
[array]::reverse($build)
$build
}
}
#set alias to use non-standard verb, not required.. just looks nicer
New-Alias -Name Reverse-Array -Value Get-ReverseArray
Test:
$a = "quick","brown","fox"
#--- these all work
$a | Reverse-Array
Reverse-Array -Array $a
Reverse-Array $a
#--- Output for each
fox
brown
quick
Here's a remarkably compact solution:
function Reverse
{
[System.Collections.Stack]::new(#($input))
}
I realize this doesn't use a pipe, but I found this easier if you just wanted to do this inline, there is a simple way. Just put your values into an array and call the existing Reverse function like this:
$list = 30,20,10
[array]::Reverse($list)
# print the output.
$list
Put that in a PowerShell ISE window and run it the output will be 10, 20, 30.
Again, this is if you just want to do it inline. There is a function that will work for you.
$array = 10,20,30; (($array.Length - 1)..0) | %{ $array[$_] }
You can use Linq.Enumerable.Reverse, but it's hard work. Either of the following work:
,(10, 20, 30) | % { [Linq.Enumerable]::Reverse([int[]]$_) }
,([int[]]10, 20, 30) | % { [Linq.Enumerable]::Reverse($_) }
The two challenges are putting the whole array into the pipeline rather than one element at a time (solved by making it an array of arrays with the initial comma) and hitting the type signature of Reverse (solved by the [int[]]).
Try:
10, 20, 30 | Sort-Object -Descending
the script below reads my outlook emails but how do I access the output. I'm new too Powershell and I'm still getting used to certain things. I just want to get the body of 10 unread outlook emails and store them in an Array called $Body.
$olFolderInbox = 6
$outlook = new-object -com outlook.application;
$ns = $outlook.GetNameSpace("MAPI");
$inbox = $ns.GetDefaultFolder($olFolderInbox)
#checks 10 newest messages
$inbox.items | select -first 10 | foreach {
if($_.unread -eq $True) {
$mBody = $_.body
#Splits the line before any previous replies are loaded
$mBodySplit = $mBody -split "From:"
#Assigns only the first message in the chain
$mBodyLeft = $mbodySplit[0]
#build a string using the –f operator
$q = "From: " + $_.SenderName + ("`n") + " Message: " + $mBodyLeft
#create the COM object and invoke the Speak() method
(New-Object -ComObject SAPI.SPVoice).Speak($q) | Out-Null
}
}
This may not be a factor here, since you're looping through only ten elements, but using += to add elements to an array is very slow.
Another approach would be to output each element within the loop, and assign the results of the loop to $body. Here's a simplified example, assuming that you want $_.body:
$body = $inbox.items | select -first 10 | foreach {
if($_.unread -eq $True) {
$_.body
}
}
This works because anything that is output during the loop will be assigned to $body. And it can be much faster than using +=. You can verify this for yourself. Compare the two methods of creating an array with 10,000 elements:
Measure-Command {
$arr = #()
1..10000 | % {
$arr += $_
}
}
On my system, this takes just over 14 seconds.
Measure-Command {
$arr = 1..10000 | % {
$_
}
}
On my system, this takes 0.97 seconds, which makes it over 14 times faster. Again, probably not a factor if you are just looping through 10 items, but something to keep in mind if you ever need to create larger arrays.
define $body = #(); before your loop
Then just use += to add the elements
Here's another way:
$body = $inbox.Items.Restrict('[Unread]=true') | Select-Object -First 10 -ExpandProperty Body