the script below reads my outlook emails but how do I access the output. I'm new too Powershell and I'm still getting used to certain things. I just want to get the body of 10 unread outlook emails and store them in an Array called $Body.
$olFolderInbox = 6
$outlook = new-object -com outlook.application;
$ns = $outlook.GetNameSpace("MAPI");
$inbox = $ns.GetDefaultFolder($olFolderInbox)
#checks 10 newest messages
$inbox.items | select -first 10 | foreach {
if($_.unread -eq $True) {
$mBody = $_.body
#Splits the line before any previous replies are loaded
$mBodySplit = $mBody -split "From:"
#Assigns only the first message in the chain
$mBodyLeft = $mbodySplit[0]
#build a string using the –f operator
$q = "From: " + $_.SenderName + ("`n") + " Message: " + $mBodyLeft
#create the COM object and invoke the Speak() method
(New-Object -ComObject SAPI.SPVoice).Speak($q) | Out-Null
}
}
This may not be a factor here, since you're looping through only ten elements, but using += to add elements to an array is very slow.
Another approach would be to output each element within the loop, and assign the results of the loop to $body. Here's a simplified example, assuming that you want $_.body:
$body = $inbox.items | select -first 10 | foreach {
if($_.unread -eq $True) {
$_.body
}
}
This works because anything that is output during the loop will be assigned to $body. And it can be much faster than using +=. You can verify this for yourself. Compare the two methods of creating an array with 10,000 elements:
Measure-Command {
$arr = #()
1..10000 | % {
$arr += $_
}
}
On my system, this takes just over 14 seconds.
Measure-Command {
$arr = 1..10000 | % {
$_
}
}
On my system, this takes 0.97 seconds, which makes it over 14 times faster. Again, probably not a factor if you are just looping through 10 items, but something to keep in mind if you ever need to create larger arrays.
define $body = #(); before your loop
Then just use += to add the elements
Here's another way:
$body = $inbox.Items.Restrict('[Unread]=true') | Select-Object -First 10 -ExpandProperty Body
Related
I'm trying to find the row with an attribute that is larger than the other row's attributes. Example:
$Array
Name Value
---- ----
test1 105
test2 101
test3 512 <--- Selects this row as it is the largest value
Here is my attempt to '1 line' this but It doesn't work.
$Array | % { If($_.value -gt $Array[0..($Array.Count)].value){write-host "$_.name is the largest row"}}
Currently it outputs nothing.
Desired Output:
"test1 is the largest row"
I'm having trouble visualizing how to do this efficiently with out some serious spaghetti code.
You could take advantage of Sort-Object to rank them by the property "Value" like this
$array = #(
[PSCustomObject]#{Name='test1';Value=105}
[PSCustomObject]#{Name='test2';Value=101}
[PSCustomObject]#{Name='test3';Value=512}
)
$array | Sort-Object -Property value -Descending | Select-Object -First 1
Output
Name Value
---- -----
test3 512
To incorporate your write host you can just run the one you select through a foreach.
$array | Sort-Object -Property value -Descending |
Select-Object -First 1 | Foreach-Object {Write-host $_.name,"has the highest value"}
test3 has the highest value
Or capture to a variable
$Largest = $array | Sort-Object -Property value -Descending | Select-Object -First 1
Write-host $Largest.name,"has the highest value"
test3 has the highest value
PowerShell has many built in features to make tasks like this easier.
If this is really an array of PSCustomObjects you can do something like:
$Array =
#(
[PSCustomObject]#{ Name = 'test1'; Value = 105 }
[PSCustomObject]#{ Name = 'test2'; Value = 101 }
[PSCustomObject]#{ Name = 'test3'; Value = 512 }
)
$Largest = ($Array | Sort-Object Value)[-1].Name
Write-host $Largest,"has the highest value"
This will sort your array according to the Value property. Then reference the last element using the [-1] syntax, then return the name property of that object.
Or if you're a purist you can assign the variable like:
$Largest = $Array | Sort-Object Value | Select-Object -Last 1 -ExpandProperty Name
If you want the whole object just remove .Name & -ExpandProperty Name respectively.
Update:
As noted PowerShell has some great tools to help with common tasks like sorting & selecting data. However, that doesn't mean there's never a need for looping constructs. So, I wanted to make a couple of points about the OP's own answer.
First, if you do need to reference array elements by index use a traditional For loop, which might look something like:
For( $i = 0; $i -lt $Array.Count; ++$i )
{
If( $array[$i].Value -gt $LargestValue )
{
$LargestName = $array[$i].Name
$LargestValue = $array[$i].Value
}
}
$i is commonly used as an iteration variable, and within the script block is used as the array index.
Second, even the traditional loop is unnecessary in this case. You can stick with the ForEach loop and track the largest value as and when it's encountered. That might look something like:
ForEach( $Row in $array )
{
If( $Row.Value -gt $LargestValue )
{
$LargestName = $Row.Name
$LargestValue = $Row.Value
}
}
Strictly speaking you don't need to assign the variables beforehand, though it may be a good practice to precede either of these with:
$LargestName = ""
$LargestValue = 0
In these examples you'd have to follow with a slightly modified Write-Host command
Write-host $LargestName,"has the highest value"
Note: Borrowed some of the test code from Doug Maurer's Fine Answer. Considering our answers were similar, this was just to make my examples more clear to the question and easier to test.
Figured it out, hopefully this isn't awful:
$Count = 1
$CurrentLargest = 0
Foreach($Row in $Array) {
# Compare This iteration vs the next to find the largest
If($Row.value -gt $Array.Value[$Count]){$CurrentLargest = $Row}
Else {$CurrentLargest = $Array[$Count]}
# Replace the existing largest value with the new one if it is larger than it.
If($CurrentLargest.Value -gt $Largest.Value){ $Largest = $CurrentLargest }
$Count += 1
}
Write-host $Largest.name,"has the highest value"
Edit: its awful, look at the other answers for a better way.
I'm trying to get the encryption status of all drives on a Windows system and sort that list in a custom formatted output. I need this because the output is going to a Nagios server; it messes up the formatting of the standard output for Get-BitLockerVolume and is too long.
Here's what I have so far. I'm trying to sort the output in such a manner that the system drive is listed first and gives the mount point (drive letter) along with the percentage.
[array]$DriveTypes = Get-BitLockerVolume | Sort-Object VolumeType | Select-Object VolumeType
[array]$DriveMounts = Get-BitLockerVolume | Sort-Object VolumeType | Select-Object MountPoint
[array]$WDEPercent = Get-BitLockerVolume | Sort-Object VolumeType | Select-Object EncryptionPercentage
for ($i = 0; $i -lt $DriveTypes.Count; $i++) {
if ($DriveIndex -eq $DriveTypes.Count) {
$TextDriveListing = $TextDriveListing + $DriveMounts.MountPoint+" ("+$DriveTypes.VolumeType+") at "+$WDEPercent.EncryptionPercentage+"%."
}
else {
$TextDriveListing = $TextDriveListing + $DriveMounts.MountPoint+" ("+$DriveTypes.VolumeType+") at "+$WDEPercent.EncryptionPercentage+"%, "
}
if ($WDEPercent.EncryptionPercentage -lt $ReqValue) {
$NoEncryptFlag = 1
}
}
My desired output, for example, is this:
C: (OperatingSystem) at 100%, D: (Data) at 0%.
What I actually end up with is this:
C: D: (OperatingSystem Data) at 100 0%, C: D: (OperatingSystem Data) at 100 0%,
I did try something deriving from an answer to "How to sort a Multi Dimensional Array in Powershell" to test it out, commenting out my aforementioned for block and putting in:
$ListDrives | ForEach-Object {
Get-BitLockerVolume #{
MountPoint = $_[0]
EncryptionPercentage = $_[1]
}
} | Sort-Object VolumeType
Write-Host $ListDrives
That spit out this error:
Cannot index into a null array.
At C:****************.ps1:142 char:3
Get-BitLockerVolume #{
~~~~~~~~~~~~~~~~~~~~~~
CategoryInfo : InvalidOperation: (:) [], RuntimeException
FullyQualifiedErrorId : NullArray
What am I doing wrong? Any suggestions?
Thanks so much in advance!
Try this:
for ($i = 0; $i -lt $DriveTypes.Count; $i++) {
if ($i -eq ($DriveTypes.Count - 1)) {
$TextDriveListing = $TextDriveListing + $DriveMounts[$i].MountPoint+" ("+$DriveTypes[$i].VolumeType+") at "+$WDEPercent[$i].EncryptionPercentage+"%."
}
else {
$TextDriveListing = $TextDriveListing + $DriveMounts[$i].MountPoint+" ("+$DriveTypes[$i].VolumeType+") at "+$WDEPercent[$i].EncryptionPercentage+"%, "
}
if ($WDEPercent[$i].EncryptionPercentage -lt $ReqValue) {
$NoEncryptFlag = 1
}
}
You weren't using the $i from your For Loop to access specific indexes in your collections (i've added [$i] to each of your collection variables to do so). You were also using a variable called $DriveIndex that was never populated and I think this needed to be comparing to $i also, however the logic was also one that would never be true because the For loop would end before it was so (so i've changed the logic to ($i -eq ($DriveTypes.Count - 1)).
Here's a tidier version that I think also gets you the same result:
$TextDriveListing = ''
$Drives = Get-BitLockerVolume | Sort-Object VolumeType | Select VolumeType,MountPoint,EncryptionPercentage
$Drives | ForEach-Object {
$TextDriveListing += "$($_.MountPoint) ($($_.VolumeType)) at $($_.EncryptionPercentage)%,"
If ($_.EncryptionPercentage -lt $ReqValue) { $NoEncryptFlag = 1 }
} -End { $TextDriveListing -Replace ',$','.' }
Uses a single variable for the three properties you wanted to access, rather than putting them in to separate variables which was unnecessary.
Uses a ForEach-Object loop to access each item (and their properties) in that collection via the special token $_.
Uses a single double quoted string for output, with the object/properties accessed via the subexpression operator $().
Puts a comma on the end of each line, but then at the End of the ForEach, uses regex to replace the comma at the end of the line (regex: $ token) with a full stop.
Both sets of code are untested, so may need tweaking.
I am trying to use $a variable in this script for working with intermediate steps so that I don't have to use $array[$array.Count-1] repeatedly. Similarly for $prop as well . However, values are being overwritten by last value in loop.
$guests = Import-Csv -Path C:\Users\shant_000\Desktop\UploadGuest_test.csv
$output = gc '.\Sample Json.json' | ConvertFrom-Json
$array = New-Object System.Collections.ArrayList;
foreach ($g in $guests) {
$array.Add($output);
$a = $array[$array.Count-1];
$a.Username = $g.'EmailAddress';
$a.DisplayName = $g.'FirstName' + ' ' + $g.'LastName';
$a.Password = $g.'LastName' + '123';
$a.Email = $g.'EmailAddress';
foreach ($i in $a.ProfileProperties.Count) {
$j = $i - 1;
$prop = $a.ProfileProperties[$j];
if ($prop.PropertyName -eq "FirstName") {
$prop.PropertyValue = $g.'FirstName';
} elseif ($prop.PropertyName -eq "LastName") {
$prop.PropertyValue = $g.'LastName';
}
$a.ProfileProperties[$j] = $prop;
}
$array[$array.Count-1] = $a;
}
$array;
All array elements are referencing one actual variable: $output.
Create an entirely new object each time by repeating JSON-parsing:
$jsontext = gc '.\Sample Json.json'
..........
foreach ($g in $guests) {
$a = $jsontext | ConvertFrom-Json
# process $a
# ............
$array.Add($a) >$null
}
In case the JSON file is very big and you change only a few parts of it you can use a faster cloning technique on the changed parts (and their entire parent chain) via .PSObject.Copy():
foreach ($g in $guests) {
$a = $output.PSObject.Copy()
# ............
$a.ProfileProperties = $a.ProfileProperties.PSObject.Copy()
# ............
foreach ($i in $a.ProfileProperties.Count) {
# ............
$prop = $a.ProfileProperties[$j].PSObject.Copy();
# ............
}
$array.Add($a) >$null
}
As others have pointed out, appending $object appends a references to the same single object, so you keep changing the values for all elements in the list. Unfortunately the approach #wOxxOm suggested (which I thought would work at first too) doesn't work if your JSON datastructure has nested objects, because Copy() only clones the topmost object while the nested objects remain references to their original.
Demonstration:
PS C:\> $o = '{"foo":{"bar":42},"baz":23}' | ConvertFrom-Json
PS C:\> $o | Format-Custom *
class PSCustomObject
{
foo =
class PSCustomObject
{
bar = 42
}
baz = 23
}
PS C:\> $o1 = $o
PS C:\> $o2 = $o.PSObject.Copy()
If you change the nested property bar on both $o1 and $o2 it has on both objects the value that was last set to any of them:
PS C:\> $o1.foo.bar = 23
PS C:\> $o2.foo.bar = 24
PS C:\> $o1.foo.bar
24
PS C:\> $o2.foo.bar
24
Only if you change a property of the topmost object you'll get a difference between $o1 and $o2:
PS C:\> $o1.baz = 5
PS C:\> $o.baz
5
PS C:\> $o1.baz
5
PS C:\> $o2.baz
23
While you could do a deep copy it's not as simple and straightforward as one would like to think. Usually it takes less effort (and simpler code) to just create the object multiple times as #PetSerAl suggested in the comments to your question.
I'd also recommend to avoid appending to an array (or arraylist) in a loop. You can simply echo your objects inside the loop and collect the entire output as a list/array by assigning the loop to a variable:
$json = Get-Content '.\Sample Json.json' -Raw
$array = foreach ($g in $guests) {
$a = $json | ConvertFrom-Json # create new object
$a.Username = $g.'EmailAddress'
...
$a # echo object, so it can be collected in $array
}
Use Get-Content -Raw on PowerShell v3 and newer (or Get-Content | Out-String on earlier versions) to avoid issues with multiline JSON data in the JSON file.
My cmdlet get-objects returns an array of MyObject with public properties:
public class MyObject{
public string testString = "test";
}
I want users without programming skills to be able to modify public properties (like testString in this example) from all objects of the array.
Then feed the modified array to my second cmdlet which saves the object to the database.
That means the syntax of the "editing code" must be as simple as possible.
It should look somewhat like this:
> get-objects | foreach{$_.testString = "newValue"} | set-objects
I know that this is not possible, because $_ just returns a copy of the element from the array.
So you'd need to acces the elements by index in a loop and then modify the property.This gets really quickly really complicated for people that are not familiar with programming.
Is there any "user-friendly" built-in way of doing this? It shouldn't be more "complex" than a simple foreach {property = value}
I know that this is not possible, because $_ just returns a copy of the element from the array (https://social.technet.microsoft.com/forums/scriptcenter/en-US/a0a92149-d257-4751-8c2c-4c1622e78aa2/powershell-modifying-array-elements)
I think you're mis-intepreting the answer in that thread.
$_ is indeed a local copy of the value returned by whatever enumerator you're currently iterating over - but you can still return your modified copy of that value (as pointed out in the comments):
Get-Objects | ForEach-Object {
# modify the current item
$_.propertyname = "value"
# drop the modified object back into the pipeline
$_
} | Set-Objects
In (allegedly impossible) situations where you need to modify a stored array of objects, you can use the same technique to overwrite the array with the new values:
PS C:\> $myArray = 1,2,3,4,5
PS C:\> $myArray = $myArray |ForEach-Object {
>>> $_ *= 10
>>> $_
>>>}
>>>
PS C:\> $myArray
10
20
30
40
50
That means the syntax of the "editing code" must be as simple as possible.
Thankfully, PowerShell is very powerful in terms of introspection. You could implement a wrapper function that adds the $_; statement to the end of the loop body, in case the user forgets:
function Add-PsItem
{
[CmdletBinding()]
param(
[Parameter(Mandatory,ValueFromPipeline,ValueFromRemainingArguments)]
[psobject[]]$InputObject,
[Parameter(Mandatory)]
[scriptblock]$Process
)
begin {
$InputArray = #()
# fetch the last statement in the scriptblock
$EndBlock = $Process.Ast.EndBlock
$LastStatement = $EndBlock.Statements[-1].Extent.Text.Trim()
# check if the last statement is `$_`
if($LastStatement -ne '$_'){
# if not, add it
$Process = [scriptblock]::Create('{0};$_' -f $Process.ToString())
}
}
process {
# collect all the input
$InputArray += $InputObject
}
end {
# pipe input to foreach-object with the new scriptblock
$InputArray | ForEach-Object -Process $Process
}
}
Now the users can do:
Get-Objects | Add-PsItem {$_.testString = "newValue"} | Set-Objects
The ValueFromRemainingArguments attribute also lets users supply input as unbounded parameter values:
PS C:\> Add-PsItem { $_ *= 10 } 1 2 3
10
20
30
This might be helpful if the user is not used to working with the pipeline
Here's a more general approach, arguably easier to understand, and less fragile:
# $dataSource would be get-object in the OP
# $dataUpdater is the script the user supplies to modify properties
# $dataSink would be set-object in the OP
function Update-Data {
param(
[scriptblock] $dataSource,
[scriptblock] $dataUpdater,
[scriptblock] $dataSink
)
& $dataSource |
% {
$updaterOutput = & $dataUpdater
# This "if" allows $dataUpdater to create an entirely new object, or
# modify the properties of an existing object
if ($updaterOutput -eq $null) {
$_
} else {
$updaterOutput
}
} |
% $dataSink
}
Here are a couple of examples of use. The first example isn't applicable to the OP, but it's being used to create a data set that is applicable (a set of objects with properties).
# Use updata-data to create a set of data with properties
#
$theDataSource = #() # will be filled in by first update-data
update-data {
# data source
0..4
} {
# data updater: creates a new object with properties
New-Object psobject |
# add-member uses hash table created on the fly to add properties
# to a psobject
add-member -passthru -NotePropertyMembers #{
room = #('living','dining','kitchen','bed')[$_];
size = #(320, 200, 250, 424 )[$_]}
} {
# data sink
$global:theDataSource += $_
}
$theDataSource | ft -AutoSize
# Now use updata-data to modify properties in data set
# this $dataUpdater updates the 'size' property
#
$theDataSink = #()
update-data { $theDataSource } { $_.size *= 2} { $global:theDataSink += $_}
$theDataSink | ft -AutoSize
And then the output:
room size
---- ----
living 320
dining 200
kitchen 250
bed 424
room size
---- ----
living 640
dining 400
kitchen 500
bed 848
As described above update-data relies on a "streaming" data source and sink. There is no notion of whether the first or fifteenth element is being modified. Or if the data source uses a key (rather than an index) to access each element, the data sink wouldn't have access to the key. To handle this case a "context" (for example an index or a key) could be passed through the pipeline along with the data item. The $dataUpdater wouldn't (necessarily) need to see the context. Here's a revised version with this concept added:
# $dataSource and $dataSink scripts need to be changed to output/input an
# object that contains both the object to modify, as well as the context.
# To keep it simple, $dataSource will output an array with two elements:
# the value and the context. And $dataSink will accept an array (via $_)
# containing the value and the context.
function Update-Data {
param(
[scriptblock] $dataSource,
[scriptblock] $dataUpdater,
[scriptblock] $dataSink
)
% $dataSource |
% {
$saved_ = $_
# Set $_ to the data object
$_ = $_[0]
$updaterOutput = & $dataUpdater
if ($updaterOutput -eq $null) { $updaterOutput = $_}
$_ = $updaterOutput, $saved_[1]
} |
% $dataSink
}
I have a set of strings gathered from logs that I'm trying to parse into unique entries:
function Scan ($path, $logPaths, $pattern)
{
$logPaths | % `
{
$file = $_.FullName
Write-Host "`n[$file]"
Get-Content $file | Select-String -Pattern $pattern -CaseSensitive - AllMatches | % `
{
$regexDateTime = New-Object System.Text.RegularExpressions.Regex "((?:\d{4})-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}(,\d{3})?)"
$matchDate = $regexDateTime.match($_)
if($matchDate.success)
{
$loglinedate = [System.DateTime]::ParseExact($matchDate, "yyyy-MM-dd HH:mm:ss,FFF", [System.Globalization.CultureInfo]::InvariantCulture)
if ($loglinedate -gt $laterThan)
{
$date = $($_.toString().TrimStart() -split ']')[0]
$message = $($_.toString().TrimStart() -split ']')[1]
$messageArr += ,$date,$message
}
}
}
$messageArr | sort $message -Unique | foreach { Write-Host -f Green $date$message}
}
}
So for this input:
2015-09-04 07:50:06 [20] WARN Core.Ports.Services.ReferenceDataCheckers.SharedCheckers.DocumentLibraryMustExistService - A DocumentLibrary 3 could not be found.
2015-09-04 07:50:06 [20] WARN Core.Ports.Services.ReferenceDataCheckers.SharedCheckers.DocumentLibraryMustExistService - A DocumentLibrary 3 could not be found.
2015-09-04 07:50:16 [20] WARN Brighter - The message abc123 has been marked as obsolete by the consumer as the entity has a higher version on the consumer side.
Only the second two entries should be returned
I'm having trouble filtering out duplicates of $message: currently all entries are being returned (sort -Unique is not behaving as I would expect it to). I also need the correct $date to be returned against the filtered $message.
I'm pretty stuck with this, can anyone help?
We can do what you want, but first let's backup just a little bit to help us do this better. Right now you have an array of arrays, and that's difficult to work with in general. What would be better is if you had an array of objects, and those objects had properties such as Date and Message. Let's start there.
if ($loglinedate -gt $laterThan)
{
$date = $($_.toString().TrimStart() -split ']')[0]
$message = $($_.toString().TrimStart() -split ']')[1]
$messageArr += ,$date,$message
}
is going to become...
if ($loglinedate -gt $laterThan)
{
[Array]$messageArr += [PSCustomObject]#{
'date' = $($_.toString().TrimStart() -split ']')[0]
'message' = $($_.toString().TrimStart() -split ']')[1]
}
}
That produces an array of objects, and each object has two properties, Date and Message. That will be much easier to work with.
If you only want the latest version of any message that's easily done with the Group-Object command as such:
$FilteredArr = $messageArr | Group Message | ForEach{$_.Group|sort Date|Select -Last 1}
Then if you want to display it to screen like you are, you could do:
$Filtered|ForEach{Write-Host -f Green ("{0}`t{1}" -f $_.Date, $_.Message)}
My take (not tested) :
function Scan ($path, $logPaths, $pattern)
{
$regex = '(\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2})\s(.+)'
$ht = #{}
$logPaths | % `
{
$file = $_.FullName
Write-Host "`n[$file]"
Get-Content $file | Select-String -Pattern $pattern -CaseSensitive -AllMatches | % `
{
if ($_.line -match $regex -and $ht[$matches[2]] -gt $matches[1])
{ $ht[$matches[2]] = $matches[1] }
}
$ht.GetEnumerator() |
sort Value |
foreach { Write-Host -f Green "$($_.Value)$($_.Name)" }
}
}
This splits the file at the timestamp, and loads the parts into a hash table, using the error message as the key and the timestamp as the data (this will de-dupe the messages in-stream).
The timestamps are already in string-sortable format (yyyy-MM-dd HH:mm:ss), so there's really no need to cast them to [datetime] to find the latest one. Just do a straight string compare, and if the incoming timestamp is greater than an existing value for that message, replace the existing value with the new one.
When you're done, you should have a hash table with a key for each unique message found, having a value of the latest timestamp found for that message.