Exchange Powershell Script consumes all system resources on local pc - arrays

I'm back!
Anyway, I'm running an exchange script to find emails that contain a specific list of keywords for a specific set of users, defined as users and terms in the script below, and it works. However after about an hour or so of running, it's consuming obnoxious amounts of memory, 12 - 14 GB. and running very slowly.
It does flux between 3 GB and 14 GB, so I don't know if this is simply normal, expected behavior or if its' something wrong with my script. I am aware that I'm using a sorta(?) depreciated commandlet in the search-mailbox function, but I'm only searching about 300 users and 21 terms, so I don't think I need to use the new-mailboxsearch commandlet.
Script for Reference
$users = Get-Content x:\xxx\xxx\users.txt
$terms = Get-Content x:\xxx\xxx\Terms.txt
ForEach ($term in $Terms) {
ForEach ($line in $users) {
$Results = Search-Mailbox -Identity $line -SearchQuery $term -TargetMailbox SearchResults2 -TargetFolder $term -LogLevel Full | Select Identity,TargetFolder,ResultItemsCount
Add-Content -Path x:\xxx\xxx\outputfile.txt -Value "$($term);$($line);$($Results.TargetFolder);$($Results.ResultItemsCount)"
}
}
Anyway, any help is, as always, greatly appreciated.
Thanks!

Does foreach-object fair better?
$terms | ForEach { $term = $_
$users | ForEach { $line = $_
..
}
}

The problem wasn’t with the script itself, it was the environment we were running it in.
For some reason running the script inside of the integrated scripting environment Powershell ISE, was causing the script to suck up crazy amounts of memory, eventually halting the system. By simply launching it outside of the ISE we were able to get the script to behave normally:
Thanks to everyone who replied!

Related

Using Powershell to select only one file in a folder

I've been trying to figure out a way to use Powershell to select only one file (doesn't matter which one) out of a folder so I can do a rename on it. After that I want it to just exit. Please be patient with me, I'm new to PS, most of my coding experience is in other languages.
So I've been doing something like this so it just executes once and exits...
$a = 0
DO
{
$a
$a++
} While ($a -le 0)
Would I put a Get-ChildItem in there somehow, or how do I reference the file so I can rename it to something like "newfile.txt"? Plus the file location is not local on my machine, it's on a UNC path so not sure how to point it to another location.
Thanks in advance.
Try this:
Get-ChildItem "\\Server\path" -file | select -first 1 | Rename-Item -NewName "newname.txt"

Nested foreach loops to install printers

I have a need to install multiple printers to multiple servers and was wanting to script this so that it would only need to be configured once.
I am using PowerShell to query 2 CSV files; 1 to get a list of computers to install printers too and the other to list the Name, IP, Driver, Location of the needed printers. I am using 2 foreach loops to accomplish this and a break to try to get it to loop correctly.
Currently with the break :outer where it is the first PC gets all printers installed. If I move it to inside the foreach ($printer in $printers) loop it will only install the first printer to all computers.
Thank you for any assistance that anyone can provide.
$computerfile = Import-Csv C:\computerlist.csv
$printers = Import-Csv C:\printers2.csv
foreach ($computer in $computerfile) {
"computer: $computer"
:outer
foreach ($printer in $printers) {
Add-PrinterPort -ComputerName $computer.Name -Name $printer.IP -PrinterHostAddress $printer.IP
Add-Printer -ComputerName $computer.Name -Name $printer.Name -DriverName $printer.Driver -PortName $printer.IP -ShareName $printer.Name -Location $printer.Location
Set-printer -ComputerName $computer.Name -Name $printer.Name -Shared $false -Published $true
}
break :outer
}
Please remove the break :outer from your code.
What is happening is that the first loop starts running, and enters the second loop only once! because of the break, and jumps to the next computer.

Optimizing Powershell Script to find old files and delete them on DFS replicated folders

Here is the story. I have a fileshare that is replicated between 2 servers located in different places in the world. DFS will not replicate a file if it has only been viewed, but I wouldn't want to delete that file/folder because it was used within the time period I have set (7 days). So to make sure that I don't remove still used files I have to check both locations for their LastAccessTime.
I currently have this
Set-ExecutionPolicy RemoteSigned
$limit = (Get-Date).AddDays(-7)
$PathOne = "FirstPath"
$PathTwo = "SecondPath"
$ToBeDeletedPathOne = Get-ChildItem -Path $PathOne -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.LastAccessTime -lt $limit }
$TobeDeletedPathTwo = Get-ChildItem -Path $PathTwo -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.LastAccessTime -lt $limit }
$DiffObjects = Compare-Object -referenceobject $ToBeDeletedPathOne -differenceobject $ToBeDeletedPathTwo -IncludeEqual
$ToBeDeletedOverall = DiffObjects | where {$_.SideIndicator -eq "=="}
After this, I loop through and delete the files that are marked for deletion by both locations.
Part of the problem I have is that there are a tremendous amount of files and this can take a very long time. So I wanted to make it better/faster. My idea is to have this script run the scan as a different script on each FS server and wait for them to return the output. That way it can scan on the local machine easier than remotely and it would scan both locations simultaneously.
The other part of the problem comes in with the fact that I have no idea how to do this. I will continue to work on this and if I solve it, I will post here in case anyone in the future finds this useful.
You could run everything locally. Copy the script to the machines you want (make a script to copy them if you need to) then use something like PSTools to kick them off on the local machines. This should essentially run the script simultaneously on all machines.

Powershell from SQL Server Agent Job doesn't recognize ConvertTo-CSV

UPDATE: Modified the script to work within the bounds of PS1 as required by SQLPS.
Changed:
IF($property.Value -match $regex){
$currentBadLine = (ConvertTo-Csv $_ -NoTypeInformation -Delimiter $delimiter);
$badLines += $currentBadLine[1,2]
};
To:
IF($property.Value -match $regex){
$badLines += $_ | Select-Object | ft -autoSize;
};
Prints a new header for each bad line, but it's not the end of the world and not worth the effort to prevent.
I have a Powershell script that pre-processes CSV files before they have a chance to screw up my data imports.
On two servers in a row now I have confirmed that the PS Major Version at least 2, and that the following code snippet runs fine in Powershell ISE. The purpose of the code is to read in each line of the CSV, and then loop through the columns looking for the regex pattern in the $regex variable. When it finds one I want it to keep track of the error before fixing it so I can write an error log later before outputting a cleaned up file ready for import.
%{
Foreach($Property in $_.PSObject.Properties){
IF($property.Value -match $regex){
$currentBadLine = (ConvertTo-Csv $_ -NoTypeInformation -Delimiter $delimiter);
$badLines += $currentBadLine[1,2]
};
$property.Value = $property.Value -replace $regex;
}
$_
}
But once I put that code into an agent job the agent complains:
'The term 'ConvertTo-Csv' is not recognized as the name of a cmdlet,
function, script file, or operable program. Check the spelling of the
name, or if a path was included, verify that the path is correct and
try again. '
The question then is this: Is the Agents Powershell subsystem using a different version of Powershell than the rest of the system? If so, how do I find out which version the Subsystem is using and if possible upgrade it so I can fix this.
The server is running:
Windows Server 2008 R2 Enterprise
PS Major version 2, Minor 0 Build -1 revision -1
SQL Server 2008 R2 SP1 10.5.2500.0 64 bit
Thanks!
Yes, proper PowerShell support isn't really implemented until SQL Server 2012 (an even that is a bit flakey as to what cmdlets it supports)
In 2008 and R2 the agent's powershell implementation is actually a minishell, created by the now (thankfully) deprecated make-shell.exe utility, which only allows v1 cmdlets to run, and disallows the use of Add-PSSnapin so you can't add anymore cmdlets.
To get proper powershell support, you either need to shell out and call an external script, or run the job as a windows scheduled task rather than an agent job.
The following article explains a lot about why powershell support in 2008 R2 doesn't work like you think it should:
The Truth about SQLPS and PowerShell V2
One work-around: Export-CSV to a file, then Get-Content from the file.
$rows = ,(New-Object PSObject -prop #{a=7; b='stuff';});
$rows +=,(New-Object PSObject -prop #{a=77; b='more';});
#To run from SQL Job, change from this:
$csvrows = $rows | ConvertTo-CSV -NoType | % {$_.Replace('"','')};
write-output $csvrows;
#to this:
$rows | Export-CSV -NoType "C:\Temp\T.csv"
$csvrows = (Get-Content "C:\Temp\T.csv") | % {$_.Replace('"','')};
write-output $csvrows;

powershell condition not processing all values in an array

I am newb in powershell but keen to put it into good use. I am working on a script which should do the following:
Check for the existence of a specific folder in a specific location (mapped drive)
If the folder exists, then return a listing
If the folder does not exist, then create it.
Ideally, I would like to improve it in terms of check-if exists-remove-item (subdir); check-if not exists-create
This is to facilitate the automation of an archiving process for a specific piece of software. What I have right now sort of works but I cannot figure out how to make it do exactly what I want.
Here is the code:
$X = #("Server1", "Server2", "Server3")
$ChkFile = "f:\archive\$server\AABackup"
$myFolder = "f:\archive\$server"
$DirExists = Test-Path $ChkFile
Foreach ($server in $x){
IF ($DirExists -eq $True) {
Remove-Item $ChkFile -recurse
import-Module "AppAssurePowerShellModule"
start-archive -protectedserver $server -startdate "18/03/2013 5:30 PM" -path "f:\archive\$server"
}
Elseif ($DirExists -ne $True) {
New-Item -path $myFolder -itemType "directory"
import-Module "AppAssurePowerShellModule"
start-archive -protectedserver $server -startdate "18/03/2013 5:30 PM" -path "f:\archive\$server"
}
}
Yes I know it is rough... It's my first attempt though so I could definitely benefit from the more experienced scripters here.
Thanks in advance.
You're setting $ChkFile, $myFolder and $DirExists before the loop, which means that $server doesn't have a value yet, and -- since variables are evaluated immediately -- these variables will contain garbage.
You need to move those three statements inside the foreach loop.
You also don't need to compare -eq $true; this would be simpler:
if ($dirExists) {
# ...
}
else {
# ...
}
Oh, and you only need to import the module once -- do it at the top of the script.
Also, in terms of style: PowerShell keywords should generally be in lowercase: foreach, if, elseif; be consistent when invoking cmdlets (you have a mixture of lower-case and Upper-Case and lower-Case. Note that these don't make any real difference, but using a consistent style makes the code easier to read for someone else coming to it. I'm basing those rules on what I've seen on TechNet, PoshCode, and here, by the way -- they're definitely subjective.
And, personally, I use $lowerCase for local variables, and $UpperCase for parameters (because it makes the auto-generated help text look nicer).
Give this a shot.
$specificPath = Z:\MYDir\MySubDir
if(!(Test-Path $SpecificPath))
{
Mkdir $SpecificPath
}
Else
{
Get-ChildItem $specificPath
}
Explanation:
This checks for the existence of the path contained in $SpecificPath using Test-Path, which will return a Boolean value.
Since I used the (!()) syntax in my IF statement, it will try to evaluate the statement to false, IF the path DOES NOT EXIST, it will run the first block of code.
MkDir is an alias for New-ItemProperty, if you pass just a path to Mkdir it will make a directory, similar to the windows MkDir command.
If the statement contained in the IF statement does not evaluate to false, the ELSE block will run, and execute a get-childitem on the $specificpath variable.

Resources