I have a program which checks two paths for files and if there are any files it sends one mail for one person each.
They now need four paths checked and multiple person need a mail for a path.
e.g.
Path 1 - Mail to x, y and z
Path 2 - Mail to a and b
Path 3 - Mail to x and a
Path 4 - Mail to s, r and w
How can I make it easy and most efficient?
$folders = #()
$folders += "\\server\files\Info\test"
$folders += "\\server\files\Info\test2"
$receiver = #()
$receiver += "person1#test.com"
$receiver += "person2#test.com"
$i = 0
$folders | ForEach-Object {
$checkforfiles = $_
$directoryInfo = Get-ChildItem $checkforfiles | Measure-Object
$directoryInfo.count #Returns the count of all of the objects in the directory
$Numberone = $directoryInfo.Count
if ($directoryInfo.count -ge 1){
send-mailmessage -subject "Subject" -Body "There are $checkforfiles files " -from foldercheck#test.com -to $receiver[$i] `
-smtpserver smtp.ser.com
$i = $i+1
}
else{
write-host "nothing to process"
}
}
You could simply extend the array of folders to test and use a switch to determine which users should get an email.
Also, I would advise using splatting on cmdlets with a lot of parameters to make for cleaner code.
# an array of folder paths to test
$folders = "\\server\files\Info\test", "\\server\files\Info\test2", "\\server\files\Info\test3", "\\server\files\Info\test4"
for ($i = 0; $i -lt $folders.Count; $i++) {
# get the file (and folder) count inside
$count = #(Get-ChildItem -Path $folders[$i]).Count
if ($count -gt 0) {
# determine who to email
$to = switch ($i) {
0 { 'personX#test.com', 'personY#test.com', 'personZ#test.com' ; break }
1 { 'personA#test.com', 'personB#test.com' ; break }
2 { 'personX#test.com', 'personA#test.com' ; break }
3 { 'personS#test.com', 'personR#test.com', 'personW#test.com' }
}
# set up a hashtable with parameters for splattting to Send-MailMessage
$mailParams = #{
To = $to
From = 'foldercheck#test.com'
Subject = "Subject"
Body = "There are $count files in folder '$($folders[$i])'"
SmtpServer = 'smtp.ser.com'
# add more parameters here if needed
}
# send the email
Send-MailMessage #mailParams
}
else {
Write-Host "Empty folder '$folders[$i]'; Nothing to process"
}
}
Related
I have a script from here, this is the job :
function CaptureWeight {
Start-Job -Name WeightLog -ScriptBlock {
filter timestamp {
$sw.WriteLine("$(Get-Date -Format MM/dd/yyyy_HH:mm:ss) $_")
}
try {
$sw = [System.IO.StreamWriter]::new("$using:LogDir\$FileName$(Get-Date -f MM-dd-yyyy).txt")
& "$using:PlinkDir\plink.exe" -telnet $using:SerialIP -P $using:SerialPort | TimeStamp
}
finally {
$sw.ForEach('Flush')
$sw.ForEach('Dispose')
}
}
}
I'd like to get his to run against a list of IP addresses while also having a name associated with the IP to set the file name for each file. I was thinking something like $Name = Myfilename and $name.IP = 1.1.1.1 and using those in place of $FileName and $SerialIP, but have yet to be able get anything close to working or find an example close enough to what I'm trying for.
Thanks
Here is one way you could do it with a hash table as Theo mentioned in his helpful comment. Be aware that Jobs don't have a Threshold / ThrottleLimit parameter as opposed to Start-ThreadJob or ForEach-Object -Parallel since jobs run in a different process as you have already commented instead of instances / runspaces, there is no built-in way to control how many Jobs can run at the same time. If you wish have control over this you would need to code it yourself.
# define IPs as Key and FileName as Value
$lookup = #{
'1.2.3.4' = 'FileNameForThisIP'
'192.168.1.15' = 'AnotherFileNameForTHatIP'
}
# path to directory executable
$plink = 'path\to\plinkdirectory'
# path to log directory
$LogDir = 'path\to\logDirectory'
# serial port
$serialport = 123
$jobs = foreach($i in $lookup.GetEnumerator()) {
Start-Job -Name WeightLog -ScriptBlock {
filter timestamp {
$sw.WriteLine("$(Get-Date -Format MM/dd/yyyy_HH:mm:ss) $_")
}
try {
$path = Join-Path $using:LogDir -ChildPath ('{0}{1}.txt' -f $using:i.Value, (Get-Date -f MM-dd-yyyy))
$sw = [System.IO.StreamWriter]::new($path)
$sw.AutoFlush = $true
& "$using:plink\plink.exe" -telnet $using:i.Key -P $using:serialPort | TimeStamp
}
finally {
$sw.ForEach('Dispose')
}
}
}
$jobs | Receive-Job -AutoRemoveJob -Wait
The other alternative to the hash table could be to use a Csv (either from a file with Import-Csv or hardcoded with ConvertFrom-Csv).
Adding here another alternative to my previous answer, using a RunspacePool instance which has built-in a way of concurrency and enqueuing.
using namespace System.Management.Automation.Runspaces
try {
# define number of threads that can run at the same time
$threads = 10
# define IPs as Key and FileName as Value
$lookup = #{
'1.2.3.4' = 'FileNameForThisIP'
'192.168.1.15' = 'AnotherFileNameForTHatIP'
}
# path to directory executable
$plink = 'path\to\plinkdirectory\'
# path to log directory
$LogDir = 'path\to\logDirectory'
# serial port
$port = 123
$iss = [initialsessionstate]::CreateDefault2()
$rspool = [runspacefactory]::CreateRunspacePool(1, $threads, $iss, $Host)
$rspool.ApartmentState = 'STA'
$rspool.ThreadOptions = 'ReuseThread'
# session variables that will be intialized with the runspaces
$rspool.InitialSessionState.Variables.Add([SessionStateVariableEntry[]]#(
[SessionStateVariableEntry]::new('plink', $plink, '')
[SessionStateVariableEntry]::new('serialport', $port, '')
[SessionStateVariableEntry]::new('logDir', $LogDir, '')
))
$rspool.Open()
$rs = foreach($i in $lookup.GetEnumerator()) {
$ps = [powershell]::Create().AddScript({
param($pair)
filter timestamp {
$sw.WriteLine("$(Get-Date -Format MM/dd/yyyy_HH:mm:ss) $_")
}
try {
$path = Join-Path $LogDir -ChildPath ('{0}{1}.txt' -f $pair.Value, (Get-Date -f MM-dd-yyyy))
$sw = [System.IO.StreamWriter]::new($path)
$sw.AutoFlush = $true
& "$plink\plink.exe" -telnet $pair.Key -P $serialPort | TimeStamp
}
finally {
$sw.ForEach('Dispose')
}
}).AddParameter('pair', $i)
$ps.RunspacePool = $rspool
#{
Instance = $ps
AsyncResult = $ps.BeginInvoke()
}
}
foreach($r in $rs) {
try {
$r.Instance.EndInvoke($r.AsyncResult)
$r.Instance.Dispose()
}
catch {
Write-Error $_
}
}
}
finally {
$rspool.ForEach('Dispose')
}
Background
I am trying to number each item in a WBS with PowerShell. The WBS is on a spreadsheet. For example, if you have the following WBS (4-level depth) from Wikipedia:
The result should be:
1
1.1
1.1.1
1.1.1.1
1.1.1.2
1.1.1.3
1.1.1.4
1.1.1.5
1.1.1.6
1.1.2
1.1.3
1.1.4
1.2
1.3
1.4
1.5
1.6
1.7
1.8
1.9
1.10
1.11
Problem
I decided to export the WBS to CSV and read it with PowerShell:
Import-Csv -LiteralPath .\WBS.csv -Header Lv1,Lv2,Lv3,Lv4 |
ForEach-Object {
$_ | Add-Member -MemberType NoteProperty -Name Lv1i -Value $null
$_ | Add-Member -MemberType NoteProperty -Name Lv2i -Value $null
$_ | Add-Member -MemberType NoteProperty -Name Lv3i -Value $null
$_ | Add-Member -MemberType NoteProperty -Name Lv4i -Value $null
$_
} | Set-Variable wbs
$Lv1i = 0;
$wbs | ForEach-Object {
if ($_.Lv1 -ne "") {
$Lv1i = $Lv1i + 1;
$_.Lv1i = $Lv1i;
} else {
$_.Lv1i = $Lv1i;
}
}
$Lv2i = 0;
$wbs | ForEach-Object {
if ($_.Lv2 -ne "") {
$Lv2 = $_.Lv2;
$Lv2i = $Lv2i + 1;
$_.Lv2i = $Lv2i;
} else {
if ($_.Lv1 -eq "") {
$_.Lv2i = $Lv2i;
} else {
$Lv2i = 0;
}
}
}
$Lv3i = 0;
$wbs | ForEach-Object {
if ($_.Lv3 -ne "") {
$Lv3 = $_.Lv3;
$Lv3i = $Lv3i + 1;
$_.Lv3i = $Lv3i;
} else {
if (($_.Lv1 -ne "") -or ($_.Lv2 -ne "")) {
$Lv3i = 0;
} else {
$_.Lv3i = $Lv3i;
}
}
}
$Lv4i = 0;
$wbs | ForEach-Object {
if ($_.Lv4 -ne "") {
$Lv4 = $_.Lv4;
$Lv4i = $Lv4i + 1;
$_.Lv4i = $Lv4i;
} else {
if (($_.Lv1 -ne "") -or ($_.Lv2 -ne "") -or ($_.Lv3 -ne "")) {
$Lv4i = 0;
} else {
$_.Lv4i = $Lv4i;
}
}
}
$wbs | ForEach-Object { "{0} {1} {2} {3} `t {4}.{5}.{6}.{7}" -F $_.Lv1, $_.Lv2, $_.Lv3, $_.Lv4,$_.Lv1i, $_.Lv2i, $_.Lv3i, $_.Lv4i } `
| ForEach-Object { $_.Trim(".") }
The code above works for me, but it supports only 4-level depth WBS. I want to improve it to handle any depth. To implement this requirement, I think it has to read the CSV file into a variable-size two-dimensional array. But I could not found the robust (support commas and line breaks in cell) way to do it in PowerShell.
Question
Is there any way to import a CSV into a variable-size two-dimensional array with PowerShell? Cells in the CSV could contain commas, double quotes, or line breaks.
Given this sample input data (sample.csv):
Aircraft System;;;
;Air Vehicle;;
;;Airframe;
;;;Airfram Integration
;;;Fuselage
;;;Wing
;;Propulsion;
;;Vehicle Subsystems;
;;Avionics;
;System Engineering;;
Other;;;
the following PowerShell script
$cols = 5
$data = Import-Csv .\sample.csv -Delimiter ";" -Encoding UTF8 -Header (1..$cols)
$stack = #()
$prev = 0
foreach ($row in $data) {
for ($i = 0; $i -lt $cols; $i++) {
$value = $row.$i
if (-not $value) { continue }
if ($i -eq $prev) {
$stack[$stack.Count-1]++
} elseif ($i -eq $prev + 1) {
$stack += 1
} elseif ($i -lt $prev) {
$stack = $stack[0..($i-1)]
$stack[$stack.Count-1]++
}
$prev = $i
Write-Host $($stack -join ".") $value
}
}
outputs
1 Aircraft System
1.1 Air Vehicle
1.1.1 Airframe
1.1.1.1 Airfram Integration
1.1.1.2 Fuselage
1.1.1.3 Wing
1.1.2 Propulsion
1.1.3 Vehicle Subsystems
1.1.4 Avionics
1.2 System Engineering
2 Other
To save the result, instead of printing it out to the console, e.g. this:
$result = foreach ($row in $data) {
for ($i = 0; $i -lt $cols; $i++) {
# ...
[pscustomobject]#{outline = $($stack -join "."); text = $value}
}
}
would give $result as
outline text
------- ----
1 Aircraft System
1.1 Air Vehicle
1.1.1 Airframe
1.1.1.1 Airfram Integration
1.1.1.2 Fuselage
1.1.1.3 Wing
1.1.2 Propulsion
1.1.3 Vehicle Subsystems
1.1.4 Avionics
1.2 System Engineering
2 Other
Not quite as succinct as #Tomalek's answer, but doesn't use an inner loop and accumulates the results into a variable...
Given:
$csv = #"
"Aircraft System"
, "Air Vehicle"
,, "Airframe"
,,, "Airframe Integration, Assembly, Test and Checkout"
,,, "Fuselage"
,,, "Wing"
,,, "Empennage"
,,, "Nacelle"
,,, "Other Airframe Components 1..n (Specify)"
,, "Propulsion"
,, "Vehicle Subsystems"
,, "Avionics"
, "System Engineering"
, "Program Management"
, "System Test and Evaluation"
, "Training"
, "Data"
, "Peculiar Support Equipment"
, "Common Support Equipment"
, "Operational/Site Activation"
, "Industrial Facilities"
, "Initial Spares and Repair Parts"
"#
the code:
# increase "1..9" to , e.g. "1..99" if you want to handle deeper hierarchies
$headers = 1..9;
$data = $csv | ConvertFrom-Csv -Header $headers;
# this variable does the magic - it tracks the index of the current node
# at each level in the hierarchy - e.g. 1.1.1.5 => #( 1, 1, 1, 5 ). each
# time we find a sibling or a new child we edit this array to append or
# increment the last item.
$indexes = new-object System.Collections.ArrayList;
$depth = 0;
$results = new-object System.Collections.ArrayList;
foreach( $item in $data )
{
# we can't nest by more than one level at a time, so this row must have
# a value at either the same depth as the previous if it's a sibling,
# the next depth if it's the first child, or a shallower index if we've
# reached the end of a nested list.
if( $item.($depth + 1) )
{
# this is the first child node of the previous node
$null = $indexes.Add(1);
$depth += 1;
}
elseif( $item.$depth )
{
# this is a sibling of the previous node, so increment the last index
$indexes[$depth - 1] += 1;
}
else
{
# this is the first item after a list of siblings (e.g. 1.1.2), so we
# need to look at shallower properties until we find a value
while( ($depth -gt 0) -and -not $item.$depth )
{
$indexes.RemoveAt($depth - 1);
$depth -= 1;
}
if( $depth -lt 1 )
{
throw "error - no shallower values found"
}
# don't forget this item is a sibling of the previous node at this level
# since it's not the *first* child, so we need to increment the counter
$indexes[$depth - 1] += 1;
}
$results += $indexes -join ".";
}
produces output:
$results
#1
#1.1
#1.1.1
#1.1.1.1
#1.1.1.2
#1.1.1.3
#1.1.1.4
#1.1.1.5
#1.1.1.6
#1.1.2
#1.1.3
#1.1.4
#1.2
#1.3
#1.4
#1.5
#1.6
#1.7
#1.8
#1.9
#1.10
#1.11
I'm trying write a script that will grab the fortune 100 URLs from here, put those into an array, and then write a runspace that uses Invoke-WebRequest to get the content of those URLs and writes that content to a file. This is the code that I have so far:
#Importing Modules
Import-Module PoshRSJob
#variable declaration
$page = Invoke-WebRequest https://www.zyxware.com/articles/4344/list-of-fortune-500-companies-and-their-websites
$links = $page.Links
$tables = #($page.ParsedHtml.GetElementsByTagName("TABLE"))
$tableRows = $tables[0].Rows
#loops through the table to get only the top 100 urls.
$urlArray = #()
foreach ($tablerow in $tablerows) {
$urlArray += New-Object PSObject -Property #{'URLName' = $tablerow.InnerHTML.Split('"')[1]}
#Write-Host ($tablerow.innerHTML).Split('"')[1]
$i++
if ($i -eq 101) {break}
}
#Number of Runspaces to use
#$RunspaceThreads = 1
#Declaring Variables
$ParamList = #($urlArray)
$webRequest = #()
$urlArray | start-rsjob -ScriptBlock {
#$webRequest = (Invoke-WebRequest $using:ParamList)
#Invoke-WebRequest $urlArray
#Invoke-WebRequest {$urlArray}
#Get-Content $urlArray
}
The problem that I'm running into right now is that I can't get Invoke-WebRequest or Get-Content to give me the contents of the URLs that are actually contained in the array. You can see that in the scriptblock, I commented out some lines that didn't work.
My question is: using a runspace, what do I need to do to pull the data from all the URLs in the array using Get-Content, and then write that to a file?
You can adjust your current query to get the first 100 company names. This skips the empty company at the front. Consider using [PSCustomObject] #{ URLName = $url } which replaces the legacy New-Object PSObject.
$urlArray = #()
$i = 0
foreach ($tablerow in $tablerows) {
$url = $tablerow.InnerHTML.Split('"')[1]
if ($url) {
# Only add an object when the url exists
$urlArray += [PSCustomObject] #{ URLName = $url }
$i++
if ($i -eq 100) {break}
}
}
To run the requests in parallel use Start-RSJob with a script block. Invoke-Webrequest is then run in parallel. Note that in this example $_ refers to the current array element that is piped which consists of an object with a URLName property, but you need to be a little careful what variables you use inside the scriptblock because they might not be resovled they way you expect them to be.
# Run the webrequests in parallel
# $_ refers to a PSCustomObject with the #{ URLName = $url } property
$requests = ($urlArray | start-rsjob -ScriptBlock { Invoke-WebRequest -Uri $_.URLName })
You can then wait for all the jobs to complete and do some post processing of the results.
Here only the length of the website contents are written because the pages themself are lengthy.
# Get the results
# $_.Content.Length gets the length of the content to not spam the output with garbage
$result = Get-RSjob | Receive-RSJob | ForEach { $_.Content.Length }
Write-Host $result
Below is only an example, I have seen a lot of script to breakdown a .CSV file in smaller files but struggling with this.
How can we with PowerShell, find the header indicated by ALPH take each subsequent line, stop when it reaches ALPT (inclusive) and put this text into another file.
The operation will need to run through the whole file and the number of ALPD or ALPC lines will vary.
ALPH can be considered as a header while the information contained is needed as some field value can be different. The only constant are ALPH and ALPT.
ALPH;8102014
ALPC;PK
ALPD;50
ALPD;40
ALPT;5
ALPH;15102014
ALPC;PK
ALPD;50
ALPD;50
ALPD;70
ALPD;70
ALPD;71
ALPD;72
ALPD;40
ALPT;6
ALPH;15102014
ALPC;PK
ALPD;50
ALPD;50
ALPD;40
ALPT;6
If I understood your question correctly, something like this should work:
$csv = 'C:\path\to\your.csv'
$pattern = 'ALPH[\s\S]*?ALPT.*'
$cnt = 0
[IO.File]::ReadAllText($csv) | Select-String $pattern -AllMatches |
select -Expand Matches | select -Expand Groups |
% {
$cnt++
$outfile = Join-Path (Split-Path $csv -Parent) "split${cnt}.csv"
[IO.File]::WriteAllText($outfile, $_.Value)
}
Here is a way using switch. Your original file is in C:\temp\ALPH.CSV here is the way I imagine to find the begin an the end.
$n = 1
switch -File 'C:\temp\ALPH.CSV' -Regex
{
'^ALPH.*' {
Write-Host "Begin $n"
}
'^ALPT.*' {
Write-Host "End $n"
$n++
}
}
Now saving lines to a var and exporting files :
$n = 1
$csvTmp = #()
switch -File 'C:\temp\ALPH.CSV' -Regex
{
'^ALPH.*' {
Write-Host "Begin $n"
$csvTmp += $_
}
'^ALPT.*' {
Write-Host "End $n"
$csvTmp += $_
$csvTmp | Set-Content "c:\temp\file$n.csv"
$csvTmp = #()
$n++
}
default {
$csvTmp += $_
}
}
I'm looking to use powershell to see if there is a specific file in an ftp folder. More specifically, there is roughly 15 different folders that need to be checked for a specific file name. Any ideas on how I would do this?
There is a PowerShell ftp module here.
$DEBUG = 1
# Machines
$MachineNames = #("machine1","machine2" )
$MachineIPs = #("192.168.1.1","192.168.1.2" )
# Websites
$WebsiteNames = #("website1","website2" )
$WebsiteURLs = #("http://yahoo.com","http://google.com" )
#====== check websites
$i = 0;
foreach ($WebsiteURL in $WebsiteURLs){
# First we create the request.
$HTTP_Request = [System.Net.WebRequest]::Create($WebsiteURL)
# We then get a response from the site.
$HTTP_Response = $HTTP_Request.GetResponse()
# We then get the HTTP code as an integer.
$HTTP_Status = [int]$HTTP_Response.StatusCode
#$HTTP_Response
If ($HTTP_Status -eq 200) {
if ($DEBUG -eq 1) {Write-Host "== " $WebsiteNames[$i] " is OK!" }
}
Else {
if ($DEBUG -eq 1) {Write-Host "==Error: "$WebsiteNames[$i] " may be down!" }
SendEmails $WebsiteNames[$i]
}
# Finally, we clean up the http request by closing it.
$HTTP_Response.Close()
Clear-Variable HTTP_Response
$i = $i + 1
}
#====== check IP
$i = 0;
foreach ($MachineIP in $MachineIPs){
$isValidIP = Test-Connection $MachineIP -Count 1 -Quiet
if ($DEBUG -eq 1) {
$hostn = [System.Net.Dns]::GetHostEntry($MachineIP).HostName
New-Object -TypeName PSObject -Property #{'Host'=$hostn;'IP'=$MachineIP}
}
if (-not($isValidIP)) {
if ($DEBUG -eq 1) {Write-Host "==Error: " $MachineNames[$i] " ("$MachineIPs[$i]") may be down!" }
SendEmails $MachineNames[$i]
}
$i = $i + 1
}