I found a script that does great. The only change I want to make is to list just the files, not the new folders. This script should monitor a folder and subfolders and notify only when a new file has been created. How can I filter this down to just the file? Can I add an exclude on the get-childitem?
Param (
[string]$Path = "\\path\share",
[string]$SMTPServer = "smtp server",
[string]$From = "email",
[string]$To = "email",
[string]$Subject = "New File(s) Received on the FTP site"
)
$SMTPMessage = #{
To = $To
From = $From
Subject = "$Subject at $Path"
Smtpserver = $SMTPServer
}
$File = Get-ChildItem $Path | Where { $_.LastWriteTime -ge (Get-Date).Addminutes(-10) }
If ($File)
{ $SMTPBody = "`nTo view these new files, click the link(s) below:`n`n "
$File | ForEach { $SMTPBody += "$($_.FullName)`n" }
Send-MailMessage #SMTPMessage -Body $SMTPBody
}
Thanks
To list only files, you can pass the -File parameter to Get-ChildItem.
From Get-ChildItem for FileSystem:
To get only files, use the File parameter and omit the Directory parameter.
Related
I have a script from here, this is the job :
function CaptureWeight {
Start-Job -Name WeightLog -ScriptBlock {
filter timestamp {
$sw.WriteLine("$(Get-Date -Format MM/dd/yyyy_HH:mm:ss) $_")
}
try {
$sw = [System.IO.StreamWriter]::new("$using:LogDir\$FileName$(Get-Date -f MM-dd-yyyy).txt")
& "$using:PlinkDir\plink.exe" -telnet $using:SerialIP -P $using:SerialPort | TimeStamp
}
finally {
$sw.ForEach('Flush')
$sw.ForEach('Dispose')
}
}
}
I'd like to get his to run against a list of IP addresses while also having a name associated with the IP to set the file name for each file. I was thinking something like $Name = Myfilename and $name.IP = 1.1.1.1 and using those in place of $FileName and $SerialIP, but have yet to be able get anything close to working or find an example close enough to what I'm trying for.
Thanks
Here is one way you could do it with a hash table as Theo mentioned in his helpful comment. Be aware that Jobs don't have a Threshold / ThrottleLimit parameter as opposed to Start-ThreadJob or ForEach-Object -Parallel since jobs run in a different process as you have already commented instead of instances / runspaces, there is no built-in way to control how many Jobs can run at the same time. If you wish have control over this you would need to code it yourself.
# define IPs as Key and FileName as Value
$lookup = #{
'1.2.3.4' = 'FileNameForThisIP'
'192.168.1.15' = 'AnotherFileNameForTHatIP'
}
# path to directory executable
$plink = 'path\to\plinkdirectory'
# path to log directory
$LogDir = 'path\to\logDirectory'
# serial port
$serialport = 123
$jobs = foreach($i in $lookup.GetEnumerator()) {
Start-Job -Name WeightLog -ScriptBlock {
filter timestamp {
$sw.WriteLine("$(Get-Date -Format MM/dd/yyyy_HH:mm:ss) $_")
}
try {
$path = Join-Path $using:LogDir -ChildPath ('{0}{1}.txt' -f $using:i.Value, (Get-Date -f MM-dd-yyyy))
$sw = [System.IO.StreamWriter]::new($path)
$sw.AutoFlush = $true
& "$using:plink\plink.exe" -telnet $using:i.Key -P $using:serialPort | TimeStamp
}
finally {
$sw.ForEach('Dispose')
}
}
}
$jobs | Receive-Job -AutoRemoveJob -Wait
The other alternative to the hash table could be to use a Csv (either from a file with Import-Csv or hardcoded with ConvertFrom-Csv).
Adding here another alternative to my previous answer, using a RunspacePool instance which has built-in a way of concurrency and enqueuing.
using namespace System.Management.Automation.Runspaces
try {
# define number of threads that can run at the same time
$threads = 10
# define IPs as Key and FileName as Value
$lookup = #{
'1.2.3.4' = 'FileNameForThisIP'
'192.168.1.15' = 'AnotherFileNameForTHatIP'
}
# path to directory executable
$plink = 'path\to\plinkdirectory\'
# path to log directory
$LogDir = 'path\to\logDirectory'
# serial port
$port = 123
$iss = [initialsessionstate]::CreateDefault2()
$rspool = [runspacefactory]::CreateRunspacePool(1, $threads, $iss, $Host)
$rspool.ApartmentState = 'STA'
$rspool.ThreadOptions = 'ReuseThread'
# session variables that will be intialized with the runspaces
$rspool.InitialSessionState.Variables.Add([SessionStateVariableEntry[]]#(
[SessionStateVariableEntry]::new('plink', $plink, '')
[SessionStateVariableEntry]::new('serialport', $port, '')
[SessionStateVariableEntry]::new('logDir', $LogDir, '')
))
$rspool.Open()
$rs = foreach($i in $lookup.GetEnumerator()) {
$ps = [powershell]::Create().AddScript({
param($pair)
filter timestamp {
$sw.WriteLine("$(Get-Date -Format MM/dd/yyyy_HH:mm:ss) $_")
}
try {
$path = Join-Path $LogDir -ChildPath ('{0}{1}.txt' -f $pair.Value, (Get-Date -f MM-dd-yyyy))
$sw = [System.IO.StreamWriter]::new($path)
$sw.AutoFlush = $true
& "$plink\plink.exe" -telnet $pair.Key -P $serialPort | TimeStamp
}
finally {
$sw.ForEach('Dispose')
}
}).AddParameter('pair', $i)
$ps.RunspacePool = $rspool
#{
Instance = $ps
AsyncResult = $ps.BeginInvoke()
}
}
foreach($r in $rs) {
try {
$r.Instance.EndInvoke($r.AsyncResult)
$r.Instance.Dispose()
}
catch {
Write-Error $_
}
}
}
finally {
$rspool.ForEach('Dispose')
}
I have a program which checks two paths for files and if there are any files it sends one mail for one person each.
They now need four paths checked and multiple person need a mail for a path.
e.g.
Path 1 - Mail to x, y and z
Path 2 - Mail to a and b
Path 3 - Mail to x and a
Path 4 - Mail to s, r and w
How can I make it easy and most efficient?
$folders = #()
$folders += "\\server\files\Info\test"
$folders += "\\server\files\Info\test2"
$receiver = #()
$receiver += "person1#test.com"
$receiver += "person2#test.com"
$i = 0
$folders | ForEach-Object {
$checkforfiles = $_
$directoryInfo = Get-ChildItem $checkforfiles | Measure-Object
$directoryInfo.count #Returns the count of all of the objects in the directory
$Numberone = $directoryInfo.Count
if ($directoryInfo.count -ge 1){
send-mailmessage -subject "Subject" -Body "There are $checkforfiles files " -from foldercheck#test.com -to $receiver[$i] `
-smtpserver smtp.ser.com
$i = $i+1
}
else{
write-host "nothing to process"
}
}
You could simply extend the array of folders to test and use a switch to determine which users should get an email.
Also, I would advise using splatting on cmdlets with a lot of parameters to make for cleaner code.
# an array of folder paths to test
$folders = "\\server\files\Info\test", "\\server\files\Info\test2", "\\server\files\Info\test3", "\\server\files\Info\test4"
for ($i = 0; $i -lt $folders.Count; $i++) {
# get the file (and folder) count inside
$count = #(Get-ChildItem -Path $folders[$i]).Count
if ($count -gt 0) {
# determine who to email
$to = switch ($i) {
0 { 'personX#test.com', 'personY#test.com', 'personZ#test.com' ; break }
1 { 'personA#test.com', 'personB#test.com' ; break }
2 { 'personX#test.com', 'personA#test.com' ; break }
3 { 'personS#test.com', 'personR#test.com', 'personW#test.com' }
}
# set up a hashtable with parameters for splattting to Send-MailMessage
$mailParams = #{
To = $to
From = 'foldercheck#test.com'
Subject = "Subject"
Body = "There are $count files in folder '$($folders[$i])'"
SmtpServer = 'smtp.ser.com'
# add more parameters here if needed
}
# send the email
Send-MailMessage #mailParams
}
else {
Write-Host "Empty folder '$folders[$i]'; Nothing to process"
}
}
I've been working on a little project in Powershell.
My task was to create a script that will collect all files from mail attachments, merge all .pdf files into one and send the generated file to my email.
The script works completely fine in Powershell ISE, but when I try to run it from task scheduler, the merged .pdf file is corrupted without any data in it.
Keep in mind I am new to this stuff.
This is my main code that does all the heavy work:
function getAttachments
{
#########################################################
##-----------------------------------------------------##
## GET ATTACHMENTS ##
##-----------------------------------------------------##
#########################################################
##PATH TO CREDENTIAL
$credpath = "C:\Users\" + $env:UserName + "\Documents\myCred_${env:USERNAME}_${env:COMPUTERNAME}.xml"
#test variable
$test = Test-Path $credpath
##TEST IF CREDENTIAL EXISTS
if(!$test){
## USER PROMPT PSW CREDENTIAL ##
$cred = Get-Credential
#save credential in documents
$cred | Export-CliXml -Path $credpath
}else{
##READ USER CREDENTIAL FROM FILE
$cred = Import-CliXml -Path $credpath
}
##url and date variables
$url = "https://outlook.office365.com/api/v1.0/me/messages"
$d = [DateTime]::Today.AddDays(-1)
$global:date = $d.ToString("yyyy-MM-dd")
## Get all messages that have attachments where received date is greater than $date
$messageQuery = "" + $url + "?`$select=Id&`$filter=HasAttachments eq true and DateTimeReceived ge " + $date
$messages = Invoke-RestMethod $messageQuery -Credential $cred
## Loop through each results
foreach ($message in $messages.value)
{
# get attachments and save to file system
$query = $url + "/" + $message.Id + "/attachments"
$attachments = Invoke-RestMethod $query -Credential $cred
# in case of multiple attachments in email
foreach ($attachment in $attachments.value)
{
Write-Host “Found File :- ” $attachment.Name
$path = "c:\Attachments\" + $attachment.Name
$Content = [System.Convert]::FromBase64String($attachment.ContentBytes)
Set-Content -Path $path -Value $Content -Encoding Byte
}
}
}
function sendAttachments
{
#############################################################
##---------------------------------------------------------##
## SEND ATTACHMENTS AND DELETE FILES ##
##---------------------------------------------------------##
#############################################################
#Connection Details
#PATH TO CREDENTIAL
$credpath = "C:\Users\" + $env:UserName + "\Documents\myCred_${env:USERNAME}_${env:COMPUTERNAME}.xml"
$cred = Import-CliXml -Path $credpath
$smtpServer = “ smtp.office365.com”
$msg = new-object Net.Mail.MailMessage
#Change port number for SSL to 587
$smtp = New-Object Net.Mail.SmtpClient($SmtpServer, 25)
#Uncomment Next line for SSL
$smtp.EnableSsl = $true
$smtp.Credentials = $cred
$msg.IsBodyHtml = $true
#From Address
$msg.From = $cred.UserName
#To Address, Copy the below line for multiple recipients
$msg.To.Add(“email#gmail.com”)
#Message Body
$msg.Body=”<h2>Alle attachments samen bevinden zich in de bijlage van did email</h2> <br/><br/>”
#Message Subject
$msg.Subject = “no-reply: Email met alle attachments”
#your file location
$files=Get-ChildItem “C:\Attachments\”
#attach the right file
$file = $global:pname
Write-Host “Attaching File :- ” $file
$attachment = New-Object System.Net.Mail.Attachment –ArgumentList C:\Attachments\$file
$msg.Attachments.Add($attachment)
#send email
$smtp.Send($msg)
$attachment.Dispose();
$msg.Dispose();
#delete the files from the folder
Get-ChildItem -Path C:\Attachments -Include * -File -Recurse | foreach { $_.Delete()}
}
function mergePDF
{
#############################################################
##---------------------------------------------------------##
## MERGE ALL PDF FILES ##
##---------------------------------------------------------##
#############################################################
$workingDirectory = "C:\Attachments"
$itspath = $PSScriptRoot
$global:pname = $global:date + "_pdfAttachments.pdf"
$pdfs = ls $workingDirectory -recurse | where {-not $_.PSIsContainer -and $_.Extension -imatch "^\.pdf$"};
[void] [System.Reflection.Assembly]::LoadFrom([System.IO.Path]::Combine($itspath, 'itextsharp.dll'));
$output = [System.IO.Path]::Combine($workingDirectory, $pname);
$fileStream = New-Object System.IO.FileStream($output, [System.IO.FileMode]::OpenOrCreate);
$document = New-Object iTextSharp.text.Document;
$pdfCopy = New-Object iTextSharp.text.pdf.PdfCopy($document, $fileStream);
$document.Open();
foreach ($pdf in $pdfs) {
$reader = New-Object iTextSharp.text.pdf.PdfReader($pdf.FullName);
[iTextSharp.text.pdf.PdfReader]::unethicalreading = $true
$pdfCopy.AddDocument($reader);
$reader.Dispose();
}
$document.Close()
$pdfCopy.Dispose();
$document.Dispose();
$fileStream.Dispose();
}
getAttachments
Start-Sleep -s 10
mergePDF
Start-Sleep -s 10
sendAttachments
In this piece of code that I run in another script file, I create a new task:
#############################################################
##---------------------------------------------------------##
## SCHEDULE SCRIPTS IN WINDOWS TASKS ##
##---------------------------------------------------------##
#############################################################
##PATH TO CREDENTIAL
$credpath = "C:\Users\" + $env:UserName + "\Documents\myCred_${env:USERNAME}_${env:COMPUTERNAME}.xml"
#test variable
$test = Test-Path $credpath
##TEST IF CREDENTIAL EXISTS
if(!$test){
## USER PROMPT PSW CREDENTIAL ##
$cred = Get-Credential
#save credential in documents
$cred | Export-CliXml -Path $credpath
}
$taskName = "ManageEmailAttachments"
$taskExists = Get-ScheduledTask | Where-Object {$_.TaskName -like $taskName }
if($taskExists)
{
Get-ScheduledJob ManageEmailAttachments
Unregister-ScheduledJob ManageEmailAttachments
$wshell = New-Object -ComObject Wscript.Shell
$wshell.Popup("Task successfully deleted, run the script again to schedule the task",0,"Done",0x0)
}
else
{
$tt = Get-Date
$tt = $tt.AddMinutes(1)
$testtime = $tt.ToString("HH:mm:ss")
#set trigger
$trigger = New-JobTrigger -Daily -At "1:00"
$testtrigger = New-JobTrigger -Daily -At $testtime
#path to the scripts
$scriptPath = $PSScriptRoot + "\wps_manage_pdf_attachments.ps1"
#options(optional)
$option = New-ScheduledJobOption -WakeToRun: $true
#create a new task
Register-ScheduledJob -Name ManageEmailAttachments -FilePath $scriptPath -Trigger $testtrigger -ScheduledJobOption $option
}
The script when run in Powershell works great, it gets all the attachments from mailbox, merges them into 1 .pdf file and sends them to the requested email address. But when scheduled in windows task scheduler it does the first step fine, but when merged, the .pdf file is corrupted without any content.
I couldn't figure out how to make it work so I posted a question on the forum.
Hope you guys find a way to figure it out.
Thanks in advance
Use below function to get script root directory.
Function Get-ScriptDirectory
{
$Invocation = (Get-Variable MyInvocation -scope 1).Value
Split-path $Invocation.MyCommand.Path
}
$scriptPath=Join-Path(Get-ScriptDirectory) 'wps_manage_pdf_attachments.ps1'
Apparently the problem nested itself in the main code. I used:
Try{...}
Catch{$_ | Out-File C:\errors.txt}
In mergePDF function to find out what the error was. Seems like the path to my ITextSharp.dll was incorrect. $PSScriptRoot that I used showed "C:\windows\windows32" instead of where the script actually is.
So what I did instead was add a line in my batch file to copy the itextsharp.dll to %Temp%:
xcopy Scripts\itextsharp.dll %Temp% /D >NUL 2>NUL
and then read the file from there with:
$itsPath = [System.IO.Path]::GetTempPath()
And everything works as it should be. I know this isn't the best way to do it but I had this batch file before to make the script run by just dubbleclicking it.
So adding a little line won't hurt.
I hope this helps anyone with the same problem.
I am trying to get all files w/in a directory that have the extension ".rtf". I have a working script, but it takes a while, as there is a foreach loop w/in a foreach loop. Is there a faster way to handle this? The goal of the script is to get all files w/in a directory ending in .rtf and use MSWord to Open the file and save it as a ".DOC". The conversion functionality works fine. The issue is with the length of time to search through all of the folders.
Function Convert-Dir($path)
{
$subFolders = get-childitem $path -Recurse | Where-Object {$_.PSIsContainer -eq $True}
if($subFolders)
{
foreach($folder in $subFolders)
{
if($folder.PSisContainer)
{
$Files=Get-ChildItem $folder.fullname -Filter "*.rtf"
$Word=New-Object -ComObject WORD.APPLICATION
if($Files)
{
foreach ($File in $Files)
{
$Doc=$Word.Documents.Open($File.fullname)
$Name=($Doc.name).replace("rtf","doc")
if (Test-Path $Name)
{
} else
{
# Use WORD
$fullName = ($Doc.path + "\" + "Converted_" + $Name)
$Doc.saveas([ref] $fullName, [ref] 0)
$Doc.close()
$fileToRemove = $File.fullName
Remove-Item $fileToRemove
$Word.Quit()
}
}
}
}
}
}
}
I guess the performance is lost by creating a lot of word-instances by calling a word-process in each subfolder. You should should use only one instance of word all the time. Just move the line $Word=New-Object -ComObject WORD.APPLICATION to the top of your function and the line $word.quit() to the very end.
I'm trying to create a PS script that will read multiple file directories from an XML file and then depending on the filetype and date modified > 10 Days I want to delete the files.
I'm not sure how I should be accessing the config value so that i can setup a foreach loop. At the minute I have this in my script
#Read XML Config File to get settings
[xml]$configfile = Get-Content "C:\Project\TestForXML.xml"
#Declare and set variables from Config values
$dirs = #($configfile.Settings.DirectoryName)
$scanSubDirectories = $configfile.Settings.ScanSubDirectories
$deleteAllFiles = $configfile.Settings.deleteAllFiles
$fileTypesToDelete = $configfile.Settings.FileTypesToDelete
$liveSiteLogs = $configfile.Settings.LiveSiteLogs
$fileExclusions = $configfile.Settings.FileExclusions
$retentionPeriod = $configfile.Settings.RetentionPeriod
$aicLogs = $configfile.Settings.AICLogs
$aicLogsRententionPeriod = $configfile.Settings.AICLogsRententionPeriod
$Now = Get-Date -format d
$Days = "10"
$LastWrite = $Now.AddDays(-$retentionPeriod)
$aicLastWrite = $Now.AddDays(-$aicLogsRententionPeriod)
$Logs = Get-Childitem $dirs -Include $fileTypesToDelete,$liveSiteLogs -Recurse | Where {$_.LastWriteTime -le "$LastWrite"}
foreach ($Log in $Logs)
{
Remove-Item $Log.FullName
$Msg =Write-Output "Deleting File $Log"
$Msg | out-file "C:\Desktop\Output.txt" -append
}
My XML Config files looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<Settings>
<DirectoryName>"ServerA\C$\logs", "ServerB\C$\logfiles",
"ServerC\C$\logfiles", "ServerD\C$\logs"</DirectoryName>
<AICLogs>D:\DebugXML</AICLogs>
<AICLogsRententionPeriod>30</AICLogsRententionPeriod>
<ScanSubDirectories>True</ScanSubDirectories>
<DeleteAllFiles>False</DeleteAllFiles>
<FileTypesToDelete>.txt;.log;.mf</FileTypesToDelete>
<LiveSiteLogs>livesite.runtime.*</LiveSiteLogs>
<FileExclusions>fdlogfile.log</FileExclusions>
<RetentionPeriod>13</RetentionPeriod>
I had to change out the following line.
$dirs = #($configfile.Settings.DirectoryName)
To something like this.
$dirs = #($configfile.Settings.DirectoryName.Split(",").Trim())
let me know if that gets you closer.