Read XML Config into PowerShell array - arrays

I'm trying to create a PS script that will read multiple file directories from an XML file and then depending on the filetype and date modified > 10 Days I want to delete the files.
I'm not sure how I should be accessing the config value so that i can setup a foreach loop. At the minute I have this in my script
#Read XML Config File to get settings
[xml]$configfile = Get-Content "C:\Project\TestForXML.xml"
#Declare and set variables from Config values
$dirs = #($configfile.Settings.DirectoryName)
$scanSubDirectories = $configfile.Settings.ScanSubDirectories
$deleteAllFiles = $configfile.Settings.deleteAllFiles
$fileTypesToDelete = $configfile.Settings.FileTypesToDelete
$liveSiteLogs = $configfile.Settings.LiveSiteLogs
$fileExclusions = $configfile.Settings.FileExclusions
$retentionPeriod = $configfile.Settings.RetentionPeriod
$aicLogs = $configfile.Settings.AICLogs
$aicLogsRententionPeriod = $configfile.Settings.AICLogsRententionPeriod
$Now = Get-Date -format d
$Days = "10"
$LastWrite = $Now.AddDays(-$retentionPeriod)
$aicLastWrite = $Now.AddDays(-$aicLogsRententionPeriod)
$Logs = Get-Childitem $dirs -Include $fileTypesToDelete,$liveSiteLogs -Recurse | Where {$_.LastWriteTime -le "$LastWrite"}
foreach ($Log in $Logs)
{
Remove-Item $Log.FullName
$Msg =Write-Output "Deleting File $Log"
$Msg | out-file "C:\Desktop\Output.txt" -append
}
My XML Config files looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<Settings>
<DirectoryName>"ServerA\C$\logs", "ServerB\C$\logfiles",
"ServerC\C$\logfiles", "ServerD\C$\logs"</DirectoryName>
<AICLogs>D:\DebugXML</AICLogs>
<AICLogsRententionPeriod>30</AICLogsRententionPeriod>
<ScanSubDirectories>True</ScanSubDirectories>
<DeleteAllFiles>False</DeleteAllFiles>
<FileTypesToDelete>.txt;.log;.mf</FileTypesToDelete>
<LiveSiteLogs>livesite.runtime.*</LiveSiteLogs>
<FileExclusions>fdlogfile.log</FileExclusions>
<RetentionPeriod>13</RetentionPeriod>

I had to change out the following line.
$dirs = #($configfile.Settings.DirectoryName)
To something like this.
$dirs = #($configfile.Settings.DirectoryName.Split(",").Trim())
let me know if that gets you closer.

Related

Cannot remove a string from an array in powershell

I'm trying to populate an array of file paths where ever the script is located in. But I don't want
the array to include the path of the script only the other files in that folder. I have tried removing it after it is populated by using a list array instead but then I get an error that the array is a fixed size.
#To get path in which the script is located
$mypath = $MyInvocation.MyCommand.Path
$myStringPath=$mypath.ToString().Replace("TestingScriptPath.ps1", "")
#Populates files inside the folder
$array = #()
(Get-ChildItem -Path $myStringPath ).FullName |
foreach{
$array += $_
}
#display paths
for($i = 0; $i -lt $array.length; $i++)
{
$array[$i]
}
You're better off not putting it in the array in the first place.
When updating an array the whole array has to be rewritten, so the performance tends to be terrible.
Use a different datatype if you're going to be removing item-by-item.
#To get path in which the script is located
$mypath = $MyInvocation.MyCommand.Path
$myStringPath=$mypath.ToString().Replace("testingscriptpath.ps1", "")
#Populates files inside the folder
$array = Get-ChildItem -Path $myStringPath | Where-Object {$_.fullname -ne $mypath}
$array
if you definitely want to do it the way suggested in the question (slower)
$ArrayWithFile = Get-ChildItem -Path $myStringPath
$ArrayWithoutFile = $ArrayWithFile | Where-Object {$_.fullName -ne $mypath}

Task scheduler creates corrupted version of generated by script file

I've been working on a little project in Powershell.
My task was to create a script that will collect all files from mail attachments, merge all .pdf files into one and send the generated file to my email.
The script works completely fine in Powershell ISE, but when I try to run it from task scheduler, the merged .pdf file is corrupted without any data in it.
Keep in mind I am new to this stuff.
This is my main code that does all the heavy work:
function getAttachments
{
#########################################################
##-----------------------------------------------------##
## GET ATTACHMENTS ##
##-----------------------------------------------------##
#########################################################
##PATH TO CREDENTIAL
$credpath = "C:\Users\" + $env:UserName + "\Documents\myCred_${env:USERNAME}_${env:COMPUTERNAME}.xml"
#test variable
$test = Test-Path $credpath
##TEST IF CREDENTIAL EXISTS
if(!$test){
## USER PROMPT PSW CREDENTIAL ##
$cred = Get-Credential
#save credential in documents
$cred | Export-CliXml -Path $credpath
}else{
##READ USER CREDENTIAL FROM FILE
$cred = Import-CliXml -Path $credpath
}
##url and date variables
$url = "https://outlook.office365.com/api/v1.0/me/messages"
$d = [DateTime]::Today.AddDays(-1)
$global:date = $d.ToString("yyyy-MM-dd")
## Get all messages that have attachments where received date is greater than $date
$messageQuery = "" + $url + "?`$select=Id&`$filter=HasAttachments eq true and DateTimeReceived ge " + $date
$messages = Invoke-RestMethod $messageQuery -Credential $cred
## Loop through each results
foreach ($message in $messages.value)
{
# get attachments and save to file system
$query = $url + "/" + $message.Id + "/attachments"
$attachments = Invoke-RestMethod $query -Credential $cred
# in case of multiple attachments in email
foreach ($attachment in $attachments.value)
{
Write-Host “Found File :- ” $attachment.Name
$path = "c:\Attachments\" + $attachment.Name
$Content = [System.Convert]::FromBase64String($attachment.ContentBytes)
Set-Content -Path $path -Value $Content -Encoding Byte
}
}
}
function sendAttachments
{
#############################################################
##---------------------------------------------------------##
## SEND ATTACHMENTS AND DELETE FILES ##
##---------------------------------------------------------##
#############################################################
#Connection Details
#PATH TO CREDENTIAL
$credpath = "C:\Users\" + $env:UserName + "\Documents\myCred_${env:USERNAME}_${env:COMPUTERNAME}.xml"
$cred = Import-CliXml -Path $credpath
$smtpServer = “ smtp.office365.com”
$msg = new-object Net.Mail.MailMessage
#Change port number for SSL to 587
$smtp = New-Object Net.Mail.SmtpClient($SmtpServer, 25)
#Uncomment Next line for SSL
$smtp.EnableSsl = $true
$smtp.Credentials = $cred
$msg.IsBodyHtml = $true
#From Address
$msg.From = $cred.UserName
#To Address, Copy the below line for multiple recipients
$msg.To.Add(“email#gmail.com”)
#Message Body
$msg.Body=”<h2>Alle attachments samen bevinden zich in de bijlage van did email</h2> <br/><br/>”
#Message Subject
$msg.Subject = “no-reply: Email met alle attachments”
#your file location
$files=Get-ChildItem “C:\Attachments\”
#attach the right file
$file = $global:pname
Write-Host “Attaching File :- ” $file
$attachment = New-Object System.Net.Mail.Attachment –ArgumentList C:\Attachments\$file
$msg.Attachments.Add($attachment)
#send email
$smtp.Send($msg)
$attachment.Dispose();
$msg.Dispose();
#delete the files from the folder
Get-ChildItem -Path C:\Attachments -Include * -File -Recurse | foreach { $_.Delete()}
}
function mergePDF
{
#############################################################
##---------------------------------------------------------##
## MERGE ALL PDF FILES ##
##---------------------------------------------------------##
#############################################################
$workingDirectory = "C:\Attachments"
$itspath = $PSScriptRoot
$global:pname = $global:date + "_pdfAttachments.pdf"
$pdfs = ls $workingDirectory -recurse | where {-not $_.PSIsContainer -and $_.Extension -imatch "^\.pdf$"};
[void] [System.Reflection.Assembly]::LoadFrom([System.IO.Path]::Combine($itspath, 'itextsharp.dll'));
$output = [System.IO.Path]::Combine($workingDirectory, $pname);
$fileStream = New-Object System.IO.FileStream($output, [System.IO.FileMode]::OpenOrCreate);
$document = New-Object iTextSharp.text.Document;
$pdfCopy = New-Object iTextSharp.text.pdf.PdfCopy($document, $fileStream);
$document.Open();
foreach ($pdf in $pdfs) {
$reader = New-Object iTextSharp.text.pdf.PdfReader($pdf.FullName);
[iTextSharp.text.pdf.PdfReader]::unethicalreading = $true
$pdfCopy.AddDocument($reader);
$reader.Dispose();
}
$document.Close()
$pdfCopy.Dispose();
$document.Dispose();
$fileStream.Dispose();
}
getAttachments
Start-Sleep -s 10
mergePDF
Start-Sleep -s 10
sendAttachments
In this piece of code that I run in another script file, I create a new task:
#############################################################
##---------------------------------------------------------##
## SCHEDULE SCRIPTS IN WINDOWS TASKS ##
##---------------------------------------------------------##
#############################################################
##PATH TO CREDENTIAL
$credpath = "C:\Users\" + $env:UserName + "\Documents\myCred_${env:USERNAME}_${env:COMPUTERNAME}.xml"
#test variable
$test = Test-Path $credpath
##TEST IF CREDENTIAL EXISTS
if(!$test){
## USER PROMPT PSW CREDENTIAL ##
$cred = Get-Credential
#save credential in documents
$cred | Export-CliXml -Path $credpath
}
$taskName = "ManageEmailAttachments"
$taskExists = Get-ScheduledTask | Where-Object {$_.TaskName -like $taskName }
if($taskExists)
{
Get-ScheduledJob ManageEmailAttachments
Unregister-ScheduledJob ManageEmailAttachments
$wshell = New-Object -ComObject Wscript.Shell
$wshell.Popup("Task successfully deleted, run the script again to schedule the task",0,"Done",0x0)
}
else
{
$tt = Get-Date
$tt = $tt.AddMinutes(1)
$testtime = $tt.ToString("HH:mm:ss")
#set trigger
$trigger = New-JobTrigger -Daily -At "1:00"
$testtrigger = New-JobTrigger -Daily -At $testtime
#path to the scripts
$scriptPath = $PSScriptRoot + "\wps_manage_pdf_attachments.ps1"
#options(optional)
$option = New-ScheduledJobOption -WakeToRun: $true
#create a new task
Register-ScheduledJob -Name ManageEmailAttachments -FilePath $scriptPath -Trigger $testtrigger -ScheduledJobOption $option
}
The script when run in Powershell works great, it gets all the attachments from mailbox, merges them into 1 .pdf file and sends them to the requested email address. But when scheduled in windows task scheduler it does the first step fine, but when merged, the .pdf file is corrupted without any content.
I couldn't figure out how to make it work so I posted a question on the forum.
Hope you guys find a way to figure it out.
Thanks in advance
Use below function to get script root directory.
Function Get-ScriptDirectory
{
$Invocation = (Get-Variable MyInvocation -scope 1).Value
Split-path $Invocation.MyCommand.Path
}
$scriptPath=Join-Path(Get-ScriptDirectory) 'wps_manage_pdf_attachments.ps1'
Apparently the problem nested itself in the main code. I used:
Try{...}
Catch{$_ | Out-File C:\errors.txt}
In mergePDF function to find out what the error was. Seems like the path to my ITextSharp.dll was incorrect. $PSScriptRoot that I used showed "C:\windows\windows32" instead of where the script actually is.
So what I did instead was add a line in my batch file to copy the itextsharp.dll to %Temp%:
xcopy Scripts\itextsharp.dll %Temp% /D >NUL 2>NUL
and then read the file from there with:
$itsPath = [System.IO.Path]::GetTempPath()
And everything works as it should be. I know this isn't the best way to do it but I had this batch file before to make the script run by just dubbleclicking it.
So adding a little line won't hurt.
I hope this helps anyone with the same problem.

Find a value in XML and access its siblings from the same parent group

I have an XML file with parent node hospitalStore and a few child nodes I need to process:
<?xml version="1.0" encoding="UTF-8"?><!--Hospital Store -->
<hospitalStore><!-- Hospital Config -->
<src><!--Search Path For Files To Move-->
<path>C:\Users\Desktop\move\*.pdf</path>
</src>
<hospital>
<criteria>Beth Israel requirements</criteria>
<name>Beth Israel</name>
<category>OPBEIH</category>
<destination>C:\Users\Desktop\dest\1\</destination>
<hospcode>1101</hospcode>
</hospital><!-- End Hospital Config --><!-- Hospital Config -->
<hospital>
<criteria>Beth Israel CCC - WEST requirements</criteria>
<name>Beth Israel CCC WEST</name>
<category>OPBICC</category>
<destination>C:\Users\Desktop\dest\2\</destination>
<hospcode>1107</hospcode>
</hospital><!-- End Hospital Config --><!-- Hospital Config -->
<hospital>
<criteria>Beth Israel KHD requirements</criteria>
<name>Beth Israel KHD</name>
<category>OPBIKD</category>
<destination>C:\Users\Desktop\dest\3\</destination>
<hospcode>1102</hospcode>
</hospital><!-- End Hospital Config --><!-- Hospital Config -->
<hospital>
<criteria>Beth Israel KHD requirements</criteria>
<name>Beth Israel KHD</name>
<category>OPBIKE</category>
<destination>C:\Users\Desktop\dest\3\</destination>
<hospcode>1102</hospcode>
</hospital><!-- End Hospital Config -->
</hospitalStore><!-- End Hospital Store -->
So far my code retrieves the files in a directory and then matches the basename's first 6 characters to the XML child node category.
I want to be able to point the file that matches the value in XML category to XML destination etc and then test destination with a few other functions. I'm unsure how to associate the file with that group of nodes.
$cPath="C:\move\outpatient.xml"
$xml = New-Object -TypeName XML
$xml.Load($cPath)
$hName = $xml.hospitalstore.hospital
$src = $xml.hospitalstore.src.path
$fileList= get-childitem -path $src -File
ForEach ($file in $fileList ){
$category = ($file.BaseName.Substring(0,6))
$fileName = $fileList.Name
if ($category -in $hName.category){
$file.Name
$hName.category
}
}
To find a hospital that has a child node category matching the condition:
PowerShell 4+:
$hospital = $hospitals.Where({ $_.category -eq $foo }, 'first')[0]
PowerShell 3+:
$hospital = $hospitals | Where category -eq $foo
Aliases: Where = Where-Object = ?
PowerShell 1+:
$hospital = $hospitals | Where { $_.category -eq $foo }
Aliases: Where = Where-Object = ?
XPath (available in all PS versions for Windows, but not yet in PowerShell for Linux AFAIK):
$hospital = $xml.SelectSingleNode("//hospital[category='$foo']")
The PowerShell 1,2,3 methods return an array of all matching nodes, or a single matching value (not an array), or nothing, iterating the entire array. PowerShell 4 method and SelectSingleNode stop on the first match.
So the boilerplate code could be something like this:
$xml = New-Object XML
$xml.Load('C:\move\outpatient.xml')
$hospitals = $xml.hospitalStore.hospital
Get-ChildItem $src -File | ForEach {
$file = $_
$fileCategory = $file.BaseName.Substring(0,6)
$fileName = $file.Name
$hospital = $hospitals | Where { $_.category -eq $fileCategory }
if (!$hospital) {
Write-Error "$fileCategory not found in XML"
return # return because we're inside a function-like ScriptBlock
}
# use the values
echo $hospital.criteria
echo $hospital.name
echo $hospital.destination
echo $hospital.hospcode
echo ''
}

List just files in a Powershell script

I found a script that does great. The only change I want to make is to list just the files, not the new folders. This script should monitor a folder and subfolders and notify only when a new file has been created. How can I filter this down to just the file? Can I add an exclude on the get-childitem?
Param (
[string]$Path = "\\path\share",
[string]$SMTPServer = "smtp server",
[string]$From = "email",
[string]$To = "email",
[string]$Subject = "New File(s) Received on the FTP site"
)
$SMTPMessage = #{
To = $To
From = $From
Subject = "$Subject at $Path"
Smtpserver = $SMTPServer
}
$File = Get-ChildItem $Path | Where { $_.LastWriteTime -ge (Get-Date).Addminutes(-10) }
If ($File)
{ $SMTPBody = "`nTo view these new files, click the link(s) below:`n`n "
$File | ForEach { $SMTPBody += "$($_.FullName)`n" }
Send-MailMessage #SMTPMessage -Body $SMTPBody
}
Thanks
To list only files, you can pass the -File parameter to Get-ChildItem.
From Get-ChildItem for FileSystem:
To get only files, use the File parameter and omit the Directory parameter.

how to check if a specific file extension exists in a folder using powershell?

I have a root directory that consists of many folders and sub folders. I need to check whether a particular file like *.sln or *.designer.vb exists in the folders or subfolders and output the result in a text file.
For Eg:
$root = "C:\Root\"
$FileType = ".sln",".designer.vb"
the text file will have result somewhat like below:
.sln ---> 2 files
.sln files path ---->
c:\Root\Application1\subfolder1\Test.sln
c:\Root\Application2\subfolder1\Test2.sln
Any help will be highly appreciated!
Regards,
Ashish
Try this:
function Get-ExtensionCount {
param(
$Root = "C:\Root\",
$FileType = #(".sln", ".designer.vb"),
$Outfile = "C:\Root\rootext.txt"
)
$output = #()
Foreach ($type in $FileType) {
$files = Get-ChildItem $Root -Filter *$type -Recurse | ? { !$_.PSIsContainer }
$output += "$type ---> $($files.Count) files"
foreach ($file in $files) {
$output += $file.FullName
}
}
$output | Set-Content $Outfile
}
I turned it into a function with your values as default parameter-values. Call it by using
Get-ExtensionCount #for default values
Or
Get-ExtensionCount -Root "d:\test" -FileType ".txt", ".bmp" -Outfile "D:\output.txt"
Output saved to the file ex:
.txt ---> 3 files
D:\Test\as.txt
D:\Test\ddddd.txt
D:\Test\sss.txt
.bmp ---> 2 files
D:\Test\dsadsa.bmp
D:\Test\New Bitmap Image.bmp
To get the all the filecounts at the start, try:
function Get-ExtensionCount {
param(
$Root = "C:\Root\",
$FileType = #(".sln", ".designer.vb"),
$Outfile = "C:\Root\rootext.txt"
)
#Filecount per type
$header = #()
#All the filepaths
$filelist = #()
Foreach ($type in $FileType) {
$files = Get-ChildItem $Root -Filter *$type -Recurse | ? { !$_.PSIsContainer }
$header += "$type ---> $($files.Count) files"
foreach ($file in $files) {
$filelist += $file.FullName
}
}
#Collect to single output
$output = #($header, $filelist)
$output | Set-Content $Outfile
}
Here's a one-liner to determine if at least one file with extension .txt or .ps1 exists in the directory $OutputPath:
(Get-ChildItem -Path $OutputPath -force | Where-Object Extension -in ('.txt','.ps1') | Measure-Object).Count
Explanation: the command tells you the number of files in the specified directory matching any of the listed extensions. You can append -ne 0 to the end, which returns true or false to be used in an if block.
This will search the directory $root and its subdirectories for files of type $FileType, including hidden files and excluding directories:
$root = "C:\Root\";
$FileType = "*.sln", "*.designer.vb";
$results = Get-ChildItem -Path $root -Force -Recurse `
| Where-Object {
if ($_ -isnot [System.IO.DirectoryInfo])
{
foreach ($pattern in $FileType)
{
if ($_.Name -like $pattern)
{
return $true;
}
}
}
return $false;
}
Note that I've modified the strings in $FileType to be formatted as a wildcard pattern. Then group the files by extension:
$resultGroups = $results | Group-Object -Property 'Extension';
Then loop through each group and print the file count and paths:
foreach ($group in $resultGroups)
{
# $group.Count: The number of files with that extension
# $group.Group: A collection of FileInfo objects
# $group.Name: The file extension with leading period
Write-Host "$($group.Name) ---> $($group.Count) files";
Write-Host "$($group.Name) files path ---->";
foreach ($file in $group.Group)
{
Write-Host $file.FullName;
}
}
To write the results to a file instead of the console, use the Out-File cmdlet instead of the Write-Host cmdlet.

Resources