Powershell Remove Oldest Items - loops

I am trying to use a do while loop to remove old files in multiple directories (starting from the oldest) until there is one left, at which point the program should end. The program should also only run if there is more than one file in the directory at runtime.
Here is my environment:
Top Folder
Folder 1
Folder 2
etc
In Folder 1, Folder 2, etc there should only be one file. The script should delete everything but the latest and nothing at all if there is only one file in there to begin with.
I have semi-accomplished this using the following code:
$basedir = "C:\Test"
Set-Location -Path C:\Test
$a = Get-ChildItem -recurse $basedir
if ($a.Count -gt 1) {
do
{
$a | Sort-Object LastWriteTime -Descending | Select-Object -Last 1 | remove-item
}
while
(
$a.Count -gt 1
)
}
It will only run when there is more than one file present, which is correct.
It correctly deletes the oldest file, but then it keeps on trying to delete the same file rather than rechecking the directory.
All I need help with at this point is getting it to re-run the loop once it has deleted a file, rather than trying to delete the same file over and over.
Thank you, sincerely, for any help and I apologise if this has been answered before. I did a lot of searching but couldn't find something with my situation.
Brad

To fix yours, you should re-define $a every time you pass through the loop, or just avoid using $a altogether. $a doesn't change just because the child items that were there when you created the object have.
Another way to do it would be to just select the last n-1 items in the folder.
#Get Items that aren't folders
$items = Get-ChildItems "c:\test" | ? {!($_.PSIsContainer)}
#Get a count - this is easier in PoSh 3
$itemCount = ($items | Measure-Object).Count
if ($itemCount -gt 1){
$items | Sort-Object LastWriteTime -Descending | Select-Object -Last ($itemCount - 1) | Remove-Item
}
To do this recursively is also fairly easy. First we get all of the folders we'll need to do the work in, then we'll move the code above into a foreach loop:
$folders = Get-ChildItems "C:\test" -recurse | ? {$_.PSIsContainer}
foreach ($folder in $folders) {
#Get Items that aren't folders
$items = Get-ChildItems $folder.FullName | ? {!($_.PSIsContainer)}
#Get a count - this is easier in PoSh 3
$itemCount = ($items | Measure-Object).Count
if ($itemCount -gt 1){
$items | Sort-Object LastWriteTime -Descending | Select-Object -Last ($itemCount - 1) | Remove-Item
}
}
You might need to run both scripts to prune the root as well, or add something like $folders += Get-Item "C:\test" before the foreach loop

Related

PowerShell find files one by one

I have a script that examines a folder and finds the oldest file (by LastWrittenTime) and writes the found file's LastWriteTime to a log file.
If I run this script again I would like it to find the next oldest file, which has LastWriteTime greater than the one previously written to the log file.
But there is a problem. My scipt can find only the oldest file in the folder each time and ignores the one in the log file.
My script:
$programdir = "C:\Data\PowerShell\Learning"
$folder = "C:\Data\PowerShell\Learning\folder" #there is the files
$TimeLog = "$programdir\LastFileDate.log" #this file contains the last found file's LastWriteTime attribute
$LastWriteTime = Get-Content $TimeLog
$File = Get-ChildItem -Path $folder | Sort-Object LastWriteTime -Descending | Select-Object -Last 1 | Where-Object {$_.LastWriteTime -gt $LastWriteTime}
Clear-Content $TimeLog
$File.LastWriteTime | Set-Content $TimeLog
You immediately cripple your selection set with this line. Specifically where you have Select-Object -Last 1:
$File = Get-ChildItem -Path $folder | Sort-Object LastWriteTime -Descending | Select-Object -Last 1 | Where-Object {$_.LastWriteTime -gt $LastWriteTime}
In the second last pipe statement you limit your selection set to 1 file. Then afterwards you apply your date logic. You need to filter on your dates first then grab the appropriate entry.
Get-ChildItem -Path $folder -File |
Sort-Object LastWriteTime -Descending |
Where-Object {$_.LastWriteTime -lt $LastWriteTime} |
Select-Object -First 1
There are other similar approaches that would work as well.
Also...
Clear-Content $TimeLog
$File.LastWriteTime | Set-Content $TimeLog
That is redundant since Set-Content will overwrite by default. You can remove the Clear-Content.
While this is not an issue in your code be aware that $LastWriteTime, as returned by Get-Content, is a string and not a datetime object. Since it is on the RHS of the statement in your where clause is cast as a [datetime] for the purpose of evaluating the clause.
Also be careful that your code could act differently if there is more than one line in your $timelog

Using powershell to count number of files in subfolder with specific name

So I've started working on a problem where I need to know how many files are in a subfolder of a certain name, that is repeated multiple times in throughout the directory. All folders I want to count have the same name. For example:
Main Folder
Subfolder
Folder I want to count
Folder A
Folder B
Subfolder
Folder I want to count
Folder C
Folder D
I'm able to count the number of files in all subfolders recursively, but I don't know how to only look at folders named " Folder I want to count ".
This is where I've gotten so far to count everything. What do I need to add/modify to only look at and count in the area I want. I'm not familiar with supershell, and have been working to make sense of various questions and cobble this together.
Get-ChildItem -recurse | Where {!$_.PSIsContainer} | Group Directory | Format-Table Name, Count -autosize
I would probably do something like this.
First get all the folders, then run through them. And if the folder is the "folder_I_want", get the count.
$folders = Get-ChildItem C:\Users\David\Documents\SAPIEN\test -Recurse | ?{ $_.PSIsContainer } | select name, fullname
foreach ($folder in $folders)
{
#Write-Host "$folder"
if ($folder.name -eq "folder_I_want")
{
$fullname = $folder.fullname
#Gets the count of the files in "Folder I want". It will filter out folders.
$count = (Get-ChildItem $fullname | where { $_.PSIsContainer -EQ $false }).count
Write-Host "The amount of files in the folder I want: $count"
}
}
somthing like this?
(gci -Path c:\temp\ -Recurse -File | where fullname -like "*\yourname\*").Count

How to search Powershell array for specific strings and use this result in a copy-item -include script

I'm trying to create a synchronization script in Powershell so that my applications in MDT are being copied on a regular basis to our main file server, based on the folder name (in MDT, applications are in one folder, where our main server has applications split depending on the department who uses them).
From what I read on the web, the best way would be to populate an array with "Get-ChildItem", which I kinda figured how to do (see code below).
After the array is populated though, I don't know how to search that array for specific results, nor do I know how to use those results with copy-item.
In a nutshell, here's what I need to do: Build an array using "Get-ChildItem", query the resulting array for specific folders, and have those folders be copied to specific destinations.
Here's the code I have so far:
$arr = Get-ChildItem \\slmtl-wds02.domain.inc\deploymentshare$\applications |
Where-Object {$_.PSIsContainer} |
Foreach-Object {$_.Name}
$sourcepath = \\slmtl-wds02.domain.inc\deploymentshare$\applications
$destSLARC = \\slmtl-fs01.domain.inc\folder\it_services\private\software\service_desk\pc\SLARCMTL
$destSLMTL = \\slmtl-fs01.domain.inc\folder\it_services\private\software\service_desk\pc\SLMTL
$destSLGLB = \\slmtl-fs01.domain.inc\folder\it_services\private\software\service_desk\pc\SLGLB
$destSLTECH = \\slmtl-fs01.domain.inc\folder\it_services\private\software\service_desk\pc\SLTECH
Thanks in advance for your help :)
$sourceLocation = "c:\analysis\"
$targetLocation = "c:\analysisCopy\"
$included = #("folder1", "folder2")
$result = #()
foreach ($i in $included){
$result += get-ChildItem $sourceLocation -filter $i | Where-Object {$_.PSIsContainer}
}
$result | foreach-Object { copy-item $_.FullName -Destination $targetLocation -Recurse}
Hope this works change the path D:\ to your desired path enter the name of folder you looking for
$Keyword=[Microsoft.VisualBasic.Interaction]::InputBox("Enter your Query")
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.VisualBasic') | Out-Null
Get-ChildItem D:\ -recurse | Where-Object {$_.PSIsContainer -eq $fasle -and $_.Name -match "$keyword"} | Copy-Item -Destination d:\test

Powershell script to Get-Content of last line in multple files (loop array)

I have a folder with a list of files (*.log files) which all end with the following line, showing the output (different files will have different amounts) e.g.:
Total MB moved to Archive = 369.06444644928
I have the following query which will get the result of the first file in the folder:
Get-ChildItem $path| Where-Object {$_.Name} | Get-Content | select -last 1
I also have the following query which will output a list of all the log files:
Get-ChildItem $Path | ForEach {
Write-Host $_.Name}
I just want to create a loop to print a list of each last line of each log, not the log name or the last line of one log. And then the complex part, to pull the number part and add these together. Any tips?
If I'm reading the question correctly, this should work:
Get-ChildItem $Path |
ForEach {
Write-Host $_.Name
[double]$MB_Moved += (Get-Content $_.FullName -Tail 1) -replace '.+?([\d.]+)\s*','$1'
}

Loop/Cycle through Subdirectories and determine file count in each

I have the directory E:\NugetRoot\NugetServer where I need to cycle through the subdirectories on this path and within the packages folder within that subdirectory I need to count the files ending in .nupkg and output them to a cvs file named d:\monitoring\NugetStatistics and each time the script is run, it should append to the file.
Count the files ending in .nupkg in "C:\NugetRoot\NugetServer\\**\Packages" for each folder. (I need to Loop through the ** folders and count each file ending on .nupkg)
Output in cvs file with two columns: one showing the "**" folder name & the other showing the file count.
First find all the *.nupkg files using Get-Childitem with the recurse flag to get all files in sub folders, then filter the results using a regex to exclude any where the final folder is not called Package. Then use another regex to extract the previous folder name, feed that in to a Group-Object to get the count and then into a Export-Csv which includes the append flag.
cd E:\NugetRoot\NugetServer
Get-ChildItem -Filter *.nupkg -Recurse | ? {
$_.DirectoryName -match '\\Packages$'
} | % {
$_.DirectoryName -Replace '^.*\\([^\\]+)\\Packages$', '$1'
} | Group-Object | Select Name, Count | Export-Csv outfile.csv -Append -NoTypeInformation
cd "C:\NugetRoot\NugetServer\\**\Packages"
$a = Get-ChildItem -Name
foreach ($i in $a) {
$b = (Get-ChildItem -Recurse -Force -Include .nupkg -Path $i | Select-Object -ExpandProperty Name).Count
$i + "`t" + $b
}
Here's what I have so far. It displays the server name, ProjectgroupID(or folder name), but get error for package count. Also, I am having trouble getting the average file size as well, I commented those out:
$folders = gci C:\NuGetRoot\NugetServer -Directory
foreach($folder in $folders){
#{ServerName=$env:COMPUTERNAME;
ProjectGroupID = $folder.Name;
NuGetPackageCount = (gci $folder.FullName\packages -Include '*.nupkg') | %{$_.Size}.Count;
#AverageSize= Measure-Object (listof sizes) -Average
} #| Export-Csv -Path c:\temp -NoTypeInformation -Append
}
Measure-Object -Average

Resources