PowerShell nested loop for renaming files in directory - loops

I am trying to do a nested loop in PowerShell. I am renaming images in a several sub directories but i need the loop to reset based on it finding a new directory.
Here is the code I wrote but I can't get it to work as it gives me an error saying the file doesn't exist.
function rename-cruise {
Param([string]$parkname, [string]$extension=".jpg")
if(!$parkname){
$park = $args[0];
} else {
$park = $parkname;
}
$dirs = Get-ChildItem -Recurse | ?{ $_.PSIsContainer };
foreach($dir in $dirs){
if($dir -cnotlike "177"){
$i = 0;
$currentDir = get-location;
$location = "$currentDir\$dir";
write-host $currentDir
write-host $location
Get-ChildItem -Path $location -Filter "*$extension" |
ForEach-Object {
if($i -lt 10){
$newName = "$park-0$i$extension";
Rename-Item $_.Name -NewName $newName;
} else {
$newName = "$park-$i$extension";
Rename-Item $_.Name -NewName $newName;
}
$i++;
}
}
}
}

Related

Powershell script for reporting old folders on multiple shares

I am hoping for some guidance. I need to analyze multiple shares for folders that have not been modified in 6 months. For each share, I defined specific paths that need to be examined. A csv should be generated for each share. The powershell script works fine for checking a single share but I can't get it work correctly for multiple shares. Here's a sample of the script.
$age = (Get-Date).AddMonths(-6)
# Export path
$outputPath = "D:\Scripting\xPowershell_Results\"
# Export file name
$FilePath1 = "OldMediaPath1.csv"
$FilePath2 = "OldMediaPath2.csv"
# Paths to scan
$CheckPath1 = #(
"\\servername\share1\Projects"
"\\servername\share1\Restored"
)
$CheckPath2 = #(
"\\servername\share2\Graphics"
)
$folders = dir $CheckPath1 | where {$_.PSiscontainer -eq $true} | Where-Object {($_.LastWriteTime -lt $age)}
$results = foreach ($folder in $folders) {
$bytecount = (dir $folder.fullname -recurse | Measure-Object -property length -sum).sum
switch ($bytecount)
{
{$_ -lt 1KB} { $size = "{0:N0} Bytes" -f $_; break }
{$_ -lt 1MB} { $size = "{0:N2} KB" -f ($_ / 1KB); break }
{$_ -lt 1GB} { $size = "{0:N2} MB" -f ($_ / 1MB); break }
{$_ -lt 1TB} { $size = "{0:N2} GB" -f ($_ / 1GB); break }
{$_ -lt 1PB} { $size = "{0:N2} TB" -f ($_ / 1TB); break }
default { $size = "{0:N2} PB" -f ($_ / 1PB) }
}
[PSCustomObject]#{
Name = $folder.fullname
Size = $size
LastWriteTime = $folder.LastWriteTime
}
}
$results | Export-Csv $outputPath$FilePath1 -NoTypeInformation

File sorting based on file content (string)

I want to build a modular script that sorts files based on content (strings/Get-Content in PowerShell).
Requirement:
Defining a directory. ($directory)
start a foreach loop: foreach
list items in the directory & full path in memory
$FilePath in Get-ChildItem $directory | Select-Object -ExpandPropert FullName
Load content of one file at a time in the memory
$content = Get-Content $FilePath
Search for the keyword and copy the file once a particular keyword is found.
if ($content -match 'keyword1') { Copy-Item $FilePath $OutputPath }
While I am able to do this in a static manner using the below mentioned code, I wanted to modularise it for reuse.
[string] $Directory = "parth to source directory";
[string] $outpath1 = "outpath for keyword1";
[string] $OutputPath2 = "outpath for keyword2";
[string] $OutputPath3 = "outpath for keyword3";
foreach ($FilePath = Get-ChildItem $Directory | Select-Object -ExpandProperty FullName) {
[string] $content = Get-Content $FilePath
if ($content -match 'keyword1') {
Copy-Item $FilePath $OutputPath
} elseif ($content -match 'keyword2') {
Copy-Item $FilePath $OutputPath2
} else {
Copy-Item $FilePath $keyword3
}
}
My questions:
Is it possible to define keywords in a single array? If so how do that in PowerShell? (keyword1, keyword2, keyword3)
Run keywords sequentially in the files and whenever one keyword is detected, the file is copied to it's designated folder. Can I have this done in modular fashion or will I have to define directory for each keyword?
The reason I am doing this is because while the script is being used for 2 or 3 keywords as of now, it will be used for over 50 keywords and allowing reuse should help.
What you describe could be achieved with a hashtable and a nested loop:
$outpath = #{
'keyword1' = 'outpath for keyword1'
'keyword2' = 'outpath for keyword2'
'keyword3' = 'outpath for keyword3'
}
foreach ($FilePath in Get-ChildItem $Directory | Select-Object -Expand FullName) {
$content = Get-Content $FilePath
foreach ($keyword in $outpath.Keys) {
if ($content -match $keyword) {
Copy-Item $FilePath $outpath[$keyword]
break
}
}
}
Alternatively you could use a switch statement:
$outpath = #{
'keyword1' = 'outpath for keyword1'
'keyword2' = 'outpath for keyword2'
'keyword3' = 'outpath for keyword3'
}
$pattern = ($outpath.Keys | ForEach-Object { [regex]::Escape($_) }) -join '|'
foreach ($FilePath in Get-ChildItem $Directory | Select-Object -Expand FullName) {
$content = Get-Content $FilePath
switch -regex ($content) {
$pattern {
Copy-Item $FilePath $outpath[$keyword]
break
}
}
}
The latter would also give you a simple way of specifying a fallback destination path if you also want to handle files with no matching keyword.
$fallbackpath = '...'
foreach ($FilePath in Get-ChildItem $Directory | Select-Object -Expand FullName) {
$content = Get-Content $FilePath
switch -regex ($content) {
$pattern {
Copy-Item $FilePath $outpath[$keyword]
break
}
default {
Copy-Item $FilePath $fallbackpath
break
}
}
}

How to adapt Get-ChildItemToDepth to return Depth in array ?

I using PowerShell 2.0 and using a function to list all directory ending by "_S" on 2 level depth
Example where Push-Location = "\\MyServer\Shared\toto\" the result is:
\\MyServer\Shared\toto\Folder1_S
\\MyServer\Shared\toto\Folder1_S\Folder2_S
\\MyServer\Shared\toto\Folder1_S\Folder3_S
Now, i would like this function return level depth number in array
For example i would like this result
1; \\MyServer\Shared\toto\Folder1_S
2;\\MyServer\Shared\toto\Folder1_S\Folder2_S
2; \\MyServer\Shared\toto\Folder1_S\Folder3_S
.
function Get-ChildItemToDepth {
param(
[String]$Path = $PWD,
[String]$Filter = "*_S",
[Byte]$ToDepth = 2,
[Byte]$CurrentDepth = 0,
[Switch]$DebugMode
)
$CurrentDepth++
if ($DebugMode) { $DebugPreference = "Continue" }
Get-ChildItem $Path | ForEach-Object {$_ | Where-Object { ($_.Attributes -match "Directory") -and ($_.Name -like $Filter) } | Select-Object -ExpandProperty FullName
#Write-Host $CurrentDepth
if ($_.PsIsContainer) {
if ($CurrentDepth -le $ToDepth) {
# Callback to this function
Get-ChildItemToDepth -Path $_.FullName -Filter $Filter -ToDepth $ToDepth -CurrentDepth $CurrentDepth
} else {
Write-Host $("Skipping GCI for Folder: $($_.FullName) " +
"(Why: Current depth $CurrentDepth vs limit depth $ToDepth)")
}
}
}
}
How to adpat the funtion to return an array with depth ?
Instead of outputing the FullName-value, create a string with your current depth and fullname. Ex (I've also cleaned it up a bit):
function Get-ChildItemToDepth {
param(
[String]$Path = $PWD,
[String]$Filter = "*_S",
[int]$MaxDepth = 2,
[int]$CurrentDepth = 1,
[Switch]$DebugMode
)
if ($DebugMode) { $DebugPreference = "Continue" }
Get-ChildItem $Path | Where-Object { $_.PSIsContainer } | ForEach-Object {
#Write-Host $CurrentDepth
if ($_.Name -like $Filter) {
#Match found. Output "Level; Path"
"$CurrentDepth; $($_.FullName)"
}
#Recursion
if ($CurrentDepth -lt $MaxDepth) {
# Callback to this function
Get-ChildItemToDepth -Path $_.FullName -Filter $Filter -MaxDepth $MaxDepth -CurrentDepth ($CurrentDepth + 1)
} else {
Write-Host $("Skipping GCI for Folder: $($_.FullName) " +
"(Why: Current depth $CurrentDepth vs limit depth $MaxDepth)")
}
}
}
Get-ChildItemToDepth -Path \\MyServer\Shared\toto\

Check if file is in use before exporting

I'm writing a script that will export an excel file to a PDF. I got that part working, However, because I'm saving on top of an existing PDF it cannot be open when the export happens. I'm looking for a way to have PowerShell check if the file is currently open, and if so, wait for X seconds and then check again. If not it can proceed.
It currently works perfectly and breaks if the PDF is open, however I need it to loop.
Here's what I have so far:
$path = "c:\users\XXXXX\documents"
$xlFixedFormat = "Microsoft.Office.Interop.Excel.xlFixedFormatType" -as [type]
$excelFiles = Get-ChildItem -Path $path -Include spreadsheet.xlsx -Recurse
$File = "c:\users\XXXXX\documents\Exported.pdf"
try {
[IO.File]::OpenWrite($File).Close();
$true
} catch {
break
}
$objExcel = New-Object -ComObject Excel.Application
$objExcel.Visible = $false
foreach ($wb in $excelFiles) {
$filepath = Join-Path -Path $path -ChildPath ($wb.BaseName + ".pdf")
$workbook = $objExcel.Workbooks.Open($wb.FullName, 3)
$workbook.Saved = $true
"saving $filepath"
$workbook.ExportAsFixedFormat($xlFixedFormat::xlTypePDF, $filepath)
$objExcel.Workbooks.Close()
}
$objExcel.Quit()
This should do what you want; thanks to Ben Baird for the Test-FileLock function.
function Test-FileLock {
param ([parameter(Mandatory=$true)][string]$Path)
$oFile = New-Object System.IO.FileInfo $Path
if ((Test-Path -Path $Path) -eq $false)
{
return $false
}
try
{
$oStream = $oFile.Open([System.IO.FileMode]::Open, [System.IO.FileAccess]::ReadWrite, [System.IO.FileShare]::None)
if ($oStream)
{
$oStream.Close()
}
# file is unlocked.
$false
}
catch
{
# file is locked by a process.
return $true
}
}
$path = "c:\users\XXXXX\documents"
$xlFixedFormat = "Microsoft.Office.Interop.Excel.xlFixedFormatType" -as [type]
$excelFiles = Get-ChildItem -Path $path -Include spreadsheet.xlsx -Recurse
$File = "c:\users\XXXXX\documents\Exported.pdf"
while((Test-FileLock $file) -eq $true)
{
Start-Sleep -Seconds 3
}
$objExcel = New-Object -ComObject Excel.Application
$objExcel.Visible = $false
foreach ($wb in $excelFiles) {
$filepath = Join-Path -Path $path -ChildPath ($wb.BaseName + ".pdf")
$workbook = $objExcel.Workbooks.Open($wb.FullName, 3)
$workbook.Saved = $true
"saving $filepath"
$workbook.ExportAsFixedFormat($xlFixedFormat::xlTypePDF, $filepath)
$objExcel.Workbooks.Close()
}
$objExcel.Quit()
The code will check for a file lock and if it is detected wait 3 seconds and then try again. Once the lock is cleared the PDF export code will run.

Find file extension in folder and sum total size

I am trying to find all file extension in a folder and subfolders and generate a list. I found a oneliner but it do not generate the list as i want. i got mutiple of paths so i do this.
$date = get-date -Format d
$File = "C:\NoBackup\FolderPaths.txt"
foreach ($Folder in (Get-Content $File)) {
Get-ChildItem $Share -Recurse -ErrorAction SilentlyContinue | Group-Object extension | Select-Object #{Name="Folder";Expression={$Folder}}, name, #{n='TotalSize';e={$_.group | ForEach-Object -Begin {$size=0} -Process {$size += ([decimal]::round($_.Length / 1MB))} -End {"$size MB"}}} | Sort-Object -Property 'TotalSize' -Descending | Format-Table -AutoSize
}
This will give a new header foreach folder in folderpaths, and i need the result be like this
.ext1 .ext2 .ext3 .ext4
D:\Folder1 5MB 12MB 20MB 8MB
D:\Folder2 10MB 54MB 12MB 3MB
D:\Folder3 2MB 12MB 20MB 100MB
I cant find out to rewrite the code to get what i need. Hope you can help me out with this.
The script works now. I needed to change
foreach($folder in $folders)
To
foreach($folder in (Get-Content $file))
It is not short or sweet, but try this:
function ConvertTo-Units($decimal)
{
$value = [decimal]::Round($decimal/1mb,2)
$units = "MB"
if($value -eq 0)
{
$value = [decimal]::Round($decimal/1kb,2)
$units = "KB"
}
return "{0} {1}" -f $value,$units
}
$File = "C:\NoBackup\FolderPaths.txt"
$fileData = #{}
foreach ($folder in (Get-Content $file))
{
$files = Get-ChildItem $folder -Recurse -ErrorAction SilentlyContinue -File
$fileData[$folder] = $files | Select-Object extension,length | %{$h = #{}} { $h[$_.extension] += $_.length } { $h}
}
$extensions = $fileData.Keys | % { $fileData[$_].Keys } | % tolower | Select-Object -Unique | ? { $_ }
$properties = #(
#{Name="Folder";Expression={$_}}
)
$extensions | % {$properties += #{Name="$_";Expression=[scriptblock]::Create("ConvertTo-Units `$fileData[`$folder][""$_""]")}}
$result = #()
foreach($folder in $folders)
{
$result += $folder | Select-Object $properties
}
$result | ft * -AutoSize -force

Resources