I am new to Powershell and I am having an issue within a loop that I need assistance with. I am attempting to rename some files that are created as part of the process within the loop.
I have tested the code OUTSIDE of the loop and it works fine. However, when I try to put it in the loop, nothing seems to happen.
The files I need to rename are in the following locations…
(1)“\MYSERVER\MYCOMPANY\MYFOLDER\ MyPrintouts\EstimateImport\ImportPrintout.txt”
(2)“\MYSERVER\MYCOMPANY\MYFOLDER\ MyPrintouts\PostEntries\ImportPostEntries.txt”
I need to tack the date and time on the end. This code works for me OUTSIDE of the loop. I put it in a file I named RenameFiles.ps1
#File location
$ImportPrintout = “\\MYSERVER\MYCOMPANY\MYFOLDER\MyPrintouts\EstimateImport\ImportPrintout.txt”
$ImportPostEntries = “\MYSERVER\MYCOMPANY\MYFOLDER\ MyPrintouts\PostEntries\ImportPostEntries.txt”
#Find and rename the import printouts
Get-ChildItem $ImportPrintout -Filter "ImportPrintout.txt" | ForEach-Object {
Rename-Item $_.FullName "$BackupFolder$($_.BaseName -replace " ", "_" -replace '\..*?$')$(Get-Date -Format "MMddyyyy-HHmmss").txt"}
Get-ChildItem $ImportPostEntries -Filter "ImportPostEntires.txt" | ForEach-Object {
Rename-Item $_.FullName "$BackupFolder$($_.BaseName -replace " ", "_" -replace '\..*?$')$(Get-Date -Format "MMddyyyy-HHmmss").txt"}
This is how I added it to the loop as I want the files renamed BEFORE the next file is processed…
#Define actions after an Event is Detected
$action = {$files = Get-ChildItem -Path $watcher.Path -Filter $watcher.Filter #-Recurse
foreach ($file in $files)
{
#Define variables for log file
$changeType = $Event.SourceEventArgs.ChangeType #Your event trigger "Created"
$fileDate = $file.LastWriteTime #Date/Time Estimate was created
#logline contains = Date/Time of import, Created, Date/Time Created, filename
$logline = "$(Get-Date), $changeType, $fileDate, $file"
#Actions to take ==============================================================
#Write details to the log
Add-Content "“\\MYSERVER\MYCOMPANY\MYFOLDER\EstimateImportLog.txt" -value $logline
#Copy the estimate to the "ToBeProcessed" folder
Copy-Item $file.FullName -Destination $copyTo
#Move the estimate to the "EstimateHistory" folder
Move-Item $file.FullName -Destination $moveTo -Force
#Run the powershell script that launches the .bat file that launches the macro
Invoke-Expression "& '\\MYSERVER\MYCOMPANY\MYFOLDER\PSscriptToRunBATfile.ps1'"
#Pause the script for 30 seconds to allow the estimate to finish posting
Start-Sleep -Seconds 30
Invoke-Expression "& '“\\MYSERVER\MYCOMPANY\MYFOLDER\RenameFiles.ps1'"
}
}
This seems to “break” my loop. However, I need this to be done BEFORE going to the next file. How can I accomplish renaming these files before moving on. Any assistance would be greatly appreciated. Thank you!
As far as the loop failing, you're probably encountering an error. Either set your $ErrorActionPreference to Continue or set it to Stop and wrap try/catch blocks around your copy-item and move-item to detect and handle errors.
That probably also addresses the failure of the copy-item/move-item to change the file name, it's running into an error trying to perform that action and failing.
Related
I have some code which deletes a folder, then copies files from a temporary directory to where that folder had been.
Remove-Item -Path '.\index.html' -Force
Remove-Item -Path '.\generated' -Force -Recurse #folder containing generated files
#Start-Sleep -Seconds 10 #uncommenting this line fixes the issue
#$tempDir contains index.html and a sub folder, "generated", which contains additional files.
#i.e. we're replacing the content we just deleted with new versions.
Get-ChildItem -Path $tempDir | %{
Move-Item -Path $_.FullName -Destination $RelativePath -Force
}
I get an intermittent error, Move-Item : Cannot create a file when that file already exists. on the Move-Item line for the generated path.
I've been able to prevent this by adding a hacky Start-Sleep -Seconds 10 after the second Remove-Item statement; though that's not a great solution.
I assume the issue is that the Remove-Item statement completes / code moves on to the next line, before the OS has caught up with the actual file deletion; though that seems odd/worrying. NB: There are ~2,500 files in the generated folder (all between 1-100 KBs).
There are no other processes accessing the folders (i.e. I've even closed my explorer windows & tested with this directory being excluded from my AV).
I've considered other options:
using Copy-Item instead of Move-Item. I don't like this as it requires creating new files when they're not required (i.e. a copy is slower than a move)... It's faster than my current sleep hack; but still not ideal.
deleting the files & not the folder, then iterating through the subfolders & copying files to the new locations. This would work, but is a lot more code for something that should be simple; so I don't want to pursue that option.
Robocopy would do the trick; but I'd prefer a pure PowerShell solution. This is the option I'll eventually pick if there is no clean solution though.
Question
Has anyone seen this before?
Is it a bug, or have I missed something?
Is anyone aware of a fix / good workaround?
Update
Running the remove in a separate job (i.e. using the code below) did not resolve the issue.
Start-Job -ScriptBlock {
Remove-Item -Path '.\index.html' -Force
Remove-Item -Path '.\generated' -Force -Recurse #folder containing generated files
} | Wait-Job | Out-Null
#$tempDir contains index.html and a sub folder, "generated", which contains additional files.
#i.e. we're replacing the content we just deleted with new versions.
Get-ChildItem -Path $tempDir | %{
Move-Item -Path $_.FullName -Destination $RelativePath -Force
}
Update #2
Adding this works; i.e. rather than waiting a fixed time, we wait for the path to be removed / checking every second. If it's not removed after 30 seconds we assume it's not going to be; so carry on regardless (which will cause the move-item to throw an error which gets handled elsewhere).
# ... remove-item code ...
Start-Job -ScriptBlock {
param($Path)
while(Test-Path $Path){start-sleep -Seconds 1}
} -ArgumentList '.\generated' | Wait-Job -Timeout 30 | Out-Null
# ... move-item code ...
In the end I settled for this solution; not perfect, but it works.
Remove-Item -Path '.\index.html' -Force
Remove-Item -Path '.\generated' -Force -Recurse #folder containing generated files
#wait until the .\generated directory is full removed; or until ~30 seconds has elapsed
1..30 | %{
if (-not (Test-Path -Path '.\generated' -PathType Container)) {break;}
Start-Sleep -Seconds 1
}
Get-ChildItem -Path $tempDir | %{
Move-Item -Path $_.FullName -Destination $RelativePath -Force
}
This does the same as the job in update #2 of the question; only doesn't require the overhead of a job; just loops until the file's removed.
Here's the above logic wrapped as a reuable cmdlet:
function Wait-Item {
[CmdletBinding()]
param (
[Parameter(Mandatory, ValueFromPipeline, HelpMessage = 'The path of the item you wish to wait for')]
[string]$Path
,
[Parameter(HelpMessage = 'How many seconds to wait for the item before giving up')]
[ValidateRange(1,[int]::MaxValue)]
[int]$TimeoutSeconds = 30
,
[Parameter(HelpMessage = 'By default the function waits for an item to appear. Adding this switch causes us to wait for the item to be removed.')]
[switch]$Remove
)
process {
[bool]$timedOut = $true
1..$TimeoutSeconds | %{
if ((Test-Path -Path $Path) -ne ($Remove.IsPresent)){$timedOut=$false; return;}
Start-Sleep -Seconds 1
}
if($timedOut) {
Write-Error "Wait-Item timed out after $TimeoutSeconds waiting for item '$Path'"
}
}
}
#example usage:
Wait-Item -Path '.\generated' -TimeoutSeconds 30 -Remove
I have a piece of code to search for all .txt and .csv files in the a bunch of submit folders inside a main folder.
#Folder to check for files
$path="\\Myfolder-DEV\RI*"
#variable which contains the number of files to be processed
$NumOfFiles = 0
Get-ChildItem $path -Recurse | % {
if (($_.Attributes -eq "Directory") -and ($_.FullName -match "submit")) {
$myPath = $_.FullName -replace "\\", "/"
Write-Host $myPath;
$fileEntries = Get-ChildItem -Path $_.FullName -include *.txt, *.csv -recurse;
foreach($fileName in $fileEntries){
$myDfile = $fileName.FullName -replace "\\", "/"
Write-Host $myDfile;
$myOfile = $myDfile -replace '\.[^.\\/]+$'
Write-Host $myOfile;
$NumOfFiles = $NumOfFiles + 1
}
}
}
echo "`n$NumOfFiles files were processed`n"
When I run this code in DEV, it works fine but when I do it in QA by modifying the path like this, it is not going through the folders at all.
$path="\\Myfolder-QA\RI*"
It even fails to write a value to the variable here
Write-Host $myPath;
I've tried modifying the scope - Clearing the variables at the end of the previous script. Restarting ISE but nothing seems to work.
Edit:
When I change the main directory to $path="\\Myfolder-QA\", it seems to work but it is searching through folders that I don't want it to. I need to search through only those folders which start with a RI and it doesnt seem to work. Any idea why it could happen?
Any help is appreciated
Thanks,
Sree
I'm trying to use Powershell ISE to help me do the following:
Perform a search for many files (with an extension of *props.tmpl) under a certain folder and to include all sub-directories.
When found, I want to copy that file to its current location, but with an extension of *.tmpl2 (what I really want is to skip this step and copy *props.tmpl to a file called *props)
Then rename all *.tmpl2 files and remove the tmpl2 entirely, leaving just the *.props extension.
Ideally, what I want is to copy existing files to the same directory with a new name. It seems like all of the searches I've ran on Powershell ISE are not coming up with the right info I need (or I'm not searching for the right way to do it - trying 'powershell ise copy many files with new names' didn't help.
I had the replacement piece down and working, but I no longer want to eliminate the original tmpl files (they are templates so I may want to review them later for their original content).
What I was doing to replace them was this:
Get-ChildItem -Filter "*props.tmpl" -Recurse |
Rename-Item -NewName { $_.name -replace '.tmpl',''}
Which works great other than completely removing the original file.
I started trying to piece something together, but I'm not understanding how to properly name the copy and stopped at this point with just an error (this was an attempt to skip the extra copy and just simply rename the copy instead of adding the extra step of '*.tmpl2'):
# Get all *props.tmpl files
Get-ChildItem -Filter "*props.tmpl" -Recurse |
# Iterate through each found file
ForEach-Object {
Copy-Item $_.name |
Rename-Item -NewName { $_.name -replace '.props.tmpl','.props' }
}
Any help would be really appreciated (not much of a Powershell guy, but I'm trying to learn since powershell tends to be a little more dynamic then oldschool batch scripts).
Thanks in advance
Final version of this script per help from #ssennett
Here's my final version:
# Get all *props.tmpl files
Get-ChildItem -Filter "*props.tmpl" -Recurse |
# Iterate through each found file and copy it to non-template form in same location
ForEach-Object {
Copy-Item $_.FullName ($_.Name -replace '.tmpl','')
}
You're not too far from the answer! It's just how Copy-Item is being handled.
Without a Destination being specified, the Copy-Item will effectively try and copy the file onto itself. Instead of piping it to Rename-Item, you can handle the renaming with the -Destination parameter, as below.
$files = Get-ChildItem -Filter "*props.tmpl" -Recurse
$files | % { Copy-Item -Path $_.FullName -Destination ($_.Name -replace 'props.tmpl','.props') }
This would copy a file called RandomFileprops.tmpl into another file RandomFile.props. If you want to remove the original, you can use the Move-Item cmdlet with the same parameters, which effectively renames the original file.
I have a script that I've been working on, which reads a specified directory, locates .CSV files, and executes some logic for each of the .CSV files and ultimately renames them .csv.archived .
The code was working fine last night, however this morning, when I execute the code it only loops through once. For example, last night, I had 5 .csv files in the directory, it would return the file names for all 5 files in a single action. Now, each time I execute my script, it grabs the first file, performs the actions intended, and then exits forcing me to manually initiate the script for each file.
I've gutted the irrelevant code for testing purposes, and would be happy if someone could tell me that I am doing something wrong, and that I am not crazy.
Here's the code:
$iterations = 1
#set the location where the .CSV files will be pulled from
$Filecsv = get-childitem "\\SERVERPATH\Audit Test\" -recurse | where {$_.extension -eq ".csv"} | % {
$filename = $_.Name
}
#for each file found in the directory
ForEach ($Item in $Filecsv) {
#spit out the file name
"File Name: " + $filename
#count the times we've looped through
"Iterations : " + $iterations
# get the date and time from the system
$datetime = get-date -f MMddyy-hhmmtt
# rename the file
rename-item -path ("\\SERVERPATH\Audit Test\"+ $filename) -newname ($filename + $datetime + ".csv.archived")
$iterations ++
}
...and here is the Output:
For the example I showed you, I had four .CSV files in the directory. I had to manually execute my script, and each time it would perform as expected, but only for the first item it encounters in the directory. It doesn't actually loop through, what am I missing here?
Right here (folded at the pipe for readability):
$Filecsv = get-childitem "\\SERVERPATH\Audit Test\" -recurse |
where {$_.extension -eq ".csv"} |
% {$filename = $_.Name}
You're looping through your files and for each one setting $filename to the name of that file instead of letting the filenames accumulate in $Filecsv
$Filecsv = get-childitem "\\SERVERPATH\Audit Test\" -recurse |
where {$_.extension -eq ".csv"} |
% {$_.Name}
Going out of my mind over here.
I have a script where I'm parsing a folder full of tifs, and breaking the files up into sub-folders to limit the number of pages to around 60 per folder. If a document is very large it gets its own folder.
The problem is that the process is locking up the files, so they cannot be deleted. Not every file though, most of them work fine, and my end clean-up portion of the script gets rid of everything else.
I wrote a lot of work-around sections to my code to fix this issue, and now it is pretty bad looking
#Large Documents
Get-ChildItem -Path "$directory" -recurse -filter "*.tif" | foreach {
$file = [System.Drawing.Bitmap]::FromFile($_.Fullname);
$pagecount = $file.GetFrameCount($file.FrameDimensionsList[0]);
if ($pagecount -gt $MaxSize){
$total = $total + $pagecount;
$name = $_.Basename;
New-Item $name -ItemType directory;
Copy-Item $_.fullname -Destination $name;
#Copy-Item $name".DS" -Destination $processingDir;
Write-Host "Sleeping in large doc loop";
$file.Dispose;
Write-Host "Dispose file object";
Write-Host $_.Fullname
$storename = $_.Fullname
$largeFiles = $largeFiles + $storename
Write-Host "Storing to array: " $largeFiles[$index];
$index = $index + 1;
sleep(15);
}
}
while ($delInd -lt $largeFiles.Count){
Write-Host "Deleting: " $largeFiles[$delInd];
Remove-Item $largeFiles[$delInd] -Force;
$delInd = $delInd + 1;
}
I'm absolutely perplexed by this. Any help is greatly appreciated.
As far as I understand with $file.Dispose you do not force underlying object to close the file. Dispose is a method and, in PowerShell (like in C#), to invoke a method you have to use (). So try $file.Dispose().
piece of advice : you can avoid ; at the end of the lines