Ok, here is the scenario. I have a folder that people can use to share data around. Quick and easy, no USB needed. However they tend to leave the data in there and it is filling up the HDD. I want to delete everything over 30 days old. I was going to do this with powershell -
$limit = (Get-Date).AddDays(-30)
$path = "C:\temp"
Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force -whatif
Get-ChildItem -Path $path -Recurse -Force | Where-Object { $_.PSIsContainer -and (Get-ChildItem -Path $_.FullName -Recurse -Force | Where-Object { !$_.PSIsContainer }) -eq $null } | Remove-Item -Force -Recurse -whatif
I left the -whatif there because that makes it easier to test.
So I tested it, and it works fine, but during the testing I ran into a small problem. Moved files retain their original creation date. Why is this important? Because a user might move something into the folder 5 minutes before the script runs, and it will get deleted if the creation date is older than 30 days. Copying a file is no problem, it gets a new creation date when copied. Its only the moved files.
So question, how do I find the date the file was moved into the folder so that it isn't accidentally deleted?
This topic may have already been covered here: Finding Modified Date of a file/folder
Is this more or less what you are looking for or are you looking specifically for the creation date? I would imagine that some of the files may have been created on 1/1/2016, but the user might be modifying it every so often so it could be last modified on 2/15/2016. I'm sure you wouldn't want to remove that otherwise you would have one very angry end user.
Ok, so short answer is it can't be done. In the end I have had to copy the folder to an archive folder on another server as a backup, and then delete everything in the folder. The backup then gets deleted 10 days later.
But some things I found out along the way.
Using Robocopy to move the files also deletes the root folder. This might be ok for some cases, but the root folder has permissions on it that aren't created with a simple "make directory" command
Using Move-Item in powershell doesn't like to work between servers
Powershell is limited by the path and file length limitations that windows has
What I ended up doing was copying all the files to an archive first
robocopy $SourceFolder $DestinationFolder /e
Then I wanted to delete all the files. Which Remove-Item can't do if the path is longer than 248 characters because of the file and path length limitations. Again, robocopy to the rescue. I have a blank folder. Nothing in it. And to delete all the items in the other folder, regardless of path and file length, I mirror the blank folder.
robocopy /mir $EmptyFolder $SourceFolder
I already had a powershell script so I stuck these commands in there. But as it isn't a powershell command, then you could replace the variables with the actual folders and put the commands straight into a .bat file.
From here I just put in a scheduled task to run the script once a month to keep the folder free of built up clutter.
I hope this helps anyone else who is trying to do something similar.
As it turns out, there's a hidden fourth modification time in windows: ChangeTime. As far as I understand it, it's stored in a separate place, and isn't returned by the normal APIs - it's part of the Master File Table, I think. (Perhaps it depends on your file system type?) Whether that description is accurate or not, consider the following:
C:\Users\erhannis\Downloads\nsx220en>stat nsx-2.2.0
File: nsx-2.2.0
Size: 0 Blocks: 4 IO Block: 65536 directory
Device: c6194f59h/3323547481d Inode: 7318349397564576 Links: 1
Access: (0755/drwxr-xr-x) Uid: (1056947/ erhannis) Gid: (1049089/ UNKNOWN)
Access: 2016-03-24 14:41:17.880026600 -0400
Modify: 2016-03-24 14:41:17.880026600 -0400
Change: 2020-12-01 17:03:41.697541400 -0500
Birth: 2016-03-24 14:35:09.540397700 -0400
(See links for relevant programs.)
The time listed next to Change: is the time I moved the folder from one place to another. When I move a file, the Change date is updated. Note: I tried moving a folder full of files, and it appears only the root object's ChangeTime is updated. So like, for your purposes, you would consider the last ChangeTime of a given file to be the newest of the ChangeTime of the file and any parent directory. (Or, just stop recursing into directories when you hit a change time too new; depends on how you're doing it.)
Perhaps more relevantly, the Linux find command has a parameter -cmin that lets you filter by "metadata more recently changed than X". So, for instance, find -cmin -5 recursively finds all items whose metadata (e.g., when it was last moved) has changed less than five minutes ago. (find -cmin 5 would find files changed MORE than 5 minutes ago.) (On windows, I rename the find command to something else so it doesn't conflict with the default windows find command.)
Windows ports of both stat and find are available incidentally as part of a Git installation:
https://git-scm.com/download/win
There's also this Powershell script I found - haven't tried it, but it purports to get the ChangeTime of a file.
https://gallery.technet.microsoft.com/scriptcenter/Get-MFT-Timestamp-of-a-file-9227f399
Related
I'm working on creating a batch script which will allow me delete files with the same name and an appended ascending number; for example:
fileName.txt
fileName (1).txt
fileName (2).txt
fileName (3).txt
fileName (4).txt
fileName (5).txt
fileName (6).txt
fileName (7).txt
etc....
Below is the code I came up with, but it only deletes the file name, without the appended number in the parenthesis.
#echo off
:: Change to the Downloads directory
cd %UserProfile%\Downloads
:: Deletes files
Del Awesome-Kicks-Test_File-Run.txt
All of these file are saved in my Downloads directory.
Could you please direct me on how best to go about deleting these files?
Since you are on Windows, you should have access to PowerShell. PS has much better support for these types of operations, and is available on all versions of Windows in common use.
One method is to use a regular expression to find all of the files that match the pattern. Here is a basic one that should work for the example you gave: fileName \(\d+\)
Using PowerShell, you then combine that regular expression with the Get-ChildItem, Where-Object and Remove-Item commandlets.
Get-ChildItem | Where-Object { $_.Name -match 'fileName \(\d+\)' } | Remove-Item
I've posted this question before, yet was not able to find a suitable solution for my problem, hence I have been testing some more myself and have some new findings and theories as to why It might not work as intended. I hope a respectable time has passed for me to bump my question with new info attached.
I am for quite a while, in my free time, tackling a script that can batch replace external link addresses in multiple excel files within script folder. I have learned, that you can't change external links via usual powershell to excel interaction, as these values are forced to read-only. However, there is a clever way to bypass that by converting the Excel file to a .zip archive and read/change the files inside and then rename it back to excel format.
Through learning and digging around the web, I have compiled this script function that should create a backup, rename to archive and then replace desired text within, renaming the file backwards afterwards.
function Update-ExcelLinks($xlsxFile, $oldText, $newText) {
# Build BAK file name
$bakFile = $xlsxFile -ireplace [regex]::Escape(".xlsb"), ".bak"
# Build ZIP file name
$zipFile = $xlsxFile -ireplace [regex]::Escape(".xlsb"), ".zip"
# Create temporary folder
$parent = [System.IO.Path]::GetTempPath();
[string] $guid = [System.Guid]::NewGuid();
$tempFolder = Join-Path $parent $guid;
New-Item -ItemType Directory -Path $tempFolder;
# Uncomment the next line to create backup before processing XLSX file
# Copy-Item $xlsxFile $bakFile
# Rename file to ZIP
Rename-Item -Path $xlsxFile -NewName $zipFile
# Not using Expand-Archive because it changes the ZIP format
C:\7z\7za.exe x "$zipFile" -o"$tempFolder"
# Replace old text with new text
$fileNames = Get-ChildItem -Path $tempFolder -Recurse -Force -Include *.xml,*.bin.rels
foreach ($file in $fileNames)
{
(Get-Content -ErrorAction SilentlyContinue $file.PSPath) |
Foreach-Object { $_ -replace $oldText, $newText } |
Set-Content $file.PSPath
}
# Changing working folder because 7Zip option -w doesn't work
Set-Location -Path $tempFolder
# Not using Compress-Archive because it changes the ZIP format
C:\7z\7za.exe u -r "$zipFile" *.*
# Rename file back to XLSB
Rename-Item -Path $zipFile -NewName $xlsxFile
}
So far, I am able to find the desired file, rename it to .zip, extract it to a temporary folder, and according to the powershell window prompts, it updates the archive with new files. Afterwards I am left with the .zip file without any desired changes inside. The files that are responsible for external links data in the excel files are located at
wk33\Gross Qty_wk33.zip\xl\externalLinks\_rels
and are presented in the form of files named externalLink1.bin.rels and numbered onwards.
These files are essentially identical to a .xml file and are opened with either Notepad or Internet explorer through windows and contain the following:
The aim of my script is to rename the week number within "Target=" parameters from last week to current (For example wk32 to wk33). The thing is that no changes happen even though no errors are displayed, and 7zip indicates that files are packed into the zip successfully.
I have tested what happens, If I unpack the .bin.rels file and change the week number inside manually through notepad and repeat the intended script process and I can confirm that it works. When I open the file the link is correctly updated.
The last 4 steps seem to be not working as intended, even though they are correct as far as I am aware. The changes are not made and the file is not consecutively renamed back to its original .xlsb extension.
Here is the output of my powershell window upon trying to execute the function:
I've been trying for several weeks to make it work, but nothing substantial can be changed as it seems. I would appreciate any additional insights or alternatives to my code to achieve the same task.
EDIT: The function is intended to be called upon from within the working directory, so it is supposed to be used as Update-ExcelLinks 'Gross Qty_wk33.xlsb' 'wk32' 'wk33' Although I have tried calling the file via its full path as Update-ExcelLinks 'C:\Test\Gross Qty_wk33.xlsb' 'wk32' 'wk33'
I am trying to come up with a Powershell script to dynamically do 'Restore Database' in SQL Server 2019 with multiple TRN (or BAK in my case) files that are located in one folder on a daily basis.
I will manually do the full backup first, and this task will be scheduled to run after (once on a daily basis).
So, a Python script will grab only yesterday's files from another folder into this folder, and this Powershell script will execute to run to restore a database using these TRN / BAK files (located in this folder).
The plan is go thru each TRN files (located in the same folder) sequentially (not with the time files were created, but by file name).
For example, it will start from "..04" --> "..12" in this case.
I found some examples from this site, but I was not sure how to code where it recognize the sequence ("..04" --> "..12") to run.
PS C:\> $File = Get-ChildItem c:\backups, \\server1\backups -recurse
PS C:\> $File | Restore-DbaDatabase -SqlInstance Server1\Instance -UseDestinationDefaultDirectories
So, by default, I think Get-ChildItem should be already displaying the files starting from lowest to highest but if you want to make sure you could try something like this and see if the output fits your case.
For starting the test I'll create files using the same names as yours:
$i=1;1..12|foreach{
$null > "LOG_us_bcan_multi_replica_20210427$($i.ToString('0#')).bak"
$null > "LOG_us_bcan_multi_replica_20200327$($i.ToString('0#')).bak"
$i++
}
This creates 24 files with the same naming convention you have.
From ...multi_replica_2021042701.bak to ...multi_replica_2021042712.bak
From ...multi_replica_2020042701.bak to ...multi_replica_2020042712.bak
We know sorting by DateTime is possible so we can use string manipulation to get the date of your FileNames and use the ParseExact method on them.
Example:
# Assuming your current directory is the directory where the .bak files are
$expression={
[datetime]::ParseExact(
$_.BaseName.split('replica_')[1],'yyyyMMddHH',[cultureinfo]::InvariantCulture
)
}
Get-ChildItem | Select-Object BaseName,#{n='DateFromFileName';e=$expression}
# This will return a side by side FileName with their Dates from FileName
BaseName DateFromFileName
-------- ----------------
LOG_us_bcan_multi_replica_2020032701 3/27/2020 1:00:00 AM
LOG_us_bcan_multi_replica_2020032702 3/27/2020 2:00:00 AM
LOG_us_bcan_multi_replica_2020032703 3/27/2020 3:00:00 AM
LOG_us_bcan_multi_replica_2020032704 3/27/2020 4:00:00 AM
LOG_us_bcan_multi_replica_2020032705 3/27/2020 5:00:00 AM
LOG_us_bcan_multi_replica_2020032706 3/27/2020 6:00:00 AM
.....
Now we can use the same $expression with Sort-Object instead of Select-Object
Get-ChildItem | Sort-Object $expression
I use an old program that went out of business, it creates a folder and inside of that "root" folder it creates other folders inside of that folder it creates a rar file of the output from the program that ran.
Root
Folder
zipfile
Folder
zipfile
Note that inside of the Root is where there are over 400 folders and each has a zip file inside I need to unrar.
The folder is based on time it was run and this program runs anywhere from 2 times per minute to 6 times a minute so it creates a new folder in the "root" folder based on the time it ran.
To see the information, I need to unrar the file it made in that folder.
Problem, I have over 400 folders inside of a central folder, that has to have all the files unrared to the folder they are currently in.
I know WinRar has an unrar here function, and I have generated a text file that has the directory listing of each folder I need to unrar. One folder per line.
Basically I need a batch file that will go to each folder, do a unrar here and move on to the next folder.
Any help available? I am on a Win10 Home system and have my registered version 5.80 of WinRAR. I am not a Powershell user, I am learning it starting today, I did use PowerShell to generate the txt file with the directory listing though.
(Get-ChildItem -Recurse -Filter "zip").FullName | Out-File d:\pattern.txt
So I am not totally clueless.
Thank you.
Responding to Campo, I use WinRar to handle Zips as well as Rars, I do not find I need a zip extractor when I have WinRar installed.
The reason I asked about the extension in my comments is because you said that you'd already used powershell to create a filelist using Get-ChildItem, and wanting to start learning it.
As you said you were using PowerShell in Windows 10, I expect that you have, PowerShell v5.0+. This therefore, is a single line for a PowerShell script or to enter directly at the PowerShell Prompt, which should unzip each of your .zip files directly without introducing a third party utility, like WinRAR.
Please change the -Path string according to your specific location.
Get-ChildItem -Path "C:\Users\Frank\Root" -Filter "*.zip" -File -Recurse | ForEach-Object { Expand-Archive -Path $_.FullName -DestinationPath $_.DirectoryName -Force }
You didn't clarify whether you wanted to delete the zip file once you've unarchived it, so if you want that functionality, for the purpose of this exercise you could simply append a deletion command:
Get-ChildItem -Path "C:\Users\Frank\Root" -Filter "*.zip" -File -Recurse | ForEach-Object { Expand-Archive -Path $_.FullName -DestinationPath $_.DirectoryName -Force; Remove-Item $_.FullName }
Please note that the example is only a starting point for you. When you learn more, you can add robustness to it, to prevent for example, deleting a zip file if its extraction failed, and perhaps what to do in the event of errors etc.
Code below works when I am in a directory:
gci | % {rni $_.Name ($_.Name -replace '120', '121')}
How do I create a batch file with powershell code that will work in every directory without errors. I just want to start file.bat that will rename all files in every folder where file.bat is located?
powershell -C "gci | % {rni $_.Name ($_.Name -replace '120', '121')}"
That doesn't work, here is error:
Expressions are only permitted as the first element of a pipeline.
At line:1 char:52 + gci | {rni $_.Name ($_.Name -replace '501', '121')} <<<<
Comedy answer:
echo Hi
is a batch file that will "work in any directory without errors", which is literally what you asked for.
Half serious answer, here is a batch file which will do the equivalent renaming of 120 to 121 in files in the current folder, what your PowerShell does, which is plausibly what you ask for:
#echo off
setlocal ENABLEDELAYEDEXPANSION
for %%f in (*120*) do (
set _f=%%f
ren "%%f" "!_f:120=121!"
)
Although even though you asked about Batch files and tagged the question about batch files, I suspect that's not what you want. Maybe you mean "how can I specify the directory for it to run against?" Then maybe this:
powershell -C "gci '%1' | % {rni $_.Name ($_.Name -replace '120', '121')}"
Where you save this as a batch file to run from a command prompt, %1 is the parameter you give to the batch file which is embedded in the PowerShell code, quoted as the input to gci. Then you could run myscript.bat "c:\data files" which would launch PowerShell and rename files in "c:\data files\".
Or maybe if you are asking more about the "with no errors" part from your title, then you need to accept that you might not have permission to list folder contents, or permission to rename files, or renaming files might lead to clashing names, in that case what you need is to add
`-ErrorAction -Ignore`
to gci and to rni, which will make them "run without errors", in a sense.
Serious answer: % is a special character in batch files, you need to escape it by writing it twice.
powershell -C "gci | %% {rni $_.Name ($_.Name -replace '120', '121')}"
Otherwise it gets swallowed and you get the error "Expressions are only allowed as the first element of a pipeline."
NB. You are trying to rename every file in the folder, regardless of whether it has 120 in the name or not, this is horrible. Follow Nikhil Gupta's suggestions on how to avoid doing that.
You could have been a little more specific. I'm making assumptions now to try and solve your issue -
You want to rename all the files with 120 in the name to now have 121.
You want to do it in a specific directory but want to run the script from any location on the cmd.
The current script would run in any directory, get all the child items in that directory and replace 120 with 121. It wouldn't work if the name doesn't have 120.
Based on this I suggest 2 modifications -
Add a filter while gci to get only the items with 120 in the name.
Add a first line to go to the directory you need the script to run under.
Here is the sample code -
Set-location <dir>
gci -Filter "*120*" | % {rni $_.Name ($_.Name -replace '120', '121')}