I have written a simple PowerShell script to help my wife organize some imported iPhone video files by allowing her to manually specify the correct creation date. It seems to work as expected, except that if I make a copy of one of the files, I cannot set the creation date on the copy.
Note: The time of day is negligible in this situation.
$movFiles = Get-ChildItem "C:\Users\jg\Desktop\videos" -Filter *.MOV
Foreach ($file in $movFiles) {
$fileName = $file.Name
$userDate = Read-Host -Prompt "Date wanted for ${fileName} (format--> 1970-02-13) "
$newDateObj = Get-Date -Date $userDate
$item.CreationTime = $newDateObj
}
When verifying the results, all files have been correctly modified, except a file that I just made a copy of. Is there a way to force the modification of the creation date (short of changing it before making a copy)? Specifically, what mechanism is getting in the way?
As autosvet pointed out in their comment, I simply had a mistake in my variable name. The correction is changing the variable $item to $file. Here is the correct code:
$movFiles = Get-ChildItem "C:\Users\jg\Desktop\videos" -Filter *.MOV
Foreach ($file in $movFiles) {
$fileName = $file.Name
$userDate = Read-Host -Prompt "Date wanted for ${fileName} (format--> 1970-02-13) "
$newDateObj = Get-Date -Date $userDate
$file.CreationTime = $newDateObj
}
Thanks again autosvet.
Related
I'm trying to search through a number of log files with different filenames. I want search for a hostname in each log and when a match is found have it copy that entire line to summary_[date].log and keep appending matching lines to it. So something I've started with is:
$captured = Get-ChildItem -recurse -Path \\nas1\share1 -Include *.log |
where { ($_.Name | Select-String -pattern ('PC1','PC2','PC3') -SimpleMatch) }
Now copy the line from each log file which contains the pattern and append it to a file with today's date stamp, so each week I'll have a file like \\nas1\share1\summary_03-07-2020.log
But this is not quite what I want as this will capture the filenames and append them to the $captured variable. It's also missing the code to copy any matching lines to a date stamped summary_[date].log
Each text file will contain, among other lines that start with a time stamp, something like this:
03-07-2020_14-36-17 - Backup of computer [PC1] is successfully
created.
So what I want is to search several text files on a share for several hostnames. If a text file contains the hostname have it append the line which contains the hostname to summary_[date].log. Lastly, since the matching lines will all start with a date/time stamp I need to keep the contents of summary_[date].log file sorted from newest date to oldest.
Essentially I should end up with a summary_[date].log every week that will look similar to this:
03-07-2020_14-36-17 - Backup of computer [PC1] is successfully created.
03-07-2020_13-21-12 - Backup of computer [PC3] is successfully created.
03-07-2020_11-36-29 - Backup of computer [PC2] is successfully created.
By doing this I get a summary of all log files from that day in a single file which I will then automatically email to a specific email address.
How do I accomplish this?
Your current code selects a string from the file name, not the content.
To do what you are after, you can use simething like this:
$logFolder = '\\nas1\share1'
$outFile = Join-Path -Path $logFolder -ChildPath ('summary_{0:dd-MM-yyyy}.log' -f (Get-Date))
$captured = Get-ChildItem -Recurse -Path $logFolder -Include *.log | ForEach-Object {
($_ | Select-String -Pattern 'PC1','PC2','PC3' -SimpleMatch).Line
} | Sort-Object # sorting is optional of course
#output on screen
$captured
# output to new log file
$captured | Set-Content -Path $outFile
Next send this file as attachment like:
# use splatting for cmdlets that take a lot of parameters
$params = #{
SmtpServer = 'yourmailserver'
From = 'logtester#yourcompany.com'
To = 'someone#yourcompany.com'
Subject = 'Summary'
Body = 'Hi, in the attachment a summary of the backup logs'
Attachments = $outFile
# etc.
}
Send-Mailmessage #params
How big are your log files?
This will work but it loads the entire content of all files into memory and maybe inefficient with large file and/or a large number of files.
If there is a large number of files or large files you could do some nested foreach loops and loop through each computer name for each file.
This also uses match rather than like in where-object, you can use. like if you get false positives.
$FileContent = Get-Content -Path "\\nas1\share1\*.log"
$ComputerArray = 'PC1','PC2','PC3'
$DateStamp = get-date -Format dd-MM-yyyy
$OutFile = 'summary_' + $DateStamp + '.log'
foreach ($ComputerName in $ComputerArray)
{
$FileContent | Where {$_ -match $ComputerName} | Add-Content -Path $OutFile
}
I would like to create a PowerShell script that can import a CSV file (details.csv) with two headers (FileName and FileCreationTime). Ideally, the script would look for details.csv in the current location the script is saved.
It would create folders in the script's current location with the same name as FileName, and the creation date of said folder would then be changed to match FileCreationTime.
Example chunk of my CSV [made in A & B columns of Excel then saved as CSV (comma delimited)(*.csv)]:
FileName FileCreationTime
Alpha 5/17/2017
Bravo 12/23/2013
Charlie 11/8/2015
I have been searching for a solution, but nothing I do seems to be quite right. I currently have this:
Import-Csv -Path 'K:\Users\eschlitz\Thesis\details.csv' -Delimiter "," |
ForEach-Object {
$path = 'K:\Users\eschlitz\Thesis'
# Again, didn't want a definite path here, but I was trying different
# tweaks to see if I could get at least one instance to work correctly.
New-Item -Path $path -Name $$_.Filename -Type Directory
(Get-Item $_.Filename).CreationTime = (Get-Date $_.FileCreationTime)
}
My current error message:
Get-Item : Cannot find path 'K:\Users\eschlitz\Thesis\Alpha' because it does not exist.
I do not care about whether or not the hh:mm:ss part of the creation time is edited for the new folders, but it would be a bonus if I could standardize them all to 12:00:00 AM.
~~~~~~~~~~~~~~~~~~~~~~~Question Duplication Edit~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Suggested edit to show how my question is different from PowerShell: Change the timestamp (Date created) of a folder or file
Everything that I was able to find related to this did either only A)create folders from a CSV, or was B)script to edit the creation date of a single folder / or batch edit the creation date of multiple folders but only with a single new creation time. I wanted the script to hopefully fail if it would be unable to correctly find the new creation time unique to each new folder, thereby eliminating the need for me to manually delete wrong folders or edit the creation time manually.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~Edit~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Just wanted to post the complete, working versions in case anyone needs them in the future.
#Adds folders to specified directory and modifies their creation date
Import-Csv -Path 'K:\Users\eschlitz\Thesis\details.csv' -Delimiter "," |
ForEach-Object {
$path = ' K:\Users\eschlitz\Thesis'
$dir = New-Item -Path $path -Name $_.Filename -Type Directory
$dir.CreationTime = [DateTime]::ParseExact($_.FileCreationTime,
'M\/d\/yyyy', [Globalization.CultureInfo]::InvariantCulture)
}
And a slightly different version depending on needs:
#Adds folders and then modifies their creation date where script+csv
#currently are
Set-Location -Path "$PSScriptRoot"
Import-Csv -Path ".\details.csv" -Delimiter ',' |
ForEach-Object {
New-Item -Path "$PSScriptRoot" -Name $_.FileName -Type Directory
(Get-Item $_.Filename).CreationTime =
([DateTime]::ParseExact($_.FileCreationTime, 'M\/d\/yyyy',
[Globalization.CultureInfo]::InvariantCulture))
}
The folder is not created b/c you have a typo in the New-Item statement ($$_.Filename → $_.Filename), and even if it were created Get-Item most likely wouldn't be able to find it, because it's looking in the current working directory, whereas you create the folder in $path. You can avoid the latter issue by capturing the DirectoryInfo object that New-Item returns in a variable:
$dir = New-Item -Path $path -Name $_.Filename -Type Directory
$dir.CreationTime = (Get-Date $_.FileCreationTime)
You may also need to actually parse the date string into a DateTime value (depending on your locale):
[DateTime]::ParseExact($_.FileCreationTime, 'M\/d\/yyyy', [Globalization.CultureInfo]::InvariantCulture)
If you defined the date in ISO format (yyyy-MM-dd) Get-Date should be able to digest it regardless of the system's locale.
I am writing a batch file that executes a Powershell script that at one point loops items with UNC paths as attributes and uses Get-ChildItem on those paths. In a minimal version, this is what is happening in my scripts:
Master.bat
powershell -ExecutionPolicy ByPass -File "Slave.ps1"
Slave.ps1
$foo = #{Name = "Foo"}
$foo.Path = "\\remote-server\foothing"
$bar = #{Name = "Bar"}
$bar.Path = "\\remote-server\barthing"
#( $foo, $bar ) | ForEach-Object {
$item = Get-ChildItem $_.Path
# Do things with item
}
The problem I'm running into is that when I run Master.bat, it fails at Get-ChildItem with an error along the lines of
get-childitem : Cannot find path '\\remote-server\foothing' because it does not exist.
However, it seems to work perfectly fine if I run the Slave.ps1 file directly using Powershell. Why might this be happening only when the Master.bat file is run?
Things I have tried
Prepending the UNC paths with FileSystem:: with providers http://powershell.org/wp/2014/02/20/powershell-gotcha-unc-paths-and-providers/
Making sure there are no strange characters in the actual paths
Using the -literalPath parameter instead of the plain -path parameter for Get-ChildItem
Running Get-ChildItem \\remote-server\foothing in PowerShell and succeeding to verify connection to the remote server
I have found this issue when running scripts referring to UNC paths - but the error only occurs when the root of the script is set to a non file system location. e.g. PS SQLSEVER\
So the following fails with the same error:
cd env:
$foo = #{Name = "Foo"}
$foo.Path = "\\remote-server\foothing"
$bar = #{Name = "Bar"}
$bar.Path = "\\remote-server\barthing"
#( $foo, $bar ) | ForEach-Object {
$item = Get-ChildItem $_.Path
# Do things with item
Write-Host $item
}
So my resolution was to ensure that the PS prompt was returned to a file system location before executing this code. e.g.
cd env:
$foo = #{Name = "Foo"}
$foo.Path = "\\remote-server\foothing"
$bar = #{Name = "Bar"}
$bar.Path = "\\remote-server\barthing"
cd c: #THIS IS THE CRITICAL LINE
#( $foo, $bar ) | ForEach-Object {
$item = Get-ChildItem $_.Path
# Do things with item
Write-Host $item
}
I hope this helps - I would be very happy with the bounty as this is my first answer on stack overflow.
P.S. I forgot to add - the PS command prompt root may be set by auto loaded modules in the configuration of your machine. I would check with Get-Location to see if you are actually executng from a non FileSystem location.
Rory's answer provides an effective workaround, but there's a solution that doesn't require changing the current location to a FileSystem provider location first:
Prefix your UNC paths with FileSystem:: to ensure that they are recognized correctly, irrespective of the current location:
$foo = #{
Name = "Foo"
Path = "FileSystem::\\remote-server\foothing"
}
$bar = #{
Name = "Bar"
Path = "FileSystem::\\remote-server\barthing"
}
Alternatively, here is a tweak to Rory's answer to avoid changing the current location session-globally (to preserve whatever the current location is), using Push-Location and Pop-Location:
try {
# Switch to the *filesystem provider's* current location, whatever it is.
Push-Location (Get-Location -PSProvider FileSystem)
# Process the paths.
$foo, $bar | ForEach-Object {
$item = Get-ChildItem $_.Path
# Do things with item
}
} finally {
# Restore the previous location.
Pop-Location
}
Optional background information
This excellent blog post explains the underlying problem (emphasis added):
PowerShell doesn't recognize [UNC paths] as "rooted" because they're not on a PSDrive; as such, whatever provider is associated with PowerShell's current location will attempt to handle them.
Adding prefix FileSystem:: unambiguously identifies the path as being a FileSystem provider path, irrespective of the provider underlying the current location.
I read somewhere else about the Push-Location and Pop-Location commands to counter this kind of problem - I landed on your question while manually, step-by-step, testing a new routine where the script has push/pop, but I forgot to do them on my PS window. After checking #Rory's answer I noticed I was on PS SQLServer:\ instead of PS C:\ prompt.
So a way to use this on your "slave" script would be:
$foo = #{Name = "Foo"}
$foo.Path = "\\remote-server\foothing"
$bar = #{Name = "Bar"}
$bar.Path = "\\remote-server\barthing"
#( $foo, $bar ) | ForEach-Object {
$item = Get-ChildItem $_.Path
Push-Location
# Do things with item
Pop-Location
}
Thought of adding the Push/Pop before and after the # Do things because it seems that it's those things that change the location.
I am working on a PowerShell 2 script that will read a CD-ROM or DVD disk and copy its contents - the caveat is that I need to check for certain file types and NOT copy anything over with that filetype. So far I have this working:
$user = read-host "Enter owner's username:"
$drv = read-host "Enter Optical Drive letter (no colons or slashes):"
$list = Import-Csv badtypes.csv
$badlist = #()
$Extns = #()
ForEach($xt in $list)
{
$Extns += "."_$xt.Extention
}
$filepath = $drv +":\"
$cnts = Get-ChildItem $filepath -r
ForEach($itm in $cnts)
{
if($itm.PSis.Container)
{
#write new folder name in user's temporary folder
}
else
{
CheckFile $itm
}
}
Function CheckFile($fl)
{
$fildextension = [System.IO.Path]::GetExtention($fl)
$badfound = 0
ForEach($a in $Extns)
{
if($a -eq $fildextension)
{
$badfound = 1
}
}
if(badfound -eq 0)
{
write-host "File Type Acceptable:" $fl
# write file to proper place in the user's temporary folder
}
}
I'm having a problem getting the an empty folder created in the proper place and copying the (acceptable) file to the proper place.
Any help would be appreciated!
Sticking with a powershell solution, try this on for size:
$user = read-host "Enter owner's username:"
$drv = read-host "Enter Optical Drive letter (no colons or slashes):"
$list = Import-Csv badtypes.csv
$Exclusions = "*.$($list.Extension -join ",*.")"
$DestFolder = New-Item -path "\\BAAC\homedir\$user\transfer\$(get-date -f MMddyyyy.HH.mm)" -ItemType Directory
$FilePath = $drv +":\"
Write-Host "Copying files from $FilePath to $DestFolder`:"
Copy-Item "$FilePath*" -Destination $DestFolder -Exclude $Exclusions -Recurse -PassThru
explorer $DestFolder
Don't really need the user's name, but I left it in there. Here's what the script will do:
It gets the drive letter
Imports the list of extensions to exclude, builds a string from them (resultant string would be something like "*.bat,*.exe,*.com" if your CSV had 3 entries being bat, exe, and com).
It creates a date/time formatted folder in the desired folder
Then it recursively copies files and folders to new folder excluding anything on the exclusion list.
Lastly it opens up a Windows Explorer window to the destination folder that was created containing all the recently copied files and folders.
Edit: Updated to use the path specified in the comment. If this answer provided a solution to your question please mark it as the selected answer so future users can find it and use it without having to repeat questions.
I need a PowerShell script that can access a file's properties and discover the LastWriteTime property and compare it with the current date and return the date difference.
I have something like this...
$writedate = Get-ItemProperty -Path $source -Name LastWriteTime
...but I can not cast the LastWriteTime to a "DateTime" datatype. It says, "Cannot convert "#{LastWriteTime=...date...}" to "System.DateTime".
Try the following.
$d = [datetime](Get-ItemProperty -Path $source -Name LastWriteTime).lastwritetime
This is part of the item property weirdness. When you run Get-ItemProperty it does not return the value but instead the property. You have to use one more level of indirection to get to the value.
(ls $source).LastWriteTime
("ls", "dir", or "gci" are the default aliases for Get-ChildItem.)
I have an example I would like to share
$File = "C:\Foo.txt"
#retrieves the Systems current Date and Time in a DateTime Format
$today = Get-Date
#subtracts 12 hours from the date to ensure the file has been written to recently
$today = $today.AddHours(-12)
#gets the last time the $file was written in a DateTime Format
$lastWriteTime = (Get-Item $File).LastWriteTime
#If $File doesn't exist we will loop indefinetely until it does exist.
# also loops until the $File that exists was written to in the last twelve hours
while((!(Test-Path $File)) -or ($lastWriteTime -lt $today))
{
#if a file exists then the write time is wrong so update it
if (Test-Path $File)
{
$lastWriteTime = (Get-Item $File).LastWriteTime
}
#Sleep for 5 minutes
$time = Get-Date
Write-Host "Sleep" $time
Start-Sleep -s 300;
}
(Get-Item $source).LastWriteTime is my preferred way to do it.
I can't fault any of the answers here for the OP accepted one of them as resolving their problem. However, I found them flawed in one respect. When you output the result of the assignment to the variable, it contains numerous blank lines, not just the sought after answer. Example:
PS C:\brh> [datetime](Get-ItemProperty -Path .\deploy.ps1 -Name LastWriteTime).LastWriteTime
Friday, December 12, 2014 2:33:09 PM
PS C:\brh>
I'm a fan of two things in code, succinctness and correctness. brianary has the right of it for succinctness with a tip of the hat to Roger Lipscombe but both miss correctness due to the extra lines in the result. Here's what I think the OP was looking for since it's what got me over the finish line.
PS C:\brh> (ls .\deploy.ps1).LastWriteTime.DateTime
Friday, December 12, 2014 2:33:09 PM
PS C:\brh>
Note the lack of extra lines, only the one that PowerShell uses to separate prompts. Now this can be assigned to a variable for comparison or, as in my case, stored in a file for reading and comparison in a later session.
Slightly easier - use the new-timespan cmdlet, which creates a time interval from the current time.
ls | where-object {(new-timespan $_.LastWriteTime).days -ge 1}
shows all files not written to today.
Use
ls | % {(get-date) - $_.LastWriteTime }
It can work to retrieve the diff. You can replace ls with a single file.