I need a PowerShell script that can access a file's properties and discover the LastWriteTime property and compare it with the current date and return the date difference.
I have something like this...
$writedate = Get-ItemProperty -Path $source -Name LastWriteTime
...but I can not cast the LastWriteTime to a "DateTime" datatype. It says, "Cannot convert "#{LastWriteTime=...date...}" to "System.DateTime".
Try the following.
$d = [datetime](Get-ItemProperty -Path $source -Name LastWriteTime).lastwritetime
This is part of the item property weirdness. When you run Get-ItemProperty it does not return the value but instead the property. You have to use one more level of indirection to get to the value.
(ls $source).LastWriteTime
("ls", "dir", or "gci" are the default aliases for Get-ChildItem.)
I have an example I would like to share
$File = "C:\Foo.txt"
#retrieves the Systems current Date and Time in a DateTime Format
$today = Get-Date
#subtracts 12 hours from the date to ensure the file has been written to recently
$today = $today.AddHours(-12)
#gets the last time the $file was written in a DateTime Format
$lastWriteTime = (Get-Item $File).LastWriteTime
#If $File doesn't exist we will loop indefinetely until it does exist.
# also loops until the $File that exists was written to in the last twelve hours
while((!(Test-Path $File)) -or ($lastWriteTime -lt $today))
{
#if a file exists then the write time is wrong so update it
if (Test-Path $File)
{
$lastWriteTime = (Get-Item $File).LastWriteTime
}
#Sleep for 5 minutes
$time = Get-Date
Write-Host "Sleep" $time
Start-Sleep -s 300;
}
(Get-Item $source).LastWriteTime is my preferred way to do it.
I can't fault any of the answers here for the OP accepted one of them as resolving their problem. However, I found them flawed in one respect. When you output the result of the assignment to the variable, it contains numerous blank lines, not just the sought after answer. Example:
PS C:\brh> [datetime](Get-ItemProperty -Path .\deploy.ps1 -Name LastWriteTime).LastWriteTime
Friday, December 12, 2014 2:33:09 PM
PS C:\brh>
I'm a fan of two things in code, succinctness and correctness. brianary has the right of it for succinctness with a tip of the hat to Roger Lipscombe but both miss correctness due to the extra lines in the result. Here's what I think the OP was looking for since it's what got me over the finish line.
PS C:\brh> (ls .\deploy.ps1).LastWriteTime.DateTime
Friday, December 12, 2014 2:33:09 PM
PS C:\brh>
Note the lack of extra lines, only the one that PowerShell uses to separate prompts. Now this can be assigned to a variable for comparison or, as in my case, stored in a file for reading and comparison in a later session.
Slightly easier - use the new-timespan cmdlet, which creates a time interval from the current time.
ls | where-object {(new-timespan $_.LastWriteTime).days -ge 1}
shows all files not written to today.
Use
ls | % {(get-date) - $_.LastWriteTime }
It can work to retrieve the diff. You can replace ls with a single file.
Related
I have written a simple PowerShell script to help my wife organize some imported iPhone video files by allowing her to manually specify the correct creation date. It seems to work as expected, except that if I make a copy of one of the files, I cannot set the creation date on the copy.
Note: The time of day is negligible in this situation.
$movFiles = Get-ChildItem "C:\Users\jg\Desktop\videos" -Filter *.MOV
Foreach ($file in $movFiles) {
$fileName = $file.Name
$userDate = Read-Host -Prompt "Date wanted for ${fileName} (format--> 1970-02-13) "
$newDateObj = Get-Date -Date $userDate
$item.CreationTime = $newDateObj
}
When verifying the results, all files have been correctly modified, except a file that I just made a copy of. Is there a way to force the modification of the creation date (short of changing it before making a copy)? Specifically, what mechanism is getting in the way?
As autosvet pointed out in their comment, I simply had a mistake in my variable name. The correction is changing the variable $item to $file. Here is the correct code:
$movFiles = Get-ChildItem "C:\Users\jg\Desktop\videos" -Filter *.MOV
Foreach ($file in $movFiles) {
$fileName = $file.Name
$userDate = Read-Host -Prompt "Date wanted for ${fileName} (format--> 1970-02-13) "
$newDateObj = Get-Date -Date $userDate
$file.CreationTime = $newDateObj
}
Thanks again autosvet.
I have PowerShell scripts that require to start other PowerShell scripts in a new session. The first script passes a set of arguments to the second script as an array. Everything works fine when all of the arguments have values, but when i try passing a $null, the parameter is stripped and the list of arguments gets messed up.
To better understand the issue, you can do the following (this is just an example):
define C:\Test.ps1 as:
param($a,$b,$c)
" a $a " | Out-File C:\temp.txt
" b $b " | Out-File C:\temp.txt -append
" c $c " | Out-File C:\temp.txt -append
run in any ps console
PowerShell.exe -WindowStyle Hidden -NonInteractive -file C:\Test.ps1 #(1,2,3) #works as expected temp.txt contains
a 1
b 2
c 3
PowerShell.exe -WindowStyle Hidden -NonInteractive -file C:\Test.ps1 #(1,$null,5) #strips $null and writes the following to temp.txt
a 1
b 5
c
I need to preserve the $null when creating the new session, the desired temp.txt would contain
a 1
b
c 5
The problem seems to be that the $null gets stripped from the array directly and is already #(1,5) when being interpreted by the new session.
I've tried declaring an empty array and adding elements to it one by one, also tried replacing the array with a System.Collections.Generic.List[System.Object] and using the Add method, but still the $null gets stripped.
My last ideas are, to test for a $null and assign a default value to the argument, in the calling script, then in the second script to test for the default value and reassign the $null, or create a hash with all the arguments, pass it as an argument and process and split them in the called script. I really don't like these ideas as it feels overkill for the task at hand.
Any help in understanding the basic problem, why $null gets stripped from the array and how to preserve it, or how to alter the creation of the new session to pickup the $null is greatly appreciated.
When I've needed to serialize data between scripts, or preserve objects across script executions, I tend to use Export-Clixml and Import-Clixml, sometimes combined with splatting. The Clixml scripts will preserve objects as they existed previously in their entirety.
For example, in your sending script:
$a = 1;
$b = $null;
$c = 3;
#($a, $b, $c) | Export-Clixml 'C:\params.xml'
And then to retrieve the data:
$params = Import-Clixml 'C:\params.xml';
$x = $params[0];
$y = $params[1];
$z = $params[2];
Write-Output "$x,$y,$z";
You should get 1,,3.
Often you can even use hash tables to help organize:
#{'a'=$a, 'b'=$b, 'c'=$c} | Export-Clixml 'C:\params.xml'
And then:
$x = $params.a;
$y = $params.b;
$z = $params.c;
But hashtables are a bit funky sometimes. Be sure to test.
As for what's going on, it looks like PowerShell skips null values when assigning parameters from an array like you're doing. The null value is in the array (#(1, $null, 3)[1] -eq $null is True), it's just PowerShell skipping it.
If you specify the param names, then PowerShell knows which parameters you're giving it.
PowerShell.exe -WindowStyle Hidden -NonInteractive -file C:\Test.ps1 -a 1 -c 5
gives you
a 1
b
c 5
More on PowerShell parameters here.
I have two question or a two part problem:
I have a CSV file with a list of files including full path, e.g.:
C:\file1.txt
C:\file2.xls
C:\file3.doc
etc.
I need to check the modified date for each file and if its within a specific timespan, then output it to a new CSV file, e.g.:
C:\file1.txt has modified time and date 01/01/2014 10:00
C:\file2.xls has modified time and date 02/01/2014 12:00
C:\file3.doc has modified time and date 03/01/2014 14:00
C:\file4.ppt has modified time and date 04/01/2014 16:00
Timespan = 02/01/2014 8:00 to 03/01/2014 20:00
The output.csv file should contain:
C:\file2.xls
C:\file3.doc
I've tried to modified existing PowerShell scripts I've found online, but honestly don't understand the code. I've tried using foreach-object:
$names = get-content .\test.csv
$names | foreach-object {(get-item $_.name).lastwritetime}
but I'm falling over at job getting this to work.
I then need to pass the output.csv file to robocopy, which doesn't support it natively so might need to use a loop to call robocopy, to copy files from one location to another.
Hope this makes sense.
Observe that the solution below will not work if there are several files with the same file name. The only thing you'll need to add then is some form of source base address (from which the file structure is built) and a destination base address and just build the new destination path based on these.
$copyChangesSince = Get-Date "2014-01-08"
$doNotCopyChangesNewerThan = Get-Date "2014-02-02"
get-content .\files.txt | Foreach {
Get-Item $_
} |
Where { $_.LastWriteTime -gt $copyChangesSince -AND $_.LastWriteTime -lt $doNotCopyChangesNewerThan } |
Select FullName, Name |
Export-Csv C:\temp\filelist.csv -NoTypeInformation
Since you only want to copy a single item at a time I'm not sure I see the benefit of using robocopy. In my example below, I've just used Copy-Item. If you really want to use robocopy it shouldn't be hard to replace that call with a call to robocopy.
Import-Csv C:\temp\filelist.csv |
Foreach { Copy-Item -Path $_.FullName -Destination "C:\temp\output\$($_.Name)" }
If you don't want to actually use the .csv-file afterwards, you could just as easily skip that part and replace the Select and Export-Csv with the Foreach statement from the Import-part (and of course skip the Import-Csv as well).
Edit: Split the process into two parts, as was requested in the question.
I'm trying to write a PowerShell script to build a list of files, from several directories. After all directories have been added to the main list, I'd like to do the same processing on all files.
This is what I have:
$items = New-Object Collections.Generic.List[IO.FileInfo]
$loc1 = #(Get-ChildItem -Path "\\server\C$\Program Files (x86)\Data1\" -Recurse)
$loc2 = #(Get-ChildItem -Path "\\server\C$\Web\DataStorage\" -Recurse)
$items.Add($loc1) # This line fails (the next also fails)
$items.Add($loc2)
# Processing code is here
which fails with this error:
Cannot convert argument "0", with
value: "System.Object[]", for "Add" to
type "System.IO.FileInfo": "Cannot
convert the "System.Object[]" va lue
of type "System.Object[]" to type
"System.IO.FileInfo"."
I am mostly interested in what is the correct approach for this type of situation. I realize that my code is a very C way of doing it -- if there is a more PowerShell way to acomplish the same task, I'm all for it. The key, is that the number of $loc#'s may change over time, so adding and removing one or two should be easy in the resulting code.
Not sure you need a generic list here. You can just use a PowerShell array e.g.:
$items = #(Get-ChildItem '\\server\C$\Program Files (x86)\Data1\' -r)
$items += #(Get-ChildItem '\\server\C$\Web\DataStorage\' -r)
PowerShell arrays can be concatenated using +=.
From get-help get-childitem:
-Path
Specifies a path to one or more locations. Wildcards are permitted. The default location is the current directory (.).
$items = get-childitem '\\server\C$\Program Files (x86)\Data1\','\\server\C$\Web\DataStorage\' -Recurse
Here is some perhaps even more PowerShell-ish way that does not need part concatenation or explicit adding items to the result at all:
# Collect the results by two or more calls of Get-ChildItem
# and perhaps do some other job (but avoid unwanted output!)
$result = .{
# Output items
Get-ChildItem C:\TEMP\_100715_103408 -Recurse
# Some other job
$x = 1 + 1
# Output some more items
Get-ChildItem C:\TEMP\_100715_110341 -Recurse
#...
}
# Process the result items
$result
But the code inside the script block should be written slightly more carefully to avoid unwanted output mixed together with file system items.
EDIT: Alternatively, and perhaps more effectively, instead of .{ ... } we can
use #( ... ) or $( ... ) where ... stands for the code containing several
calls of Get-ChildItem.
Keith's answer is the PowerShell way: just use #(...)+#(...).
If you actually do want a typesafe List[IO.FileInfo], then you need to use AddRange, and cast the object array to a FileInfo array -- you also need to make sure you don't get any DirectoryInfo objects, or else you need to use IO.FileSystemInfo as your list type:
So, avoid directories:
$items = New-Object Collections.Generic.List[IO.FileInfo]
$items.AddRange( ([IO.FileSystemInfo[]](ls '\\server\C$\Program Files (x86)\Data1\' -r | Where { -not $_.PSIsContainer } )) )
$items.AddRange( ([IO.FileSystemInfo[]](ls '\\server\C$\Web\DataStorage\' -r | Where { -not $_.PSIsContainer } )) )
Or use FileSystemInfo (the common base class of FileInfo and DirectoryInfo):
$items = New-Object Collections.Generic.List[IO.FileSystemInfo]
$items.AddRange( ([IO.FileSystemInfo[]](ls '\\server\C$\Program Files (x86)\Data1\' -r)) )
$items.AddRange( ([IO.FileSystemInfo[]](ls '\\server\C$\Web\DataStorage\' -r)) )
-Filter is more performant than -Include, so if you don't have a lot of different extensions, simply concatenating two filtered lists might be faster.
$files = Get-ChildItem -Path "H:\stash\" -Filter *.rdlc -Recurse
$files += Get-ChildItem -Path "H:\stash\" -Filter *.rdl -Recurse
I compared the output with a timer like this:
$stopwatch = [System.Diagnostics.Stopwatch]::StartNew()
# Do Stuff Here
$stopwatch.Stop()
Write-Host "$([Math]::Round($stopwatch.Elapsed.TotalSeconds)) seconds ellapsed"
I am powershell newbie. I used a sample script and made substitute from get-item to get-content in the first line.
The modified script looks like below:
$file = get-content "c:\temp\test.txt"
if ($file.IsReadOnly -eq $true)
{
$file.IsReadOnly = $false
}
So in essence I am trying to action items contained in test.txt stored as UNC paths
\\testserver\testshare\doc1.doc
\\testserver2\testshare2\doc2.doc
When running script no errors are reported and no action is performed even on first entry.
Short answer:
sp (gc test.txt) IsReadOnly $false
Long answer below
Well, some things are wrong with this.
$file is actually a string[], containing the lines of your file. So the IsReadOnly property applies to the string[] and not to the actual files represented by those strings, which happen to be file names.
So, if I'm understanding you correctly you are trying to read a file, containing other file names, one on each line. And clear the read-only attribute on those files.
Starting with Get-Content isn't wrong here. We definitely are going to need it:
$filenames = Get-Content test.txt
Now we have a list of file names. To access the file's attributes we either need to convert those file names into actual FileInfo objects and operate on those. Or we pass the file names to a -Path argument of Set-ItemProperty.
I will take the first approach first and then get to the other one. So we have a bunch of file names and want FileInfo objects from them. This can be done with a foreach loop (since we need to do this for every file in the list):
$files = (foreach ($name in $filenames) { Get-Item $name })
You can then loop over the file names and set the IsReadOnly property on each of them:
foreach ($file in $files) {
$file.IsReadOnly = $false
}
This was the long and cumbersome variant. But one which probably suits people best with no prior experience to PowerShell. You can reduce the need for having multiple collections of things lying around by using the pipeline. The pipeline transports objects from one cmdlet to another and those objects still have types.
So by writing
Get-Content test.txt | Get-Item | ForEach-Object { $_.IsReadOnly = $false }
we're achieving exactly the same result. We read the contents of the file, getting a bunch of strings. Those are passed to Get-Item which happens to know what to do with pipeline input: It treats those objects as file paths; exactly what we need here. Get-Item then sends FileInfo objects further down the pipeline, at which point we are looping over them and setting the read-only property to false.
Now, that was shorter and, with a little practise, maybe even easier. But it's still far from ideal. As I said before, we can use Set-ItemProperty to set the read-only property on the files. And we can take advantage of the fact that Set-ItemProperty can take an array of strings as input for its -Path parameter.
$files = Get-Content test.txt
Set-ItemProperty -Path $files -Name IsReadOnly -Value $false
We are using a temporary variable here, since Set-ItemProperty won't accept incoming strings as values for -Path directly. But we can inline this temporary variable:
Set-ItemProperty -Path (Get-Content test.txt) -Name IsReadOnly -Value $false
The parentheses around the Get-Content call are needed to tell PowerShell that this is a single argument and should be evaluated first.
We can then take advantage of the fact that each of those parameters is used in the position where Set-ItemProperty expects it to be, so we can leave out the parameter names and stick just to the values:
Set-ItemProperty (Get-Content test.txt) IsReadOnly $false
And then we can shorten the cmdlet names to their default aliases:
sp (gc test.txt) IsReadOnly $false
We could actually write $false as 0 to save even more space, since 0 is converted to $false when used as a boolean value. But I think it suffices with shortening here.
Johannes has the scoop on the theory behind the problem you are running into. I just wanted to point out that if you happen to be using the PowerShell Community Extensions you can perform this by using the Set-Writable and Set-ReadOnly commands that are pipeline aware e.g.:
Get-Content "c:\temp\test.txt" | Set-Writable
or the short, aliased form:
gc "c:\temp\test.txt" | swr
The alias for Set-ReadOnly is sro. I use these commands weekly if not daily.