PowerShell script to create folders and then modify creation date from csv - arrays

I would like to create a PowerShell script that can import a CSV file (details.csv) with two headers (FileName and FileCreationTime). Ideally, the script would look for details.csv in the current location the script is saved.
It would create folders in the script's current location with the same name as FileName, and the creation date of said folder would then be changed to match FileCreationTime.
Example chunk of my CSV [made in A & B columns of Excel then saved as CSV (comma delimited)(*.csv)]:
FileName FileCreationTime
Alpha 5/17/2017
Bravo 12/23/2013
Charlie 11/8/2015
I have been searching for a solution, but nothing I do seems to be quite right. I currently have this:
Import-Csv -Path 'K:\Users\eschlitz\Thesis\details.csv' -Delimiter "," |
ForEach-Object {
$path = 'K:\Users\eschlitz\Thesis'
# Again, didn't want a definite path here, but I was trying different
# tweaks to see if I could get at least one instance to work correctly.
New-Item -Path $path -Name $$_.Filename -Type Directory
(Get-Item $_.Filename).CreationTime = (Get-Date $_.FileCreationTime)
}
My current error message:
Get-Item : Cannot find path 'K:\Users\eschlitz\Thesis\Alpha' because it does not exist.
I do not care about whether or not the hh:mm:ss part of the creation time is edited for the new folders, but it would be a bonus if I could standardize them all to 12:00:00 AM.
~~~~~~~~~~~~~~~~~~~~~~~Question Duplication Edit~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Suggested edit to show how my question is different from PowerShell: Change the timestamp (Date created) of a folder or file
Everything that I was able to find related to this did either only A)create folders from a CSV, or was B)script to edit the creation date of a single folder / or batch edit the creation date of multiple folders but only with a single new creation time. I wanted the script to hopefully fail if it would be unable to correctly find the new creation time unique to each new folder, thereby eliminating the need for me to manually delete wrong folders or edit the creation time manually.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~Edit~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Just wanted to post the complete, working versions in case anyone needs them in the future.
#Adds folders to specified directory and modifies their creation date
Import-Csv -Path 'K:\Users\eschlitz\Thesis\details.csv' -Delimiter "," |
ForEach-Object {
$path = ' K:\Users\eschlitz\Thesis'
$dir = New-Item -Path $path -Name $_.Filename -Type Directory
$dir.CreationTime = [DateTime]::ParseExact($_.FileCreationTime,
'M\/d\/yyyy', [Globalization.CultureInfo]::InvariantCulture)
}
And a slightly different version depending on needs:
#Adds folders and then modifies their creation date where script+csv
#currently are
Set-Location -Path "$PSScriptRoot"
Import-Csv -Path ".\details.csv" -Delimiter ',' |
ForEach-Object {
New-Item -Path "$PSScriptRoot" -Name $_.FileName -Type Directory
(Get-Item $_.Filename).CreationTime =
([DateTime]::ParseExact($_.FileCreationTime, 'M\/d\/yyyy',
[Globalization.CultureInfo]::InvariantCulture))
}

The folder is not created b/c you have a typo in the New-Item statement ($$_.Filename → $_.Filename), and even if it were created Get-Item most likely wouldn't be able to find it, because it's looking in the current working directory, whereas you create the folder in $path. You can avoid the latter issue by capturing the DirectoryInfo object that New-Item returns in a variable:
$dir = New-Item -Path $path -Name $_.Filename -Type Directory
$dir.CreationTime = (Get-Date $_.FileCreationTime)
You may also need to actually parse the date string into a DateTime value (depending on your locale):
[DateTime]::ParseExact($_.FileCreationTime, 'M\/d\/yyyy', [Globalization.CultureInfo]::InvariantCulture)
If you defined the date in ISO format (yyyy-MM-dd) Get-Date should be able to digest it regardless of the system's locale.

Related

Search multiple text files for string and copy line by appending it to summary_[date].log

I'm trying to search through a number of log files with different filenames. I want search for a hostname in each log and when a match is found have it copy that entire line to summary_[date].log and keep appending matching lines to it. So something I've started with is:
$captured = Get-ChildItem -recurse -Path \\nas1\share1 -Include *.log |
where { ($_.Name | Select-String -pattern ('PC1','PC2','PC3') -SimpleMatch) }
Now copy the line from each log file which contains the pattern and append it to a file with today's date stamp, so each week I'll have a file like \\nas1\share1\summary_03-07-2020.log
But this is not quite what I want as this will capture the filenames and append them to the $captured variable. It's also missing the code to copy any matching lines to a date stamped summary_[date].log
Each text file will contain, among other lines that start with a time stamp, something like this:
03-07-2020_14-36-17 - Backup of computer [PC1] is successfully
created.
So what I want is to search several text files on a share for several hostnames. If a text file contains the hostname have it append the line which contains the hostname to summary_[date].log. Lastly, since the matching lines will all start with a date/time stamp I need to keep the contents of summary_[date].log file sorted from newest date to oldest.
Essentially I should end up with a summary_[date].log every week that will look similar to this:
03-07-2020_14-36-17 - Backup of computer [PC1] is successfully created.
03-07-2020_13-21-12 - Backup of computer [PC3] is successfully created.
03-07-2020_11-36-29 - Backup of computer [PC2] is successfully created.
By doing this I get a summary of all log files from that day in a single file which I will then automatically email to a specific email address.
How do I accomplish this?
Your current code selects a string from the file name, not the content.
To do what you are after, you can use simething like this:
$logFolder = '\\nas1\share1'
$outFile = Join-Path -Path $logFolder -ChildPath ('summary_{0:dd-MM-yyyy}.log' -f (Get-Date))
$captured = Get-ChildItem -Recurse -Path $logFolder -Include *.log | ForEach-Object {
($_ | Select-String -Pattern 'PC1','PC2','PC3' -SimpleMatch).Line
} | Sort-Object # sorting is optional of course
#output on screen
$captured
# output to new log file
$captured | Set-Content -Path $outFile
Next send this file as attachment like:
# use splatting for cmdlets that take a lot of parameters
$params = #{
SmtpServer = 'yourmailserver'
From = 'logtester#yourcompany.com'
To = 'someone#yourcompany.com'
Subject = 'Summary'
Body = 'Hi, in the attachment a summary of the backup logs'
Attachments = $outFile
# etc.
}
Send-Mailmessage #params
How big are your log files?
This will work but it loads the entire content of all files into memory and maybe inefficient with large file and/or a large number of files.
If there is a large number of files or large files you could do some nested foreach loops and loop through each computer name for each file.
This also uses match rather than like in where-object, you can use. like if you get false positives.
$FileContent = Get-Content -Path "\\nas1\share1\*.log"
$ComputerArray = 'PC1','PC2','PC3'
$DateStamp = get-date -Format dd-MM-yyyy
$OutFile = 'summary_' + $DateStamp + '.log'
foreach ($ComputerName in $ComputerArray)
{
$FileContent | Where {$_ -match $ComputerName} | Add-Content -Path $OutFile
}

Copy-Item and rename to a specific parent directory in Powershell

I've got a script that will copy a list of directories that I've filtered and copied into a new directory with the name plus the LastWriteTime
$srcdir = "Z:\Production500\000600"
$destdir = "X:\Standards\Water Resources\GIS\_Water Resources GIS Database\_Unprocessed_Raw_data"
$folders = Get-ChildItem -Recurse "Z:\Production500\000600" -Filter "Water Resources","GIS" -Recurse
$i=1
$folders | % ($_) {cp $_.FullName -Destination "$destdir\$($_.Name + $_.LastWriteTime.toString("_yyyy_MM_dd_") + $i)" -recurse; $i++}
The script works great copying the exact folders and files I need into the following format:
.\GIS_2017_07_09_1
.\GIS_2017_07_10_2
.\GIS_2017_07_10_3
Instead of the counter, I'd prefer to have a name on the end from one of the parent directories. For Example if $folders is this list:
Directory: Z:\Production500\000600\B000676\Design\004\Chisholm Park\Water Resources
Directory: Z:\Production500\000600\B000667\Design\001\Water Resources
Directory: Z:\Production500\000600\B000663\Design\001\Water Resources
I'd like the copied items renamed to this:
.\GIS_2017_07_09_B000676
.\GIS_2017_07_10_B000667
.\GIS_2017_07_10_B000663
thus eliminating the need for the counter and also making the destination more organized. Notably, the B000### is always the 3rd folder deep.
Lets assume your path is mentioned in a variable and you can proceed with this since you have already mentioned that it will always be the 3rd level folder:
You can use the spilt to get the folder name and can put it inside the foreach
$path="Z:\Production500\000600\B000676\Design\004\Chisholm Park\Water Resources"
$srcdir = "Z:\Production500\000600"
$destdir = "X:\Standards\Water Resources\GIS\_Water Resources GIS Database\_Unprocessed_Raw_data"
$folders = Get-ChildItem "Z:\Production500\000600" -Filter "Water Resources","GIS" -Recurse
$folders | % ($_) {cp $_.FullName -Destination "$destdir\$($_.Name + $_.LastWriteTime.toString("_yyyy_MM_dd_") + "$($path.split('\')[3])")" -recurse}
You do not need the $i variable and do not need to do the incremental operation.
Make sure you are giving the path properly so that it can split on each path. This is just a sample hardcoding the path since you have not specified whether it will be in the destination path or the source path.
Hope it helps.

-Filter not working when input file has more than once line [duplicate]

This question already has answers here:
How to properly -filter multiple strings in a PowerShell copy script
(5 answers)
Closed 6 years ago.
I have this script and it's working 100% , but only for a single item
I want to loop the script and get content from a txt file
You see, my scipt search for a specific file and copy it to an existing folder with the same name of the file.
So what I want is to get the folder's name and the file's name from 2 txt files and loop the script
I have manage to get the content from the txt files but I can't loop the script if I add a second line with new values in my txt files.
I always get the error:
Get-ChildItem : Cannot convert 'System.Object[]' to the type
'System.String' required by parameter 'Filter'. Specified method is
not supporte d.
Ok this is my script:
Set-ExecutionPolicy -Scope Process -ExecutionPolicy RemoteSigned
# Setup source and destination paths
$Src = '\\192.168.0.216\home\'
$Dst = 'C:\TEST\120629B\'
# Wildcard for filter
$Extension = '120629B.jpg'
# Get file objects recursively
Get-ChildItem -Path $Src -Filter $Extension -Recurse |
# Skip directories, because XXXReadMe.txt is a valid directory name
Where-Object {!$_.PsIsContainer} |
# For each file
ForEach-Object {
# If file exist in destination folder, rename it with directory tag
if(Test-Path -Path (Join-Path -Path $Dst -ChildPath $_.Name))
{
# Get full path to the file without drive letter and replace `\` with '-'
# [regex]::Escape is needed because -replace uses regex, so we should escape '\'
$NameWithDirTag = (Split-Path -Path $_.FullName -NoQualifier) -replace [regex]::Escape('\'), '-'
# Join new file name with destination directory
$NewPath = Join-Path -Path $Dst -ChildPath $NameWithDirTag
}
# Don't modify new file path, if file doesn't exist in target dir
else
{
$NewPath = $Dst
}
# Copy file
Copy-Item -Path $_.FullName -Destination $NewPath
}
Ok this is what I have change and worked but is only working with one record
$Src = '\\192.168.0.216\home\'
$Dst = Get-Content 'C:\TEST\path.txt'
# Wildcard for filter
$Extension = Get-Content 'C:\TEST\file.txt'
The error message is telling you the problem, you can't use an array as the filter for get-childitem. you can probably nest a where-object filter inside of a foreach loop but the easiest way to accomplish what you are trying to do is going to be to loop through your extension filters and then run your loop inside of that loop. so wrap your entire Get-ChildItem loop in a Foreach loop as below.
Foreach($e in $extension){
*Your Code Here*
}
Of cource make sure to change the -Filter parameter of your Get-ChildItem from $Extension to $e
Like error says, -Filter expects a single string. Get-Content would be returning an object array for files with more than one line.
Since you are also using -Recurse consider using -Include instead of -Filter since it supports arrays of stings. This should without changing your input file or adding any other post processing. From [MSDN]
Specifies, as a string array, an item or items that this cmdlet includes in the operation. The value of this parameter qualifies the Path parameter. Enter a path element or pattern, such as *.txt. Wildcards are permitted.
Get-ChildItem -Path $Src -Include $Extension -Recurse
Note:
The Include parameter is effective only when the command includes the Recurse parameter or the path leads to the contents of a directory, such as C:\Windows*, where the wildcard character specifies the contents of the C:\Windows directory.
Same goes for -Exclude as well

Powershell Match Filename with same basename but different file extensions and copy them to another folder

Powershell Version 2
Need some power shell help, Got a directory like the example below. Where there will be lots of files , the common property of them will be the name but with different extensions. I need to to recurse through the directory match any with the same base name and copy them to a location. The files need to be copied to the destination together.
Example of files. 40127.wav , 40127.txt , 40127.ini , 40128.wav , 40128.txt , 40128.ini
My example script Works but for only one member of the array. As i call the member of the array using $test[0] to match the basename. Not sure how to get it to go through the array as i seem to be breaking the foreach command.
$source = c:\audio
$test = Get-ChildItem -Path $source
foreach ($item in $test)
{get-childitem -path $source -recurse | where {$_.basename -eq $test[0].basename } | Move-Item -Destination C:\Backup }
Result
3 files in C:\backup 40127.wav , 40127.txt , 40127.ini .
Any help much appreciated.

Copy files from source directory to target directory and exclude specific file types from specified directories

I have created a simple Powershell script to copy files during a deployment from a target directory to a source directory and I would like to exclude a list of files. The caveat however is that I would like the ability to exclude files only from a sub directory if specified. This is the snippet I'm using to perform the copy and exclude a list of files:
$SourceDirectory = "C:\Source"
$DestinationDirectory = "C:\Destination"
$Exclude = #("*.txt*", "*.xml*")
Get-ChildItem $SourceDirectory -Recurse -Exclude $Exclude | Copy-Item -Destination {Join-Path $DestinationDirectory $_.FullName.Substring($SourceDirectory.length)}
This will exclude the specified files wherever they appear in the directory tree. Where I would like to get to with the Exclude list is something like this:
$Exclude = #("*Sub1\.txt*", "*.xml*").
This would exclude .txt files only under the Sub1 folder while .xml files would be excluded throughout. I know this doesn't work, but I hope that it helps to better demonstrate the problem I'm trying to solve.
I have considered using a multidimensional array, but I'm not sure if that might be overkill. Any help would be appreciated.
This is one way to do it
$SourceDirectory = 'C:\Source'
$DestinationDirectory = 'C:\Destination'
$ExcludeExtentions = '*.txt*', '*.xml*'
$ExcludeSubDirectory = 'C:\Source\bad_directory1', 'C:\Source\bad_directory2'
Get-ChildItem $SourceDirectory -Recurse -Exclude $ExcludeExtentions |
Where-Object { $ExcludeSubDirectory -notcontains $_.DirectoryName } |
Copy-Item -Destination $DestinationDirectory
Your best friend here is Where-Object, or where. It takes a scriptblock as parameter and uses that scriptblock to validate each object that goes through pipeline. Only objects that make script return $true are passed through Where-Object.
Also, take a look at the object that represents a file you get from Get-ChildItem. It has Name, Directory and DirectoryName containing pieces of file's FullName already split respectively. Directory is actually an object that represents parent directory, and DirectoryName is a string. Get-Member commandlet will help you to discover hidden gems like.
$SourceDirectory = 'C:\Source'
$DestinationDirectory = 'C:\Destintation'
$ExcludeExtentions1 = "^(?=.*?(SubDirectory1))(?=.*?(.xml)).*$"
$ExcludeExtentions2 = "^(?=.*?(SubDirectory2))(?=.*?(.config)).*$"
$ExcludeExtentions3 = "^(?=.*?(.ps1))((?!SubDirectory1|SubDirectory2).)*$"
$ExcludeExtentions4 = ".txt|.datasource"
$files = Get-ChildItem $SourceDirectory -Recurse
foreach ($file in $files)
{
if ($file.FullName -notmatch $ExcludeExtentions1 -and $file.FullName -notmatch $ExcludeExtentions2 -and $file.FullName -notmatch $ExcludeExtentions3-and $file.FullName -notmatch $ExcludeExtentions4)
{
$CopyPath = Join-Path $DestinationDirectory $file.FullName.Substring($SourceDirectory.length)
Copy-Item $file.FullName -Destination $CopyPath
}
}
In this solution, using regex and -notmatch I am able to exclude specific file types from specific directories. $ExcludeExtentions1 will exclude xml files only from SubDirectory1, $ExcludeExtentions2 will exclude config files only from SubDirectory2, $ExcludeExtentions3 will exclude ps1 files as long as they are not in either of the two SubDirectories, $ExcludeExtentions4 will exclude txt and datasource files throughout the entire tree.
We are not actually using all of these matches in our solution, but since I was working on this, I thought I would add multiple conditions in case others could benefit from this approach.
Here are a couple of links that also helped:
http://www.tjrobinson.net/?p=109
http://dominounlimited.blogspot.com/2007/09/using-regex-for-matching-multiple-words.html

Resources