I work for a large organization that has many windows 2012 file servers where we re-direct AD users profile folders to.
Ever since Crypo-wall viruII have been plaguing us we run search looking for the various telltale signatures that someone's account is infected.
The last campaign left files with a HELP_YOUR_FILES* in there so was easy to look for using
Get-ChildItem -Recurse -Force $filePath -ErrorAction SilentlyContinue | Where-Object { ($_.PSIsContainer -eq $false) -and ( $_.Name -like "*$fileName*") } | Select-Object Name,Directory| Format-Table -AutoSize *
Now starting this week we've been hit with a new Cryto-wall campaing that leaves affected user files like this
<original_filename>.extension.mp3
only using a search parameter of *.mp3 or even ..mp3 (in an attempt to catch the double barrel file extensions just keeps returning all mp3 files which is too large a list to sort through and find any infection.
Does anyone have suggestions about how to find only
<original_filename>.extension.mp3
Thank you in advance
You could use a RegEx match to help with this. It would not be perfect, but it would limit the results quite a bit. I think I would do something like:
Get-ChildItem -Recurse -Force $filePath -ErrorAction SilentlyContinue | Where-Object { ($_.PSIsContainer -eq $false) -and ( $_.Name -match "\.[^\.]+\.mp3$") } | Select-Object Name,Directory| Format-Table -AutoSize *
That will search for a pattern where there's a period, at least one non-period character, another period, and then mp3 with nothing after it. Here's a RegEx101 Explaination.
As the RegEx101 example shows in the last line matches, this is not fool proof, and would require manual review, but it should reduce false positive results to a minimum.
Related
I am just starting to use Powershell )
I need to compare the file name with a list of file names to exclude some files from further transfer .
This script I wrote :
$E_list= #("**a8e4022a-41c4-4480-b627-02fd6ca03413**","**042d7f01-46da-4f64-b0a8-1fa90e2386d8**")
Foreach($file in(Get-ChildItem $path))
{$sname=$file.Name.SubString(0,36)
if($sname -**notcontains** $E_list)
{write-output $file.name}
}
This is what I get :
042d7f01-46da-4f64-b0a8-1fa90e2386d8.20220807T043155.239-0500.2B2DAA77BB5DBC120EC7.log
042d7f01-46da-4f64-b0a8-1fa90e2386d8.20220807T053153.074-0500.7BBEE35804E5257E9322.log
042d7f01-46da-4f64-b0a8-1fa90e2386d8.20220807T063152.902-0500.CFE11DD033449E5C566C.log
042d7f01-46da-4f64-b0a8-1fa90e2386d8.20220807T073153.266-0500.62C3B67C51A3298D7E41.log
514e18c3-73ec-4203-bd6b-b7088f35f86a.20220807T011525.607-0500.28EC734139367578396D.log
77414eee-054c-4251-b564-d7dc8f18c074.20220807T012552.070-0500.98E353CC0CB50F1901C2.log
77414eee-054c-4251-b564-d7dc8f18c074.20220807T022335.696-0500.570D615179EC19BA5454.log
77414eee-054c-4251-b564-d7dc8f18c074.20220807T023742.213-0500.42759D279D05B3F5476F.log
77414eee-054c-4251-b564-d7dc8f18c074.20220807T042724.518-0500.340A0A2454D6C72CFDEA.log
a8e4022a-41c4-4480-b627-02fd6ca03413.20220807T073747.827-0500.B7411E2EF3038A32BF17.log
a8e4022a-41c4-4480-b627-02fd6ca03413.20220807T074255.366-0500.CE5DB99320C88F4CA300.log
a8e4022a-41c4-4480-b627-02fd6ca03413.20220807T074756.774-0500.7F5A11C5C8B6C9FF5E68.log
a8e4022a-41c4-4480-b627-02fd6ca03413.20220807T075255.354-0500.C460A6E0122A9DD787E6.log
bb57d78f-9bfa-41c0-9a78-f4709a7e02ba.20220807T012609.428-0500.942CD4B421572B911155.log
d80daa0d-124b-4061-86b8-d76e5c02d48a.20220807T011529.679-0500.A13D8D9EE5A5628F4C02.log
d80daa0d-124b-4061-86b8-d76e5c02d48a.20220807T012603.506-0500.4BA46E891A4DDA1D55B3.log
d80daa0d-124b-4061-86b8-d76e5c02d48a.20220807T023744.215-0500.3D5F1792A1FEDA5C2AAB.log
d80daa0d-124b-4061-86b8-d76e5c02d48a.20220807T040642.210-0500.637614FAEF716E613EC7.log
dbe57367-16c9-4115-ad75-0f33449b56e4.20220807T043150.243-0500.FE8A3AF63EDF427D33FC.log
Basically all files in this folder.
I expect files marked bold not to appear in the output.
but somehow they are.
I will appreciate your help.
You can use the -Exclude parameter of Get-ChildItem to greatly simplify and fix the code:
$E_list= 'a8e4022a-41c4-4480-b627-02fd6ca03413*', '042d7f01-46da-4f64-b0a8-1fa90e2386d8*'
Get-ChildItem $path\* -Exclude $E_list | ForEach-Object { $_.Name }
As for what you have tried:
As commenters noted, you need to swap the parameters for -notcontains - i.e. $E_list -notcontains $name. Alternatively, use the -notin operator, i.e. $name -notin $E_list.
The * in the $E_list strings and around -notcontains don't do any good. The -contains and -notcontains operators don't support wildcard matching. For that purpose you'd have to use the -like or -notlike operators. The -Exclude parameter of Get-ChildItem also supports wildcards.
A bit different from the others. I'm retrieving an arraylist of files for processing (basically handling DLL registration on a local machine), and I need my script to properly handle multiple DLLs with the same name. The select -Unique doesn't work, since technically the files aren't duplicates - each has its own unique full path.
I need this script to retrieve all DLLs in a folder (as well as sub-folders), but only return the last instance of each named file. For example if I have files:
C:\Path\Update1\GRM.DLL
C:\Path\Update1\HTCP.DLL
C:\Path\Update2\GRM.DLL
C:\Path\Update3\GRM.DLL
The script should return the objects for Update3\GRM.DLL and Update1\HTCP.DLL.
[System.Collections.ArrayList]$dlls = #(Get-ChildItem -Path $PSScriptRoot -Recurse | Where-Object
{$_.Extension -eq ".dll" -and $_.FullName -notmatch 'dll_Old'})
Edit: Got it going with this, but it's selecting the first instance that shows up, and I need the last. In this example, that means it's snagging Update1/GRM.DLL instead of Update3/GRM.DLL
$dlls = #(Get-ChildItem -Path $PSScriptRoot -Recurse | Where-Object {$_.Extension -eq ".dll" -and $_.FullName -notmatch 'dll_Old'}) | Select-Object -Unique
Use a hashtable to keep track of the last file seen for a specific file name:
$files = #{}
Get-ChildItem -Path $PSScriptRoot -File -Recurse -Filter *.dll |Where-Object FullName -notmatch 'dll_Old' |ForEach-Object {
$files[$_.Name] = $_
}
$uniqueFiles = $files.Values
Mathias R. Jessen's helpful answer is probably the best (fastest) solution in this case, but here's an alternative based on the Group-Object cmdlet:
Get-ChildItem -LiteralPath $PSScriptRoot -Recurse -Filter *.dll |
Where-Object FullName -notmatch dll_Old |
Group-Object Name |
ForEach-Object { $_.Group[-1] }
Group-Object Name groups all matching files by their .Name property.
ForEach-Object { $_.Group[-1] } then extracts the last (-1) member from each resulting group.
Note that Group-Object will implicitly sort the groups by the grouping property, so the resulting list of file-info objects (System.IO.FileInfo, as output by Get-ChildItem) will be sorted by file name.
Sorry for the dump question as I am a beginner in PowerShell. We have a lot of coming files into a directory, and they are always increasing and decreasing in that directory because it is moved to another directory after we are done from using them.
We have a priority file list in a txt file which has only partial name of the file name.
For example, for the file name:
2017-06-5---666-12-05-01.txt
In the priority list, I have the partial name as ---666---.
How can I check if the files in the folder are already in the priority list by using Powershell?
I wrote the below script since I need only the files which are older than a given time. But it is not working.
Get-ChildItem -path $Struct.Path -filter $Struct.filter |
Where-Object {$_.CreationTime -lt $time} |
Where-Object {$_.Name -contains "*$PriorityList*"} |
ForEach-Object { $counterP++}
I have updated your code and now it is working perfectly
Get-ChildItem -path $Struct.Path -filter $Struct.filter |
Where-Object {$_.CreationTime -lt $time} | ForEach-Object { If($_.name -
Match $Priorfilter) { $counterP++ }
First of all, the file name in your example doesn't match the partial name. So let's assume that pattern is "666". If you have several patterns, you can join them in one filter:
$PriorityList = "---666---","---555---"
$PriorityFilter = $PriorityList -join '|'
Then use this filter to check Name property:
Get-ChildItem -path $Struct.Path -filter $Struct.filter |
Where-Object {$_.CreationTime -lt $time} |
Where-Object {$_.Name -match $PriorityFilter |
ForEach-Object { $counterP++}
The -contains operator works with a collection as left operand and do not accept wildcards for matching.
I'm trying to create a synchronization script in Powershell so that my applications in MDT are being copied on a regular basis to our main file server, based on the folder name (in MDT, applications are in one folder, where our main server has applications split depending on the department who uses them).
From what I read on the web, the best way would be to populate an array with "Get-ChildItem", which I kinda figured how to do (see code below).
After the array is populated though, I don't know how to search that array for specific results, nor do I know how to use those results with copy-item.
In a nutshell, here's what I need to do: Build an array using "Get-ChildItem", query the resulting array for specific folders, and have those folders be copied to specific destinations.
Here's the code I have so far:
$arr = Get-ChildItem \\slmtl-wds02.domain.inc\deploymentshare$\applications |
Where-Object {$_.PSIsContainer} |
Foreach-Object {$_.Name}
$sourcepath = \\slmtl-wds02.domain.inc\deploymentshare$\applications
$destSLARC = \\slmtl-fs01.domain.inc\folder\it_services\private\software\service_desk\pc\SLARCMTL
$destSLMTL = \\slmtl-fs01.domain.inc\folder\it_services\private\software\service_desk\pc\SLMTL
$destSLGLB = \\slmtl-fs01.domain.inc\folder\it_services\private\software\service_desk\pc\SLGLB
$destSLTECH = \\slmtl-fs01.domain.inc\folder\it_services\private\software\service_desk\pc\SLTECH
Thanks in advance for your help :)
$sourceLocation = "c:\analysis\"
$targetLocation = "c:\analysisCopy\"
$included = #("folder1", "folder2")
$result = #()
foreach ($i in $included){
$result += get-ChildItem $sourceLocation -filter $i | Where-Object {$_.PSIsContainer}
}
$result | foreach-Object { copy-item $_.FullName -Destination $targetLocation -Recurse}
Hope this works change the path D:\ to your desired path enter the name of folder you looking for
$Keyword=[Microsoft.VisualBasic.Interaction]::InputBox("Enter your Query")
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.VisualBasic') | Out-Null
Get-ChildItem D:\ -recurse | Where-Object {$_.PSIsContainer -eq $fasle -and $_.Name -match "$keyword"} | Copy-Item -Destination d:\test
Here is the story. I have a fileshare that is replicated between 2 servers located in different places in the world. DFS will not replicate a file if it has only been viewed, but I wouldn't want to delete that file/folder because it was used within the time period I have set (7 days). So to make sure that I don't remove still used files I have to check both locations for their LastAccessTime.
I currently have this
Set-ExecutionPolicy RemoteSigned
$limit = (Get-Date).AddDays(-7)
$PathOne = "FirstPath"
$PathTwo = "SecondPath"
$ToBeDeletedPathOne = Get-ChildItem -Path $PathOne -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.LastAccessTime -lt $limit }
$TobeDeletedPathTwo = Get-ChildItem -Path $PathTwo -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.LastAccessTime -lt $limit }
$DiffObjects = Compare-Object -referenceobject $ToBeDeletedPathOne -differenceobject $ToBeDeletedPathTwo -IncludeEqual
$ToBeDeletedOverall = DiffObjects | where {$_.SideIndicator -eq "=="}
After this, I loop through and delete the files that are marked for deletion by both locations.
Part of the problem I have is that there are a tremendous amount of files and this can take a very long time. So I wanted to make it better/faster. My idea is to have this script run the scan as a different script on each FS server and wait for them to return the output. That way it can scan on the local machine easier than remotely and it would scan both locations simultaneously.
The other part of the problem comes in with the fact that I have no idea how to do this. I will continue to work on this and if I solve it, I will post here in case anyone in the future finds this useful.
You could run everything locally. Copy the script to the machines you want (make a script to copy them if you need to) then use something like PSTools to kick them off on the local machines. This should essentially run the script simultaneously on all machines.