PowerShell to delete files not matching an array of user names - arrays

I have a simple database query that returns an array of current users
e.g.
ABB016
ACQ002
AGU003
AHM007
I would like to delete all files in a particular folder that have a name to the left of the file ending (BaseName) that can not be found within this array.
The code below is what I currently have, however at the moment it would delete all the files in this directory
# Find files that do not have the same basename as current users' name and delete
Get-ChildItem -path $DirectoryWithFiles -rec *.* | Where-Object {
$SQLQeryResultsSet -NotContains $_.BaseName } |
Remove-Item -WhatIf
What if: Performing the operation "Remove File" on target "ABB016.pdf".
What if: Performing the operation "Remove File" on target "ACQ002.pdf".
What if: Performing the operation "Remove File" on target "AGU003.pdf".
What am I doing wrong?

As your comments suggest, $SQLQeryResultsSet is not an array of strings, but a DataTable.
You could extract the individual string values from first column in each data row like this:
$ExcludedUsers = $SQLQeryResultsSet |ForEach-Object { '{0}' -f $_.Item(0) }
Then use this new variable in place of $SQLQeryResultsSet in the Where-Object filter scriptblock

Related

How to find if Powershell Array Contain part of another Array

I have 2 Arrays
$Array1 = Get-Disabledusers SIDS
$Array2 = %Unnecessarytext(SIDS).VHDX
I need to compare Array1 and Array2 and output only the things in Array2 that contain Array1.
Thing is, when I compare both objects, it returns not equal because they don't match exactly.
How do I get it to output the items in Array2 that contain the matching Disabled Users SIDS?
Should I run a foreach loop and compare a part of the Array?
I found this: How to find if Powershell Array Contains Object of Another Array
However this doesn't help as it will return not equal.
Clarified Question:
There is a folder in which there are VHDXs. The VHDXs are named based on a user's SID. However, there is a bunch if unnecessary text before and after the SIDs.
In Array1, I run:
Get-ADUser -Filter {Enabled -eq $false} | FT SID
In order to retrieve a list of disabled users and filter out their SIDs.
In Array2, I list the names of the files in the VHDX folder which look like this: text SID text. I want to compare both and return which files in the VHDX folders contain the SIDS of the disabled users.
You can do it this way, first get the list of SID values from all disabled users and store them in a variable, then since the files or folders (unclear on this) are not exact SIDs, you will need to first check if they contain a valid SID, this can be accomplished using a regular expression and the -match operator and if they do, then we can use the automatic variable $Matches to check if the SID is -in the array of SIDs of Disabled Users, if yes, we can output that file or folder and store it in $result:
$re = 'S-1-[0-59]-\d{2}-\d{8,10}-\d{8,10}-\d{8,10}-[1-9]\d{2,3}'
$sids = (Get-ADUser -Filter "Enabled -eq '$false'").SID.Value
$result = foreach($item in Get-ChildItem -Path path\to\something) {
if($item.Name -match $re) {
if($Matches[0] -in $sids) {
$item
}
}
}
$result # => has all the files or folders existing in `$sids`
The regex used was taken from this answer and only required to change the last \d{3} for \d{2,3} to match any valid SID.

searching for multiple strings in multiple files in PowerShell

first of all, I've got a reliable search (thanks to some help on Stack Overflow) that checks for occurrences of different strings in a line over many log files.
I've now been tasked to include multiple searches and since there are about 20 files and about a dozen search criteria, I don't want to to have to access these files over 200 times. I believe the best way of doing this is in a array, but so far all methods I've tried have failed.
The search criteria is made up of date, which obviously changes very day, a fixed string (ERROR) and a unique java classname. Here is what i have:
$dateStr = Get-Date -Format "yyyy-MM-dd"
$errword = 'ERROR'
$word01 = [regex]::Escape('java.util.exception')
$pattern01 = "${dateStr}.+${errword}.+${word01}"
$count01 = (Get-ChildItem -Filter $logdir -Recurse | Select-String -Pattern $pattern01 -AllMatches |ForEach-Object Matches |Measure-Object).Count
Add-Content $outfile "$dateStr,$word01,$count01"
the easy way to expand this is to have a separate three command entry (set word, set pattern and then search) for each class i want to search against - which I've done and it works, but its not elegant and then we're processing >200 files to run the search. I've tried to read the java classes in from a simple text file with mixed results, but its the only thing I've been able to get to work in order to simplify the search for 12 different patterns.
iRon provided an important pointer: Select-String can accept an array of patterns to search for, and reports matches for lines that match any one of them.
You can then get away with a single Select-String call, combined with a Group-Object call that allows you to group all matching lines by which pattern matched:
# Create the input file with class names to search for.
#'
java.util.exception
java.util.exception2
'# > classNames.txt
# Construct the array of search patterns,
# and add them to a map (hashtable) that maps each
# pattern to the original class name.
$dateStr = Get-Date -Format 'yyyy-MM-dd'
$patternMap = [ordered] #{}
Get-Content classNames.txt | ForEach-Object {
$patternMap[('{0}.+{1}.+{2}' -f $dateStr, 'ERROR', [regex]::Escape($_))] = $_
}
# Search across all files, using multiple patterns.
Get-ChildItem -File -Recurse $logdir | Select-String #($patternMap.Keys) |
# Group matches by the matching pattern.
Group-Object Pattern |
# Output the result; send to `Set-Content` as needed.
ForEach-Object { '{0},{1},{2}' -f $dateStr, $patternMap[$_.Name], $_.Count }
Note:
$logDir, as the name suggests, is presumed to refer to a directory in which to (recursively) search for log files; passing that to -Filter wouldn't work, so I've removed it (which then positionally binds $logDir to the -Path parameter); -File limits the results to files; if other types of files are also present, add a -Filter argument as needed, e.g. -Filter *.log
Select-String's -AllMatches switch is generally not required - you only need it if any of the patterns can match multiple times per line and you want to capture all of those matches.
Using #(...), the array-subexpression operator around the collection of the hashtable's keys, $patternMap.Keys, i.e. the search patterns, is required purely for technical reasons: it forces the collection to be convertible to an array of strings ([string[]]), which is how the -Pattern parameter is typed.
The need for #(...) is surprising, and may be indicative of a bug, as of PowerShell 7.2; see GitHub issue #16061.

Replace Database connection string using PowerShell

I am working on a script that should display the current database a program is connected to, and also give an option to replace with a new database (either by manually entering the database name but preferably listing all databases on the local server \ instance and give users "number selection" to select the database they want to use.
The connection string is saved in a text file called server.exe.config below is an example of what the file contains, its not the only data in the config file obviously
<add key="persistency.connection" value="data source=MyDatabaseServer;initial catalog=MyDatabase1;Integrated Security=True" />
I can use Get-Content to see the entire file and also use Where-Object {$_ -like "*initial catalog=*"} to see the only line which has the database configuration.
But I think this will be difficult for users to interpret what database is being used so if possible need a command that will display just the database name that is in the config file instead of the entire line, save that database name for future replacement when the user selects a new database to be replaced into the config file.
Possible?
While plain-text processing, as shown in your own answer, works in simple cases, for robustness it is usually preferable to use XML parsing, such as via the Select-Xml cmdlet:
$file = './server.exe.config'
# Extract the XML element of interest, using an XPath query
$xpathQuery = '/configuration/appSettings/add[#key = "persistency.connection"]'
$el = (Select-Xml $xpathQuery $file).Node
# Replace the database name in the element's value...
$newDatabase = 'NewDatabase'
$el.value = $el.value -replace '(?<=\binitial catalog=)[^;]+', $newDatabase
# ... and save the modified document back to the file.
# IMPORTANT: Be sure to use a *full* path to the output file,
# because PowerShell's current directory (location)
# usually differs from .NET's; Convert-Path ensures that.
$el.OwnerDocument.Save((Convert-Path $file))
Note that this technique works in principle for any XML file, not just .config.exe files.
I think this is simpler and works for me
$ConnString = Get-Content -Path $filepath\server.exe.Config | Where-Object {$_ -like "*initial catalog=*"}
$ConnString -split '\s*;\s*' | ForEach-Object -Process {
if ($_ -imatch '^initial catalog=(?<dbname>.+)$') {
$ConnStringdb = $Matches.dbname
}
}
I am then using the following simply to replace the text above back into the config file after a database name is enter - Manually. would love if I could get it to work automatically by maybe building an array or something.
Write-Host "Please Enter the database name you want to use for Production(Don't Leave Blank): " -NoNewline; $NewDatabaseConfig = Read-Host;
(Get-Content -Path $filepath\server.exe.Config -Raw) -Replace $ConnStringdb,$NewDatabaseConfig | Set-Content -Path $IPSServerProd\UBERbaseOfficeServer.exe.Config

Is it possible to make IndexOf case-insensitive in PowerShell?

I've got a problem searching an INDEX in an array made up by query sessions command in a terminal server.
This is the problematic script:
# Array of logged users in terminal servers
$a=Get-RDUsersession -CollectionName "BLABLA" -ConnectionBroker BLABLA.BLA.BL
# Array of all users with two columns from active directory
$b=Get-ADUser -filter * -properties TelephoneNumber,SamAccountName
Now imagine logging in the terminal server using the account name TEST instead of test.
If I do:
$c = $b[$b.SamAccountName.indexof("test")].TelephoneNumber
then I don't get the telephone number.
I think that's because of the case sensitivity, isn't it? If I type TEST in the search command, I get the correct number.
Is there any simple way to solve this problem and make the search of the index case-insensitive?
I've read about using the method [StringComparison]"CurrentCultureIgnoreCase", but it seems not working with array.
Thanks.
Since $b is an Object[] type, then you would probably want to do a Where-Object.
$b | Where-Object -FilterScript {$_.Samaccountname -like '*Smith*'} | Select-Object -ExpandProperty 'telephoneNumber'
That being said, an array in Powershell can be indexed case-insensitively if it is converted to a [Collections.Generic.List[Object]] type.
$b = [Collections.Generic.List[Object]]$b
$b.FindIndex( {$args[0].sAMAccountName -eq 'test'} )
Note that pulling every single user object in AD and filtering using where-object or index matching can be very slow. You can instead Get-ADUser as needed or pull all ADusers using a filter that pulls only the users returned in $a.
If you insist on having all ADUsers in one spot with one pull, consider looping over the list once to make a hash lookup so you can easily index the hash value.
#Create account lookup hash
$accountlookup = #{}
foreach ($element in $accounts) {
$accountlookup[$element.SamAccountName] = $element
}
Hope that helps!

Powershell importing one csv column in array. Contains command doesn`t work with this array

With a powershell script
I`m importing a CSV file in an array, everything works fine.
$csv = Import-Csv "C:\test.csv" -delimiter ";"
But I'm not able to find easily a value in a field name PeopleID directly.
The only best method is to loop through all array line and look if the item I`m looking for exist like :
foreach($item in $csv)
{
if ($csv | where {$item.PeopleID -eq 100263} | select *) {
#Good I found it!!!
}
$List.add($item.PeopleID) > $null
}
Instead I decide to import only my column PeopleID directly in an array to make it faster:
$csvPeople = Import-Csv "C:\test.csv" -delimiter ";" | select PeopleID
If you see higher, I also create an array $List that add every PeopleID in my loop.
So I have 2 arrays that are identically
The problem if I use the CONTAINS command:
if($csvPeople -contains 100263)
the result is false
if($List -contains 100263)
the result is true
What can I do to have my $csvPeople array working with "contains" ?
Importing a csv column is faster than looping through result and adding it to a new array, but this array is working.
Do I'm missing somthing when I import my CSV column to have a "working" array ?
thanks
I think you are looking for -like not -contains. -contains is a bit finicky about how it is used. If you replace you if syntax with this:
if($csvPeople -like "*100263*")
You should be good to go. Note the wildcards on either side, I'm putting these here for you since I don't know exactly what your data looks like. You might be either able to remove them or able to change them.
Obligatory -like vs -contains article if you are interested: http://windowsitpro.com/blog/powershell-contains
Also, #AnsgarWiechers comment above will work. I believe you will still need to wrap your number in quotes though. I don't like to do this as it requires an exact match. If you are working with the CSV in excel or elsewhere and you have whitespace or oddball line ending characters then you might not get a hit with -contains.
I just noticed this in your script above where you are doubling your efforts:
foreach($item in $csv) # <-- Here $item is a record in the CSV
{
if ($csv | where {$item.PeopleID -eq 100263} | select *) { # <-- Here you are re-parsing the CSV even though you already have the data in $item
}
$List.add($item.Matricule) > $null
}
Since I don't have the full picture of what you are trying to do I'm providing a solution that allows you to match and take action per record.
foreach($item in $csv)
{
if ($item.PeopleID -eq 100263) {
# Do something with $item here now that you matched it
}
}
To Address your 191 users. I would take a different approach. You should be able to load a raw list of PeopleID's into a variable and then use the -in feature of Where to do a list-to-list comparison.
First load your target PeopleID's into a list. Assuming it is just a flat list, not a csv you could do your comparison like this:
$PeopleID_List = Get-Content YourList.txt
$csv | where { $_.PeopleID -in $PeopleID_List } | % { # Do something with the matched record here. Reference it with $_ }
This basically says for each record in the CSV I want you to check if that record's peopleID is in the $PeopleID_List. If it is, take action on that record.

Resources