PowerShell searching matches over 3 CSVs ... slow execution - arrays

I have to find matches over 3 CSVs. It is to find out whether users have AccessRights on PublicFolders in Exchange 2016. For ease of use I have already searched and stored all the needed values in 3 CSVs
"PF-Folder_Full.csv": a list of all the Publich Folders (more than 5000)
"PF-Mailboxes.csv": a list of all the users (around 50)
"PF-Permissions.csv": the result of
Get-PublicFolderClientPermission -Identity $Folder.Identity
looped through all the Public Folders (that takes ages)
I have written a script that does the job but even on a fast computer it is extremely slow because it has too loop through all the Public Folders and through all the users and then find a match for both values in the permissions
$Folders = Import-Csv -Path ".\PF-Folder_Full.csv" -Encoding Unicode
$Mailboxes = Import-Csv -Path ".\PF-Mailboxes.csv" -Encoding Unicode
$Permissions = Import-Csv -Path ".\PF-Permissions.csv" -Encoding Unicode
foreach ($Folder in $Folders) {
foreach ($Mailbox in $Mailboxes) {
$Permission = $Permissions | where {
($_.Identity -eq $Folder.Identity) -and
($_.User -eq $Mailbox.DisplayName)
}
if ($Permission) {
# some code
} else {
# some other code
}
Remove-Variable Permission
}
}
Is there a way to speed-up things? Possibly through the use of regular expressions.
I couldn't find any example that allows for extended matches between multiple arrays.

As Kory Gill mentioned in the comments: build two hashtables from the Identity and DisplayName properties from the first two CSVs:
$Folders = #{}
Import-Csv -Path ".\PF-Folder_Full.csv" -Encoding Unicode | ForEach-Object {
$Folders[$_.Identity] = $_
}
$Mailboxes = #{}
Import-Csv -Path ".\PF-Mailboxes.csv" -Encoding Unicode | ForEach-Object {
$Mailboxes[$_.DisplayName] = $_
}
Then process the third CSV using these hashtables for lookups:
$Permissions = Import-Csv -Path ".\PF-Permissions.csv" -Encoding Unicode
foreach ($p in $Permissions) {
if ($Folders.Contains($p.Identity) -and $Mailboxes.Contains($p.User)) {
# some code
} else {
# some other code
}
}
If you want code run just once per unique identity/user combination you could build a hashtable with the combined identity and mailbox name for filtering:
$Folders = Import-Csv -Path ".\PF-Folder_Full.csv" -Encoding Unicode |
Select-Object -Expand Identity
$Mailboxes = Import-Csv -Path ".\PF-Mailboxes.csv" -Encoding Unicode |
Select-Object -Expand DisplayName
$fltr = #{}
foreach ($f in $Folders) {
foreach ($m in $Mailboxes) {
$fltr["$f $m"] = $true
}
}
and then group the records from the third CSV:
Import-Csv -Path ".\PF-Permissions.csv" -Encoding Unicode |
Group-Object Identity, User |
ForEach-Object {
if ($fltr.Contains($_.Name)) {
# some code
} else {
# some other code
}
}

Related

Using two arrays to create registry keys/values

Trying to automate our font installation process for new PCs.
To install fonts, Windows adds the .ttf, .otf, etc. file to C:\Windows\Fonts and then creates a corresponding registry key in HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Fonts. A typical registry key would look like this:
Arial (TrueType) | Arial.ttf
To automate this, I've made two arrays using Get-ChildItem:
$names = Get-ChildItem -Path "C:\corp\install\fonts" | Select-Object name | Out-String | ForEach-Object {$_ -Replace "----","" ` -Replace "Name","" ` -Replace ".otf","" ` -Replace ".ttf","" } | ForEach-Object { $_.Trim() }
$files = Get-ChildItem -Path "C:\corp\install\fonts" | Select-Object name | Out-String | ForEach-Object {$_ -Replace "----","" ` -Replace "Name","" } | ForEach-Object { $_.Trim() }
Each $name in $names will be the name of the registry key, and each $file in $files will be the data for that registry key.
How would I go about doing this? I've attempted to use hash tables, PSObjects, nested ForEach loops, all to no avail. I have had difficulty finding anything on here and elsewhere that matches this situation exactly.
Error checking is not really necessary since there will always be a corresponding value.
REVISED FINAL SOLUTION:
Write-Host "Installing corporate fonts..."
Copy-Item -Path "C:\corp\install\fonts\*" -Destination "C:\Windows\Fonts" -Force -Recurse
$fontList = #()
$fonts = Get-ChildItem "C:\corp\install\fonts" | Select-Object -ExpandProperty Name
ForEach ( $font in $fonts ) {
$fontList += [PSCustomObject] #{
Name = $font -Replace ".otf","" ` -Replace ".ttf",""
File = $font
} |
ForEach-Object {
New-ItemProperty -Path "HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Fonts" -Name $_.Name -Value $_.File
}
}
I must admit I don't fully understand your question so forgive me if this response is way off base, but it this what you're looking for? A table with both pieces of data in one?
Function CreateVariables {
$namevariables = #()
$filenames = ( Get-ChildItem "C:\corp\install\fonts" ).name
Foreach ( $name in $filenames ){
$namevariables += [PSCustomObject] #{
Name = $name -Replace "----","" ` -Replace "Name","" ` -Replace ".otf","" ` -Replace ".ttf",""
File = $name -Replace "----","" ` -Replace "Name",""
}
}
Return $namevariables
}
CreateVariables
Piping both name and value to set-itemproperty seem impossible. Foreach-object seems the way to go.
$path = 'hklm:\software\microsoft\windows nt\currentversion\fonts'
[pscustomobject]#{name='a name';value='a value'} |
foreach { set-itemproperty $path $_.name $_.value -whatif }
What if: Performing the operation "Set Property" on target "Item: HKEY_LOCAL_MACHINE\software\microsoft\windows nt\currentversion\fonts Property: a name".
You may prefer using this vbscript-like method to install fonts:
https://www.mondaiji.com/blog/other/it/10247-windows-install-fonts-via-command-line

Export csv via pscustomobject displaying incorrectly powershell

I have the following code:
function realtest
{
$files = Get-ChildItem -Path 'D:\data\' -Filter *.csv
$tester = [PSCustomObject]#{
foreach ($file in $files)
{
$tempName = $file.BaseName
$temp = Import-Csv $file
$tester | Add-Member -MemberType NoteProperty -Name $tempName -Value $temp.$tempName
}
$tester
$tester | Export-Csv "D:\result.csv" -NoTypeInformation
}
I am trying to export a bunch of data to CSV however when it is display the data on csv it is shown as below
"E0798T102","E0798T103"
"System.Object[]","System.Object[]"
but when i do it as a print on console it displays as the below
E0798T102 E0798T103
--------- ---------
{0, 0, 0, 0...} {0, 0, 0, 0...}
Ultimately, I want E0798T102 and E0798T103 as seperate columns in the result.csv
just to note, I will have 50 csv to loop through and each should display as its own column
Here is an incredibly inefficient answer to your question. If left as is, it assumes your CSV files already have a header with the CSV file basename:
$CSVs = Get-ChildItem -path 'D:\data\' -filter "*.csv" -file
$headers = $CSVs.basename
$table = [System.Data.DataTable]::new("Files")
foreach ($header in $headers) {
$table.Columns.Add($header) | out-null
}
foreach ($CSV in $CSVs) {
#$contents = Import-Csv $CSV -Header $CSV.basename # If CSV has no header
$contents = Import-Csv $CSV # If CSV contains header
$rowNumber = 0
foreach ($line in $Contents) {
$rowcount = $table.rows.count
if ($rowNumber -ge $rowCount) {
$row = $table.NewRow()
$row[$CSV.basename] = $line.$($CSV.basename)
$table.Rows.Add($row)
}
else {
$row = $table.rows[$rowNumber]
$row[$CSV.basename] = $line.$($CSV.basename)
}
$rowNumber++
}
}
$table | Export-Csv output.csv -NoTypeInformation
You can uncomment the commented $contents line if your CSV files do not have a header. You will just have to comment out the next $contents variable assignment if you uncomment the first.
Based on your snippet, this can be significantly simplified:
function Get-Csv {
$col = foreach ($file in Get-ChildItem -Path D:\data -Filter *.csv) {
$csv = Import-Csv -Path $file.FullName
[pscustomobject]#{
$csv.($file.BaseName) = $csv
}
}
$col | Export-Csv D:\result.csv -NoTypeInformation
return $col
}
However, a csv file seems like the wrong approach because you're trying to embed objects under a header. This doesn't really work in a tabular format as you only get one layer of depth. You should either expand all the properties on your objects, or use a different format that can represent depth, like json.
The reason for your formatting woes is due to how the serialization works. You're getting a string representation of your objects.
Converting to json isn't difficult, you just trade your Export-Csv call:
$col | ConvertTo-Json -Depth 100 | Out-File -FilePath D:\result.json
Note: I specify -Depth 100 because the cmdlet's default Depth is 2.

wildcard in array

How can I use the 1709 as a wildcard? The value 1709 is stored in an array as $MoveItem.Version, but I can't figure out how do a -like, when the value comes from an array, as I can't put in a wildcard *. I also tried to do a match.
The file name looks like this: REFW10-X86-1709_01-12-2017.wim.
The below code works fine, but I would like to automate it, so everything comes from the array. Is that possible?
Get-ChildItem -Path $OSPathTemp -Recurse | Where {
($_.Name -eq $MoveItem.File) -and
($_.Name -like "*1709*")
} | Select-Object -ExpandProperty FullName
$MoveItem.Version contains 1607,1706,1709. I would like to choose only the one with 1709. The final output should look like this:
foreach ($MoveItem in $MoveItems) {
Get-ChildItem -Path $OSPathTemp -Recurse | Where {
($_.Name -eq $MoveItem.File) -and
($_.Name -like $MoveItem.Version)
} | Select-Object -ExpandProperty FullName
}
The Array looks like this:
$MoveItem = #(
[pscustomobject]#{File="REFW10-X86-1709_01-12-2017.wim";Version=1709}
[pscustomobject]#{File="REFW10-X86-1706_01-12-2017.wim";Version=1706}
)
So you have a hash table (or similar) named $MoveItem that has a .File property that is a filename, and you have a .Versions property that's a string array?
Test name: REFW10-X86-1709_01-12-2017.wim
Get-ChildItem -Path $OSPathTemp -Recurse |
ForEach-Object {
If ($_.Name -match '-\d{4}_') { $Version = $Matches[0] }
If ($Version -in $MoveItem.Versions -and
$_.Name -eq $MoveItem.File) { $_.FullName }
}

Iterate through Rows in SQL to Output to Text File

I have a SQL table that contains several hundred rows of data. One of the columns in this table contains text reports that were stored as plain text within the column.
Essentially, I need to iterate through each row of data in SQL and output the contents of each row's report column to its own individual text file with a unique name pulled from another column.
I am trying to accomplish this via PowerShell and I seem to be hung up. Below is what I have thus far.
foreach ($i=0; $i -le $Reports.Count; $i++)
{
$SDIR = "C:\harassmentreports"
$FILENAME = $Reports | Select-Object FILENAME
$FILETEXT = $Reports | Select-Object TEXT
$NAME = "$SDIR\$FILENAME.txt"
if (!([System.IO.File]::Exists($NAME))) {
Out-File $NAME | Set-Content -Path $FULLFILE -Value $FILETEXT
}
}
Assuming that $Reports is a list of the records from your SQL query, you'll want to fix the following issues:
In an indexed loop use indexed access to the elements of your array:
$FILENAME = $Reports[$i] | Select-Object FILENAME
$FILETEXT = $Reports[$i] | Select-Object TEXT
Define variables outside the loop if their value doesn't change inside the loop:
$SDIR = "C:\harassmentreports"
foreach ($i=0; $i -le $Reports.Count; $i++) {
...
}
Expand properties if you want to use their value:
$FILENAME = $Reports[$i] | Select-Object -Expand FILENAME
$FILETEXT = $Reports[$i] | Select-Object -Expand TEXT
Use Join-Path for constructing paths:
$NAME = Join-Path $SDIR "$FILENAME.txt"
Use Test-Path for checking the existence of a file or folder:
if (-not (Test-Path -LiteralPath $NAME)) {
...
}
Use either Out-File
Out-File -FilePath $NAME -InputObject $TEXT
or Set-Content
Out-File -Path $NAME -Value $TEXT
not both of them. The basic difference between the two cmdlets is their default encoding. The former uses Unicode, the latter ASCII encoding. Both allow you to change the encoding via the parameter -Encoding.
You may also want to reconsider using a for loop in the first place. A pipeline with a ForEach-Object loop might be a better approach:
$SDIR = "C:\harassmentreports"
$Reports | ForEach-Object {
$file = Join-Path $SDIR ($_.FILENAME + '.txt')
if (-not (Test-Path $file)) { Set-Content -Path $file -Value $_.TEXT }
}

powershell pipe to array returns as character array

I am trying to build an array with the results of the following powershell script. I would then like to only output the unique results in that array (I am expecting duplicates). I tried but it does not work. One thing I noticed is that my array is being converted to a character array. So if I call a single object in the array ($array[1]) it only displays one character, which may be why the select -unique is not working.
Please help.
$servers="Server1","server2"
$regex="(\.\d\d\d.*\()"
$path = "SomePath"
$yesterday = (get-date).Date.AddDays(-1)
Foreach ($server in $servers) {Get-ChildItem -Path $path -recurse | where { $_.LastWriteTime -le $yesterday } | ForEach-Object { Get-Content -Path "$_" -TotalCount 2 | select-string $regex | % { $_.Matches } | % { $array += $_.Value.Substring(5,6) } }}
$array|select -Unique
If you initialize the $array value as an array, then your code will work. Without this knowledge, PowerShell is treating $array += like concatenating strings together.
$array = #()
$servers="Server1","server2"
$regex="(\.\d\d\d.*\()"
$path = "SomePath"
$yesterday = (get-date).Date.AddDays(-1)
Foreach ($server in $servers) {Get-ChildItem -Path $path -recurse | where { $_.LastWriteTime -le $yesterday } | ForEach-Object { Get-Content -Path "$_" -TotalCount 2 | select-string $regex | % { $_.Matches } | % { $array += $_.Value.Substring(5,6) } }}
$array|select -Unique

Resources