I have the following code:
function realtest
{
$files = Get-ChildItem -Path 'D:\data\' -Filter *.csv
$tester = [PSCustomObject]#{
foreach ($file in $files)
{
$tempName = $file.BaseName
$temp = Import-Csv $file
$tester | Add-Member -MemberType NoteProperty -Name $tempName -Value $temp.$tempName
}
$tester
$tester | Export-Csv "D:\result.csv" -NoTypeInformation
}
I am trying to export a bunch of data to CSV however when it is display the data on csv it is shown as below
"E0798T102","E0798T103"
"System.Object[]","System.Object[]"
but when i do it as a print on console it displays as the below
E0798T102 E0798T103
--------- ---------
{0, 0, 0, 0...} {0, 0, 0, 0...}
Ultimately, I want E0798T102 and E0798T103 as seperate columns in the result.csv
just to note, I will have 50 csv to loop through and each should display as its own column
Here is an incredibly inefficient answer to your question. If left as is, it assumes your CSV files already have a header with the CSV file basename:
$CSVs = Get-ChildItem -path 'D:\data\' -filter "*.csv" -file
$headers = $CSVs.basename
$table = [System.Data.DataTable]::new("Files")
foreach ($header in $headers) {
$table.Columns.Add($header) | out-null
}
foreach ($CSV in $CSVs) {
#$contents = Import-Csv $CSV -Header $CSV.basename # If CSV has no header
$contents = Import-Csv $CSV # If CSV contains header
$rowNumber = 0
foreach ($line in $Contents) {
$rowcount = $table.rows.count
if ($rowNumber -ge $rowCount) {
$row = $table.NewRow()
$row[$CSV.basename] = $line.$($CSV.basename)
$table.Rows.Add($row)
}
else {
$row = $table.rows[$rowNumber]
$row[$CSV.basename] = $line.$($CSV.basename)
}
$rowNumber++
}
}
$table | Export-Csv output.csv -NoTypeInformation
You can uncomment the commented $contents line if your CSV files do not have a header. You will just have to comment out the next $contents variable assignment if you uncomment the first.
Based on your snippet, this can be significantly simplified:
function Get-Csv {
$col = foreach ($file in Get-ChildItem -Path D:\data -Filter *.csv) {
$csv = Import-Csv -Path $file.FullName
[pscustomobject]#{
$csv.($file.BaseName) = $csv
}
}
$col | Export-Csv D:\result.csv -NoTypeInformation
return $col
}
However, a csv file seems like the wrong approach because you're trying to embed objects under a header. This doesn't really work in a tabular format as you only get one layer of depth. You should either expand all the properties on your objects, or use a different format that can represent depth, like json.
The reason for your formatting woes is due to how the serialization works. You're getting a string representation of your objects.
Converting to json isn't difficult, you just trade your Export-Csv call:
$col | ConvertTo-Json -Depth 100 | Out-File -FilePath D:\result.json
Note: I specify -Depth 100 because the cmdlet's default Depth is 2.
Related
I have a script that is looking for some values in xml-like files:
$path = "C:\Users\*.xml"
Get-ChildItem $path |
Select-String -Pattern "<status>", "<logicalIdentifier>" | Out-File out.txt
The result is:
I am trying to store these in an array, however nothing happens (no error message on the console but nothing is printed either):
$path = "C:\Users\*.xml"
Get-ChildItem $path |
Select-String -Pattern "<status>", "<logicalRecordIdentifier>" |
Foreach-Object {
$id, $status = $_.Matches[0].Groups['<logicalIdentifier>', '<status>'].Value
[PSCustomObject] #{
ID = $id
Status = $status
}
}
I know it's a long shot, but not sure what goes wrong.
Originally the XML file has the following sructure:
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<ACKReceipt xmlns="http://www.test.eu/ReceiptSchema_V1.xsd">
<receiptTimestamp>2021-04-23T20:32:09.239Z</receiptTimestamp>
<product>
<type>LXGF-H78</type>
</product>
<receiptType>validation</receiptType>
<validationReceipt>
<reportedFilename>filename1</reportedFilename>
<globalReceiptItem>
<logicalIdentifier>4567-YYYMMDD</logicalIdentifier>
<status>Accepted</status>
</globalReceiptItem>
</validationReceipt>
</ACKReceipt>
This code works for your xml structure:
Clear-Host
$path = "C:\tmp\xml\*.xml"
$foundFiles = Get-ChildItem $path
$arrayTable = $null
$arrayTable = #()
foreach($file in $foundFiles) {
$file.FullName
[xml]$fileXML = Get-Content $file | Select-Object -Skip 1
$currentLogicalIdentifier = $fileXML.ACKReceipt.validationReceipt.globalReceiptItem.logicalIdentifier
$currentStatus = $fileXML.ACKReceipt.validationReceipt.globalReceiptItem.status
$arrayTable += new-object psobject -property #{Status=$currentStatus;LogicalIdentifier=$currentLogicalIdentifier;File=$file.Name}
}
# Report to csv file
$arrayTable | Export-Csv -Path "C:\tmp\xml\report.csv" -UseCulture -NoTypeInformation -Encoding "UTF8"
Here is how it looks like in eviroment table:
You can easy export it as csv file as a report.
I have to find matches over 3 CSVs. It is to find out whether users have AccessRights on PublicFolders in Exchange 2016. For ease of use I have already searched and stored all the needed values in 3 CSVs
"PF-Folder_Full.csv": a list of all the Publich Folders (more than 5000)
"PF-Mailboxes.csv": a list of all the users (around 50)
"PF-Permissions.csv": the result of
Get-PublicFolderClientPermission -Identity $Folder.Identity
looped through all the Public Folders (that takes ages)
I have written a script that does the job but even on a fast computer it is extremely slow because it has too loop through all the Public Folders and through all the users and then find a match for both values in the permissions
$Folders = Import-Csv -Path ".\PF-Folder_Full.csv" -Encoding Unicode
$Mailboxes = Import-Csv -Path ".\PF-Mailboxes.csv" -Encoding Unicode
$Permissions = Import-Csv -Path ".\PF-Permissions.csv" -Encoding Unicode
foreach ($Folder in $Folders) {
foreach ($Mailbox in $Mailboxes) {
$Permission = $Permissions | where {
($_.Identity -eq $Folder.Identity) -and
($_.User -eq $Mailbox.DisplayName)
}
if ($Permission) {
# some code
} else {
# some other code
}
Remove-Variable Permission
}
}
Is there a way to speed-up things? Possibly through the use of regular expressions.
I couldn't find any example that allows for extended matches between multiple arrays.
As Kory Gill mentioned in the comments: build two hashtables from the Identity and DisplayName properties from the first two CSVs:
$Folders = #{}
Import-Csv -Path ".\PF-Folder_Full.csv" -Encoding Unicode | ForEach-Object {
$Folders[$_.Identity] = $_
}
$Mailboxes = #{}
Import-Csv -Path ".\PF-Mailboxes.csv" -Encoding Unicode | ForEach-Object {
$Mailboxes[$_.DisplayName] = $_
}
Then process the third CSV using these hashtables for lookups:
$Permissions = Import-Csv -Path ".\PF-Permissions.csv" -Encoding Unicode
foreach ($p in $Permissions) {
if ($Folders.Contains($p.Identity) -and $Mailboxes.Contains($p.User)) {
# some code
} else {
# some other code
}
}
If you want code run just once per unique identity/user combination you could build a hashtable with the combined identity and mailbox name for filtering:
$Folders = Import-Csv -Path ".\PF-Folder_Full.csv" -Encoding Unicode |
Select-Object -Expand Identity
$Mailboxes = Import-Csv -Path ".\PF-Mailboxes.csv" -Encoding Unicode |
Select-Object -Expand DisplayName
$fltr = #{}
foreach ($f in $Folders) {
foreach ($m in $Mailboxes) {
$fltr["$f $m"] = $true
}
}
and then group the records from the third CSV:
Import-Csv -Path ".\PF-Permissions.csv" -Encoding Unicode |
Group-Object Identity, User |
ForEach-Object {
if ($fltr.Contains($_.Name)) {
# some code
} else {
# some other code
}
}
I have a SQL table that contains several hundred rows of data. One of the columns in this table contains text reports that were stored as plain text within the column.
Essentially, I need to iterate through each row of data in SQL and output the contents of each row's report column to its own individual text file with a unique name pulled from another column.
I am trying to accomplish this via PowerShell and I seem to be hung up. Below is what I have thus far.
foreach ($i=0; $i -le $Reports.Count; $i++)
{
$SDIR = "C:\harassmentreports"
$FILENAME = $Reports | Select-Object FILENAME
$FILETEXT = $Reports | Select-Object TEXT
$NAME = "$SDIR\$FILENAME.txt"
if (!([System.IO.File]::Exists($NAME))) {
Out-File $NAME | Set-Content -Path $FULLFILE -Value $FILETEXT
}
}
Assuming that $Reports is a list of the records from your SQL query, you'll want to fix the following issues:
In an indexed loop use indexed access to the elements of your array:
$FILENAME = $Reports[$i] | Select-Object FILENAME
$FILETEXT = $Reports[$i] | Select-Object TEXT
Define variables outside the loop if their value doesn't change inside the loop:
$SDIR = "C:\harassmentreports"
foreach ($i=0; $i -le $Reports.Count; $i++) {
...
}
Expand properties if you want to use their value:
$FILENAME = $Reports[$i] | Select-Object -Expand FILENAME
$FILETEXT = $Reports[$i] | Select-Object -Expand TEXT
Use Join-Path for constructing paths:
$NAME = Join-Path $SDIR "$FILENAME.txt"
Use Test-Path for checking the existence of a file or folder:
if (-not (Test-Path -LiteralPath $NAME)) {
...
}
Use either Out-File
Out-File -FilePath $NAME -InputObject $TEXT
or Set-Content
Out-File -Path $NAME -Value $TEXT
not both of them. The basic difference between the two cmdlets is their default encoding. The former uses Unicode, the latter ASCII encoding. Both allow you to change the encoding via the parameter -Encoding.
You may also want to reconsider using a for loop in the first place. A pipeline with a ForEach-Object loop might be a better approach:
$SDIR = "C:\harassmentreports"
$Reports | ForEach-Object {
$file = Join-Path $SDIR ($_.FILENAME + '.txt')
if (-not (Test-Path $file)) { Set-Content -Path $file -Value $_.TEXT }
}
I have a script that shows all the local users and their associated groups. However, I'm trying to output the results into a text file and that's where the script goes wrong, because it's not giving me the same results I'm receiving from the output window. For example, the code I have reads:
$LogFile = Test-Path C:\Users\FredAslami\Downloads\Test.txt
$LocalUsers = [ADSI]"WinNT://$env:COMPUTERNAME"
if ($LogFile) {
$LocalUsers.Children | where {$_.SchemaClassName -eq 'user'} | Foreach-Object {
$groups = $_.Groups() | Foreach-Object {
$_.GetType().InvokeMember("Name", 'GetProperty', $null, $_, $null)
}
$_ | Select-Object #{n='UserName';e={$_.Name}},
#{n='Groups';e={$groups -join ';'}}
}
Write-Host "Got User Groups Info"
Out-File -FilePath C:\Users\FredAslami\Downloads\Test.txt `
-InputObject $LocalUsers -Append
Write-Host "Added info to text"
}
$LocalUsers.Dispose()
When I run that the text in the file will read
distinguishedName :
Path : WinNT://R68-CUSTOM-01
I have also tried using Add-Content, but that doesn't work either. It will add something like:
System.DirectoryServices.DirectoryEntry
I also, tried to debug using Write-Host after it retrieves the local users and group info and another Write-Host after it writes the results into the text file and noticed that it's writing the results before it gathered all the info. So I tried using the Start-Sleep, and that didnt seem to work.
On the second line you have $LocalUsers = [ADSI]"WinNT://$env:COMPUTERNAME". You never assigned it a different value, so that's what you're seeing as your output.
I would recommend piping your Select-Object statement to Export-Csv. Much easier and cleaner.
You get different results in screen and file output, because you're doing different things with your data. The pipeline starting with $localUsers.Children builds a list of the user objects and their group memberships and echoes that to the screen, but you don't do anything else with that data. Instead you're writing the unmodified variable $localUsers to the output file.
If you want tabular data to go both to the console and a file, I'd suggest using Write-Host for the console output, and Export-Csv for the file output:
$LocalUsers.Children | where {$_.SchemaClassName -eq 'user'} | Foreach-Object {
$groups = $_.Groups() | Foreach-Object {
$_.GetType().InvokeMember('Name', 'GetProperty', $null, $_, $null)
}
$o = New-Object -Type PSObject -Property #{
'UserName' = $_.Name
'Groups' = $groups -join ';'
}
Write-Host $o
$o
} | Export-Csv 'C:\Users\FredAslami\Downloads\Test.txt' -NoType
If you want the output to go to the success output stream instead of the console, you could capture the result in a variable and output that in two different ways:
$users = $LocalUsers.Children | where {
$_.SchemaClassName -eq 'user'
} | Foreach-Object {
$groups = $_.Groups() | Foreach-Object {
$_.GetType().InvokeMember('Name', 'GetProperty', $null, $_, $null)
}
New-Object -Type PSObject -Property #{
'UserName' = $_.Name
'Groups' = $groups -join ';'
}
}
$users
$users | Export-Csv 'C:\Users\FredAslami\Downloads\Test.txt' -NoType
I have a root directory that consists of many folders and sub folders. I need to check whether a particular file like *.sln or *.designer.vb exists in the folders or subfolders and output the result in a text file.
For Eg:
$root = "C:\Root\"
$FileType = ".sln",".designer.vb"
the text file will have result somewhat like below:
.sln ---> 2 files
.sln files path ---->
c:\Root\Application1\subfolder1\Test.sln
c:\Root\Application2\subfolder1\Test2.sln
Any help will be highly appreciated!
Regards,
Ashish
Try this:
function Get-ExtensionCount {
param(
$Root = "C:\Root\",
$FileType = #(".sln", ".designer.vb"),
$Outfile = "C:\Root\rootext.txt"
)
$output = #()
Foreach ($type in $FileType) {
$files = Get-ChildItem $Root -Filter *$type -Recurse | ? { !$_.PSIsContainer }
$output += "$type ---> $($files.Count) files"
foreach ($file in $files) {
$output += $file.FullName
}
}
$output | Set-Content $Outfile
}
I turned it into a function with your values as default parameter-values. Call it by using
Get-ExtensionCount #for default values
Or
Get-ExtensionCount -Root "d:\test" -FileType ".txt", ".bmp" -Outfile "D:\output.txt"
Output saved to the file ex:
.txt ---> 3 files
D:\Test\as.txt
D:\Test\ddddd.txt
D:\Test\sss.txt
.bmp ---> 2 files
D:\Test\dsadsa.bmp
D:\Test\New Bitmap Image.bmp
To get the all the filecounts at the start, try:
function Get-ExtensionCount {
param(
$Root = "C:\Root\",
$FileType = #(".sln", ".designer.vb"),
$Outfile = "C:\Root\rootext.txt"
)
#Filecount per type
$header = #()
#All the filepaths
$filelist = #()
Foreach ($type in $FileType) {
$files = Get-ChildItem $Root -Filter *$type -Recurse | ? { !$_.PSIsContainer }
$header += "$type ---> $($files.Count) files"
foreach ($file in $files) {
$filelist += $file.FullName
}
}
#Collect to single output
$output = #($header, $filelist)
$output | Set-Content $Outfile
}
Here's a one-liner to determine if at least one file with extension .txt or .ps1 exists in the directory $OutputPath:
(Get-ChildItem -Path $OutputPath -force | Where-Object Extension -in ('.txt','.ps1') | Measure-Object).Count
Explanation: the command tells you the number of files in the specified directory matching any of the listed extensions. You can append -ne 0 to the end, which returns true or false to be used in an if block.
This will search the directory $root and its subdirectories for files of type $FileType, including hidden files and excluding directories:
$root = "C:\Root\";
$FileType = "*.sln", "*.designer.vb";
$results = Get-ChildItem -Path $root -Force -Recurse `
| Where-Object {
if ($_ -isnot [System.IO.DirectoryInfo])
{
foreach ($pattern in $FileType)
{
if ($_.Name -like $pattern)
{
return $true;
}
}
}
return $false;
}
Note that I've modified the strings in $FileType to be formatted as a wildcard pattern. Then group the files by extension:
$resultGroups = $results | Group-Object -Property 'Extension';
Then loop through each group and print the file count and paths:
foreach ($group in $resultGroups)
{
# $group.Count: The number of files with that extension
# $group.Group: A collection of FileInfo objects
# $group.Name: The file extension with leading period
Write-Host "$($group.Name) ---> $($group.Count) files";
Write-Host "$($group.Name) files path ---->";
foreach ($file in $group.Group)
{
Write-Host $file.FullName;
}
}
To write the results to a file instead of the console, use the Out-File cmdlet instead of the Write-Host cmdlet.