String replacement to variable - arrays

I've got the following script that works and I'm trying to rewrite in a more efficient manner. The following lines of code below work and accomplish what I want:
Get-Content "C:\Documents and Settings\a411882\My Documents\Scripts\printserveroutput.txt" | Select-String -SimpleMatch -AllMatches -Pattern "/GPR" | % {
$_.Line -replace '(?:.*)(/GPR)(?:.*)(?<=on\s)(\w+)(?:.*)', '$1,$2'
}
Get-Content "C:\Documents and Settings\a411882\My Documents\Scripts\printserveroutput.txt" | Select-String -SimpleMatch -AllMatches -Pattern "DV6" | % {
$_.Line -replace '(?:.*)(DV6)(?:.*)(?<=on\s)(\w+)(?:.*)', '$1,$2'
}
I repeat the same exact lines of code seven times with slightly altering what I'm looking for. The output that I get is the following, which I'd want (note: this is just a small output):
/GPR,R3556
/GPR,R3556
While this works, I really don't like how cluttered the code is and I decided to try re-writing it in a more effective method. I've re-written the code like this:
$My_Arr = "/GPR", "DV6", "DV7", "RT3", "DEV", "TST", "PRE"
$low = $My_Arr.getlowerbound(0)
$upper = $My_Arr.getupperbound(0)
for ($temp=$low; $temp -le $upper; $temp++){
$Test = $My_Arr[$Temp]
Get-Content "C:\Documents and Settings\a411882\My Documents\Scripts\printserveroutput.txt" | Select-String -SimpleMatch -AllMatches -Pattern $My_Arr[$temp] | % {
$_.Line -replace '(?:.*)($Test)(?:.*)(?<=on\s)(\w+)(?:.*)', '$1,$2'
}
}
The output that this gives me is the following:
10 Document 81, A361058/GPR0000151814_1: owned by A361058 was printed on R3556 via port IP_***.***.***.***. Size in bytes: 53704; pages printed: 2 20130219123105.000000-300
10 Document 80, A361058/GPR0000151802_1: owned by A361058 was printed on R3556 via port IP_***.***.***.***. Size in bytes: 53700; pages printed: 2 20130219123037.000000-300
This is almost correct however the -replace line is where my error is occurring since the code is expecting a string instead of variable when it reaches ($Test). It is posting the entire line of the text file that I'm parsing each time that I find /GPR in this example, rather than the desired output shown above. Would anyone know of a method to fix this line and get the same output as the original code I was using?
EDIT: the output that I'm getting right now with the newer code is also the exact text that is in the .txt file I'm trying to parse through. There's more lines than that in the .txt but for the most part it is identical to that. I'm only concerned with getting the /GPR or any of the other possible strings in the array and then the server name which comes after the word "on" each time.

I'd say this is caused by the simple quotes, which prevent variable expansion.
PS is trying to replace the exact string '(?:.*)($Test)(?:.*)(?<=on\s)(\w+)(?:.*)' without replacing the $test variable by its value.
Try replacing them with quotes, but keep simple quotes on the second string, as follows:
Get-Content "C:\Documents and Settings\a411882\My Documents\Scripts\printserveroutput.txt" | Select-String -SimpleMatch -AllMatches -Pattern $My_Arr[$temp] | % {
$_.Line -replace "(?:.*)($Test)(?:.*)(?<=on\s)(\w+)(?:.*)", '$1,$2'

Does this work on your data?
$file = 'C:\Documents and Settings\a411882\My Documents\Scripts\printserveroutput.txt'
$regex = '(?:.*)(/GPR|DV6|DV7|RT3|DEV|TST|PRE)(?:.*)(?<=on\s)(\w+)(?:.*)'
(Get-Content $file) -match $regex -replace $regex,'$1,$2'

Related

Use txt file as list in PowerShell array/variable

I've got a script that searches for a string ("End program" in this case). It then goes through each file within the folder and outputs any files not containing the string.
It works perfectly when the phrase is hard coded, but I want to make it more dynamic by creating a text file to hold the string. In the future, I want to be able to add to the list of string in the text file. I can't find this online anywhere, so any help is appreciated.
Current code:
$Folder = "\\test path"
$Files = Get-ChildItem $Folder -Filter "*.log" |
? {$_.LastWriteTime -gt (Get-Date).AddDays(-31)}
# String to search for within the file
$SearchTerm = "*End program*"
foreach ($File in $Files) {
$Text = Get-Content "$Folder\$File" | select -Last 1
if ($Text | WHERE {$Text -inotlike $SearchTerm}) {
$Arr += $File
}
}
if ($Arr.Count -eq 0) {
break
}
This is a simplified version of the code displaying only the problematic area. I'd like to put "End program" and another string "End" in a text file.
The following is what the contents of the file look like:
*End program*,*Start*
If you want to check whether a file contains (or doesn't contain) a number of given terms you're better off using a regular expression. Read the terms from a file, escape them, and join them to an alternation:
$terms = Get-Content 'C:\path\to\terms.txt' |
ForEach-Object { [regex]::Escape($_) }
$pattern = $terms -join '|'
Each term in the file should be in a separate line with no leading or trailing wildcard characters. Like this:
End program
Start
With that you can check if the files in a folder don't contain any of the terms like this:
Get-ChildItem $folder | Where-Object {
-not $_.PSIsContainer -and
(Get-Content $_.FullName | Select-Object -Last 1) -notmatch $pattern
}
If you want to check the entire files instead of just their last line change
Get-Content $_.FullName | Select-Object -Last 1
to
Get-Content $_.FullName | Out-String

Recursing Through Multiple Text Files with References

I have hundreds of text files in a folder which can often reference each other, and go serveral levels deep. Not sure if I am explaining this well, so I will explain with an example.
Let's say folder "A" contains 500 .txt files. The first one could be called A.txt and somewhere in there it mentions B.txt, which in turn mentions C.txt and so on. I believe the number of levels down is no more than 10.
Now, I want to find a certain text strings which relate to A.txt by programmitically going through that file, then if it sees references to other .txt files go through them as well and so on. The resulting output would be something like A_out.txt which contains everything it found based on a regex.
I started out with this using Powershell but am now a little stuck:
$files = Get-ChildItem "C:\TEST\" -Filter *.txt
$regex = ‘PCB.*;’
for ($i=0; $i -lt $files.Count; $i++) {
$infile = $files[$i].FullName
$outfile = $files[$i].BaseName + "_out.txt"
select-string $infile -Pattern $regex -AllMatches | % { $_.Matches } | % { $_.Value } > $outfile
}
It goes through every .txt file and outputs everything that matches the PCB.*; expression to its corresponding _out.txt file.
I have absolutely no idea how to now expand this to include references to the other files. I'm not even sure if this is possible in PowerShell or whether I need to use another language to achieve what I want.
I could get some office monkey's to do all this manually but if this is relatively simple to code then it would save us a lot of time. Any help would be greatly appreciated :)
/Edit
Whilst running through this in my head, I thought I could build up an array for every time another one of the files is mentioned, and then repeat the process for those as well. However, back to my original problem, I have no idea how I would go about this.
/Edit 2:
Sorry, had been away for a few days and am only just picking this up. I have been using what I've learnt from this question and a few others to come up with the following:
function Get-FileReference
{
Param($FileName, $OutputFileName='')
if ($OutputFileName -eq '')
{
Get-FileReference $FileName ($FileName -replace '.xml$', '_out.xml')
}
else
{
Select-String $FileName -Pattern 'BusinessObject.[^"rns][w.]*' -AllMatches | % { $_.Matches } | % { $_.Value } | Add-Content $OutputFileName
Set-Location C:\TEST
$References = (Select-String -Pattern '(?<=resid=")d+' -AllMatches -path $FileName | % { $_.Matches } | % { $_.Value })
Write "SC References: $References" | Out-File OUTPUT.txt -Append
foreach ($Ref in $References)
{
$count
Write "$count" | Out-File OUTPUT.txt -Append
$count++
Write "SC Reference: $Ref" | Out-File OUTPUT.txt -Append
$xml = [xml](Get-Content 'C:\TEST\package.xml')
$res = $xml.SelectSingleNode('//res[#id = $Ref]/child::resver[last()]')
$resource = $res.id + ".xml"
Write "File to Check $resource" | Out-File OUTPUT.txt -Append
Get-FileReference $resource $OutputFileName
}
}
}
$files = gci "C:\TEST" *.xml
ForEach ($file in $files) {
Get-FileReference $file.FullName
}
Following my original question, I realised that this was a little bit more extensive than I originally thought and therefore had to tinker.
These are the noteable points:
All the parent files are .xml and code that matches on
"BusinessObject" etc works as expected.
The references to other
files are not simply .txt but require a pattern match of
'(?<=resid=")d+'.
This pattern match needs to be cross referenced with another file package.xml and based on the value
it returns, the file it next needs to look into is [newname].xml
As before, those child .xml files could reference some of the
other .xml files
The code I have pasted above seems to be getting stuck in endless loops (hence why I have debugging in there at the moment) and it is not liking the use of $Ref in:
$res = $xml.SelectSingleNode('//res[#id = $Ref]/child::resver[last()]')
That results in the following error:
Exception calling "SelectSingleNode" with "1" argument(s): "Namespace Manager or XsltContext needed. This query has a prefix, variable, or user-defined function."
Since there could be hundreds of files it dies when it gets over 1000+.
A recursive function which tries to do what you want.
function Get-FileReference
{
Param($FileName, $OutputFileName='')
if ($OutputFileName -eq '')
{
Get-FileReference $FileName ($FileName -replace '\.txt$', '_out.txt')
}
else
{
Select-String -Pattern 'PCB.*;' -Path $FileName -AllMatches | Add-Content $OutputFileName
$References = (Select-String -Pattern '^.*\.txt' -AllMatches -path $FileName).Matches.Value
foreach ($Ref in $References)
{
Get-FileReference $Ref $OutputFileName
}
}
}
$files = gci *.txt
ForEach ($file in $files) { Get-FileReference $file.FullName }
It takes two parameters - a filename and an output filename. If called without an output filename, it assumes it's at the top of a new recursion tree and generates an output filename to append to.
If called with an output filename (i.e. by itself) it searches for PCB patterns, appends to the output, then calls itself on any file references, with the same output filename.
Assuming that file references are lines on their own with no spaces xyz.txt.

How to compare substrings within a folder and an array with PowerShell?

I have a folder with 100,000 files (pictures) which are named by their UPC code (8 to 14 numerical digits) followed by an underscore and other digits:
000012345678_00_1
And I have a list of 20,000 unique UPC codes in a word document (separated by commas) which should match a fifth of these pictures (I also have this list in an Excel table).
000000000000, 000000000001, 000000000011
What I'm trying to do, is to find matches between my array (the 20,000 elements list) and files in my folder so as to extract only those 20,000 pictures from the folder.
I've started by cutting the file name up to the "__" so as to get only the relevant part of the file name:
$FName = ($File -split '_')[0]
To make things harder, I also need to add a wild card " * " to the elements in the array since some extra "0" at the beginning of the files name might have been added and are not present in our array. For example, this UPC in the array "05713901" refers to this file name "00005713901_00.png "; so to find matches I will have to use the "like" operator.
Then when I've found those matches, I'll just have to use Move-Item to a new folder or subfolder.
This is what I've started to code without any result:
$Directory = "C:path_to_my_folder";
$AllFiles = Get-ChildItem $Directory
$FileNames = New-Object System.Collections.ArrayList;
foreach($File in $AllFiles)
{
$FName = ($File -split '_')[0]
$FileNames.Add($FName)
}
$Upc = Get-Content C:\path_to_my_word.docx
Compare-Object $FileNames $Upc
You can't read a docx-file using Get-Content, and even if it did, Compare-Object wouldn't work because your word file was a list over UPC-codes separated by a commas (a single string in powershell), while $FileNames is an array (multiple-objects).
Copy the UPC-codes from excel to notepad so you get a simple textfile with one code per line similar to this sample.
UPC.txt - Content:
000000000000
000000000001
000000000011
....
It would take a long time to run 100.000 files through a 20.000 -like test-loop each. I would create a regex-pattern that looks for either of the codes with an underscore at the end. Ex:
$Directory = "C:\path_to_my_folder";
$AllFiles = Get-ChildItem $Directory
#Generate regex that matches 00001_ or 00002_ etc. Trimming leading and trailing whitespace just to be safe.
$regex = ((Get-Content -Path "c:\UPC.txt") | ForEach-Object { "$($_.Trim())_" }) -join '|'
#Get files that match
$AllFiles | Where-Object { $_.Name -match $regex } | ForEach-Object {
#Do something, ex. Move file.
Move-Item -Path $_.FullName -Dest C:\Destination
}
Or simply
$AllFiles | Where-Object { $_.Name -match $regex } | Move-Item -Destination "C:\Destination"
Save your UPC codes as a plain text file. As Frode F. suggested, copying them from Excel to Notepad is probably the easiest way to do it. Save that list. Then we will load that list into PowerShell, and for each file we will split at the underscore like you did, and trim any leading zeros, then check if it is in the list of known codes. Move any files that are in the list of known UPCs with Move-Item
#Import Known UPC List
$UPCList = Get-Content C:\Path\To\UPCList.txt
#Remove Leading Zeros From List
$UPCList = $UPCList | ForEach{$_.TrimStart('0')}
$Directory = "C:path_to_my_folder"
Get-ChildItem $Directory | Where{$_.Name.Split('_')[0].TrimStart('0') -in $UPCList} | Move-Item -Dest C:\Destination

Replace a line of text in a file using powershell 3

I am trying to replace some text in a file. Currently I am replacing IP addresses with:
(Get-Content $editfile) | ForEach-Object { $_ -replace "10.10.37.*<", "10.10.37.$BusNet<" } | Set-Content $editfile
This code works well here. However I can't get the code to work with another line:
<dbserver>SVRNAME</dbserver>
Here is the code I have written for this line:
(Get-Content $editfile) | ForEach-Object { $_ -replace "<dbserver>*</dbserver>", "$DbSVRName" } | Set-Content $editfile
The code above should replace SVRNAME with the DbSVRName. Yet it does not. I know it's simple, and I know I am going to feel dumb afterwards. What am I missing?
While debugging trying to find a solution I found that for some reason it can't see the *
(Get-Content $editfile) | ForEach-Object { $_ -match "<dbserver>*</dbserver>" }
This code reveals all falses results.
* doesn't capture stuff in regex, you need .* and specifically (.*?)
$str = 'text <dbserver>SVRNAME</dbserver> text'
$replace = 'foo'
$str -replace '(<dbserver>)(.*?)(</dbserver>)', ('$1'+$replace+'$3')
Output
text <dbserver>foo</dbserver> text

append data to a pipe-delimited file

I have a pipe-delimited file containing 5 columns. I need to append a sixth (pipe-delimited) column to the end of each row.
Old data:
a|b|c|d|e
p|q|r|s|t
New Data:
a|b|c|d|e|x
p|q|r|s|t|x
The sixth column (x) is a value which read from a text-file.
I am wondering if there is a quick way to append this data into existing data-file using powershell? The file contains variable number of rows (between 10 to 100,000)
Any help is appreciated
Simple text operations should work:
$replace = 'x'
(Get-Content file.txt) -replace '$',"|$replace"
a|b|c|d|e|x
p|q|r|s|t|x
For large files, you can do this:
$replace = 'x'
filter add-data {$_ -replace '$',"|$replace"}
Get-Content file.txt -ReadCount 1000 | add-data | add-content newfile.txt
That should produce very good performance with large files.
Assuming that your data does not have any headers in the CSV already, then you'll have to define the headers with the -Headers parameter of the Import-Csv cmdlet. To run the example below, put your data into a file called c:\test\test.csv. Then, run the script in PowerShell or PowerShell ISE.
# 1. Import the data
$Data = Import-Csv -Delimiter '|' -Path c:\test\test.csv -Header prop1,prop2,prop3,prop4,prop5;
# 2. Add a new member to each row
foreach ($Item in $Data) {
Add-Member -InputObject $Item -MemberType NoteProperty -Name prop6 -Value x;
}
# 3. Export the data to a new CSV file
$Data | Export-Csv -Delimiter '|' -Path c:\test\test.new.csv -NoTypeInformation;
# 4. Remove the double quotes around values
(Get-Content -Path c:\test\test.new.csv -Raw) -replace '"','' | Set-Content -Path c:\test\test.new.csv;
Original Data
The source data in c:\test\test.csv should look like this (according to your original post):
a|b|c|d|e
p|q|r|s|t
Resulting Data
After executing the script, your resulting data in c:\test\test.new.csv will look like this:
prop1|prop2|prop3|prop4|prop5|prop6
a|b|c|d|e|x
p|q|r|s|t|x
Random Sample Data Generation
Here is a short script that will generate a 10,000-line, randomized sample file to c:\test\test.csv:
$Random = { [System.Text.ASCIIEncoding]::ASCII.GetString((1..5 | % { [byte](Get-Random -Minimum 97 -Maximum 122); })).ToCharArray(); };
1..10000 | % { #('{0}|{1}|{2}|{3}|{4}' -f (& $Random)) } | Set-Content -Path c:\test\test.csv;
After running my first script against this sample data (10,000 lines), the result took: 1,729 milliseconds to execute. I would say that's pretty fast. Not that this is a race or anything.
I ran the sample file generator again, to generate 100,000 lines of data. After running the same script against that data, it took 19,784 milliseconds to run. It's roughly proportional to the 10,000 line test, but all in all, still doesn't take all that long. Is this a one-time thing, or does it need to be run on a schedule?
You could loop through the file line for line and just append the value in the loop:
Edit full sample code:
function append{
process{
foreach-object {$_ + "|x"}}}
$a = get-content yourcsv.csv
$a | append | set-content yourcsv.csv

Resources