I have a folder with oracle bi publisher reports
Folder/Usage Trend Report.xdmz
Folder/Usage Summary Report.xdmz
Folder/Charge Trend Report.xdmz
Folder/Consolidation Reports.xdmz
Folder/Charge Summary Report.xdmz
Each report is like zip file and contains some files
_datamodel.xdm
~metadata.meta
security.xml
I want to make a batch file, searches all the _datamodel.xdm file in order to find a literal (eg INVOICE_NBR or invoice_nbr)
The output will be something like
Report Name Literal Usages
Consolidation Reports.xdmz INVOICE_NBR 1
Is there anyone to help me do it ?
Thanks
In the future, please show what you've tried to solve the problem on your own, and where you're getting stuck. For this time, I found the challenge interesting, so I whipped up a beginning for you. It's a Batch + PowerShell hybrid script. Save it with a .bat extension and salt to taste. Be advised that the regexp object that performs a count of strings uses case-sensitive matching; so "INVOICE_NBR" would not increment the count when searching for "invoice_nbr".
<# : batch portion
#echo off & setlocal
set "outerfile=*.xdmz"
set "innerfile=_datamodel.xdm"
set "search=invoice_nbr"
rem // re-launch self with PowerShell interpreter
powershell "iex (${%~f0} | out-string)"
goto :EOF
: end batch / begin PowerShell hybrid chimera #>
add-type -as System.IO.Compression.FileSystem
# // encapsulate loop into a collection for select | format-table
&{ foreach ($archive in (gci $env:outerfile)) {
# // create a temporary directory within %temp%
$tempdir = New-Item -path $env:temp -name ([Guid]::NewGuid()) -type dir
[IO.Compression.ZipFile]::ExtractToDirectory($archive, $tempdir)
# // For each innerfile found within the zip archive...
gci -path $tempdir -filter $env:innerfile -recurse | %{
new-object PSObject -property #{
"Report Name" = $archive.Name
"Usages" = ([regex]::Matches((gc $_.FullName | out-string), $env:search)).count
"Literal" = $env:search
}
}
Remove-Item $tempdir -recurse -force
} } | select "Report Name",Literal,Usages | format-table -auto
Example output:
Report Name Literal Usages
----------- ------- ------
Usage Summary Report.xdmz invoice_nbr 2
Usage Trend Report.xdmz invoice_nbr 2
If you want case-insensitive matching, add the following as a third argument to the [regex]::Matches() function:
[Text.RegularExpressions.RegexOptions]::IgnoreCase
Related
I'm trying to search through a number of log files with different filenames. I want search for a hostname in each log and when a match is found have it copy that entire line to summary_[date].log and keep appending matching lines to it. So something I've started with is:
$captured = Get-ChildItem -recurse -Path \\nas1\share1 -Include *.log |
where { ($_.Name | Select-String -pattern ('PC1','PC2','PC3') -SimpleMatch) }
Now copy the line from each log file which contains the pattern and append it to a file with today's date stamp, so each week I'll have a file like \\nas1\share1\summary_03-07-2020.log
But this is not quite what I want as this will capture the filenames and append them to the $captured variable. It's also missing the code to copy any matching lines to a date stamped summary_[date].log
Each text file will contain, among other lines that start with a time stamp, something like this:
03-07-2020_14-36-17 - Backup of computer [PC1] is successfully
created.
So what I want is to search several text files on a share for several hostnames. If a text file contains the hostname have it append the line which contains the hostname to summary_[date].log. Lastly, since the matching lines will all start with a date/time stamp I need to keep the contents of summary_[date].log file sorted from newest date to oldest.
Essentially I should end up with a summary_[date].log every week that will look similar to this:
03-07-2020_14-36-17 - Backup of computer [PC1] is successfully created.
03-07-2020_13-21-12 - Backup of computer [PC3] is successfully created.
03-07-2020_11-36-29 - Backup of computer [PC2] is successfully created.
By doing this I get a summary of all log files from that day in a single file which I will then automatically email to a specific email address.
How do I accomplish this?
Your current code selects a string from the file name, not the content.
To do what you are after, you can use simething like this:
$logFolder = '\\nas1\share1'
$outFile = Join-Path -Path $logFolder -ChildPath ('summary_{0:dd-MM-yyyy}.log' -f (Get-Date))
$captured = Get-ChildItem -Recurse -Path $logFolder -Include *.log | ForEach-Object {
($_ | Select-String -Pattern 'PC1','PC2','PC3' -SimpleMatch).Line
} | Sort-Object # sorting is optional of course
#output on screen
$captured
# output to new log file
$captured | Set-Content -Path $outFile
Next send this file as attachment like:
# use splatting for cmdlets that take a lot of parameters
$params = #{
SmtpServer = 'yourmailserver'
From = 'logtester#yourcompany.com'
To = 'someone#yourcompany.com'
Subject = 'Summary'
Body = 'Hi, in the attachment a summary of the backup logs'
Attachments = $outFile
# etc.
}
Send-Mailmessage #params
How big are your log files?
This will work but it loads the entire content of all files into memory and maybe inefficient with large file and/or a large number of files.
If there is a large number of files or large files you could do some nested foreach loops and loop through each computer name for each file.
This also uses match rather than like in where-object, you can use. like if you get false positives.
$FileContent = Get-Content -Path "\\nas1\share1\*.log"
$ComputerArray = 'PC1','PC2','PC3'
$DateStamp = get-date -Format dd-MM-yyyy
$OutFile = 'summary_' + $DateStamp + '.log'
foreach ($ComputerName in $ComputerArray)
{
$FileContent | Where {$_ -match $ComputerName} | Add-Content -Path $OutFile
}
I am trying to create a batch file that I can use to type in a name of a folder and search multiple directories, then display the results in a new window. Example: I want to search for "tcash" in 3 separate directories, ie; \vm-xa01\users, vm-xa02\users and vm-xa03\users. How can I do this?
The original question had Powershell tag, so the answer is Powershell. For cmd (batch) script, I'd strongly suggest you to move into Powershell anyway. It's 2018 and cmd scripts require lots of tweaking.
In Powershell, there's a built-in cmdlet Out-GridView that might be suitable. For example, to display all txt files in c:\some\path and its subdirectories requires just a few commands. Like so,
gci c:\some\path -Recurse | ? { $_.extension -eq ".txt" } | ogv
First off, get a recursive list of all files
gci c:\some\path -Recurse
Then select those that have extension .txt
| ? { $_.extension -eq ".txt" }
Finally, pass the results to out-gridview aka ogv
| ogv
I also think PowerShell is the better script language for your task.
You can do a dir/Get-ChildItem with ranges [1-3] similar to a Regular Expression, so:
Get-ChildItem "\vm-xa0[1-3]\users\tcash" -File -Recurse | Out-Gridview
should enumerate all matching files and display in a gui window.
I have put together the below PowerShell script which scripts out all the USPs on a server.
Is there an option to split the output into individual files (instead of saving as one whole/large file)?
Get-ChildItem -Path SQLSERVER:\SQL\myserver\Default\Databases\mydb\StoredProcedures\ | %{$_.script() | out-file -Filepath "myfilelocation.sql"}
Try the following; it should append the procedure name to the file.
Get-ChildItem -Path SQLSERVER:\SQL\myserver\Default\Databases\mydb\StoredProcedures\ |
%{
#Deal with invalid chars in Procedure name i.e. [Customers\Remove]
$SProc = "$($_.name -replace '\\', '_')"
# $Sproc | Out-Host # Uncomment this to check the procedure names...
$_.script() |
out-file -Filepath "myfilelocation_$SProc.sql"
}
This should make the file name unique per database and not have the file over-written each time. That is what is currently happening with your script.
I have a script that I've been working on, which reads a specified directory, locates .CSV files, and executes some logic for each of the .CSV files and ultimately renames them .csv.archived .
The code was working fine last night, however this morning, when I execute the code it only loops through once. For example, last night, I had 5 .csv files in the directory, it would return the file names for all 5 files in a single action. Now, each time I execute my script, it grabs the first file, performs the actions intended, and then exits forcing me to manually initiate the script for each file.
I've gutted the irrelevant code for testing purposes, and would be happy if someone could tell me that I am doing something wrong, and that I am not crazy.
Here's the code:
$iterations = 1
#set the location where the .CSV files will be pulled from
$Filecsv = get-childitem "\\SERVERPATH\Audit Test\" -recurse | where {$_.extension -eq ".csv"} | % {
$filename = $_.Name
}
#for each file found in the directory
ForEach ($Item in $Filecsv) {
#spit out the file name
"File Name: " + $filename
#count the times we've looped through
"Iterations : " + $iterations
# get the date and time from the system
$datetime = get-date -f MMddyy-hhmmtt
# rename the file
rename-item -path ("\\SERVERPATH\Audit Test\"+ $filename) -newname ($filename + $datetime + ".csv.archived")
$iterations ++
}
...and here is the Output:
For the example I showed you, I had four .CSV files in the directory. I had to manually execute my script, and each time it would perform as expected, but only for the first item it encounters in the directory. It doesn't actually loop through, what am I missing here?
Right here (folded at the pipe for readability):
$Filecsv = get-childitem "\\SERVERPATH\Audit Test\" -recurse |
where {$_.extension -eq ".csv"} |
% {$filename = $_.Name}
You're looping through your files and for each one setting $filename to the name of that file instead of letting the filenames accumulate in $Filecsv
$Filecsv = get-childitem "\\SERVERPATH\Audit Test\" -recurse |
where {$_.extension -eq ".csv"} |
% {$_.Name}
I hope someone can help me. I am pretty new to PowerShell and can't really script it myself with the exception of looking at existing code and modifying it.
I have found a PowerShell script that reports file share permissions for a specific share and recurses through the subfolders returning their permissions as well.
My problem is I need to do this with a lot of shares so would like to be able to provide the script with a text file containing the share names. I know I need to do a for each loop and read the names of the shares in a text file into an array but I don't know how to do this. I guess it's pretty simple for someone with more experience.
This is the script i have used with single entry.
http://mywinsysadm.wordpress.com/2011/08/17/powershell-reporting-ntfs-permissions-of-windows-file-shares/
#Set variables
$path = Read-Host "Enter the path you wish to check"
$filename = Read-Host "Enter Output File Name"
$date = Get-Date
#Place Headers on out-put file
$list = "Permissions for directories in: $Path"
$list | format-table | Out-File "C:\scripts\$filename"
$datelist = "Report Run Time: $date"
$datelist | format-table | Out-File -append "C:\scripts\$filename"
$spacelist = " "
$spacelist | format-table | Out-File -append "C:\scripts\$filename"
#Populate Folders & Files Array
[Array] $files = Get-ChildItem -path $path -force -recurse
#Process data in array
ForEach ($file in [Array] $files)
{
#Convert Powershell Provider Folder Path to standard folder path
$PSPath = (Convert-Path $file.pspath)
$list = ("Path: $PSPath")
$list | format-table | Out-File -append "C:\scripts\$filename"
Get-Acl -path $PSPath | Format-List -property AccessToString | Out-File -append "C:\scripts\$filename"
} #end ForEach
Sorry for the noob question. I plan to learn more when I have a bit more time but any help now would be massively appreciated.
Thanks in advance.
If you have a share name on each line within your text file can put all the shares into an array like this:
$path = "C:\ShareNames.txt"
$shareArray = gc $path
To access the first share you can use this syntax:
$shareArray[0]