I'm testing a project and all my test cases are in MTM , I'm looking for a way to extract all the test result we have in the MTM in a separate file , is there any way to do that ? please share if you have any idea
thanks a lot
If you wish to export the results of an automated run, you can download the .trx (test run execution) file from the attachments section and use XSLand XSLT to create an html report from it (you can also use the command-line tool tcm.exe run /export to get a .trx file).
But if you created the test run by manual execution, this won't be possible. The only way to get a "result file" would be to parse the result of the test run using the TFS API (in C# or Powershell via TfsTeamProjectCollection from Microsoft.TeamFoundation.TestManagement.Client and store it in a file.
Or you can use the TFS Rest-API with this PowerShell-Script (save as .ps) which lets you query a JSON and extract the data you want and display it the way you want to:
$RunId = Read-Host -Prompt "TFS Run Id"
$Url = "http://<tfsurl>/tfs/<CollectionName>/<TeamProject>/_apis/test/runs/$RunId/results"
$Client = New-Object System.Net.WebClient
$Client.Credentials = New-Object System.Net.NetworkCredential("<username>", "<password>", "<domain>")
$Json = $Client.DownloadString($Url) | ConvertFrom-Json
$Dict = #{}
ForEach($Test in $Json.value)
{
$Key = "Run " + $Test.testRun.name + " [" + $Test.testRun.id + "]"
$Val = $Test.testCase.name + " [" + $Test.testCase.id + "]" + " = " + $Test.outcome
if (!$Dict.ContainsKey($Key))
{
$List = New-Object System.Collections.ArrayList
$Dict.Add($Key, $List)
}
$IgnoreIndex = $Dict[$Key].Add($Val)
}
ForEach($Key in $Dict.Keys)
{
Write-Host $Key
ForEach($Val in $Dict[$Key])
{
Write-Host $Val
}
}
Exit
(replace values like <xxx> with yours)
Related
I am trying to automate a process of mine that involves fetching monthly reports that are generated on the first of each month for the previous month. The systems that generate these reports have different ways of distributing them, some are uploaded to an SFTP server, some are e-mailed as attachments, and the ones I am currently trying to fetch are stored on a web server.
Using some other code that I found, I have been able to authenticate to the web server at the URL one of the reports is hosted at, and I am able to successfully download the report.
Here is the code:
$request = Invoke-WebRequest $url -SessionVariable fb
$form = $request.Forms[0]
$form.Fields["username"] = "username"
$form.Fields["password"] = "password"
$downloadPath = "C:\Users\user\Documents\Reports_January_2019\File Servers\department1\"
$reportDate = (Get-Date -UFormat %b) + "01" + (Get-Date -UFormat %Y)
$url = "https://ourwebsite.com/schedule-reports/reportname/" + $reportDate + "/12-33-0-PM/servername1/monthlyreport.pdf"
Invoke-WebRequest -Uri ($url + $form.Action) -WebSession $fb -Method POST -Body $form.Fields
Invoke-WebRequest -Uri $url -OutFile ($downloadPath + "monthlyreport_servername1.pdf")
This code does work, so I am able to download the .pdf at the location specified. There are several other servers that the report is run on, so what I am trying to do is come up with an efficient way to download the reports generated for the other servers. For this, certain parts of the URL change. The parts that change are the date, the server name, and the name of the report which also has the server name in it. The $reportDate variable displays the dynamic part of the URL for the date the report was generated correctly.
I tried modifying this code and doing two foreach loops but this is where I am getting confused. Here is the code I am trying to make work to download all 7 reports using an array containing the server names:
$request = Invoke-WebRequest $url -SessionVariable fb
$form = $request.Forms[0]
$form.Fields["username"] = "username"
$form.Fields["password"] = "password"
$downloadPath = "C:\Users\user\Documents\Reports_January_2019\File Servers\$server\"
$servers = #("department1", "department2", "department3", "department4", "department5", "department6", "department7")
$serverNames = #("server1", "server2", "server3", "server4", "server5", "server6", "server7")
$reportDate = (Get-Date -UFormat %b) + "01" + (Get-Date -UFormat %Y)
$url = "https://ourwebsite.com/schedule-reports/reportname/" + "$reportDate" + "/12-33-0-PM/" + "$serverName" + "/monthlyreport.pdf"
foreach ($serverName in $serverNames) {
Invoke-WebRequest -Uri ($url + $form.Action) -WebSession $fb -Method POST -Body $form.Fields
foreach ($server in $servers) {
Invoke-WebRequest -Uri $url -OutFile ($downloadPath + "monthlyreport_$serverName.pdf")
}
}
The result when I run this is that I get 7 .pdf files downloaded and they are all named appropriately (monthlyreport_$serverName.pdf), but the file size is 6KB for all of them and they can't be opened in a PDF reader because they are somehow corrupted. 6KB does not match the file size of any of the reports so something is definitely messed up. Additionally, they are not moved into their own folder as per the $downloadPath variable, but rather all 7 files end up in "C:\Users\user\Documents\Reports_January_2019\File Servers\department7\"
Any guidance would be greatly appreciated! Thanks!
There seem to be logical issues of variable expansion in some of your commands and variables. When you are outside of your loop trying to expand a variable that is assigned inside of a loop, you will only evaluate it once instead of every iteration of the loop. I do not know if the code below will get you the result you are looking for, but I know that it fixes the variable expansion problem.
$servers = #("department1", "department2", "department3", "department4", "department5", "department6", "department7")
$serverNames = #("server1", "server2", "server3", "server4", "server5", "server6", "server7")
$reportDate = (Get-Date -UFormat %b) + "01" + (Get-Date -UFormat %Y)
foreach ($serverName in $serverNames) {
$url = "https://ourwebsite.com/schedule-reports/reportname/" + $reportDate + "/12-33-0-PM/" + $serverName + "/monthlyreport.pdf"
$request = Invoke-WebRequest $url -SessionVariable fb
$form = $request.Forms[0]
$form.Fields["username"] = "username"
$form.Fields["password"] = "password"
Invoke-WebRequest -Uri ($url + $form.Action) -WebSession $fb -Method POST -Body $form.Fields
foreach ($server in $servers) {
$downloadPath = "C:\Users\user\Documents\Reports_January_2019\File Servers\" + $server + "\"
Invoke-WebRequest -Uri $url -OutFile ($downloadPath + "monthlyreport_" + $serverName + ".pdf")
}
}
I have a PS script that should deploy a project to my SSIS server.
When I run the generated command in a console it runs fine but when the command is executed from Powershell it fails because of this (windows) error :
TITLE: SQL Server Integration Services
The path format is invalid.
Parameter name: DestinationPath (ISDeploymentWizard)
ADDITIONAL INFORMATION:
The path format is invalid. (Microsoft.SqlServer.IntegrationServices.Wizard.Common)
If I run the generated command from a console it runs fine:
D:\Deploy\ISDeploymentWizard.exe /Silent /ModelType:Project /SourcePath:"D:\Deploy\Receive\My_Beautiful_Project.ispac" /DestinationServer:"localhost" /DestinationPath:"/SSISDB/My Beautiful Project/My_Beautiful_Project" /ProjectPassword:"SuperSecretPassword"
The script (thanks to suggestions from Guenther Schmitz and Janne Tukaanen) :
#region script configuration
$SsisServer = "."
$ProjectFileFolder = "D:\Deploy\Receive"
$ProjectFileName = "My_Beautiful_Project.ispac"
$ProjectFilePassword = "SuperSecretPassword"
$FolderName = "My Beautiful Project"
$ProjectName = "My_Beautiful_Project"
$ISDeploymentWizard = "D:\Deploy\ISDeploymentWizard.exe"
#endregion
#region project deployment
# Create command line arguments
$DestinationPath = "/SSISDB/" + $FolderName + "/" + $ProjectName
$ProjectFilePath = $ProjectFileFolder + "\" + $ProjectFileName
$cmd = $ISDeploymentWizard
$arg1 = "/Silent"
$arg1a= "/ModelType:Project"
$arg2 = "/SourcePath:""$ProjectFilePath"""
$arg3 = "/DestinationServer:""$SsisServer"""
$arg4 = "/DestinationPath:""$DestinationPath"""
$arg5 = "/ProjectPassword:""$ProjectFilePassword"""
Write-Host "$cmd" $arg1 $arg1a $arg2 $arg3 $arg4 $arg5
& "$cmd" $arg1 $arg1a $arg2 $arg3 $arg4 $arg5
Write-Host "Done"
#endregion
There is no need to declare the following variables $arg1 $arg1a $arg2 $arg3 $arg4 $arg5, just run the following command (why declaring variables and storing their values in another variables??):
& $cmd /Silent /ModelType:Project /SourcePath:$ProjectFilePath /DestinationServer:$SsisServer /DestinationPath:$DestinationPath /ProjectPassword:$ProjectFilePassword
you are missing the executable in the line below Write-Host.
change
& $arg1 $arg2 $arg3 $arg4 $arg5
to
& $cmd $arg1 $arg2 $arg3 $arg4 $arg5
If you have troubles to start console apps in powershell (typically because of multiple arguments), you may execute it through cmd (in powershell)
cmd /c "$cmd $arg1 $arg2 $arg3 $arg4 $arg5"
There is also another option using Process class, so you don't have to use cmd:
$ProcessInfo = New-Object System.Diagnostics.ProcessStartInfo
$ProcessInfo.FileName = "D:\Deploy\ISDeploymentWizard.exe"
$ProcessInfo.Arguments = "$arg1 $arg1a $arg2 $arg3 $arg4 $arg5"
$ProcessInfo.RedirectStandardError = $true
$ProcessInfo.RedirectStandardOutput = $true
$ProcessInfo.UseShellExecute = $false
$Process = New-Object System.Diagnostics.Process
$Process.StartInfo = $ProcessInfo
$Process.Start() | Out-Null
$output = $Process.StandardOutput.ReadToEnd()
$errors = $Process.StandardError.ReadToEnd()
$Process.WaitForExit()
Write-Host $output
Write-Error $errors
You can check this for some more details:
PowerShell, stream Process output and errors while running external process
Pretty sure that the $DestinationPath needs to be a non relative path. Change it to the full path including the drive and I think that will solve your problem.
So I have a script that is meant to send some pretty huge (700mb) txt files to an FTP of a service we use that automatically set's our prices across a fleet of domains.
I'm using a bitearray (as found here) to upload it to the site, and I have some very elementary built in error handling as well as some database tie-ins.
If we run these seperately, they move pretty fast. For whatever reason, after the completion of one block, the time until starting another block is CRAZY volatile. Sometimes it's 2 minutes, sometimes the script just sits around for a good 40 minutes before moving to the next block.
I'm sort of assuming the issue is I'm giving the thing more housekeeping than I should be? It's also worth noting that even stopping the script can sometimes take 15 minutes. (Like, If I just hit break on the script in the middle of it running, it can take a good 15-20 minutes to stop)
Also, for what it's worth, the script got MUCH worse in terms of runtime in the last few days. I have no idea what we could have changed to make it start taking so much longer but here we are.
Anyway, any insights would be appreciated.
Notes: I don't clear the content variable when I clear variables, should I?
Should I keep the rs open? I don't think I CAN because I'm connecting to the FTP with different usernames.
Here's the code (I actually have about 12 of the *.txt Blocks, but They're identicle so I've kept it down to three here):
#================================== SETUP BLOCK ==================================#
#get dat email ready
$strpasswd = Get-Content "PASSWORDFILE" | ConvertTo-SecureString
$mycreds = New-Object System.Management.Automation.PSCredential ("EXCHANGEUSER",$strpasswd)
$EmailTo = 'some#email','goes#here'
$EmailFrom = 'EXCHANGEUSER'
$EmailSubject = "CA Feed Issue Undefined Subject"
$emailbody = "Body Not Yet Defined"
$SmtpServer = 'MUHSERVER'
#Opens up database session so we can send queries
$strserver = "Server\MUHDB"
$strdatabase = "logs"
$strusername = "EXCHANGEUSER"
#createsdatabaseconnection
$sqlConnection = new-object System.Data.SqlClient.SqlConnection "server='$strserver';database='$strdatabase';Integrated Security=SSPI; User ID='$strusername'; password='$strpassword'"
$sqlConnection.Open()
#define the defaultquery
$strQuery =
"
INSERT INTO [logs].[dbo].[EventLog] (SourceID, Started, Completed, Result, Context, Machine)
values (50,1500,1500,'NOTEDEFINED','NOTDEFINED','Server\MUHCLIENTMACHINE-CAFeed')
"
#this is how I execute the command
#$sqlCommand = $sqlConnection.CreateCommand()
#$sqlCommand.CommandText = $strquery
#$sqlCommand.ExecuteReader()
#==================================Luna.txt ==================================#
##DEFINE THESE TO CREATE NEW FEEDS
$strFilename = "\\PATH\Luna.txt"
$ftp = [System.Net.FtpWebRequest]::Create("FTPLINK1")
$user = "USERNAME1"
$password = "PASSWORDREDACTED"
# create the FtpWebRequest and configure it
$ftp = [System.Net.FtpWebRequest]$ftp
# build authentication and connection
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential($user,$password)
$ftp.UseBinary = $true
$ftp.UsePassive = $true
$ftp.timeout = -1
#start a timer and error handling
$starttime = (get-date).ToString()
$error.Clear()
# read in the file to upload as a byte array
$content = [System.IO.File]::ReadAllBytes("$strfilename")
$ftp.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
$endtime = (get-date).ToString()
#error handle
if ($error)
{
#Assemble the Query
$sqlresult = "THERE IS AN ERROR, Check the error email for details"
$sqlcontext = ($strfilename + '|' + $content.length + ' bytes')
$strquery =
"INSERT INTO [logs].[dbo].[EventLog] (SourceID, Started, Completed, Result, Context, Machine)
values (50,'$starttime','$endtime','$sqlresult','$sqlcontext','Server\MUHCLIENTMACHINE-CAFEEDSCRIPT')"
#Create Command and Execute.
$sqlCommand = $sqlConnection.CreateCommand()
$sqlCommand.CommandText = $strQuery
$sqlCommand.ExecuteNonQuery()
#Send dem emails
$emailbody = "A file for the CA Feed failed on $strfilename at " + (get-date).ToString() + " with the error '$error[0]'"
$emailsubject = "CA Feed Failed File"
Send-MailMessage -SmtpServer $SmtpServer -to $EmailTo -from $EmailFrom -subject $EmailSubject -Body $emailbody
}
else
{
write-host ("$strfilename" + ' Ran Without Errors')
$sqlresult = "RAN WITHOUT ERRORS"
$sqlcontext = ($strfilename + '|' + $content.length + ' bytes')
$strquery =
"INSERT INTO [logs].[dbo].[EventLog] (SourceID, Started, Completed, Result, Context, Machine)
values (50,'$starttime','$endtime','$sqlresult','$sqlcontext','Server\MUHCLIENTMACHINE-CAFEEDSCRIPT')"
#Create a command object.
$sqlCommand = $sqlConnection.CreateCommand()
$sqlCommand.CommandText = $strQuery
$sqlCommand.ExecuteNonQuery()
}
# be sure to clean up after ourselves and get ready for next block
Clear-Variable -Name starttime,endtime,strfilename,sqlresult,sqlcontext,ftp
$rs.Close()
$rs.Dispose()
#==================================LDE.txt ==================================#
##DEFINE THESE TO CREATE NEW FEEDS
$strFilename = "\\PATH\LDE.txt"
$ftp = [System.Net.FtpWebRequest]::Create("FTPLINK2")
$user = "USERNAME2"
$password = "PASSWORDREDACTED"
# create the FtpWebRequest and configure it
$ftp = [System.Net.FtpWebRequest]$ftp
# build authentication and connection
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential($user,$password)
$ftp.UseBinary = $true
$ftp.UsePassive = $true
$ftp.timeout = -1
#start a timer and error handling
$starttime = (get-date).ToString()
$error.Clear()
# read in the file to upload as a byte array
$content = [System.IO.File]::ReadAllBytes("$strfilename")
$ftp.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
$endtime = (get-date).ToString()
#error handle
if ($error)
{
#Assemble the Query
$sqlresult = "THERE IS AN ERROR, Check the error email for details"
$sqlcontext = ($strfilename + '|' + $content.length + ' bytes')
$strquery =
"INSERT INTO [logs].[dbo].[EventLog] (SourceID, Started, Completed, Result, Context, Machine)
values (50,'$starttime','$endtime','$sqlresult','$sqlcontext','Server\MUHCLIENTMACHINE-CAFEEDSCRIPT')"
#Create Command and Execute.
$sqlCommand = $sqlConnection.CreateCommand()
$sqlCommand.CommandText = $strQuery
$sqlCommand.ExecuteNonQuery()
#Send dem emails
$emailbody = "A file for the CA Feed failed on $strfilename at " + (get-date).ToString() + " with the error '$error[0]'"
$emailsubject = "CA Feed Failed File"
Send-MailMessage -SmtpServer $SmtpServer -to $EmailTo -from $EmailFrom -subject $EmailSubject -Body $emailbody
}
else
{
write-host ("$strfilename" + ' Ran Without Errors')
$sqlresult = "RAN WITHOUT ERRORS"
$sqlcontext = ($strfilename + '|' + $content.length + ' bytes')
$strquery =
"INSERT INTO [logs].[dbo].[EventLog] (SourceID, Started, Completed, Result, Context, Machine)
values (50,'$starttime','$endtime','$sqlresult','$sqlcontext','Server\MUHCLIENTMACHINE-CAFEEDSCRIPT')"
#Create a command object.
$sqlCommand = $sqlConnection.CreateCommand()
$sqlCommand.CommandText = $strQuery
$sqlCommand.ExecuteNonQuery()
}
# be sure to clean up after ourselves and get ready for next block
Clear-Variable -Name starttime,endtime,strfilename,sqlresult,sqlcontext,ftp
$rs.Close()
$rs.Dispose()
I don't think anybody is going to debug that code. Your best bet is to find where your issue is. I use a stopwatch like below. set it up strategically:
$SW = [System.Diagnostics.Stopwatch]::new()
$SW.Start()
#Your code block goes here
Write-Host "End of Code block 1"
$SW.Elapsed.TotalSeconds
#Another code block goes here
Write-Host "End of Code block 2"
$SW.Elapsed.TotalSeconds
Now if you are trying to break out and it is taking 15 mins to respond, it is probably stuck doing an operation. It cannot respond until the operation finishes or fails.
I've done varied amounts of research into how to run a parameterised PS script from SSIS. I am having issues getting a parameterised script running. PS script is as follows, if I hard code the parameters into the script it behaves as expected:
Param ([string]$filepath,[string]$filename)
$Path = $filepath
$InputFile = (Join-Path $Path $filename)
$Reader = New-Object System.IO.StreamReader($InputFile)
While (($Line = $Reader.ReadLine()) -ne $null) {
If ($Line -match 'FILE\|([^\|]+)') {
$OutputFile = "$($matches[1]).txt"
}
Add-Content (Join-Path $Path $OutputFile) $Line
}
Running in SSIS execute process task, I am trying to build the Arguments command via an expression as follows:
"-ExecutionPolicy ByPass -File " + #[User::vPSScriptLocation] + " " + #[User::vFilePath]+ " "+ #[User::vFileName]
Evaluating the expression gives the following:
-ExecutionPolicy ByPass -File \\WorkDirectory\Script.ps1 \\transfer datafile.data
Upon execution, the task fails. The .ps1 is deleted from the work directory and SSIS gives the following error code:
Error: 0xC0029151 at Execute powershell script, Execute Process Task: In Executing "C:\Windows\System32\WindowsPowerShell\v1.0\PowerShell.exe" "-ExecutionPolicy ByPass -File \\WorkDirectory\Script.ps1 \\transfer datafile.data" at "", The process exit code was "-196608" while the expected was "0".
Looks like it's getting an empty string where it shouldn't? Any pointers appreciated.
OK so as it appears I could not call a UNC path to execute this using an Execute Process Task, I decided to execute this within a Script Task with a reference added to System.Management.Automation which allowed me to create a PowerShell instance. This is far from my ideal solution as I really wanted to call a .ps1 file, but looks like this is my only solution given I need to use a UNC path.
I build the PS script with my Dts variables and then executed it within the instance, which achieved the desired result:
public void Main()
{
string filepath = Dts.Variables["User::vUNCPath"].Value.ToString();
string filename = Dts.Variables["User::vFileName"].Value.ToString();
string searchterm = Dts.Variables["User::vSearchTerm"].Value.ToString();
bool fireAgain = true;
// Build powershell script
string script = "$Path = \""+filepath+"\";"+
"$InputFile = (Join-Path $Path \""+ filename+"\");" +
"$Reader = New-Object System.IO.StreamReader($InputFile);" +
"While (($Line = $Reader.ReadLine()) -ne $null) {" +
"If ($Line -match '"+searchterm+"') { "+
"$OutputFile = \"$($matches[1]).txt\"};" +
"Add-Content (Join-Path $Path $OutputFile) $Line}";
Dts.Events.FireInformation(0, "Info", "Powershell script built: " + script, String.Empty, 0, ref fireAgain);
try
{
// Create instance to run script
using (PowerShell psinstance = PowerShell.Create())
{
//Assign built script to this instance
psinstance.AddScript(script);
//Run powershell script
psinstance.Invoke();
}
Dts.TaskResult = (int)ScriptResults.Success;
}
catch (Exception ex)
{
Dts.Events.FireError(0, "Error", ex.Message, String.Empty, 0);
Dts.TaskResult = (int)ScriptResults.Failure;
}
}
If you're running your script as a ps1-File with a param-block like this, your execution-call should name the parameters by their name:
"-ExecutionPolicy ByPass -File " + #[User::vPSScriptLocation] + " -filepath " + #[User::vFilePath]+ " -filename "+ #[User::vFileName]
This should do it, if you use a valid filepath and filename.
If it doesn't work, please try to write your script as a function and try it in a powershell-console. Your script as a function looks like this:
function SearchLines
{
Param (
[string]$filepath,
[string]$filename
)
$Path = $filepath
$InputFile = (Join-Path $Path $filename)
$Reader = New-Object System.IO.StreamReader($InputFile)
While (($Line = $Reader.ReadLine()) -ne $null) {
If ($Line -match 'FILE\|([^\|]+)') {
$OutputFile = "$($matches[1]).txt"
}
Add-Content (Join-Path $Path $OutputFile) $Line
}
}
Usage:
SearchLines -filepath "\\your\unc\path\here" -filename "filename.txt"
If this don't work for you, please let us know which error you got.
Thx.
UPDATE:
Based on your comments, i wrote your function new in the hope, it meets your requirements as close as possible. The function now looks like this:
function SearchLines
{
Param (
[string]$InputFile
)
$FileContent = Get-Content $InputFile
foreach($Line in $FileContent)
{
If ($Line -match 'FILE\|([^\|]+)')
{
$OutputFile = "$($matches[1]).txt"
}
Add-Content -Path $OutputFile -Value $Line
}
}
Usage:
SearchLines -InputFile c:\your\path\to\your\file.log
This function creates for every line in your given file a new file in the actual folder named what is written in the line. The Cmdlet Join-Path simply adds the two strings together, without any check for plausibility. That's why you can simply commit the full path to your file instead of the path and the file in separate parameters.
If you need the path of the inputfile to set it for your outputfiles, you can get it with these lines:
$tmpPath = Get-Childitem $InputFullPath
$Path = $tmpPath.Directory.FullName
Because you didn't explained what exactly this script should do, i hope you can use this to get what you wanted.
Greetings
I am trying to write a script to scan for Sql Maintenance Task failures - see script below. I appear to be unable to process more than 100 entries using EnumHistory(). Does anyone have a way around this?
Param(
[int]$days="30" # this hardly matters since EnumJobHistory is limited to 100 rows :-(
)
#http://powershell.com/cs/blogs/tobias/archive/2010/01/13/cancelling-a-pipeline.aspx
filter Stop-Pipeline([scriptblock]$condition = {$true})
{$_
if (& $condition) {continue}
}
cls
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
$instances = Get-Content "DailyMaintenanceMMCServerList.txt"
#loop through each instance
foreach ($instance in $instances)
{
# Create an SMO connection to the instance
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $instance
$instance
#get all the jobs on the server
$jobs = $srv.JobServer.Jobs
# Avoid exception on some servers?
if (!$jobs.Count)
{
continue
}
#go through each job and find failures in the job history
$jobs | % {
do
{ $job = $_; $count = 0;
$_.EnumHistory() |
Stop-Pipeline { $_.Rundate -lt [datetime]::Today.AddDays(-$days) } |
#? {$_.Message -notlike "*succeeded*" } |
% { " " + ++$count + " " + $job.Name + " " + $_.RunDate + " " + ($_.Message).Substring(0,20) }
} while ($false)
}
}
As pointed out by Ben Thul, the maximum number of rows of history kept is configured by server instance:
Check how much history the Agent is configured to keep. In powershell, you can get this from the MaximumJobHistoryRows in the JobServer object. Or right click on the agent in SSMS and look at "history". My guess is that it's only configured to keep 100 per job.