Powershell Script Taking Long Time when seems like it shouldnt? - arrays

So I have a script that is meant to send some pretty huge (700mb) txt files to an FTP of a service we use that automatically set's our prices across a fleet of domains.
I'm using a bitearray (as found here) to upload it to the site, and I have some very elementary built in error handling as well as some database tie-ins.
If we run these seperately, they move pretty fast. For whatever reason, after the completion of one block, the time until starting another block is CRAZY volatile. Sometimes it's 2 minutes, sometimes the script just sits around for a good 40 minutes before moving to the next block.
I'm sort of assuming the issue is I'm giving the thing more housekeeping than I should be? It's also worth noting that even stopping the script can sometimes take 15 minutes. (Like, If I just hit break on the script in the middle of it running, it can take a good 15-20 minutes to stop)
Also, for what it's worth, the script got MUCH worse in terms of runtime in the last few days. I have no idea what we could have changed to make it start taking so much longer but here we are.
Anyway, any insights would be appreciated.
Notes: I don't clear the content variable when I clear variables, should I?
Should I keep the rs open? I don't think I CAN because I'm connecting to the FTP with different usernames.
Here's the code (I actually have about 12 of the *.txt Blocks, but They're identicle so I've kept it down to three here):
#================================== SETUP BLOCK ==================================#
#get dat email ready
$strpasswd = Get-Content "PASSWORDFILE" | ConvertTo-SecureString
$mycreds = New-Object System.Management.Automation.PSCredential ("EXCHANGEUSER",$strpasswd)
$EmailTo = 'some#email','goes#here'
$EmailFrom = 'EXCHANGEUSER'
$EmailSubject = "CA Feed Issue Undefined Subject"
$emailbody = "Body Not Yet Defined"
$SmtpServer = 'MUHSERVER'
#Opens up database session so we can send queries
$strserver = "Server\MUHDB"
$strdatabase = "logs"
$strusername = "EXCHANGEUSER"
#createsdatabaseconnection
$sqlConnection = new-object System.Data.SqlClient.SqlConnection "server='$strserver';database='$strdatabase';Integrated Security=SSPI; User ID='$strusername'; password='$strpassword'"
$sqlConnection.Open()
#define the defaultquery
$strQuery =
"
INSERT INTO [logs].[dbo].[EventLog] (SourceID, Started, Completed, Result, Context, Machine)
values (50,1500,1500,'NOTEDEFINED','NOTDEFINED','Server\MUHCLIENTMACHINE-CAFeed')
"
#this is how I execute the command
#$sqlCommand = $sqlConnection.CreateCommand()
#$sqlCommand.CommandText = $strquery
#$sqlCommand.ExecuteReader()
#==================================Luna.txt ==================================#
##DEFINE THESE TO CREATE NEW FEEDS
$strFilename = "\\PATH\Luna.txt"
$ftp = [System.Net.FtpWebRequest]::Create("FTPLINK1")
$user = "USERNAME1"
$password = "PASSWORDREDACTED"
# create the FtpWebRequest and configure it
$ftp = [System.Net.FtpWebRequest]$ftp
# build authentication and connection
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential($user,$password)
$ftp.UseBinary = $true
$ftp.UsePassive = $true
$ftp.timeout = -1
#start a timer and error handling
$starttime = (get-date).ToString()
$error.Clear()
# read in the file to upload as a byte array
$content = [System.IO.File]::ReadAllBytes("$strfilename")
$ftp.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
$endtime = (get-date).ToString()
#error handle
if ($error)
{
#Assemble the Query
$sqlresult = "THERE IS AN ERROR, Check the error email for details"
$sqlcontext = ($strfilename + '|' + $content.length + ' bytes')
$strquery =
"INSERT INTO [logs].[dbo].[EventLog] (SourceID, Started, Completed, Result, Context, Machine)
values (50,'$starttime','$endtime','$sqlresult','$sqlcontext','Server\MUHCLIENTMACHINE-CAFEEDSCRIPT')"
#Create Command and Execute.
$sqlCommand = $sqlConnection.CreateCommand()
$sqlCommand.CommandText = $strQuery
$sqlCommand.ExecuteNonQuery()
#Send dem emails
$emailbody = "A file for the CA Feed failed on $strfilename at " + (get-date).ToString() + " with the error '$error[0]'"
$emailsubject = "CA Feed Failed File"
Send-MailMessage -SmtpServer $SmtpServer -to $EmailTo -from $EmailFrom -subject $EmailSubject -Body $emailbody
}
else
{
write-host ("$strfilename" + ' Ran Without Errors')
$sqlresult = "RAN WITHOUT ERRORS"
$sqlcontext = ($strfilename + '|' + $content.length + ' bytes')
$strquery =
"INSERT INTO [logs].[dbo].[EventLog] (SourceID, Started, Completed, Result, Context, Machine)
values (50,'$starttime','$endtime','$sqlresult','$sqlcontext','Server\MUHCLIENTMACHINE-CAFEEDSCRIPT')"
#Create a command object.
$sqlCommand = $sqlConnection.CreateCommand()
$sqlCommand.CommandText = $strQuery
$sqlCommand.ExecuteNonQuery()
}
# be sure to clean up after ourselves and get ready for next block
Clear-Variable -Name starttime,endtime,strfilename,sqlresult,sqlcontext,ftp
$rs.Close()
$rs.Dispose()
#==================================LDE.txt ==================================#
##DEFINE THESE TO CREATE NEW FEEDS
$strFilename = "\\PATH\LDE.txt"
$ftp = [System.Net.FtpWebRequest]::Create("FTPLINK2")
$user = "USERNAME2"
$password = "PASSWORDREDACTED"
# create the FtpWebRequest and configure it
$ftp = [System.Net.FtpWebRequest]$ftp
# build authentication and connection
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential($user,$password)
$ftp.UseBinary = $true
$ftp.UsePassive = $true
$ftp.timeout = -1
#start a timer and error handling
$starttime = (get-date).ToString()
$error.Clear()
# read in the file to upload as a byte array
$content = [System.IO.File]::ReadAllBytes("$strfilename")
$ftp.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
$endtime = (get-date).ToString()
#error handle
if ($error)
{
#Assemble the Query
$sqlresult = "THERE IS AN ERROR, Check the error email for details"
$sqlcontext = ($strfilename + '|' + $content.length + ' bytes')
$strquery =
"INSERT INTO [logs].[dbo].[EventLog] (SourceID, Started, Completed, Result, Context, Machine)
values (50,'$starttime','$endtime','$sqlresult','$sqlcontext','Server\MUHCLIENTMACHINE-CAFEEDSCRIPT')"
#Create Command and Execute.
$sqlCommand = $sqlConnection.CreateCommand()
$sqlCommand.CommandText = $strQuery
$sqlCommand.ExecuteNonQuery()
#Send dem emails
$emailbody = "A file for the CA Feed failed on $strfilename at " + (get-date).ToString() + " with the error '$error[0]'"
$emailsubject = "CA Feed Failed File"
Send-MailMessage -SmtpServer $SmtpServer -to $EmailTo -from $EmailFrom -subject $EmailSubject -Body $emailbody
}
else
{
write-host ("$strfilename" + ' Ran Without Errors')
$sqlresult = "RAN WITHOUT ERRORS"
$sqlcontext = ($strfilename + '|' + $content.length + ' bytes')
$strquery =
"INSERT INTO [logs].[dbo].[EventLog] (SourceID, Started, Completed, Result, Context, Machine)
values (50,'$starttime','$endtime','$sqlresult','$sqlcontext','Server\MUHCLIENTMACHINE-CAFEEDSCRIPT')"
#Create a command object.
$sqlCommand = $sqlConnection.CreateCommand()
$sqlCommand.CommandText = $strQuery
$sqlCommand.ExecuteNonQuery()
}
# be sure to clean up after ourselves and get ready for next block
Clear-Variable -Name starttime,endtime,strfilename,sqlresult,sqlcontext,ftp
$rs.Close()
$rs.Dispose()

I don't think anybody is going to debug that code. Your best bet is to find where your issue is. I use a stopwatch like below. set it up strategically:
$SW = [System.Diagnostics.Stopwatch]::new()
$SW.Start()
#Your code block goes here
Write-Host "End of Code block 1"
$SW.Elapsed.TotalSeconds
#Another code block goes here
Write-Host "End of Code block 2"
$SW.Elapsed.TotalSeconds
Now if you are trying to break out and it is taking 15 mins to respond, it is probably stuck doing an operation. It cannot respond until the operation finishes or fails.

Related

How to remove encryption from all objects in SQL Server?

I have more than a hundred encrypted procedures and functions that I want to decrypt (I am trying a bacpac file export but it fails due to procedures being encrypted). I tried using dbforge sql decryptor decryption wizard for in place alter but I get the error:
Definition is invalid. Can't find CREATE keyword.
When I try to see the DDL script of a stored procedure(using dbforge sql decryptor), I get the error:
A definition for the object dbo.pt_blocks cannot be shown because it is encrypted by a third party tool
I can not find a resolution to this. Are there any solutions or other tools available for this?
Edit: I found this resource which mentions
take the source code and issue an ALTER command without the encryption option. Just take the source code and remove the WITH ENCRYPTION
How could I achieve this?
EDIT: I have enabled remote DAC. How can I decrypt everything? The accepted answer from this question has a broken link.
Edit: The problem has been solved by uninstalling a third party tool which was creating encrypted procedures.
Below is a PowerShell example that creates a script file of all encrypted objects, gleaned from Paul White's The Internals of WITH ENCRYPTION article. Change the data source and initial catalog in the 2 connection strings to the desired server and database as well as script file path.
A DAC connection is used to retrieve values from system tables so sysadmin server role membership is required. If run remotely, the SQL Server remote admin connections option must be enabled and TCP port 1434 allowed through the firewall.
The script can be run from the PowerShell ISE or from a command prompt after customization. Example command-line invocation, assuming script was saved to file "Decrypt-Objects.ps1".
powershell -ExecutionPolicy RemoteSigned -File C:\PowershellScripts\Decrypt-Objects.ps1
PowerShell script:
# PowerShell implementation of T-SQL code from https://sqlperformance.com/2016/05/sql-performance/the-internals-of-with-encryption
Function Get-DecryptedString($pwd, $data) {
$key = [System.Array]::CreateInstance([int], 256)
$box = [System.Array]::CreateInstance([int], 256)
$cipher = [System.Array]::CreateInstance([byte], $data.Length)
for ($i = 0; $i -lt 256; ++$i) {
$key[$i] = $pwd[$i % $pwd.Length]
$box[$i] = $i
}
for ($j = $i = 0; $i -lt 256; ++$i) {
$j = ($j + $box[$i] + $key[$i]) % 256
$tmp = $box[$i]
$box[$i] = $box[$j]
$box[$j] = $tmp
}
for ($a = $j = $i = 0; $i -lt $data.Length; ++$i) {
++$a
$a %= 256
$j += $box[$a]
$j %= 256
$tmp = $box[$a]
$box[$a] = $box[$j]
$box[$j] = $tmp
$k = $box[(($box[$a] + $box[$j]) % 256)]
$cipher[$i] = ($data[$i] -bxor $k)
}
$decryptedString = [System.Text.Encoding]::Unicode.GetString($cipher)
return $decryptedString
}
Function Get-ClearObjectText($connectionString, $objectName) {
$getRc4KeyQuery = #"
DECLARE
#objectid integer = OBJECT_ID(#ObjectName),
#family_guid binary(16),
#objid binary(4),
#subobjid binary(2);
-- Find the database family GUID
SELECT #family_guid = CONVERT(binary(16), DRS.family_guid)
FROM sys.database_recovery_status AS DRS
WHERE DRS.database_id = DB_ID();
-- Convert object ID to little-endian binary(4)
SET #objid = CONVERT(binary(4), REVERSE(CONVERT(binary(4), #objectid)));
SELECT
-- Read the encrypted value
#imageval = SOV.imageval,
-- Get the subobjid and convert to little-endian binary
#subobjid = CONVERT(binary(2), REVERSE(CONVERT(binary(2), SOV.subobjid)))
FROM sys.sysobjvalues AS SOV
WHERE
SOV.[objid] = #objectid
AND SOV.valclass = 1;
-- Compute the RC4 initialization key
SELECT #RC4key = HASHBYTES('SHA1', #family_guid + #objid + #subobjid);
"#
$connection = New-Object System.Data.SqlClient.SqlConnection($dacConnectionString)
$connection.Open()
$command = New-Object System.Data.SqlClient.SqlCommand($getRc4KeyQuery, $connection)
($command.Parameters.Add("#ObjectName", [System.Data.SqlDbType]::NVarChar, 261)).Value = $objectName
($command.Parameters.Add("#imageval", [System.Data.SqlDbType]::VarBinary, -1)).Direction = [System.Data.ParameterDirection]::Output
($command.Parameters.Add("#RC4key", [System.Data.SqlDbType]::Binary, 20)).Direction = [System.Data.ParameterDirection]::Output
[void]$command.ExecuteNonQuery()
$imageval = $command.Parameters["#imageval"].Value
$RC4key = $command.Parameters["#RC4key"].Value
$connection.Close()
$decryptedString = Get-DecryptedString -pwd $RC4key -data $imageval
Return $decryptedString
}
# ############
# ### MAIN ###
# ############
# DAC connection string for decryption
$dacConnectionString = "Data Source=admin:YourServer;Initial Catalog=YourDatabase;Integrated Security=SSPI"
# normal connection string for encrypted object list
$connectionString = "Data Source=YourServer;Initial Catalog=YourDatabase;Integrated Security=SSPI"
# target file path for clear encrypted objects DDL
$scriptFilePath = "C:\Scripts\EncryptedObjects.sql"
[void](New-Item -Path "C:\Scripts\EncryptedObjects.sql" -ItemType file -Force) # create directory (if needed) and empty script file
$EncryptedObjectQuery = #"
SELECT
QUOTENAME(OBJECT_SCHEMA_NAME(object_id)) + '.' + QUOTENAME(name) AS QualifiedObjectName
FROM sys.objects
WHERE OBJECTPROPERTY(object_id, 'IsEncrypted') = 1;
"#
try {
$connection = New-Object System.Data.SqlClient.SqlConnection($connectionString)
$command = New-Object System.Data.SqlClient.SqlCommand($EncryptedObjectQuery, $connection)
$connection.Open()
$reader = $command.ExecuteReader()
while ($reader.Read()) {
$createObjectScript = Get-ClearObjectText -connectionString $dacConnectionString -objectName $reader["QualifiedObjectName"]
$createObjectScript | Out-File -FilePath $scriptFilePath -Append
"GO" | Out-File -FilePath $scriptFilePath -Append
}
$connection.Close()
}
catch {
throw
}

Show message after script runs with row count

I have the following PowerShell script that works fine. However, when it is run it flashes up and away very quickly.
How could I display a message if the $rowsAffected is greater/equal to 1 and a different message if $rowsAffected is 0 and have it stay on screen until enter is pressed or the window is closed?
#Create SQL Connection
$con = New-Object "System.Data.SqlClient.SQLConnection"
#Set Connection String
$con.ConnectionString = ("Data Source=.\SQL2017;Initial Catalog=DatabaseName;user id=test;password=test;")
$con.Open()
#run query
$sqlcmd = New-Object "System.Data.SqlClient.SqlCommand"
$sqlcmd.Connection = $con
$sqlcmd.CommandTimeout = 30
$sqlcmd.CommandText = "UPDATE Execution SET Execution = 1"
$rowsAffected = $sqlcmd.ExecuteNonQuery()
$con.Close()
This looks to do the trick.
If ($rowsAffected -ge 1)
{
"Your request for the file has been successfully submitted and will be processed in the next few minutes. "
pause
}
else
{
"ERROR! No file was found. Please contact support"
pause
}

Using Powershell to Bulk Import Large CSV into SQL Server

I came across a post discussing how to use Powershell to bulk import massive data relatively fast. I have a typical csv file with about 5 million rows formatted in the usual way.
I keep getting the same error messages regardless if I choose to import a txt or csv file. Playing around with the csvdelimiter/firstcolumnnames section also created their own issues.
I've spent hours trying to figure out how to get it to work with MY csv files and I keep getting the same error messages no matter what I try. All field names accept Null and they are identical in every way between the table and csv file. I do not have a primary key for the database.
# Database variables
$sqlserver = "SERVERNAMEHERE"
$database = "autos"
$table = "AgedAutos"
# CSV variables
$csvfile = "C:\temp\aged.csv"
$csvdelimiter = "',"
$firstRowColumnNames = $true
################### No need to modify anything below ###################
Write-Host "Script started..."
$elapsed = [System.Diagnostics.Stopwatch]::StartNew()
[void][Reflection.Assembly]::LoadWithPartialName("System.Data")
[void][Reflection.Assembly]::LoadWithPartialName("System.Data.SqlClient")
# 50k worked fastest and kept memory usage to a minimum
$batchsize = 50000
# Build the sqlbulkcopy connection, and set the timeout to infinite
$connectionstring = "Data Source=$sqlserver;Integrated Security=true;Initial Catalog=$database;"
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connectionstring, [System.Data.SqlClient.SqlBulkCopyOptions]::TableLock)
$bulkcopy.DestinationTableName = $table
$bulkcopy.bulkcopyTimeout = 0
$bulkcopy.batchsize = $batchsize
# Create the datatable, and autogenerate the columns.
$datatable = New-Object System.Data.DataTable
# Open the text file from disk
$reader = New-Object System.IO.StreamReader($csvfile)
$columns = (Get-Content $csvfile -First 1).Split($csvdelimiter)
if ($firstRowColumnNames -eq $true) { $null = $reader.readLine() }
foreach ($column in $columns) {
$null = $datatable.Columns.Add()
}
# Read in the data, line by line
while (($line = $reader.ReadLine()) -ne $null) {
$null = $datatable.Rows.Add($line.Split($csvdelimiter))
$i++; if (($i % $batchsize) -eq 1) {
$bulkcopy.WriteToServer($datatable)
Write-Host "$i rows have been inserted in $($elapsed.Elapsed.ToString())."
$datatable.Clear()
}
}
# Add in all the remaining rows since the last clear
if($datatable.Rows.Count -gt 0) {
$bulkcopy.WriteToServer($datatable)
$datatable.Clear()
}
# Clean Up
$reader.Close(); $reader.Dispose()
$bulkcopy.Close(); $bulkcopy.Dispose()
$datatable.Dispose()
Write-Host "Script complete. $i rows have been inserted into the database."
Write-Host "Total Elapsed Time: $($elapsed.Elapsed.ToString())"
# Sometimes the Garbage Collector takes too long to clear the huge datatable.
[System.GC]::Collect()
Error message listed below.
Exception calling "WriteToServer" with "1" argument(s): "The given value of type String from the data source cannot be converted to
type date of the specified target column."
At C:\powershell_scripts\batch_csv_import-code1-working-test for auto table.ps1:43 char:3
+ $bulkcopy.WriteToServer($datatable)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : InvalidOperationException
340000 rows have been inserted in 00:00:03.5156162
I have no idea what that error means since I cannot find anything useful on Google. I'm thinking one of the columns might be listed incorrectly in SQL Server, but I could be wrong.
Please help me figure out the problem. Thanks.
You are getting all the data in the first column because your value for $csvdelimiter is incorrect.
you have: $csvdelimiter = "',"
it should be: $csvdelimiter = ","

Issues getting column width set when creating a csv file from Powershell

I am having to update 500+ machines at work not knowing the patching status of them and what they will need patched. This will just be a script that will tell me which machines are online or offline so I know what to target with other scripts that are yet to be written.
This will output to a CSV file and it can be opened successfully with all of the data in the proper columns.
How do I get the column width of the columns in the CSV file to be a set size or autosized to the data in the columns?
I can get the data formatted the way I would like in Powershell. it will display properly onscreen. All of the ways I have tried to export it do not provide the correct result.
Also if possible, can background color, font, bold, etc be set from powershell and sent to the CSV file? Future scripts that I will have will be creating CSV files with 50+ columns and over 500 rows.
# Setting Time Variable
$Script_Time = (Get-Date -Format MM-dd-yyyy--HH-mm)
# XXXXXXXX Path mapping for Computer Names, Ping Logs, CSV Name
$Root_Path = $env:USERPROFILE + "\xxxx-Working-Path----\"
$Computer_Names_Path = $root_Path + "Computer_Names\"
$Computer_Names_File = $Computer_Names_Path + "\ComputerNames_Working.txt"
$Ping_Logs_Path = $Root_Path + "Logs\Ping_Logs\"
$Log_Csv_Name = $Ping_Logs_Path + "Ping Results--" + $Script_Time + ".csv"
# Blanking/Setting needed variables
$Path_Result = ""
$Ping_Result = ""
$Name = ""
$Machine_Status = ""
$Total_Count = 0
$Online_Count = 0
$Offline_Count = 0
# Gathering Computer Names from text file
$List_Of_Computers = Get-Content "$Computer_Names_File"
# Array creation for CSV output
$Results_Array = [System.Collections.ArrayList]#()
# Testing to see if folders exist
If (Test-Path $Computer_Names_Path){
$Path_Result = "Computer Names Path Exists"
Write-Output $Path_Result
}
Else{
$Path_Result = "Computer Names Path Does Not Exist"
Write-Output $Path_Result
Pause
}
If (Test-Path $Ping_Logs_Path){
$Path_Result = "Ping Logs Csv Path Exists"
Write-Output $Path_Result
}
Else{
$Path_Result = "Ping Logs Csv Path Does Not Exist"
Write-Output $Path_Result
Pause
}
# Loop to gather online status of each machine
Foreach ($Name in $List_Of_Computers) {
# Test for True result
If (test-connection -computername $name -buffersize 16 -count 1 -quiet){
$Ping_Result = "Online"
$Online_Count = $Online_Count + 1
}
# Test for false result
Else{
$Ping_Result = "Offline"
$Offline_Count = $Offline_Count + 1
}
# Displays Online status and adds to CSV Array
$Machine_Status = $Name + "," + $Ping_Result + "," + (Get-Date -Format MM-dd-yyyy--HH-mm-ss)
Write-Output $Machine_Status
[void]$Results_Array.add($Machine_Status)
}
# Total machine count
$Total_Count = $Results_Array.Count
# Adding Information to Top 2 rows in Array
$Results_Array.Insert(0,"No of Machines: $Total_Count,Online Total: $Online_Count,Offline Total: $Offline_Count")
$Results_Array.Insert(1,"Computer Name,Online Status,Timestamp")
# Sends Array to CSV File
set-content $Log_Csv_Name -Value $Results_Array
EDIT
Applied what I had to a Excel Application output. Thanks for pointing my in that direction. It is working perfectly. Just need to finish fine tuning the formatting that I desire and learn how to save it with Powershell. Both, I assume, I will be able to research easily.
Here is the new script I have come up with after researching and applying what I found. Hopefully it will help others if they stumble upon this.
# Setting Time Variable
Write-Output "Setting Time Variable" ""
$Script_Time = (Get-Date -Format MM-dd-yyyy--HH-mm)
# XXXXXXXX Path Mapping for Computer Names and Ping Logs
Write-Output "Setting Path Variable" ""
$Root_Path = $env:USERPROFILE + "\XXXXXXXXXXX\"
$Computer_Names_Path = $root_Path + "Computer_Names\"
$Computer_Names_File = $Computer_Names_Path + "\ComputerNames_Working.txt"
$Ping_Logs_Path = $Root_Path + "Logs\Ping_Logs\"
# Testing to see if Folders Exist
Write-Output "Testing to see if Folder Paths exist" ""
If (Test-Path $Computer_Names_Path){
$Path_Result = "Computer Names Path Exists"
Write-Output $Path_Result
}
Else{
$Path_Result = "Computer Names Path Does Not Exist"
Write-Output $Path_Result
Pause
}
If (Test-Path $Ping_Logs_Path){
$Path_Result = "Ping Logs Csv Path Exists"
Write-Output $Path_Result ""
}
Else{
$Path_Result = "Ping Logs Csv Path Does Not Exist"
Write-Output $Path_Result
Pause
}
# Creating Excel Application
Write-Output "Creating Excel and Setting up the Sheet" ""
$New_Excel = New-Object -ComObject Excel.Application
# Makes Excel Visable
$New_Excel.Application.Visible = $true
$New_Excel.DisplayAlerts = $false
# Creating Excel WorkBook
$New_Book = $New_Excel.Workbooks.Add()
# Sets the Worksheet and Names it
$New_Sheet = $New_Book.Worksheets.Item(1)
$New_Sheet.name = "Ping Results--" + (Get-Date -Format MM-dd-yyyy--HH-mm)
# Selects the Worksheet
$New_Sheet.Activate()
# Blanking/Setting Needed Variables
Write-Output "Setting Start Variables" ""
$Path_Result = ""
$Ping_Result = ""
$Name = ""
$Machine_Status = ""
$Total_Count = 0
$Online_Count = 0
$Offline_Count = 0
$Row = 1
$Column = 1
# Creates the Title for the Sheet
Write-Output "Creating Title" ""
$New_Sheet.Cells.Item($Row,$Column) = "Ping Results"
$New_Sheet.Cells.Item($Row,$Column).Font.Size = 21
$New_Sheet.Cells.Item($Row,$Column).HorizontalAlignment = -4108
$New_Sheet.Cells.Item($Row,$Column).Font.Bold = $true
# Merging Cells for the Title
$New_Sheet.Range("A1:c1").Merge()
# Skipping Two Rows to leave space for Totals to be added at the end
$row = $Row + 2
# Creating Column Headers for the Data
Write-Output "Creating Column Headers" ""
$New_Sheet.Cells.Item($Row,$Column) = "Computer Name"
$New_Sheet.Cells.Item($Row,$Column).Font.Size = 16
$New_Sheet.Cells.Item($Row,$Column).Font.ColorIndex = 1
$New_Sheet.Cells.Item($Row,$Column).Font.Bold = $True
$column++
$New_Sheet.Cells.Item($Row,$Column) = "Online Status"
$New_Sheet.Cells.Item($Row,$Column).Font.Size = 16
$New_Sheet.Cells.Item($Row,$Column).Font.ColorIndex = 1
$New_Sheet.Cells.Item($Row,$Column).Font.Bold = $True
$column++
$New_Sheet.Cells.Item($Row,$Column) = "Timestamp"
$New_Sheet.Cells.Item($Row,$Column).Font.Size = 16
$New_Sheet.Cells.Item($Row,$Column).Font.ColorIndex = 1
$New_Sheet.Cells.Item($Row,$Column).Font.Bold = $True
# Sets the Column and Row for the First Set of Data
$Row = $Row + 1
$Column = 1
# Gathers the List of Computers
Write-Output "Gathering Computer Names" ""
$List_Of_Computers = Get-Content "$Computer_Names_File"
# Loop to get the status of each machine
Write-Output "Gathering Data from each Computer" ""
Write-Output "Computer Name Online Status Timestamp"
Foreach ($Name in $List_Of_Computers){
If (test-connection -computername $name -buffersize 16 -count 1 -quiet){
$Ping_Result = "Online"
$Online_Count = $Online_Count + 1
}
Else{
$Ping_Result = "Offline"
$Offline_Count = $Offline_Count + 1
}
$Machine_Status = $Name + " " + $Ping_Result + " " + (Get-Date -Format MM-dd-yyyy--HH-mm-ss)
Write-Output $Machine_Status
$New_Sheet.Cells.Item($row,$column) = $Name
$column = $Column + 1
$New_Sheet.Cells.Item($row,$column) = $Ping_Result
$Column = $Column + 1
$New_Sheet.Cells.Item($row,$column) = (Get-Date -Format MM-dd-yyyy--HH-mm-ss)
$Row = $row + 1
$Column = 1
}
# Adding Totals Information to Row 2
$New_Sheet.Cells.Item(2,1) = "# of Machines: " + $List_Of_Computers.Count
$New_Sheet.Cells.Item(2,2) = "Online Total: " + $Online_Count
$New_Sheet.Cells.Item(2,3) = "Offline Total: " + $Offline_Count

how to extract test log from MTM ?

I'm testing a project and all my test cases are in MTM , I'm looking for a way to extract all the test result we have in the MTM in a separate file , is there any way to do that ? please share if you have any idea
thanks a lot
If you wish to export the results of an automated run, you can download the .trx (test run execution) file from the attachments section and use XSLand XSLT to create an html report from it (you can also use the command-line tool tcm.exe run /export to get a .trx file).
But if you created the test run by manual execution, this won't be possible. The only way to get a "result file" would be to parse the result of the test run using the TFS API (in C# or Powershell via TfsTeamProjectCollection from Microsoft.TeamFoundation.TestManagement.Client and store it in a file.
Or you can use the TFS Rest-API with this PowerShell-Script (save as .ps) which lets you query a JSON and extract the data you want and display it the way you want to:
$RunId = Read-Host -Prompt "TFS Run Id"
$Url = "http://<tfsurl>/tfs/<CollectionName>/<TeamProject>/_apis/test/runs/$RunId/results"
$Client = New-Object System.Net.WebClient
$Client.Credentials = New-Object System.Net.NetworkCredential("<username>", "<password>", "<domain>")
$Json = $Client.DownloadString($Url) | ConvertFrom-Json
$Dict = #{}
ForEach($Test in $Json.value)
{
$Key = "Run " + $Test.testRun.name + " [" + $Test.testRun.id + "]"
$Val = $Test.testCase.name + " [" + $Test.testCase.id + "]" + " = " + $Test.outcome
if (!$Dict.ContainsKey($Key))
{
$List = New-Object System.Collections.ArrayList
$Dict.Add($Key, $List)
}
$IgnoreIndex = $Dict[$Key].Add($Val)
}
ForEach($Key in $Dict.Keys)
{
Write-Host $Key
ForEach($Val in $Dict[$Key])
{
Write-Host $Val
}
}
Exit
(replace values like <xxx> with yours)

Resources