Best way to read ERRORLEVEL codes for a windows executable, executed from within a TCL script - try-catch

I am pretty new to Tcl world, so please excuse me of any naive questions.
I am trying to execute a windows executable from a Tcl procedure. At the same time, I want to read the %errorlevel% outputted from the windows executable and throw some meaningful messages to the Tcl shell.
Ex:
I have a windows executable "test.exe arg1" that outputs various errorcodes based on the interrupts:
0 - If the script successfully executed
1 - If the user interrupted the process manually, and the process exited.
2 - If the user login is not found, process exited.
3 - If the "arg1" is not specified, process exited
In my TCL script, I have the following:
set result [catch {exec cmd /c test.exe arg1}]
if { $result == 3 } {
puts "Argument undefined"
} elseif { $result == 2 } {
puts "Login Failed"
} elseif { $result == 1 } {
puts "Process Cancelled by user"
} elseif { $result == 0 } {
puts "Command successful"
}
It appears that the output of the catch command is either 1 or 0, and it would not read the appropriate %errorlevel% information from the windows terminal.
What is the best way to trap the %errorlevel% info from the Windows executable and process appropriate error messages using Tcl?

The catch command takes two optional arguments: "resultVarName" and "optionsVarName". If you use those, you can examine the second one for the return code:
catch {exec cmd /c test.exe arg1} output options
puts [dict get $options -errorcode]
That would report something like: CHILDSTATUS 15567 1
The fields represent the error type, process ID, and the exit code. So you should check that error type is "CHILDSTATUS" before taking that last number as the exit code. Other error types will have different data. This is actually more easily done with the try command:
try {
exec cmd /c test.exe arg1
} on ok {output} {
puts "Command successful"
} trap {CHILDSTATUS} {output options} {
set result [lindex [dict get $options -errorcode] end]
if {$result == 3} {
puts "Argument undefined"
} elseif {$result == 2} {
puts "Login Failed"
} elseif {$result == 1} {
puts "Process Cancelled by user"
}
}
Note: I tested this on linux, but it should work very similar on windows.

Related

expect returns error "spawn id exp5 not open" running c program

I'am running some test on a program with the following code:
set timeout -1
set filename "test"
set programName "./library"
spawn rm -f $filename.db $filename.ind
spawn ./$programName first_fit $filename
expect "Type command and argument/s."
expect "exit"
The program output is the following:
Type command and argument/s.
exit
both lines are written using printf and the next line that executes is fgets();
expects outputs the following error:
expect: spawn id exp5 not open
while executing
"expect "exit""
(file "add_data_test.sh" line 16)

Is there a way to read stdin without blocking script's execution using a vbscript?

I am trying to find a way read stdin without blocking my vbscript's execution but still no luck.
What I want to achieve is the following (written in sh shell script):
for i in {1..3}; do
read input;
echo $input;
sleep 1;
if [ "$input" == "done" ]; then
echo "process done";
exit;
fi
done
Tried the following in vbscript but script hangs in the first iteration waiting for Enter in order to proceed
input=""
for i=1 to 3
WScript.Echo i
WScript.sleep (100);
If WScript.StdIn.AtEndOfStream Then
input = input & WScript.StdIn.Readline()
If input = "done" Then
WScript.Echo "process done"
End if
End If
Next
Is there a way not to block my script while reading stdin?

Batch script terminates in case of error when using pipe operator

I need to perferm error handling (check ERRORLEVEL) on an operation involving the pipe operator, but instead of the script continuing with a non-zero ERRORLEVEL, it terminates immediately. How can I avoid this behavior?
Consider the following example. (Note that is a simplified constructed example to illustrate the problem - not a meaningful script)
someinvalidcommand
echo nextline
This will result in
> 'someinvalidcommand' is not recognized as ... command...
> nextline
In other words, the script continues after the error.
Now consider
echo firstline | someinvalidcommand
echo nextline
This will result in only
> 'someinvalidcommand' is not recognized as ... command ...
That is, it terminates before evaluating "echo nextline"
Why this behavior and how to avoid it ? The purpose is to perform something similar to
someoperation | someotheroperation
IF NOT %ERRORLEVEL% == 0 (
handleerror
)
but the error handling has no effect since it stops early.
Delegate it to another cmd instance
cmd /c" someoperation | someotheroperation "
if errorlevel 1 (
handleerror
)

Is it possible for sshkit capture to not error when the command executed returns nothing

What I'm trying to achieve is a capistrano3 task that does a log file grep on all servers - this would save a lot of time as we have a lot of servers so doing it manually or even scripted but sequentially takes ages.
I have a rough at the edges task that actually works except when one of the servers returns nothing for the grep. In this case the whole command falls over.
Hence wondering if there is a way to set capture to accept empty returns.
namespace :admin do
task :log_grep, :command, :file do |t,args|
command = args[:command] || 'ask for a command'
file = args[:file] || 'log_grep_results'
outs = {}
on roles(:app), in: :parallel do
outs[host.hostname] = capture(:zgrep, "#{command}")
end
File.open(file, 'w') do |fh|
outs.each do |host,out|
fh.write(out)
end
end
end
end
Should anyone else come to this question, here's solution - raise_on_non_zero_exit: false
i wanted:
resp = capture %([ -f /var/run/xxx/xxx.pid ] && echo "ok")
error:
SSHKit::Command::Failed: [ -f /var/run/xxx/xxx.pid ] && echo "ok" exit status: 1
[ -f /var/run/xxx/xxx.pid ] && echo "ok" stdout: Nothing written
[ -f /var/run/xxx/xxx.pid ] && echo "ok" stderr: Nothing written
solution:
resp = capture %([ -f /var/run/xxx/xxx.pid ] && echo "ok"), raise_on_non_zero_exit: false
# resp => ""
So the work around I did was to start adding what I'm calling Capistrano utility scripts in the repo. Then capistrano runs these scripts. All the scripts are is a wrapper around a grep and some logic to output something if the return is empty.
Capistrano code:
namespace :utils do
task :log_grep, :str, :file, :save_to do |t,args|
command_args = "#{args[:str]} #{args[:file]}"
outs = {}
on roles(:app), in: :parallel do
outs[host.hostname] = capture(:ruby, "#{fetch(:deploy_to)}/current/bin/log_grep.rb #{args[:str]} #{args[:file]}")
end
file = args[:save_to]
file ||= 'log_grep_output'
File.open(file, 'w') do |fh|
outs.each do |host,out|
s = "#{host} -- #{out}\n"
fh.write(s)
end
end
end
end
Ruby script log_grep.rb:
a = `zgrep #{ARGV[0]} #{ARGV[1]}`
if a.empty?
puts 'Nothing Found'
else
puts a
end

PowerShell: break nested loops

There should be a break command in PowerShell that can exit nested loops by assigning a label. Just it doesn't work. Here's my code:
$timestampServers = #(
"http://timestamp.verisign.com/scripts/timstamp.dll",
"http://timestamp.comodoca.com/authenticode",
"http://timestamp.globalsign.com/scripts/timstamp.dll",
"http://www.startssl.com/timestamp"
)
:outer for ($retry = 2; $retry -gt 0; $retry--)
{
Write-Host retry $retry
foreach ($timestampServer in $timestampServers)
{
Write-Host timestampServer $timestampServer
& $signtoolBin sign /f $keyFile /p "$password" /t $timestampServer $file
if ($?)
{
Write-Host OK
break :outer
}
}
}
if ($retry -eq 0)
{
WaitError "Digitally signing failed"
exit 1
}
It prints the following:
retry 2
timestampServer http://timestamp.verisign.com/scripts/timstamp.dll
Done Adding Additional Store
Successfully signed and timestamped: C:\myfile.dll
OK
retry 1
timestampServer http://timestamp.verisign.com/scripts/timstamp.dll
Done Adding Additional Store
Successfully signed and timestamped: C:\myfile.dll
OK
ERROR: Digitally signing failed
What have I done wrong?
Can I have goto and labels, please?
Using Windows 7 and I guess PS 2.0. This script is supposed to run on PS 2 at least.
You do not add the colon when using break with a loop label. This line:
break :outer
should be written like this:
break outer
For a further demonstration, consider this simple script:
:loop while ($true)
{
while ($true)
{
break :loop
}
}
When executed, it will run forever without breaking. This script however:
:loop while ($true)
{
while ($true)
{
break loop
}
}
exits as it should because I changed break :loop to break loop.
So, I changed the code a bit to make it clear
$timestampServers = #(
"http://timestamp.verisign.com/scripts/timstamp.dll",
"http://timestamp.comodoca.com/authenticode",
"http://timestamp.globalsign.com/scripts/timstamp.dll",
"http://www.startssl.com/timestamp"
)
:outer for ($retry = 2; $retry -gt 0; $retry--)
{
Write-Host retry $retry
foreach ($timestampServer in $timestampServers)
{
Write-Host timestampServer $timestampServer
#& $signtoolBin sign /f $keyFile /p "$password" /t $timestampServer $file
if ($true)
{
break :outer
Write-Host OK
}
}
}
if ($retry -eq 0)
{
Write-Error "Digitally signing failed" ## you have a typo there
exit 1
}
This produces the following:
retry 2
timestampServer http://timestamp.verisign.com/scripts/timstamp.dll
retry 1
timestampServer http://timestamp.verisign.com/scripts/timstamp.dll
C:\temp\t.ps1 : Digitally signing failed
+ CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
+ FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,t.ps1
So, skips Write-Host OK, but also seems to continue to loop. In other words, it acts like 'Continue' statement.
Changed it like the folks mentioned to remove ':', although PowerShell documentation does not exclude it:
if ($true)
{
break outer
Write-Host OK
}
I get the correct behavior.
retry 2
timestampServer http://timestamp.verisign.com/scripts/timstamp.dll
Long story short... do not use ':'

Resources