expect returns error "spawn id exp5 not open" running c program - c

I'am running some test on a program with the following code:
set timeout -1
set filename "test"
set programName "./library"
spawn rm -f $filename.db $filename.ind
spawn ./$programName first_fit $filename
expect "Type command and argument/s."
expect "exit"
The program output is the following:
Type command and argument/s.
exit
both lines are written using printf and the next line that executes is fgets();
expects outputs the following error:
expect: spawn id exp5 not open
while executing
"expect "exit""
(file "add_data_test.sh" line 16)

Related

Best way to read ERRORLEVEL codes for a windows executable, executed from within a TCL script

I am pretty new to Tcl world, so please excuse me of any naive questions.
I am trying to execute a windows executable from a Tcl procedure. At the same time, I want to read the %errorlevel% outputted from the windows executable and throw some meaningful messages to the Tcl shell.
Ex:
I have a windows executable "test.exe arg1" that outputs various errorcodes based on the interrupts:
0 - If the script successfully executed
1 - If the user interrupted the process manually, and the process exited.
2 - If the user login is not found, process exited.
3 - If the "arg1" is not specified, process exited
In my TCL script, I have the following:
set result [catch {exec cmd /c test.exe arg1}]
if { $result == 3 } {
puts "Argument undefined"
} elseif { $result == 2 } {
puts "Login Failed"
} elseif { $result == 1 } {
puts "Process Cancelled by user"
} elseif { $result == 0 } {
puts "Command successful"
}
It appears that the output of the catch command is either 1 or 0, and it would not read the appropriate %errorlevel% information from the windows terminal.
What is the best way to trap the %errorlevel% info from the Windows executable and process appropriate error messages using Tcl?
The catch command takes two optional arguments: "resultVarName" and "optionsVarName". If you use those, you can examine the second one for the return code:
catch {exec cmd /c test.exe arg1} output options
puts [dict get $options -errorcode]
That would report something like: CHILDSTATUS 15567 1
The fields represent the error type, process ID, and the exit code. So you should check that error type is "CHILDSTATUS" before taking that last number as the exit code. Other error types will have different data. This is actually more easily done with the try command:
try {
exec cmd /c test.exe arg1
} on ok {output} {
puts "Command successful"
} trap {CHILDSTATUS} {output options} {
set result [lindex [dict get $options -errorcode] end]
if {$result == 3} {
puts "Argument undefined"
} elseif {$result == 2} {
puts "Login Failed"
} elseif {$result == 1} {
puts "Process Cancelled by user"
}
}
Note: I tested this on linux, but it should work very similar on windows.

Is there a way to read stdin without blocking script's execution using a vbscript?

I am trying to find a way read stdin without blocking my vbscript's execution but still no luck.
What I want to achieve is the following (written in sh shell script):
for i in {1..3}; do
read input;
echo $input;
sleep 1;
if [ "$input" == "done" ]; then
echo "process done";
exit;
fi
done
Tried the following in vbscript but script hangs in the first iteration waiting for Enter in order to proceed
input=""
for i=1 to 3
WScript.Echo i
WScript.sleep (100);
If WScript.StdIn.AtEndOfStream Then
input = input & WScript.StdIn.Readline()
If input = "done" Then
WScript.Echo "process done"
End if
End If
Next
Is there a way not to block my script while reading stdin?

Batch script terminates in case of error when using pipe operator

I need to perferm error handling (check ERRORLEVEL) on an operation involving the pipe operator, but instead of the script continuing with a non-zero ERRORLEVEL, it terminates immediately. How can I avoid this behavior?
Consider the following example. (Note that is a simplified constructed example to illustrate the problem - not a meaningful script)
someinvalidcommand
echo nextline
This will result in
> 'someinvalidcommand' is not recognized as ... command...
> nextline
In other words, the script continues after the error.
Now consider
echo firstline | someinvalidcommand
echo nextline
This will result in only
> 'someinvalidcommand' is not recognized as ... command ...
That is, it terminates before evaluating "echo nextline"
Why this behavior and how to avoid it ? The purpose is to perform something similar to
someoperation | someotheroperation
IF NOT %ERRORLEVEL% == 0 (
handleerror
)
but the error handling has no effect since it stops early.
Delegate it to another cmd instance
cmd /c" someoperation | someotheroperation "
if errorlevel 1 (
handleerror
)

how to execute a command using array containing strings with spaces linux

I have been stuck on an issue for a couple of hours now regarding bash shell arrays.
I am using an array of strings created from an inputFile using IFS=$'\n' as the separator.
Each index of the array may contain multiple words separated by spaces. Each string represents a combination of Options/Arguments that may be used with the execution of another shell script.
I attempt to supply the these strings to the other shell script I wrote using variations of the following syntax:
# Run the testcases, case by case
for testcase in `cat $inputFile` ###"${testcases[#]}"
#for ((i = 0; i < ${#testcases[#]}; i++))
do
#testcase="${testcases[i]}"
# Print each case in inputFile
echo "${testcase}"
# Run the Archive Script, redirect output to /dev/null
./archive.sh "${testcase}" > /dev/null
# Respond to $# return value
done
You will notice a few different variations I have used to loop through an array (or attempt to directly respond by directly reading into a local variable from the output of cat.
What is most frustrating is that the echo command works, and prints the string with spaces. However, when my script attempts to execute using the string, the first word is read as a whole but following the first space, the script attempts to execute the script a single character at a time. :(
Please help! Here is the output I get when executing the script:
$ test_archive.sh lab6_testcase
Testcases Count: 6
Running cases:
-v dirA
./archive.sh: illegal option --
./archive.sh: illegal option -- d
./archive.sh: illegal option -- i
./archive.sh: illegal option -- r
./archive.sh: illegal option -- A
dirB
-v -r dirA
./archive.sh: illegal option --
./archive.sh: illegal option -- -
./archive.sh: illegal option -- r
./archive.sh: illegal option --
./archive.sh: illegal option -- d
./archive.sh: illegal option -- i
./archive.sh: illegal option -- r
./archive.sh: illegal option -- A
-v dirA dirB
./archive.sh: illegal option --
./archive.sh: illegal option -- d
./archive.sh: illegal option -- i
./archive.sh: illegal option -- r
./archive.sh: illegal option -- A
./archive.sh: illegal option --
./archive.sh: illegal option -- d
./archive.sh: illegal option -- i
./archive.sh: illegal option -- r
./archive.sh: illegal option -- B
dirc
-a
./archive.sh: illegal option -- a
TEST SUMMARY
SUCCESSFUL CASES:0
USAGE CASES:0
INVALID ARGUMENT CASES:6
-v dirA
dirB
-v -r dirA
-v dirA dirB
dirc
-a
Here is the Full Script incase I have missed some key line triggering and error:
#!/bin/bash
# CONSTANTS
SUCCESS=0
USAGE_ERROR=1
INVALID_ARGUMENT=2
# Set the Input Field Seperator to '\n'
IFS=$'\n'
# Setup counters
usageErrorCount=0
argumentErrorCount=0
successCount=0
# Check if not enough or too many arguments have bee supplied
if (( $# != 1 ))
then
echo Usage: $0 filename
exit $USAGE_ERROR
fi
# Store the File in inputFile
inputFile=$1
# Check if the inputFile exists
if [[ ! -f $inputFile ]]
then
# Report that the file does not exist
echo "Exiting: The input file '$inputFile' does not exist!"
exit $INVALID_ARGUMENT
fi
# Read the lines from the file, and place them in the array
testcases=( `cat $inputFile` )
echo Testcases Count: ${#testcases[#]}
# Inform use of case list
echo Running cases:
# Run the testcases, case by case
for testcase in `cat $inputFile` ###"${testcases[#]}"
#for ((i = 0; i < ${#testcases[#]}; i++))
do
#testcase="${testcases[i]}"
# Print each case in inputFile
echo "${testcase}"
# Run the Archive Script, redirect output to /dev/null
./archive.sh "${testcase}" > /dev/null
# Use Switch Statement on ENV Success Var (#?) to:
# 1. Increment counters
# 2. Add testcase to appropriate array
case $? in
$USAGE_ERROR) # Add testcase to usage array
usageCases[$usageErrorCount]="$testcase"
# Up Usage Count
usageErrorCount=$((usageErrorCount+1))
;;
$INVALID_ARGUMENT) # Add testcase to argument array
argumentCases[$argumentErrorCount]="$testcase"
# Up Argument Count
argumentErrorCount=$((argumentErrorCount+1))
;;
$INVALID_ARGUMENT) # Add testcase to success array
successCases[$successCount]="$testcase"
# Up Success Count
successCount=$(($successCount+1))
;;
esac
done
# Format the Output
echo "TEST SUMMARY"
# Report Successful Cases
echo "SUCCESSFUL CASES:$successCount"
for testcase in ${successCases[#]}
do
echo $testcase
done
# Report Ussage Cases
echo "USAGE CASES:$usageErrorCount"
for testcase in ${usageCases[#]}
do
echo $testcase
done
# Report Successful Cases
echo "INVALID ARGUMENT CASES:$argumentErrorCount"
for testcase in ${argumentCases[#]}
do
echo $testcase
done
# Exit with Success
exit $SUCCESS
The problem is due to your IFS still being set to $'\n' so ./archive.sh is reading the whole line as a single argument and trying to make sense of the letters.
Change the IFS back to ' ' and the problem should be gone!
Example:
# Run the testcases, case by case
for testcase in "${testcases[#]}"
do
IFS=' '
echo "${testcase}"
./archive.sh "${testcase}" > /dev/null
done
I found a solution to the issue after some digging.
The issue is that the shell executes the command before evaluating/dereferencing the contents of the variable (or array value).
Because of this, eval must be used to force the shell to dereference the variable/array and retrieve the contents before executing the command proceeding the variable/array.
Here is a working version (reading directly from file):
for testcase in `cat $inputFile`
do
#testcase="${testcases[i]}"
# Print each case in inputFile
echo " ${testcase}"
# Run the Archive Script, redirect output to /dev/null
eval ./lab5.sh ${testcase} > /dev/null
... CONTINUE WITH SCRIPT ...
./archive.sh "${testcase}"
You are calling archive.sh with a single argument (with embedded whitespace). That's what double quotes do. Everything inside double quotes is a single word.
Since the first character of the argument is a dash, archive.sh takes the rest of the argument (including the embedded space) as options, and complain about them not being valid options.
You want to drop the quotes. Demo.

How to catch signals from a shell script which is executing a file which may throw it?

I have a shell script, which is executing a program fuseIO which is basically a C program.
Ths idea is, this executable fuseIO may throw a SIGABRT by an abort( ) call inside it, in which case, the while loop should exit.
How to accomplish that?
i=0
while [ $i != 10 ]
do
echo "************ Iteration: $i *********\n" 2>&1 |tee -a log.txt
./fuseIO 2>&1 | tee -a log.txt // This may throw SIGABRT
i=`expr $i + 1`
sleep 1
done
In part, see Exit status codes greater than — possible? The WIFSIGNALED stuff tells you that a process was signalled, but there's a problem for the shell encoding that, and it does it by encoding a signalled exit status as 128 + signal number (129 for HUP, 130 for INT, etc). To demonstrate shell and signal exit statuses:
$ cat killme.sh
#!/bin/bash
kill ${1:-"-INT"} $$
$ ./killme.sh -HUP; echo $?
Hangup: 1
129
$ ./killme.sh -TERM; echo $?
Terminated: 15
143
$ ./killme.sh -QUIT; echo $?
0
$ ./killme.sh -PIPE; echo $?
141
$ ulimit -a
core file size (blocks, -c) 0
...
$
This more or less justifies my '128+signum' claim (the -QUIT behaviour is unexpected, but explainable after a fashion — it normally dumps core, but didn't because ulimit has them disabled).
In bash, you can get 'a list of exit status values from the processes in the most-recently-executed foreground pipeline (which may contain only a single command)' via the array $PIPESTATUS. For example:
$ ./killme.sh | exit 31
$ echo ${PIPESTATUS[*]}
130 31
$
This corresponds to the 130 exit status for SIGINT (2) plus the explicit exit status 31. Note the notation: ${PIPESTATUS[0]} with the braces around the indexing. Double quotes and things work like $* vs $# vs "$*" vs "$#" for getting all the values in the array.
Applied to your two-part pipeline, you should be able to test:
if [[ ${PIPESTATUS[0]} == 134 ]]
then : FuseIO crash with SIGABRT
fi
Without the | tee, you can simply test $?:
if [[ $? > 128 && $? < 160 ]]
then : died a signalled death (probably)
else : died an ordinary death
fi
The 160 here is also an informed guess; it should be 128+SIGRTMAX. Note that if a process does exit 135, it will be treated as if it was signalled (even if it does not).

Resources