shell mock --define from array: ERROR: Bad option for '--define' ("dist). Use --define 'macro expr' - arrays

I am currently writing a script which should make it more easy for me to build some RPMs using mock.
The plan is to make it possible to add values for the mock (and therefor rpmbuild) --define parameter.
The error I get if I add such a define value is
ERROR: Bad option for '--define' ("dist). Use --define 'macro expr'
When I execute the script with as simple as ./test.sh --define "dist .el7" the "debug" output is as follows:
/usr/bin/mock --init -r epel-7-x86_64 --define "dist .el7"
If I copy this and execute it in the shell directly it is actually working. Does anybody have an idea why this is the case?
My script can be cut down to the following:
#!/bin/sh
set -e
set -u
set -o pipefail
C_MOCK="/usr/bin/mock"
MOCK_DEFINES=()
_add_mock_define() {
#_check_parameters_count_strict 1 ${#}
local MOCK_DEFINE="${1}"
MOCK_DEFINES+=("${MOCK_DEFINE}")
}
_print_mock_defines_parameter() {
if [ ${#MOCK_DEFINES[#]} -eq 0 ]; then
return 0
fi
printf -- "--define \"%s\" " "${MOCK_DEFINES[#]}"
}
_mock_init() {
local MOCK_DEFINES_STRING="$(_print_mock_defines_parameter)"
local MOCK_PARAMS="--init"
MOCK_PARAMS="${MOCK_PARAMS} -r epel-7-x86_64"
[ ! "${#MOCK_DEFINES_STRING}" -eq 0 ] && MOCK_PARAMS="${MOCK_PARAMS} ${MOCK_DEFINES_STRING}"
echo "${C_MOCK} ${MOCK_PARAMS}"
${C_MOCK} ${MOCK_PARAMS}
local RC=${?}
if [ ${RC} -ne 0 ]; then
_exit_error "Error while mock initializing ..." ${RC}
fi
}
while (( ${#} )); do
case "${1}" in
-s|--define)
shift 1
_add_mock_define "${1}"
;;
esac
shift 1
done
_mock_init
exit 0

After asking this question a coworker I was pointed to this question on unix stackexchange: Unix Stackexchange question
The way this problem was solved can be broken down to following lines:
DEFINES=()
DEFINES+=(--define "dist .el7")
DEFINES+=(--define "foo bar")
/usr/bin/mock --init -r epel-7-x86_64 "${DEFINES[#]}"
Just in case somebody else stumbles upon this kind of issue.

Related

check multiple directories and output status

Hello I am trying run a script that checks for a list of directories. then output the status of each directory into new fields or an array.
Thanks in Advance!
#!/bin/sh
dir00="/tmp/Apple"
dir01="/tmp/Banana"
dir02="/tmp/Carrot"
dirList=("$dir00" "$dir01" "$dir02")
dirName00="Apple"
dirName01="Banana"
dirName02="Carrot"
dirNames=("$dirName00" "$dirName01" "$dirName02")
for i in "${dirList[#]}"; do
if [ -d "$i" ]; then
echo "Directory Not Missing:$i"
# write to a new arrary (dirStatus) either 0 or 1
else
echo "Directory Missing: $i"
# write to a new arrary (dirStatus) either 0 or 1
fi
done
# then I can do the following:
echo dirName[0] dirStatus[0]
# expected output:
echo Apple 1 # if Apple is missing
The test [ -d path ] sets the exit status $?. Instead of using that status implicitly in an if statement you can append it explicitly to an array.
An exit status of 0 means "yes" and everything else (usually 1) means "no".
Since you didn't specify how exactly you'd like to store the results, here are two alternatives that could be useful:
Two Regular Arrays
#! /bin/bash
path=(tmp/DIR_0{0..2})
isDir=()
for p in "${path[#]}"; do
[ -d "$p" ]
isDir+=($?)
done
declare -p path isDir
One Associative Array
#! /bin/bash
declare -A isDir
for p in tmp/DIR_0{0..2}; do
[ -d "$p" ]
isDir["$p"]=$?
done
declare -p isDir

How to return an array from a function and get the exit code returned from the function at the same time

I have many functions which return an array.
function myfunction() {
local -i status=0
local -a statusmsg=()
... do ....
statusmsg=($(do something ...))
if [[ ${status} -eq 0 ]]; then
....
return 0
else
for (( statusmsgline=0; statusmsgline<${#statusmsg[#]}; statusmsgline++ ))
do
printf "%s\n" "${statusmsg[${statusmsgline}]}"
done
return 1
fi
}
in the script I use mapfile as suggested here How to return an array in bash without using globals?
mapfile -t returnmsg <<< "$(myfunction "${param1}" "${param2}" "${paramX}" ...)"
if [ $? -eq 1 ]] ; then
... do something
fi
Using mapfile the array is well returned as it was generated but return code is ever and ever 0 (mapfile exit code) and can't retrieve the status code returned by the function.
I tried to use shopt -s lastpipe and shopt -so pipefail but without any success.
Is there any way to retrieve the array from function and the exit code at the same time ?
Kind Regards
Read the whole array into one string variable, evaluate the status, then pass the string variable on to mapfile.
output=$(myfunction ...)
if [ $? -eq 1 ] ; then
# do something
fi
mapfile -t array <<< "$output"
Warning: if the output of myfunction is very long the script may become slow or may even run out of memory. In this case you should write to a file or file descriptor instead.
The $() removes trailing empty lines. If those are important to you, then you can write either the exit status or the output to a file. Here we write the exit status at it is always short. Writing a long output to the file system and reading it afterwards would have more overhead.
mapfile -t array <(myfunction ...; echo $? > /tmp/status)
if [ "$(< /tmp/status; rm -f /tmp/status)" -eq 1 ] ; then
# do something
fi
There's also a way to work without these temporary variables/files by leveraging bash's options:
shopt -s lastpipe
myfunction ... | mapfile -t array
if [ "${PIPESTATUS[0]}" -eq 1 ] ; then
# do something
fi
# you may want to do `shopt -u lastpipe` here

How to read multiple arrays in Bash and skip the array after first match

This is my bash script
I have 4 set of arrays and in each set i am ssh-ing to each server to find if /data filesystem exists. If it matches it should skip the array and move to next arry. I am unable to do with break as it exits the entire script. Any ideas ?
declare -a siteA=("server01" "server02" "server03")
declare -a siteB=("server04" "server05" "server06")
declare -a siteB=("server07" "server08" "server09")
declare -a siteB=("server10" "server11" "server12")
cmd=$(df -h|grep /data)
for i in "${siteA[#]}" "${siteB[#]}" "${siteC[#]}" "${siteD[#]}"; do
ping -c 2 ${i} > /dev/null 2>&1
if [[ $? -eq 0 ]] ; then
X=$(ssh root#${i} -q $cmd1 2>&1)
if [[ $Z == "/data" ]]; then
echo "$i: has /data"
fi
fi
done
To continue to the next array when you find a match, simply wrap your loop contents in a parameter loop within a function and call that for each site:
has_running_host() {
for host
do
[code which `break`s on a match]
done
}
has_running_host "${siteA[#]}"
has_running_host "${siteB[#]}"
[…]
Well although not very nice, you could use two loops in combination with eval:
for j in siteA siteB siteC siteD; do
for i in $(eval echo \${${j}[#]}); do
echo $i
done
done
this would then allow you to use break in the inner loop and therefore jumping to the next array
That worked for me, also getting output from a remote ssh is a challenge

Is there a way to generate the include map for a C file?

Given:
a C file with several included header files
a bunch of include file search folder
Is there a way to generate some kind of include map for the C file?
Though IDEs can sometimes help locate the definition/declaration of a symbol in the header file. But I think an include map can give me a better insight into how these files are related when project gets complicated. And identify issues such as circular includes.
ADD 1
A similar thread but not much helpful.
It only generates an include hierarchy in text in the Output window when building.
And it only works for native VC++ project. Not work for NMAKE C project.
Displaying the #include hierarchy for a C++ file in Visual Studio
ADD 2
I just tried the Include Manager mentioned in above thread. Though not free, it's not expensive and perfectly fits in my scenario.
Not sure if this is quite what you're after, but I was curious what a graph of this would look like, so I threw this together. It's a bit of a mess, but workable for a throw-away script:
#!/bin/bash
INCLUDE_DIRS=()
# Add any include dirs here
# INCLUDE_DIRS+=("/usr/include")
# If you want to add flags for some pkg-config modules (space
# separated)
PKG_CONFIG_PKGS=""
FOLLOW_SYS_INCLUDES=y
while read dir; do
dir="$(readlink -f "${dir}")"
for d in "${INCLUDE_DIRS[#]}"; do
if [ "${dir}" = "${d}" ]; then
continue
fi
done
INCLUDE_DIRS+=("${dir}")
done < <(echo | cc -Wp,-v -x c - -fsyntax-only 2>&1 | grep '^ ' | cut -b2-)
PROCESSED=()
while read flag; do
if [ -n "${flag}" ]; then
INCLUDE_DIRS+=("${flag}")
fi
done < <(pkg-config --cflags-only-I "${PKG_CONFIG_PKGS}" | sed -E 's/-I([^ ]*)/\1\n/g')
function not_found {
echo " \"$1\" [style=filled,color=lightgrey];"
echo " \"$2\" -> \"$1\""
}
function handle_file {
filename="${1}"
for f in "${PROCESSED[#]}"; do
if [ "${f}" = "${1}" ]; then
echo " \"$2\" -> \"$1\""
return
fi
done
PROCESSED+=("$1")
if [ -n "${2}" ]; then
echo " \"${2}\" -> \"${1}\";"
fi
if [ ! "${FOLLOW_SYS_INCLUDES}" = "y" ]; then
for d in "${INCLUDE_DIRS[#]}"; do
case "${1}" in
"${d}"/*)
return
;;
esac
done
fi
parse_file "$1"
}
function handle_include {
case "${1}" in
/*)
handle_file "${name}" "$2"
return
;;
esac
for dir in "${INCLUDE_DIRS[#]}"; do
if [ -f "${dir}/${1}" ]; then
handle_file "${dir}/${1}" "$2"
return
fi
done
not_found "${1}" "${2}"
}
function handle_include_2 {
case "${1}" in
/*)
handle_file "${1}" "$2"
return
;;
esac
FILE="$(readlink -f "$(dirname "${2}")/${1}")"
if [ -f "${FILE}" ]; then
handle_file "${FILE}" "$2"
fi
}
function parse_file {
while read name; do
handle_include "$name" "$1";
done < <(grep '^[ \t]*#[ \t]*include <' "$1" | sed -E 's/[ \t]*#[ \t]*include ?<([^>]+)>.*/\1/')
while read name; do
handle_include_2 "$name" "$1" "$PWD";
done < <(grep '^[ \t]*#[ \t]*include "' "$1" | sed -E 's/[ \t]*#[ \t]*include \"([^"]+)\"/\1/')
}
echo "digraph G {"
echo "graph [rankdir = \"LR\"];"
parse_file "$(readlink -f "${1}")" "" "$PWD"
echo "}"
Pass it a file and it will generate a graphviz file. Pipe it to dot:
$ ./include-map.sh /usr/include/stdint.h | dot -Tx11
And you have something nice to look at.
Recently almost any of the featured IDE can help you in that built in.Visual Studio or Jetbrain's Clion are proper for that. You did not wrote neither platform or environment, but maybe it worth to give a try even with efforts to properly set the project that compiles.
Back in days I found also useful to generate documentation with doxygen, as I remember that will also create such maps /links/ as well.

Bash, confusing results for different file tests (test -f)

I am confused in bash by this expression:
$ var="" # empty var
$ test -f $var; echo $? # test if such file exists
0 # and this file exists, amazing!
$ test -f ""; echo $? # let's try doing it without var
1 # and all ok
I can't understand such bash behaviour, maybe anybody can explain?
It's because the empty expansion of $var is removed before test sees it. You are actually running test -f and thus there's only one arg to test, namely -f. According to POSIX, a single arg like -f is true because it is not empty.
From POSIX test(1) specification:
1 argument:
Exit true (0) if `$1` is not null; otherwise, exit false.
There's never a test for a file with an empty file name. Now with an explicit test -f "" there are two args and -f is recognized as the operator for "test existence of path argument".
When var is empty, $var will behave differently when if quoted or not.
test -f $var # <=> test -f ==> $? is 0
test -f "$var" # <=> test -f "" ==> $? is 1
So this example tells us: we should quote the $var.

Resources