Output parts of a heredoc conditionally - arrays

I've made a script that would ask the user for input and then evaluate the input from that file. I was wondering if I could do if else statements inside the cat string.
InfoCreateFile() {
touch $InfoFile
cat > "${InfoFile}" <<- 'EOF'
########################################
# System Information #
########################################
System IP=""
Domain=""
if [ "${panelinstall}" == "1" ]; then
Panel Subdomain=""
fi
if [ "${nodeinstall}" == "1" ]; then
Node Subdomain=""
fi
EOF
}
I know this is possible with arrays but I would like to do it without, arrays are a bit of a pain. Anyways, if there's no other solution.. could anyone give me an example of how I would do that using arrays?

In general, I'd just avoid cat here altogether; it's not giving you any value.
exec 3>"$InfoFile" # create and open output file
emit() { printf '%s\n' "$#" >&3; } # write each argument as a line in that file
emit '########################################'
emit '# System Information #'
emit '########################################'
emit ''
emit 'System IP=""'
emit 'Domain=""'
if [ "$panelinstall" = 1 ]; then
emit 'Panel Subdomain=""'
elif [ "$nodeinstall" = 1 ]; then
emit 'Node Subdomain=""'
fi
exec 3>&- # close output file
...is a lot simpler (and also more efficient to run) than something that forces use of a heredoc even when it's not called for, like:
cat >"$InfoFile" <<EOF
########################################
# System Information #
########################################
System IP=""
Domain=""
$(if [ "$panelinstall" = 1 ]; then
echo 'Panel Subdomain=""'
elif [ "$nodeinstall" = 1 ]; then
echo 'Node Subdomain=""'
fi)
EOF
Note how we use just <<EOF, not <<'EOF'; leaving out the quotes lets you do parameter expansions, command substitutions, &c inside the heredoc.

Related

How to return an array from a function and get the exit code returned from the function at the same time

I have many functions which return an array.
function myfunction() {
local -i status=0
local -a statusmsg=()
... do ....
statusmsg=($(do something ...))
if [[ ${status} -eq 0 ]]; then
....
return 0
else
for (( statusmsgline=0; statusmsgline<${#statusmsg[#]}; statusmsgline++ ))
do
printf "%s\n" "${statusmsg[${statusmsgline}]}"
done
return 1
fi
}
in the script I use mapfile as suggested here How to return an array in bash without using globals?
mapfile -t returnmsg <<< "$(myfunction "${param1}" "${param2}" "${paramX}" ...)"
if [ $? -eq 1 ]] ; then
... do something
fi
Using mapfile the array is well returned as it was generated but return code is ever and ever 0 (mapfile exit code) and can't retrieve the status code returned by the function.
I tried to use shopt -s lastpipe and shopt -so pipefail but without any success.
Is there any way to retrieve the array from function and the exit code at the same time ?
Kind Regards
Read the whole array into one string variable, evaluate the status, then pass the string variable on to mapfile.
output=$(myfunction ...)
if [ $? -eq 1 ] ; then
# do something
fi
mapfile -t array <<< "$output"
Warning: if the output of myfunction is very long the script may become slow or may even run out of memory. In this case you should write to a file or file descriptor instead.
The $() removes trailing empty lines. If those are important to you, then you can write either the exit status or the output to a file. Here we write the exit status at it is always short. Writing a long output to the file system and reading it afterwards would have more overhead.
mapfile -t array <(myfunction ...; echo $? > /tmp/status)
if [ "$(< /tmp/status; rm -f /tmp/status)" -eq 1 ] ; then
# do something
fi
There's also a way to work without these temporary variables/files by leveraging bash's options:
shopt -s lastpipe
myfunction ... | mapfile -t array
if [ "${PIPESTATUS[0]}" -eq 1 ] ; then
# do something
fi
# you may want to do `shopt -u lastpipe` here

Create associative array from grep output

I have a grep output and I'm trying to make an associative array from the output that I get.
Here is my grep output:
"HardwareSerialNumber": "123456789101",
"DeviceId": "devid1234",
"HardwareSerialNumber": "111213141516",
"DeviceId": "devid5678",
I want to use that output to define an associative array, like this:
array[123456789101]=devid1234
array[11213141516]=devid5678
Is that possible? I'm new at making arrays. I hope someone could help me in my problem.
Either pipe your grep output to a helper script with a while loop containing a simple "0/1" toggle to read two lines taking the last field of each to fill your array, e.g.
#!/bin/bash
declare -A array
declare -i n=0
arridx=
while read -r label value; do # read 2 fields
if [ "$n" -eq 0 ]
then
arridx="${value:1}" # strip 1st and lst 2 chars
arridx="${arridx:0:(-2)}" # save in arridx (array index)
((n++)) # increment toggle
else
arrval="${value:1}" # strip 1st and lst 2 chars
arrval="${arrval:0:(-2)}" # save in arrval (array value)
array[$arridx]="$arrval" # assign to associative array
n=0 # zero toggle
fi
done
for i in ${!array[#]}; do # output array
echo "array[$i] ${array[$i]}"
done
Or you can use process substitution containing the grep command within the script to do the same thing, e.g.
done < <( your grep command )
You can also add a check under the else clause that if [[ $label =~ DeviceId ]] to validate you are on the right line and catch any variation in the grep output content.
Example Input
$ cat dat/grepout.txt
"HardwareSerialNumber": "123456789101",
"DeviceId": "devid1234",
"HardwareSerialNumber": "111213141516",
"DeviceId": "devid5678",
Example Use/Output
$ cat dat/grepout.txt | bash parsegrep2array.sh
array[123456789101] devid1234
array[111213141516] devid5678
Parsing out the values is easy, and once you have them you can certainly use those values to build up an array. The trickiest part comes from the fact that you need to combine input from separate lines. Here is one approach; note that this script is verbose on purpose, to show what's going on; once you see what's happening, you can eliminate most of the output:
so.input
"HardwareSerialNumber": "123456789101",
"DeviceId": "devid1234",
"HardwareSerialNumber": "111213141516",
"DeviceId": "devid5678",
so.sh
#!/bin/bash
declare -a hardwareInfo
while [[ 1 ]]; do
# read in two lines of input
# if either line is the last one, we don't have enough input to proceed
read lineA < "${1:-/dev/stdin}"
# if EOF or empty line, exit
if [[ "$lineA" == "" ]]; then break; fi
read lineB < "${1:-/dev/stdin}"
# if EOF or empty line, exit
if [[ "$lineB" == "" ]]; then break; fi
echo "$lineA"
echo "$lineB"
hwsn=$lineA
hwsn=${hwsn//HardwareSerialNumber/}
hwsn=${hwsn//\"/}
hwsn=${hwsn//:/}
hwsn=${hwsn//,/}
echo $hwsn
# some checking could be done here to test that the value is numeric
devid=$lineB
devid=${devid//DeviceId/}
devid=${devid//\"/}
devid=${devid//:/}
devid=${devid//,/}
echo $devid
# some checking could be done here to make sure the value is valid
# populate the array
hardwareInfo[$hwsn]=$devid
done
# spacer, for readability of the output
echo
# display the array; in your script, you would do something different and useful
for key in "${!hardwareInfo[#]}"; do echo $key --- ${hardwareInfo[$key]}; done
cat so.input | ./so.sh
"HardwareSerialNumber": "123456789101",
"DeviceId": "devid1234",
123456789101
devid1234
"HardwareSerialNumber": "111213141516",
"DeviceId": "devid5678",
111213141516
devid5678
111213141516 --- devid5678
123456789101 --- devid1234
I created the input file so.input just for convenience. You would probably pipe your grep output into the bash script, like so:
grep-command | ./so.sh
EDIT #1: There are lots of choices for parsing out the key and value from the strings fed in by grep; the answer from #David C. Rankin shows another way. The best way depends on what you can rely on about the content and structure of the grep output.
There are also several choices for reading two separate lines that are related to each other; David's "toggle" approach is also good, and commonly used; I considered it myself, before going with "read two lines and stop if either is blank".
EDIT #2: I see declare -A in David's answer and in examples on the web; I used declare -a because that's what my version of bash wants (I'm using a Mac). So, just be aware that there can be differences.

shell mock --define from array: ERROR: Bad option for '--define' ("dist). Use --define 'macro expr'

I am currently writing a script which should make it more easy for me to build some RPMs using mock.
The plan is to make it possible to add values for the mock (and therefor rpmbuild) --define parameter.
The error I get if I add such a define value is
ERROR: Bad option for '--define' ("dist). Use --define 'macro expr'
When I execute the script with as simple as ./test.sh --define "dist .el7" the "debug" output is as follows:
/usr/bin/mock --init -r epel-7-x86_64 --define "dist .el7"
If I copy this and execute it in the shell directly it is actually working. Does anybody have an idea why this is the case?
My script can be cut down to the following:
#!/bin/sh
set -e
set -u
set -o pipefail
C_MOCK="/usr/bin/mock"
MOCK_DEFINES=()
_add_mock_define() {
#_check_parameters_count_strict 1 ${#}
local MOCK_DEFINE="${1}"
MOCK_DEFINES+=("${MOCK_DEFINE}")
}
_print_mock_defines_parameter() {
if [ ${#MOCK_DEFINES[#]} -eq 0 ]; then
return 0
fi
printf -- "--define \"%s\" " "${MOCK_DEFINES[#]}"
}
_mock_init() {
local MOCK_DEFINES_STRING="$(_print_mock_defines_parameter)"
local MOCK_PARAMS="--init"
MOCK_PARAMS="${MOCK_PARAMS} -r epel-7-x86_64"
[ ! "${#MOCK_DEFINES_STRING}" -eq 0 ] && MOCK_PARAMS="${MOCK_PARAMS} ${MOCK_DEFINES_STRING}"
echo "${C_MOCK} ${MOCK_PARAMS}"
${C_MOCK} ${MOCK_PARAMS}
local RC=${?}
if [ ${RC} -ne 0 ]; then
_exit_error "Error while mock initializing ..." ${RC}
fi
}
while (( ${#} )); do
case "${1}" in
-s|--define)
shift 1
_add_mock_define "${1}"
;;
esac
shift 1
done
_mock_init
exit 0
After asking this question a coworker I was pointed to this question on unix stackexchange: Unix Stackexchange question
The way this problem was solved can be broken down to following lines:
DEFINES=()
DEFINES+=(--define "dist .el7")
DEFINES+=(--define "foo bar")
/usr/bin/mock --init -r epel-7-x86_64 "${DEFINES[#]}"
Just in case somebody else stumbles upon this kind of issue.

Is there a way to generate the include map for a C file?

Given:
a C file with several included header files
a bunch of include file search folder
Is there a way to generate some kind of include map for the C file?
Though IDEs can sometimes help locate the definition/declaration of a symbol in the header file. But I think an include map can give me a better insight into how these files are related when project gets complicated. And identify issues such as circular includes.
ADD 1
A similar thread but not much helpful.
It only generates an include hierarchy in text in the Output window when building.
And it only works for native VC++ project. Not work for NMAKE C project.
Displaying the #include hierarchy for a C++ file in Visual Studio
ADD 2
I just tried the Include Manager mentioned in above thread. Though not free, it's not expensive and perfectly fits in my scenario.
Not sure if this is quite what you're after, but I was curious what a graph of this would look like, so I threw this together. It's a bit of a mess, but workable for a throw-away script:
#!/bin/bash
INCLUDE_DIRS=()
# Add any include dirs here
# INCLUDE_DIRS+=("/usr/include")
# If you want to add flags for some pkg-config modules (space
# separated)
PKG_CONFIG_PKGS=""
FOLLOW_SYS_INCLUDES=y
while read dir; do
dir="$(readlink -f "${dir}")"
for d in "${INCLUDE_DIRS[#]}"; do
if [ "${dir}" = "${d}" ]; then
continue
fi
done
INCLUDE_DIRS+=("${dir}")
done < <(echo | cc -Wp,-v -x c - -fsyntax-only 2>&1 | grep '^ ' | cut -b2-)
PROCESSED=()
while read flag; do
if [ -n "${flag}" ]; then
INCLUDE_DIRS+=("${flag}")
fi
done < <(pkg-config --cflags-only-I "${PKG_CONFIG_PKGS}" | sed -E 's/-I([^ ]*)/\1\n/g')
function not_found {
echo " \"$1\" [style=filled,color=lightgrey];"
echo " \"$2\" -> \"$1\""
}
function handle_file {
filename="${1}"
for f in "${PROCESSED[#]}"; do
if [ "${f}" = "${1}" ]; then
echo " \"$2\" -> \"$1\""
return
fi
done
PROCESSED+=("$1")
if [ -n "${2}" ]; then
echo " \"${2}\" -> \"${1}\";"
fi
if [ ! "${FOLLOW_SYS_INCLUDES}" = "y" ]; then
for d in "${INCLUDE_DIRS[#]}"; do
case "${1}" in
"${d}"/*)
return
;;
esac
done
fi
parse_file "$1"
}
function handle_include {
case "${1}" in
/*)
handle_file "${name}" "$2"
return
;;
esac
for dir in "${INCLUDE_DIRS[#]}"; do
if [ -f "${dir}/${1}" ]; then
handle_file "${dir}/${1}" "$2"
return
fi
done
not_found "${1}" "${2}"
}
function handle_include_2 {
case "${1}" in
/*)
handle_file "${1}" "$2"
return
;;
esac
FILE="$(readlink -f "$(dirname "${2}")/${1}")"
if [ -f "${FILE}" ]; then
handle_file "${FILE}" "$2"
fi
}
function parse_file {
while read name; do
handle_include "$name" "$1";
done < <(grep '^[ \t]*#[ \t]*include <' "$1" | sed -E 's/[ \t]*#[ \t]*include ?<([^>]+)>.*/\1/')
while read name; do
handle_include_2 "$name" "$1" "$PWD";
done < <(grep '^[ \t]*#[ \t]*include "' "$1" | sed -E 's/[ \t]*#[ \t]*include \"([^"]+)\"/\1/')
}
echo "digraph G {"
echo "graph [rankdir = \"LR\"];"
parse_file "$(readlink -f "${1}")" "" "$PWD"
echo "}"
Pass it a file and it will generate a graphviz file. Pipe it to dot:
$ ./include-map.sh /usr/include/stdint.h | dot -Tx11
And you have something nice to look at.
Recently almost any of the featured IDE can help you in that built in.Visual Studio or Jetbrain's Clion are proper for that. You did not wrote neither platform or environment, but maybe it worth to give a try even with efforts to properly set the project that compiles.
Back in days I found also useful to generate documentation with doxygen, as I remember that will also create such maps /links/ as well.

BASH: Best way to set variable from array

Bash 4 on Linux ~
I have an array of possible values. I must restrict user input to these values.
Arr=(hello kitty goodbye quick fox)
User supplies value as argument to script:
bash myscript.sh -b var
Currently, I'm trying the following:
function func_exists () {
_var="$1"
for i in ${Arr[#]}
do
if [ "$i" == "$_var" ]
then
echo hooray for "$_var"
return 1
fi
done
return 0
}
func_exists $var
if [ $? -ne 1 ];then
echo "Not a permitted value."
func_help
exit $E_OPTERROR
fi
Seems to work fine, are there better methods for testing user input against an array of allowed values?
UPDATE: I like John K's answer ...can someone clarify the use of $#? I understand that this represents all positional parameters -- so we shift the first param off the stack and $# now represents all remaining params, those being the passed array ...is that correct? I hate blindly using code without understanding ...even if it works!
Your solution is what I'd do. Maybe using a few more shell-isms, such as returning 0 for success and non-0 for failure like UNIX commands do in general.
# Tests if $1 is in the array ($2 $3 $4 ...).
is_in() {
value=$1
shift
for i in "$#"; do
[[ $i == $value ]] && return 0
done
return 1
}
if ! is_in "$var" "${Arr[#]}"; then
echo "Not a permitted value." >&2
func_help
exit $E_OPTERROR
fi
Careful use of double quotes makes sure this will work even if the individual array entries contain spaces, which is allowed. This is a two element array: list=('hello world' 'foo bar').
Another solution. is_in is just a variable:
Arr=(hello kitty goodbye quick fox)
var='quick'
string=" ${Arr[*]} " # array to string, framed with blanks
is_in=1 # false
# try to delete the variable inside the string; true if length differ
[ "$string" != "${string/ ${var} /}" ] && is_in=0
echo -e "$is_in"
function func_exists () {
case "$1"
in
hello)
kitty)
goodbye)
quick)
fox)
return 1;;
*)
return 0;;
esac
}

Resources