Store values from arrays in config-file to variables - arrays

I've got the following problem:
I got a config-file (written in bash) with multiple arrays, the amount of these arrays is different from config to config. Each array contains three values.
declare -a array0
array0=(value1 value2 value3)
#
declare -a array1
array1=(value1 value2 value3)
#
declare -a array2
array2=(value1 value2 value3)
Now, this config file is sourced into the main bash script. I want to go from array to array and store the values into single variables. My actual solution:
for ((i=0;i=2;i++))
do
if [ "$i" = 0 ]
then
wantedvalue1="${array0["$i"]}"
fi
if [ "$i" = 1 ]
then
wantedvalue2="${array0["$i"]}"
fi
if [ "$i" = 2 ]
then
wantedvalue3="${array0["$i"]}"
fi
done
I guess, this will work for one specific array. But how can I tell the script to analyze every array in the config file like this?
Thanks for any help!

You can find the arrays in your environment via set. This extracts the names of the arrays which have exactly three elements:
set | sed -n 's/^\([_A_Za-z][_A-Za-z0-9]*\)=(\[0]=.*\[2]="[^"]*")$/\1/p'
(The number of backslashes depends on your sed dialect. This worked for me on Debian, where backslashed parentheses are metacharacters for grouping, and bare parentheses are matched literally.)
I don't really see why you want to use a loop to extract just three elements, but the wacky indirect reference syntax in bash kind of forces it here.
for array in $(set |
sed -n 's/^\([_A_Za-z][_A-Za-z0-9]*\)=(\[0]=.*\[2]="[^"]*")$/\1/p'); do
for((i=0, j=1; i<3; ++i, ++j)); do
k="$array[$i]"
eval wantedvalue$j=\'${!k}\'
done
:
: code which uses the wantedvalues here
done
It would be a tad simpler if you just used another array for the wantedvalues. Then the pesky eval could be avoided, too.

Related

Concatenate arrays using iteration in name with bash

I would like to concatenate unlimited numbers of arrays using shortest lines possible, so for this I did the code below:
#!/bin/bash
declare -a list1=("element1")
declare -a list2=("element2")
declare -a list3=("element3")
declare -a list4=("element4")
declare -a list
for i in {1..4}
do
list=( ${list[#]} ${list$i[#]} )
done
echo ${list[*]}
But the code above is not working because $i is not seen as variable and the error is: ${list$i[#]} bad substitution
You can use variable indirection:
for i in {1..4} ; do
ref="list$i[#]"
list+=("${!ref}")
done
echo "${list[#]}"
The following code outputs all 4 lists concatenated together.
eval echo \${list{1..4}[*]}
This code runs filename expansion over the result of list elements (* is replaced by filenames). Consider sacrificing 4 characters and doing \"\${list{1..4}[*]}\".
Note that eval is evil https://mywiki.wooledge.org/BashFAQ/048 and such code is confusing. I wouldn't write such code in a real script - I would definitely use a loop. Use shellcheck to check your scripts.

Convert JSON dictionary into Bash one?

I am trying to receive output from an aws-cli command into Bash, and use it as an input to the second.
I successfully saved the output into the variable with this helpful answer:
iid=$(aws ec2 run-instances ...)
and receive output like that:
{ "ImageId": "ami-0abcdef1234567890", "InstanceId": "i-1231231230abcdef0", "InstanceType": "t2.micro", ... }
I know that Bash since v4 supports associative arrays but I'm struggling converting one into one.
I tried to parse the dictionary in Bash but received error:
must use subscript when assigning associative array
This is because the right key syntax for Bash dicts is =, not :.
Finally I accessed the members by using this marvelous answer with sed:
echo $iid|sed 's/{//g;s/}//g;s/"//g'|cut -d ':' -f2
My question: is there any standard way of creating a Bash dictionary from JSON or text besides regex? Best-practice?
Considering the snippet from the answer I used, the sed-approach can be very verbose and the verbosity increases exponentially with the number of keys/members:
for items in `echo $db|sed 's/{//;s/}//'`
do
echo one${count} = `echo $items|sed 's/^.*\[//;s/\].*$//'|cut -d ',' -f1`
echo two${count} = `echo $items|sed 's/^.*\[//;s/\].*$//'|cut -d ',' -f2`
echo three${count} = `echo $items|sed 's/^.*\[//;s/\].*$//'|cut -d ',' -f3`
echo four${count} = `echo $items|sed 's/^.*\[//;s/\].*$//'|cut -d ',' -f4`
...
done
For simple dicts it is OK, but for the complex dictionaries with hundreds of keys and a big nesting level it is almost inapplicable.
Is there any unified approach for arbitrary dictionary?
P.S. I found answers about solving the opposite (receiving Bash dict in Python), solving the task through jq, creating dict from Bash to Bash and from non-common input but nothing about specifically JSON. I prefer not to use jq and python and stick to the standard Bash toolset, most of the answers of this collective answer use 3rd-party tools. Is it possible at all?
One way to turn your JSON object members into a Bash4+'s associative array:
#!/usr/bin/env bash
# shellcheck disable=SC2155 # Associative array declaration from JSON
declare -A assoc=$(
jq -r '"(",(to_entries | .[] | "["+(.key|#sh)+"]="+(.value|#sh)),")"' \
input.json
)
# Debug dump the Associative array declaration
typeset -p assoc
Sample output:
declare -A assoc=([InstanceId]="i-1231231230abcdef0" [InstanceType]="t2.micro" [ImageId]="ami-0abcdef1234567890" )

Sorting an array of pathnames (strings) [Bash]

I have seen way too many duplicates of this, but none of the answer codes or tips ever helped me, so I'm left confused.
input=/foo/bar/*;
#Contains something along the lines of
#/foo/bar/file1 /foo/bar/file2 /foo/bar/file3
#And I simply need
#/foo/bar/file3 /foo/bar/file2 /foo/bar/file1
output=($(for l in ${input[#]}; do echo $l; done | sort));
#Doesn't work, returns only the last entry from input
output=$(sort -nr ${input});
#Works, returns everything correctly reversed, but outputs the file contents and not the pathnames;
output=($(sort -nr ${input}));
#Outputs only the last entry and also its contents and not the pathname;
I tried many more options, but I'm not gonna fill this whole page with them, you get the gist.
Duplicates: (None of them helpful to me)
How can I sort the string array in linux bash shell?
How to sort an array in BASH
custom sort bash array
Sorting bash arguments alphabetically
You're confused about what is an array in bash: this does not declare an array:
input=/foo/bar/*
$input is just the string "/foo/bar/*" -- the list of files does not get expanded until you do something like for i in ${input[#]} where the "array" expansion is unquoted.
You want this:
input=( /foo/bar/* )
mapfile -t output < <(printf "%s\n" "${input[#]}" | sort -nr)
I don't have time to explain it. I'll come back later.
You can use sort -r with printf, where input containg glob string to match your filenames:
sort -r <(printf "%s\n" $input)
This works:
input=`foo/bar/*`
output=`for l in $input ; do echo $l ; done | sort -r`

Bash array indirection in a function [duplicate]

Bash script to create multiple arrays from csv with unknown columns.
I am trying to write a script to compare two csv files with similar columns. I need it to locate the matching column from the other csv and compare any differences. The kicker is I would like the script to be dynamic to allow any number of columns to be entered and it still be able to function. I thought I had a good plan to solve this but turns out I'm running into syntax errors. Here is a sample of a csv I need to compare.
IP address, Notes, Nmap-SSH, Nmap-SMTP, Nmap-HTTP, Nmap-HTTPS,
10.0.0.1, , open, closed, open, open,
10.0.0.2, , closed, open, closed, closed,
When I read the csv file I was planning to look for "IF column == open; then; populate this column's array with the IP address" This would have given me 4 lists in this scenario with the IPs that were listening on said port. I could then compare that to my security device configuration to make sure it was configured properly. Finally to the meat, here is what I thought would accomplish creating the arrays for me to search later. However I ran into a snag when I tried to use a variable inside an array name. Can my syntax be corrected or is there just a better way to do this sort of thing?
#!/bin/bash
#
#
# This script compares config_cleaned_<ip>.txt output against ext_web_env.csv and outputs the differences
#
#
# Read from ext_web_env.csv file and create Array
#
FILENAME=./tmp/ext_web_env.csv
#
index=0
#
while read line
do
# How many columns are in the .csv?
varEnvCol=$(echo $line | awk -F, '{print NF}')
echo "columns = $varEnvCol"
# While loop to create array for each column
while [ $varEnvCol != 2 ]
do
# Checks to see if port is open; if so then add IP address to array
varPortCon=$(echo $line | awk -F, -v i=$varEnvCol '{print $i}')
if [ $varPortCon = "open" ]
then
arr$varEnvCol[$index]="$(echo $line | awk -F, '{print $1}')"
# I get this error message "line29 : arr8[194]=10.0.0.194: command not found"
fi
echo "arrEnv$varEnvCol is: ${arr$varEnvCol[#]}"
# Another error but not as important since I am using this to debug "line31: arr$varEnvCol is: ${arr$varEnvCol[#]}: bad substitution"
varEnvCol=$(($varEnvCol - 1))
done
index=$(($index + 1 ))
done < $FILENAME
UPDATE
I also tried using the eval command since all the data will be populated by other scripts.
but am getting this error message:
./compare.sh: line 41: arr8[83]=10.0.0.83: command not found
Here is my new code for this example:
if [[ $varPortCon = *'open'* ]]
then
eval arr\$varEnvCol[$index]=$(echo $line | awk -F, '{print $1}')
fi
arr$varEnvCol[$index]="$(...)"
doesn't work the way you expect it to - you cannot assign to shell variables indirectly - via an expression that expands to the variable name - this way.
Your attempted workaround with eval is also flawed - see below.
tl;dr
If you use bash 4.3 or above:
declare -n targetArray="arr$varEnvCol"
targetArray[index]=$(echo $line | awk -F, '{print $1}')
bash 4.2 or earlier:
declare "arr$varEnvCol"[index]="$(echo $line | awk -F, '{print $1}')"
Caveat: This will work in your particular situation, but may fail subtly in others; read on for details, including a more robust, but cumbersome alternative based on read.
The eval-based solution mentioned by #shellter in a since-deleted comment is problematic not only for security reasons (as they mentioned), but also because it can get quite tricky with respect to quoting; for completeness, here's the eval-based solution:
eval "arr$varEnvCol[index]"='$(echo $line | awk -F, '\''{print $1}'\'')'
See below for an explanation.
Assign to a bash array variable indirectly:
bash 4.3+: use declare -n to effectively create an alias ('nameref') of another variable
This is by far the best option, if available:
declare -n targetArray="arr$varEnvCol"
targetArray[index]=$(echo $line | awk -F, '{print $1}')
declare -n effectively allows you to refer to a variable by another name (whether that variable is an array or not), and the name to create an alias for can be the result of an expression (an expanded string), as demonstrated.
bash 4.2-: there are several options, each with tradeoffs
NOTE: With non-array variables, the best approach is to use printf -v. Since this question is about array variables, this approach is not discussed further.
[most robust, but cumbersome]: use read:
IFS=$'\n' read -r -d '' "arr$varEnvCol"[index] <<<"$(echo $line | awk -F, '{print $1}')"
IFS=$'\n' ensures that that leading and trailing whitespace in each input line is left intact.
-r prevents interpretation of \ chars. in the input.
-d '' ensures that ALL input is captured, even multi-line.
Note, however, that any trailing \n chars. are stripped.
If you're only interested in the first line of input, omit -d ''
"arr$varEnvCol"[index] expands to the variable - array element, in this case - to assign to; note that referring to variable index inside an array subscript does NOT need the $ prefix, because subscripts are evaluated in arithmetic context, where the prefix is optional.
<<< - a so-called here-string - sends its argument to stdin, where read takes its input from.
[simplest, but may break]: use declare:
declare "arr$varEnvCol"[index]="$(echo $line | awk -F, '{print $1}')"
(This is slightly counter-intuitive, in that declare is meant to declare, not modify a variable, but it works in bash 3.x and 4.x, with the constraints noted below.)
Works fine OUTSIDE a FUNCTION - whether the array was explicitly declared with declare or not.
Caveat: INSIDE a function, only works with LOCAL variables - you cannot reference shell-global variables (variables declared outside the function) from inside a function that way. Attempting to do so invariably creates a LOCAL variable ECLIPSING the shell-global variable.
[insecure and tricky]: use eval:
eval "arr$varEnvCol[index]"='$(echo $line | awk -F, '\''{print $1}'\'')'
CAVEAT: Only use eval if you fully control the contents of the string being evaluated; eval will execute any command contained in a string, with potentially unwanted results.
Understanding what variable references/command substitutions get expanded when is nontrivial - the safest approach is to delay expansion so that they happen when eval executes rather than immediate expansion that happens when arguments are passed to eval.
For a variable assignment statement to succeed, the RHS (right-hand side) must eventually evaluate to a single token - either unquoted without whitespace or quoted (optionally with whitespace).
The above example uses single quotes to delay expansion; thus, the string passed mustn't contain single quotes directly and thus is broken into multiple parts with literal ' chars. spliced in as \'.
Also note that the LHS (left-hand side) of the assignment statement passed to eval must be a double-quoted string - using an unquoted string with selective quoting of $ won't work, curiously:
OK: eval "arr$varEnvCol[index]"=...
FAILS: eval arr\$varEnvCol[index]=...

bash4 read file into associative array

I am able to read file into a regular array with a single statement:
local -a ary
readarray -t ary < $fileName
Not happening is reading a file into assoc. array.
I have control over file creation and so would like to do as simply as possible w/o loops if possible at all.
So file content can be following to be read in as:
keyname=valueInfo
But I am willing to replace = with another string if cuts down on code, especially in a single line code as above.
And ...
So would it be possible to read such a file into an assoc array using something like an until or from - i.e. read into an assoc array until it hits a word, or would I have to do this as part of loop?
This will allow me to keep a lot of similar values in same file, but read into separate arrays.
I looked at mapfile as well, but does same as readarray.
Finally ...
I am creating an options list - to select from - as below:
local -a arr=("${!1}")
select option in ${arr[*]}; do
echo ${option}
break
done
Works fine - however the list shown is not sorted. I would like to have it sorted if possible at all.
Hope it is ok to put all 3 questions into 1 as the questions are similar - all on arrays.
Thank you.
First thing, associative arrays are declared with -A not -a:
local -A ary
And if you want to declare a variable on global scope, use declare outside of a function:
declare -A ary
Or use -g if BASH_VERSION >= 4.2.
If your lines do have keyname=valueInfo, with readarray, you can process it like this:
readarray -t lines < "$fileName"
for line in "${lines[#]}"; do
key=${line%%=*}
value=${line#*=}
ary[$key]=$value ## Or simply ary[${line%%=*}]=${line#*=}
done
Using a while read loop can also be an option:
while IFS= read -r line; do
ary[${line%%=*}]=${line#*=}
done < "$fileName"
Or
while IFS== read -r key value; do
ary[$key]=$value
done < "$fileName"

Resources