Why isn't my bash array append working? - arrays

This script was working just fine on an AIX box, but now on RH linux box the arrays just doesn't seem to work. Version on the new RH box is 4.1.2
I declare my array
declare -a gridNames=()
I get information about a grid
gridstats=`snmpwalk -v 2c -c splunk $host gridStatsTable -m $APPLIANCEMIB -OUQs -Ln`
As well as getting the stats from the above, I reuse it to find all the gridNames and then will use the array of gridNames to get stats about their maps.
while read -r process; do
gridNames=(${gridNames[#]} `grep gridName | awk -F "\"" '{print $(NF-1)}'`)
done <<< "$gridstats"
The awk part is tested and correctly returns a list of gridnames (just one in this case) but when I echo the array gridNames its empty.
I have also tried using
gridNames+=(`grep gridName | awk -F "\"" '{print $(NF-1)}'`)
but that doesnt work either

You need to use += operator for appending elements to an array with process substitution:
while read -r process; do
gridNames+=( $(grep 'gridName' | awk -F '"' '{print $(NF-1)}' <<< "$process") )
done < <(snmpwalk -v 2c -c splunk $host gridStatsTable -m $APPLIANCEMIB -OUQs -Ln)

Related

bash script split array of strings to designed output

Data in log file is something like:
"ipHostNumber: 127.0.0.1
ipHostNumber: 127.0.0.2
ipHostNumber: 127.0.0.3"
that's my code snippet:
readarray -t iparray < "$destlog"
unset iparray[0]
for ip in "${iparray[#]}"
do
IFS=$'\n' read -r -a iparraychanged <<< "${iparray[#]}"
done
And what I wanted to receive is to transfer IP's to another array and then read every line from that array and ping it.
UPDATE: I've acknowledged something, it's probably that I want one array to another but without some string, in this case cut "ipHostNumber: " and have remained only IPs.
Thanks in advance, if there's anything missing please let me know.
Why do you need arrays at all? Just read the input and do the work.
while IFS=' ' read -r _ ip; do
ping -c1 "$ip"
done < "$destlog"
See https://mywiki.wooledge.org/BashFAQ/001 .
Another way is to use xargs and filtering:
awk '{print $2}' "$destlog" | xargs -P0 -n1 ping -c1
I prefer KamilCuk's xargs solution, but if you just wanted the array because you plan to reuse it -
$: readarray -t ip < <( sed '1d; s/^.* //' file )
Now it's loaded.
$: declare -p ip
declare -a ip=([0]="127.0.0.2" [1]="127.0.0.3")
$: for addr in "${ip[#]}"; do ping -c 1 "$addr"; done

How to access last item in bash array on Mac OS?

I had to create a bash array on Mac OS as follows. The $1 represents # of git commits you want to store in the array.
IFS=$'\n' read -rd '' -a array<<< "$(git log -n $1 | grep commit | awk '{print $2}')"
I can't access last array item as ${array[-1]}. I get the error "array: bad array subscript".
However, when I create the array on linux OS, I can access the last array item in the same way successfully.
readarray -t array <<< "$(git log -n $1 | grep commit | awk '{print $2}')"
echo ${array[-1]} is successful on Linux machine but not on Mac OS machine.
In a bash too old to support negative subscripts, you end up needing to do something like:
echo "${array[$((${#array[#]} - 1))]}"

Trouble with AWK'd command output and bash array

I am attempting to get a list of running VirtualBox VMs (the UUIDs) and put them into an array. The command below produces the output below:
$ VBoxManage list runningvms | awk -F '[{}]' '{print $(NF-1)}'
f93c17ca-ab1b-4ba2-95e5-a1b0c8d70d2a
46b285c3-cabd-4fbb-92fe-c7940e0c6a3f
83f4789a-b55b-4a50-a52f-dbd929bdfe12
4d1589ba-9153-489a-947a-df3cf4f81c69
I would like to take those UUIDs and put them into an array (possibly even an associative array for later use, but a simple array for now is sufficient)
If I do the following:
array1="( $(VBoxManage list runningvms | awk -F '[{}]' '{print $(NF-1)}') )"
The commands
array1_len=${#array1[#]}
echo $array1_len
Outputs "1" as in there's only 1 element. If I print out the elements:
echo ${array1[*]}
I get a single line of all the UUIDs
( f93c17ca-ab1b-4ba2-95e5-a1b0c8d70d2a 46b285c3-cabd-4fbb-92fe-c7940e0c6a3f 83f4789a-b55b-4a50-a52f-dbd929bdfe12 4d1589ba-9153-489a-947a-df3cf4f81c69 )
I did some research (Bash Guide/Arrays on how to tackle this and found this with command substitution and redirection, but it produces an empty array
while read -r -d '\0'; do
array2+=("$REPLY")
done < <(VBoxManage list runningvms | awk -F '[{}]' '{print $(NF-1)}')
I'm obviously missing something. I've looked at several simiar questions on this site such as:
Reading output of command into array in Bash
AWK output to bash Array
Creating an Array in Bash with Quoted Entries from Command Output
Unfortunately, none have helped. I would apprecaite any assistance in figuring out how to take the output and assign it to an array.
I am running this on macOS 10.11.6 (El Captain) and BASH version 3.2.57
Since you're on a Mac:
brew install bash
Then with this bash as your shell, pipe the output to:
readarray -t array1
Of the -t option, the man page says:
-t Remove a trailing delim (default newline) from each line read.
If the bash4 solution is admissible, then the advice given
e.g. by gniourf_gniourf at reading-output-of-command-into-array-in-bash
is still sound.

Parse multiline output to array in shellscript

I tried to fetch all repos from a git user and put them into a array within a shell script. Somehow the array doesn´t recognize the new line as a separator and acts like there´s only one multi line element within the array.
Here´s my sample code:
someUser=Joe
declare -a repos=$(curl -s "https://api.github.com/users/$someUser/repos?page=$PAGE&per_page=100" | grep -e 'git_url*' | cut -d \" -f 4 | cut -d"/" -f5 | cut -d"." -f1)
for repo in $repos; do
echo $repo
// some more stuff
done
The output from the curl and cut looks like that:
RepoA
RepoB
RepoC
[...]
How do I get a new line element treated as a new element within the array? I use the array several times so I need one fixed container with all the repositories.
This is the correct way to iterate over the elements of a Bash array:
for repo in "${repos[#]}"; do
Also, to create an array with the output of a command, you need to wrap the $(...) subshell within (...), like this:
declare -a repos=($(curl -s ...))
Thanks to #janos i fixed the script. The additionsl bracket within the delcare line was the problem. Here is the full code. Maybe somebody would like to copy it.
#!/bin/bash
gitUrl="https://github.com"
gitUser="foobar"
cloneCmd="git clone"
fetchCmd="git fetch"
magenta="\033[35m"
green="\033[32m"
def="\033[0m"
declare -a repos=($(curl -s "https://api.github.com/users/$gitUser/repos?page=$PAGE&per_page=100" | grep -e 'git_url*' | cut -d \" -f 4 | cut -d"/" -f5 | cut -d"." -f1))
# Init clone
echo -e "$magenta Cloning new Repositories $def"
for repo in "${repos[#]}"
do
if [ -d $repo ]; then
echo -e "$green \tRepo $repo already exists $def"
continue
fi
$cloneCmd $gitUrl/$gitUser/$repo
done
echo -e "$green Colning finished $def"
# Update Repos
echo -e "$magenta Updating Repositories $def"
for repo in "${repos[#]}"
do
cd $repo
$fetchCmd
cd ..
done
echo -e "$green Update finished $def"

Execute bash command stored in associative array over SSH, store result

For a larger project that's not relevant, I need to collect system stats from the local system or a remote system. Since I'm collecting the same stats either way, I'm preventing code duplication by storing the stats-collecting commands in a Bash associative array.
declare -A stats_cmds
# Actually contains many more key:value pairs, similar style
stats_cmds=([total_ram]="$(free -m | awk '/^Mem:/{print $2}')")
I can collect local system stats like this:
get_local_system_stats()
{
# Collect stats about local system
complex_data_structure_that_doesnt_matter=${stats_cmds[total_ram]}
# Many more similar calls here
}
A precondition of my script is that ~/.ssh/config is setup such that ssh $SSH_HOSTNAME works without any user input. I would like something like this:
get_remote_system_stats()
{
# Collect stats about remote system
complex_data_structure_that_doesnt_matter=`ssh $SSH_HOSTNAME ${stats_cmds[total_ram]}`
}
I've tried every combination of single quotes, double quotes, backticks and such that I can imagine. Some combinations result in the stats command getting executed too early (bash: 7986: command not found), others cause syntax errors, others return null (single quotes around the stats command) but none store the proper result in my data structure.
How can I evaluate a command, stored in an associative array, on a remote system via SSH and store the result in a data structure in my local script?
Make sure that the commands you store in your array don't get expanded when you assign your array!
Also note that the complex-looking quoting style is necessary when nesting single quotes. See this SO post for an explanation.
stats_cmds=([total_ram]='free -m | awk '"'"'/^Mem:/{print $2}'"'"'')
And then just launch your ssh as:
sh "$ssh_hostname" "${stats_cmds[total_ram]}"
(yeah, I lowercased your variable name because uppercase variable names in Bash are really sick). Then:
get_local_system_stats() {
# Collect stats about local system
complex_data_structure_that_doesnt_matter=$( ${stats_cmds[total_ram]} )
# Many more similar calls here
}
and
get_remote_system_stats() {
# Collect stats about remote system
complex_data_structure_that_doesnt_matter=$(ssh "$ssh_hostname" "${stats_cmds[total_ram]}")
}
First, I'm going to suggest an approach that makes minimal changes to your existing implementation. Then, I'm going to demonstrate something closer to best practices.
Smallest Modification
Given your existing code:
declare -A remote_stats_cmds
remote_stats_cmds=([total_ram]='free -m | awk '"'"'/^Mem:/{print $2}'"'"''
[used_ram]='free -m | awk '"'"'/^Mem:/{print $3}'"'"''
[free_ram]='free -m | awk '"'"'/^Mem:/{print $4}'"'"''
[cpus]='nproc'
[one_min_load]='uptime | awk -F'"'"'[a-z]:'"'"' '"'"'{print $2}'"'"' | awk -F "," '"'"'{print $1}'"'"' | tr -d " "'
[five_min_load]='uptime | awk -F'"'"'[a-z]:'"'"' '"'"'{print $2}'"'"' | awk -F "," '"'"'{print $2}'"'"' | tr -d " "'
[fifteen_min_load]='uptime | awk -F'"'"'[a-z]:'"'"' '"'"'{print $2}'"'"' | awk -F "," '"'"'{print $3}'"'"' | tr -d " "'
[iowait]='cat /proc/stat | awk '"'"'NR==1 {print $6}'"'"''
[steal_time]='cat /proc/stat | awk '"'"'NR==1 {print $9}'"'"'')
...one can evaluate these locally as follows:
result=$(eval "${remote_stat_cmds[iowait]}")
echo "$result" # demonstrate value retrieved
...or remotely as follows:
result=$(ssh "$hostname" bash <<<"${remote_stat_cmds[iowait]}")
echo "$result" # demonstrate value retrieved
No separate form is required.
The Right Thing
Now, let's talk about an entirely different way to do this:
# no awful nested quoting by hand!
collect_total_ram() { free -m | awk '/^Mem:/ {print $2}'; }
collect_used_ram() { free -m | awk '/^Mem:/ {print $3}'; }
collect_cpus() { nproc; }
...and then, to evaluate locally:
result=$(collect_cpus)
...or, to evaluate remotely:
result=$(ssh "$hostname" bash <<<"$(declare -f collect_cpus); collect_cpus")
...or, to iterate through defined functions with the collect_ prefix and do both of these things:
declare -A local_results
declare -A remote_results
while IFS= read -r funcname; do
local_results["${funcname#collect_}"]=$("$funcname")
remote_results["${funcname#collect_}"]=$(ssh "$hostname" bash <<<"$(declare -f "$funcname"); $funcname")
done < <(compgen -A function collect_)
...or, to collect all the items into a single remote array in one pass, avoiding extra SSH round-trips and not eval'ing or otherwise taking security risks with results received from the remote system:
remote_cmd=""
while IFS= read -r funcname; do
remote_cmd+="$(declare -f "$funcname"); printf '%s\0' \"$funcname\" \"\$(\"$funcname\")\";"
done < <(compgen -A function collect_)
declare -A remote_results=( )
while IFS= read -r -d '' funcname && IFS= read -r -d '' result; do
remote_results["${funcname#collect_}"]=$result
done < <(ssh "$hostname" bash <<<"$remote_cmd")

Resources