bash script split array of strings to designed output - arrays

Data in log file is something like:
"ipHostNumber: 127.0.0.1
ipHostNumber: 127.0.0.2
ipHostNumber: 127.0.0.3"
that's my code snippet:
readarray -t iparray < "$destlog"
unset iparray[0]
for ip in "${iparray[#]}"
do
IFS=$'\n' read -r -a iparraychanged <<< "${iparray[#]}"
done
And what I wanted to receive is to transfer IP's to another array and then read every line from that array and ping it.
UPDATE: I've acknowledged something, it's probably that I want one array to another but without some string, in this case cut "ipHostNumber: " and have remained only IPs.
Thanks in advance, if there's anything missing please let me know.

Why do you need arrays at all? Just read the input and do the work.
while IFS=' ' read -r _ ip; do
ping -c1 "$ip"
done < "$destlog"
See https://mywiki.wooledge.org/BashFAQ/001 .
Another way is to use xargs and filtering:
awk '{print $2}' "$destlog" | xargs -P0 -n1 ping -c1

I prefer KamilCuk's xargs solution, but if you just wanted the array because you plan to reuse it -
$: readarray -t ip < <( sed '1d; s/^.* //' file )
Now it's loaded.
$: declare -p ip
declare -a ip=([0]="127.0.0.2" [1]="127.0.0.3")
$: for addr in "${ip[#]}"; do ping -c 1 "$addr"; done

Related

Why doesn't this split string expression give me an array?

I am wondering why this array expression in Bash doesn't give me an array. It just gives me the first element in the string:
IFS='\n' read -r -a POSSIBLE_ENCODINGS <<< $(iconv -l)
I want to try out all available encodings to see how reading different file encodings for a script in R works, and I am using this Bash-script to create text-files with all possible encodings:
#!/bin/bash
IFS='\n' read -r -a POSSIBLE_ENCODINGS <<< $(iconv -l)
echo "${POSSIBLE_ENCODINGS[#]}"
for CURRENT_ENCODING in "${POSSIBLE_ENCODINGS[#]}"
do
TRIMMED=$(echo $CURRENT_ENCODING | sed 's:/*$::')
iconv --verbose --from-code=UTF-8 --to-code="$TRIMMED" --output=encoded-${TRIMMED}.txt first_file.txt
echo "Current encoding: ${TRIMMED}"
echo "Output file:encoded-${TRIMMED}.txt"
done
EDIT: Code edited according to answers below:
#!/bin/bash
readarray -t possibleEncodings <<< "$(iconv -l)"
echo "${possibleEncodings[#]}"
for currentEncoding in "${possibleEncodings[#]}"
do
trimmedEncoding=$(echo $currentEncoding | sed 's:/*$::')
echo "Trimmed encoding: ${trimmedEncoding}"
iconv --verbose --from-code=UTF-8 --to-code="$trimmedEncoding" --output=encoded-${trimmedEncoding}.txt first_file.txt
echo "Current encoding: ${trimmedEncoding}"
echo "Output file:encoded-${trimmedEncoding}.txt"
done
You could just readarray/mapfile instead which are tailor made for reading multi-line output into an array.
mapfile -t possibleEncodings < <(iconv -l)
The here-strings are useless, when you can just run the command in a process-substitution model. The <() puts the command output as if it appears on a file for mapfile to read from.
As for why your original attempt didn't work, you are just doing the read call once, but there is still strings to read in the subsequent lines. You either need to read till EOF in a loop or use the mapfile as above which does the job for you.
As a side-note always use lowercase letters for user defined variable/array and function names. This lets you distinguish your variables from the shell's own environment variables which are upper-cased.
because read reads only one line, following while can be used
arr=()
while read -r line; do
arr+=( "$line" )
done <<< "$(iconv -l)"
otherwise, there is also readarray builtin
readarray -t arr <<< "$(iconv -l)"

Why isn't my bash array append working?

This script was working just fine on an AIX box, but now on RH linux box the arrays just doesn't seem to work. Version on the new RH box is 4.1.2
I declare my array
declare -a gridNames=()
I get information about a grid
gridstats=`snmpwalk -v 2c -c splunk $host gridStatsTable -m $APPLIANCEMIB -OUQs -Ln`
As well as getting the stats from the above, I reuse it to find all the gridNames and then will use the array of gridNames to get stats about their maps.
while read -r process; do
gridNames=(${gridNames[#]} `grep gridName | awk -F "\"" '{print $(NF-1)}'`)
done <<< "$gridstats"
The awk part is tested and correctly returns a list of gridnames (just one in this case) but when I echo the array gridNames its empty.
I have also tried using
gridNames+=(`grep gridName | awk -F "\"" '{print $(NF-1)}'`)
but that doesnt work either
You need to use += operator for appending elements to an array with process substitution:
while read -r process; do
gridNames+=( $(grep 'gridName' | awk -F '"' '{print $(NF-1)}' <<< "$process") )
done < <(snmpwalk -v 2c -c splunk $host gridStatsTable -m $APPLIANCEMIB -OUQs -Ln)

sh shell: how do I grab and store values, which may have space, in an array

I am trying to write a script to grab the users from the passwd file
USERS_LIST=( $( cat /etc/passwd | cut -d":" -f1 ) )
the above would do the trick up until now because I only had users with no spaces in their names.
However, this is not the case anymore. I need to be able to resolve usernames that may very well have spaces in their names.
I tried reading line by line the file, but the same problem exists (this is one line but I have indented it for clarity here):
tk=($( while read line ; do
j=$(echo ${line} | cut -d":" -f1 )
echo "$j"
done < /etc/passwd )
)
unfortunately if I try to print the array, the usernames with space will be split in 2 array cells.
So username "named user" , will occupy array [0] and [1] locations.
How can I fix that in sh shell?
thank you for your help!
Arrays are bash (and ksh, and zsh) features not present in POSIX sh, so I'm assuming that you mean to ask about bash. You can't store anything in an array in sh, since sh doesn't have arrays.
Don't populate an array that way.
users_list=( $( cat /etc/passwd | cut -d":" -f1 ) )
...string-splits and glob-expands contents. Instead:
# This requires bash 4.0 or later
mapfile -t users_list < <(cut -d: -f1 </etc/passwd)
...or...
IFS=$'\n' read -r -d '' -a users_list < <(cut -d: -f1 </etc/passwd)
Now, if you really want POSIX sh compatibility, there is one array -- exactly one, the argument list. You can overwrite it if you see fit.
set --
cut -d: -f1 </etc/passwd >tempfile
while read -r username; do
set -- "$#" "$username"
done <tempfile
At that point, "$#" is an array of usernames.

Execute bash command stored in associative array over SSH, store result

For a larger project that's not relevant, I need to collect system stats from the local system or a remote system. Since I'm collecting the same stats either way, I'm preventing code duplication by storing the stats-collecting commands in a Bash associative array.
declare -A stats_cmds
# Actually contains many more key:value pairs, similar style
stats_cmds=([total_ram]="$(free -m | awk '/^Mem:/{print $2}')")
I can collect local system stats like this:
get_local_system_stats()
{
# Collect stats about local system
complex_data_structure_that_doesnt_matter=${stats_cmds[total_ram]}
# Many more similar calls here
}
A precondition of my script is that ~/.ssh/config is setup such that ssh $SSH_HOSTNAME works without any user input. I would like something like this:
get_remote_system_stats()
{
# Collect stats about remote system
complex_data_structure_that_doesnt_matter=`ssh $SSH_HOSTNAME ${stats_cmds[total_ram]}`
}
I've tried every combination of single quotes, double quotes, backticks and such that I can imagine. Some combinations result in the stats command getting executed too early (bash: 7986: command not found), others cause syntax errors, others return null (single quotes around the stats command) but none store the proper result in my data structure.
How can I evaluate a command, stored in an associative array, on a remote system via SSH and store the result in a data structure in my local script?
Make sure that the commands you store in your array don't get expanded when you assign your array!
Also note that the complex-looking quoting style is necessary when nesting single quotes. See this SO post for an explanation.
stats_cmds=([total_ram]='free -m | awk '"'"'/^Mem:/{print $2}'"'"'')
And then just launch your ssh as:
sh "$ssh_hostname" "${stats_cmds[total_ram]}"
(yeah, I lowercased your variable name because uppercase variable names in Bash are really sick). Then:
get_local_system_stats() {
# Collect stats about local system
complex_data_structure_that_doesnt_matter=$( ${stats_cmds[total_ram]} )
# Many more similar calls here
}
and
get_remote_system_stats() {
# Collect stats about remote system
complex_data_structure_that_doesnt_matter=$(ssh "$ssh_hostname" "${stats_cmds[total_ram]}")
}
First, I'm going to suggest an approach that makes minimal changes to your existing implementation. Then, I'm going to demonstrate something closer to best practices.
Smallest Modification
Given your existing code:
declare -A remote_stats_cmds
remote_stats_cmds=([total_ram]='free -m | awk '"'"'/^Mem:/{print $2}'"'"''
[used_ram]='free -m | awk '"'"'/^Mem:/{print $3}'"'"''
[free_ram]='free -m | awk '"'"'/^Mem:/{print $4}'"'"''
[cpus]='nproc'
[one_min_load]='uptime | awk -F'"'"'[a-z]:'"'"' '"'"'{print $2}'"'"' | awk -F "," '"'"'{print $1}'"'"' | tr -d " "'
[five_min_load]='uptime | awk -F'"'"'[a-z]:'"'"' '"'"'{print $2}'"'"' | awk -F "," '"'"'{print $2}'"'"' | tr -d " "'
[fifteen_min_load]='uptime | awk -F'"'"'[a-z]:'"'"' '"'"'{print $2}'"'"' | awk -F "," '"'"'{print $3}'"'"' | tr -d " "'
[iowait]='cat /proc/stat | awk '"'"'NR==1 {print $6}'"'"''
[steal_time]='cat /proc/stat | awk '"'"'NR==1 {print $9}'"'"'')
...one can evaluate these locally as follows:
result=$(eval "${remote_stat_cmds[iowait]}")
echo "$result" # demonstrate value retrieved
...or remotely as follows:
result=$(ssh "$hostname" bash <<<"${remote_stat_cmds[iowait]}")
echo "$result" # demonstrate value retrieved
No separate form is required.
The Right Thing
Now, let's talk about an entirely different way to do this:
# no awful nested quoting by hand!
collect_total_ram() { free -m | awk '/^Mem:/ {print $2}'; }
collect_used_ram() { free -m | awk '/^Mem:/ {print $3}'; }
collect_cpus() { nproc; }
...and then, to evaluate locally:
result=$(collect_cpus)
...or, to evaluate remotely:
result=$(ssh "$hostname" bash <<<"$(declare -f collect_cpus); collect_cpus")
...or, to iterate through defined functions with the collect_ prefix and do both of these things:
declare -A local_results
declare -A remote_results
while IFS= read -r funcname; do
local_results["${funcname#collect_}"]=$("$funcname")
remote_results["${funcname#collect_}"]=$(ssh "$hostname" bash <<<"$(declare -f "$funcname"); $funcname")
done < <(compgen -A function collect_)
...or, to collect all the items into a single remote array in one pass, avoiding extra SSH round-trips and not eval'ing or otherwise taking security risks with results received from the remote system:
remote_cmd=""
while IFS= read -r funcname; do
remote_cmd+="$(declare -f "$funcname"); printf '%s\0' \"$funcname\" \"\$(\"$funcname\")\";"
done < <(compgen -A function collect_)
declare -A remote_results=( )
while IFS= read -r -d '' funcname && IFS= read -r -d '' result; do
remote_results["${funcname#collect_}"]=$result
done < <(ssh "$hostname" bash <<<"$remote_cmd")

Proper way to keep array from pipe BASH

I saw quite a few different solutions to resolve an issue with keeping an array from a pipe however none seemed to do the trick for me, currently my script works correctly however the array "databasesarray" is lost upon "done", how would I go about keeping this information with my complex pipe scheme?
databasesarray=()
N=0
dbs -d 123123 | grep db|awk '{print $2}'|while read db;
do
databasesarray[$N]="$db";
databasesarray[$N]+=$(gdb $db|grep dn);
echo ${N} ${databasesarray[$N]};
N=$(($N + 1));
done
Better and more efficient way of filling up array in a loop:
databasesarray=()
while read -r db; do
databasesarray+=( "$db $(gdb "$db"|grep "dn")" )
done < <(dbs -d 123123 | awk '/db/{print $2}')
Your grep and awk can be combined into one
Instead of pipe with while it better to use process substitution < <(...) syntax
PS: You could use read -a for filling up array:
read -a databasesarray < <(dbs -d 123123 | awk '/db/{print $2}')

Resources