How to compare Json data with array using Shell Script? - arrays

I have one json e.g
var bbview ={
"tbl_am_api":[
{
"Modified_User":"user1",
"Modified_Time":"04-Jul-2018 01:40:05",
"Line_Number":"SS001",
"Service_Type":"BB3",
"Status":"Yes",
"ID":3144526000014337832,
"Added_Time":"04-May-2018 11:37:29"
},
{
"Modified_User":"user2",
"Modified_Time":"04-Jul-2018 01:40:05",
"Line_Number":"SS002",
"Service_Type":"BB2",
"Status":"Yes",
"ID":3144526000014337832,
"Added_Time":"04-May-2018 11:37:29"
},
{
"Modified_User":"user3",
"Modified_Time":"04-Jul-2018 01:40:05",
"Line_Number":"SS004",
"Service_Type":"BB1",
"Status":"No",
"ID":3144526000014337832,
"Added_Time":"04-May-2018 11:37:29"
}
]
};
I want to compare this json data and array. Primary key as Line Number.
arrayA = {[{Line_Number : SS001, Service_Type : BB3; Status : Yes}]}
arrayA have Line_Number SS001. Find this Line_Number in json and compare Service_Type value and Status value are same or not. I want to write with Shell Script in bash file. I am not proficient in shell script. Please help me.
Update:
I tried with following bash code. But still fail. Please advice me
echo "Download FMS AM API File"
rm -rf tbl_am_api_Report?authtoken=da84d49f334c33b88d30dd2c947a4ff0 && wget -q https://creator.zoho.com/api/json/fixed-management-system/view/tbl_am_api_Report?authtoken=da84d49f334c33b88d30dd2c947a4ff0&scope=creatorapi&zc_ownername=tmlbroadband < /dev/null
cat > tbl_api_Report?authtoken=da84d49f334c33b88d30dd2c947a4ff0 //read json file
for row in $(echo "${apiview}" | jq -r '.[] | #base64'); do
_jq() {
echo ${row} | base64 --decode | jq -r ${1}
}
echo $(_jq '.name') >> info.txt
done
mail -s "Test email" aa#gmail.com -A info.txt < /dev/null

The value of arrayA is not JSON so I am going to let you figure out how to extract the values of Line_Number and Status from arrayA (but see below). Once those values are available, one could proceed as illustrated here:
#!/bin/bash
bbview='...' # as above
echo "$bbview" |
jq --arg Line_Number SS001 --arg Status Yes '
.tbl_am_api
| map(select(.Line_Number==$Line_Number and .Status==$Status)) '
Output
[
{
"Modified_User": "user1",
"Modified_Time": "04-Jul-2018 01:40:05",
"Line_Number": "SS001",
"Service_Type": "BB3",
"Status": "Yes",
"ID": 3144526000014338000,
"Added_Time": "04-May-2018 11:37:29"
}
]
true/false
Under a different reading of the question, the following variant might be relevant:
echo "$bbview" |
jq --arg Line_Number SS001 --arg Status Yes '
.tbl_am_api
| map(select(.Line_Number==$Line_Number) | .Status==$Status) '
arrayA
If you are using a version of bash that supports associative arrays, you could define arrayA as an associative array, like so:
declare -A arrayA
arrayA=([Line_Number]=SS001 [Service_Type]=BB3 [Status]=Yes)
Then to retrieve the value associated with Line_Number, you would write: ${arrayA[Line_Number]}; etc.

Related

return nvme list as array using jq and loop thru it

This is how I am grabbing all the NVME volumes:
all_nvme_volumes=$(sudo nvme list -o json | jq .Devices[].DevicePath)
This how the output looks like:
"/dev/nvme0n1" "/dev/nvme1n1" "/dev/nvme2n1" "/dev/nvme3n1" "/dev/nvme4n1" "/dev/nvme6n1"
How do I loop thru them process them individually?
I tried for r in "${all_nvme_volumes[#]}"; do echo "Device Name: $r"; done but the output is Device Name: "/dev/nvme0n1" "/dev/nvme1n1" "/dev/nvme2n1" "/dev/nvme3n1" "/dev/nvme4n1" "/dev/nvme6n1"
which is one string instead of each element of array:
Populating a bash array with mapfile from null delimited raw output from jq:
#!/usr/bin/env bash
mapfile -d '' all_nvme_volumes < <(
sudo nvme list --output-format=json |
jq --join-output '.Devices[].DevicePath + "\u0000"'
)
A solution for bash < 4.4:
#!/bin/bash
IFS=$'\t' read -r -a all_nvme_volumes < <(
sudo nvme list -o json | jq -r '[ .Devices[].DevicePath ] | #tsv'
)
note: device paths shouldn't be escaped by #tsv, so you won't need to unescape the values, but in case you use this trick for other purposes, you can unescape a value with printf -v value '%b' "$value"
How do I loop thru them process them individually?
Well, once you have the array, you can loop though its elements with:
for nvme_volume in "${all_nvme_volumes[#]}"
do
# process "$nvme_volume"
done
But, if you only need to loop though the nvme volumes without storing them then you can use #LéaGris null delimiter method with a while loop:
#!/bin/bash
while IFS='' read -r -d '' nvme_volume
do
# process "$nvme_volume"
done < <(sudo nvme list -o json | jq -j '.Devices[].DevicePath + "\u0000"')

Use jq to replace txt file array string values from dictionary

I have a dictionary that looks like this:
{
"uid": "d6fc3e2b-0001a",
"name": "ABC Mgmt",
"type": "host"
}
{
"uid": "d6fc3e2b-0002a",
"name": "Server XYZ",
"type": "group"
}
{
"uid": "d6fc3e2b-0003a",
"name": "NTP Primary",
"type": "host"
}
{
"uid": "d6fc3e2b-0004a",
"name": "H-10.10.10.10",
"type": "host"
}
Then I have a txt file:
"d6fc3e2b-0001a"
"d6fc3e2b-0001a","d6fc3e2b-0002a","d6fc3e2b-0003a"
"d6fc3e2b-0004a"
Expected Output:
"ABC Mgmt"
"ABC Mgmt","Server XYZ","NTP Primary"
"H-10.10.10.10"
I have some trouble to make jq using an array which is not json format. I tried various solutions that I found, but none of them worked. I am rather new to scripting, need some help.
input=file.txt
while IFS= read -r line
do
{
value=$(jq -r --arg line "$line" \
'from_entries | .[($line | split(","))[]]' \
dictionary.json)
echo $name
}
done < "$input"
In the following solution, the dictionary file is read using the --slurpfile command-line option, and the lines of "text" are read using inputs in conjunction with the -n command-line option. The -r command-line option is used in conjunction with the #csv filter to produce the desired output.
Invocation
jq -n -R -r --slurpfile dict stream.json -f program.jq stream.txt
program.jq
(INDEX($dict[]; .uid) | map_values(.name)) as $d
| inputs
| split(",")
| map(fromjson)
| map($d[.])
| #csv
Caveat
The above assumes that the quoted values in stream.txt do not themselves contain commas.
If the quoted values in stream.txt do contain commas, then it would be much easier if the values given on each line in stream.txt were given as JSON entities, e.g. as an array of strings, or as a sequence of JSON strings with no separator character.
Solution to problem described in a comment
Invocation
< original.json jq -r --slurpfile dict stream.json -f program.jq
program.jq
(INDEX($dict[]; .uid) | map_values(.name)) as $d
| .source
| map($d[.])
| #csv

Assigning an Array Parsed With jq to Bash Script Array

I parsed a json file with jq like this :
# cat test.json | jq '.logs' | jq '.[]' | jq '._id' | jq -s
It returns an array like this : [34,235,436,546,.....]
Using bash script i described an array :
# declare -a msgIds = ...
This array uses () instead of [] so when I pass the array given above to this array it won't work.
([324,32,45..]) this causes problem. If i remove the jq -s, an array forms with only 1 member in it.
Is there a way to solve this issue?
We can solve this problem by two ways. They are:
Input string:
// test.json
{
"keys": ["key1","key2","key3"]
}
Approach 1:
1) Use jq -r (output raw strings, not JSON texts) .
KEYS=$(jq -r '.keys' test.json)
echo $KEYS
# Output: [ "key1", "key2", "key3" ]
2) Use #sh (Converts input string to a series of space-separated strings). It removes square brackets[], comma(,) from the string.
KEYS=$(<test.json jq -r '.keys | #sh')
echo $KEYS
# Output: 'key1' 'key2' 'key3'
3) Using tr to remove single quotes from the string output. To delete specific characters use the -d option in tr.
KEYS=$((<test.json jq -r '.keys | #sh')| tr -d \')
echo $KEYS
# Output: key1 key2 key3
4) We can convert the comma-separated string to the array by placing our string output in a round bracket().
It also called compound Assignment, where we declare the array with a bunch of values.
ARRAYNAME=(value1 value2 .... valueN)
#!/bin/bash
KEYS=($((<test.json jq -r '.keys | #sh') | tr -d \'\"))
echo "Array size: " ${#KEYS[#]}
echo "Array elements: "${KEYS[#]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
Approach 2:
1) Use jq -r to get the string output, then use tr to delete characters like square brackets, double quotes and comma.
#!/bin/bash
KEYS=$(jq -r '.keys' test.json | tr -d '[],"')
echo $KEYS
# Output: key1 key2 key3
2) Then we can convert the comma-separated string to the array by placing our string output in a round bracket().
#!/bin/bash
KEYS=($(jq -r '.keys' test.json | tr -d '[]," '))
echo "Array size: " ${#KEYS[#]}
echo "Array elements: "${KEYS[#]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
To correctly parse values that have spaces, newlines (or any other arbitrary characters) just use jq's #sh filter and bash's declare -a. (No need for a while read loop or any other pre-processing)
// foo.json
{"data": ["A B", "C'D", ""]}
str=$(jq -r '.data | #sh' foo.json)
declare -a arr="($str)" # must be quoted like this
$ declare -p arr
declare -a arr=([0]="A B" [1]="C'D" [2]="")
The reason that this works correctly is that #sh will produce a space-separated list of shell-quoted words:
$ echo "$str"
'A B' 'C'\''D' ''
and this is exactly the format that declare expects for an array definition.
Use jq -r to output a string "raw", without JSON formatting, and use the #sh formatter to format your results as a string for shell consumption. Per the jq docs:
#sh:
The input is escaped suitable for use in a command-line for a POSIX shell. If the input is an array, the output will be a series of space-separated strings.
So can do e.g.
msgids=($(<test.json jq -r '.logs[]._id | #sh'))
and get the result you want.
From the jq FAQ (https://github.com/stedolan/jq/wiki/FAQ):
𝑸: How can a stream of JSON texts produced by jq be converted into a bash array of corresponding values?
A: One option would be to use mapfile (aka readarray), for example:
mapfile -t array <<< $(jq -c '.[]' input.json)
An alternative that might be indicative of what to do in other shells is to use read -r within a while loop. The following bash script populates an array, x, with JSON texts. The key points are the use of the -c option, and the use of the bash idiom while read -r value; do ... done < <(jq .......):
#!/bin/bash
x=()
while read -r value
do
x+=("$value")
done < <(jq -c '.[]' input.json)
++ To resolve this, we can use a very simple approach:
++ Since I am not aware of you input file, I am creating a file input.json with the following contents:
input.json:
{
"keys": ["key1","key2","key3"]
}
++ Use jq to get the value from the above file input.json:
Command: cat input.json | jq -r '.keys | #sh'
Output: 'key1' 'key2' 'key3'
Explanation: | #sh removes [ and "
++ To remove ' ' as well we use tr
command: cat input.json | jq -r '.keys | #sh' | tr -d \'
Explanation: use tr delete -d to remove '
++ To store this in a bash array we use () with `` and print it:
command:
KEYS=(`cat input.json | jq -r '.keys | #sh' | tr -d \'`)
To print all the entries of the array: echo "${KEYS[*]}"

Bash array values as variables

Is it possible to use array values as variables?
For example, i have this script:
#!/bin/bash
SOURCE=$(curl -k -s $1 | sed 's/{//g;s/}//g;s/,/"\n"/g;s/:/=/g;s/"//g' | awk -F"=" '{ print $1 }')
JSON=$(curl -k -s $1 | sed 's/{//g;s/}//g;s/,/"\n"/g;s/:/=/g;s/"//g' | awk -F"=" '{ print $NF }')
data=$2
readarray -t prot_array <<< "$SOURCE"
readarray -t pos_array <<< "$JSON"
for ((i=0; i<${#prot_array[#]}; i++)); do
echo "${prot_array[i]}" "${pos_array[i]}" | sed 's/NOK/0/g;s/OK/1/g' | grep $2 | awk -F' ' '{ print $2,$3,$4 }'
done
EDIT:
I just added: grep $2 | awk -F' ' '{ print $2,$3,$4 }'
Usage:
./json.sh URL
Sample (very short) output:
DATABASE 1
STATUS 1
I don't want to echo out all the lines, i would like to use DATABASE STATUS as variable $DATABASE and echo that out.
I just need DATABASE (or any other) value from command line.
Is it somehow possible to use something like this?
./json.sh URL $DATABASE
Happy to explain more if needed.
EDIT:
curl output without any formattings etc:
{
"VERSION":"R3.1",
"STATUS":"OK",
"DATABASES":{
"READING":"OK"
},
"TIMESTAMP":"2017-03-08-16-20-35"
}
Output using script:
VERSION R3.1
STATUS 1
DATABASES 1
TIMESTAMP 2017-03-08-16-21-54
What i want is described before. For example use DATABASE as varible $DATABASE and somehow get the value "1"
EDIT:
Random json from uconn.edu
./json.sh https://github.uconn.edu/raw/nam12023/novaLauncher/master/manifest.json
Another:
./json.sh https://gitlab.uwe.ac.uk/dc2-roskilly/angular-qs/raw/master/.npm/nan/2.4.0/package/package.json
Last output begins with:
name nan
version 2.4.0
From command line: ./json.sh URL version
At leats it works for me.
I think you want to use jq something like this:
$ curl -k -s "$1" | jq --arg d DATABASES -r '
"VERSION \(.VERSION)",
"STATUS \(if .STATUS == "OK" then 1 else 0 end)",
"DATABASES \(if .[$d].READING == "OK" then 1 else 0 end)",
"TIMESTAMP \(.TIMESTAMP)"
'
VERSION R3.1
STATUS 1
DATABASES 1
TIMESTAMP 2017-03-08-16-20-35
(I'm probably missing a simpler way to convert a boolean value to an integer.)
Quick explanation:
The ,-separated strings each become a separate output line.
The -r option outputs a raw string, rather than a JSON string value.
The name of the database field is passed using the --arg option.
\(...) is jq's interpolation operator; the contents are evaluated as a JSON expression and the result is inserted into the string.

Accessing a JSON object in Bash - associative array / list / another model

I have a Bash script which gets data in JSON, I want to be able to convert the JSON into an accessible structure - array / list / or other model which would be easy to parse the nested data.
Example:
{
"SALUTATION": "Hello world",
"SOMETHING": "bla bla bla Mr. Freeman"
}
I want to get the value like the following: echo ${arr[SOMETHING]}
[ Different approach is optional as well. ]
If you want key and value, and based on How do i convert a json object to key=value format in JQ, you can do:
$ jq -r "to_entries|map(\"\(.key)=\(.value|tostring)\")|.[]" file
SALUTATION=Hello world
SOMETHING=bla bla bla Mr. Freeman
In a more general way, you can store the values into an array myarray[key] = value like this, just by providing jq to the while with the while ... do; ... done < <(command) syntax:
declare -A myarray
while IFS="=" read -r key value
do
myarray[$key]="$value"
done < <(jq -r 'to_entries|map("(.key)=(.value)")|.[]' file)
And then you can loop through the values like this:
for key in "${!myarray[#]}"
do
echo "$key = ${myarray[$key]}"
done
For this given input, it returns:
SALUTATION = Hello world
SOMETHING = bla bla bla Mr. Freeman
Although this question is answered, I wasn't able to fully satiate my
requirements from the posted answer. Here is a little write up that'll help any
bash-newcomers.
Foreknowledge
A basic associative array declaration
#!/bin/bash
declare -A associativeArray=([key1]=val1 [key2]=val2)
You can also use quotes (', ") around the declaration, its keys, and
values.
#!/bin/bash
declare -A 'associativeArray=([key1]=val1 [key2]=val2)'
And you can delimit each [key]=value pair via space or newline.
#!/bin/bash
declare -A associativeArray([key1]=value1
['key2']=value2 [key3]='value3'
['key4']='value2' ["key5"]="value3"
["key6"]='value4'
['key7']="value5"
)
Depending on your quote variation, you may need to escape your string.
Using Indirection to access both key and value in an associative array
example () {
local -A associativeArray=([key1]=val1 [key2]=val2)
# print associative array
local key value
for key in "${!associativeArray[#]}"; do
value="${associativeArray["$key"]}"
printf '%s = %s' "$key" "$value"
done
}
Running the example function
$ example
key2 = val2
key1 = val1
Knowing the aforementioned tidbits allows you to derive the following snippets:
The following examples will all have the result as the example above
String evaluation
#!/usr/bin/env bash
example () {
local arrayAsString='associativeArray=([key1]=val1 [key2]=val2)'
local -A "$arrayAsString"
# print associative array
}
Piping your JSON into JQ
#!/usr/bin/env bash
# Note: usage of single quotes instead of double quotes for the jq
# filter. The former is preferred to avoid issues with shell
# substitution of quoted strings.
example () {
# Given the following JSON
local json='{ "key1": "val1", "key2": "val2" }'
# filter using `map` && `reduce`
local filter='to_entries | map("[\(.key)]=\(.value)") |
reduce .[] as $item ("associativeArray=("; . + ($item|#sh) + " ") + ")"'
# Declare and assign separately to avoid masking return values.
local arrayAsString;
# Note: no encompassing quotation (")
arrayAsString=$(jq --raw-output "${filter}" <<< "$json")
local -A "$arrayAsString"
# print associative array
}
jq -n / --null-input option + --argfile && redirection
#!/usr/bin/env bash
example () {
# /path/to/file.json contains the same json as the first two examples
local filter filename='/path/to/file.json'
# including bash variable name in reduction
filter='to_entries | map("[\(.key | #sh)]=\(.value | #sh) ")
| "associativeArray=(" + add + ")"'
# using --argfile && --null-input
local -A "$(jq --raw-output --null-input --argfile file "$filename" \
"\$filename | ${filter}")"
# or for a more traceable declaration (using shellcheck or other) this
# variation moves the variable name outside of the string
# map definition && reduce replacement
filter='[to_entries[]|"["+(.key|#sh)+"]="+(.value|#sh)]|"("+join(" ")+")"'
# input redirection && --join-output
local -A associativeArray=$(jq --join-output "${filter}" < "${filename}")
# print associative array
}
Reviewing previous answers
#Ján Lalinský
To load JSON object into a bash associative array efficiently
(without using loops in bash), one can use tool 'jq', as follows.
# first, load the json text into a variable:
json='{"SALUTATION": "Hello world", "SOMETHING": "bla bla bla Mr. Freeman"}'
# then, prepare associative array, I use 'aa':
unset aa
declare -A aa
# use jq to produce text defining name:value pairs in the bash format
# using #sh to properly escape the values
aacontent=$(jq -r '. | to_entries | .[] | "[\"" + .key + "\"]=" + (.value | #sh)' <<< "$json")
# string containing whole definition of aa in bash
aadef="aa=($aacontent)"
# load the definition (because values may contain LF characters, aadef must be in double quotes)
eval "$aadef"
# now we can access the values like this: echo "${aa[SOMETHING]}"
Warning: this uses eval, which is dangerous if the json input is from unknown source (may contain malicious shell commands that eval may execute).
This could be reduced to the following
example () {
local json='{ "key1": "val1", "key2": "val2" }'
local -A associativeArray="($(jq -r '. | to_entries | .[] |
"[\"" + .key + "\"]=" + (.value | #sh)' <<< "$json"))"
# print associative array
}
#fedorqui
If you want key and value, and based on How do i convert a json object to key=value format in JQ, you can do:
$ jq -r "to_entries|map(\"\(.key)=\(.value|tostring)\")|.[]" file
SALUTATION=Hello world
SOMETHING=bla bla bla Mr. Freeman
In a more general way, you can store the values into an array myarray[key] = value like this, just by providing jq to the while with the while ... do; ... done < <(command) syntax:
declare -A myarray
while IFS="=" read -r key value
do
myarray[$key]="$value"
done < <(jq -r "to_entries|map(\"\(.key)=\(.value)\")|.[]" file)
And then you can loop through the values like this:
for key in "${!myarray[#]}"
do
echo "$key = ${myarray[$key]}"
done
For this given input, it returns:
SALUTATION = Hello world
SOMETHING = bla bla bla Mr. Freeman
The main difference between this solution and my own is looping through the
array in bash or in jq.
Each solution is valid and depending on your use case, one may be more useful
then the other.
Context: This answer was written to be responsive to a question title which no longer exists..
The OP's question actually describes objects, vs arrays.
To be sure that we help other people coming in who are actually looking for help with JSON arrays, though, it's worth covering them explicitly.
For the safe-ish case where strings can't contain newlines (and when bash 4.0 or newer is in use), this works:
str='["Hello world", "bla bla bla Mr. Freeman"]'
readarray -t array <<<"$(jq -r '.[]' <<<"$str")"
To support older versions of bash, and strings with newlines, we get a bit fancier, using a NUL-delimited stream to read from jq:
str='["Hello world", "bla bla bla Mr. Freeman", "this is\ntwo lines"]'
array=( )
while IFS= read -r -d '' line; do
array+=( "$line" )
done < <(jq -j '.[] | (. + "\u0000")')
This is how can it be done recursively:
#!/bin/bash
SOURCE="$PWD"
SETTINGS_FILE="$SOURCE/settings.json"
SETTINGS_JSON=`cat "$SETTINGS_FILE"`
declare -A SETTINGS
function get_settings() {
local PARAMS="$#"
local JSON=`jq -r "to_entries|map(\"\(.key)=\(.value|tostring)\")|.[]" <<< "$1"`
local KEYS=''
if [ $# -gt 1 ]; then
KEYS="$2"
fi
while read -r PAIR; do
local KEY=''
if [ -z "$PAIR" ]; then
break
fi
IFS== read PAIR_KEY PAIR_VALUE <<< "$PAIR"
if [ -z "$KEYS" ]; then
KEY="$PAIR_KEY"
else
KEY="$KEYS:$PAIR_KEY"
fi
if jq -e . >/dev/null 2>&1 <<< "$PAIR_VALUE"; then
get_settings "$PAIR_VALUE" "$KEY"
else
SETTINGS["$KEY"]="$PAIR_VALUE"
fi
done <<< "$JSON"
}
To call it:
get_settings "$SETTINGS_JSON"
The array will be accessed like so:
${SETTINGS[grandparent:parent:child]}
To load JSON object into a bash associative array efficiently (without using loops in bash), one can use tool 'jq', as follows.
# first, load the json text into a variable:
json='{"SALUTATION": "Hello world", "SOMETHING": "bla bla bla Mr. Freeman"}'
# then, prepare associative array, I use 'aa':
unset aa
declare -A aa
# use jq to produce text defining name:value pairs in the bash format
# using #sh to properly escape the values
aacontent=$(jq -r '. | to_entries | .[] | "[\"" + .key + "\"]=" + (.value | #sh)' <<< "$json")
# string containing whole definition of aa in bash
aadef="aa=($aacontent)"
# load the definition (because values may contain LF characters, aadef must be in double quotes)
eval "$aadef"
# now we can access the values like this: echo "${aa[SOMETHING]}"
Warning: this uses eval, which is dangerous if the json input is from unknown source (may contain malicious shell commands that eval may execute).
Building on #HelpNeeder's solution (nice one btw)
His solution wasn't really working with integers, so i made some additions. Extended amount of condition checks, so it's fair to say some performance is sacrificed.
This version works with integers and also floating point values.
SOURCE="$PWD"
SETTINGS_FILE="./test2.json"
SETTINGS_JSON=`cat "$SETTINGS_FILE"`
declare -A SETTINGS
get_settings() {
local PARAMS="$#"
local JSON=`jq -r "to_entries|map(\"\(.key)=\(.value|tostring)\")|.[]" <<< "$1"`
local KEYS=''
if [ $# -gt 1 ]; then
KEYS="$2"
fi
while read -r PAIR; do
local KEY=''
if [ -z "$PAIR" ]; then
break
fi
IFS== read PAIR_KEY PAIR_VALUE <<< "$PAIR"
if [ -z "$KEYS" ]; then
KEY="$PAIR_KEY"
else
KEY="$KEYS:$PAIR_KEY"
fi
res=$(jq -e . 2>/dev/null <<< "$PAIR_VALUE")
exitCode=$?
check=`echo "$PAIR_VALUE" | grep -E ^\-?[0-9]*\.?[0-9]+$`
# if [ "${res}" ] && [ $exitCode -eq "0" ] && [[ ! "${PAIR_VALUE}" == ?(-)+([0-9]) ]] ALTERNATIVE, works only for integer (not floating point)
if [ "${res}" ] && [ $exitCode -eq "0" ] && [[ "$check" == '' ]]
then
get_settings "$PAIR_VALUE" "$KEY"
else
SETTINGS["$KEY"]="$PAIR_VALUE"
fi
done <<< "$JSON"
}
get_settings "$SETTINGS_JSON"
Solution: use jq( it's a lightweight and flexible command-line JSON processor.).
In bash I'd rather assign JSONs object to a variable and use jq in order to access and parse the right result from it. It's more convenient than parse this structure with arrays and it comes out of the box with multiple functionalities and features such as accessing nested and complex objects, select methods, builtin operators and functions, regex support ,comparisons etc...
example:
example='{"SALUTATION": "Hello world","SOMETHING": "bla bla bla Mr. Freeman"}'
echo $example | jq .SOMETHING
# output:
"bla bla bla Mr. Freeman"

Resources