This question already has answers here:
How to format a bash array as a JSON array
(6 answers)
Closed 5 months ago.
I am trying to write a bash script that generates uuids and writes that to a file, in a json array format.
I have written a simple code that generates the id and writes to a file but the issues i am running into are;
here is my own implementation
function uud {
for ((i=1;i<=$1;i++));
do
echo "\`uuidgen\`", >> $file
done
}
for file in file1.json file2.json;
do
$( uud $1)
done
I am having issues with converting the ids into strings.
Converting the results to arrays.
Currently my solution prints the ids into the file in this format
caca8fef-42d6-4b21-9d6b-0e40d348bd53,
e6e12fb5-4304-4ba9-b895-931bb4b58fbf,
df699ecd-d887-413e-8383-2a98ac2fa22f,
and this is what i want to acheive, How can I go about getting this result?
[
"caca8fef-42d6-4b21-9d6b-0e40d348bd53",
"e6e12fb5-4304-4ba9-b895-931bb4b58fbf",
"df699ecd-d887-413e-8383-2a98ac2fa22f"
]
Use jq to generate the JSON.
uud () {
for ((i=0; i< $1; i++)); do
uuidgen
done
}
for file in file1.json file2.json; do
uud 5 | jq -Rs 'rtrimstr("\n") | split("\n")' > "$file"
done
-Rs reads the entire input into a single JSON string, which you then split (after removing the trailing newline) on newlines to produce the desired array.
One blogger suggests using two jq processes, which is more expensive but arguably simpler:
for file in file1.json file2.json; do
uud | jq -R . | jq -s . > "$file"
done
I have adjusted slightly your script, try it:
function uud {
echo "[" > $file
for ((i=1;i<$1;i++));
do
echo \"$(uuidgen)\", >> $file
done
echo \"$(uuidgen)\" >> $file # last item in array without comma
echo "]" >> $file
}
for file in file1.json file2.json;
do
$( uud $1)
done
Or execute it here and check the result.
After you've run your existing loop...
#!/bin/sh
cat > ed1 <<EOF
%s/^/"/g
%s/$/"/g
$s/,//
1i
[
.
$a
]
.
wq
EOF
ed -s file < ed1
Related
This question already has answers here:
Compare/Difference of two arrays in Bash
(9 answers)
Closed 2 years ago.
I have one array item1 with entries like
item1=($(cat abc.txt | grep "exam" |sed 's/.*exam//'|sed 's/^ *//g'|cut -d/ -f2))
so in this array i have below entries
abc.raml def.raml xyz.schema check1.json check2.json
now i want check for each item of this array item1, if it is present in another array item2
so i did this using for loop
for i in "${item1[#]}"; do
for j in "${item2[#]}"; do
if [[ $i == $j ]]
then
echo "file $i present in both arrays"
fi
done
done
so this loop is working fine..
but can we just get a one liner command to check if a particular element is present in another array without using another for loop inside
i tried this but it is not working
for i in "${item1[#]}"; do
echo ` "${item2[#]}" | grep "$i" `
if echo $? == 0
then
echo "file $i present in both arrays"
fi
done
done
Please help me here
Print both arrays as a sorted list, then with comm extract elements present in both lists.
$ item1=(abc.raml def.raml xyz.schema check1.json check2.json)
$ item2=(abc.raml def.raml xyz.schema check1.json check2.json other)
$ comm -12 <(printf "%s\n" "${item1[#]}" | sort) <(printf "%s\n" "${item2[#]}" | sort) | xargs -I{} echo "file {} present in both arrays"
file abc.raml present in both arrays
file check1.json present in both arrays
file check2.json present in both arrays
file def.raml present in both arrays
file xyz.schema present in both arrays
to check if a particular element is present in another array
Print the array as list and grep the element.
if printf "%s\n" "${item1[#]}" | grep -qxF abc.raml; then
echo "abc.raml is present in item1"
fi
When working with array items that can have newline in them, use zero separated streams.
I parsed a json file with jq like this :
# cat test.json | jq '.logs' | jq '.[]' | jq '._id' | jq -s
It returns an array like this : [34,235,436,546,.....]
Using bash script i described an array :
# declare -a msgIds = ...
This array uses () instead of [] so when I pass the array given above to this array it won't work.
([324,32,45..]) this causes problem. If i remove the jq -s, an array forms with only 1 member in it.
Is there a way to solve this issue?
We can solve this problem by two ways. They are:
Input string:
// test.json
{
"keys": ["key1","key2","key3"]
}
Approach 1:
1) Use jq -r (output raw strings, not JSON texts) .
KEYS=$(jq -r '.keys' test.json)
echo $KEYS
# Output: [ "key1", "key2", "key3" ]
2) Use #sh (Converts input string to a series of space-separated strings). It removes square brackets[], comma(,) from the string.
KEYS=$(<test.json jq -r '.keys | #sh')
echo $KEYS
# Output: 'key1' 'key2' 'key3'
3) Using tr to remove single quotes from the string output. To delete specific characters use the -d option in tr.
KEYS=$((<test.json jq -r '.keys | #sh')| tr -d \')
echo $KEYS
# Output: key1 key2 key3
4) We can convert the comma-separated string to the array by placing our string output in a round bracket().
It also called compound Assignment, where we declare the array with a bunch of values.
ARRAYNAME=(value1 value2 .... valueN)
#!/bin/bash
KEYS=($((<test.json jq -r '.keys | #sh') | tr -d \'\"))
echo "Array size: " ${#KEYS[#]}
echo "Array elements: "${KEYS[#]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
Approach 2:
1) Use jq -r to get the string output, then use tr to delete characters like square brackets, double quotes and comma.
#!/bin/bash
KEYS=$(jq -r '.keys' test.json | tr -d '[],"')
echo $KEYS
# Output: key1 key2 key3
2) Then we can convert the comma-separated string to the array by placing our string output in a round bracket().
#!/bin/bash
KEYS=($(jq -r '.keys' test.json | tr -d '[]," '))
echo "Array size: " ${#KEYS[#]}
echo "Array elements: "${KEYS[#]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
To correctly parse values that have spaces, newlines (or any other arbitrary characters) just use jq's #sh filter and bash's declare -a. (No need for a while read loop or any other pre-processing)
// foo.json
{"data": ["A B", "C'D", ""]}
str=$(jq -r '.data | #sh' foo.json)
declare -a arr="($str)" # must be quoted like this
$ declare -p arr
declare -a arr=([0]="A B" [1]="C'D" [2]="")
The reason that this works correctly is that #sh will produce a space-separated list of shell-quoted words:
$ echo "$str"
'A B' 'C'\''D' ''
and this is exactly the format that declare expects for an array definition.
Use jq -r to output a string "raw", without JSON formatting, and use the #sh formatter to format your results as a string for shell consumption. Per the jq docs:
#sh:
The input is escaped suitable for use in a command-line for a POSIX shell. If the input is an array, the output will be a series of space-separated strings.
So can do e.g.
msgids=($(<test.json jq -r '.logs[]._id | #sh'))
and get the result you want.
From the jq FAQ (https://github.com/stedolan/jq/wiki/FAQ):
𝑸: How can a stream of JSON texts produced by jq be converted into a bash array of corresponding values?
A: One option would be to use mapfile (aka readarray), for example:
mapfile -t array <<< $(jq -c '.[]' input.json)
An alternative that might be indicative of what to do in other shells is to use read -r within a while loop. The following bash script populates an array, x, with JSON texts. The key points are the use of the -c option, and the use of the bash idiom while read -r value; do ... done < <(jq .......):
#!/bin/bash
x=()
while read -r value
do
x+=("$value")
done < <(jq -c '.[]' input.json)
++ To resolve this, we can use a very simple approach:
++ Since I am not aware of you input file, I am creating a file input.json with the following contents:
input.json:
{
"keys": ["key1","key2","key3"]
}
++ Use jq to get the value from the above file input.json:
Command: cat input.json | jq -r '.keys | #sh'
Output: 'key1' 'key2' 'key3'
Explanation: | #sh removes [ and "
++ To remove ' ' as well we use tr
command: cat input.json | jq -r '.keys | #sh' | tr -d \'
Explanation: use tr delete -d to remove '
++ To store this in a bash array we use () with `` and print it:
command:
KEYS=(`cat input.json | jq -r '.keys | #sh' | tr -d \'`)
To print all the entries of the array: echo "${KEYS[*]}"
I have an array and a csv file. I'm trying to get data from the csv file that match with the data from the array.
Here's my array:
array[12345678876543]=ID00000111
array[87654321234567]=ID00000222
and here is the data from csv file:
12345678876543,floor1
87654321234567,floor2
Im trying to get this output:
ID00000111 floor1
ID00000222 floor2
I tried this sytax but I can only get the floor number.
for key in ${!array[#]}; do
awk -F, -v serial="${key}" '$1 == serial { print $2; exit}' test.csv
done
I hope someone could help me in my problem.
I'm assuming that the first entry in your csv file is the key to the array.
#!/bin/bash
array[12345678876543]=ID00000111
array[87654321234567]=ID00000222
while read -r line; do
key=$(echo $line | cut -d, -f1)
val=$(echo $line | cut -d, -f2-)
echo ${array[$key]} $val
done < test.csv
You can also do something like this which would be closer to what you have right now:
for key in ${!array[#]}; do
echo ${array[$key]} $(grep "$key" test.csv | cut -d, -f2-)
done
I have a Bash script which gets data in JSON, I want to be able to convert the JSON into an accessible structure - array / list / or other model which would be easy to parse the nested data.
Example:
{
"SALUTATION": "Hello world",
"SOMETHING": "bla bla bla Mr. Freeman"
}
I want to get the value like the following: echo ${arr[SOMETHING]}
[ Different approach is optional as well. ]
If you want key and value, and based on How do i convert a json object to key=value format in JQ, you can do:
$ jq -r "to_entries|map(\"\(.key)=\(.value|tostring)\")|.[]" file
SALUTATION=Hello world
SOMETHING=bla bla bla Mr. Freeman
In a more general way, you can store the values into an array myarray[key] = value like this, just by providing jq to the while with the while ... do; ... done < <(command) syntax:
declare -A myarray
while IFS="=" read -r key value
do
myarray[$key]="$value"
done < <(jq -r 'to_entries|map("(.key)=(.value)")|.[]' file)
And then you can loop through the values like this:
for key in "${!myarray[#]}"
do
echo "$key = ${myarray[$key]}"
done
For this given input, it returns:
SALUTATION = Hello world
SOMETHING = bla bla bla Mr. Freeman
Although this question is answered, I wasn't able to fully satiate my
requirements from the posted answer. Here is a little write up that'll help any
bash-newcomers.
Foreknowledge
A basic associative array declaration
#!/bin/bash
declare -A associativeArray=([key1]=val1 [key2]=val2)
You can also use quotes (', ") around the declaration, its keys, and
values.
#!/bin/bash
declare -A 'associativeArray=([key1]=val1 [key2]=val2)'
And you can delimit each [key]=value pair via space or newline.
#!/bin/bash
declare -A associativeArray([key1]=value1
['key2']=value2 [key3]='value3'
['key4']='value2' ["key5"]="value3"
["key6"]='value4'
['key7']="value5"
)
Depending on your quote variation, you may need to escape your string.
Using Indirection to access both key and value in an associative array
example () {
local -A associativeArray=([key1]=val1 [key2]=val2)
# print associative array
local key value
for key in "${!associativeArray[#]}"; do
value="${associativeArray["$key"]}"
printf '%s = %s' "$key" "$value"
done
}
Running the example function
$ example
key2 = val2
key1 = val1
Knowing the aforementioned tidbits allows you to derive the following snippets:
The following examples will all have the result as the example above
String evaluation
#!/usr/bin/env bash
example () {
local arrayAsString='associativeArray=([key1]=val1 [key2]=val2)'
local -A "$arrayAsString"
# print associative array
}
Piping your JSON into JQ
#!/usr/bin/env bash
# Note: usage of single quotes instead of double quotes for the jq
# filter. The former is preferred to avoid issues with shell
# substitution of quoted strings.
example () {
# Given the following JSON
local json='{ "key1": "val1", "key2": "val2" }'
# filter using `map` && `reduce`
local filter='to_entries | map("[\(.key)]=\(.value)") |
reduce .[] as $item ("associativeArray=("; . + ($item|#sh) + " ") + ")"'
# Declare and assign separately to avoid masking return values.
local arrayAsString;
# Note: no encompassing quotation (")
arrayAsString=$(jq --raw-output "${filter}" <<< "$json")
local -A "$arrayAsString"
# print associative array
}
jq -n / --null-input option + --argfile && redirection
#!/usr/bin/env bash
example () {
# /path/to/file.json contains the same json as the first two examples
local filter filename='/path/to/file.json'
# including bash variable name in reduction
filter='to_entries | map("[\(.key | #sh)]=\(.value | #sh) ")
| "associativeArray=(" + add + ")"'
# using --argfile && --null-input
local -A "$(jq --raw-output --null-input --argfile file "$filename" \
"\$filename | ${filter}")"
# or for a more traceable declaration (using shellcheck or other) this
# variation moves the variable name outside of the string
# map definition && reduce replacement
filter='[to_entries[]|"["+(.key|#sh)+"]="+(.value|#sh)]|"("+join(" ")+")"'
# input redirection && --join-output
local -A associativeArray=$(jq --join-output "${filter}" < "${filename}")
# print associative array
}
Reviewing previous answers
#Ján Lalinský
To load JSON object into a bash associative array efficiently
(without using loops in bash), one can use tool 'jq', as follows.
# first, load the json text into a variable:
json='{"SALUTATION": "Hello world", "SOMETHING": "bla bla bla Mr. Freeman"}'
# then, prepare associative array, I use 'aa':
unset aa
declare -A aa
# use jq to produce text defining name:value pairs in the bash format
# using #sh to properly escape the values
aacontent=$(jq -r '. | to_entries | .[] | "[\"" + .key + "\"]=" + (.value | #sh)' <<< "$json")
# string containing whole definition of aa in bash
aadef="aa=($aacontent)"
# load the definition (because values may contain LF characters, aadef must be in double quotes)
eval "$aadef"
# now we can access the values like this: echo "${aa[SOMETHING]}"
Warning: this uses eval, which is dangerous if the json input is from unknown source (may contain malicious shell commands that eval may execute).
This could be reduced to the following
example () {
local json='{ "key1": "val1", "key2": "val2" }'
local -A associativeArray="($(jq -r '. | to_entries | .[] |
"[\"" + .key + "\"]=" + (.value | #sh)' <<< "$json"))"
# print associative array
}
#fedorqui
If you want key and value, and based on How do i convert a json object to key=value format in JQ, you can do:
$ jq -r "to_entries|map(\"\(.key)=\(.value|tostring)\")|.[]" file
SALUTATION=Hello world
SOMETHING=bla bla bla Mr. Freeman
In a more general way, you can store the values into an array myarray[key] = value like this, just by providing jq to the while with the while ... do; ... done < <(command) syntax:
declare -A myarray
while IFS="=" read -r key value
do
myarray[$key]="$value"
done < <(jq -r "to_entries|map(\"\(.key)=\(.value)\")|.[]" file)
And then you can loop through the values like this:
for key in "${!myarray[#]}"
do
echo "$key = ${myarray[$key]}"
done
For this given input, it returns:
SALUTATION = Hello world
SOMETHING = bla bla bla Mr. Freeman
The main difference between this solution and my own is looping through the
array in bash or in jq.
Each solution is valid and depending on your use case, one may be more useful
then the other.
Context: This answer was written to be responsive to a question title which no longer exists..
The OP's question actually describes objects, vs arrays.
To be sure that we help other people coming in who are actually looking for help with JSON arrays, though, it's worth covering them explicitly.
For the safe-ish case where strings can't contain newlines (and when bash 4.0 or newer is in use), this works:
str='["Hello world", "bla bla bla Mr. Freeman"]'
readarray -t array <<<"$(jq -r '.[]' <<<"$str")"
To support older versions of bash, and strings with newlines, we get a bit fancier, using a NUL-delimited stream to read from jq:
str='["Hello world", "bla bla bla Mr. Freeman", "this is\ntwo lines"]'
array=( )
while IFS= read -r -d '' line; do
array+=( "$line" )
done < <(jq -j '.[] | (. + "\u0000")')
This is how can it be done recursively:
#!/bin/bash
SOURCE="$PWD"
SETTINGS_FILE="$SOURCE/settings.json"
SETTINGS_JSON=`cat "$SETTINGS_FILE"`
declare -A SETTINGS
function get_settings() {
local PARAMS="$#"
local JSON=`jq -r "to_entries|map(\"\(.key)=\(.value|tostring)\")|.[]" <<< "$1"`
local KEYS=''
if [ $# -gt 1 ]; then
KEYS="$2"
fi
while read -r PAIR; do
local KEY=''
if [ -z "$PAIR" ]; then
break
fi
IFS== read PAIR_KEY PAIR_VALUE <<< "$PAIR"
if [ -z "$KEYS" ]; then
KEY="$PAIR_KEY"
else
KEY="$KEYS:$PAIR_KEY"
fi
if jq -e . >/dev/null 2>&1 <<< "$PAIR_VALUE"; then
get_settings "$PAIR_VALUE" "$KEY"
else
SETTINGS["$KEY"]="$PAIR_VALUE"
fi
done <<< "$JSON"
}
To call it:
get_settings "$SETTINGS_JSON"
The array will be accessed like so:
${SETTINGS[grandparent:parent:child]}
To load JSON object into a bash associative array efficiently (without using loops in bash), one can use tool 'jq', as follows.
# first, load the json text into a variable:
json='{"SALUTATION": "Hello world", "SOMETHING": "bla bla bla Mr. Freeman"}'
# then, prepare associative array, I use 'aa':
unset aa
declare -A aa
# use jq to produce text defining name:value pairs in the bash format
# using #sh to properly escape the values
aacontent=$(jq -r '. | to_entries | .[] | "[\"" + .key + "\"]=" + (.value | #sh)' <<< "$json")
# string containing whole definition of aa in bash
aadef="aa=($aacontent)"
# load the definition (because values may contain LF characters, aadef must be in double quotes)
eval "$aadef"
# now we can access the values like this: echo "${aa[SOMETHING]}"
Warning: this uses eval, which is dangerous if the json input is from unknown source (may contain malicious shell commands that eval may execute).
Building on #HelpNeeder's solution (nice one btw)
His solution wasn't really working with integers, so i made some additions. Extended amount of condition checks, so it's fair to say some performance is sacrificed.
This version works with integers and also floating point values.
SOURCE="$PWD"
SETTINGS_FILE="./test2.json"
SETTINGS_JSON=`cat "$SETTINGS_FILE"`
declare -A SETTINGS
get_settings() {
local PARAMS="$#"
local JSON=`jq -r "to_entries|map(\"\(.key)=\(.value|tostring)\")|.[]" <<< "$1"`
local KEYS=''
if [ $# -gt 1 ]; then
KEYS="$2"
fi
while read -r PAIR; do
local KEY=''
if [ -z "$PAIR" ]; then
break
fi
IFS== read PAIR_KEY PAIR_VALUE <<< "$PAIR"
if [ -z "$KEYS" ]; then
KEY="$PAIR_KEY"
else
KEY="$KEYS:$PAIR_KEY"
fi
res=$(jq -e . 2>/dev/null <<< "$PAIR_VALUE")
exitCode=$?
check=`echo "$PAIR_VALUE" | grep -E ^\-?[0-9]*\.?[0-9]+$`
# if [ "${res}" ] && [ $exitCode -eq "0" ] && [[ ! "${PAIR_VALUE}" == ?(-)+([0-9]) ]] ALTERNATIVE, works only for integer (not floating point)
if [ "${res}" ] && [ $exitCode -eq "0" ] && [[ "$check" == '' ]]
then
get_settings "$PAIR_VALUE" "$KEY"
else
SETTINGS["$KEY"]="$PAIR_VALUE"
fi
done <<< "$JSON"
}
get_settings "$SETTINGS_JSON"
Solution: use jq( it's a lightweight and flexible command-line JSON processor.).
In bash I'd rather assign JSONs object to a variable and use jq in order to access and parse the right result from it. It's more convenient than parse this structure with arrays and it comes out of the box with multiple functionalities and features such as accessing nested and complex objects, select methods, builtin operators and functions, regex support ,comparisons etc...
example:
example='{"SALUTATION": "Hello world","SOMETHING": "bla bla bla Mr. Freeman"}'
echo $example | jq .SOMETHING
# output:
"bla bla bla Mr. Freeman"
This question already has answers here:
How can I store the "find" command results as an array in Bash
(8 answers)
Closed 4 years ago.
How do I put the result of find $1 into an array?
In for loop:
for /f "delims=/" %%G in ('find $1') do %%G | cut -d\/ -f6-
I want to cry.
In bash:
file_list=()
while IFS= read -d $'\0' -r file ; do
file_list=("${file_list[#]}" "$file")
done < <(find "$1" -print0)
echo "${file_list[#]}"
file_list is now an array containing the results of find "$1
What's special about "field 6"? It's not clear what you were attempting to do with your cut command.
Do you want to cut each file after the 6th directory?
for file in "${file_list[#]}" ; do
echo "$file" | cut -d/ -f6-
done
But why "field 6"? Can I presume that you actually want to return just the last element of the path?
for file in "${file_list[#]}" ; do
echo "${file##*/}"
done
Or even
echo "${file_list[#]##*/}"
Which will give you the last path element for each path in the array. You could even do something with the result
for file in "${file_list[#]##*/}" ; do
echo "$file"
done
Explanation of the bash program elements:
(One should probably use the builtin readarray instead)
find "$1" -print0
Find stuff and 'print the full file name on the standard output, followed by a null character'. This is important as we will split that output by the null character later.
<(find "$1" -print0)
"Process Substitution" : The output of the find subprocess is read in via a FIFO (i.e. the output of the find subprocess behaves like a file here)
while ...
done < <(find "$1" -print0)
The output of the find subprocess is read by the while command via <
IFS= read -d $'\0' -r file
This is the while condition:
read
Read one line of input (from the find command). Returnvalue of read is 0 unless EOF is encountered, at which point while exits.
-d $'\0'
...taking as delimiter the null character (see QUOTING in bash manpage). Which is done because we used the null character using -print0 earlier.
-r
backslash is not considered an escape character as it may be part of the filename
file
Result (first word actually, which is unique here) is put into variable file
IFS=
The command is run with IFS, the special variable which contains the characters on which read splits input into words unset. Because we don't want to split.
And inside the loop:
file_list=("${file_list[#]}" "$file")
Inside the loop, the file_list array is just grown by $file, suitably quoted.
arrayname=( $(find $1) )
I don't understand your loop question? If you look how to work with that array then in bash you can loop through all array elements like this:
for element in $(seq 0 $((${#arrayname[#]} - 1)))
do
echo "${arrayname[$element]}"
done
This is probably not 100% foolproof, but it will probably work 99% of the time (I used the GNU utilities; the BSD utilities won't work without modifications; also, this was done using an ext4 filesystem):
declare -a BASH_ARRAY_VARIABLE=$(find <path> <other options> -print0 | sed -e 's/\x0$//' | awk -F'\0' 'BEGIN { printf "("; } { for (i = 1; i <= NF; i++) { printf "%c"gensub(/"/, "\\\\\"", "g", $i)"%c ", 34, 34; } } END { printf ")"; }')
Then you would iterate over it like so:
for FIND_PATH in "${BASH_ARRAY_VARIABLE[#]}"; do echo "$FIND_PATH"; done
Make sure to enclose $FIND_PATH inside double-quotes when working with the path.
Here's a simpler pipeless version, based on the version of user2618594
declare -a names=$(echo "("; find <path> <other options> -printf '"%p" '; echo ")")
for nm in "${names[#]}"
do
echo "$nm"
done
To loop through a find, you can simply use find:
for file in "`find "$1"`"; do
echo "$file" | cut -d/ -f6-
done
It was what I got from your question.