Input:
I have a filename called 'myseedips' with a set of Ip addresses in it in the below structure
10.204.99.15
10.204.99.12
10.204.99.41
These can be 'n' number of IP addressess line by line.
Output
I have no idea on bash programming. But I have to write a bash script to create a JSON file in the below structure. These IP addresses has to be in a loop, so that the JSON will change/extend depending on the length of myseedips file.
"cassandra": {
"nodes": [
{"ip_address": "10.204.99.15","type": "seed"},
{"ip_address": "10.204.99.12","type": "seed"},
{"ip_address": "10.204.99.41","type": "seed"}]
},
Also need to add logic to add comma at the end of each node for all nodes except the last. Do not append comma if there is only one node.
Example:
May be be something like the below code logic, but in bash programming.
j string
j = `"cassandra": {"nodes": [`
for i =0;i<len(ips);i++ {
j = j + `{"ip_address": "` + ips[i] + `","type": "seed"},`
}
j = j + `}]}`
Thanks
Nissar Sheik
Further to Jeff's answer, please note that the transformation can be accomplished with a single invocation of jq. If your jq has the inputs filter:
jq -Rn '[inputs] | {cassandra:{nodes:map({ip_address:.,type:"seed"})}}'
Otherwise:
jq -Rs 'split("\n") | {cassandra:{nodes:map({ip_address:.,type:"seed"})}}' ips.txt
Using jq, you'll need an extra pass to convert from raw text to a workable array but simple:
$ jq -R '.' myseedips | jq -s '{cassandra:{nodes:map({ip_address:.,type:"seed"})}}'
This yields the following:
{
"cassandra": {
"nodes": [
{
"ip_address": "10.204.99.15",
"type": "seed"
},
{
"ip_address": "10.204.99.12",
"type": "seed"
},
{
"ip_address": "10.204.99.41",
"type": "seed"
}
]
}
}
awk to the rescue!
a template awk solution can be
$ awk 'BEGIN{print "header"}
NR==FNR{c=NR;next}
{print "prefix",$1,"suffix" (FNR<c?",":"]")}
END{print "footer"}' myseedips{,}
header
prefix 10.204.99.15 suffix,
prefix 10.204.99.12 suffix,
prefix 10.204.99.41 suffix]
footer
you can replace the header,footer,prefix, and suffix.
Related
This question already has answers here:
bash & jq: add attribute with object value
(2 answers)
Passing variable in jq to filter with select fails
(1 answer)
Closed 5 days ago.
I'm simply trying to replace an objects value in a json file with an array, using jq in a bash script.
The json file (truncated) looks like this:
{
"objects": {
"type": "foo",
"host": "1.1.1.1",
"port": "1234"
}
}
I want to replace the host objects value with an array of different values, so it looks like this:
{
"objects": {
"type": "foo",
"host": ["1.1.1.1","2.2.2.2"],
"port": "1234"
}
}
I tested around with this script. The Input comes from a simple, comma separated string which I convert into a proper json array (which seems to work).
But I'm not able replace the value with the array.
#!/bin/bash
objectshost="1.1.1.1,2.2.2.2"
objectshost_array=$(jq -c -n --arg arg $objectshost '$arg|split(",")')
jq --arg value "$objectshost_array" '.objects.host = $value' ./test.json > ./test.json.tmp
The best I ended up with, is this:
{
"objects": {
"type": "foo",
"host": "[\"1.1.1.1\",\"2.2.2.2\"]",
"port": "1234"
}
}
The result seems to be some logical result, as the script simply replaces the value with the arrays string. But it's not what I expected to get. ;)
I found some similar questions, but all of them were dealing with replacing values in existing arrays or key/value pairs, but my problem seems to be the conversion from a single value to an array.
Can somebody please push me into the right direction? Or should I forget about jq and threat the json file as a simple text file?
Thanks in advance,
André
It would work with a conditional assignment from arguments:
jq '
.objects.host = (
.objects.host |
if type == "array"
then .
else [ . ]
end + $ARGS.positional
)
' input.json --args 1.2.3.4 2.2.2.2 4.4.4.4
Or the same as a stand-alone jq script; which is more readable and maintainable:
myJQScript:
#!/usr/bin/env -S jq --from-file --args
.objects.host = (
.objects.host |
if type == "array"
then .
else [ . ]
end + $ARGS.positional
)
Make it executable:
chmod +x myJQScript
Run it with arguments to add array entries to host
$ ./myJQScript 1.2.3.4 2.2.2.2 4.4.4.4 < input.json
{
"objects": {
"type": "foo",
"host": [
"1.1.1.1",
"1.2.3.4",
"2.2.2.2",
"4.4.4.4"
],
"port": "1234"
}
}
You can do it with a single jq command:
#!/bin/sh
objectshost="1.1.1.1,2.2.2.2"
jq --arg value "$objectshost" '.objects.host = ($value / ",")' ./test.json > ./test.json.tmp
This has the added benefit of not requiring Bash arrays, but can be used with any shell.
If you already have a JSON array, you must use --argjson and not --arg. --arg always creates a variable of type string, --argjson however parses the value as JSON entity.
#!/bin/bash
objectshost="1.1.1.1,2.2.2.2"
objectshost_array=$(printf '%s\n' "$objectshost" | jq -c 'split(",")')
jq --argjson value "$objectshost_array" '.objects.host = $value' ./test.json > ./test.json.tmp
First - a very simple code:
#!/bin/bash
a_string="string"
readarray -t a <<<"$a_string"
echo "${a[#]}"
echo "${#a[#]}"
# read empty string into array
emptystring=""
readarray -t b <<<"$emptystring"
echo "${b[#]}"
echo "${#b[#]}"
# and now - empty array
c=()
echo "${c[#]}"
echo "${#c[#]}"
And the output
string
1
1
0
So reading an empty string into bash array with readarray -t shows array length as 1. And only truly empty array's length is 0.
My question - why it is happening?
As my case - here is a fragment from my script - execute an API call, get JSON, and apply jq filter:
URL_list_json=$(curl -s --request GET "$curl_request" --header "Authorization: Bearer ${token}")
readarray -t URL_names <<<"$(echo "$URL_list_json" | jq -r '.[].Body.monitors[].name')"
Here is an example of JSON which results an empty array from the jq call:
[
{
"Header": {
"Region": "dc1",
"Tenant": "tenant1",
"Stage": "test"
},
"Body": {
"monitors": []
}
}
]
and here is JSON with populated content returning a non-empty array from jq call:
[
{
"Header": {
"Region": "dc2",
"Tenant": "tenant2",
"Stage": "qa"
},
"Body": {
"monitors": [
{
"enabled": true,
"entityId": "TEST-674BA97E74FC74AA",
"name": "production.example.com"
},
{
"enabled": false,
"entityId": "TEST-3E2D23438A973D1E",
"name": "test.example.com"
}
]
}
}
]
I need to check if URL_names array is empty or not. If not empty, then iterate over contents. If empty - means that jq did not return any results - white to log and move on.
If I use if ((${#URL_names[#]})) as a means to check if the array is empty or not, it will return 1 even if the array has just an empty string from the jq call, so this logic fails.
What are alternative approaches to deal with the case like above?
I can assign the output of the jq filter to a string and then use if statement to check if the string is empty and if non-empty then assign the string to an array, but this introduces additional elements. With this approach, I can skip using arrays entirely - I was hoping to use only arrays for this task.
Thanks
why it is happening?
Because it reads one line. From bash manual here document:
[n]<<< word
[...] The result is supplied as a single string, with a newline appended, to the command on its standard input (or file descriptor n if n is specified).
Because there is a newline, readarray reads one empty line. You may do:
readarray -t b < <(printf "%s" "$emptystring")
Notes:
echo "$var" is not preferred. Do printf "%s" "$var" when in posix shell, do <<<"$var" when in bash (and you do not care about the extra newline).
<<<"$(...)" always looks strange - the <<< has to allocate a subshell anyway. Do < <(...).
You want to do: readarray -t URL_names < <(<<<"$URL_list_json" jq -r '.[].Body.monitors[].name')
If you want to check if the array is empty, check that in jq. I see ex. jq --exit-status '.[].Body.monitors[].name' and just check the exit status.
I have a dictionary that looks like this:
{
"uid": "d6fc3e2b-0001a",
"name": "ABC Mgmt",
"type": "host"
}
{
"uid": "d6fc3e2b-0002a",
"name": "Server XYZ",
"type": "group"
}
{
"uid": "d6fc3e2b-0003a",
"name": "NTP Primary",
"type": "host"
}
{
"uid": "d6fc3e2b-0004a",
"name": "H-10.10.10.10",
"type": "host"
}
Then I have a txt file:
"d6fc3e2b-0001a"
"d6fc3e2b-0001a","d6fc3e2b-0002a","d6fc3e2b-0003a"
"d6fc3e2b-0004a"
Expected Output:
"ABC Mgmt"
"ABC Mgmt","Server XYZ","NTP Primary"
"H-10.10.10.10"
I have some trouble to make jq using an array which is not json format. I tried various solutions that I found, but none of them worked. I am rather new to scripting, need some help.
input=file.txt
while IFS= read -r line
do
{
value=$(jq -r --arg line "$line" \
'from_entries | .[($line | split(","))[]]' \
dictionary.json)
echo $name
}
done < "$input"
In the following solution, the dictionary file is read using the --slurpfile command-line option, and the lines of "text" are read using inputs in conjunction with the -n command-line option. The -r command-line option is used in conjunction with the #csv filter to produce the desired output.
Invocation
jq -n -R -r --slurpfile dict stream.json -f program.jq stream.txt
program.jq
(INDEX($dict[]; .uid) | map_values(.name)) as $d
| inputs
| split(",")
| map(fromjson)
| map($d[.])
| #csv
Caveat
The above assumes that the quoted values in stream.txt do not themselves contain commas.
If the quoted values in stream.txt do contain commas, then it would be much easier if the values given on each line in stream.txt were given as JSON entities, e.g. as an array of strings, or as a sequence of JSON strings with no separator character.
Solution to problem described in a comment
Invocation
< original.json jq -r --slurpfile dict stream.json -f program.jq
program.jq
(INDEX($dict[]; .uid) | map_values(.name)) as $d
| .source
| map($d[.])
| #csv
Given a bash array, how to convert it to a JSON array in order to output to a file with jq?
Additionnally: is there a way to keep the server_nohup array unchanged instead of re-writing the whole json file each time?
newArray=(100 200 300)
jq -n --arg newArray $newArray '{
client_nohup: [
$newArray
],
server_nohup: [
]
}' > $projectDir/.watch.json
Current output:
{
"client_nohup": [
"100"
],
"server_nohup": []
}
Desired output:
{
"client_nohup": [
100,
200,
300
],
"server_nohup": []
}
(1) If all the values in newArray are valid as JSON values without spaces, then you could get away with piping the values as a stream, e.g.
newArray=(100 200 300)
echo "${newArray[#]}" |
jq -s '{client_nohup: ., server_nohup: []}'
(2)
Now let's suppose you merely wish to update the "nohup" object in a file, say nohup.json:
{ "client_nohup": [], "server_nohup": [ "keep me" ] }
Since you are using bash, you can then write:
echo "${newArray[#]}" |
jq -s --argjson nohup "$(cat nohup.json)" '
. as $newArray | $nohup | .client_nohup = $newArray
'
Output
(1)
{
"client_nohup": [
100,
200,
300
],
"server_nohup": []
}
(2)
{
"client_nohup": [
100,
200,
300
],
"server_nohup": [
"keep me"
]
}
Other cases
Where there's a will, there's a jq way :-)
See for example the accepted answer at How to format a bash array as a JSON array (though this is not a completely generic solution).
For a generic solution, see 𝑸: How can a variable number of arguments be passed to jq? How can a bash array of values be passed in to jq as a single argument? at the jq FAQ https://github.com/stedolan/jq/wiki/FAQ
Generic Solutions
To be clear, if the array values are known to be valid JSON, there are several good options; if the array values are arbitrary bash strings, then the only efficient, generic way to handle them with jq is by using the -R jq option (e.g. in conjunction with -s), but then the (bash) strings will all be read in as JSON strings, so any intended type information will be lost. (The point here hinges on the technicality that bash strings cannot CONTAIN NUL characters.)
Often, to alleviate the latter concern, one can convert numeric strings to JSON numbers, e.g. using the jq idiom: (tonumber? // .).
In general, the only truly safe way to do this is with multiple invocations of jq, adding each element to the output of the previous command.
arr='[]' # Empty JSON array
for x in "${newArray[#]}"; do
arr=$(jq -n --arg x "$x" --argjson arr "$arr" '$arr + [$x]')
done
This ensures that each element x of your bash array is properly encoded prior to adding it to the JSON array.
This is complicated, though, by the fact that bash doesn't not distinguish between numbers and strings. This encodes your array as ["100", "200", "300"], not [100, 200, 300]. In the end, you need to have some awareness of what your array contains, and preprocess it accordingly.
I have a json array that looks like this:
{
"StackSummaries": [
{
"CreationTime": "2016-06-01T22:22:49.890Z",
"StackName": "foo-control-eu-west-1",
"StackStatus": "UPDATE_COMPLETE",
"LastUpdatedTime": "2016-06-01T22:47:58.433Z"
},
{
"CreationTime": "2016-04-13T11:22:04.250Z",
"StackName": "foo-bar-testing",
"StackStatus": "UPDATE_COMPLETE",
"LastUpdatedTime": "2016-04-26T16:17:07.570Z"
},
{
"CreationTime": "2016-04-10T01:09:49.428Z",
"StackName": "foo-ldap-eu-west-1",
"StackStatus": "UPDATE_COMPLETE",
"LastUpdatedTime": "2016-04-17T13:44:04.758Z"
}
]
}
I am looking to create text output that looks like this:
foo-control-eu-west-1
foo-bar-testing
foo-ldap-eu-west-1
Is jq able to do this? Specifically, what would the jq command line be that would select each StackName in the array and output each key one per line?
jq --raw-output '.StackSummaries[].StackName'
$ jq -r '[.StackSummaries[] | .StackName] | unique[]' input.json
foo-bar-testing
foo-control-eu-west-1
foo-ldap-eu-west-1
The -r option strips the quotation marks from the output. You might not want the call to 'unique'.
For reference, if you wanted all the key names:
$ jq '[.StackSummaries[] | keys[]] | unique' input.json
[
"CreationTime",
"LastUpdatedTime",
"StackName",
"StackStatus"
]
Here is another solution
jq -M -r '..|.StackName?|values' input.json