i have an array converted to a json ($user_data_JSON = $user_data | ConvertTo-Json) in powershell which looks like this
$user_data_JSON
{
[
{
"targetId": "5007Y00000K5nkjQAB",
"login": "login1",
"password": "scRHDztkKbO"
},
{
"targetId": "5007Y00000MNbDvQAL",
"login": "login2",
"password": "scRHDztkKbO"
}
]
}
But i need to modify it like this with a KEY value on top:
$user_data_JSON
{
"logins":[
{
"targetId": "5007Y00000K5nkjQAB",
"login": "login1",
"password": "scRHDztkKbO"
},
{
"targetId": "5007Y00000MNbDvQAL",
"login": "login2",
"password": "scRHDztkKbO"
}
]
}
How can I manage to achieve this?
I've already tried to create a new array object, add the key and convert it to a json file like this:
$jsonBase = #{}
$list = New-Object System.Collections.ArrayList
$list.Add("Foo")
$list.Add("Bar")
$jsonBase.Add("Data",$list)
$jsonBase | ConvertTo-Json
To get something like this:
{
"Data": [
"Foo",
"Bar"
]
}
but when I convert my array to a json again, it looks kind of trunkated:
$jsonBase | ConvertTo-Json -Depth 10
{
"logins": [
"[\r\n {\r\n \"targetId\": \"5007Y00000K5nkjQAB\",\r\n \"login\": \"login1\",\r\n \"password\": \"scRHDztkKbO\"\r\n },\r\n {\r\n
\"targetId\": \"5007Y00000MNbDvQAL\",\r\n \"login\": \"login2\",\r\n \"password\": \"scRHDztkKbO\"\r\n },"
]
}
How can I get a propoer JSON object?
Thanks
Don't use the JSON (text) representation of your array ($user_data_JSON), use its original, object form ($user_data) to construct a wrapper object with the desired top-level property:
[pscustomobject] #{ logins = $user_data } | ConvertTo-Json -Depth 10
As for what you tried:
If you use a preexisting JSON string as the value of property to be converted to JSON with ConvertTo-Json, it is treated like any other string value, resulting in the representation you saw, with " characters escaped as \" and (CRLF) newlines as \r\n
A simple example:
[pscustomobject] #{
foo = '{ "bar":
"baz" }'
} | ConvertTo-Json
Output (note how the foo property value became a single-line string with " and newlines escaped):
{
"foo": "{ \"bar\": \n \"baz\" }"
}
Related
For context, I am attempting to create a cmdlet that would allow for single value substitutions on arbitrary Json files, for use in some pipelines. I've managed to get this working for non-array-containing Json.
A representative bit of Json:
{"test": {
"env": "dev",
"concept": "abstraction",
"array": [
{"id":1, "name":"first"},
{"id":2, "name":"second"}
]
}
}
I want to be able to replace values by providing a function with a path like test.array[1].name and a value.
After using ConvertFrom-Json on the Json above, I attempt to use the following function (based on this answer) to replace second with third:
function SetValue($object, $key, $value) {
$p1, $p2 = $key.Split(".")
$a = $p1 | Select-String -Pattern '\[(\d{1,3})\]'
if ($a.Matches.Success) {
$index = $a.Matches.Groups[1].Value
$p1 = ($p1 | Select-String -Pattern '(\w*)\[').Matches.Groups[1].Value
if ($p2.length -gt 0) { SetValue -object $object.$p1[$index] -key $p2 -value $value }
else { $object.$p1[$index] = $value }
}
else {
if ($p2.length -gt 0) { SetValue -object $object.$p1 -key $p2 -value $value }
else {
Write-Host $object.$p1
$object.$p1 = $value
}
}
}
$content = SetValue -object $content -key "test.array[1].name" -rep "third"
Unfortunately this results in the following:
{ "test": {
"env": "dev",
"concept": "abstraction",
"array": [
"#{id=1; name=first}",
"#{id=2; name=third}"
]
}
}
If the values in the array aren't objects the code works fine as presented, it's only when we get to objects within arrays that this output happens.
What would be a way to ensure that the returned Json contains an array that is more in line with the input?
Edit: please note that the actual cause of the issue lay in not setting the -Depth property of ConvertTo-Json to 3 or greater. Doing so restored the resulting Json to the expected format. The accepted answer was still helpful in investigating the cause.
While Invoke-Expression (iex) should generally be avoided, there are exceptional cases where it offers the simplest solution.
$fromJson = #'
{
"test": {
"env": "dev",
"concept": "abstraction",
"array": [
{"id":1, "name":"first"},
{"id":2, "name":"second"}
]
}
}
'# | ConvertFrom-Json
$nestedPropertyAccessor = 'test.array[1].name'
$newValue = 'third'
Invoke-Expression "`$fromJson.$nestedPropertyAccessor = `"$newValue`""
Important:
Be sure that you either fully control or implicitly trust the content of the $nestedPropertyAccessor and $newValue variables, to prevent inadvertent or malicious execution of injected commands.
On re-conversion to JSON, be sure to pass a high-enough -Depth argument to ConvertTo-Json; with the sample JSON, at least -Depth 3 is required - see this post.
I am unable to use ArrayList or avoid using += for array manipulation. Wishing that powerShell had a universal add or append available.
I have the below JSON array for $aksAppRules.RulesText
[{
"Name": "A2B",
"Description": null,
"SourceAddresses": [
"10.124.176.0/21",
"10.124.184.0/21"
],
"TargetFqdns": [
"*.github.com",
"*.grafana.com",
"*.trafficmanager.net",
"*.loganalytics.io",
"*.applicationinsights.io",
"*.azurecr.io",
"*.debian.org"
],
"FqdnTags": [],
"Protocols": [
{
"ProtocolType": "Https",
"Port": 443
}
],
"SourceIpGroups": []
},
{
"Name": "Y2office365",
"Description": null,
"SourceAddresses": [
"10.124.176.0/21",
"10.124.184.0/21"
],
"TargetFqdns": [
"smtp.office365.com"
],
"FqdnTags": [],
"Protocols": [
{
"ProtocolType": "Http",
"Port": 25
},
{
"ProtocolType": "Http",
"Port": 587
}
],
"SourceIpGroups": []
}
]
I managed to make this work with the below powershell snippet
$new_list = #()
$collectionRules = $aksAppRules.RulesText | ConvertFrom-Json
foreach ($rule in $collectionRules) {
$protoArray = #()
ForEach ($protocol in $rule.Protocols) {
$protoArray += $protocol.ProtocolType + "," + $protocol.Port
}
#$new_list += , #($rule.Name, $rule.SourceAddresses, $rule.TargetFqdns, $protoArray )
# the 'comma' right after += in below line tells powershell to add new record.
$new_list += , #{Name=$rule.Name;SourceAddresses=$rule.SourceAddresses; TargetFqdns=$rule.TargetFqdns;Protocol=$protoArray}
}
$new_list | ConvertTo-Json | ConvertFrom-Json | select Name, SourceAddresses, TargetFqdns, Protocol| Convert-OutputForCSV -OutputPropertyType Comma | Export-Csv .\test.csv
The CSV looks like
I am unable to do this using Arraylists and without using += as I heard it is inefficient with large arrays.
I have to copy things to a new array because I have to change the key:value format of the original "Protocols" to a 2 d array.
Any pointers will be appreciated.
Yes, you should avoid using the increase assignment operator (+=) to create a collection as it exponential expensive. Instead you should use the pipeline
collectionRules = $aksAppRules.RulesText | ConvertFrom-Json
foreach ($rule in $collectionRules) {
[pscustomobject]#{
Name = $rule.Name
SourceAddresses = $rule.SourceAddresses
TargetFqdns = $rule.TargetFqdns
Protocol = #(
ForEach ($protocol in $rule.Protocols) {
$protocol.ProtocolType + "," + $protocol.Port
}
)
}
} | Convert-OutputForCSV -OutputPropertyType Comma | Export-Csv .\test.csv
Note 1: I have no clue why you are doing | ConvertTo-Json | ConvertFrom-Json, so far I can see there is no need for this if you use a [pscustomobject] rather than a [Hashtabe] type.
Note 2: I no clue what the function Convert-OutputForCSV is doing and suspect that isn't required either (but left it in).
I have a json file that I use for work that I need to parse that is in the following format:
(^)#(^)#(^)#(^)bminter#ubuntu:~$ cat jqtest
{
"files":[
{
"BLOCK1":{
"SUBBLOCK1":{
"akey1":"avalue1",
"bkey1":"bvalue1",
"ckey1":"cvalue1"
},
"dkey1":"dvalue1",
"key":"evalue1"
}
},
{
"BLOCK-2":{
"SUBBLOCK2":{
"akey2":"avalue2",
"bkey2":"bvalue2"
},
"ckey2":"cvalue2",
"key":"dvalue2"
}
},
{
"BLOCK-A":{
"SUBBLOCK2":{
"akey2":"avalue2",
"bkey2":"bvalue2"
},
"ckey2":"cvalue2",
"key":"dvalue2"
}
}],
"NOBLOCK":"value",
"key":"NOBLOCKvalue"
}
So it's an array nested within a json file. jq .[] jqtest gives me everything in the file. Even the data outside the array. Except, outside the array, I'm only given the values not the keys:
(^)#(^)#(^)#(^)bminter#ubuntu:~$ jq .[] jqtest
[
{
"BLOCK1": {
"SUBBLOCK1": {
"akey1": "avalue1",
"bkey1": "bvalue1",
"ckey1": "cvalue1"
},
"dkey1": "dvalue1",
"key": "evalue1"
}
},
{
"BLOCK-2": {
"SUBBLOCK2": {
"akey2": "avalue2",
"bkey2": "bvalue2"
},
"ckey2": "cvalue2",
"key": "dvalue2"
}
},
{
"BLOCK-A": {
"SUBBLOCK2": {
"akey2": "avalue2",
"bkey2": "bvalue2"
},
"ckey2": "cvalue2",
"key": "dvalue2"
}
}
]
"value"
"NOBLOCKvalue"
(^)#(^)#(^)#(^)bminter#ubuntu:~$
Beyond that I can't access any block inside the array:
(^)#(^)#(^)#(^)bminter#ubuntu:~$ jq '.[].BLOCK1' jqtest
jq: error (at jqtest:36): Cannot index array with string "BLOCK1"
(^)#(^)#(^)#(^)bminter#ubuntu:~$ jq '.[].BLOCK-2' jqtest
jq: error (at jqtest:36): Cannot index array with string "BLOCK"
(^)#(^)#(^)#(^)bminter#ubuntu:~$ jq '.[].BLOCK-A' jqtest
jq: error: A/0 is not defined at <top-level>, line 1:
.[].BLOCK-A
jq: 1 compile error
(^)#(^)#(^)#(^)bminter#ubuntu:~$
How do I access the array?
The array of objects with non-uniform keys is making things a little tricky here. Once you've gotten past .files you need to start using Array Iteration [] to access those elements and then use object operations like keys to go deeper.
Here is a function which may help in this situation. It scans .files for an object with a key matching the specified key and then returns the corresponding value:
def getfile($k): .files[] | select(keys[] | .==$k) | .[$k];
If jqtest contains the sample data the command
$ jq -M '
def getfile($k): .files[] | select(keys[] | .==$k) | .[$k];
getfile("BLOCK1").SUBBLOCK1.akey1
' jqtest
Returns
"avalue1"
Another approach is to use a function to convert .files[] into a more useful form. e.g.
$ jq -M '
def files: reduce .files[] as $f ({}; ($f|keys[0]) as $k | .[$k] = $f[$k]) ;
files
' jqtest
this returns a more uniform structure without arrays
{
"BLOCK1": {
"SUBBLOCK1": {
"akey1": "avalue1",
"bkey1": "bvalue1",
"ckey1": "cvalue1"
},
"dkey1": "dvalue1",
"key": "evalue1"
},
"BLOCK-2": ...
so with it you can write
files.BLOCK1.SUBBLOCK1
to obtain
{
"akey1": "avalue1",
"bkey1": "bvalue1",
"ckey1": "cvalue1"
}
Note that jq will re-evaluate the files function with each use so the following form may be more practical:
files as $files
| $files.BLOCK1.SUBBLOCK1
If you find this representation useful you may want to skip the function and instead just start your filter with
.files = reduce .files[] as $f ({}; ($f|keys[0]) as $k | .[$k] = $f[$k])
e.g.
$ jq -M '
.files = reduce .files[] as $f ({}; ($f|keys[0]) as $k | .[$k] = $f[$k])
# more stuff goes here
' jqtest
which converts your input to
{
"files": {
"BLOCK1": {
"SUBBLOCK1": {
"akey1": "avalue1",
"bkey1": "bvalue1",
"ckey1": "cvalue1"
},
"dkey1": "dvalue1",
"key": "evalue1"
},
"BLOCK-2": {
"SUBBLOCK2": {
"akey2": "avalue2",
"bkey2": "bvalue2"
},
"ckey2": "cvalue2",
"key": "dvalue2"
},
"BLOCK-A": {
"SUBBLOCK2": {
"akey2": "avalue2",
"bkey2": "bvalue2"
},
"ckey2": "cvalue2",
"key": "dvalue2"
}
},
"NOBLOCK": "value",
"key": "NOBLOCKvalue"
}
making whatever else you need to do after that easier
I'm trying to pack my data into objects before displaying them with ConvertTo-Json. The test case below shows perfectly how I'm dealing with data and what problem occurs:
$array = #("a","b","c")
$data = #{"sub" = #{"sub-sub" = $array}}
$output = #{"root" = $data}
ConvertTo-Json -InputObject $data
ConvertTo-Json -InputObject $output
Output (formatted by hand for clarity):
{ "sub": { "sub-sub": [ "a", "b", "c" ] }}
{ "root": { "sub": { "sub-sub": "a b c" } }}
Is there any way to assign $data to $output without this weird implicit casting?
As mentioned in the comments, ConvertTo-Json will try to flatten the object structure beyond a maximum nesting level, or depth, by converting whatever object it finds beyond that depth to a string.
The default depth is 2, but you can specify that it should go deeper with the Depth parameter:
PS C:\> #{root=#{level1=#{level2=#("level3-1","level3-2")}}}|ConvertTo-Json
{
"root": {
"level1": {
"level2": "level3-1 level3-2"
}
}
}
PS C:\> #{root=#{level1=#{level2=#("level3-1","level3-2")}}}|ConvertTo-Json -Depth 3
{
"root": {
"level1": {
"level2": [
"level3-1",
"level3-2"
]
}
}
}
I'm trying to pack my data into objects before displaying them with ConvertTo-Json. The test case below shows perfectly how I'm dealing with data and what problem occurs:
$array = #("a","b","c")
$data = #{"sub" = #{"sub-sub" = $array}}
$output = #{"root" = $data}
ConvertTo-Json -InputObject $data
ConvertTo-Json -InputObject $output
Output (formatted by hand for clarity):
{ "sub": { "sub-sub": [ "a", "b", "c" ] }}
{ "root": { "sub": { "sub-sub": "a b c" } }}
Is there any way to assign $data to $output without this weird implicit casting?
As mentioned in the comments, ConvertTo-Json will try to flatten the object structure beyond a maximum nesting level, or depth, by converting whatever object it finds beyond that depth to a string.
The default depth is 2, but you can specify that it should go deeper with the Depth parameter:
PS C:\> #{root=#{level1=#{level2=#("level3-1","level3-2")}}}|ConvertTo-Json
{
"root": {
"level1": {
"level2": "level3-1 level3-2"
}
}
}
PS C:\> #{root=#{level1=#{level2=#("level3-1","level3-2")}}}|ConvertTo-Json -Depth 3
{
"root": {
"level1": {
"level2": [
"level3-1",
"level3-2"
]
}
}
}