I have a POST request in which the data comes as JSON. For testing via collection runner, I want to pick those from JSON File but I am unable to define variables in Array and am stuck need support. Input data is like:
Input Data:
{
"field1": "1",
"field2": "111111111111111",
"field3": "value3",
"field4": [
[],
[],
[]
],
"master_field": {
"field5": 11,
"field6": 33.0,
"field7": [5, 184]
},
"field8": [
[10, 6, -1030],
[-83, 0, -999],
[-54, 21, -1054],
[-162, 21, -990]
],
"field9": 92
}
I tried making a request in POSTMAN Body like JSON, it worked till field3 only:
{
"field1": "{{field1}}",
"field2": "{{field2}}",
"field3": "{{field3}}",
"field4":
[
"{{field4}}"
]
}
It Doesn't parse field4 onward. Thanks
In order to store arrays in Postman variables, you have to stringify them.
Say your field4 value is an array in your test script, just JSON.stringify() it and save it in the environment variable.
Then you can use the variable directly in your request body.
{
"field1": "{{field1}}",
"field2": "{{field2}}",
"field3": "{{field3}}",
"field4": {{field4}}
}
Note that the field4 variable is NOT inside quotes.
Related
Background
I am using wiremock-jre8-standalone-2.35.0.jar
I want it to return a 200 response if the incoming request's array contains any values:
{
"field1": "data1",
"array": [
{...},
{...},
...
],
"field2": "data2",
"field3": "data3",
"field4": "data4",
"field5": "data5"
}
I want it to return a 400 response if the incoming requests' array is empty:
{
"field1": "data1",
"array": [],
"field2": "data2",
"field3": "data3",
"field4": "data4",
"field5": "data5"
}
Wiremock should match the incoming request against the "request": {...} from the below code:
{
"id": "...",
"request": {
"urlPattern": "...",
"method": "POST",
"headers": {...},
"bodyPatterns": [
{
"matchesJsonPath": "$[?(#.length < 1)]"
}
]
}
},
"response": {
"status": 400,
"bodyFileName": "...",
"headers": {...}
},
"uuid": "..."
}
Problem
Wiremock is rejecting my JSONPath expression in the bodyPatterns array:
[{"matchesJsonPath":"$[?(#.length < 1)]"}] is not a valid match operation
Yet it seems that the expression is valid according to https://jsonpath.com/ :
JSONPath
---
$[?(#.length < 1)]
Inputs
---
{
"field1": "data1",
"array": [],
"field2": "data2",
"field3": "data3",
"field4": "data4",
"field5": "data5"
}
Evaluation Results
---
[
[]
]
...What gives?
This is what gives:
JSON Path
Deems a match if the attribute value is valid JSON and matches the JSON Path expression supplied. A JSON body will be considered to match a path expression if the expression returns either a non-null single value (string, integer etc.), or a non-empty object or array.
https://wiremock.org/docs/request-matching/
My JSON Path expression returns an empty array, so it cannot be considered to match the path expression.
From the contacts, I'd like to select the values in fields: "Id" (47) and everything from the nested array [doNotContact]. I could use some help defining the JSONPath-filter I should be using to select the values: 47 and each value inside the nested array.
{
"total": "1",
"contacts": {
"47": {
"id": 47,
"isPublished": true,
"dateAdded": "2015-07-21T12:27:12-05:00",
"createdBy": 1,
"createdByUser": "Joe Smith",
"doNotContact": [{
"id": 2,
"reason": 2,
"comments": "",
"channel": "email",
"channelId": null
}]
}
}
}
I have tried paths like: $.contacts.*.['id','doNotContact'] however, this does not seem to work. I am using the website: https://goessner.net/articles/JsonPath/ normally this would help me solve the problem.
Not all implementations support the comma-delimited selectors, e.g. ['id','doNotContact']. See the JSON Path comparison project site (specifically this test) for information as to which implementations support the syntax.
Secondly, please see this answer about omitting the dot before a bracket syntax
Nested arrays get flattened when represented in "fields". I expect that values from the same path to be merged, but that the internal data structure will not be modified.
Could someone explain whether I am doing something incorrectly, or whether this belongs as an Elasticsearch issue?
Steps to reproduce:
Create the 2D data
curl -XPOST localhost:9200/test/5 -d '{ "data": [ [100],[2,3],[6,7] ] }'
Query the data, specifying fields
curl -XGET localhost:9200/test/5/_search -d '{"query":{"query_string":{"query":"*"} }, "fields":["data"] } }'
Result:
{"took":1,"timed_out":false,"_shards":{"total":5,"successful":5,"failed":0},"hits":{"total":1,"max_score":1.0,"hits":[{"_index":"test","_type":"5","_id":"AVdsHJrepOClGTFyoGqo","_score":1.0,"fields":{"data":[100,2,3,6,7]}}]}}
Repeat without the use of "fields":
curl -XGET localhost:9200/test/5/_search -d '{"query":{"query_string":{"query":"*"} } } }'
Result:
{"took":1,"timed_out":false,"_shards":{"total":5,"successful":5,"failed":0},"hits":{"total":1,"max_score":1.0,"hits":[{"_index":"test","_type":"5","_id":"AVdsHJrepOClGTFyoGqo","_score":1.0,"_source":{ "data": [ [100],[2,3],[6,7] ] }}]}}
Notice that _source and fields differ, in that "fields" decomposes the 2D array into a 1D array.
When you specify nothing else in your request, what you get back foreach hit is the "_source" object, that is, exactly the Json you sent to ES during indexing (even including whitespace!).
When you use source filtering, as Andrey suggests, it's the same except you can include or exclude certain fields.
When you use the "fields" directive in your query, the return values are not taken from the _source, but read directly from the Lucene Index. (see docs) Now the key in your search response will switch from "_source" to "fields" to reflect this change.
As alkis said:
https://www.elastic.co/guide/en/elasticsearch/reference/current/array.html
These docs say up front that, yes, Elasticsearch does flatten arrays.
Instead of specifying "fields" I usually do source filtering
Your query would change to something like:
curl -XGET <IPADDRESS>:9200/test/5/_search -d '{"_source":{"include": ["data"]}, "query":{"query_string":{"query":"*"} }}'
From here https://www.elastic.co/guide/en/elasticsearch/reference/current/array.html
it seems that elasticsearch considers them the same.
In Elasticsearch, there is no dedicated array type. Any field can contain zero or more values by default, however, all values in the array must be of the same datatype. For instance:
an array of strings: [ "one", "two" ]
an array of integers: [ 1, 2 ]
an array of arrays: [ 1, [ 2, 3 ]] which is the equivalent of [ 1, 2, 3 ]
an array of objects: [ { "name": "Mary", "age": 12 }, { "name": "John", "age": 10 }]
You could use an array of json objects and use nested data type with nested query.
Maybe nested data type could be helpful
PUT /my_index
PUT /my_index/_mapping/my_type
{
"properties" : {
"data" : {
"type" : "nested",
"properties": {
"value" : {"type": "long" }
}
}
}
}
POST /my_index/my_type
{
"data": [
{ "value": [1, 2] },
{ "value": [3, 4] }
]
}
POST /my_index/my_type
{
"data": [
{ "value": [1, 5] }
]
}
GET /my_index/my_type/_search
{
"query": {
"nested": {
"path": "data",
"query": {
"bool": {
"must": [
{
"match": {
"data.value": 1
}
},
{
"match": {
"data.value": 2
}
}
]
}
}
}
}
}
I need to be able to quickly discern whether or not a JSON data structure contains an array in it. For example the regex should return true for
{
"array": [
1,
2,
3
],
"boolean": true,
"null": null,
"number": 123,
"object": {
"a": "b",
"c": "d",
"e": "f"
},
"string": "Hello World"
}
and false for
{
"boolean": true,
"null": null,
"number": 123,
"object": {
"a": "b",
"c": "d",
"e": "f"
},
"string": "[Hello World]"
}
Any ideas?
I Would MUCH prefer if this check could be done via regex instead of traversing json, unless anyone can show me a better way.
You could format the JSON on some "canonical" from using a tool like jshon.
I do not recommend this approach.
$ cat foo.json
{"root": { "foo": 1, "bar": [2,3,4], "baz":"BAZ", "qux":[5,6, [7,8, [9, 10]]]}}
$ jshon < foo.json
{
"root": {
"foo": 1,
"bar": [
2,
3,
4
],
"baz": "BAZ",
"qux": [
5,
6,
[
7,
8,
[
9,
10
]
]
]
}
}
$ jshon < foo.json | grep '^\s*\]'
],
]
]
]
$ echo $?
0
Use this regex:
/:?\s+\[/g
And then you can simply do:
var json = "json here";
var containsArray = json.match(/:?\s+\[/g) !== null; // returns true or false
Demo: http://jsfiddle.net/pMMgb/1
Try
var a = {} // required json.
var regex = /\[(.*)\]/
regex.test(a)
After reviewing the answers, I've reconsidered, and I will go with parsing and traversing to find the answer to my question.
Regexing turned out to be unreliable, and the other options just add more convolution.
Thanks to everyone for the answers!
I have a collection like the following:-
{
_id: 5,
"org_name": "abc",
"items": [
{
"item_id": "10",
"data": [
// Values goes here
]
},
{
"item_id": "11",
"data": [
// Values goes here
]
}
]
},
// Another sub document
{
_id: 6,
"org_name": "sony",
"items": [
{
"item_id": "10",
"data": [
// Values goes here
]
},
{
"item_id": "11",
"data": [
// Values goes here
]
}
]
}
Each sub document corresponds to individual organizations and each organization has an array of items in them.
What I need is to get the select individual elements from the items array, by providing item_id.
I already tried this:-
db.organizations.find({"_id": 5}, {items: {$elemMatch: {"item_id": {$in: ["10", "11"]}}}})
But it is returning either the item list with *item_id* "10" OR the item list with *item_id* "11".
What I need is is the get values for both item_id 10 and 11 for the organization "abc". Please help.
update2:
db.organizations.aggregate([
// you can remove this to return all your data
{$match:{_id:5}},
// unwind array of items
{$unwind:"$items"},
// filter out all items not in 10, 11
{$match:{"items.item_id":{"$in":["10", "11"]}}},
// aggregate again into array
{$group:{_id:"$_id", "items":{$push:"$items"}}}
])
update:
db.organizations.find({
"_id": 5,
items: {$elemMatch: {"item_id": {$in: ["10", "11"]}}}
})
old Looks like you need aggregation framework, particularly $unwind operator:
db.organizations.aggregate([
{$match:{_id:5}}, // you can remove this to return all your data
{$unwind:"$items"}
])