This is my payload:
[
[
2452,
1,
"AA",
"SH289122275",
"82310",
"CB",
"83.5"
],
[
3456,
2,
"BB ",
"SH389122275",
"92310",
"BB",
"83.5"
]
]
How to fit my payload into below transformation dynamically (i.e at. [111,"aaa"],[222,"bbb"]), I may get more objects in my payload.
%dw 2.0
output application/java
---
{
attribute: Db::createArray("Database_Config","DEMO_OBJECTS",[
Db::createStruct("Database_Config","DEMO_OBJECT",[111,"aaa"]),
Db::createStruct("Database_Config","DEMO_OBJECT",[222,"bbb"])
])
}
Depends on the structure and order of the data in the input. This is a possible example that you can adapt to your specific needs:
%dw 2.0
output application/java
---
{
attribute: Db::createArray("Database_Config","DEMO_OBJECTS",
payload map
Db::createStruct("Database_Config","DEMO_OBJECT",[ $[0], $[1] ... ]))
}
Related
I have a MongoDB document with the following attributes:
{
"label": [
"ibc",
"ibd",
"ibe"
],
"location": "vochelle st"
}
and I have to return the document only if the documents label exactly matches the given array i.e., ["ibc","ibd"] and for the same, I am using the query:
db.collection.find({"location":"vochelle st","dock_label":{"$all":["ibc", "ibd"]}})
Actual Response:
{
"label": [
"ibc",
"ibd",
"ibe"
],
"location": "vochelle st"
}
Expected Response:
{}
Since the label "ibe" doesn't exist in the given array, the expected result has to be the empty dictionary.
Give $size in your query
db.collection.find({
location: "vochelle st",
label: {
$all: [
"ibc",
"ibd"
],
$size: 2
}
})
mongoplayground
Use $setIntersection to intersect both label and input array.
Compare both intersected array (from 1) and label arrays are matched via $eq.
db.collection.find({
"location": "vochelle st",
$expr: {
$eq: [
{
$setIntersection: [
"$label",
[
"ibc",
"ibd"
]
]
},
"$label"
]
}
})
Sample Mongo Playground
If you want to check if the array exactly matches your input, you don't need any operator, just compare it with your value:
db.collection.find({"location":"vochelle st","label": ["ibc", "ibd"]})
I have a json payload in Mule that I am trying to convert to xml. Whatever value is in "entity_id" should be the "payment_id" in my output. Also whatever value comes for the key "selected_payment_option" should be "method" in my output.
Please see my sample input here:
{
"items": [
{
"payment": {
"entity_id": 1485222,
"method": "m2_kp"
},
"extension_attributes": {
"payment_addition_info":[
{
"key": "m1_code",
"value": "over_time"
},
{
"key": "order_id",
"value": "4f86-2cce-4870-ad05-089a333"
},
{
"key": "selected_payment_option",
"value": "slice_by_card"
}
]
}
}
]
}
And my desired output for this would be
<root>
<payment>
<payment_id>1485222</payment_id>
<method>slice_by_card</method>
</payment>
</root>
Any help would be greatly appreciated. Thank you!
Used Map since items is an array hence there can be multiple entities inside items.
Used default -> "" if key does not match selected_payment_option
DW
%dw 2.0
output application/xml
---
root:payment:payload.items map {
payment_id: $.payment.entity_id,
method: (flatten($..payment_addition_info) filter ($.key=="selected_payment_option"))[0].value default ""
}
Objective: Get all row in CSV file (minimum 1 row & more), add fix string & use the variable in JSON array.
Sorry, if the question is not related. I'm not sure what the correct question to be asked.
I want the "entryList" format to be like below, for successfully post the JSON object.
But, none of below approach works.
"entryList": [
{"entry": ["value_from","csv",""]},
{"entry": ["value_from","csv",""]},
{"entry": ["value_from","csv",""]}
]
Dumps JSON (full format):
x={
"act.addEntries": {
"act.authToken": "test123",
"act.resourceId": "asdasda=123asd1",
"act.entryList": {
"columns": [
"domainName","threatInfo","riskScore"
],
"entryList": [
VARIABLE
]
}
}
}
"""
Sample CSV file
domainName,threatInfo,riskScore
xxx.xxx.com,Test1,
yyy.yyy.com,Test2,
zzz.zzz.com,Test3,
"""
# CSV to List
domainList=[]
with open('Desktop/domainMatches.csv', 'r') as file:
reader = csv.reader(file)
col = next(reader)
Approach 1 - String Formatting
Problem: JSON dumps only show 1 entryList value. I couldn't get the correct way to escape {}, hence it lead to undesired JSON Object
for data in reader:
domainList.append(data)
line= '{{"entry": {}}}'.format(data)
print(line)
**OUTPUT:**
{"entry": ['xxx.xxx.com', 'Test1', '']}
{"entry": ['yyy.yyy.com', 'Test2', '']}
{"entry": ['zzz.zzz.com', 'Test3', '']}
...
...
"entryList": [
{{{}}}.format(line)
]
**OUTPUT - JSON Dumps:**
{
"act.addEntries": {
"act.authToken": "test123",
"act.resourceId": "asdasda=123asd1",
"act.entryList": {
"columns": [
"domainName",
"threatInfo",
"riskScore"
],
"entryList": [
"{{\"entry\": ['zzz.zzz.com', 'Test3', '']}}"
]
}
}
}
Aprroach 2 - + Concatenation
Too many problem here as well. Advice if needed
line = ""
for value in reader:
line+="{"
line+=""" "entry": """
line+=str(value)
line+="},\n"
**OUTPUT:**
{"entry": ['xxx.xxx.com', 'Test1', '']},
{"entry": ['yyy.yyy.com', 'Test2', '']},
{"entry": ['zzz.zzz.com', 'Test3', '']},
...
...
"entryList": [
"""+line+"""
]
I am interested in storing key-value pair of metadata inside a JSON array containing multiple JSON objects. This will instruct a generic parser what to do with the list of JSON objects in the JSON Array when it is processing the JSON Array. Below is a sample JSON, with where I am hoping to have some sort of metadata field.
{
"Data": [
<< "metadata":"instructions" here >>
{
"foo": 1,
"bar": "barString"
},
{
"foo": 3,
"bar": "fooString"
}
]
}
What is the proper way to structure this mixed data JSON array?
I would add a meta key as a peer of data like below. This would separate your data from the meta data.
{
"Meta": {
"metadata":"instructions"
},
"Data": [
{
"foo": 1,
"bar": "barString"
},
{
"foo": 3,
"bar": "fooString"
}
]
}
If you can modify the structure of the data, why not add a property meta with your instructions (i.e. Data.meta) and another property content (for want of a better word...) (i.e. Data.content), where the latter is the original array of objects.
That way, it is still valid JSON, and other implementations can read the meta-field as well without much ado.
Edit: just realized, you would also have to make Data an object rather than array. Then your JSON-schema should become this:
{
"Data": {
"metadata": "instructions here",
"content": [
{
"foo": 1,
"bar": "barString"
},
{
"foo": 3,
"bar": "fooString"
}
]
}
}
This will probably be the most stable, maintainable and portable solution.
For refrence, something similar has already been asked before.
After some additional discussion with another developer, we thought of one way to include the metadata instructions in the data JSON array.
{
"Data": [
{
"metadata": "Instructions"
}
{
"foo": 1,
"bar": "barString"
},
{
"foo": 3,
"bar": "fooString"
}
]
}
This approach does come with the limitation that index 0 of the data JSON array MUST contain a JSON Object containing the metadata and associated instructions for the generic parser. Failure to include this metadata object as index 0 would trigger an error case that the generic parser would need to handle. So it does have its trade-offs.
I will go to try help you..
"metadata" : [
{
"foo": 1,
"bar": "barString"
},
{
"foo": 3,
"bar": "fooString"
}
]
Nested arrays get flattened when represented in "fields". I expect that values from the same path to be merged, but that the internal data structure will not be modified.
Could someone explain whether I am doing something incorrectly, or whether this belongs as an Elasticsearch issue?
Steps to reproduce:
Create the 2D data
curl -XPOST localhost:9200/test/5 -d '{ "data": [ [100],[2,3],[6,7] ] }'
Query the data, specifying fields
curl -XGET localhost:9200/test/5/_search -d '{"query":{"query_string":{"query":"*"} }, "fields":["data"] } }'
Result:
{"took":1,"timed_out":false,"_shards":{"total":5,"successful":5,"failed":0},"hits":{"total":1,"max_score":1.0,"hits":[{"_index":"test","_type":"5","_id":"AVdsHJrepOClGTFyoGqo","_score":1.0,"fields":{"data":[100,2,3,6,7]}}]}}
Repeat without the use of "fields":
curl -XGET localhost:9200/test/5/_search -d '{"query":{"query_string":{"query":"*"} } } }'
Result:
{"took":1,"timed_out":false,"_shards":{"total":5,"successful":5,"failed":0},"hits":{"total":1,"max_score":1.0,"hits":[{"_index":"test","_type":"5","_id":"AVdsHJrepOClGTFyoGqo","_score":1.0,"_source":{ "data": [ [100],[2,3],[6,7] ] }}]}}
Notice that _source and fields differ, in that "fields" decomposes the 2D array into a 1D array.
When you specify nothing else in your request, what you get back foreach hit is the "_source" object, that is, exactly the Json you sent to ES during indexing (even including whitespace!).
When you use source filtering, as Andrey suggests, it's the same except you can include or exclude certain fields.
When you use the "fields" directive in your query, the return values are not taken from the _source, but read directly from the Lucene Index. (see docs) Now the key in your search response will switch from "_source" to "fields" to reflect this change.
As alkis said:
https://www.elastic.co/guide/en/elasticsearch/reference/current/array.html
These docs say up front that, yes, Elasticsearch does flatten arrays.
Instead of specifying "fields" I usually do source filtering
Your query would change to something like:
curl -XGET <IPADDRESS>:9200/test/5/_search -d '{"_source":{"include": ["data"]}, "query":{"query_string":{"query":"*"} }}'
From here https://www.elastic.co/guide/en/elasticsearch/reference/current/array.html
it seems that elasticsearch considers them the same.
In Elasticsearch, there is no dedicated array type. Any field can contain zero or more values by default, however, all values in the array must be of the same datatype. For instance:
an array of strings: [ "one", "two" ]
an array of integers: [ 1, 2 ]
an array of arrays: [ 1, [ 2, 3 ]] which is the equivalent of [ 1, 2, 3 ]
an array of objects: [ { "name": "Mary", "age": 12 }, { "name": "John", "age": 10 }]
You could use an array of json objects and use nested data type with nested query.
Maybe nested data type could be helpful
PUT /my_index
PUT /my_index/_mapping/my_type
{
"properties" : {
"data" : {
"type" : "nested",
"properties": {
"value" : {"type": "long" }
}
}
}
}
POST /my_index/my_type
{
"data": [
{ "value": [1, 2] },
{ "value": [3, 4] }
]
}
POST /my_index/my_type
{
"data": [
{ "value": [1, 5] }
]
}
GET /my_index/my_type/_search
{
"query": {
"nested": {
"path": "data",
"query": {
"bool": {
"must": [
{
"match": {
"data.value": 1
}
},
{
"match": {
"data.value": 2
}
}
]
}
}
}
}
}