I have below data:
{
"results":[
{
"ID":"1",
"products":[
{
"product":"car",
"number":"5"
},
{
"product":"computer",
"number":"212"
}
]
},
{
"ID":"2",
"products":[
{
"product":"car",
"number":"9"
},
{
"product":"computer",
"number":"463"
},
{
"product":"bicycle",
"number":"5"
}
]
}
]
}
And my query is below:
{
"query":{
"bool":{
"must":[
{
"wildcard":{
"results.products.product":"*car*"
}
},
{
"wildcard":{
"results.products.number":"*5*"
}
}
]
}
}
}
What I expect is to get only ID1. Because only it has a product with { "product":"car", "number":"5" } record. But what I get is both ID1 and ID2 because ID2's first record has "product":"car" and third record has "number":"5" records separately.
How can I fix this query?
You need to define your products as a nested type when creating mapping. Try with following mapping example:
PUT http://localhost:9200/indexname
{
"mappings": {
"typename": {
"properties": {
"products" : {
"type" : "nested"
}
}
}
}
}
Then you can use nested queries to match entire elements of your array - just as you need to.
{
"query": {
"nested": {
"path": "products",
"query": {
"bool": {
"must": [
{ "wildcard": { "products.product": "*car*" }},
{ "wildcard": { "products.number": "*5*" }}
]
}
}
}
}
}
Related
I want to write a Jolt definition using cardinality-Many that can transform object "PO_POD_LN_EVW1" into list and ignore if its already a list.
Input JSON :
{
"PURCHASE_ORDER_DISPATCH": {
"MsgData": {
"Transaction": {
"PSCAMA": {
"PUBLISH_RULE_ID": {
"IsChanged": "Y"
}
},
"PO_POD_HDR_EVW1": {
"STATE_BILL": "",
"CURRENCY_CD": {
"IsChanged": "Y",
"content": "USD"
},
"ADDRESS4_VNDR": "",
"PO_POD_LN_EVW1": {
"WG_ACCOUNT": 641100,
"LINE_NBR": {
"IsChanged": "Y",
"content": 1
},
"ITM_ID_VNDR": "B0798CX2Q9",
"PO_POD_SHP_EVW1": {
"LINE_NBR": {
"IsChanged": "Y",
"content": 1
}
}
},
"WG_ADDR_SEQ_NUM": 1
}
}
}
}
}
JOLT Spec :
{
"operation": "cardinality",
"spec": {
"PURCHASE_ORDER_DISPATCH": {
"MsgData": {
"Transaction": {
"PO_POD_HDR_EVW1": "MANY"
}
}
}
}
}
Getting this error: Failed to Transform
Your jolt spec is correct, But You should wrap all of your specs in the [] array.
Try this:
[
{
"operation": "cardinality",
"spec": {
"PURCHASE_ORDER_DISPATCH": {
"MsgData": {
"Transaction": {
"PO_POD_HDR_EVW1": "MANY"
}
}
}
}
}
]
In my MongoDB database, I have a collection 'produits' with documents like this
{
"_id": {
"$oid": "6048e97b4a5f000096007505"
},
"modeles": [
{
"id": "OppoA3",
"pieces": [
{
"id": "OppoA3avn"
},
{
"id": "OppoA3bat"
}]
]
},
{
"id": "OppoA1",
"pieces": [
{
"id": "OppoA1avn",
},
{
"id": "OppoA1batt",
}
]
}
]
}
How can I delete all modeles.pieces from all my documents.
I managed to delete with a filter on modeles.id but with that code but not on all the collection
db.produits.update(
{marque_id:'OPPO', 'modeles.id':'RENOZ'},
{$set:
{
'modeles.$.pieces': []
}
}
, { multi : true }
)
I would like all documents like this finally
{
"_id": {
"$oid": "6048e97b4a5f000096007505"
},
"modeles": [
{
"id": "OppoA3",
"pieces": []
},
{
"id": "OppoA1",
"pieces": []
}
]
}
Thank you for your help.
I have done a javascript loop like this, but i think it's not best practice
async removePieces(){
var doc
try {
doc = await produitModel.find()
for (var produit of doc) {
for (var modele of produit.modeles) {
const filter = {'marque_id': produit.marque_id, 'modeles.id': modele.id}
const set = {
$set: {
'modeles.$.pieces': []
}
}
await db.collection('produits').updateOne(filter, set)
}
}
console.log('removePieces() ==> Terminé')
} catch(err) {
console.log(err)
}
}
db.produits.update({
modeles: {//This is because your second document will create failure otherwise
$exists: true
}
},
{
$set: {
"modeles.$.pieces": []
}
},
{
multi: true
})
Here is my Json in Mongo DB Compass. I am just querying greater than rating products from each collection.
Note: if I am doing with pageCount it is working fine because that is not inside a collection.
{PageCount:{gte:2}} -- works.
Problem with inner arrays collection of collection if anyone matches it displays all.
When we are doing the below query if anyone of the index have greater than 99 it shows all the values.
{"ProductField.ProductDetailFields.ProductDetailInfo.ProductScore.Rating": {$exists:true, $ne: null , $gte: 99}}
----- if I perform above query, I am getting this output.
How to iterate like foreach kind of things and check the condition in MongoDB querying
{
"_id":{
"$oid":"5fc73a7b3fb52d00166554b9"
},
"ProductField":{
"PageCount":2,
"ProductDetailFields":[
{
"PageNumber":1,
"ProductDetailInfo":[
{
"RowIndex":0,
"ProductScore":{
"Name":"Samsung",
"Rating":99
},
},
{
"RowIndex":1,
"ProductScore":{
"Name":"Nokia",
"Rating":96
},
},
{
"RowIndex":2,
"ProductScore":{
"Name":"Apple",
"Rating":80
},
}
]
}
]
}
},
{
"_id":{
"$oid":"5fc73a7b3fb52d0016655450"
},
"ProductField":{
"PageCount":2,
"ProductDetailFields":[
{
"PageNumber":1,
"ProductDetailInfo":[
{
"RowIndex":0,
"ProductScore":{
"Name":"Sony",
"Rating":93
}
},
{
"RowIndex":1,
"ProductScore":{
"Name":"OnePlus",
"Rating":93
}
},
{
"RowIndex":2,
"ProductScore":{
"Name":"BlackBerry",
"Rating":20
}
}
]
}
]
}
}
#Misky How to run this query execute:
While run this query in Mongo Shell - no sql client throws below error. we are using 3.4.9 https://www.nosqlclient.com/demo/
Is this somewhat close to your idea
db.collection.aggregate({
$addFields: {
"ProductField.ProductDetailFields": {
$map: {
"input": "$ProductField.ProductDetailFields",
as: "pdf",
in: {
$filter: {
input: {
$map: {
"input": "$$pdf.ProductDetailInfo",
as: "e",
in: {
$cond: [
{
$gte: [
"$$e.ProductScore.Rating",
99
]
},
{
$mergeObjects: [
"$$e",
{
PageNumber: "$$pdf.PageNumber"
}
]
},
null
]
}
}
},
as: "i",
cond: {
$ne: [
"$$i",
null
]
}
}
}
}
}
}
},
{
$addFields: {
"ProductField.ProductDetailFields": {
"$arrayElemAt": [
"$ProductField.ProductDetailFields",
0
]
}
}
})
LIVE VERSION
I have a complex structure similar to:
{
type: "Data",
data: [
"item1": {...},
"item2": {...},
...
"itemN": {
"otherStructure": {
"testData": [
{
"values": {...}
},
{
"values": {
"importantKey": ObjectId("23a2345gf651")
}
}
]
}
}
]
}
For such kind of data structure I want to update data type for all these importantKeys from ObjectId into string.
I was trying queries similar to:
db.collection.updateMany(
{type:"Data"},
{$toString: "data.$[element].otherStructure.testData.$[element].values.importantKey"},
{"data.element.otherStructure.testData.element.values.importantKey": {$type: "objectId"}})
But all these tries were no successful.
So, are there any adequate solutions for updating such data?
UPDATE
Sorry for confusing, my structure is more complex:
data.content.$[element].otherStructure.testData.keys.$[element].values.$[element].meta.importantKey
All these $[element] elements means objects with list of elements.
You may use this workaround:
db.collection.aggregate([
{
$addFields: {
"data.content": {
$map: {
input: "$data.content",
as: "data",
in: {
otherStructure: {
testData: {
keys: {
$map: {
input: "$$data.otherStructure.testData.keys",
as: "testData",
in: {
"values": {
$map: {
input: "$$testData.values",
as: "values",
in: {
"meta": {
"importantObject": {
"importantKey": {
$toString: "$$values.meta.importantObject.importantKey"
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
},
{$out:"collection"}
])
MongoPlayground
I am new in groovy and I want to construct a json object with the builder
{
"query": {
"bool": {
"must": [
{
"bool": {
"should": [
{ "match": { "content": "scontent" } },
{ "match": { "title":"stitle" } }
]
}
},
{
"bool": {
"should": [
{ "match": { "a1": "v1" } },
{ "match": { "a2":"v2" } },
... and so on ...
{ "match": { "an":"vn" } }
]
}
}
]
}
},
"highlight": {
"fields": {
"content":{}
}
}
}
I search a lot of on other posts on stackoverflow and I write this code
So I did this but no way to get what I want :
JsonBuilder builder = new JsonBuilder()
def body = builder {
from Lib.or(qQuery.start, 0)
size Lib.or(qQuery.num, 10)
query {
bool {
must [
{
bool {
should [
{ match { content 'scontent' } },
{ match { title 'stitle' } }
]
}
},
{
bool {
should myVals.collect {[
'match' : { it.key it.value }
]}
}
}
]
}
}
highlight {
fields {
content {}
}
}
}
Thanks for any help !
I think you can make this work with the JsonBuilder as is, but it is usually easier to create the data structure using maps and lists (which is what the builder outputs) in groovy as you have more control there.
Example code:
import groovy.json.*
def data = [
query: [
bool: [
must: [
[bool:
[should: [
[match: [ content: 'scontent']],
[match: [ title: 'stitle']]
]]
],
[bool:
[should: [
[match: [ a1: 'v1']],
[match: [ a2: 'v2']],
[match: [ vn: 'vn']]
]]
]
]
]
]
]
println JsonOutput.prettyPrint(JsonOutput.toJson(data))
produces:
{
"query": {
"bool": {
"must": [
{
"bool": {
"should": [
{
"match": {
"content": "scontent"
}
},
{
"match": {
"title": "stitle"
}
}
]
}
},
{
"bool": {
"should": [
{
"match": {
"a1": "v1"
}
},
{
"match": {
"a2": "v2"
}
},
{
"match": {
"vn": "vn"
}
}
]
}
}
]
}
}
}
I did not include your full json as it takes up some space, but the structure is there. Note the use of lists ([valueA, valueB]) vs maps ([someKey: someValue]) in the data structure.
Granted this makes the JsonBuilder less than 100% useful but I haven't seen any concise ways of including lists of large json objects in a list within the structure. You can do:
def json = JsonBuilder()
json.query {
bool('list', 'of', 'values')
}
but for larger structures as list elements I would say go with the lists and maps approach.