Parsing JSON Array in Codenameone fails - codenameone

I am trying to use PropertyBusinessObject.getPropertyIndex.loadJsonList(storage) to load a list of propertybusinessobjects. It keeps failing for this json which I have verified as valid json.
[{
"_id": "5e8ef0485053da7500020c79",
"name": "CriminalAct",
"description": "A criminal act",
"label": "Criminal Act",
"_created": "2020-04-09T09:52:08.689Z",
"comment_services": [
"5c49c41840e295040000582a",
"5c49c2b140e2950400005811",
"5b632dfabba4456c000002b1"
],
"service_attributes": [
"5ba1192eda1139170000482c",
"5e8f02c45053da7500020d04",
"5e8f03355053da7500020d08",
"5e8f03d45053da7500020d0e",
"5e8f06585053da7500020d23",
"5e8f08c35053da7500020d35",
"5e8f0c035053da7500020d58",
"5c347ec511f06f26000027a4",
"5c3471f711f06f26000025fe",
"5c347d2811f06f2600002768"
],
"category": [
"5ca7939d0eaf051400001c5f",
"5d8e2f84bc58cd2800000854",
"5e8ee9655053da7500020c41"
],
"parent": [],
"providers": ["5a57a54c9d89946a00002216"],
"logo": ["5e8ef2a15053da7500020c8e"]
}]
I get this error
java.lang.ArrayIndexOutOfBoundsException: -1
[EDT] 0:0:2,193 - Exception during JSON parsing at row: 1 column: 9 buffer:
[EDT] 0:0:2,197 - Exception: java.lang.NullPointerException - null
[EDT] 0:0:2,198 - Exception: java.lang.NullPointerException - null
at java.util.ArrayList.elementData(ArrayList.java:418)
at java.util.ArrayList.remove(ArrayList.java:495)
at com.codename1.io.JSONParser.endArray(JSONParser.java:625)
at com.codename1.io.JSONParser.parse(JSONParser.java:387)
at com.codename1.io.JSONParser.parseJSON(JSONParser.java:484)
at com.codename1.properties.PropertyIndex.loadJSONList(PropertyIndex.java:667)
I am quiet puzzled as to what could be wrong.

Related

Solr indexing nested objects array

We're trying to run an index in Solr (8.9.0 - Schemaless Mode) of a list of items that each contain 1 or 2 arrays of objects, with 1 or more records per array. The sample code below is the json we feed the index:
[
{
"id": 8270861,
"type": "Product",
"title": "Stripped T-shirt"
"tags": [{
"tagId": 218,
"tagIcon": "smile,happy",
"tagHelpText": "",
"tagValue": "grand"
},
{
"tagId": 219,
"tagIcon": "frown,sad",
"tagHelpText": "",
"tagValue": "grand"
}],
"keywords": [
{
"keywordId": 742,
"type": "color"
},
{
"keywordId": 743,
"type": "size"
}]
}
]
2 problems we run into:
PROBLEM 1:
The output of the solr query changes the format of the arrays to this (effectively removing the quotes):
...
"tags": [
"{tagIcon=smile,happy, tagHelpText=, tagId=218, tagValue=grand}",
"{tagIcon=frown,sad, tagHelpText=, tagId=219, tagValue=grand}"
],
"keywords": [
"{type=color, keywordId=742}",
"{type=size, keywordId=743}"
],
...
Is there a way to get the format of the arrays to come back the same way as fed into the index:
"tags": [
{ "tagId": 218, "tagIcon": "smile,happy", "tagHelpText": "", "tagValue": "grand" },
{ "tagId": 219, "tagIcon": "frown,sad", "tagHelpText": "", "tagValue": "grand"}
]
to avoid any conflicts when the value is a comma separated list. Are we missing some definition adjustments in the schema file? If so do we need to define the children of those parent keys(i.e. "tags.tagIcon")?
PROBLEM 2:
The index seems to reject an array with a single element. If we feed it the same json as above, but only one entry in the keywords array (or the tags array):
...
"keywords": [
{
"keywordId": 742,
"type": "color"
}]
...
it throws a code: 400 Unknown operation for the an atomic update: type"
Any suggestions on this would be welcome.

I have a datastudio error - basic json config

I've returned to try and make some datastudio custom javascript.
So I started off with a template type settings and basic js. Manifest is listing correctly - datastudio sees the custom item.
I took a long time for it to be authorised.
However, on adding the custome js, the console is reporting a load of erros.
first : data.0.type is not a valid config
second : data.0.elements.data.0.type is not a valid config.
Json:
{
"data": [
{
"id": "idtestviz",
"label": "Dimension Element Heading",
"type":"DIMENSION"
}
]
,
"style": [
{
"id": "idtestvizstyles",
"label": "Test Styles",
"elements":[
{
"id":"idtestvizfontcolor",
"label":"Font Colour",
"defaultValue":"#FFFF00"
}
]
}
]
}
It did have options in before, same error.
And appears to be the same as in https://developers.google.com/datastudio/visualization/define-config
Also it also is erroring on 'is already used in the config'
and that data.0.elements.style.0.elements.0.type required field that cannot be found
Seems like there are more checks that need to be done.
Is there a validator for json etc. before running, or has something updated on google side that their documentation hasn't been updated yet?
Or the more likely aspect, I'm missing some critical stuff...
Regards
Vince
Re checked my json config with a previous one that works, noted some errors in the objects. Corrected those and the json errors in the console have gone away.
JS errors remain - working on those... closing this question.
{
"data": [
{
"id":"test_viz_data",
"label":"Test Viz Data",
"elements":[
{
"id": "text_viz_dimensions",
"label": "Dimension Element Heading",
"type": "DIMENSION",
"options": {
"min": 1,
"max": 1
}
}
,
{
"id": "test_metrics",
"label": "Metric fields",
"type": "METRIC",
"options": {
"min": 1,
"max": 1
}
}
]
}
]
,
"style": [
{
"id": "idstyles",
"label": "Test Styles",
"elements":[
{
"id":"idfontcolor",
"label":"Font Colour",
"type":"FONT_COLOR",
"defaultValue":"#FFFF00"
}
]
}
]
,
"interactions": [
]
}

Solr DIH showing data import successfull but no docs retrieved via query

I am using the SolrEntityProcessor in my DIH config to reindex data from one collection to another.
Here is my DIH config for the same
<dataConfig>
<document>
<entity name="sep" processor="SolrEntityProcessor"
url="http://127.0.0.1:8983/solr/techPro2 "
query="*:*"/>
</document>
</dataConfig>
I have another collection techproducts(destination collection) which has the same configset(sample_techproducts_configs) as techPro2 (my source collection here).
So after performing a fullimport of the data, this is the output i get
Indexing completed. Added/Updated: 10 documents. Deleted 0 documents.
(Duration: 01s) Requests: 0 , Fetched: 10 10/s, Skipped: 0 ,
Processed: 10 10/s Started: less than a minute ago
Also in debug mode here is the detailed output
"responseHeader": {
"rf": 2147483647,
"status": 0,
"QTime": 120
},
"initArgs": [
"defaults",
[
"config",
"DIHconfigfile.xml"
]
],
"command": "full-import",
"mode": "debug",
"documents": [
{
"author": "Glen Cook",
"genre_s": "fantasy",
"price_c____l_ns": 699,
"series_t": "The Chronicles of The Black Company",
"price_c": "6.99,USD",
"author_s": "Glen Cook",
"_version_": 1656949432473616400,
"price": 6.99,
"cat": "book",
"name": "The Black Company",
"inStock": false,
"sequence_i": 1,
"id": "0812521390"
},
{},....{}
],
"verbose-output": [],
"status": "idle",
"importResponse": "",
"statusMessages": {
"Total Requests made to DataSource": "0",
"Total Rows Fetched": "10",
"Total Documents Processed": "10",
"Total Documents Skipped": "0",
"Full Dump Started": "2020-01-28 06:14:23",
"": "Indexing completed. Added/Updated: 10 documents. Deleted 0 documents.",
"Committed": "2020-01-28 06:14:23",
"Time taken": "0:0:0.77",
"Full Import failed": "2020-01-28 06:14:23"
}
}
Now in the json response the last key "Full Import failed": "2020-01-28 06:14:23" it says the import failed and the status that I get says indexing completed and when I query the collection I get 0 docs returned
{
"responseHeader":{
"zkConnected":true,
"status":0,
"QTime":12,
"params":{
"q":"*:*",
"_":"1580193223772"}},
"response":{"numFound":0,"start":0,"maxScore":0.0,"docs":[]
}}
Edit 1
Here are the error in the logs
Full Import failed:org.apache.solr.update.processor.DistributedUpdateProcessor$DistributedUpdatesAsyncException: Async exception during distributed update: Error from server at http://172.23.98.162:7574/solr/filmsCopy_shard2_replica_n6/: null
request: http://172.23.98.162:7574/solr/filmsCopy_shard2_replica_n6/
Remote error message: version conflict for /en/code_46 expected=1656968541372416000 actual=-1
at org.apache.solr.update.processor.DistributedZkUpdateProcessor.doDistribFinish(DistributedZkUpdateProcessor.java:1189)
at org.apache.solr.update.processor.DistributedUpdateProcessor.finish(DistributedUpdateProcessor.java:1096)
at org.apache.solr.update.processor.LogUpdateProcessorFactory$LogUpdateProcessor.finish(LogUpdateProcessorFactory.java:182)
at org.apache.solr.update.processor.UpdateRequestProcessor.finish(UpdateRequestProcessor.java:80)
at org.apache.solr.update.processor.UpdateRequestProcessor.finish(UpdateRequestProcessor.java:80)
at org.apache.solr.update.processor.UpdateRequestProcessor.finish(UpdateRequestProcessor.java:80)
at org.apache.solr.update.processor.UpdateRequestProcessor.finish(UpdateRequestProcessor.java:80)
at org.apache.solr.update.processor.UpdateRequestProcessor.finish(UpdateRequestProcessor.java:80)
at org.apache.solr.update.processor.UpdateRequestProcessor.finish(UpdateRequestProcessor.java:80)
at org.apache.solr.update.processor.UpdateRequestProcessor.finish(UpdateRequestProcessor.java:80)
at org.apache.solr.update.processor.UpdateRequestProcessor.finish(UpdateRequestProcessor.java:80)
at org.apache.solr.handler.dataimport.SolrWriter.close(SolrWriter.java:61)
at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:275)
at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:424)
at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:483)
at org.apache.solr.handler.dataimport.DataImporter.lambda$runAsync$0(DataImporter.java:466)
at java.base/java.lang.Thread.run(Thread.java:834)
More logs
06:12:14.129 WARN (Thread-26) [ ] o.a.s.h.d.SolrWriter Error creating document : SolrInputDocument(fields: [_version_=1656968541868392448, name=Find Me Guilty, id=/en/find_me_guilty]) => org.apache.solr.common.SolrException: version conflict for /en/find_me_guilty expected=1656968541868392448 actual=-1

Calling parse function in swift fails

I am working on a food delivery app, which uses parse as its backend. I am facing a problem while calling the placeOrder API through
PFCloud.callFunction(inBackground: PlaceOrder, withParameters: params) { (data, err) in}
Please have a look at the JSON which I need to post below.
{
"source": "card_1EVYuOEynlyM6L4SHgBMJYRQ",
"userId": "YjSZYSXEp7",
"data": {
"menuItems": [{
"id": "QSYa2JDcIm",
"title": "Rice With Tibss(Beef)",
"menuTitle": "Rice With Tibss",
"submenuItem": [{
"id": "zaOo6G4KSV",
"name": "Beef",
"price": 12,
"desc": "Fillings?"
}],
"price": 24,
"qty": 1,
"storeId": "yqBCDmzaDP",
"storeName": "Ibex Ethiopian Cusine and Bar",
"orderType": "takeout",
"taxState": 0.0925,
"storeInfo": {
"cart_storeId": "yqBCDmzaDP",
"cart_storeName": "Ibex Ethiopian Cusine and Bar",
"cart_storeImage": "https://res.cloudinary.com/http-get-tolofood-com/image/upload/c_scale,h_199,q_auto,w_270/v1461575640/Ibex_lopx38.jpg",
"cart_storeCuisine": "Ethiopian",
"cart_storeDescription": "We always serve a quality food. We always serve a quality food. We always serve a quality food. We always serve a quality food.",
"cart_storeRating": 3.33,
"cart_storeDelivery": false,
"takeout": true,
"address": "12255 Greenville Ave,Dallas, TX 75243",
"slugname": "TX_DAL_ibex_ethiopian_cuisine_and_bar",
"multiple_location": false,
"cart_storeDeliveryFee": 15,
"cart_storeServes": "Lunch,Dinner",
"busy": false,
"cart_storeSeoSlug": "ibex-ethiopian-cusine-and-bar"
},
"enable": true,
"voice_read_mi_label": "fbgcb",
"voice_read_mi_option": false,
"menuTypeName": "Standard"
}],
"lastOrderType": "takeout",
"searchedAddress": "takeout",
"timeData": {
"day": "06-05-2019",
"time": "12:55 am",
"tz": "America/Los_Angeles"
}
},
"unavailable_option": "restaurant_recommendation"
}
And below is the Swift code which I have used to make pass it.
let storeInfo: Dictionary = [CartStoreId: self.cartStoreId, CartStoreName: self.cartRestaurantName, CartStoreImage: self.cartStoreImage, CartStoreCuisine: self.cartStoreCuisine, CartStoreDescription: self.cartStoreDescription, CartStoreRating: self.cartStoreRating, CartStoreDelivery: self.cartStoreDelivery, Takeout: self.takeOut, Address: self.address, Slugname: self.slugName, MultipleLocation: self.multipleLocation, CartStoreDeliveryFee: self.cartStoreDelivery, CartStoreServes: self.cartStoreServes, Busy: self.busy, CartStoreSeoSlug: self.cartStoreSeoSlug] as Dictionary
let subMenuItem = ["id": "zaOo6G4KSV", "name": "Beef", "price": 12, "desc": "Fillings?", "voice_read_submi_label":"bf", "voice_read_submi_option":false, "disabled": false] as [String: Any]
let ordersDictionary = [
"id" : "1234",
"title" : "Test",
"menuTitle" : "MenuName",
"price" : 23,
"qty" : 2,
"storeId" : 23,
"orderType" : "standard",
"taxState" : 0.22,
"enable" : true,
"menuTypeName" : "Type Name",
"voice_read_mi_label":"fdfs",
"voice_read_mi_option":"false",
"submenuitem": subMenuItem,
"storeInfo": storeInfo
] as Dictionary
let timeData = ["day" : 17-06-2019, "time": "11:00 AM", "tz": "America/Los_Angeles"] as Dictionary
let data = ["menuItems": ordersDictionary, "lastOrderType": "takeout", "searchedAddress": "takeout", "timeData" : timeData] as Dictionary
let params = [UserId: self.userId, "source":"card_1EVYuOEynlyM6L4SHgBMJYRQ", "data": data, "unavailable_option":"restaurant_recommendation","_ApplicationId":"6EuadToYoFGJhI1sX8XnuFBz9tp9l3yH6HxzzXZO", "_JavaScriptKey":"rQkALu9saFtF2oq9yCibyw6mEcs3PVqct3uuP6vg", "_ClientVersion":"js1.6.14", "_InstallationId":"444ec64d-5fcc-7b8e-596e-6be627892c2a",
"_SessionToken":"r:c966376120c8eca77aa63c29d5bebe1a"] as Dictionary
After all this is done I call the parse function like below.
PFCloud.callFunction(inBackground: PlaceOrder, withParameters: params) { (data, err) in
if err != nil {
print(err!)
} else {
print(data!)
}
}
But this gives me error after a few seconds saying
"Error Domain=NSCocoaErrorDomain Code=3840 "JSON text did not start with array or object and option to allow fragments not set." UserInfo={NSDebugDescription=JSON text did not start with array or object and option to allow fragments not set.}"
I have searched the web with the error and made fixes accordingly but still no success. Please help me guys.
I noticed that your params var is not compatible with the JSON you sent, there are more fields and also missing fields. Moreover, menuItems and submenuItem are an Array in your JSON and an Object in your code. It is probably making the cloud code function to fail and you are therefore not receiving back a valid JSON. Try the following and check if it works. In the case it works, just replace the values by your vars.
let params = [
"source": "card_1EVYuOEynlyM6L4SHgBMJYRQ",
"userId": "YjSZYSXEp7",
"data": [
"menuItems": [[
"id": "QSYa2JDcIm",
"title": "Rice With Tibss(Beef)",
"menuTitle": "Rice With Tibss",
"submenuItem": [[
"id": "zaOo6G4KSV",
"name": "Beef",
"price": 12,
"desc": "Fillings?"
]],
"price": 24,
"qty": 1,
"storeId": "yqBCDmzaDP",
"storeName": "Ibex Ethiopian Cusine and Bar",
"orderType": "takeout",
"taxState": 0.0925,
"storeInfo": [
"cart_storeId": "yqBCDmzaDP",
"cart_storeName": "Ibex Ethiopian Cusine and Bar",
"cart_storeImage": "https://res.cloudinary.com/http-get-tolofood-com/image/upload/c_scale,h_199,q_auto,w_270/v1461575640/Ibex_lopx38.jpg",
"cart_storeCuisine": "Ethiopian",
"cart_storeDescription": "We always serve a quality food. We always serve a quality food. We always serve a quality food. We always serve a quality food.",
"cart_storeRating": 3.33,
"cart_storeDelivery": false,
"takeout": true,
"address": "12255 Greenville Ave,Dallas, TX 75243",
"slugname": "TX_DAL_ibex_ethiopian_cuisine_and_bar",
"multiple_location": false,
"cart_storeDeliveryFee": 15,
"cart_storeServes": "Lunch,Dinner",
"busy": false,
"cart_storeSeoSlug": "ibex-ethiopian-cusine-and-bar"
],
"enable": true,
"voice_read_mi_label": "fbgcb",
"voice_read_mi_option": false,
"menuTypeName": "Standard"
]],
"lastOrderType": "takeout",
"searchedAddress": "takeout",
"timeData": [
"day": "06-05-2019",
"time": "12:55 am",
"tz": "America/Los_Angeles"
]
],
"unavailable_option": "restaurant_recommendation"
]

Ruby - parse JSON file with nested arrays to ruby hash without data loss

I have a file1.json with structure like this :
[
{
"uri": "features/hdp.feature",
"id": "as-a-user-i-want-to-use-house-detailed-page",
"keyword": "Feature",
"name": "As a user I want to use house detailed page",
"description": "",
"line": 2,
"tags": [
{
"name": "#hdp",
"line": 1
}
],
"elements": [
{
As you can see - it is an array with nested key:value pairs and other arrays. I need to convert it to ruby hash, but when I'm performing JSON.parse(file1) - it creates an array (http://prntscr.com/lqio6r) with ruby hashes, arrays and so on. If I'm performing JSON.parse(file1).reduce Hash.new, :merge or JSON.parse(file1).reduce Hash.new, :update) - as one of the answers on StackOverflow supposed - the result hash losses about 60% of .json content. Can you please advice on how can I convert json file to ruby hash (without any data losses)?
UPD - not truncated array - https://gist.githubusercontent.com/M1khah/3337507e3ca1544e6098bc726bca90cb/raw/c8262ad753bd0eebf1180e111acd016ffc07d1a5/gistfile1.txt
Hash with hashes - something like this instead of an array with nested hashes
{
{
"uri": "features/hdp.feature",
"id": "as-a-user-i-want-to-use-house-detailed-page",
"keyword": "Feature",
"name": "As a user I want to use house detailed page",
"description": "",
"line": 2,
"tags": [
{
"name": "#hdp",
"line": 1
}
],
"elements": [
{
}

Resources