EDIFACT DELFOR Decoding in Azure Logic Apps - azure-logic-apps

I am moving from X12 into other formats with Logic Apps + Integration Account, without much EDI or LA experience.
Here is my DELFOR example:
UNB+UNOA:2+OURCODE:ZZ+THEIRCODE:01+180523:1005+157'
UNG+DELFOR+OURCODE+THEIRCODE+180523:1005+157+UN+D:96A'
UNH+15700001+DELFOR:D:96A:UN'
BGM+241+201805231005-1+5'
DTM+137:20180523:102'
DTM+323:2018052120180820:711'
NAD+MI+OURCODE::92'
NAD+SF+THEIRCODE::92'
NAD+SU+THEIRCODE::92'
UNS+D'
NAD+ST+OURCODE::92'
LIN+++TESTPARTA1:IN'
PIA+1+RECEIVERPARTNO:VP'
RFF+ON:PONUMBER55'
QTY+1:3080:EA'
SCC+4++W'
DTM+2:20180521:102'
QTY+1:0:EA'
SCC+4++W'
DTM+2:20180528:102'
QTY+1:0:EA'
SCC+4++W'
DTM+2:20180604:102'
QTY+1:880:EA'
SCC+4++W'
DTM+2:20180611:102'
QTY+1:2200:EA'
SCC+4++W'
DTM+2:20180618:102'
QTY+1:1760:EA'
SCC+4++W'
DTM+2:20180625:102'
QTY+1:1760:EA'
SCC+4++W'
DTM+2:20180702:102'
QTY+1:1760:EA'
SCC+4++W'
DTM+2:20180709:102'
QTY+1:1760:EA'
SCC+4++W'
DTM+2:20180716:102'
QTY+1:1320:EA'
SCC+4++W'
DTM+2:20180723:102'
QTY+1:1320:EA'
SCC+4++W'
DTM+2:20180730:102'
QTY+1:440:EA'
SCC+4++W'
DTM+2:20180806:102'
QTY+1:440:EA'
SCC+4++W'
DTM+2:20180813:102'
QTY+1:440:EA'
SCC+4++W'
DTM+2:20180820:102'
QTY+3:274560:C62'
SCC+2'
DTM+52:20180604:102'
QTY+3:275440:C62'
SCC+3'
DTM+52:20180611:102'
QTY+3:271480:C62'
DTM+50:20180522:102'
QTY+1:2640:C62'
DTM+50:20180522:102'
RFF+AAK:CG07656'
QTY+79:271480:C62'
DTM+52:20180522:102'
UNS+S'
UNT+69+15700001'
UNE+1+157'
UNZ+1+157'
I am using Liaison EDI Notepad to validate it generally. It reports no issues and I find it a very reliable tool. I am open to other validation suggestions.
Now, for doing a DECODE EDIFACT action in LA, which MS schema do I use? I have selected EFACT_D96A_DELFOR.xsd from MicrosoftEdiXSDTemplates.zip also available on GitHub. Is it the correct one for my sample? When I try the action, I have raw input:
{
"host": {
"connection": {
"name": "/subscriptions/..."
}
},
"method": "post",
"path": "/decode",
"queries": {
"componentSeparator": "58",
"dataElementSeparator": "43",
"decimalIndicator": "Comma",
"releaseIndicator": "63",
"repetitionSeparator": "42",
"segmentTerminator": "39",
"segmentTerminatorSuffix": "None"
},
"body": {
"$content-type": "application/octet-stream",
"$content": "VU5CK1VOT0E6MitPVVJDT0RFOlpaK1RIRUlSQ09ERTowMSsxODA1MjM6MTAwNSsxNTcnDQpVTkcrREVMRk9SK09VUkNPREUrVEhFSVJDT0RFKzE4MDUyMzoxMDA1KzE1NytVTitEOjk2QScNClVOSCsxNTcwMDAwMStERUxGT1I6RDo5NkE6VU4nDQpCR00rMjQxKzIwMTgwNTIzMTAwNS0xKzUnDQpEVE0rMTM3OjIwMTgwNTIzOjEwMicNCkRUTSszMjM6MjAxODA1MjEyMDE4MDgyMDo3MTEnDQpOQUQrTUkrT1VSQ09ERTo6OTInDQpOQUQrU0YrVEhFSVJDT0RFOjo5MicNCk5BRCtTVStUSEVJUkNPREU6OjkyJw0KVU5TK0QnDQpOQUQrU1QrT1VSQ09ERTo6OTInDQpMSU4rKytURVNUUEFSVEExOklOJw0KUElBKzErUkVDRUlWRVJQQVJUTk86VlAnDQpSRkYrT046UE9OVU1CRVI1NScNClFUWSsxOjMwODA6RUEnDQpTQ0MrNCsrVycNCkRUTSsyOjIwMTgwNTIxOjEwMicNClFUWSsxOjA6RUEnDQpTQ0MrNCsrVycNCkRUTSsyOjIwMTgwNTI4OjEwMicNClFUWSsxOjA6RUEnDQpTQ0MrNCsrVycNCkRUTSsyOjIwMTgwNjA0OjEwMicNClFUWSsxOjg4MDpFQScNClNDQys0KytXJw0KRFRNKzI6MjAxODA2MTE6MTAyJw0KUVRZKzE6MjIwMDpFQScNClNDQys0KytXJw0KRFRNKzI6MjAxODA2MTg6MTAyJw0KUVRZKzE6MTc2MDpFQScNClNDQys0KytXJw0KRFRNKzI6MjAxODA2MjU6MTAyJw0KUVRZKzE6MTc2MDpFQScNClNDQys0KytXJw0KRFRNKzI6MjAxODA3MDI6MTAyJw0KUVRZKzE6MTc2MDpFQScNClNDQys0KytXJw0KRFRNKzI6MjAxODA3MDk6MTAyJw0KUVRZKzE6MTc2MDpFQScNClNDQys0KytXJw0KRFRNKzI6MjAxODA3MTY6MTAyJw0KUVRZKzE6MTMyMDpFQScNClNDQys0KytXJw0KRFRNKzI6MjAxODA3MjM6MTAyJw0KUVRZKzE6MTMyMDpFQScNClNDQys0KytXJw0KRFRNKzI6MjAxODA3MzA6MTAyJw0KUVRZKzE6NDQwOkVBJw0KU0NDKzQrK1cnDQpEVE0rMjoyMDE4MDgwNjoxMDInDQpRVFkrMTo0NDA6RUEnDQpTQ0MrNCsrVycNCkRUTSsyOjIwMTgwODEzOjEwMicNClFUWSsxOjQ0MDpFQScNClNDQys0KytXJw0KRFRNKzI6MjAxODA4MjA6MTAyJw0KUVRZKzM6Mjc0NTYwOkM2MicNClNDQysyJw0KRFRNKzUyOjIwMTgwNjA0OjEwMicNClFUWSszOjI3NTQ0MDpDNjInDQpTQ0MrMycNCkRUTSs1MjoyMDE4MDYxMToxMDInDQpRVFkrMzoyNzE0ODA6QzYyJw0KRFRNKzUwOjIwMTgwNTIyOjEwMicNClFUWSsxOjI2NDA6QzYyJw0KRFRNKzUwOjIwMTgwNTIyOjEwMicNClJGRitBQUs6Q0cwNzY1NicNClFUWSs3OToyNzE0ODA6QzYyJw0KRFRNKzUyOjIwMTgwNTIyOjEwMicNClVOUytTJw0KVU5UKzY5KzE1NzAwMDAxJw0KVU5FKzErMTU3Jw0KVU5aKzErMTU3Jw=="
}
}
and output includes an error makes me think I am not validating correctly (wrong XSD?):
errorMessage:
Error: 1 (Field level error)
SegmentID: UNB
Position in TS: 1
Data Element ID: UNB3.2
Position in Segment: 4
Position in Field: 2
Data Value: 01
12: Invalid value in data element
I take it this is referring to the 01 leading off 01+180523:1005+157 on the first segment (line) referring to datetime 5/23/2018 10:05. Actually the 01 refers to my Guest Identity. What is the issue being pointed out here?
is the relevant IA Agreement.
I am looking for advice on troubleshooting this, including generally matching a given DELFOR or other EDIFACT to the correct MS XSD.
After I get through this, I plan VDA next. Anything about VDA that you recommend for me to keep in mind?
Thank you
Note 01 is not a problem to the Decode Action and is used in the TI Automotive specs on page 29 of link. However, I changed to ZZ and this did move onto another error. The output is
"badMessages": [
{
"UNB": {
"UNB_Segment": "UNB+UNOA:2+OURCODE:ZZ+THEIRCODE:ZZ+180523:1005+157'\r\n################",
"UNB2.1": "OURCODE",
"UNB2.2": "ZZ",
"UNB3.1": "THEIRCODE",
"UNB3.2": "ZZ"
},
"UNG": {
"UNG_Segment": "UNG+DELFOR+OURCODE+THEIRCODE+180523:1005+157+UN+D:96A'\r\n",
"UNG1": "DELFOR",
"UNG2.1": "OURCODE",
"UNG3.1": "THEIRCODE",
"UNG4.1": "180523",
"UNG4.2": "1005",
"UNG5": "157",
"UNG6": "UN",
"UNG7.1": "D",
"UNG7.2": "96A"
},
"UNH": {
"UNH1": "15700001",
"UNH2.1": "DELFOR",
"UNH2.2": "D",
"UNH2.3": "96A",
"UNH2.4": "UN"
}, ...
The error with ZZ is
Error encountered during parsing. The Edifact transaction set with id '15700001' contained in functional group with id '157',
in interchange with id '157', with sender id 'OURCODE', receiver id 'THEIRCODE' is being suspended with following errors:\r\n
Error: 1 (Miscellaneous error)\r\n\t70: \r\n\r\n
Error: 2 (Miscellaneous error)\r\n\t71: Transaction Set or Group Control Number Mismatch\r\n\r\n
Error: 3 (Miscellaneous error)\r\n\t29: Invalid count specified at interchange, group, or message levels\r\n\r\n",
Note the UNT segment has the correct segment count and control number: UNT+69+15700001'
In the output the payload reports invalid schema, including the correctly parsed UNH segment:
<UnrecognizedSchema>
<UnrecognizedSegment>
UNH+15700001+DELFOR:D:96A:UN
</UnrecognizedSegment>
<UnrecognizedSegment>
BGM+241+201805231005-1+5
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+137:20180523:102
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+323:2018052120180820:711
</UnrecognizedSegment>
<UnrecognizedSegment>
NAD+MI+OURCODE::92
</UnrecognizedSegment>
<UnrecognizedSegment>
NAD+SF+THEIRCODE::92
</UnrecognizedSegment>
<UnrecognizedSegment>
NAD+SU+THEIRCODE::92
</UnrecognizedSegment>
<UnrecognizedSegment>
UNS+D
</UnrecognizedSegment>
<UnrecognizedSegment>
NAD+ST+OURCODE::92
</UnrecognizedSegment>
<UnrecognizedSegment>
LIN+++TESTPARTA1:IN
</UnrecognizedSegment>
<UnrecognizedSegment>
PIA+1+RECEIVERPARTNO:VP
</UnrecognizedSegment>
<UnrecognizedSegment>
RFF+ON:PONUMBER55
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:3080:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180521:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:0:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180528:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:0:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180604:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:880:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180611:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:2200:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180618:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:1760:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180625:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:1760:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180702:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:1760:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180709:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:1760:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180716:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:1320:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180723:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:1320:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180730:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:440:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180806:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:440:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180813:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:440:EA
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+4++W
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+2:20180820:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+3:274560:C62
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+2
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+52:20180604:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+3:275440:C62
</UnrecognizedSegment>
<UnrecognizedSegment>
SCC+3
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+52:20180611:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+3:271480:C62
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+50:20180522:102
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+1:2640:C62
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+50:20180522:102
</UnrecognizedSegment>
<UnrecognizedSegment>
RFF+AAK:CG07656
</UnrecognizedSegment>
<UnrecognizedSegment>
QTY+79:271480:C62
</UnrecognizedSegment>
<UnrecognizedSegment>
DTM+52:20180522:102
</UnrecognizedSegment>
<UnrecognizedSegment>
UNS+S
</UnrecognizedSegment>
<UnrecognizedSegment>
UNT+42+15700001
</UnrecognizedSegment>

Thanks to everyone that commented.
In the end, changing content was not an option as we are processing the EDI for various customers, not generating it. I entered a support request and MS changed things on its end and I tested it today validating that this is not longer an issue, at least in EastUS Azure region where I am running our code.

Related

AWS Amplify - Graphql + Datastore: Variable 'input' has coerced Null value for NonNull type String

Running into a problem with the API generated with aws amplify.
Basically, I keep getting the following warning whenever I try to create one entity and it is not being persisted in DynamoDB.
Variable 'input' has coerced Null value for NonNull type 'String!
The following are the pertinent parts of the Graphql schema I used to create the backend.
enum EntityStatus {
ACTIVE
INACTIVE
ARCHIVED
}
type Address {
streetAddress1: String!
streetAddress2: String
city: String!
state: String!
zipCode: String!
country: String!
location: Location!
}
type Location {
lat: Float
lng: Float
}
type Tenant
#model
#auth(
rules: [
{ allow: groups, groups: ["Admin", "Coordinator", "Employees"], operations }
{ allow: groups, groups: ["Auditor"], operations: [read] }
]
) {
id: ID!
name: String!
address: Address!
phone: AWSPhone!
email: AWSEmail!
status: EntityStatus!
locale: String!
}
The code to create one of the Tenant entities is a simple call
try {
return await DataStore.save(new Tenant({ ...values }));
} catch (error) {
console.error(error);
}
The payload going sent by Datastore is as follows:
{
"name": "Tenant 1",
"phone": "1234567890",
"email": "tenant#tenant.com",
"status": "ACTIVE",
"address": {
"city": "Anytown",
"state": "TAB",
"zipCode": "12345",
"country": "US",
"location": { "lat": 123.12, "lng": 123.12 }
},
"locale": "en-US",
"id": "f8be53bd-b1cb-4cbd-9b64-01fdf930da8a"
}
Here is the full Warning message
[WARN] 40:26.787 DataStore
Object { localModel: {…}, message: "Variable 'input' has coerced Null value for NonNull type 'String!'", operation: "Create", errorType: undefined, errorInfo: undefined, remoteModel: null }
errorInfo: undefined
errorType: undefined
localModel: Object { id: "f8be53bd-b1cb-4cbd-9b64-01fdf930da8a", name: "Tenant 1", phone: "1234567890", … }
_deleted: undefined
_lastChangedAt: undefined
_version: undefined
address: Object { city: "Anytown", state: "TAB", zipCode: "12345", … }
createdAt: undefined
email: "tenant#tenant.com"
id: "f8be53bd-b1cb-4cbd-9b64-01fdf930da8a"
locale: "en-US"
name: "Tenant 1"
phone: "1234567890"
status: "ACTIVE"
updatedAt: undefined
<prototype>: Object { … }
message: "Variable 'input' has coerced Null value for NonNull type 'String!'"
operation: "Create"
remoteModel: null
<prototype>: Object { … }
react_devtools_backend.js:3973:25
Figured it out. My payload was missing 2 fields.
Wish that the error messages would be more helpful.

How to place woocommerce order with meta_data with axios

i am creating a resturant delivery react native app using woocommerce api and trying to place an order with Woocommerce API using axios.
on the website, i can select a product, its optional dishes and place the order. and it creates successfully..
but in the app i do like this.
{
payment_method: "cod",
payment_method_title: "Cash On Delivery",
set_paid: true,
billing: {
first_name: "John",
last_name: "Doe",
address_1: "969 Market",
address_2: "",
city: "San Francisco",
state: "CA",
postcode: "94103",
country: "US",
email: "john.doe#example.com",
phone: "(555) 555-5555"
},
shipping: {
first_name: "John",
last_name: "Doe",
address_1: "969 Market",
address_2: "",
city: "San Francisco",
state: "CA",
postcode: "94103",
country: "US"
},
line_items: [
{
product_id: 1479,
quantity: 2
},
],
shipping_lines: [
{
method_id: "flat_rate",
method_title: "Flat Rate",
total: "10.00"
}
]
};
it creates the order successfully but only shows the side items without calculating the price. means only the basic price is shown on the order without adding the additional products.

Yup validation for nested object returns error (Objects are not valid as a React child)

This is my Formik form that I want to validate with yup
{
"images": [],
"fundingGoal": 1337,
"name": {
"en": "english name",
"fr": "french name"
},
"description": {
"en": "english project desc",
"fr": "french project desc"
},
"country": "Anguilla",
"city": "city",
"address": "address",
"location": "",
"socialMedia": "facebook",
"contact": "contact",
"currency": "USD",
"sdgs": [
"NO_POVERTY"
]
}
My current yup schema:
const ProjectInputSchema = yup.object().shape({
id: yup.string(),
name: yup.object().shape({
en: yup.string().required(),
fr: yup.string()
}),
description: yup.object(),
images: yup.array(),
...
});
How I render my form with the name object:
<Form.Group as={Col} md={{span: 5}} controlId="projectName">
<Form.Label>
{t('projectName')}
</Form.Label>
{
<Form.Control
type="text"
name="name.en"
value={(values['name'] as I18n).en}
onChange={handleChange}
/>
}
...
As soon as I type a character into name or description, I get the following console error:
Uncaught (in promise) Error: Objects are not valid as a React child (found: object with keys {fr}). If you meant to render a collection of children, use an array instead.
I followed the example implementation (https://codesandbox.io/s/y7q2v45xqx) for nested object but I do not see a difference.

Query for JSON[] type in postgres

I have a table in postgres which include a field with json[] datatype.
structure of stored data in this column is like this:
[{
"sym": "BTC",
"enn": "Bitcoin",
"fan": "",
"prc": 7284.46,
"c24": -4.33,
"mkc": 124460367747.02,
"mkp": 0
}, {
"sym": "ETH",
"enn": "Ethereum",
"fan": "",
"prc": 571.735,
"c24": -5.23,
"mkc": 57166582578.235,
"mkp": 0
}, {
"sym": "XRP",
"enn": "Ripple",
"fan": "",
"prc": 0.625291,
"c24": -6.28,
"mkc": 24539115471.842476,
"mkp": 0
}, {
"sym": "BCH",
"enn": "Bitcoin Cash",
"fan": "",
"prc": 1034.65,
"c24": -7.09,
"mkc": 17771148400,
"mkp": 0
}, {
"sym": "EOS",
"enn": "EOS",
"fan": "",
"prc": 13.2186,
"c24": -7.95,
"mkc": 11845841674.9512,
"mkp": 0
}]
what I need is fetching json with specific "sym" key. like this:
{
"sym": "BTC",
"enn": "Bitcoin",
"fan": "",
"prc": 7284.46,
"c24": -4.33,
"mkc": 124460367747.02,
"mkp": 0
}
I tried this:
select to_json(data)::json ->'sym'->'BTC'from my_table;
but it's not working. I know it's not working because my field is an array, not json and tried this
select json_array_elements(to_json(data)::json->'sym'->'BTC') from my_table;
but it's not working too.
any help?
Have you tried it with a CTE? There are certainly a dozen other ways to do the same, but I find CTEs quite elegant and easy to read.
WITH j AS (
SELECT json_array_elements('[{"sym":"BTC","enn":"Bitcoin","fan":"","prc":7284.46,"c24":-4.33,"mkc":124460367747.02,"mkp":0},{"sym":"ETH","enn":"Ethereum","fan":"","prc":571.735,"c24":-5.23,"mkc":57166582578.235,"mkp":0},{"sym":"XRP","enn":"Ripple","fan":"","prc":0.625291,"c24":-6.28,"mkc":24539115471.842476,"mkp":0},{"sym":"BCH","enn":"Bitcoin Cash","fan":"","prc":1034.65,"c24":-7.09,"mkc":17771148400,"mkp":0},{"sym":"EOS","enn":"EOS","fan":"","prc":13.2186,"c24":-7.95,"mkc":11845841674.9512,"mkp":0}]'::json) AS sym
)
SELECT *
FROM j
WHERE j.sym->>'sym'= 'BTC';
sym
------------------------------------------------------------------------------------------------
{"sym":"BTC","enn":"Bitcoin","fan":"","prc":7284.46,"c24":-4.33,"mkc":124460367747.02,"mkp":0}
(1 Zeile)

Solr schema with nested object

I am using solr 5.5 version in single instance. I am trying to index the below data as one record:
{
"MLId": "00021BF6-BCC7-4F2E-8B8F-02587310A1B4",
"PublishDate": "2015-06-03",
"CompanyName": "GLI Finance Limited",
"Ticker": "GLI",
"Primary": "1",
"Exchange": "Channel Islands Securities Exchange",
"Line1": "Sarnia House",
"Line2": "Le Truchot",
"Line3": "St Peter Port",
"Line4": "Guernsey GY1 4NA",
"Line5": "Channel Islands",
"Country": "GBR",
"Phone": "",
"WebAddress": "http://www.glifund.com",
"NoOfEmployees": "",
"Turnover": "580000",
"TurnoverUSD": "992600.0000",
"FinancialYearEnd": "--12--",
"overView": "GLI Finance is a closed-ended investment company. It invests in senior secured loans and syndicated corporate loans issued primarily by middle market US companies. Its portfolio investment is managed by T2 Advisers. The company operates in the Channel Islands, the UK and the Cayman Islands. It is headquartered in St. Peter Port, Guernsey.|The company recorded revenues of £584.4 thousand (approximately $963.1 thousand) in the fiscal year ended December 2014. Its net loss was £13,626.4 thousand (approximately $22,457.6 thousand) in fiscal 2014.|",
"MajorProductsServices": "GLI Finance is a closed-ended investment company. The company's key activities include the following: Activities: Invests in senior secured loans and syndicated corporate loans issued primarily by middle market US companies",
"KeyEmployeesCount": "8",
"_childDocuments_": [
{
"FullName": "Geoffrey Richard Miller",
"JobTitle": "Chief Executive Officer and Executive Director",
"Board": "Executive Board"
},
{
"FullName": "Emma Stubbs",
"JobTitle": "Chief Financial Officer",
"Board": "Executive Board"
},
{
"FullName": "Patrick Anthony Seymour Firth",
"JobTitle": "Chairman",
"Board": "Non Executive Board"
},
{
"FullName": "Frederick Peter Forni",
"JobTitle": "Non-Executive Director",
"Board": "Non Executive Board"
},
{
"FullName": "James Henry Carthew",
"JobTitle": "Non-Executive Director",
"Board": "Non Executive Board"
},
{
"FullName": "Marc Krombach",
"JobTitle": "Managing Director",
"Board": "Senior Management"
},
{
"FullName": "Andrew Whelan",
"JobTitle": "Director, Lending",
"Board": "Senior Management"
},
{
"FullName": "Louise Beaumont",
"JobTitle": "Head, Public Affairs and Marketing",
"Board": "Senior Management"
}
],
"LocationsSubsidiariesCount": "5",
"Subsidiary": [
{
"SubsidiaryName": "GLIF BMS Holdings Limited",
"SubsidiaryAddressLine1": "",
"SubsidiaryAddressLine2": "",
"SubsidiaryAddressLine3": "",
"SubsidiaryAddressLine4": "",
"SubsidiaryAddressLine5": "",
"SubsidiaryAddressCountry": "GBR"
},
{
"SubsidiaryName": "Secured Loan Investments Limited",
"SubsidiaryAddressLine1": "",
"SubsidiaryAddressLine2": "",
"SubsidiaryAddressLine3": "",
"SubsidiaryAddressLine4": "",
"SubsidiaryAddressLine5": "Guernsey",
"SubsidiaryAddressCountry": "GBR"
},
{
"SubsidiaryName": "BMS Finance AB Limited",
"SubsidiaryAddressLine1": "",
"SubsidiaryAddressLine2": "",
"SubsidiaryAddressLine3": "",
"SubsidiaryAddressLine4": "",
"SubsidiaryAddressLine5": "",
"SubsidiaryAddressCountry": "GBR"
},
{
"SubsidiaryName": "NVF Tech Limited",
"SubsidiaryAddressLine1": "",
"SubsidiaryAddressLine2": "",
"SubsidiaryAddressLine3": "",
"SubsidiaryAddressLine4": "",
"SubsidiaryAddressLine5": "",
"SubsidiaryAddressCountry": "GBR"
},
{
"SubsidiaryName": "GLI Investments Holdings Sarl",
"SubsidiaryAddressLine1": "",
"SubsidiaryAddressLine2": "",
"SubsidiaryAddressLine3": "",
"SubsidiaryAddressLine4": "",
"SubsidiaryAddressLine5": "",
"SubsidiaryAddressCountry": "LUX"
}
]
}
I am getting Unknown command MLID error:
Could you please help me in creating the index?
Thanks,
Srilu
Short answer: put [] around that whole structure and it should work.
Long answer: Solr accepts JSON in 3 different forms:
Single arbitrary JSON document that Solr does smart mapping to internal fields, creating them as needed (using schemaless) approach. The request handler for that is */update/json/docs" This works, but does not support children and also maps all new fields to multiValued types.
Single arbitrary JSON document that Solr expects to be a sequence of commands, such as add, delete, and commit. This is in Solr-specific format and does support child documents. The request for that is "/update"
A multi-document shortcut of the one above, where only the documents are provided in an array, including child support.
You are hitting the use case 2 here, so Solr is complaining that the first thing it sees is NOT one of the known commands. By putting you object into the array, you switch to type 3 and it should work.
P.s. Your specific example seems to have other arrays of nested objects which will probably stop indexing (specifically "Subsidiary"). But it is a separate problem/question.

Resources