Hey i am new to Avro Schema space, needed to convert Jason Array into Avro Schema.
Below Jason is kind of client which serviceName along-with enabler-
If Enabler is true means that particular service is taken by client
If Enabler is false means that particular service is not taken by client.
{
"clientName": "Haven",
"serviceDetailsList": [
{
"serviceName": "Service1",
"enabled": true
},
{
"serviceName": "Service2",
"enabled": true
},
{
"serviceName": "Service3",
"enabled": true
},
{
"serviceName": "Service4",
"enabled": false
},
{
"serviceName": "Service5",
"enabled": false
},
{
"serviceName": "Service6",
"enabled": true
}
]
}
I worked with below schema but not getting proper response.
"fields":[
{"name": "serviceName", "type": [ "Boolean", "false" ] , "aliases":[
"service1" ]
},
{"name": "serviceName", "type": [ "Boolean", "false" ] , "aliases":[
"service2" ]
}
]
Any help would be appreciated.
Thank you all of you,again i tried and able to get correct scheam. Correct Avro Schema is-
{
"name": "modelData",
"type": "record",
"namespace": "com.hi.model",
"fields": [
{
"name": "clientName",
"type": "string"
},
{
"name": "serviceDetailsList",
"type": {
"type": "array",
"items": {
"name": "serviceDetailsList_record",
"type": "record",
"fields": [
{
"name": "serviceName",
"type": "string"
},
{
"name": "enabled",
"type": "boolean"
}
]
}
}
}
]
}
Related
After spending many hours of reading the documentation, following some tutorials and trial & error, i just can't figure it out; how can I transform the following complex object with key objects to an array using a data flow in Azure Data Factory?
Input
{
"headers": {
"Content-Length": 1234
},
"body": {
"00b50a39-8591-3db3-88f7-635e2ec5c65a": {
"id": "00b50a39-8591-3db3-88f7-635e2ec5c65a",
"name": "Example 1",
"date": "2023-02-09"
},
"0c206312-2348-391b-99f0-261323a94d95": {
"id": "0c206312-2348-391b-99f0-261323a94d95",
"name": "Example 2",
"date": "2023-02-09"
},
"0c82d1e4-a897-32f2-88db-6830a21b0a43": {
"id": "00b50a39-8591-3db3-88f7-635e2ec5c65a",
"name": "Example 3",
"date": "2023-02-09"
},
}
}
Expected output
[
{
"id": "00b50a39-8591-3db3-88f7-635e2ec5c65a",
"name": "Example 1",
"date": "2023-02-09"
},
{
"id": "0c206312-2348-391b-99f0-261323a94d95",
"name": "Example 2",
"date": "2023-02-09"
},
{
"id": "00b50a39-8591-3db3-88f7-635e2ec5c65a",
"name": "Example 3",
"date": "2023-02-09"
}
]
AFAIK, Your JSON keys are dynamic. So, getting the desired result using dataflow might not be possible.
In this case, you can try the below approach as a workaround. This will work only if all of your key's length is same.
This is my Pipeline:
First I have used a lookup activity to get the JSON file and converted the lookup output to a string and stored in a variable using below expression.
#substring(string(activity('Lookup1').output.value[0].body),2,sub(length(string(activity('Lookup1').output.value[0].body)),4)).
Then I have used split on that String variable with '},"' and stored in an array variable using below expression.
#split(variables('res_str'),'},"')
It will give the array like below.
Give that array to a ForEach and inside ForEach use an append variable activity to store the keys into an array with below expression.
#take(item(), 36)
Now, I got the list of keys in an array, after the above ForEach use another ForEach activity to get the desired array of objects. Use append variable actvity inside ForEach and give the below expression for it.
#activity('Lookup1').output.value[0].body[item()]
Result array after ForEach will be:
If you want to store the above JSON into a file, you need to use OPENJSON from SQL. This is because copy activity additonal column only supports string type not an array type.
Use a SQL dataset on copy activity source and give the below SQL script in the query.
DECLARE #json NVARCHAR(MAX)
SET #json =
N'#{variables('json_arr')}'
SELECT * FROM
OPENJSON ( #json )
WITH (
id varchar(200) '$.id' ,
name varchar(32) '$.name',
date varchar(32) '$.date'
)
In Sink, give a JSON dataset and select Array of Objects as File pattern.
Execute the pipeline and you will get the above array inside a file.
This is my Pipeline JSON:
{
"name": "pipeline1",
"properties": {
"activities": [
{
"name": "Lookup1",
"type": "Lookup",
"dependsOn": [],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "JsonSource",
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"recursive": true,
"enablePartitionDiscovery": false
},
"formatSettings": {
"type": "JsonReadSettings"
}
},
"dataset": {
"referenceName": "Json1",
"type": "DatasetReference"
},
"firstRowOnly": false
}
},
{
"name": "Lookup output to Str",
"description": "",
"type": "SetVariable",
"dependsOn": [
{
"activity": "Lookup1",
"dependencyConditions": [
"Succeeded"
]
}
],
"userProperties": [],
"typeProperties": {
"variableName": "res_str",
"value": {
"value": "#substring(string(activity('Lookup1').output.value[0].body),2,sub(length(string(activity('Lookup1').output.value[0].body)),4))",
"type": "Expression"
}
}
},
{
"name": "Split Str to array",
"type": "SetVariable",
"dependsOn": [
{
"activity": "Lookup output to Str",
"dependencyConditions": [
"Succeeded"
]
}
],
"userProperties": [],
"typeProperties": {
"variableName": "split_arr",
"value": {
"value": "#split(variables('res_str'),'},\"')",
"type": "Expression"
}
}
},
{
"name": "build keys array using split array",
"type": "ForEach",
"dependsOn": [
{
"activity": "Split Str to array",
"dependencyConditions": [
"Succeeded"
]
}
],
"userProperties": [],
"typeProperties": {
"items": {
"value": "#variables('split_arr')",
"type": "Expression"
},
"isSequential": true,
"activities": [
{
"name": "take first 36 chars of every item",
"type": "AppendVariable",
"dependsOn": [],
"userProperties": [],
"typeProperties": {
"variableName": "keys_array",
"value": {
"value": "#take(item(), 36)",
"type": "Expression"
}
}
}
]
}
},
{
"name": "build final array using keys array",
"type": "ForEach",
"dependsOn": [
{
"activity": "build keys array using split array",
"dependencyConditions": [
"Succeeded"
]
}
],
"userProperties": [],
"typeProperties": {
"items": {
"value": "#variables('keys_array')",
"type": "Expression"
},
"isSequential": true,
"activities": [
{
"name": "Append variable1",
"description": "append every object to array",
"type": "AppendVariable",
"dependsOn": [],
"userProperties": [],
"typeProperties": {
"variableName": "json_arr",
"value": {
"value": "#activity('Lookup1').output.value[0].body[item()]",
"type": "Expression"
}
}
}
]
}
},
{
"name": "Just for Res show",
"type": "SetVariable",
"dependsOn": [
{
"activity": "build final array using keys array",
"dependencyConditions": [
"Succeeded"
]
}
],
"userProperties": [],
"typeProperties": {
"variableName": "final_res_show",
"value": {
"value": "#variables('json_arr')",
"type": "Expression"
}
}
},
{
"name": "Copy data1",
"type": "Copy",
"dependsOn": [
{
"activity": "Just for Res show",
"dependencyConditions": [
"Succeeded"
]
}
],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "AzureSqlSource",
"sqlReaderQuery": "DECLARE #json NVARCHAR(MAX)\nSET #json = \n N'#{variables('json_arr')}' \n \nSELECT * FROM \n OPENJSON ( #json ) \nWITH ( \n id varchar(200) '$.id' , \n name varchar(32) '$.name', \n date varchar(32) '$.date'\n )",
"queryTimeout": "02:00:00",
"partitionOption": "None"
},
"sink": {
"type": "JsonSink",
"storeSettings": {
"type": "AzureBlobFSWriteSettings"
},
"formatSettings": {
"type": "JsonWriteSettings",
"filePattern": "arrayOfObjects"
}
},
"enableStaging": false
},
"inputs": [
{
"referenceName": "AzureSqlTable1",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "Target_JSON",
"type": "DatasetReference"
}
]
}
],
"variables": {
"res_str": {
"type": "String"
},
"split_arr": {
"type": "Array"
},
"keys_array": {
"type": "Array"
},
"final_res_show": {
"type": "Array"
},
"json_arr": {
"type": "Array"
}
},
"annotations": []
}
}
Result file:
I have an interaction model with a GetMenuIntent which I can invoke with "what's for {meal}". meal is a MealType custom slot with allowed values of "breakfast" and "lunch". I added validation on the meal slot in my GetMenuIntent to only allow those values defined in the slot type and it works great for those configured values.
However, after saving and building my model, when I put "what's for dinner" into the Utterance Profiler or the interactive tester, It ended up calling my FallbackIntent instead of reprompting for a correct value.
I feel like what I'm trying to do isn't really much different than Amazon's own example here.
Here's "whats for lunch" working correctly:
And here's "whats for dinner" ignoring my GetMenuIntent and calling FallbackIntent instead:
Here's my interaction model:
{
"interactionModel": {
"languageModel": {
"invocationName": "school menus",
"intents": [
{
"name": "AMAZON.CancelIntent",
"samples": []
},
{
"name": "AMAZON.HelpIntent",
"samples": []
},
{
"name": "AMAZON.StopIntent",
"samples": []
},
{
"name": "AMAZON.NavigateHomeIntent",
"samples": []
},
{
"name": "GetMenuIntent",
"slots": [
{
"name": "meal",
"type": "Meal"
},
{
"name": "date",
"type": "AMAZON.DATE"
}
],
"samples": [
"whats for {meal} {date}",
"what will you have for {meal} {date}",
"what is on the menu for {meal} {date}",
"what are we having for {meal} {date}",
"what we're having for {meal} {date}"
]
},
{
"name": "AMAZON.FallbackIntent",
"samples": []
}
],
"types": [
{
"values": [
{
"name": {
"value": "lunch"
}
},
{
"name": {
"value": "breakfast"
}
}
],
"name": "Meal"
}
]
},
"dialog": {
"intents": [
{
"name": "GetMenuIntent",
"confirmationRequired": false,
"prompts": {},
"slots": [
{
"name": "meal",
"type": "Meal",
"elicitationRequired": false,
"confirmationRequired": false,
"prompts": {},
"validations": [
{
"type": "hasEntityResolutionMatch",
"prompt": "Slot.Validation.806855880612.19281662909.602239253259"
}
]
},
{
"name": "date",
"type": "AMAZON.DATE",
"elicitationRequired": false,
"confirmationRequired": false,
"prompts": {}
}
]
}
],
"delegationStrategy": "ALWAYS"
},
"prompts": [
{
"id": "Slot.Validation.806855880612.19281662909.602239253259",
"variations": [
{
"type": "PlainText",
"value": "Hmm, I don't know about that menu type. Please try again."
}
]
}
]
},
"version": "48"
}
Since this is 6 months old I assume you figured out by now that your interaction model only includes Lunch and Breakfast.
My issue is basically described in title but I'll add a little bit more details here.
I'm trying to proactively send an update report to Alexa's Event Hub for Smart Home skills, however this fails with a 400 Bad Request error, and this is what the server responds:
{
"header": {
"namespace": "System",
"name": "Exception",
"messageId": "a154410c-2364-4c5b-9028-accde5048e1e"
},
"payload": {
"code": "INVALID_REQUEST_EXCEPTION",
"description": "Event or endpoint is missing in the request."
}
}
I decided then to check if my request was wrong by sending the one that can be found on their documentation, and I'm getting the very same error.
This is the JSON copied from Amazon's documentation (I just replaced device metadata and the token):
{
"event": {
"header": {
"namespace": "Alexa.Discovery",
"name": "AddOrUpdateReport",
"payloadVersion": "3",
"messageId": "5f8a426e-01e4-4cc9-8b79-65f8bd0fd8a4"
},
"payload": {
"endpoints": [
{
"endpointId": "<unique ID of the endpoint>",
"manufacturerName": "Sample Manufacturer",
"description": "Smart Light by Sample Manufacturer",
"friendlyName": "Kitchen Light",
"additionalAttributes": {
"manufacturer" : "Sample Manufacturer",
"model" : "Sample Model",
"serialNumber": "<the serial number of the device>",
"firmwareVersion" : "<the firmware version of the device>",
"softwareVersion": "<the software version of the device>",
"customIdentifier": "<your custom identifier for the device>"
},
"displayCategories": [
"LIGHT"
],
"capabilities": [
{
"type": "AlexaInterface",
"interface": "Alexa.PowerController",
"version": "3",
"properties": {
"supported": [
{
"name": "powerState"
}
],
"proactivelyReported": true,
"retrievable": true
}
},
{
"type": "AlexaInterface",
"interface": "Alexa.BrightnessController",
"version": "3",
"properties": {
"supported": [
{
"name": "brightness"
}
],
"proactivelyReported": true,
"retrievable": true
}
}
],
"connections": [
],
"cookie": {
}
}
],
"scope": {
"type": "BearerToken",
"token": "access-token-from-Amazon"
}
}
}
}
What I don't understand is why it reports that event or endpoints are missing if they're clearly there and if this JSON is the one that they provide in their documentation.
Does anyone have this issue too?
EDIT: pasting my payload as requested.
{
"event": {
"header": {
"namespace": "Alexa.Discovery",
"name": "AddOrUpdateReport",
"payloadVersion": "3",
"messageId": "132137185729061389"
},
"payload": {
"endpoints": [
{
"endpointId": "Rmxvb2RTZW5zb3JfMDE=",
"friendlyName": "Flood sensor",
"description": "Flood sensor",
"manufacturerName": "MyCompany",
"displayCategories": [
"ACTIVITY_TRIGGER"
],
"cookie": {
"DeviceType": "FloodSensor"
},
"capabilities": [{
"interface": "Alexa",
"type": "AlexaInterface",
"version": "3"
},
{
"interface": "Alexa.EndpointHealth",
"type": "AlexaInterface",
"version": "3",
"properties": {
"supported": [{
"name": "connectivity"
}],
"proactivelyReported": true,
"retrievable": true
}
}
]
}
],
"scope": {
"type": "BearerToken",
"token": "<redacted>"
}
}
}
}
I can't get device discovery working. My discovery response is below. Alexa keeps saying that it can't find any new devices. Can you help?
This is data passed from my local server to the skill in lambda written in python then passed back to Alexa.
{
"event": {
"header": {
"messageId": "810212af-b373-4a23-a976-67c5d79324e4",
"name": "Discover.Response",
"namespace": "Alexa.Discovery",
"payloadVersion": "3"
},
"payload": {
"endpoints": [
{
"capabilities": [
{
"interface": "Alexa.ContactSensor",
"properties": {
"proactivelyReported": true,
"retrievable": false,
"supported": [
{
"name": "detectionState"
}
]
},
"type": "AlexaInterface",
"version": "3"
},
{
"interface": "Alexa.EndpointHealth",
"properties": {
"proactivelyReported": true,
"retrievable": false,
"supported": [
{
"name": "connectivity"
}
]
},
"type": "AlexaInterface",
"version": "3"
}
],
"displayCategories": [
"CONTACT_SENSOR"
],
"endpointID": "523F5AA2-079A-4A9A-94E2-EA4259357F80",
"friendlyName": "HumiditySensor",
"manufacturerName": "THCGuard AirNode",
"description": "AirNode output rule"
}
]
}
}
}
endpointID should be endpointId (lower case 'd')
here is the Sample Json:-
[
{
"index": 0,
"object": {
"uri": "entities/oAFpSUX",
"type": "configuration/entityTypes/Pet",
"createdBy": "abc#xyz.com",
"createdTime": 1531431176965,
"updatedBy": "abc#xyz.com",
"updatedTime": 1531431177691,
"attributes": {
"Weight": [
{
"label": "5 lbs",
"value": {
"PetWeightMeasurement": [
{
"ov": true,
"value": "5",
}
],
"PetWeightUOM": [
{
"ov": true,
"value": "lbs",
"lookupCode": "lbs",
}
]
},
"ov": true,
"uri": "entities/oAFpSUX/attributes/Weight/1AeFvD8Kj"
}
],
"Identifiers": [
{
"label": "5155445576",
"value": {
"Type": [
{
"ov": false,
"value": "CRMO_Pet_Id",
}
],
"ID": [
{
"ov": true,
"value": "5155445576",
}
]
},
"ov": true,
"uri": "entities/oAFpSUX/attributes/Identifiers/1AeFvCrHh"
}
],
"Vaccination": [
{
"label": "Bordatella - 2018-10-26",
"value": {
"Type": [
{
"type": "configuration/entityTypes/Pet/attributes/Vaccination/attributes/Type",
"ov": true,
"value": "Bordatella",
"lookupCode": "4",
"lookupRawValue": "Bordatella",
"lookupAttributes": [
{
"name": "Sort Order",
"value": "3"
}
],
"uri": "entities/oAFpSUX/attributes/Vaccination/1AeFv9yGr/Type/1AeFvA2X7"
}
],
"ExpirationDate": [
{
"type": "configuration/entityTypes/Pet/attributes/Vaccination/attributes/ExpirationDate",
"ov": true,
"value": "2018-10-26",
"uri": "entities/oAFpSUX/attributes/Vaccination/1AeFv9yGr/ExpirationDate/1AeFvA6nN"
}
]
},
"ov": true,
"uri": "entities/oAFpSUX/attributes/Vaccination/1AeFv9yGr"
},
{
"label": "Distemper - 2018-10-25",
"value": {
"Type": [
{
"type": "configuration/entityTypes/Pet/attributes/Vaccination/attributes/Type",
"ov": true,
"value": "Distemper",
"lookupAttributes": [
{
"name": "Sort Order",
"value": "4"
}
],
"uri": "entities/oAFpSUX/attributes/Vaccination/1AeFv9YhJ/Type/1AeFv9cxZ"
}
],....
My question: I was able to get the value of "$..Vaccination..value.Type..value" as "Bordatella" so that works fine. However what I now want is that if the value is 'Bordatella' then I want to extract the "value" under "ExpirationDate". Can someone please help me how I can extract that "value" under "ExpirationDate" ? I am not sure if I need to do that with some custom Groovy code of using jmeter's if controller. Any help would be greatly appreciated!
Thanks.
There are several ways to run jq queries from within Java - e.g.
https://github.com/eiiches/jackson-jq
https://github.com/arakelian/java-jq (available from Maven Central)
Java Native Access wrapper https://github.com/bskaggs/jjq . Supported platform is Linux only.
Assuming the sample input has been revised in the obvious way to make it valid JSON, the following jq filter will produce the output as shown:
.[].object.attributes.Vaccination[].value
| select(.Type[].value == "Bordatella")
| .ExpirationDate[].value
Output:
"2018-10-26"
Alternative
Here's a jq filter that is agnostic about the relative location of the "Vaccination" object:
..
| objects
| select(has("Vaccination"))
| .Vaccination[].value?
| select(.Type[].value == "Bordatella")
| .ExpirationDate[].value
Given the following JSON response:
[
{
"index": 0,
"object": {
"uri": "entities/oAFpSUX",
"type": "configuration/entityTypes/Pet",
"createdBy": "a...#xyz.com",
"createdTime": 1531431176965,
"updatedBy": "a...#xyz.com",
"updatedTime": 1531431177691,
"attributes": {
"Weight": [
{
"label": "5 lbs",
"value": {
"PetWeightMeasurement": [
{
"ov": true,
"value": "5"
}
],
"PetWeightUOM": [
{
"ov": true,
"value": "lbs",
"lookupCode": "lbs"
}
]
},
"ov": true,
"uri": "entities/oAFpSUX/attributes/Weight/1AeFvD8Kj"
}
],
"Identifiers": [
{
"label": "5155445576",
"value": {
"Type": [
{
"ov": false,
"value": "CRMO_Pet_Id"
}
],
"ID": [
{
"ov": true,
"value": "5155445576"
}
]
},
"ov": true,
"uri": "entities/oAFpSUX/attributes/Identifiers/1AeFvCrHh"
}
],
"Vaccination": [
{
"label": "Bordatella - 2018-10-26",
"value": {
"Type": [
{
"type": "configuration/entityTypes/Pet/attributes/Vaccination/attributes/Type",
"ov": true,
"value": "Bordatella",
"lookupCode": "4",
"lookupRawValue": "Bordatella",
"lookupAttributes": [
{
"name": "Sort Order",
"value": "3"
}
],
"uri": "entities/oAFpSUX/attributes/Vaccination/1AeFv9yGr/Type/1AeFvA2X7"
}
],
"ExpirationDate": [
{
"type": "configuration/entityTypes/Pet/attributes/Vaccination/attributes/ExpirationDate",
"ov": true,
"value": "2018-10-26",
"uri": "entities/oAFpSUX/attributes/Vaccination/1AeFv9yGr/ExpirationDate/1AeFvA6nN"
}
]
},
"ov": true,
"uri": "entities/oAFpSUX/attributes/Vaccination/1AeFv9yGr"
},
{
"label": "Distemper - 2018-10-25",
"value": {
"Type": [
{
"type": "configuration/entityTypes/Pet/attributes/Vaccination/attributes/Type",
"ov": true,
"value": "Distemper",
"lookupAttributes": [
{
"name": "Sort Order",
"value": "4"
}
],
"uri": "entities/oAFpSUX/attributes/Vaccination/1AeFv9YhJ/Type/1AeFv9cxZ"
}
]
}
}
]
}
}
}
]
Add JSR223 PostProcessor as a child of the request which returns the above response
Put the following code into "Script" area:
new groovy.json.JsonSlurper().parse(prev.getResponseData()).get(0).get('object').get('attributes').get('Vaccination').each { vaccination ->
if (vaccination.get('value').get('Type').get(0).get('value').equals('Bordatella')) {
def expirationDate = vaccination.get('value').get('ExpirationDate').get(0).get('value')
log.info('ExpirationDate: ' + expirationDate)
vars.put('ExpirationDate', expirationDate)
}
}
If everything goes well (your JSON response matches the Groovy code) the ExpirationDate will be:
printed to jmeter.log file
stored as ${ExpirationDate} JMeter Variable
Demo:
More information:
Groovy: Parsing and producing JSON
Apache Groovy - Why and How You Should Use It