Im trying a setup a Microsoft flow. In short, I need to take JSON data retrieved from a device, and parse it so that i could reference it in the Flows below. In order to parse, i need to provide the JSON Schema to Flow. Microsoft Flow has an option to generate it from a sample payload (the results returned from the API call), but it's not generating it correctly. I'm hoping someone can help me. I need the correct JSON Schema.
The data returned from the API:
[
null,
[
{
"user_id": 2003,
"user_label": "Test1"
},
{
"user_id": 2004,
"user_label": "Test2"
}
]
]
Scheme generated in Flow from the above sample payload:
{
"type": "array",
"items": {}
}
I then tried to generate the Schema from just the data. That seemed to work, but when the Flow runs, I get a Json validation error.
Tried generating from just the data like this:
{
"user_id": 2003,
"user_label": "Test1"
}
This generated the scheme like this:
{
"type": "object",
"properties": {
"user_id": {
"type": "number"
},
"user_label": {
"type": "string"
}
}
}
So you have 2 things going on, the nested object array, and the null.
You'll need another Parse JSON after the first Parse JSON. And you'll want to filter out the null before the second Parse JSON.
It took me a while to figure out, but I hope this helps.
Start by adding the Parse JSON step to whatever step is outputting the JSON.
Now, filter the array, make sure you use the 'Expression' when comparing with null.
Add the second Parse JSON, you'll notice that you won't have the option to select the output "Item" of the Filter array step, so select 'Parse JSON' - Item for now (we will change this to use the output of the Filter JSON step in a moment)
The step should automatically change to an 'Apply to each'. In the Parse JSON 2, generate the schema with
[
{
"user_id": 2003,
"user_label": "Test1"
},
{
"user_id": 2004,
"user_label": "Test2"
}
]
Then, modify the 'Select an output from previous steps field' and change it (from the Body of the Parse JSON step) to the Body of the Filter Array step
Finally, add an action after Parse JSON 2 and select one of the fields in Parse JSON 2, this will automatically change that step to a nested Apply to each
You should end up with something like this:
Related
I'm using a LogicApp triggered by an HTTP call. The call posts a JSON message which is a single row array. I simply want to extract the single JSON object out of the array so that I can parse it but have spent several hours googling and trying various options to no avail. Here's an example of the array:
[{
"id": "866ef906-5bd8-44d8-af34-0c6906d2dfd7",
"subject": "Engagement-866ef906-5bd8-44d8-af34-0c6906d2dfd7",
"data": {
"$meta": {
"traceparent": "00-dccfde4923181d4196f870385d99cb84-52b8333f100b844c-00"
},
"timestamp": "2021-10-19T17:01:06.334Z",
"correlationId": "866ef906-5bd8-44d8-af34-0c6906d2dfd7",
"fileName": "show.xlsx"
},
"eventType": "File.Uploaded",
"eventTime": "2021-10-19T17:01:07.111Z",
"metadataVersion": "1",
"dataVersion": "1"
}]
Examples of what hasn't worked:
Parse JSON on the array, error: InvalidTemplate when specifiying an array as the schema
For each directly against the http output, error: No dependent actions succeeded.
Any suggestions would be gratefully received.
You have to paste the example that you have provided to 'Use sample payload to generate schema' in the Parse JSON Connector and then you will be able to retrieve each individual object from the sample payload.
You can extract a single JSON object from your array by using its index in square brackets. E.g., in the example below you'd need to use triggerBody()?[0] instead of triggerBody(). 0 is an index of the first element in the array, 1 - of the second, and so on.
Result:
I need to loop through this optional array (it's only the sectional of JSON I have trouble with).
As you can see from the code:
The optional bullseye has an array rings. rings has arrays of expansionCriteria and expansionCriteria may or may not have actions.
How do I iterate and get all type, threshold in expansionCriteria? I also need to access all skillsToRemove under actions, if available.
I am rather new to Logic Apps, so any help is appreciated.
"bullseye": {
"rings": [
{
"expansionCriteria": [
{
"type": "TIMEOUT_SECONDS",
"threshold": 180
}
],
"actions": {
"skillsToRemove": [
{
"name": "Claims Foundation",
"id": "60bd469a-ebab-4958-9ca9-3559636dd67d",
"selfUri": "/api/v2/routing/skills/60bd469a-ebab-4958-9ca9-3559636dd67d"
},
{
"name": "Claims Advanced",
"id": "bdc0d667-8389-4d1d-96e2-341e383476fc",
"selfUri": "/api/v2/routing/skills/bdc0d667-8389-4d1d-96e2-341e383476fc"
},
{
"name": "Claims Intermediate",
"id": "c790eac3-d894-4c00-b2d5-90cd8a69436c",
"selfUri": "/api/v2/routing/skills/c790eac3-d894-4c00-b2d5-90cd8a69436c"
}
]
}
},
{
"expansionCriteria": [
{
"type": "TIMEOUT_SECONDS",
"threshold": 5
}
]
}
]
}
Please let me know if you need more info.
To generate the schema, you can remove the name of the object at the top of the code: "bullseye":
Thank you pramodvalavala-msft for posting your answer in MS Q&A for the similar thread .
" As you are working with a JSON Object instead of an Array, unfortunately there is no built-in function to loop over the keys. There is a feature request to add a method to extract keys from an object for scenarios like this, that you could up vote for it gain more traction.
You can use the inline code action to extract the keys from your object as an array (using Object.keys()). And then you can loop over this array using the foreach loop to extract the object that you need from the main object, which you could then use to create records in dynamics."
For more information you can refer the below links:
. How to loop and extract items from Nested Json Array in Logic Apps .
.Nested ForEach Loop in Workflow. .
For a new project I want to use a CakePHP 4 as REST backend with a Vue.js frontend.
Now Cake uses a nested data structure while vue.js uses a flat data structure.
My plan now is to convert the data in the backend.
Example Format:
CakePHP
{
"user": {
"id": 1,
"name": "Peter Maus"
"articles" : [
{
"id": 15,
"title": "First Post",
}
]
},
}
Vue.js
{
"user": {
"id": 1,
"name": "Peter Maus"
"articles" : [ 15 ]
},
"articles": [
{
"id": 15,
"title": "First Post",
}
]
}
So basically instead of just sending json with
$this->viewBuilder()->setOption('serialize', ['user']);
I want to first "convert the datastructure" and then send as json.
I have now found the following possibilities for the conversion based on the documentation:
Request - convert from vue to cake
I have seen that you can use Body Parser Middleware with your own parser.
But I still have json as response format and I don't want to override the standard json formatter.
Response - convert from cake to vue
ideas:
I have seen "Data Views", but I'm not sure if it is suitable for this purpose.
extend the ViewBuilder and write my own serialize() function.
How would I have to include my own ViewBuilder, is that even possible?
write a parser function in a parent entity from which all my entities inherit. And call that parse function before serializing the data.
I will probably need access to the Entity Relations to dynamically restructure the data, for both: request and response.
What would be a reasonable approach?
Im trying to load a JOSN file where some of the arrays are empty.
{"house_account_payable":"0.00","house_account_receivable":"0.00","gift_sales_payable":"0.00","gift_sales_receivable":"0.00","store_credit_sales_payable":"0.00","percentage_row":null,"sales_per_period":[["02:00AM - 02:59AM",{"amount":0,"qty":0}],["03:00AM - 03:59AM",{"amount":0,"qty":0}]],"revenue_centers":[],"tax_breakdowns":[]}
This is giving the error:
rror while reading table: test2, error message: Failed to parse JSON: No object found when new array is started.; BeginArray returned false; Parser terminated before end of string
Could somebody help me on this?
Are you trying to load data from your local machine or GCS? Please, remember about exporting in JSONL(Newline delimited JSON):
{"open_orders_ids": []}
{"unpaid_orders_ids": []}
The output:
Take a look for documentation about nested and repeated columns.
EDIT:
Your JSON schema should look like this:
{
"items": [
{
"house_account_payable": "0.00",
"house_account_receivable": "0.00",
"gift_sales_payable": "0.00",
"gift_sales_receivable": "0.00",
"store_credit_sales_payable": "0.00",
"percentage_row": "",
"sales_per_period": [
{
"AM02_00_AM02_59": {
"amount": "0",
"qty": "0"
}
},
{
"AM03_00_AM03_59": {
"amount": "0",
"qty": "0"
}
}
]
}
]
}
Regarding to Felipe Hoffa's post, run following commands:
jq -c .items[] <FILENAME>.json > <FILENAME>.jq.json
bq load --source_format NEWLINE_DELIMITED_JSON --autodetect <DATASET_ID>.<TABLENAME> <FILENAME>.jq.json
The schema:
Let me know if this is what you are looking for.
There's no problem with the null arrays.
The problem lies in this shorter json:
{"sales_per_period":[["02:00AM - 02:59AM",{"amount":0,"qty":0}],["03:00AM - 03:59AM",{"amount":0,"qty":0}]]}
The arrays there hold elements of different types, and to bring it into a structured table, a different schema is needed.
For example:
{"sales_per_period":[{"a":"02:00AM - 02:59AM","b":{"amount":0,"qty":0}},{"a":"03:00AM - 03:59AM","b":{"amount":0,"qty":0}}]}
Now this loads easily into BigQuery:
bq load --source_format=NEWLINE_DELIMITED_JSON --autodetect temp.short delete.short.json
Can you change this source JSON easily outside BigQuery? Otherwise load it raw into BigQuery, and parse it with a JS UDF inside BigQuery.
Is it possible to send only certain entries of a JSON array?
I have a JSON object defined by the following schema:
"LineGroup": {
"type": "array",
"description": "Line group active",
"items": {
"type": "boolean"
},
"maxItems": 10
}
At the beginning all entries are send. But later on only some entries are changed and only these new values must be updated.
If my syntax at the beginning when I send the full array is:
[{"LineGroup":"False"},{"LineGroup":"True"},...,{"LineGroup":"True"}]
What will be the syntax to only send 1 or 2 entries that have changed in this array? Do I need to resend the whole array?
You could use json patch to update the original json object:
http://jsonpatch.com/
The cool thing is that the patch documents themselves are also json documents.