Non-Primitive Values from a Registered Content Provider are not showing a value - arrays

I'm using the /v2/registrations endpoint to register a content provider with the legacyForwarding flag being set. Therefore my Content Provider is offering the v1/queryContext endpoint
When I am returning a simple value (Integer, String etc.) such as a temperature the data is added to the context correctly:
{
"contextResponses": [
{
"contextElement": {
"attributes": [
{
"name": "temperature",
"type": "Number",
"value": 27
}
],
"id": "urn:ngsi-ld:Store:001",
"isPattern": "false",
"type": "Store"
},
"statusCode": {
"code": "200",
"reasonPhrase": "OK"
}
}
]
}
However when trying to return an array of strings as shown from a Context Provider.
{
"contextResponses": [
{
"contextElement": {
"attributes": [
{
"name": "tweets",
"type": "Array",
"value": [
"String 1",
"String 2"
]
}
],
"id": "urn:ngsi-ld:Store:002",
"isPattern": "false",
"type": "Store"
},
"statusCode": {
"code": "200",
"reasonPhrase": "OK"
}
}
]
}
I can see the request being sent in the log and I can retrieve the following entity:
{
"id": "urn:ngsi-ld:Store:002",
"type": "Store",
"address": {
"type": "PostalAddress",
"value": "",
"metadata": {}
},
"location": {
"type": "geo:json",
"value": "",
"metadata": {}
},
"name": {
"type": "Text",
"value": "Checkpoint Markt",
"metadata": {}
},
"tweets": {
"type": "Array",
"value": "",
"metadata": {}
}
}
As you can see the "tweets" value is blank, but the attribute exists and the type has been successfully received.
My question is how should I return an Array or an Object as a value from a Content Provider so that Orion is able to display the data received correctly?

Further investigation from the Orion Team shows that this was indeed a bug and an issue was raised against Orion 2.0.0. With the latest release, the bug has now been fixed.
The solution is to upgrade to a later version of Orion - currently 2.1.0 (at the time of writing)

Related

Custom policy does not maintain previous values on properties array

I am creating a custom policy following this documentation.
I created one, and it is working. The JSON schema for the policy is shown below:
{
"title": "ACME Custom Basic Auth Policy",
"description": "Basic Authentication policy which enforces security according with custom consumer credentials",
"type": "object",
"properties": {
"users": {
"title": "users",
"type": "array",
"items": {
"type": "object",
"required": [
"username",
"password"
],
"properties": {
"username": {
"title": "User Name",
"type": "string",
"default": []
},
"password": {
"title": "User Password",
"type": "string",
"#context": {
"#characteristics": [
"security:sensitive"
]
}
}
}
},
"minItems": 1
}
},
"#context": {
"#vocab": "anypoint://vocabulary/policy.yaml#",
"security": "anypoint://vocabulary/policy.yaml#"
},
"$id": "allow-dynamic-resources",
"$schema": "https://json-schema.org/draft/2019-09/schema"
}
When I go to API Manager, I can configure the values on first attempt, but when I go back to change the values, they do not appear.
This happens only when I configure an array. If I configure as an object, it works. How can I fix this?

Azure Logic App Partition key [X] is invalid with COsmos Db

I need to save JSON parsed data to Cosmos Db, HTTP trigger works as it should as well as parsing but getting Partition key [my_dynamic_key_value] is invalid.
Did anyone have a similar issue?
I have found this article link but still getting the same error.
Thanks
EDIT 1
This is the flow for adding item to DB
Schema:
{
"type": "array",
"items": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"Groups": {
"type": "array",
"items": {
"type": "string"
}
},
"JobName": {
"type": "string"
},
"Link": {
"type": "string"
},
"MinSalary": {
"type": "string"
},
"MaxSalary": {
"type": "string"
},
"Hours": {
"type": "string"
},
"WorkPattern": {
"type": "string"
},
"Details": {
"type": "array",
"items": {
"type": "object",
"properties": {
"Name": {
"type": "string"
},
"Detail": {
"type": "string"
}
},
"required": [
"Name",
"Detail"
]
}
}
},
"required": [
"id",
"Groups",
"JobName",
"Link",
"MinSalary",
"MaxSalary",
"Hours",
"WorkPattern",
"Details"
]
}
}
Here is a response:
{
"code": "BadRequest",
"message": "Partition key [1bb2d44f-a066-4fa8-8a78-0cdcea1a756c] is invalid.\r\nActivityId: 345f9a99-534b-40cb-9dc0-9863dc8c90f5, \r\nRequestStartTime: 2020-04-28T08:04:46.8249255Z, RequestEndTime: 2020-04-28T08:04:46.8249255Z, Number of regions attempted:1\r\n, Microsoft.Azure.Documents.Common/2.10.0"
}
You need to set the partition key in the double quotes. Refer to sample screen shot below

How to loop through an array in a logic app?

I have managed to get all my userdata in an array (see here) but now I cannot loop through the data. After building the array I have converted it to JSON, but I can no longer address the fields as defined in my JSON schema.
The only thing I can address in my loop (I use the JSON body as input for the For Each loop) is the body itself, not the individual fields like username, mail address etc.
Should I change something in my JSON schema to overcome this or is something else wrong?
Edit: Please find my JSON schema below:
{
"$schema": "http://json-schema.org/draft-04/schema#",
"items": [
{
"properties": {
"##odata.type": {
"type": "string"
},
"createdDateTime": {
"type": "string"
},
"employeeId": {
"type": "string"
},
"givenName": {
"type": "string"
},
"id": {
"type": "string"
},
"mail": {
"type": "string"
},
"onPremisesSamAccountName": {
"type": "string"
},
"surname": {
"type": "string"
},
"userPrincipalName": {
"type": "string"
}
},
"required": [
"##odata.type",
"id",
"givenName",
"surname",
"userPrincipalName",
"mail",
"onPremisesSamAccountName",
"employeeId",
"createdDateTime"
],
"type": "object"
}
],
"type": "array"
}
Please see the image for how the JSON looks:
Per my understanding, you just want to loop your array to get each item's name, mail and some other fields. As you mentioned in your question, you can use the json body as input for the For Each loop. It's ok, ther is not need to to anything more. Please refer to the screenshot below:
Initialize a variable like your json data.
Then parse it by "Parse JSON" action.
Now, set the body as input for the For each loop, and then use a variable and set the value with "mail" from "Parse JSON".
After running the logic app, we can see the mail field is also looped. You can use the "mail", "name" and other fields easily in your "For each".
Update:
I checked your json schema, but it seems can't match the json data you provided in your screenshot. May I know how did you generate your json schema, in my side I generate the json schema just by clicking the "Use sample payload to generate schema" button and it will generate the schema automatically.
I use a json data sample with the same structure of yours' and generate its schema, please refer to the json data and schema below:
json data:
{
"body": [
{
"#odata.type": "test",
"id": "123456",
"givenName": "test",
"username": "test",
"userPrincipalName": "test",
"mail": "test#mail.com",
"onPremisesSamAccountName": "test",
"employeeId": "test",
"createdDateTime": "testdate"
},
{
"#odata.type": "test",
"id": "123456",
"givenName": "test",
"username": "test",
"userPrincipalName": "test",
"mail": "test#mail.com",
"onPremisesSamAccountName": "test",
"employeeId": "test",
"createdDateTime": "testdate"
}
]
}
schema:
{
"type": "object",
"properties": {
"body": {
"type": "array",
"items": {
"type": "object",
"properties": {
"##odata.type": {
"type": "string"
},
"id": {
"type": "string"
},
"givenName": {
"type": "string"
},
"username": {
"type": "string"
},
"userPrincipalName": {
"type": "string"
},
"mail": {
"type": "string"
},
"onPremisesSamAccountName": {
"type": "string"
},
"employeeId": {
"type": "string"
},
"createdDateTime": {
"type": "string"
}
},
"required": [
"##odata.type",
"id",
"givenName",
"username",
"userPrincipalName",
"mail",
"onPremisesSamAccountName",
"employeeId",
"createdDateTime"
]
}
}
}
}

Microsoft Flow Custom Connector webhook trigger definition and implementation : 404 not found after flow creation

I'm trying to create a custom connector for my API in Microsoft Flow so users can trigger flows based on a webhook implementation.
The authentication part seems to be working properly (I'm able to create connections). After creating a flow using my custom trigger, it never gets triggered. When checking the data on my end it seems that Flow was never able to register the subscription properly.
If I navigate to the management page for the flow, I get the following error message.
When I click on fix the trigger I get the following details where the Id parameter matches the id of the resource we're trying to subscribe to.
Here is the trigger definition:
{
"/AlertRules/{id}/webhooks": {
"x-ms-notification-content": {
"schema": {
"type": "object",
"properties": {
"Title": {
"type": "string",
"description": "Title"
},
"Text": {
"type": "string",
"description": "Text"
},
"Data": {
"type": "array",
"items": {
"$ref": "#/definitions/DataApi.Models.AlertEvent"
},
"description": "Data"
}
}
},
"description": ""
},
"post": {
"responses": {
"201": {
"description": "Created",
"schema": {
"type": "string"
}
}
},
"x-ms-trigger": "single",
"operationId": "NewAlertEvent",
"summary": "When a new Alert Event is created or updated",
"parameters": [
{
"name": "id",
"in": "path",
"required": true,
"type": "string",
"x-ms-visibility": "important",
"x-ms-dynamic-values": {
"operationId": "AlertRules.AlertRule.ListAlertRule",
"value-path": "Id",
"value-collection": "value",
"value-title": "Description"
}
},
{
"name": "body",
"in": "body",
"required": false,
"schema": {
"type": "string",
"x-ms-visibility": "internal",
"title": "",
"x-ms-notification-url": true
},
"x-ms-visibility": "internal"
}
]
}
}
The description of my delete operation
{
"/AlertRuleSubscriptions({Id})": {
"delete": {
"tags": [
"AlertRuleSubscriptions.AlertRuleSubscription"
],
"summary": "Delete entity from AlertRuleSubscriptions",
"operationId": "AlertRuleSubscriptions.AlertRuleSubscription.DeleteAlertRuleSubscription",
"parameters": [
{
"in": "path",
"name": "Id",
"description": "key: Id",
"required": true,
"type": "string",
"format": "uuid",
"pattern": "^[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}$",
"x-ms-docs-key-type": "AlertRuleSubscription"
},
{
"in": "header",
"name": "If-Match",
"description": "ETag",
"type": "string"
}
],
"responses": {
"204": {
"description": "Success"
},
"default": {
"$ref": "#/responses/error"
}
},
"x-ms-docs-operation-type": "operation"
}
}
}
And my post operation does reply with a Location header which matches the format of the delete operation described above.
My questions are:
What is missing in my trigger declaration?
How can I get more details on the subscription creation and the error Microsoft Flow is generating?
After some internal discussions with Microsoft we found two main issues.
First I updated the body parameter of the POST request to create the subscription to this.
{
"name": "body",
"in": "body",
"required": false,
"schema": {
"type": "object",
"properties": {
"callbackUrl": {
"type": "string",
"required": true,
"description": "callbackUrl",
"x-ms-notification-url": true,
"x-ms-visibility": "internal"
}
}
}
}
That is because the connector definitions don't support sending the callback URL in the body without using JSON formatting and because Flow was implemented using the Open API callback specification.
Second I updated my API to support the specification mentioned above.

Send workflow information to custom connector

I need help with sending workflow information in header/body of calls to custom connector. I am trying to load a drop down list in one of the parameters of a logic app using values returned from an API call. The API end point requires basic workflow information such as the resource group and workflow name which are normally available in headers of http requests from logic app execution.
Normally when I use #{workflow().name} in logic app's json, it is substituted with the workflow name. In case of custom connector, the WDL syntax is passed as is without any transformation.
Here is a simplified swagger json with all relevant sections.
{
"swagger": "2.0",
"info": {
"title": "{{dynamicHostName}}",
"version": "1.0.0"
},
"host": "{{dynamicHostName}}",
"basePath": "/",
"schemes": [
"https"
],
"paths": {
"/sftpsource": {
"post": {
"operationId": "SftpSource",
"summary": "Sftp as source system",
"description": "Use Sftp as source system",
"produces": [
"application/json"
],
"consumes": [
"application/json"
],
"parameters": [
{
"name": "params",
"in": "body",
"required": true,
"schema": {
"type": "object",
"properties": {
"hostName": {
"type": "string",
"x-ms-summary": "Host Name",
"x-ms-visibility": "advanced"
},
"portNumber": {
"type": "string",
"x-ms-summary": "Port Number",
"x-ms-visibility": "advanced"
},
"userName": {
"type": "string",
"x-ms-summary": "User Name",
"x-ms-visibility": "advanced"
},
"password": {
"type": "string",
"x-ms-summary": "Password",
"x-ms-visibility": "advanced"
},
"filePath": {
"type": "string",
"x-ms-summary": "File Path"
},
"system": {
"type": "string",
"x-ms-visibility": "advanced",
"x-ms-summary": "System",
"x-ms-dynamic-values": {
"operationId": "GetTaggedSystems",
"parameters": {
"workflow-name": "#{workflow().name}"
},
"value-path": "systemId",
"value-title": "systemName"
}
}
}
}
}
],
"responses": {
"202": {
"description": "Request is queued"
},
"500": {
"description": "Server Error"
}
}
}
},
"/taggedSystems" : {
"get": {
"operationId": "GetTaggedSystems",
"summary": "Tagged Systems",
"x-ms-visibility": "advanced",
"description": "Get all systems tagged to this flow",
"parameters": [
{
"name": "workflow-name",
"in": "header",
"required": true,
"type": "string"
}
],
"responses": {
"200": {
"description": "OK",
"schema": {
"$ref": "#/definitions/TaggedSystems"
}
},
"202": {
"description": "Work is still in progress"
},
"500": {
"description": "An error occured while trying to fetch tagged systems."
}
}
}
}
},
"definitions": {
"TaggedSystems": {
"type": "array",
"items": {
"type": "object",
"properties": {
"systemId": {
"type": "string"
},
"systemName": {
"type": "string"
}
},
"required": [
"systemId",
"systemName"
]
}
}
}
}
You can have an internal parameter defined as header/body where you can pass dynamic expressions of what you need from the workflow environment at the time of execution of the flow.
For complete information regarding the flow, you can pass #{workflow()} as an internal parameter.
Hope it helps.

Resources