Custom policy does not maintain previous values on properties array - mulesoft

I am creating a custom policy following this documentation.
I created one, and it is working. The JSON schema for the policy is shown below:
{
"title": "ACME Custom Basic Auth Policy",
"description": "Basic Authentication policy which enforces security according with custom consumer credentials",
"type": "object",
"properties": {
"users": {
"title": "users",
"type": "array",
"items": {
"type": "object",
"required": [
"username",
"password"
],
"properties": {
"username": {
"title": "User Name",
"type": "string",
"default": []
},
"password": {
"title": "User Password",
"type": "string",
"#context": {
"#characteristics": [
"security:sensitive"
]
}
}
}
},
"minItems": 1
}
},
"#context": {
"#vocab": "anypoint://vocabulary/policy.yaml#",
"security": "anypoint://vocabulary/policy.yaml#"
},
"$id": "allow-dynamic-resources",
"$schema": "https://json-schema.org/draft/2019-09/schema"
}
When I go to API Manager, I can configure the values on first attempt, but when I go back to change the values, they do not appear.
This happens only when I configure an array. If I configure as an object, it works. How can I fix this?

Related

How to loop through an array in a logic app?

I have managed to get all my userdata in an array (see here) but now I cannot loop through the data. After building the array I have converted it to JSON, but I can no longer address the fields as defined in my JSON schema.
The only thing I can address in my loop (I use the JSON body as input for the For Each loop) is the body itself, not the individual fields like username, mail address etc.
Should I change something in my JSON schema to overcome this or is something else wrong?
Edit: Please find my JSON schema below:
{
"$schema": "http://json-schema.org/draft-04/schema#",
"items": [
{
"properties": {
"##odata.type": {
"type": "string"
},
"createdDateTime": {
"type": "string"
},
"employeeId": {
"type": "string"
},
"givenName": {
"type": "string"
},
"id": {
"type": "string"
},
"mail": {
"type": "string"
},
"onPremisesSamAccountName": {
"type": "string"
},
"surname": {
"type": "string"
},
"userPrincipalName": {
"type": "string"
}
},
"required": [
"##odata.type",
"id",
"givenName",
"surname",
"userPrincipalName",
"mail",
"onPremisesSamAccountName",
"employeeId",
"createdDateTime"
],
"type": "object"
}
],
"type": "array"
}
Please see the image for how the JSON looks:
Per my understanding, you just want to loop your array to get each item's name, mail and some other fields. As you mentioned in your question, you can use the json body as input for the For Each loop. It's ok, ther is not need to to anything more. Please refer to the screenshot below:
Initialize a variable like your json data.
Then parse it by "Parse JSON" action.
Now, set the body as input for the For each loop, and then use a variable and set the value with "mail" from "Parse JSON".
After running the logic app, we can see the mail field is also looped. You can use the "mail", "name" and other fields easily in your "For each".
Update:
I checked your json schema, but it seems can't match the json data you provided in your screenshot. May I know how did you generate your json schema, in my side I generate the json schema just by clicking the "Use sample payload to generate schema" button and it will generate the schema automatically.
I use a json data sample with the same structure of yours' and generate its schema, please refer to the json data and schema below:
json data:
{
"body": [
{
"#odata.type": "test",
"id": "123456",
"givenName": "test",
"username": "test",
"userPrincipalName": "test",
"mail": "test#mail.com",
"onPremisesSamAccountName": "test",
"employeeId": "test",
"createdDateTime": "testdate"
},
{
"#odata.type": "test",
"id": "123456",
"givenName": "test",
"username": "test",
"userPrincipalName": "test",
"mail": "test#mail.com",
"onPremisesSamAccountName": "test",
"employeeId": "test",
"createdDateTime": "testdate"
}
]
}
schema:
{
"type": "object",
"properties": {
"body": {
"type": "array",
"items": {
"type": "object",
"properties": {
"##odata.type": {
"type": "string"
},
"id": {
"type": "string"
},
"givenName": {
"type": "string"
},
"username": {
"type": "string"
},
"userPrincipalName": {
"type": "string"
},
"mail": {
"type": "string"
},
"onPremisesSamAccountName": {
"type": "string"
},
"employeeId": {
"type": "string"
},
"createdDateTime": {
"type": "string"
}
},
"required": [
"##odata.type",
"id",
"givenName",
"username",
"userPrincipalName",
"mail",
"onPremisesSamAccountName",
"employeeId",
"createdDateTime"
]
}
}
}
}

Microsoft Flow Custom Connector webhook trigger definition and implementation : 404 not found after flow creation

I'm trying to create a custom connector for my API in Microsoft Flow so users can trigger flows based on a webhook implementation.
The authentication part seems to be working properly (I'm able to create connections). After creating a flow using my custom trigger, it never gets triggered. When checking the data on my end it seems that Flow was never able to register the subscription properly.
If I navigate to the management page for the flow, I get the following error message.
When I click on fix the trigger I get the following details where the Id parameter matches the id of the resource we're trying to subscribe to.
Here is the trigger definition:
{
"/AlertRules/{id}/webhooks": {
"x-ms-notification-content": {
"schema": {
"type": "object",
"properties": {
"Title": {
"type": "string",
"description": "Title"
},
"Text": {
"type": "string",
"description": "Text"
},
"Data": {
"type": "array",
"items": {
"$ref": "#/definitions/DataApi.Models.AlertEvent"
},
"description": "Data"
}
}
},
"description": ""
},
"post": {
"responses": {
"201": {
"description": "Created",
"schema": {
"type": "string"
}
}
},
"x-ms-trigger": "single",
"operationId": "NewAlertEvent",
"summary": "When a new Alert Event is created or updated",
"parameters": [
{
"name": "id",
"in": "path",
"required": true,
"type": "string",
"x-ms-visibility": "important",
"x-ms-dynamic-values": {
"operationId": "AlertRules.AlertRule.ListAlertRule",
"value-path": "Id",
"value-collection": "value",
"value-title": "Description"
}
},
{
"name": "body",
"in": "body",
"required": false,
"schema": {
"type": "string",
"x-ms-visibility": "internal",
"title": "",
"x-ms-notification-url": true
},
"x-ms-visibility": "internal"
}
]
}
}
The description of my delete operation
{
"/AlertRuleSubscriptions({Id})": {
"delete": {
"tags": [
"AlertRuleSubscriptions.AlertRuleSubscription"
],
"summary": "Delete entity from AlertRuleSubscriptions",
"operationId": "AlertRuleSubscriptions.AlertRuleSubscription.DeleteAlertRuleSubscription",
"parameters": [
{
"in": "path",
"name": "Id",
"description": "key: Id",
"required": true,
"type": "string",
"format": "uuid",
"pattern": "^[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}$",
"x-ms-docs-key-type": "AlertRuleSubscription"
},
{
"in": "header",
"name": "If-Match",
"description": "ETag",
"type": "string"
}
],
"responses": {
"204": {
"description": "Success"
},
"default": {
"$ref": "#/responses/error"
}
},
"x-ms-docs-operation-type": "operation"
}
}
}
And my post operation does reply with a Location header which matches the format of the delete operation described above.
My questions are:
What is missing in my trigger declaration?
How can I get more details on the subscription creation and the error Microsoft Flow is generating?
After some internal discussions with Microsoft we found two main issues.
First I updated the body parameter of the POST request to create the subscription to this.
{
"name": "body",
"in": "body",
"required": false,
"schema": {
"type": "object",
"properties": {
"callbackUrl": {
"type": "string",
"required": true,
"description": "callbackUrl",
"x-ms-notification-url": true,
"x-ms-visibility": "internal"
}
}
}
}
That is because the connector definitions don't support sending the callback URL in the body without using JSON formatting and because Flow was implemented using the Open API callback specification.
Second I updated my API to support the specification mentioned above.

Create SQL Server via Azure Resource Manager (ARM) template

I am trying to create a new Azure instance of SQL Server in which I would like to then create a few new databases.
I know from the Azure Portal that some sort of admin users could be:
an SA user (I think this means "Server Admin" and it looks like some sort of old way of managing a SQL Server instance, but at the same time very "basic" and proved to work)
an Active Directory user (not sure about Azure terminology here, but it looks like this could be some "broad user" for the whole Azure platform, like e.g. my own login user for the Azure Portal, this is not specific to databases world).
I would like to create a SQL Server with a SA user to administer the server. From the Azure portal I can not find a way to generate an ARM template for a SA user for the SQL Server instance.
I am copy pasting from a 10000 lines ARM template for a very long list of SQL servers and databases but I am not able to isolate the basic steps to have a hopefully clean and short ARM template to start with.
This is the ARM template I am trying to deploy on Azure:
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"location": {
"type": "string",
"defaultValue": "westeurope"
},
"foo_sql_server_name": {
"defaultValue": "foo-sql-server",
"type": "String"
}
},
"resources": [
{
"type": "Microsoft.Sql/servers",
"kind": "v12.0",
"name": "[parameters('foo_sql_server_name')]",
"apiVersion": "2015-05-01-preview",
"location": "[parameters('location')]",
"scale": null,
"properties": {
"administratorLogin": "<MY_SA_USER_THAT_I_CAN_NOT_CREATE>",
"version": "12.0"
},
"dependsOn": []
}
]
}
When running the above with:
az group deployment create \
--name "deployDBs" \
--resource-group "MyCustomResourceGroup" \
--template-file ./templates/db.json # --verbose --debug
Then I get the following error message:
Deployment failed. Correlation ID: <A_CUSTOM_GUID>. {
"status": "Failed",
"error": {
"code": "ResourceDeploymentFailure",
"message": "The resource operation completed with terminal provisioning state 'Failed'.",
"details": [
{
"code": "InvalidParameterValue",
"message": "Invalid value given for parameter Password. Specify a valid parameter value."
}
]
}
}
When removing the JSON field administratorLogin (because hopefully I could create the SA user somehow somewhere else that I yet have to figure out), then I get the following error message:
Deployment failed. Correlation ID: <ANOTHER_CUSTOM_GUID>. {
"status": "Failed",
"error": {
"code": "ResourceDeploymentFailure",
"message": "The resource operation completed with terminal provisioning state 'Failed'.",
"details": [
{
"code": "InvalidParameterValue",
"message": "Invalid value given for parameter Login. Specify a valid parameter value."
}
]
}
}
I am not able to find the definition for the pair "username password" for the SA user (Server Admin) from the 10000 lines auto-generated ARM template.
How could I create/inject a SA user for the SQL Server while deploying a new instance of a SQL Server?
The sa login you use on an on-premises SQL Server instance is known on Azure SQL as the Admin login. You can provide the name of the admin login and its password as parameter as shown on below sample template:
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"yourservernameName": {
"type": "string",
"defaultValue": "yourservername2"
},
"yourservernameAdminLogin": {
"type": "string",
"defaultValue": "VeryWiseAdmin",
"minLength": 1
},
"yourservernameAdminLoginPassword": {
"type": "securestring",
"defaultValue": "ReplaceWithTheMostSecurePasswordThatEverExisted&NeverShareLikeThisWithAnyone!"
},
"dbnameName": {
"type": "string",
"defaultValue": "dbname",
"minLength": 1
},
"dbnameCollation": {
"type": "string",
"minLength": 1,
"defaultValue": "SQL_Latin1_General_CP1_CI_AS"
},
"dbnameEdition": {
"type": "string",
"defaultValue": "Basic"
},
"dbnameRequestedServiceObjectiveName": {
"type": "string",
"defaultValue": "Basic"
}
},
"variables": {
},
"resources": [
{
"name": "[parameters('yourservernameName')]",
"type": "Microsoft.Sql/servers",
"location": "West Europe",
"apiVersion": "2014-04-01-preview",
"dependsOn": [],
"tags": {
"displayName": "yourservername"
},
"properties": {
"administratorLogin": "[parameters('yourservernameAdminLogin')]",
"administratorLoginPassword": "[parameters('yourservernameAdminLoginPassword')]",
"version": "12.0"
},
"resources": [
{
"name": "[concat(parameters('yourservernameName'),'/AllowAllWindowsAzureIps')]",
"type": "Microsoft.Sql/servers/firewallRules",
"location": "[resourceGroup().location]",
"apiVersion": "2014-04-01-preview",
"dependsOn": [
"[resourceId('Microsoft.Sql/servers', parameters('yourservernameName'))]"
],
"properties": {
"startIpAddress": "0.0.0.0",
"endIpAddress": "0.0.0.0"
}
},
{
"name": "[concat(parameters('yourservernameName'),'/',parameters('dbnameName'))]",
"type": "Microsoft.Sql/servers/databases",
"location": "West Europe",
"apiVersion": "2014-04-01-preview",
"dependsOn": [
"[resourceId('Microsoft.Sql/servers', parameters('yourservernameName'))]"
],
"tags": {
"displayName": "dbname"
},
"properties": {
"collation": "[parameters('dbnameCollation')]",
"edition": "[parameters('dbnameEdition')]",
"maxSizeBytes": "1073741824",
"requestedServiceObjectiveName": "[parameters('dbnameRequestedServiceObjectiveName')]"
}
}
]
}
],
"outputs": {
"SomeString": {
"type": "string",
"value": "What ever you want to put here"
},
"ServerNameParam": {
"type": "string",
"value": "[parameters('yourservernameName')]"
},
"ServerResourceID": {
"type": "string",
"value": "[resourceId('Microsoft.Sql/servers', parameters('yourservernameName'))]"
},
"ServerObject": {
"type": "object",
"value": "[reference(parameters('yourservernameName'))]"
},
"SqlServerURL": {
"type": "string",
"value": "[reference(parameters('yourservernameName')).fullyQualifiedDomainName]"
},
"DbResourceID": {
"type": "string",
"value": "[resourceId('Microsoft.Sql/servers/databases', parameters('yourservernameName'), parameters('dbnameName'))]"
},
"DbObject": {
"type": "object",
"value": "[reference(parameters('dbnameName'))]"
},
"DbAdoConnString": {
"type": "string",
"value": "[concat('Server=tcp:',reference(parameters('yourservernameName')).fullyQualifiedDomainName,',1433;Initial Catalog=',parameters('dbnameName'),';Persist Security Info=False;User ID=',reference(parameters('yourservernameName')).administratorLogin,';Password=',reference(parameters('yourservernameName')).administratorLoginPassword,';MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;')]"
}
}
}
Working sample:
"name": "name",
"type": "Microsoft.Sql/servers",
"location": "[resourceGroup().location]",
"apiVersion": "2014-04-01",
"properties": {
"administratorLogin": "somelogin",
"administratorLoginPassword": "somepasswordD1!"
}
please note that SA might not be allowed as a username and password has complexity requirements
We wanted to create a temporary unique password per resource group and don't have to worry about passwords in template or parameters files since these are checked into git. Solved it like this:
template.json:
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"vulnerabilityAssessments_Default_storageContainerPath": {
"type": "SecureString"
},
"servers_dev_name": {
"defaultValue": "dev-app",
"type": "String"
}
},
"variables": {
"servers_dev_password": "[concat('P', uniqueString(resourceGroup().id, '224F5A8B-51DB-46A3-A7C8-59B0DD584A41'), 'x', '!')]",
},
"resources": [
{
"type": "Microsoft.Sql/servers",
"apiVersion": "2019-06-01-preview",
"name": "[parameters('servers_dev_name')]",
"location": "northeurope",
"kind": "v12.0",
"properties": {
"administratorLogin": "OurSaName",
"administratorLoginPassword": "[variables('servers_dev_password')]",
"version": "12.0",
"publicNetworkAccess": "Enabled"
}
},
"To make sure that we are compliant with the Azure SQL database policy "Your password must contain characters from three of the following categories – English uppercase letters, English lowercase letters, numbers (0-9), and non-alphanumeric characters (!, $, #, %, etc.)", we insert one character for each category before and after the unique string."
Sources:
https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-functions-string#uniquestring
https://vivien-chevallier.com/Articles/automatically-generate-a-password-for-an-azure-sql-database-with-arm-template
Warning:
If you add to your parameters.json:
"servers_dev_password": {
"value": "[uniqueString(resourceGroup().id)]"
}
and add the parameter to template.json the actual password will be:[uniqueString(resourceGroup().id)].
"servers_dev_password": {
"type": "SecureString"
}
A thing to note is that the definition for uniqueString is:
Creates a deterministic hash string based on the values provided as
parameters.
This means that if you want to create a unique password per deployment it would have to look something like this:
"parameters": {
"newGuid": {
"type": "string",
"defaultValue": "[newGuid()]"
}
}
"variables": {
"sqlserverAdminPassword": "[concat(uniqueString(guid(resourceGroup().id, deployment().name)), parameters('newGuid'), 'Tg2%')]"
}
Your password would then be updated on every deploy.
https://stackoverflow.com/a/70325944/3850405

Send workflow information to custom connector

I need help with sending workflow information in header/body of calls to custom connector. I am trying to load a drop down list in one of the parameters of a logic app using values returned from an API call. The API end point requires basic workflow information such as the resource group and workflow name which are normally available in headers of http requests from logic app execution.
Normally when I use #{workflow().name} in logic app's json, it is substituted with the workflow name. In case of custom connector, the WDL syntax is passed as is without any transformation.
Here is a simplified swagger json with all relevant sections.
{
"swagger": "2.0",
"info": {
"title": "{{dynamicHostName}}",
"version": "1.0.0"
},
"host": "{{dynamicHostName}}",
"basePath": "/",
"schemes": [
"https"
],
"paths": {
"/sftpsource": {
"post": {
"operationId": "SftpSource",
"summary": "Sftp as source system",
"description": "Use Sftp as source system",
"produces": [
"application/json"
],
"consumes": [
"application/json"
],
"parameters": [
{
"name": "params",
"in": "body",
"required": true,
"schema": {
"type": "object",
"properties": {
"hostName": {
"type": "string",
"x-ms-summary": "Host Name",
"x-ms-visibility": "advanced"
},
"portNumber": {
"type": "string",
"x-ms-summary": "Port Number",
"x-ms-visibility": "advanced"
},
"userName": {
"type": "string",
"x-ms-summary": "User Name",
"x-ms-visibility": "advanced"
},
"password": {
"type": "string",
"x-ms-summary": "Password",
"x-ms-visibility": "advanced"
},
"filePath": {
"type": "string",
"x-ms-summary": "File Path"
},
"system": {
"type": "string",
"x-ms-visibility": "advanced",
"x-ms-summary": "System",
"x-ms-dynamic-values": {
"operationId": "GetTaggedSystems",
"parameters": {
"workflow-name": "#{workflow().name}"
},
"value-path": "systemId",
"value-title": "systemName"
}
}
}
}
}
],
"responses": {
"202": {
"description": "Request is queued"
},
"500": {
"description": "Server Error"
}
}
}
},
"/taggedSystems" : {
"get": {
"operationId": "GetTaggedSystems",
"summary": "Tagged Systems",
"x-ms-visibility": "advanced",
"description": "Get all systems tagged to this flow",
"parameters": [
{
"name": "workflow-name",
"in": "header",
"required": true,
"type": "string"
}
],
"responses": {
"200": {
"description": "OK",
"schema": {
"$ref": "#/definitions/TaggedSystems"
}
},
"202": {
"description": "Work is still in progress"
},
"500": {
"description": "An error occured while trying to fetch tagged systems."
}
}
}
}
},
"definitions": {
"TaggedSystems": {
"type": "array",
"items": {
"type": "object",
"properties": {
"systemId": {
"type": "string"
},
"systemName": {
"type": "string"
}
},
"required": [
"systemId",
"systemName"
]
}
}
}
}
You can have an internal parameter defined as header/body where you can pass dynamic expressions of what you need from the workflow environment at the time of execution of the flow.
For complete information regarding the flow, you can pass #{workflow()} as an internal parameter.
Hope it helps.

Angular schema form access Object with Arrays from Form

I have a scenario where i need to display arrays in side a named object like this:
"actions": {
"singleSelection": [
{
"chartable": false,
"label": ""
}
]
}
I have accomplished it with the following schema:
"schema": {
"type": "object",
"title": "smart_report",
"properties": {
"actions": {
"title": "Actions",
"type": "object",
"properties": {
"singleSelection": {
"title": "Action: Single selection",
"type": "array",
"maxItems": 10,
"items": {
"type": "object",
"properties": {
"field": {
"title": "Field name",
"type": "string"
},
"label": {
"title": "Label",
"type": "string",
"description": "Label will be used for column name."
},
"chartable": {
"title": "Chartable",
"type": "boolean"
}
}
}
}
}
}
}
Now am trying to set the 'notitle' flag on 'actions' in from and trying to access the properties of 'actions', but its not working as expected:
{
"key": "actions",
"notitle": true,
"properties": {
"key": "singleSelection",
"notitle": true,
"startEmpty": true
}
},
I still see title for actions as well for 'singleSelection' and 'stratEmpty' is also not set.
If you remove the titles from the schema, as well as using the notitle attribute it should work.
(In case you remove the title, but do not use the notitle attribute, the title will be displayed with the same name as the key).

Resources