Need help to understand maintaining azure key vault in different environment and also to be able to use that in logic app - azure-logic-apps

I have a logic app which makes HTTP call to Key Vault URI to get the secret needed to connect to external system. I have developed this in the dev resource group. I want to know how to setup the key vault from dev resource groups to other resource groups (test/prod). Also, how to migrate the logic app and get the secret per environment.

:) The solution is to use ARM templates and ADO/any other pipeline. You can create ARM templates with different parameters' values for different environments and use them to deploy your Logic App and Key vault to different environments.
Logic App Template sample:
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
// Template parameters
"parameters": {
"<template-parameter-name>": {
"type": "<parameter-type>",
"defaultValue": "<parameter-default-value>",
"metadata": {
"description": "<parameter-description>"
}
}
},
"variables": {},
"functions": [],
"resources": [
{
// Start logic app resource definition
"properties": {
<other-logic-app-resource-properties>,
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {<action-definitions>},
// Workflow definition parameters
"parameters": {
"<workflow-definition-parameter-name>": {
"type": "<parameter-type>",
"defaultValue": "<parameter-default-value>",
"metadata": {
"description": "<parameter-description>"
}
}
},
"triggers": {
"<trigger-name>": {
"type": "<trigger-type>",
"inputs": {
// Workflow definition parameter reference
"<attribute-name>": "#parameters('<workflow-definition-parameter-name')"
}
}
},
<...>
},
// Workflow definition parameter value
"parameters": {
"<workflow-definition-parameter-name>": {
"value": "[parameters('<template-parameter-name>')]"
}
},
"accessControl": {}
},
<other-logic-app-resource-definition-attributes>
}
// End logic app resource definition
],
"outputs": {}
}
Key Vault template:
{
"name": "string",
"type": "Microsoft.KeyVault/vaults",
"apiVersion": "2018-02-14",
"location": "string",
"tags": {},
"properties": {
"tenantId": "string",
"sku": {
"family": "A",
"name": "string"
},
"accessPolicies": [
{
"tenantId": "string",
"objectId": "string",
"applicationId": "string",
"permissions": {
"keys": [
"string"
],
"secrets": [
"string"
],
"certificates": [
"string"
],
"storage": [
"string"
]
}
}
],
"vaultUri": "string",
"enabledForDeployment": "boolean",
"enabledForDiskEncryption": "boolean",
"enabledForTemplateDeployment": "boolean",
"enableSoftDelete": "boolean",
"createMode": "string",
"enablePurgeProtection": "boolean",
"networkAcls": {
"bypass": "string",
"defaultAction": "string",
"ipRules": [
{
"value": "string"
}
],
"virtualNetworkRules": [
{
"id": "string"
}
]
}
},
"resources": []
}
Moreover, you can read this article to understand more about setting up your ADO pipelines: Integrate ARM templates with Azure Pipelines

Related

Output of Stored Proc in Logic apps not becoming Dynamic content

I've made a Stored Proc and I've initialized it in a variable and am trying to make the output of the Stored Proc Dynamic content in Logic Apps, however. Once I put the variable in the Parse JSON step, it will not become Dynamic. Any advice on how to make my Stored Proc output Dynamic content?
My output varies on the number of entries I'm able to retrieve and I think that's my issue. I've set my Parse JSON scheme to look for 6 entries and sometimes I will get 8-10, which I believe is my problem How can I make it so that no matter how many entries come through from my Stored Proc, I can capture those values in dynamic content and use them?
I see that you are trying to check the SQL dynamic content rather than checking the Parse JSON Connector. Make sure you are checking in the dynamic content of Parse JSON.
Also, after reproducing from my end, I could able to get the expected results without using Parse Json too.
when you follow the above the process you can retrieve no matter how many entries are there in the table. Below is the complete flow that worked for me.
Below is the schema of my logic app
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Execute_stored_procedure_(V2)": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['sql']['connectionId']"
}
},
"method": "post",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('default'))},#{encodeURIComponent(encodeURIComponent('default'))}/procedures/#{encodeURIComponent(encodeURIComponent('[dbo].[proc1]'))}"
},
"runAfter": {},
"type": "ApiConnection"
},
"Parse_JSON": {
"inputs": {
"content": "#body('Execute_stored_procedure_(V2)')",
"schema": {
"properties": {
"OutputParameters": {
"properties": {},
"type": "object"
},
"ResultSets": {
"properties": {
"Table1": {
"items": {
"properties": {
"firstname": {
"type": "string"
},
"lastname": {
"type": "string"
}
},
"required": [
"firstname",
"lastname"
],
"type": "object"
},
"type": "array"
}
},
"type": "object"
},
"ReturnCode": {
"type": "integer"
}
},
"type": "object"
}
},
"runAfter": {
"Execute_stored_procedure_(V2)": [
"Succeeded"
]
},
"type": "ParseJson"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"sql": {
"connectionId": "/subscriptions/xxxx/resourceGroups/xxxx/providers/Microsoft.Web/connections/sql",
"connectionName": "sql",
"id": "/subscriptions/xxxx/providers/Microsoft.Web/locations/eastus/managedApis/sql"
}
}
}
}
}

Queue Name is not binding to azure logic app after deployment

I am trying to deploy the Azure logic app to two environments ie. dev and prod.
I am using a single json file for the logic app and for the environments I have different parameter files. The Azure pipeline picks the parameter file as per the environment. All the parameters defined in the parameters files get bound during deployment, but I'm having issues with the queue name.
This is how I am defining parameters in the parameters files:
"parameters": {
"logicAppName": {
"value": "Test-App-dev"
},
"QueueName": { "value": "testdev" }
}
This is how it is used in path variable in logic app json file:
But after deployment, the queue name is not reflecting on the Azure portal. This is how it looks like in the Azure portal:
After reproducing from our end, we found that this is due to an invalid template. Here is how we could able to pull off the required value.
Created the below parameter
"parameters": {
........
,
"QueueName": {
"defaultValue": "testdev",
"type": "String"
}
}
Now you need to add the same in the template's parameter
"parameters": {
"$connections": {
"value": {
"servicebus_1": {
"connectionId": "[parameters('connections_servicebus_1_externalid')]",
"connectionName": "servicebus-1",
"id": "***"
}
}
},
"QueueNamedev": {
"value": "[parameters('QueueName')]"
}
}
Declare the same in your definition's parameter
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
},
"$QueueNamedev": {
"defaultValue": {},
"type": "String"
}
}
RESULT:
Finally, we could able to read its value as 'testdev'. You can check from parameters that the value of QueueNamedev is read as testdev.
Below is the template of my logic app
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"workflows_LogicApp12_name": {
"defaultValue": "LogicApp12",
"type": "String"
},
"connections_servicebus_1_externalid": {
"defaultValue": "/subscriptions/<YourSubscriptionID>/resourceGroups/<YourResourceGroup>/providers/Microsoft.Web/connections/servicebus-1",
"type": "String"
},
"QueueName": {
"defaultValue": "testdev",
"type": "String"
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.Logic/workflows",
"apiVersion": "2017-07-01",
"name": "[parameters('workflows_LogicApp12_name')]",
"location": "centralus",
"properties": {
"state": "Enabled",
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
},
"QueueNamedev": {
"defaultValue": {},
"type": "String"
}
},
"triggers": {
"When_a_message_is_received_in_a_queue_(auto-complete)": {
"recurrence": {
"frequency": "Minute",
"interval": 3
},
"evaluatedRecurrence": {
"frequency": "Minute",
"interval": 3
},
"type": "ApiConnection",
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['servicebus_1']['connectionId']"
}
},
"method": "get",
"path": "/#{encodeURIComponent(encodeURIComponent(parameters('QueueNamedev')))}/messages/head",
"queries": {
"queueType": "Main"
}
}
}
},
"actions": {},
"outputs": {}
},
"parameters": {
"$connections": {
"value": {
"servicebus_1": {
"connectionId": "[parameters('connections_servicebus_1_externalid')]",
"connectionName": "servicebus-1",
"id": "/subscriptions/<YourSubscriptionID>/providers/Microsoft.Web/locations/centralus/managedApis/servicebus"
}
}
},
"QueueNamedev": {
"value": "[parameters('QueueName')]"
}
}
}
}
]
}

Azure sql private endpoint ARM template help needed

{
"type": "Microsoft.Network/privateEndpoints",
"apiVersion": "2020-06-01",
"name": "[variables('privateEndpointName')]",
"location": "[parameters('location')]",
"dependsOn": [
"[resourceId('Microsoft.Sql/servers', variables('sqlServerName'))]"
],
"properties": {
"subnet": {
**id": "[concat(resourceId('Microsoft.Network/virtualNetworks', variables('vnetName')),'/subnets/default')]"')]"**
},
"privateLinkServiceConnections": [
{
"name": "[variables('privateEndpointName')]",
"properties": {
"privateLinkServiceId": "[resourceId('Microsoft.Sql/servers',variables('sqlServerName'))]",
"groupIds": [
"sqlServer"
]
Getting error Resource vnetI not found. vnet is located in different subscription to where template is running.
How to get vnet ID in different subscription using arm template?
default value is the current subscription. Specify subscription and resource group to solve the issue.
"[resourceId('11111111-1111-1111-1111-111111111111', 'otherResourceGroup', 'Microsoft.Storage/storageAccounts','examplestorage')]"
link to MS docs
The issue is just wrong reference of parameters. You can reference the other VNet in a different subscription for the parameter "id" with its ID generated from resourceId function. It returns the unique identifier of a resource. You use this function when the resource name is ambiguous or not provisioned within the same template.
Default value is the current subscription. Specify subscriptionId, resourceGroupName, resourceType and resourceName value of the other subscription as you need to retrieve a resource in another subscription.
Changes:
From:
"id": resourceId([subscriptionId], [resourceGroupName], resourceType, resourceName1, [resourceName2], ...)
To:
"id": "[resourceId('OthersubscriptionId','virtualNetworkResourceGroup', 'Microsoft.Network/virtualNetworks/subnets', 'virtualNetworkName', 'subnet1Name')]"
this is how it looks...
{
"type": "Microsoft.Network/privateEndpoints",
"apiVersion": "2020-06-01",
"name": "[variables('privateEndpointName')]",
"location": "[parameters('location')]",
"dependsOn": [
"[resourceId('Microsoft.Sql/servers', variables('sqlServerName'))]"
],
"properties": {
"subnet": {
"id": "[resourceId('xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx', 'virtualNetworkResourceGroup', 'Microsoft.Network/virtualNetworks/subnets', 'virtualNetworkName', 'subnet1Name')]"
},
"privateLinkServiceConnections": [
{
"name": "[variables('privateEndpointName')]",
"properties": {
"privateLinkServiceId": "[resourceId('Microsoft.Sql/servers',variables('sqlServerName'))]",
"groupIds": [
"sqlServer"
]
}
}
]
}
},
Refer: Here and resourceId
Update ---
Additional example: To get the resource ID for a resource in a different subscription and resource group, provide the subscription ID and resource group name.
"[resourceId('xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx', 'otherResourceGroup', 'Microsoft.Storage/storageAccounts','examplestorage')]"
The following example shows how a resource from an external resource group can easily be used:
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"location": {
"type": "string"
},
"virtualNetworkName": {
"type": "string"
},
"virtualNetworkResourceGroup": {
"type": "string"
},
"subnet1Name": {
"type": "string"
},
"nicName": {
"type": "string"
}
},
"variables": {
"subnet1Ref": "[resourceId(parameters('virtualNetworkResourceGroup'), 'Microsoft.Network/virtualNetworks/subnets', parameters('virtualNetworkName'), parameters('subnet1Name'))]"
},
"resources": [
{
"type": "Microsoft.Network/networkInterfaces",
"apiVersion": "2015-05-01-preview",
"name": "[parameters('nicName')]",
"location": "[parameters('location')]",
"properties": {
"ipConfigurations": [
{
"name": "ipconfig1",
"properties": {
"privateIPAllocationMethod": "Dynamic",
"subnet": {
"id": "[variables('subnet1Ref')]"
}
}
}
]
}
}
]
}
---

ARM template deployment for Logic Apps with Datalake API Connection

Error - The API connection 'azuredatalake' is not configured to support managed identity."
I tried deployment of Azure logic App along with API connection to access Azure DataLake Gen1 using Managed Identity. This failed due to above error.
Also deployed API connection separately which was successful but with status "Unauthenticated", so deploying of corresponding logic apps failed with missing api connection.
Part Template for API connection reference:
{
"type": "Microsoft.Web/connections",
"apiVersion": "2016-06-01",
"name": "[parameters('azuredatalake_1_Connection_Name')]",
"location": "[parameters('location')]",
"kind": "V1",
"properties": {
"displayName": "azuredatalakemsi",
"customParameterValues": {},
"api": {
"id": "[variables('managedadlsApi')]"
}
}
}
If we want to access the data stored in data lake gen1, we must configure right ACLs for the user or sp used to do auth. Otherwise, we cannot have permission to access data. For more details, please refer to the official document. Meanwhile, we cannot do that via arm template. We can do that by PowerShell or Portal.
Besides, regarding how to use MSI to access Azure data lake gen1 in azure logic app, please refer to the following steps
Enable MSI in Azure logic app
{
"apiVersion": "2016-06-01",
"type": "Microsoft.logic/workflows",
"name": "[variables('logicappName')]",
"location": "[resourceGroup().location]",
"identity": {
"type": "SystemAssigned"
},
"properties": {
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {},
"parameters": {},
"triggers": {},
"contentVersion": "1.0.0.0",
"outputs": {}
},
"parameters": {},
"dependsOn": []
}
Configure ACLs
Create connection
{
"type": "Microsoft.Web/connections",
"apiVersion": "2016-06-01",
"name": "[parameters('azuredatalake_1_Connection_Name')]",
"location": "[parameters('location')]",
"tags": {
"CreatedTime": "2021-05-24T03:11:28.9371899Z"
},
"kind": "V1",
"properties": {
"displayName": "test",
"customParameterValues": {},
"api": {
"id": "[concat(subscription().id,'/providers/Microsoft.Web/locations/', parameters('location'), '/managedApis/azuredatalake')]"
}
}
},
{
"type": "Microsoft.Logic/workflows",
"apiVersion": "2017-07-01",
"name": "[parameters('workflows_testlogic05_name')]",
"location": "[parameters('location')]",
"dependsOn": [
"[resourceId('Microsoft.Web/connections', parameters('azuredatalake_1_Connection_Name'))]"
],
"properties": {
"state": "Enabled",
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
},
"actions": {
},
"outputs": {}
},
"parameters": {
"$connections": {
"value": {
"azuredatalake": {
"connectionId": "[resourceId('Microsoft.Web/connections', parameters('azuredatalake_1_Connection_Name'))]",
"connectionName": "azuredatalake",
"connectionProperties": {
"authentication": {
"type": "ManagedServiceIdentity"
}
},
"id": "[concat(subscription().id,'/providers/Microsoft.Web/locations/', parameters('location'), '/managedApis/azuredatalake')]"
}
}
}
}
}
}
For more details, please refer to
https://learn.microsoft.com/en-us/azure/logic-apps/create-managed-service-identity
I was able to solve this by modifying ARM Template for this API connection, that is by adding "parameterValueType" as "Alternative".
Deployment of Azure logic App along with API connection to access Azure DataLake Gen1 using Managed Identity is successful.
Part Template for API connection reference:
{
"type": "Microsoft.Web/connections",
"apiVersion": "2016-06-01",
"name": "[parameters('azuredatalake_1_Connection_Name')]",
"location": "[parameters('location')]",
"kind": "V1",
"properties": {
"displayName": "azuredatalakemsi",
"parameterValueType": "Alternative",
"customParameterValues": {},
"api": {
"id": "[variables('managedadlsApi')]"
}
}
}

Send workflow information to custom connector

I need help with sending workflow information in header/body of calls to custom connector. I am trying to load a drop down list in one of the parameters of a logic app using values returned from an API call. The API end point requires basic workflow information such as the resource group and workflow name which are normally available in headers of http requests from logic app execution.
Normally when I use #{workflow().name} in logic app's json, it is substituted with the workflow name. In case of custom connector, the WDL syntax is passed as is without any transformation.
Here is a simplified swagger json with all relevant sections.
{
"swagger": "2.0",
"info": {
"title": "{{dynamicHostName}}",
"version": "1.0.0"
},
"host": "{{dynamicHostName}}",
"basePath": "/",
"schemes": [
"https"
],
"paths": {
"/sftpsource": {
"post": {
"operationId": "SftpSource",
"summary": "Sftp as source system",
"description": "Use Sftp as source system",
"produces": [
"application/json"
],
"consumes": [
"application/json"
],
"parameters": [
{
"name": "params",
"in": "body",
"required": true,
"schema": {
"type": "object",
"properties": {
"hostName": {
"type": "string",
"x-ms-summary": "Host Name",
"x-ms-visibility": "advanced"
},
"portNumber": {
"type": "string",
"x-ms-summary": "Port Number",
"x-ms-visibility": "advanced"
},
"userName": {
"type": "string",
"x-ms-summary": "User Name",
"x-ms-visibility": "advanced"
},
"password": {
"type": "string",
"x-ms-summary": "Password",
"x-ms-visibility": "advanced"
},
"filePath": {
"type": "string",
"x-ms-summary": "File Path"
},
"system": {
"type": "string",
"x-ms-visibility": "advanced",
"x-ms-summary": "System",
"x-ms-dynamic-values": {
"operationId": "GetTaggedSystems",
"parameters": {
"workflow-name": "#{workflow().name}"
},
"value-path": "systemId",
"value-title": "systemName"
}
}
}
}
}
],
"responses": {
"202": {
"description": "Request is queued"
},
"500": {
"description": "Server Error"
}
}
}
},
"/taggedSystems" : {
"get": {
"operationId": "GetTaggedSystems",
"summary": "Tagged Systems",
"x-ms-visibility": "advanced",
"description": "Get all systems tagged to this flow",
"parameters": [
{
"name": "workflow-name",
"in": "header",
"required": true,
"type": "string"
}
],
"responses": {
"200": {
"description": "OK",
"schema": {
"$ref": "#/definitions/TaggedSystems"
}
},
"202": {
"description": "Work is still in progress"
},
"500": {
"description": "An error occured while trying to fetch tagged systems."
}
}
}
}
},
"definitions": {
"TaggedSystems": {
"type": "array",
"items": {
"type": "object",
"properties": {
"systemId": {
"type": "string"
},
"systemName": {
"type": "string"
}
},
"required": [
"systemId",
"systemName"
]
}
}
}
}
You can have an internal parameter defined as header/body where you can pass dynamic expressions of what you need from the workflow environment at the time of execution of the flow.
For complete information regarding the flow, you can pass #{workflow()} as an internal parameter.
Hope it helps.

Resources