Is there a procedure for migrating azure logic app from one account to another account - azure-logic-apps

Have two Azure accounts with different subscription, I wants to migrate azure logic app from old account to new account. Please help with the steps for migration

Navigate to your logic app, In overview pane beside subscription you can find move from where you can migrate to another account from present.
Now you can select the destination subscription and Resource group from the Subscription list and Resource group list and then select OK.
REFERENCES:
Move logic apps across subscriptions, resource groups, or regions
For Cloning
WAY - 1
One workaround that you can try is to clone the logic app into the same subscription and resource and then move to another subscription.
WAY - 2 Deploy the ARM template of your logic app to another Subscription.
Step-1
Navigate to the Export template and click deploy
Step-2
Change the subscription Id of your connectors then select the subscription and resource group to which you want to deploy.

You made a comment that you want to clone your LogicApp to another subscription/account, if the first answer isn't what you want then you can (in your words) clone the app by copying the code view and pasting it into a new LogicApp in the other subscription/account.
For example, this is a JSON definition for an extremely basic LogicApp I just created ...
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Initialize_variable": {
"inputs": {
"variables": [
{
"name": "Text String",
"type": "string",
"value": "This is a test"
}
]
},
"runAfter": {},
"type": "InitializeVariable"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {},
"triggers": {
"Recurrence": {
"evaluatedRecurrence": {
"frequency": "Month",
"interval": 12
},
"recurrence": {
"frequency": "Month",
"interval": 12
},
"type": "Recurrence"
}
}
},
"parameters": {}
}
It looks like this ...
You can literally take that JSON definition and paste it into any existing LogicApp and it will produce the same flow.

Related

In Azure Logic App how to iterate over a dictionary

I am having a Azure logic app where I want to iterate over a dictionary as shown below.
"test": {"text1":"qabdstw1234",
"text2":"vhry46578"
},
Here, in my Logic app I am able to iterate over a list of dictionary but here I want to iterate over a dictionary.
Does anybody knows how to do it?
I have used parse JSON inorder to fetch test1 data from the JSON which you have provided and then used compose connector to get the same. Here is my Logic app :-
Here is the generated schema while using the sample JSON Payload which you have provided I'm using:-
{
"type": "object",
"properties": {
"title": {
"type": "string"
},
"text1": {
"type": "array",
"items": {
"type": "string"
}
},
"id": {
"type": "string"
},
"status": {
"type": "string"
}
}
}
Add a new action, search for Control and select the For each:
To process an array in your logic app, you can create a "Foreach" loop. This loop repeats one or more actions on each item in the array.
Source: Create loops that repeat workflow actions or process arrays in Azure Logic Apps

80 Postgres databases in Azure Data Factory to copy into

I'm using Azure Data factory,
I'm using SQLServer as source and Postgres as target. Goal is to copy 30 tables from SQLServer, with transformation, to 30 tables in Postgres. Hard part is I have 80 databases from and to, all with the exact same layout but different data. Its one database per customer so 80 customers each with their own databases.
Linked Services doesn't allow parameters for Postgres.
I have one dataset per source and target using parameters for schema and table names.
I have one pipeline per table with SQLServer source and Postgres target.
I can parameterize the SQLServer source in linked service but not Postgres
Problem is how can I copy 80 source databases to 80 target databases without adding 80 target linked services and 80 target datasets? Plus I'd have to repeat all 30 pipelines per target database.
BTW I'm only familiar with the UI, however anything else that does the job is acceptable.
Any help would be appreciated.
There is simple way to implement this. Essentially you need to have a single Linked Service, which reads the connection string out of KeyVault. You can then parameterize source and target as keyvault secret names, and easily switch between data sources by just changing the secret name. This relies on all connection related information being enclosed within a single connection string.
I will provide a simple overview for Postgresql, but the same logic applies to MSSQL servers as source.
Implement a Linked Service for Azure Key Vault.
Add a Linked Service for Azure Postgresql that uses Key Vault to store access url in format: Server=your_server_name.postgres.database.azure.com;Database=your_database_name;Port=5432;UID=your_user_name;Password=your_password;SSL Mode=Require;Keepalive=600; (advise to use server name as secret name)
Pass this parameter, which is essentially correct secret name, in the Pipeline (you can also implement a loop that would accept immediately array of x elements, and parse n elements at a time into separate pipeline)
Linked Service Definition for KeyVault:
{
"name": "your_keyvault_name",
"properties": {
"description": "KeyVault",
"annotations": [],
"type": "AzureKeyVault",
"typeProperties": {
"baseUrl": "https://your_keyvault_name.vault.azure.net/"
}
}
}
Linked Service Definition for Postgresql:
{ "name": "generic_postgres_service".
"properties": {
"type": "AzurePostgreSql",
"parameters": {
"pg_database": {
"type": "string",
"defaultValue": "your_database_name"
}
},
"annotations": [],
"typeProperties": {
"connectionString": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "KeyVaultName",
"type": "LinkedServiceReference"
},
"secretName": "#linkedService().secret_name_for_server"
}
},
"connectVia": {
"referenceName": "AutoResolveIntegrationRuntime",
"type": "IntegrationRuntimeReference"
}
}
}
Dataset Definition for Postgresql:
{
"name": "your_postgresql_dataset",
"properties": {
"linkedServiceName": {
"referenceName": "generic_postgres_service",
"type": "LinkedServiceReference",
"parameters": {
"secret_name_for_server": {
"value": "#dataset().secret_name_for_server",
"type": "Expression"
}
}
},
"parameters": {
"secret_name_for_server": {
"type": "string"
}
},
"annotations": [],
"type": "AzurePostgreSqlTable",
"schema": [],
"typeProperties": {
"schema": {
"value": "#dataset().schema_name",
"type": "Expression"
},
"table": {
"value": "#dataset().table_name",
"type": "Expression"
}
}
}
}
Pipeline Definition for Postgresql:
{
"name": "your_postgres_pipeline",
"properties": {
"activities": [
{
"name": "Copy_Activity_1",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
...
... i skipped definition
...
"inputs": [
{
"referenceName": "your_postgresql_dataset",
"type": "DatasetReference",
"parameters": {
"secret_name_for_server": "secret_name"
}
}
]
}
],
"annotations": []
}
}

Azure Logic Apps parameters - how to use it?

For a few weeks now, there is a new "Parameter" section in the Designer of Logic Apps. We can create parameters and use these parameters in the Logic Apps:
Unfortunately, there are two things that are missing (or not working yet):
We are not able to set the "current value" of a parameter (as shown below, the field Actual Value is grayed)
When we export the Logic App as an ARM template, parameters are not used in the ARM template as ARM parameters.
Am I missing something or is it just due to the fact that there are features not yet deployed?
I'm not either sure how default/actual values should work in the Logic App designer but for the export of the ARM template with the (designer) parameters I'm using the last build of LogicAppTemplateCreator and it works for me.
The template snippet contains "paramDateFrom" parameter with its default value and it is used in the parameters section of ARM:
...
"paramDateFrom": {
"type": "string",
"defaultValue": "2019-12-19"
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.Logic/workflows",
"apiVersion": "2016-06-01",
"name": "[parameters('logicAppName')]",
"location": "[parameters('logicAppLocation')]",
"dependsOn": [],
"properties": {
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"DateFrom": {
"defaultValue": "[parameters('paramDateFrom')]",
"type": "String"
}
},
...
The template parameters file contains the actual value:
...
"parameters": {
"paramDateFrom": {
"value": "2019-12-20"
}

Where can I provide filter for Azure Service Bus Topic Subscription from my logic app

I've created a service bus, topic, and subscription to that topic in Azure. I have a logic app that is triggered when a message arrives but I need to apply a filter (or rule?) to that subscription where it looks for a particular value in the message header before the logic app processes the message. I don't see anywhere in the logic app or in the Azure portal to create filters for the subscriptions. What mechanism exists to create a filter for a subscription?
How did you create your topic subscription? did you use ARM Templates?
When you are creating the Subscription, you can add a SqlFilter to the rule applied in the topic subscription.
The ARM Template below (taken from here) shows you how to add a SqlFilter to the Rule in a Topic Subsctiption
"resources": [{
"apiVersion": "[variables('sbVersion')]",
"name": "[parameters('serviceBusNamespaceName')]",
"type": "Microsoft.ServiceBus/Namespaces",
"location": "[variables('location')]",
"sku": {
"name": "Standard",
},
"resources": [{
"apiVersion": "[variables('sbVersion')]",
"name": "[parameters('serviceBusTopicName')]",
"type": "Topics",
"dependsOn": [
"[concat('Microsoft.ServiceBus/namespaces/', parameters('serviceBusNamespaceName'))]"
],
"properties": {
"path": "[parameters('serviceBusTopicName')]"
},
"resources": [{
"apiVersion": "[variables('sbVersion')]",
"name": "[parameters('serviceBusSubscriptionName')]",
"type": "Subscriptions",
"dependsOn": [
"[parameters('serviceBusTopicName')]"
],
"properties": {},
"resources": [{
"apiVersion": "[variables('sbVersion')]",
"name": "[parameters('serviceBusRuleName')]",
"type": "Rules",
"dependsOn": [
"[parameters('serviceBusSubscriptionName')]"
],
"properties": {
"filterType": "SqlFilter",
"sqlFilter": {
"sqlExpression": "StoreName = 'Store1'",
"requiresPreprocessing": "false"
},
"action": {
"sqlExpression": "set FilterTag = 'true'"
}
}
}]
}]
}]
}]
You need to add your filter using Sql like expressions in the properties member of the rules sub-resource
e.g.
"sqlExpression": "YourMessageProperty='YourExpectedValue'",
If you are not using ARM Templates, the Service Bus Explorer allows you to remove the default subscription rule and create a new one with your own SqlFilter.
HTH

Alexa skill Rest API

Can we use Rest API instead of using Lambda. The reason im asking is because we got the request, we know what alexa accepts as a response, and we know that it is a POST. So connect all of these into REST API. The reason im asking is that the whole project is based in Jax-RS, so we want to have it all in one place, wihtout using lamda or anything. Not that lamda isn't that great.
So the request that alexa passes to Lambda is:
{
"session": {
"sessionId": "SessionId.a82f0b92-3650-4d45-8f12-e030ffc10894",
"application": {
"applicationId": "amzn1.echo-sdk-ams.app.8f35038e-13ac-4327-8e4f-e5df52dc1432"
},
"attributes": {},
"user": {
"userId": "amzn1.ask.account.AFP3ZWPOS2BGJR7OWJZ3DHPKMOMNWY4AY66FUR7ILBWANIHQN73QGGUEQZ7YXOLC7NYVD3JPUAHAGUS4ZFXJ6ZMS4EHO2CJFPWFLWLYZLDP7S227ADI54A2ZMLZLDO5CXSIB47ELNY54S2M7FDNJFHTSU67B7HB3UZUN6OUUR5BYS3UBRSIPBG4IWRLHUN36NXDYBWUM3NMQZRA"
},
"new": true
},
"request": {
"type": "IntentRequest",
"requestId": "EdwRequestId.bfdb3c27-028b-4224-977a-558129808e9a",
"timestamp": "2016-07-11T17:52:55Z",
"intent": {
"name": "HelloWorldIntent",
"slots": {}
},
"locale": "en-US"
},
"version": "1.0"
}
Response:
{
"version": "1.0",
"response": {
"outputSpeech": {
"type": "PlainText",
"text": "Hello World!"
},
"card": {
"content": "Hello World!",
"title": "Greeter",
"type": "Simple"
},
"shouldEndSession": true
},
"sessionAttributes": {}
}
Sure you can. In fact, when you are creating your skill in the Alexa Developer Portal, you have that option. The caveat is that you will need to manage your own TLS certificate and will have to make sure that the latency/responsiveness is decent based on the location of your users.
If you would like to explore this further, you can use Amazon's Java code examples. They can be found at: https://github.com/amzn/alexa-skills-kit-java.
You can definitely set up a RESTful service API for use with Alexa.
And, if you set it up in Azure, you don't even need to create your own certificate.
You can use a rest api as the endpoint for alexa skills. The apis will be invoked in the following manner
[Configured_URL]>/**alexa/[intent]**
Where [Configured_URL] - is the url endpoint configured in amazon site for invoking
[intent] - is the name of the intent
You should host your service accordingly
https://developer.amazon.com/public/solutions/alexa/alexa-skills-kit/docs/developing-an-alexa-skill-as-a-web-service
https://iwritecrappycode.wordpress.com/2016/04/01/create-an-alexa-skill-in-node-js-and-hosting-it-on-heroku/

Resources