80 Postgres databases in Azure Data Factory to copy into - sql-server

I'm using Azure Data factory,
I'm using SQLServer as source and Postgres as target. Goal is to copy 30 tables from SQLServer, with transformation, to 30 tables in Postgres. Hard part is I have 80 databases from and to, all with the exact same layout but different data. Its one database per customer so 80 customers each with their own databases.
Linked Services doesn't allow parameters for Postgres.
I have one dataset per source and target using parameters for schema and table names.
I have one pipeline per table with SQLServer source and Postgres target.
I can parameterize the SQLServer source in linked service but not Postgres
Problem is how can I copy 80 source databases to 80 target databases without adding 80 target linked services and 80 target datasets? Plus I'd have to repeat all 30 pipelines per target database.
BTW I'm only familiar with the UI, however anything else that does the job is acceptable.
Any help would be appreciated.

There is simple way to implement this. Essentially you need to have a single Linked Service, which reads the connection string out of KeyVault. You can then parameterize source and target as keyvault secret names, and easily switch between data sources by just changing the secret name. This relies on all connection related information being enclosed within a single connection string.
I will provide a simple overview for Postgresql, but the same logic applies to MSSQL servers as source.
Implement a Linked Service for Azure Key Vault.
Add a Linked Service for Azure Postgresql that uses Key Vault to store access url in format: Server=your_server_name.postgres.database.azure.com;Database=your_database_name;Port=5432;UID=your_user_name;Password=your_password;SSL Mode=Require;Keepalive=600; (advise to use server name as secret name)
Pass this parameter, which is essentially correct secret name, in the Pipeline (you can also implement a loop that would accept immediately array of x elements, and parse n elements at a time into separate pipeline)
Linked Service Definition for KeyVault:
{
"name": "your_keyvault_name",
"properties": {
"description": "KeyVault",
"annotations": [],
"type": "AzureKeyVault",
"typeProperties": {
"baseUrl": "https://your_keyvault_name.vault.azure.net/"
}
}
}
Linked Service Definition for Postgresql:
{ "name": "generic_postgres_service".
"properties": {
"type": "AzurePostgreSql",
"parameters": {
"pg_database": {
"type": "string",
"defaultValue": "your_database_name"
}
},
"annotations": [],
"typeProperties": {
"connectionString": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "KeyVaultName",
"type": "LinkedServiceReference"
},
"secretName": "#linkedService().secret_name_for_server"
}
},
"connectVia": {
"referenceName": "AutoResolveIntegrationRuntime",
"type": "IntegrationRuntimeReference"
}
}
}
Dataset Definition for Postgresql:
{
"name": "your_postgresql_dataset",
"properties": {
"linkedServiceName": {
"referenceName": "generic_postgres_service",
"type": "LinkedServiceReference",
"parameters": {
"secret_name_for_server": {
"value": "#dataset().secret_name_for_server",
"type": "Expression"
}
}
},
"parameters": {
"secret_name_for_server": {
"type": "string"
}
},
"annotations": [],
"type": "AzurePostgreSqlTable",
"schema": [],
"typeProperties": {
"schema": {
"value": "#dataset().schema_name",
"type": "Expression"
},
"table": {
"value": "#dataset().table_name",
"type": "Expression"
}
}
}
}
Pipeline Definition for Postgresql:
{
"name": "your_postgres_pipeline",
"properties": {
"activities": [
{
"name": "Copy_Activity_1",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
...
... i skipped definition
...
"inputs": [
{
"referenceName": "your_postgresql_dataset",
"type": "DatasetReference",
"parameters": {
"secret_name_for_server": "secret_name"
}
}
]
}
],
"annotations": []
}
}

Related

How to add required to sub array in a json schema?

I'm creating a json schema to define necessary data with data types. There is some data need to be set into required filed. But didn't find how to do it in its document.
For this json schema:
{
"type": "object",
"required": [
"version",
"categories"
],
"properties": {
"version": {
"type": "string",
"minLength": 1,
"maxLength": 1
},
"categories": {
"type": "array",
"items": [
{
"title": {
"type": "string",
"minLength": 1
},
"body": {
"type": "string",
"minLength": 1
}
}
]
}
}
}
json like
{
"version":"1",
"categories":[
{
"title":"First",
"body":"Good"
},
{
"title":"Second",
"body":"Bad"
}
]
}
I want to set title to be required, too. It's in a sub array. How to set it in json schema?
There are a few things wrong with your schema. I'm going to assume you're using JSON Schema draft 2019-09.
First, you want items to be an object, not an array, as you want it to apply to every item in the array.
If "items" is a schema, validation succeeds if all elements in the
array successfully validate against that schema.
If "items" is an array of schemas, validation succeeds if each
element of the instance validates against the schema at the same
position, if any.
https://datatracker.ietf.org/doc/html/draft-handrews-json-schema-02#section-9.3.1.1
Second, if the value of items should be a schema, you need to treat it like a schema in its own right.
If we take the item from your items array as a schema, it doesn't actually do anything, and you need to nest it in a properties keyword...
{
"properties": {
"title": {
"type": "string",
"minLength": 1
},
"body": {
"type": "string",
"minLength": 1
}
}
}
Finally, now your items keyword value is a schema (subschema), you can add any keywords you can normally use, such as required, the same as you have done previously.
{
"required": [
"title"
],
"properties": {
...
}
}

In Azure Logic Apps ARM template, what are the possible values for the AuthType property for a SQL Server connector using On-Premise Data Gateway?

I have an Azure Logic App with a SQL Server connector through a On-Premise Data Gateway, the connection is made using SQL Server Authentication. It works fine from the Logic App Designer.
No details about the connection are stored in the ARM template of the SQL Server connection, so if I want to automate the deployment of the Logic App, I need to add some values to the ARM template. The documentation for this is really poor, even though I was able to write this template:
{
"type": "MICROSOFT.WEB/CONNECTIONS",
"apiVersion": "2018-07-01-preview",
"name": "[parameters('sql_2_Connection_Name')]",
"location": "[parameters('logicAppLocation')]",
"properties": {
"api": {
"id": "[concat(subscription().id, '/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/', 'sql')]"
},
"displayName": "[parameters('sql_2_Connection_DisplayName')]",
"parameterValues": {
"server": "[parameters('sql_2_server')]",
"database": "[parameters('sql_2_database')]",
"username": "[parameters('sql_2_username')]",
"password": "[parameters('sql_2_password')]",
"authType": "[parameters('sql_2_authtype')]",
"sqlConnectionString": "[parameters('sql_2_sqlConnectionString')]",
"gateway": {
"id": "[concat('subscriptions/', subscription().subscriptionId, '/resourceGroups/', parameters('dataGatewayResourceGroup'), '/providers/Microsoft.Web/connectionGateways/', parameters('dataGatewayName'))]"
}
}
}
}
But I can't find the correct value for the authType property corresponding to "SQL Server Authentication". The values windows and basic are accepted, but couldn't find the value for "SQL Server Authentication".
Can someone please tell me what's the value for the authType property corresponding to "SQL Server Authentication"?
Use following properties json inside your web api connection
"properties": {
"api": {
"id": "/subscriptions/<YourSubscriptionIDHere>/providers/Microsoft.Web/locations/australiaeast/managedApis/sql"
},
"parameterValueSet": {
"name": "sqlAuthentication",
"values": {
"server": {
"value": "SampleServer"
},
"database": {
"value": "WideWorldImporters"
},
"username": {
"value": "sampleuser"
},
"password": {
"value": "somepasssword"
},
"gateway": {
"value": {
"id": "/subscriptions/<subscriptionIDGoesHere>/resourceGroups/az-integration-study-rg/providers/Microsoft.Web/connectionGateways/<NameofTheGatewayHere>"
}
}
}
}
},
"location": "australiaeast"
That should do the trick

IBM CLOUD function action took too long to respond in IBM watson chat dialog

Hi, I am creating a chatbot. I developed a IBM cloud function(action) in IBM.
This is the action code..
{
"context": {
"my_creds": {
"user": "ssssssssssssssssss",
"password": "sssssssssssssssssssssss"
}
},
"output": {
"generic": [
{
"values": [
{
"text": ""
}
],
"response_type": "text",
"selection_policy": "sequential"
}
]
},
"actions": [
{
"name": "ssssssssssss/user-detail",
"type": "server",
"parameters": {
"name": "<?input.text?>",
"lastname": "<?input.text?>"
},
"credentials": "$my_creds",
"result_variable": "$my_result"
}
]
}
Now my action user detail is giving response when i am invoking the code.
But when i am checking the output with my chatbot I am getting execution of cloud functions action took too long.
There is currently a 5 second limitation on processing time for a cloud function being called from a dialog node. If your process will need longer than this, you'll need to do it client side through your application layer.

Where can I provide filter for Azure Service Bus Topic Subscription from my logic app

I've created a service bus, topic, and subscription to that topic in Azure. I have a logic app that is triggered when a message arrives but I need to apply a filter (or rule?) to that subscription where it looks for a particular value in the message header before the logic app processes the message. I don't see anywhere in the logic app or in the Azure portal to create filters for the subscriptions. What mechanism exists to create a filter for a subscription?
How did you create your topic subscription? did you use ARM Templates?
When you are creating the Subscription, you can add a SqlFilter to the rule applied in the topic subscription.
The ARM Template below (taken from here) shows you how to add a SqlFilter to the Rule in a Topic Subsctiption
"resources": [{
"apiVersion": "[variables('sbVersion')]",
"name": "[parameters('serviceBusNamespaceName')]",
"type": "Microsoft.ServiceBus/Namespaces",
"location": "[variables('location')]",
"sku": {
"name": "Standard",
},
"resources": [{
"apiVersion": "[variables('sbVersion')]",
"name": "[parameters('serviceBusTopicName')]",
"type": "Topics",
"dependsOn": [
"[concat('Microsoft.ServiceBus/namespaces/', parameters('serviceBusNamespaceName'))]"
],
"properties": {
"path": "[parameters('serviceBusTopicName')]"
},
"resources": [{
"apiVersion": "[variables('sbVersion')]",
"name": "[parameters('serviceBusSubscriptionName')]",
"type": "Subscriptions",
"dependsOn": [
"[parameters('serviceBusTopicName')]"
],
"properties": {},
"resources": [{
"apiVersion": "[variables('sbVersion')]",
"name": "[parameters('serviceBusRuleName')]",
"type": "Rules",
"dependsOn": [
"[parameters('serviceBusSubscriptionName')]"
],
"properties": {
"filterType": "SqlFilter",
"sqlFilter": {
"sqlExpression": "StoreName = 'Store1'",
"requiresPreprocessing": "false"
},
"action": {
"sqlExpression": "set FilterTag = 'true'"
}
}
}]
}]
}]
}]
You need to add your filter using Sql like expressions in the properties member of the rules sub-resource
e.g.
"sqlExpression": "YourMessageProperty='YourExpectedValue'",
If you are not using ARM Templates, the Service Bus Explorer allows you to remove the default subscription rule and create a new one with your own SqlFilter.
HTH

"Allow access to Azure services" turn off by default from ARM template

Does anybody know how to setup ARM Template to keep "Allow access to Azure services" switch be turned OFF by default?
Here is what I currently have
"resources": [
{
"name": "[parameters('serverName')]",
"type": "Microsoft.Sql/servers",
"location": "[parameters('location')]",
"apiVersion": "2014-04-01-preview",
"properties": {
"administratorLogin": "[parameters('administratorLogin')]",
"administratorLoginPassword": "[parameters('administratorLoginPassword')]",
"version": "[parameters('serverVersion')]"
},
"tags": {
"deploymentVersion": "[parameters('deploymentVersion')]",
"deploymentType": "[parameters('deploymentType')]"
},
"resources": [
{
"apiVersion": "2014-04-01-preview",
"dependsOn": [
"[concat('Microsoft.Sql/servers/', parameters('serverName'))]"
],
"location": "[parameters('location')]",
"name": "AllowAllWindowsAzureIps",
"properties": {
"endIpAddress": "0.0.0.0",
"startIpAddress": "0.0.0.0"
},
"type": "firewallrules"
}
]
} ]
Just modify endIpAddress and startIpAddress to 255.255.255.255. Like below:
{
"apiVersion": "2014-04-01-preview",
"dependsOn": [
"[concat('Microsoft.Sql/servers/', parameters('serverName'))]"
],
"location": "[parameters('location')]",
"name": "AllowAllWindowsAzureIps",
"properties": {
"endIpAddress": "255.255.255.255",
"startIpAddress": "255.255.255.255"
},
"type": "firewallrules"
}
Just remove the nested resources section. Then the sql will deploy without that checkmark. Redeploying won't remove the rule, but deploying a new sql will work.
My experience is when you use the name AllowAllWindowsAzureIps for a firewall resource type then it will ignore what range you specifiy in properties and just turn Allow Access to Azure Services flag on.
If you want to not have it enabled then don't include a resource with that name in your template.
Here is a bicep example:
resource sqlServer 'Microsoft.Sql/servers#2022-02-01-preview' = {
name: name
location: location
tags: tags
properties: {
administratorLogin: sqlAdministratorLogin
administratorLoginPassword: sqlAdministratorLoginPassword
version: '12.0'
}
}
resource allowAccessToAzureServices 'Microsoft.Sql/servers/firewallRules#2020-11-01-preview' = {
name: 'allow-access-to-azure-services'
parent: sqlServer
properties: {
startIpAddress: '0.0.0.0'
endIpAddress: '0.0.0.0'
}
}

Resources