Can the sql query in Tilestache cfg vector layer have variables passed in as parameters? - tilestache

Something like this such that a variable is passed in as a parameter:
"vector": {
"allowed origin": "*",
"provider": {
"class": "TileStache.Goodies.VecTiles:Provider",
"kwargs": {
"dbinfo": {
"host": "localhost",
"user": "postgres",
"database": "mydb"
},
"queries": [
"SELECT something_interesting, geometry AS __geometry__ FROM mydb where something_interesting > [**variable**]"
]
}
}
}
If not, what's the best approach to take? The idea is to be able to dynamically view and interact with postGIS data? I know this perhaps defeats the ability to cache the json tiles.

Related

Using Volttron aggregation agent

I'm trying to get the aggregate agent to work with timescale db.
https://volttron.readthedocs.io/en/main/developing-volttron/developing-agents/specifications/aggregate.html
I have a file aggregation.config
{
"connection": {
"type": "postgresql",
"params": {
"dbname": "volttrondb",
"host": "127.0.0.1",
"port": 5432,
"user": "user",
"password": "password",
"timescale_dialect": true
}
},
"aggregations":[
# list of aggregation groups each with unique aggregation_period and
# list of points that needs to be collected
{
"aggregation_period": "1h",
"use_calendar_time_periods": true,
"utc_collection_start_time":"2016-03-01T01:15:01.000000",
"points": [
{
"topic_names": ["campus/building/fake/EKG_Cos", "campus/building/fake/EKG_Sin"],
"aggregation_topic_name":"campus/building/fake/avg_of_EKG_Cos_EKG_Sin",
"aggregation_type": "avg",
"min_count": 2
}
]
}
]
}
And run the following command
vctl install services/core/SQLAggregateHistorian/ --tag aggregate-historian -c config/aggregation.config --start
It starts correctly - the vctl status shows it running and there are no errors in the log.
I do not see the point campus/building/fake/avg_of_EKG_Cos_EKG_Sin in the topics table.
Any suggestions?

80 Postgres databases in Azure Data Factory to copy into

I'm using Azure Data factory,
I'm using SQLServer as source and Postgres as target. Goal is to copy 30 tables from SQLServer, with transformation, to 30 tables in Postgres. Hard part is I have 80 databases from and to, all with the exact same layout but different data. Its one database per customer so 80 customers each with their own databases.
Linked Services doesn't allow parameters for Postgres.
I have one dataset per source and target using parameters for schema and table names.
I have one pipeline per table with SQLServer source and Postgres target.
I can parameterize the SQLServer source in linked service but not Postgres
Problem is how can I copy 80 source databases to 80 target databases without adding 80 target linked services and 80 target datasets? Plus I'd have to repeat all 30 pipelines per target database.
BTW I'm only familiar with the UI, however anything else that does the job is acceptable.
Any help would be appreciated.
There is simple way to implement this. Essentially you need to have a single Linked Service, which reads the connection string out of KeyVault. You can then parameterize source and target as keyvault secret names, and easily switch between data sources by just changing the secret name. This relies on all connection related information being enclosed within a single connection string.
I will provide a simple overview for Postgresql, but the same logic applies to MSSQL servers as source.
Implement a Linked Service for Azure Key Vault.
Add a Linked Service for Azure Postgresql that uses Key Vault to store access url in format: Server=your_server_name.postgres.database.azure.com;Database=your_database_name;Port=5432;UID=your_user_name;Password=your_password;SSL Mode=Require;Keepalive=600; (advise to use server name as secret name)
Pass this parameter, which is essentially correct secret name, in the Pipeline (you can also implement a loop that would accept immediately array of x elements, and parse n elements at a time into separate pipeline)
Linked Service Definition for KeyVault:
{
"name": "your_keyvault_name",
"properties": {
"description": "KeyVault",
"annotations": [],
"type": "AzureKeyVault",
"typeProperties": {
"baseUrl": "https://your_keyvault_name.vault.azure.net/"
}
}
}
Linked Service Definition for Postgresql:
{ "name": "generic_postgres_service".
"properties": {
"type": "AzurePostgreSql",
"parameters": {
"pg_database": {
"type": "string",
"defaultValue": "your_database_name"
}
},
"annotations": [],
"typeProperties": {
"connectionString": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "KeyVaultName",
"type": "LinkedServiceReference"
},
"secretName": "#linkedService().secret_name_for_server"
}
},
"connectVia": {
"referenceName": "AutoResolveIntegrationRuntime",
"type": "IntegrationRuntimeReference"
}
}
}
Dataset Definition for Postgresql:
{
"name": "your_postgresql_dataset",
"properties": {
"linkedServiceName": {
"referenceName": "generic_postgres_service",
"type": "LinkedServiceReference",
"parameters": {
"secret_name_for_server": {
"value": "#dataset().secret_name_for_server",
"type": "Expression"
}
}
},
"parameters": {
"secret_name_for_server": {
"type": "string"
}
},
"annotations": [],
"type": "AzurePostgreSqlTable",
"schema": [],
"typeProperties": {
"schema": {
"value": "#dataset().schema_name",
"type": "Expression"
},
"table": {
"value": "#dataset().table_name",
"type": "Expression"
}
}
}
}
Pipeline Definition for Postgresql:
{
"name": "your_postgres_pipeline",
"properties": {
"activities": [
{
"name": "Copy_Activity_1",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
...
... i skipped definition
...
"inputs": [
{
"referenceName": "your_postgresql_dataset",
"type": "DatasetReference",
"parameters": {
"secret_name_for_server": "secret_name"
}
}
]
}
],
"annotations": []
}
}

In Azure Logic Apps ARM template, what are the possible values for the AuthType property for a SQL Server connector using On-Premise Data Gateway?

I have an Azure Logic App with a SQL Server connector through a On-Premise Data Gateway, the connection is made using SQL Server Authentication. It works fine from the Logic App Designer.
No details about the connection are stored in the ARM template of the SQL Server connection, so if I want to automate the deployment of the Logic App, I need to add some values to the ARM template. The documentation for this is really poor, even though I was able to write this template:
{
"type": "MICROSOFT.WEB/CONNECTIONS",
"apiVersion": "2018-07-01-preview",
"name": "[parameters('sql_2_Connection_Name')]",
"location": "[parameters('logicAppLocation')]",
"properties": {
"api": {
"id": "[concat(subscription().id, '/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/', 'sql')]"
},
"displayName": "[parameters('sql_2_Connection_DisplayName')]",
"parameterValues": {
"server": "[parameters('sql_2_server')]",
"database": "[parameters('sql_2_database')]",
"username": "[parameters('sql_2_username')]",
"password": "[parameters('sql_2_password')]",
"authType": "[parameters('sql_2_authtype')]",
"sqlConnectionString": "[parameters('sql_2_sqlConnectionString')]",
"gateway": {
"id": "[concat('subscriptions/', subscription().subscriptionId, '/resourceGroups/', parameters('dataGatewayResourceGroup'), '/providers/Microsoft.Web/connectionGateways/', parameters('dataGatewayName'))]"
}
}
}
}
But I can't find the correct value for the authType property corresponding to "SQL Server Authentication". The values windows and basic are accepted, but couldn't find the value for "SQL Server Authentication".
Can someone please tell me what's the value for the authType property corresponding to "SQL Server Authentication"?
Use following properties json inside your web api connection
"properties": {
"api": {
"id": "/subscriptions/<YourSubscriptionIDHere>/providers/Microsoft.Web/locations/australiaeast/managedApis/sql"
},
"parameterValueSet": {
"name": "sqlAuthentication",
"values": {
"server": {
"value": "SampleServer"
},
"database": {
"value": "WideWorldImporters"
},
"username": {
"value": "sampleuser"
},
"password": {
"value": "somepasssword"
},
"gateway": {
"value": {
"id": "/subscriptions/<subscriptionIDGoesHere>/resourceGroups/az-integration-study-rg/providers/Microsoft.Web/connectionGateways/<NameofTheGatewayHere>"
}
}
}
}
},
"location": "australiaeast"
That should do the trick

EventGrid Trigger - How to set clienttrackingid from triggerbody?

In a microservice environment where requests span multiple services including eventgrid i'd like to configure an end-to-end logging with correlationid.
Inspired by this blog https://toonvanhoutte.wordpress.com/2018/08/05/end-to-end-correlation-across-logic-apps/
How can i configure the EventGrid triggers clientTrackingId with my correlationnr from Events data payload?
Checkout my definition below which does not work.
If i substitute "#{coalesce(json(triggerBody().Data)?.CorrelationNr, guid())}" with a string value or even "#parameters('$connections')['azureeventgrid']['connectionId']" it works like a charm.
"triggers": {
"When_a_resource_event_occurs": {
"correlation": {
"clientTrackingId": "#{coalesce(json(triggerBody().Data)?.CorrelationNr, guid())}"
},
"inputs": {
"body": {
"properties": {
"destination": {
"endpointType": "webhook",
"properties": {
"endpointUrl": "#{listCallbackUrl()}"
}
},
"filter": {
"includedEventTypes": [
"webhook.sp.updated"
]
},
"topic": "/subscriptions/xxxx/resourceGroups/xxx/providers/Microsoft.EventGrid/topics/WebHookManager"
}
},
"host": {
"connection": {
"name": "#parameters('$connections')['azureeventgrid']['connectionId']"
}
},
"path": "/subscriptions/#{encodeURIComponent('xxx')}/providers/#{encodeURIComponent('Microsoft.EventGrid.Topics')}/resource/eventSubscriptions",
"queries": {
"x-ms-api-version": "2017-06-15-preview"
}
},
"splitOn": "#triggerBody()",
"type": "ApiConnectionWebhook"
}
}
Logic App does not trigger. No Error message.
Please check the description about clientTrackingId, and your logic app no runs history is because your triggerBody() doesn't have CorrelationNr with the definition you show.
Actually your Event Grid trigger has detected the event, it just couldn't run with the logic. You could go to the EVALUATION and check the trigger history. It's because the value is null, then it won't run.
If you use HTTP request trigger, you could set the x-my-custom-correlation-id header. or set any key-value in the json body, then set the clientTrackingId with like #{coalesce(json(triggerBody())['keyname'], guid())}.
And if you are using some trigger without header, you have to point the value with string or other parameter like you said the connectionid or the parameter value you custom like below.
So the point is the clientTrackingId must be set before it runs and value could be obatined.

URL path variables in Azure Logic App Custom Connectors

I'm trying to build a Logic Apps Custom Connector that can update a JIRA issue (a feature not currently available in the prebuilt connector).
Here is a cURL example from the JIRA documentation for this request
curl -D- -u fred:fred -X PUT --data {see below} -H "Content-Type: application/json" http://kelpie9:8081/rest/api/2/issue/QA-31
{
"fields": {
"assignee":{"name":"harry"}
}
}
The QA-31 value is the unique identifier that I want to make a variable. Using Postman I set that as an Environment variable and successfully ran the request. When I uploaded the Postman collection to my custom connector 'QA-31' value wasn't available as a path variable
Then I tried editing the custom connector directly. In the Import Sample menu I replaced 'QA-31' in the URL with '{issueKey}'. This created a path variable but it also prefixed the url with '/en-us/widgets/manage'; which I don't want
Here is a picture of the problem
So there are a couple questions here:
Why is my path variable in Postman not being picked up in the custom connector while other requests from that collection were working fine
Why is my URL being prefixed with '/en-us/widgets/manage' when add a path variable in the 'Import from Sample' menu
Thanks!
Inside the Logic Apps Custom Connector Editor you may define path variables by enclosing the variable inside brackets (e.g. https://api.library.com/[method}/). This can be done manually during the "Definition" step of creating/editing your custom connector. However, the drawback is that you must use the "Import from sample" feature which requires you to manually rewrite the particular request.
To answer your question we can define the path variables in PostMan and then run the V1 export.
You can define a path variable in a Postman request by prepending a ':' to the variable name like so, https://api.library.com/:method/. This will add the key (method) and the optional value to the request parameters field.
When you export as a Postman V1 collection the resulting JSON code looks like,
{
"id": "fc10d942-f460-4fbf-abb6-36943a112bf6",
"name": "Custom Method Demo",
"description": "",
"auth": null,
"events": null,
"variables": [],
"order": [
"becb5ff8-6d31-48ee-be3d-8c70777d60aa"
],
"folders_order": [],
"folders": [],
"requests": [
{
"id": "becb5ff8-6d31-48ee-be3d-8c70777d60aa",
"name": "Custom Request Method",
"url": "https://api.library.com/:method",
"description": "Use a path variable to define a custom method.",
"data": null,
"dataMode": "params",
"headerData": [],
"method": "GET",
"pathVariableData": [
{
"key": "method",
"value": ""
}
],
"queryParams": [],
"auth": {
"type": "noauth"
},
"events": [
{
"listen": "prerequest",
"script": {
"id": "b7b91243-0c58-4dc6-b3ee-4fb4ffc604db",
"type": "text/javascript",
"exec": [
""
]
}
}
],
"folder": null,
"headers": "",
"pathVariables": {
"method": ""
}
}
]}
Notice the "pathVariables" field which corresponds to our custom path variable.
Now we can import this into our Logic App and the path variable is properly interpreted as described in the first paragraph.
Hope that helps.

Resources