How to implement "client" type action in IBM Watson Assistant client? - ibm-watson

I'm working on IBM Watson Assistant service to build AI chatbot and currently I have building client side UI for that, So as per IBM's documentation they supports various actions to trigger external business logic using functions. I assume that "client" type action can help me to trigger some javascript function but I cannot see any working example on google. So please help me to implement "client" type action in my html/javascript client.

Check the question in this posting - https://developer.ibm.com/answers/questions/477020/help-with-custom-actions-for-ibm-watson-assistant/
There are two parts to client side actions. The first is in the dialog which signals that an action is required, and where the result of the action should be placed. Client side actions require you to have an orchestration layer. Which means that you are in control of both the dialog flow and the orchestration layer that is reacting. It is up to you how you get them to coordinate. You are not obliged to follow any pattern. The easiest way is to use the Context. The dialog sets a context variable eg. 'PleaseDoSomething', the application sees it and does something.
The documented Client Action construct is a specification that puts a structure to this process; Allows other orchestration layers to make sense of the action; Allows you to switch to Cloud Functions relatively easily.
If use the Client Action construct.
then the dialog json will look something like:
{
"output": {
"text": {
"values": [
"Hang on I need to look that up."
],
"selection_policy": "sequential"
},
"actions": [
{
"name": "fetchBalance",
"type": "client",
"result_variable": "balance"
}
]
}
}
Your orchestration layer looks for 'Actions', runs 'fetchBalance' and puts the result in the context field 'balance'.

Related

Link a jira issue to another jira issue while exporting nunit report with XrayImportBuilder

I'm trying to export nunit result to Jira Xray using XrayImportBuilder. I need to link a jira issue to another issue and I got below error. Am I missing something? XrayImportBuilder uses v2 create enpoints "rest/api/2/issue"
ERROR: Unable to confirm Result of the upload.....
Upload Failed! Status:400 Response:{"error":"Error creating Test Execution -
Issue create failed! - issuelinks: Field 'issuelinks' cannot be set.
It is not on the appropriate screen, or unknown."
The code I used in jenkins pipeline:
step([$class: 'XrayImportBuilder',
endpointName: '/nunit/multipart',
importFilePath: 'nunit3.xml',
importToSameExecution: 'true',
projectKey: 'AT',
serverInstance: jiraServerId,
importInParallel: 'true',
inputInfoSwitcher: 'fileContent',
importInfo: """{
"fields":{
"project":{
"key":"AT"
},
"summary":"${summary}",
"issuetype":{
"name":"Test Execution"
}
},
"update":{
"issuelinks": [{
"add": {
"values": [
{
"type": {
"id": "10102"
},
"outwardIssues": [
{
"key": "AT-23"
}
]
}
]
}
}
]
}
}"""
])
I tried to run without update fields it worked but I ran with update-issuelinks field it failed.
In the documentation it says > importInfo and testImportInfo must comply with the same format as the Jira issue create/update REST API format.
But it doesn't work as it is expected in the API doc.
First, to clarify, the Jenkins plugin is like a "wrapper" for invoking Xray REST API endpoints; it doesn't use Jira REST API endpoints.
Xray itself may use internally Jira REST API endpoints (or similar APIs).
Even though it seems possible to create a issue and link it to other issues using Jira's REST API, as long as you have the "Linked Issues" on the create screen as shown below, if you import results to Xray, using Xray API, that is not supported for the time being at least for Xray server/DC.
For Xray on Jira Cloud, if you have the "linked Issues" on the mentioned screen, it should work.
For Xray server/DC
You should only consider setting fields on the destination issue (i.e. the Test Execution that will be created) using the "fields" object in that JSON.
If you wish to upload results and then link the Test Execution issue to some other issues, you'll have to make a Jira REST API request yourself after uploading the results (example). However, there's currently a limitation that inhibts the Jenkins plugin of setting a build variable with the Test Execution issue key that was created..
Therefore, this approach may not work; we could think on a workaround like getting the last Test Execution issue that was created but that would be untrustworthy.
You may also reach out Xray support team and ask for an improvement in order to support this in the future.

Logic App Deployment - integration account cannot be found: The workflow must be associated with an integration account to use the workflow run action

I have a logic app that was deploying from visual studio without issue a few weeks ago.
Today its throwing the following error on deployment:
17:27:10 - "error": {
17:27:10 - "code": "IntegrationAccountAssociationRequired",
17:27:10 - "message": "The workflow must be associated with an integration account to use the workflow run action 'Liquid transform' of type 'Liquid'."
17:27:10 - }
Within my logic app, it has a parameter that references the integration account:
"IntegrationAccountRef": {
"type": "string",
"minLength": 1,
"defaultValue": "/subscriptions/99x99x9x-9xx9-x99x-x99x-x99x99x99x99/resourcegroups/devResourceGroup/providers/Microsoft.Logic/integrationAccounts/devIntegrationAccount"
},
I also reference this parameter in the parameters section of the logic app resource, so the logic app knows its the integration account:
"integrationAccount": {
"id": "[parameters('IntegrationAccountRef')]"
}
Yet it still throws the error mentioned at the top.
Has something changed in how Logic Apps now reference integration accounts in an arm template?
Appreciate any advice and expertise.
Just summarize the steps in your comments for other communities reference:
Even though we have set the reference to the integration account in the template code, it also needs to be done in the logic app properties within visual studio.
Click anyplace in the white space of the Visual Studio Logic App.
Look inside the Property Windows for the Integration Account selection windows.
Select the Integration Account you want to use and save your Logic App

Pattern for consolidating API calls while maintaining loose coupling?

Single-page apps often make several API calls on first load. Something like a dashboard might: load user information, load main content, load sidebar content, etc.
Letting individual components worry about making these requests is easy but increases load times since you have n components making n requests.
One approach is to consolidate under a single API endpoint all of the calls made by any one page. So now the dashboard page no longer makes n calls to n different endpoints but 1 call to the newly-introduced /dashboard endpoint.
This solution comes at the cost of couples serverside logic with client logic, which seems like a bad idea.
Another approach would be to limit this coupling to the client by consolidating all API calls to a single component which then could batch up all the queries, wait for the response, decompose it into its parts, and then distribute those results to the corresponding components. This approach seems better than the above, though there's still coupling.
Question: is there a well-established pattern (or a popular library) for addressing this problem? I imagine that every large application runs into this problem at some point, yet I can't find any information on the subject.
I don't know how large your application is but since you mentioned"every large application runs into this problem at some point" I think GraphQL is what you need.
GraphQL was developed internally by Facebook in 2012 before being publicly released in 2015. It basically provides a new approach to developing web APIs and has been contrasted with REST and other web service architectures. GraphQL supports read, write (they call it mutation), and subscribe to changes to data - realtime updates.
A little bit of history on why Facebook built GraphQL: In 2012, Facebook was working on their mobile application for Android and iOS. They had to make changes in their existing REST services to make it working for their mobile platforms as well considering data fetching on mobile devices with low network bandwidth. In order to resolve this issue, they started to work on GraphQL so they could still use their existing services for mobile platform as well.
There are so many advantages using GraphQL but one of them is what you are exactly looking for: Get many API’s response in single request
Also fo the front-end if you are using React, you can use Relay The production-ready GraphQL client for React by Facebook. Also there is another client GraphQL framework called Apollo
Here is just an example how the query is sent and how you will get the response based on requested query
request:
{
orders {
id
productsList {
product {
name
price
}
quantity
}
totalAmount
}
header {
id
isReadOnly
...
}
.....
}
Response:
{
"data": {
"orders": [
{
"id": 1,
"productsList": [
{
"product": {
"name": "productName",
"price": 20.55
},
"quantity": 100
}
],
"totalAmount": 80
}
]
"header": {
"id": 1,
"isReadOnly": false,
...
}
}
}

Can you edit a Logic App custom connector? and how does one deploy then maintain (update)

I created a logic app custom connector successfully (via the portal, not ARM) and it's in use (demo) working fine. It's a wrapper for an azure function but to provide better usability up front to less tech savy users i.e. expose properties VS providing json.
Any how once created my query is a simple one. Can it be edited 1. in the portal? 2. via ARM (if it was created by arm)? i.e. I want to add a better icon.
When I view the logic apps custom connector though in the portal and click EDIT all it does is populate the Connector Name and no more. See below. All original configuration, paramaters etc is missing.
So my queries.
Is this the norm?
On export of the custom connector (azure portal menu item) the template really has nothing in it. No content either of the connector details?
Is there an ARM template to deploy this?
If yes to 3, how do you go about modifying in the scenario you have to?
I also understand in using it in a logic app it created an API Connection reference. Does this stand alone, almost derived off the customer connector? And further uses say of a modified connector would create different API connections?
I feel I'm just missing some of the basic knowledge on how these are implemented. which in turn would explain the deploying, and maintenance.
Anyone :) ?
EDIT:
Think I've come to learn the portal is very buggy. The swagger editor loaded no content either and broke the screen. I've since tried a simpler connector i.e. no sample markup with escaped regex patterns and it seems to like going back into it to edit :) (Maybe one to report as a bug after all this)
That said then - Yes, edit should be possible but the other queries regarding ARM, export, redeploy and current connections still stands :)
You can deploy the Logic apps custom connector really easily. You need to do following steps
Configure you custom connector with proper settings and update it.
Once updated, click on the download link available at the top of the connector.
Download the ARM template skeleton using the Export Template.
In the properties section, just add a new property called swagger and paste the swagger you downloaded in step 2.
Parameterise your ARM template
Deploy using your choice of deployment using Azure DevOps , PowerShell etc.
Please refer to following ARM template for your perusal.
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"customApis_tempconnector_name": {
"defaultValue": "tempconnector",
"type": "String"
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.Web/customApis",
"apiVersion": "2016-06-01",
"name": "[parameters('customApis_tempconnector_name')]",
"location": "australiaeast",
"properties": {
"connectionParameters": {
"api_key": {
"type": "securestring",
"uiDefinition": {
"displayName": "API Key",
"description": "The API Key for this api",
"tooltip": "Provide your API Key",
"constraints": {
"tabIndex": 2,
"clearText": false,
"required": "true"
}
}
}
},
"backendService": {
"serviceUrl": "http://petstore.swagger.io/v2"
},
"description": "This is a sample server Petstore server. You can find out more about Swagger at [http://swagger.io](http://swagger.io) or on [irc.freenode.net, #swagger](http://swagger.io/irc/). For this sample, you can use the api key `special-key` to test the authorization filters.",
"displayName": "[parameters('customApis_tempconnector_name')]",
"iconUri": "/Content/retail/assets/default-connection-icon.e6bb72160664a5e37b9923c3d9f50ca5.2.svg",
"swagger": { "Enter Swagger Downloaded from Step 2 here" }
}
}
]
}

How to POST/PUT simple data to endpoints in BreezeJS

I'm developing a SPA in angular with BreezeJS, I could configure the client to get the data from an typical RESTful API with no special support for Breeze capabilities (Metadata is writen by hand). Now I'm struggling to create/update entities, since my server endpoints expect a simple structure, but BreezeJS saveChanges sends an array like the one described in this question
What I need is to change the data my app sends to the server from this
// Current saveBundle
{"entities": [
{ id:4,
label: "text",
description: "longer text"...,
"entityAspect": {"entityTypeName": ...}},
]}
to this:
{
id:4,
label: "text",
description: "longer text"...,
}
Is there a method or property I can override or something simple, I've been reading something about extending or writing my own dataservice adapter, but I'm lost in those waters. I'm thinking on putting some validations to parse that object from the API server in laravel, but that is not an easy way since there's a lot of code laid on already and the intention is to be generic so other clients (Breeze-less ones) consume the API.
Thank you in advance
Take a look at the Breeze REST Adapter for Azure Mobile or the Breeze REST Adapter for Sharepoint. You should be able to convert one of them to meet your needs.
See the Todo-Zumo sample to see how the adapter is used.

Resources