Microsoft Teams Chatbot Not Responding - Integration With IBM Watson Bot - ibm-watson

I have done the integration of IBM Watson Bot with the Microsoft Teams but the chatbot is not responding. Whereas, same integration with Skype working pretty well. I tried debugging the chatbot using the bot emulator and it's working there. Kindly help on this. Thanks in advance.
Below is the log for Bot Emulator:
[13:26:34] -> POST 202 [message] hello
[13:26:35] <- POST 200 Reply[message] Hello
[13:26:35] <- POST 200 Reply[event] Debug Event
[13:26:38] -> POST 202 [message] hi
[13:26:39] <- POST 200 Reply[message] Hola
[13:26:39] <- POST 200 Reply[event] Debug Event
[13:26:46] -> POST 202 [message] status of Ankita
[13:26:48] <- POST 200 Reply[message] Status of Ankita is RolledOn!
[13:26:48] <- POST 200 Reply[event] Debug Event
[13:56:20] -> POST 202 [conversationUpdate]
[13:56:20] -> POST 202 [conversationUpdate]
[13:56:31] -> POST 202 [message] Hi
[13:56:32] <- POST 200 Reply[event] Debug Event
[13:56:33] <- POST 200 Reply[message] Hola
[13:56:33] <- POST 200 Reply[event] Debug Event

Related

python flask + Server Sent Events(SSE) in Google App Engine(GAE)

Hi I am trying to do SSE(Server-Sent Events) with python flask quite similar to this question
I am using SSE to plot a real-time graph in the web app.
My codes are working fine when applied in local, but when I deploy it in GAE(Google App Engine), the data does not seem to be returned(yield)
I found that I should make the response header as following
X-Accel-Buffering: no
in this guide here
So, I tried this code on "main.py"
# render "plot.html" and plot the real-time graph
# plot.html gets the value from /chart-data event stream, and update the graph
#app.route('/plot_graph', methods=["GET", "POST"])
def plot_graph():
return render_template('plot.html')
#app.route('/chart-data')
def chart_data():
def generate_random_data():
while True:
# generating random data
json_data = json.dumps(
{'time': datetime.now().strftime('%Y-%m-%d %H:%M:%S'), 'value': random() * 100})
yield f"data:{json_data}\n\n"
time.sleep(0.5)
resp = Response(generate_random_data(), mimetype='text/event-stream')
resp.headers["X-Accel-Buffering"] = "no"
return resp
Also, in "plot.html" file, it is getting values from /chart-data like this
const source = new EventSource("/chart-data");
source.onmessage = function (event) {
const data = JSON.parse(event.data);
...
}
and this also works well when executed in local machine, but not working on GAE..
for the logs and error messages, when I tried the code on local machine,
and went to the root dir, /plot_graph, /chart-data, the log looks like
127.0.0.1 - - [03/Jul/2020 14:14:19] "GET / HTTP/1.1" 200 -
127.0.0.1 - - [03/Jul/2020 14:14:20] "GET /favicon.ico HTTP/1.1" 404 -
127.0.0.1 - - [03/Jul/2020 14:14:23] "GET /plot_graph HTTP/1.1" 200 -
127.0.0.1 - - [03/Jul/2020 14:14:24] "GET /chart-data HTTP/1.1" 200 -
127.0.0.1 - - [03/Jul/2020 14:15:04] "GET /chart-data HTTP/1.1" 200 -
when I went into /plot_graph, I see a GET request from /chart-data and it works fine(/plot_graph shows a real-time graph). Also, when I go into /chart-data, I can see yielded values being displayed on the web.
In the GAE logs,
2020-07-03 05:22:23 default[20200703t141524] "GET / HTTP/1.1" 200
2020-07-03 05:22:23 default[20200703t141524] "GET /favicon.ico HTTP/1.1" 404
2020-07-03 05:22:38 default[20200703t141524] "GET /plot_graph HTTP/1.1" 200
for the GAE, even if I entered /plot_graph, the GET request does not seemed to be happening. and since it is not getting any values, the graph is not plotted(only the frame for the graph is displayed).
Also, I tried to go into /chart-data of GAE web server, but I could not enter and cannot see any GET request in the logs as I saw on the server on local machine.
Please can you help me with this problem?

Passing JSON request starting with the character [ gives the error: invalid request format with GET operation [duplicate]

This question already has an answer here:
Karate DSL: Getting connection timeout error
(1 answer)
Closed 2 years ago.
When I pass a JSON request starting with [, I get the error which says: invalid request format with GET operation.
Here is my request. The same works fine in PostMan.
[
{
"equipmentNumber": "76576",
"systemIdentifier": "00045F063547",
"serialNumber": "00098",
"materialNumber": "786786"
}
]
Your question is incomplete. Anyway I'll give it a shot. Here is a test that works perfectly fine in Karate, cut and paste it into a new Scenario and see for yourself.
* url 'https://httpbin.org/post'
* request
"""
[
{
"equipmentNumber": "76576",
"systemIdentifier": "00045F063547",
"serialNumber": "00098",
"materialNumber": "786786"
}
]
"""
* method post
Which results in this request:
1 > POST https://httpbin.org/post
1 > Accept-Encoding: gzip,deflate
1 > Connection: Keep-Alive
1 > Content-Length: 112
1 > Content-Type: application/json; charset=UTF-8
1 > Host: httpbin.org
1 > User-Agent: Apache-HttpClient/4.5.5 (Java/1.8.0_231)
[{"equipmentNumber":"76576","systemIdentifier":"00045F063547","serialNumber":"00098","materialNumber":"786786"}]
So if you are still stuck, follow this process - else no one can help you with the lack of information in your question: https://github.com/intuit/karate/wiki/How-to-Submit-an-Issue

Thunderbird Lightning caldav sync doesn't show any data/events

when i try to synchronize my caldav server implementation with Thunderbird 45.4.0 and Lightning 4.7.4 (one particular calendar collection) it doesnt show any data or events in the calendar though the last call of the sequence provided the data.
In the Thunderbird error log i can see one error:
Zeitstempel: 07.11.16, 14:21:12
Fehler: [calCachedCalendar] replay action failed: null,
uri=http://127.0.0.1:8003/sap/sports/webdav/appsvc/webdav/services/
server.xsjs/cal/_D043133/, result=2147500037, op=[xpconnect wrapped
calIOperation]
Quelldatei:
file:///Users/d043133/Library/Thunderbird/Profiles/hfbvuk9f.default/
extensions/%7Be2fda1a4-762b-4020-b5ad-a41df1933103%7D/calendar-
js/calCachedCalendar.js
Zeile: 327
the call sequence is as follows (detailed content via gist-links):
Propfind Request - Response
Options Request - Response
Propfind Request - Response
Report Request - Response - Response Raw
The synchronization with other clients like macOS-calendar and ios-calendar works in principle and shows the data. Does anyone has a clue what is going wrong here?
Not sure whether that is the cause but I can see two incorrect things:
a) Your <href/> property has trailing spaces:
<d:href>/sap/sports/webdav/appsvc/webdav/services/server.xsjs/cal/_D043133/EVENT%3A070768ba5dd78ff15458f1985cdaabb1.ics
</d:href>
b) your ORGANIZER property is not a valid URI
ORGANIZER:_D043133
i was able to find the cause of the above issue by debugging Thunderbird as propsed by Philipp. The Report Response has http status code 200, but as it is a multistatus response Thunderbird/Lightning expects status code 207 ;-)
Thanks for the hints!

Web scraping dynamically loading data in R

I am using R to web scrape data about reviews of 3D printer hubs from here. I need to grab the URL for each of the hubs in the search. I started by using the rvest package, but the data is loading dynamically (I believe using AngularJS) and rvest could not capture it.
After reviewing stackoverflow, I found I could load the webpage onto my computer and save it as a HTML file using phantomjs.org. I did that with the following code.
# this example scrapes the user table from:
url <- "https://www.3dhubs.com/3dprint#/?place=New%20York&latitude=40.7144&longitude=-74.006&distanceLimit=250&distanceUnit=miles&shipsToCountry=US&shipsToState=NY"
# write out a script phantomjs can process
writeLines(sprintf("var page = require('webpage').create();
page.open('%s', function () {
console.log(page.content); //page source
phantom.exit();
});", url), con="scrape.js")
# process it with phantomjs
system("phantomjs scrape.js > scrape.html")
# use rvest as you would normally use it
page_html <- read_html("scrape.html")
The above code did not load any of the desired data into R. Then I found the package rdom (https://github.com/cpsievert/rdom). Rdom uses a similar technique as above, but it was able to load in the names of each of the hubs, but not the link to the hub page.
tbl <- rdom::rdom("https://www.3dhubs.com/3dprint#/?place=New%20York&latitude=40.7144&longitude=-74.006&distanceLimit=250&distanceUnit=miles&shipsToCountry=US&shipsToState=NY")
htmltxt <- paste(capture.output(tbl, file=NULL), collapse="\n")
write(htmltxt, file = "scrape.html")
page_html <- read_html("scrape.html")
I have a very basic working knowledge of GET and POST requests. So using the Firebug add-in on Firefox, I was able to find the Post request that populates the fields.
https://hub-listings.3dhubs.com/listings
In the heading, the website only allows requests from 3dhubs.com. Here is the header for reference:
HTTP/1.1 200 OK
Access-Control-Allow-Origin: https://www.3dhubs.com
access-control-expose-headers: api-version, content-length, content-md5, content-type, date, request-id, response-time
Content-Type: application/json
Date: Wed, 14 Sep 2016 15:48:30 GMT
Content-Length: 227629
Connection: keep-alive
Is there some other technique I should try? Or does the “Access-Control-Allow-Origin” make it impossible?
Additional question, the search results are paginated. The second page is only loaded in when the “2” is selected at the bottom of the page, but the URL does not change from page 1 to 2. How would you account for this in web scraping?
Here is another approach that can be considered :
library(RSelenium)
url <- "https://www.hubs.com/3d-printing/#/?place=New%20York&latitude=40.7144&longitude=-74.006&distanceLimit=250&distanceUnit=miles&shipsToCountry=US&shipsToState=NY"
shell('docker run -d -p 4445:4444 selenium/standalone-firefox')
remDr <- remoteDriver(remoteServerAddr = "localhost", port = 4445L, browserName = "firefox")
remDr$open()
remDr$navigate(url)
htmltxt <- remDr$getPageSource()[[1]]
You can also consider the following approach :
library(RDCOMClient)
url <- "https://www.hubs.com/3d-printing/#/?place=New%20York&latitude=40.7144&longitude=-74.006&distanceLimit=250&distanceUnit=miles&shipsToCountry=US&shipsToState=NY"
IEApp <- COMCreate("InternetExplorer.Application")
IEApp[['Visible']] <- TRUE
IEApp$Navigate(url)
Sys.sleep(5)
doc <- IEApp$Document()
htmltxt <- doc$documentElement()$innerHtml()

not connecting to web socket properly in gatling

I want to connect to a web socket through gatling. But it is not working. The socket listener does not get any messsage. The code is given below.Can anyone suggest the problem? Is there any recording option for websocket in gatling. the recorder only record http requests.
`val scn = scenario("RecordedSimulation")
.exec(http("login")
.post("/user/login")
.formParam("email", "username")
.formParam("password", "*******")
.check(headerRegex("Set-Cookie", "localhost:1337.sid=(.*); Path=/").saveAs("cookie")))
.pause(5)
.exec(http("get sid")
.get("/socket.io/?EIO=3&transport=polling&t=1468496890883-0")
.headers(headers_3)
.check(regex("\"sid\":\"(.*)\",").saveAs("sid"))
)
.pause(4)
.exec(ws("Connect WebSocket").open("/socket.io/?EIO=3&transport=websocket&sid=${sid}")
.headers(Map(
"Accept-Encoding" -> "gzip, deflate, sdch",
"Accept-Language" -> "en-US,en;q=0.8",
"Pragma" -> "no-cache",
"Host" -> "localhost:1337",
"Cache-Control" -> "no-cache",
"Connection" -> "Upgrade",
"Origin" -> "http://localhost:1337",
"Sec-WebSocket-Extensions" -> "permessage-deflate; client_max_window_bits",
"Sec-WebSocket-Key" -> "sBWXugNrGCMSXmO3BEm4yw==",
"Sec-WebSocket-Version" -> "13",
"Upgrade" ->"websocket",
"Cookie" -> "io=${sid}; __cfduid=d1cf62d5cf275e2c709080ad7610da8b61465800778; cf_clearance=42068d23995e3243b3ee748ac616389d5cc27d92-1468865525-1800; _gat=1; _ga=GA1.2.1134427855.1467369017; localhost:1337.sid=${cookie}"
)))
.pause(1)
.exec(http("Run")
.post("/posturl")
.headers(headers_13)
.body(RawFileBody("RecordedSimulation_0013_request.txt")))
.exec(ws("Set Check for instance ID")
.check(wsAwait.within(30).until(1).regex("\"intInstanceID\":\"(.*-.*-.*-.*-.*)\",").saveAs("instanceID")))
.pause(1)
.exec(ws("Say Hello WS")
.sendText("""{"text": "Hello, I'm ${sid}"}"""))
.exec( session =>{
println(session("cookie").as[String])
session
})
setUp(scn.inject(atOnceUsers(1))).protocols(httpProtocol)
}`
You don't say how your expected incoming messages are related to your other actions, but beware that we don't currently buffer unmatched incoming messages.
So if a messages arrives before you actually set a check, it's lost.
If you have some feedback about your WebSocket load test usage, please share and discuss on our mailing list. There's an ongoing effort to redesign WebSocket support for Gatling 3.

Resources