Why query LiveChatTranscript has no rows when Einstein bot execute the action? - salesforce

I am having trouble trying to follow this documentation https://developer.salesforce.com/docs/atlas.en-us.bot_cookbook.meta/bot_cookbook/bot_cookbook_greet_customer_lex.htm
I am getting this message "FATAL_ERROR System.QueryException: List has no rows for assignment to SObject" when I started a interaction with eistein bot. Also when I tried to query this sobject in salesforce, I get no rows. As I know LiveChatTranscript should create when an interaction start. This only happen when a Live Agent answer the conversation ? Eistein bot conversation doesnt create a LiveChatTranscript ?
The intereaction its via whatsapp channel
I tried to query via Flow and Apex class and I couldn't retrieve nothing. Also I am tracing apex class and Integration User and the only problem with that class is the query which has no rows
I am testing this 'cause I need to obtain the LiveChatTranscript Id to send to genesys integration

Related

Direct to agent routing from standard Einstein bot in Salesforce

We have a requirement to directly route the chat from Einstein bot to a specific agent linked with the Case when the user clicks on Transfer menu and Einstein bot only supports bot/queue/skill transfer. We have tried a lot of things, including Apex class call to create/update PSR and Agentwork records but nothing seems to work. Has anybody done this? Any ideas would be much appreciated, Thanks
Called an Apex class to intercept the transfer by the bot and created/updated PSR and Agentwork records in that class to route to a specified PreferredUserId

Is there a way to raise SNOW ticket as notification for query failures in snowflake?

I was going through the integration documents available for snowflake & service now. But, all documents are oddly focussed on sf consuming snow data for analytics. Didn't find anything related to creating tickets for failures at snowflake. Is it possible?
It's not about the monitoring & notification aspect of snowflake but connecting with service now and raise a ticket for query failures (tasks,sp etc.)
Any ideas?
There's no functionality like that as of now. I can recommend you open an Idea for it and if enough customers want it our Product Management will review it.
For the Snowpipe, we found a way to use it. We send the error message to SNS and then we can do a Lambda function to call the Rest API of ServiceNow to create a ticket.
For Task, we find that it is possible to use External Functions to notify to AWS whenever the Task fails, but we haven’t implemented it.
Email is a simple way. You need to determine how your ServiceNow instance is processing emails. We implemented incident creation from Azure App Insights based on emails.
In ServiceNow find the Inbound Action you need to process the email or make one.
ServiceNow provides every instance with an email account
Refer to enter link description here
The instance email is usually xxxx#service-now.com.
If your instance url is "audi.service-now.com", the email would be "audi#service-now.com".
For a PDI dev#servicenowdevelopers.com, e.g.; dev12345#servicenowdevelopers.com

Automated DynamoDB Database Checks | ReactJS + AWS Amplify

My team and I are working on a Full-Stack Application using ReactJS on the frontend and AWS Amplify on the backend. We are using AWS AppSync to Query data in our DynamoDB tables (through GraphQL Queries), Cognito for User Authentication, and SES to send out emails to users. Basically, the user inputs some info (DynamoDB Table #1), and that is matched against an opportunity database (DynamoDB table #2), and the top 3 opportunities are shown to the user. If none are found, an email is sent to inform the user that they will receive an email when opportunities are found. Now for the Question: I wanted to know if there is a way to automatically Query a DynamoDB table (Like once a day or every time a new opportunity is added to the DynamoDB Table #2) and send out emails with matching opportunities to users who were waiting for them? I tried using Lambda Triggers but the only way I could do it was by querying each row of DynamoDB Table #1 against DynamoDB Table #2. That is computationally infeasible as there will be too many resources being used up. I am asking for advice on how I can go about making that daily check because I haven't been able to figure it out yet! Any responses are appreciated, and let me know if you need any additional information from my side! Thank you!
You could look into using DynamoDB Streams. When a new Opportunity is added to DynamoDB, the stream would trigger a lambda to be called. Your lambda could then execute your business logic to match the opportunity with the appropriate user.

Maintaining the state of conversation in Facebook Messenger

I am building a facebook messenger bot which needs to take multiple inputs from the user. One way of storing all the inputs is to store them in database. However, I found a field "metadata" in the Send API request. The description of this field is also very promising - "Custom string that will be re-delivered to webhook listeners". However, when I fill this field with user inputs, I don't get them back with the next call.I used the sample app provided by Facebook to do this test.
Has anyone found a use for metadata property? Is it working?

Unable to query multiple salesforce object usnig Mule Soft Anypoint

I am new to SalesForce as well as Mule Anypoint and I have this task of getting data from multiple SalesForce objects into relational DB.
These are my SalesForce queries that I need. Basically, For every lead I get I need to perform the queries below to get the contact_guid__c
select ownerid from lead where id = ‘<lead_id>'
select EmployeeNumber from user where id = ‘<ownerid>'
select contact_guid__c from contact where Employee_Number__c = ‘<EmployeeNumber>'
I don't know how to go about combining these queries in such a way that I can integrate it into Mule workflow.
I tried the following method (See screenshot) but it didn't work at all.
The For Each workflow takes the ownerid and makes the second call to query EmployeeNumber from user table.
Any help is much appreciated!
Similar to how you query DB, you can use multiple IDs in where clause in Salesforce SOQL query. Refer to the following for SOQL to use in Mule:
https://resources.docs.salesforce.com/sfdc/pdf/salesforce_soql_sosl.pdf
By this way, you can reduce your number of calls from Mule to Salesforce.
Regarding flow changes, you need to first see whether FOR loop is required (if your WHERE IN clause in SOQL will be sufficient) and then you can call Salesforce thrice (3 message processors of Salesforce).

Resources