Snowflake sending messages to Google Pub/Sub - snowflake-cloud-data-platform

I'm interested in using Snowflake to send a JSON object to a Google pub/sub.
I'm struggling to find good documentation on how to do this.
https://community.snowflake.com/s/article/Automating-Your-Snowflake-Database-Cloning-with-GCP
Is this possible within Snowflake, or will this have to be done with something like Python?

Related

Complete list of Google Pub/Sub pre-defined topics?

I used Google Pub/Sub to receive status changes on a build workflow I have in Google Build. There's a pre-defined topic called cloud-builds where you just name a new topic cloud-builds and Cloud Builds updates will populate the topic.
The Pub/Sub topic to which Cloud Build publishes these build update messages is called cloud-builds.
https://cloud.google.com/build/docs/subscribe-build-notifications
I'm curious if there is a complete list of pre-defined topics for Pub/Sub that automatically pipe from different services.
I looked around the docs for Pub/Sub but couldn't find a complete list.
Sorry, there's no list of topics like this. There are some other services that offer Pub/Sub notifications, for example Google Cloud Storage, but there's no centralized place where they are all documented. You will need to refer to the documentation of the particular service you are interested in.

Sending User Input from IBM Watson Assistant to Database or via E-Mail

I want to implement an evaluation feature to my chatbot. User would be able to rate service on a scale of 1 to 5 and make suggestions.
I guess I would use slots for that and store the provided data in a variable.
What would be the easiest way for me to save and access that data later?
Somehow I need to write it to a database and make that database easily accessible. Or ideally having Watson sending an email with the feedback to myself.
Is there an IBM Cloud Database service available for that?
What would be my first steps in order to achieve this? Maybe you have some tips or documentation links, or even code snippets if it's not to much work for you.
I used IBM Cloud functions to get a joke from an API to Watson via webhook. I used code from the internet. So I am somewhat familiar with the concept, but I need more guidance and couldn’t find anything helpful. Basically I know nothing about NODE.
I would recommend the tutorial and its code on how to build a database-driven Slackbot with Watson Assistant. It uses a webhook and Cloud Functions to interact with a database for various actions. You could use that as blueprint for setting up the webhook and see how the database is invoked.
Make sure to secure the webhook. This can only be done using the command line (CLI), see the Cloud Functions doc on securing web actions.

Use Cloud Pub/Sub to trigger sending of email

I'm trying to figure out how to use Cloud Pub/Sub to trigger the sending of an email when a file is added to a storage bucket.
Currently using PHP72 in Google App Engine standard environment. First I created a Topic that creates a message when a file is added to the storage bucket. Then I created a Pull subscription which reads the message. I can view the messages in the GCP console, but what I would like to happen is that I want to be notified by email, preferably with a copy of the file added to the email as an attachment. Is this possible? I tried looking for a solution or tutorial but came up empty.
You can implement the send mail login in a cloud function which will be triggered by Pub/Sub (Node.js,Python,Go).
Using Pub/Sub to trigger a Cloud Function
Instead of using a pull subscription, you should probably use a push subscription with App Engine, combined with one of the third party mail services such as Send Grid or MailJet.
The upload of an object to GCS triggers a message to be sent to the topic, and the push subscription delivers that message to App Engine.
Unfortunately, there aren't any full tutorials asking for exactly what you want, but hopefully this helps. Feel free to request a community tutorial for this by filing an issue on the GCP community toturial repo.

How to consume Chatbot Analytics?

We want to somehow consume a Chatbot Analytics so we can create our Own Analytics site for our Clients.
Is that any Possible?
Are there Any tools that will help?
We don't want tools, we want to consume their Data and Present them in our own site on behalf of our Clients.
Conside creating Chatbot via Chatfuel, API.ai or something.
You can use chatbotproxy.com API to fetch app and page specific metrics.
Currently, it collects 10 metrics, Note: if there is no data; then API does not return 0, it skips keys with 0 count. ChatbotProxy Metrics
We don't want tools, we want to consume their Data and Present them in our own site on behalf of our Clients.
By assuming you are referring consumer response as Data, yes that is possible.
To gather that data you should use AWS cloud services; you can use AWS Lex and AWS Lambda to build chatbot. In AWS ecosystem,you build the skeleton with Lex and provide functionality using Lambda function which will be triggered on catching an intent.
Considering you want to do some custom analysis on your consumer's responses AWS provides the best solution. AWS implementations are more flexible, transparent and their SDKs are available for a diverse set of platforms.
If your bot is created using api.ai, unfortunately, there's no way to consume analytics data via API calls, instead, they have developed & announced Analytics dashboard in api.ai console. Here, you can review statistics relevant to the specific agent. The solution to your problem can be logging everything via webhooks and write your own analytics service, but you'd probably know that already.

Apttus to mulesoft Integration

Am trying to connect and fetch product/Item data from Apttus using mulesoft and send it to netsuite,I am not having much of information but I heard its part of salesforce so can I use salesforce connector to connect?,I have searched google but I didnt find any example to connect apptus to mulesoft.Can anyone help? if its part of sales force what operation should I choose
Yes, Salesforce connector will integrate with Apttus only if you're using standard and supported functions of the force.com platform.
You should therefore be able to create custom objects, and retrieve or filter them through DataSense.

Resources