Is there any way we can send email alerts if stored procedure fails in Snowflake?
When I checked snowflake documentation, there is no mention of email utility in Snowflake
You can send email directly from Snowflake optionally sending data from a table/view as an attachment. This is done using Snowflake External function which in turn calls an AWS Lambda function via AWS Gateway.
The first step is to setup the AWS Gateway. You may follow instructions below:
Creating a Customizable External Function on AWS
If you got the sample function working from Snowflake, you have successfully setup the foundation for adding email functionality. Next is to setup an S3 bucket to create datafiles that need to be sent as email-attachment.
Create an AWS S3 bucket with the name 'snowapi'. We need not expose this bucket to the internet, so keep 'Block all public access' set to ON.
Now you need to provide Snowflake access to this Bucket. Create an IAM user 'snowflake'. Add Permissions -> Attach exiting Policy: AmazonS3FullAccess. Go to 'Security Credetials' tab and 'Create access key'. Use the Access Key ID and Secret Access Key in the below command to unload data into S3 bucket.
CREATE OR REPLACE UTIL.AWS_S3_STAGE URL='s3://snowapi/'
CREDENTIALS=(AWS_KEY_ID='ABCD123456789123456789'
AWS_SECRET_KEY='ABCD12345678901234567890123456789');
COPY INTO #UTIL.AWS_S3_STAGE/outbound/SampleData.csv
FROM
FILE_FORMAT =
OVERWRITE = TRUE
SINGLE = TRUE;
Next step is to create a new Lambda function using the Nodejs code below. Note that this uses SENDGRID API. Sendgrid has a forever-free tier with 100 emails per day. I installed this library locally and uploaded the zip file to AWS to create the Lambda function.
//Lambda Function name: email
const sgMail = require('#sendgrid/mail');
var AWS = require('aws-sdk');
var s3 = new AWS.S3();
exports.handler = async (event, context, callback) => {
sgMail.setApiKey(process.env.SENDGRID_KEY);
const paramArray = JSON.parse(event.body).data[0];
//paramArray[0] has the row number from Snowflake
var message = {
to: paramArray[1].replace(/\s/g, '').split(','),
from: paramArray[2].replace(/\s/g, ''),
subject: paramArray[3],
html: paramArray[4]
};
// Attach file
if (paramArray.length > 5) {
var fileName = paramArray[5].substring(paramArray[5].lastIndexOf("/")+1);
var filePath = paramArray[5].substring(0, paramArray[5].lastIndexOf("/"));
try {
const params = {Bucket: process.env.BUCKET_NAME + filePath, Key: fileName};
const data = await s3.getObject(params).promise();
var fileContent = data.Body.toString('base64');
} catch (e) {
throw new Error(`Could not retrieve file from S3: ${e.message}`);
}
message.attachments = [{content: fileContent,
filename: fileName,
type: "application/text",
disposition: "attachment"
}];
}
try{
await sgMail.send(message);
return {
'statusCode': 200,
'headers': { 'Content-Type': 'application/json' },
'body' : "{'data': [[0, 'Email Sent to "+ paramArray[1] + "']]}"
};
} catch(e){
return {
'statusCode': 202,
'headers': { 'Content-Type': 'application/json' },
'body' : "{'data': [[0, 'Error - " + e.message + "']]}"
};
}
};
Set the below two environment variables for the Lambda function:
SENDGRID_KEY: <sendgrid_api_key>
BUCKET_NAME: snowapi
Create a Snowflake External Function:
create or replace external function util.aws_email
(mailTo varchar,mailFrom varchar,subject varchar,htmlBody varchar,fileName varchar)
returns variant
api_integration = aws_api_integration
as 'https://xxxxxxxxxx.execute-api.us-east-1.amazonaws.com/PROD/email';
Create a wrapper Procedure for the above external function:
create or replace procedure util.sendemail
(MAILTO varchar,MAILFROM varchar,SUBJECT varchar,HTMLBODY varchar,FILENAME varchar)
returns string
language javascript
EXECUTE AS OWNER
as
$$
// Call the AWSLambda function.
var qry = "select util.aws_email(:1,:2,:3,:4,:5)";
// null should be in lowercase.
var stmt = snowflake.createStatement({sqlText: qry,
binds: [MAILTO,
MAILFROM||'no-reply#yourdomain.com',
SUBJECT ||'Email sent from Snowflake',
HTMLBODY||'<p>Hi there,</p> <p>Good luck!</p>',
FILENAME||null]
});
var rs;
try{
rs = stmt.execute();
rs.next();
return rs.getColumnValue(1);
}
catch(err) {
throw "ERROR: " + err.message.replace(/\n/g, " ");
}
$$;
All Set! The end result is a clean call that sends email like below.
Call SENDEMAIL('to_email#dummy.com, to_another_email#dummy.com',
'from#yourdomain.com',
'Test Subject',
'Sample Body');
Good Luck!!
I believe there is no email utility in Snowflake, But you can run your snowflake stored procedure using python and check the stored procedure status, based on the status you can trigger mail from python.
Sending Email Notifications:
This feature uses the notification integration object, which is a Snowflake object that provides an interface between Snowflake and third-party services (e.g. cloud message queues, email, etc.). A single account can define a maximum of ten email integrations and enable one or more simultaneously.
To create an email notification integration, use the CREATE
NOTIFICATION INTEGRATION command with TYPE=EMAIL:
CREATE [ OR REPLACE ] NOTIFICATION INTEGRATION [IF NOT EXISTS]
<integration_name>
TYPE=EMAIL
ENABLED={TRUE|FALSE}
ALLOWED_RECIPIENTS=('<email_address_1>' [, ... '<email_address_N>'])
[ COMMENT = '<string_literal>' ]
;
After creating the email notification integration, you can call SYSTEM$SEND_EMAIL() to send an email notification, as follows:
CALL SYSTEM$SEND_EMAIL(
'<integration_name>',
'<email_address_1> [, ... <email_address_N>]',
'<email_subject>',
'<email_content>'
);
...
For example:
CALL SYSTEM$SEND_EMAIL(
'my_email_int',
'person1#example.com, person2#example.com',
'Email Alert: Task A has finished.',
'Task A has successfully finished.\nStart Time: 10:10:32\nEnd Time: 12:15:45\nTotal Records Processed: 115678'
);
We use the snowsql command from bash scripts, and use the "-o exit_on_error=true" option on the command line, checking the return code at the end of the step. If the Snowflake commands have failed, then the exit on error setting will mean that Snowflake will stop at the point of the error and return control to the calling program.
If the return code is zero, then we move onto the next step.
If it is non-zero, then we call an error handler which sends an email and then quits the job.
We're on Amazon Linux for our orchestration, and we use mutt as an email application.
Related
I have created windows flutter windows application now trying to save local data. I used shared_preferences plugin but it does not support windows applications so I found another package to save local data, sqflite_common_ffi plugin.
But don't know how to use it. I followed it's example and it worked but now want to separate it's all functionalities like I want to create a database in a method , and insert values in another method, get values in another method and delete them in another method.
CODE
import 'package:sqflite_common/sqlite_api.dart';
import 'package:sqflite_common_ffi/sqflite_ffi.dart';
class Storages{
var databaseFactory = databaseFactoryFfi;
Database db;
createDatabase ()async{
// Init ffi loader if needed.
sqfliteFfiInit();
db = await databaseFactory.openDatabase(inMemoryDatabasePath);
await db.execute('''
CREATE TABLE Credential(
id INTEGER PRIMARY KEY,
token TEXT,
password TEXT
)
''');
}
saveToken(String token)async{
await db.insert('Credential', <String, dynamic>{'token': token});
}
savePassword(String password)async{
await db.insert('Credential', <String, dynamic>{'password': password});
}
getToken()async{
var result = await db.query('Credential');
print('Credential token database;$result');
// await db.close();
}
getPassword()async{
var result = await db.query('Credential');
print('Credential password database;$result');
}
}
then first I initialized database saved the password in database from login screen but when I restart my application
and I called getpassword method from initState of my splash screen.
Storages storage = new Storages();
#override
void initState() {
super.initState();
storage.getPassword();
}
and thrown me this exception in my terminal
Unhandled Exception: NoSuchMethodError: The method 'query' was called on null.
Receiver: null
Tried calling: query("Credential")
How would you go about updating NetSuite through Salesforce.
I know that you would use NetSuite's RESTlet and Salesforce Apex code to connect the two, but how would you actually go about doing this in a step by step process?
To send data from Salesforce to NetSuite (specifically customer/account data) you will need to do some preliminary setup in both.
In NetSuite:
Create a RESTlet Script that has at the bare minimum a get and post.
For instance I would create a javascript file on my desktop that contains:
/**
*#NApiVersion 2.x
*#NScriptType restlet
*/
//Use: Update NS customer with data (context) that is passed from SF
define(['N/record'], function(record) //use the record module
{
function postData(context)
{
//load the customer I'm gonna update
var cust = record.load({type:context.recordtype, id:context.id});
log.debug("postData","loaded the customer with NSID: " + context.id);
//set some body fields
cust.setValue("companyname", context.name);
cust.setValue("entityid", context.name + " (US LLC)");
cust.setValue("custentity12", context.formerName);
cust.setValue("phone",context.phone);
cust.setValue("fax",context.fax);
//remove all addresses
while(cust.getLineCount('addressbook') != 0)
cust.removeLine('addressbook',0);
//add default billing address
cust.insertLine('addressbook',0);
cust.setSublistValue('addressbook','defaultbilling',0,true);
cust.setSublistValue('addressbook','label',0,'BILL_TO');
var billingAddress=cust.getSublistSubrecord('addressbook','addressbookaddress',0);
billingAddress.setValue('country',context.billingCountry);
billingAddress.setValue('addr1', context.billingStreet);
billingAddress.setValue('city',context.billingCity);
billingAddress.setValue('state',context.billingState);
billingAddress.setValue('zip',context.billingZip);
//add default shipping address
cust.insertLine('addressbook',0);
cust.setSublistValue('addressbook','defaultshipping',0,true);
cust.setSublistValue('addressbook','label',0,'SHIP_TO');
var shippingAddress=cust.getSublistSubrecord('addressbook','addressbookaddress',0);
shippingAddress.setValue('country',context.shippingCountry);
shippingAddress.setValue('addr1',context.shippingStreet);
shippingAddress.setValue('city',context.shippingCity);
shippingAddress.setValue('state',context.shippingState);
shippingAddress.setValue('zip',context.shippingZip);
//save the record
var NSID = cust.save();
log.debug("postData","saved the record with NSID: " + NSID);
return NSID; //success return the ID to SF
}
//get and post both required, otherwise it doesn't work
return {
get : function (){return "get works";},
post : postData //this is where the sauce happens
};
});
After You've saved this file, go to NetSuite>Customization>Scripting>Scripts>New.
Select the new file that you've saved and create the script record. Your script record in NetSuite should have GET and POST checked under scripts.
Next, click deploy script and choose who will call this script, specifically the user that will login on the salesforce end into NetSuite.
On the deployment page you will need the External URL it goes something like:
https://1234567.restlets.api.netsuite.com/app/site/hosting/restlet.nl?script=123&deploy=1
Note: If any data that you are updating is critical for a process, I would highly recommond creating this in the sandbox first for testing, before moving to production.
In Salesforce sandbox:
Click YourName>Developer Console
In the developer console click File>New and create an Apex Class:
global class NetSuiteWebServiceCallout
{
#future (callout=true) //allow restlet callouts to run asynchronously
public static void UpdateNSCustomer(String body)
{
Http http = new Http();
HttpRequest request = new HttpRequest();
request.setEndPoint('https://1234567.restlets.api.netsuite.com/app/site/hosting/restlet.nl?script=123&deploy=1'); //external URL
request.setMethod('POST');
request.setHeader('Authorization', 'NLAuth nlauth_account=1234567, nlauth_email=login#login.com, nlauth_signature=password'); //login to netsuite, this person must be included in the NS restlet deployment
request.setHeader('Content-Type','application/json');
request.setBody(body);
HttpResponse response = http.send(request);
System.debug(response);
System.debug(response.getBody());
}
}
You will have to set the endpoint here to the external url, the authorization nlauth_account=[your netsuite account number], and the header to your login email and password to a person who is on the deployment of the NS script, the body will be set in the trigger that calls this class.
Next create the trigger that will call this class. I made this script run every time I update the account in Salesforce.
trigger UpdateNSCustomer on Account (after update)
{
for(Account a: Trigger.new)
{
String data = ''; //what to send to NS
data = data + '{"recordtype":"customer","id":"'+a.Netsuite_Internal_ID__c+'","name":"'+a.Name+'","accountCode":"'+a.AccountCode__c+'",';
data = data + '"formerName":"'+a.Former_Company_Names__c+'","phone":"'+a.Phone+'","fax":"'+a.Fax+'","billingStreet":"'+a.Billing_Street__c+'",';
data = data + '"billingCity":"'+a.Billing_City__c+'","billingState":"'+a.Billing_State_Province__c+'","billingZip":"'+a.Billing_Zip_Postal_Code__c+'",';
data = data + '"billingCountry":"'+a.Billing_Country__c+'","shippingStreet":"'+a.Shipping_Street__c+'","shippingCity":"'+a.Shipping_City__c+'",';
data = data + '"shippingState":"'+a.Shipping_State_Province__c+'","shippingZip":"'+a.Shipping_Zip_Postal_Code__c+'","shippingCountry":"'+a.Shipping_Country__c+'"}';
data = data.replaceAll('null','').replaceAll('\n',',').replace('\r','');
System.debug(data);
NetSuiteWebServiceCallout.UpdateNSCustomer(data); //call restlet
}
}
In this script data is the body that you are sending to NetSuite.
Additionally, you will have to create an authorized endpoint for NetSuite in your remote site setings in salesforce (sandbox and production). Go to setup, and quickfind remote site settings which will be under security controls.
Create a new Remote site that has its remote site URL set to the first half of your external url: https://1234567.restlets.api.netsuite.com.
From here, do some testing in the sandbox.
If all looks well deploy the class and trigger to salesforce production.
While implementing chat in react web app using XMPP, the issue I've faced is storing sent data in Mysql DB end from Mnesia.
I used the strophe library for this to implement chat part in the web environment.
I am totally new to this chat server.
Here the workflow I'm following
1. Connection and authenticating to a ejabber server.
2. Creating a muc_room with some users.
3. I'm trying to send a message to the already created group, Here I designed the serve like it will store the messages to archive table in MySQL from Mnesia DB.
4. Finally, I want to get all the rosters list.
But I am facing a struggle in 3rd point I can't see the sent message from my end to the created group in archive table and I think it stored temporarily in Mnesia DB.
Please refer the pasted code regarding sendmessage,
var messagetype = (type) ? type : 'chat';
var reply;
if (messagetype === 'groupchat') {
reply = window.$msg({
to: messgeTo,
from: connection.jid,
type: messagetype,
id: connection.getUniqueId()
}).c("body", {xmlns: window.Strophe.NS.CLIENT}).t(message);
}
else {
reply = window.$msg({to: messgeTo,
from: connection.jid,
type: messagetype
}).c("body", {xmlns: window.Strophe.NS.CLIENT}).t(message);
}
connection.send(reply.tree());
console.log('I sent ' + messgeTo + ': ' + message, reply.tree());
Please, try to give any ideas to fix this issue.
/* Edited code */
connection.muc.join(messageTo, messageTo.split('#')[0], function(st) {
console.log(st, 'status');
});
if (messagetype === 'groupchat') {
reply = window.$msg({
to: messageTo,
from: connection.jid,
type: messagetype,
id: connection.getUniqueId()
}).c("body").t(message).up()
.c('request', {'xmlns': window.Strophe.NS.RECEIPTS});
} else {
reply = window.$msg({to: messa`enter code here`geTo,
from: connection.jid,
type: messagetype
}).c("body").t(message).up()
}
connection.send(reply.tree());
console.log('I sent ' + messageTo + ': ' + message, reply.tree());
Before sending message to a chat group I've joined the muc room. After that prepared the message content and sent it.
I'm trying to pull some data into a Google sheets spreadsheet from an API that's been built using Google Cloud Endpoints. Here is the API declaration:
#Api(
name = "myendpoint",
namespace =
#ApiNamespace
(
ownerDomain = "mydomain.com",
ownerName = "mydomain.com",
packagePath = "myapp.model"
),
scopes = {SCOPES},
clientIds = {ANDROID_CLIENT_ID, WEB_CLIENT_ID, API_EXPLORER_CLIENT_ID},
audiences = {WEB CLIENT_ID}
)
The method I'm trying to access has authentication enabled by means of the user parameter in the API declaration:
#ApiMethod(name = "ping", httpMethod = HttpMethod.GET, path = "ping")
public StringResponse getPing(User user) throws OAuthRequestException {
CheckPermissions(user);//throws an exception if the user is null or doesn't have the correct permissions
return new StringResponse("pong");
}
This works fine when using the generated client libraries or the gapi js library. However AFAIK I can't use those js libraries in Apps Script.
I've got an OAuth2 flow working using the apps-script-oauth2 library from here, and I'm pretty much using the default setup for creating the service
function getService() {
// Create a new service with the given name. The name will be used when
// persisting the authorized token, so ensure it is unique within the
// scope of the property store.
return OAuth2.createService(SERVICE_NAME)
// Set the endpoint URLs, which are the same for all Google services.
.setAuthorizationBaseUrl('https://accounts.google.com/o/oauth2/auth')
.setTokenUrl('https://accounts.google.com/o/oauth2/token')
// Set the client ID and secret, from the Google Developers Console.
.setClientId(CLIENT_ID)
.setClientSecret(CLIENT_SECRET)
// Set the name of the callback function in the script referenced
// above that should be invoked to complete the OAuth flow.
.setCallbackFunction('ruggedAuthCallback')
// Set the property store where authorized tokens should be persisted.
.setPropertyStore(PropertiesService.getUserProperties())
// Set the scopes to request (space-separated for Google services).
.setScope(SCOPES)
// Below are Google-specific OAuth2 parameters.
// Sets the login hint, which will prevent the account chooser screen
// from being shown to users logged in with multiple accounts.
.setParam('login_hint', Session.getActiveUser().getEmail())
// Requests offline access.
.setParam('access_type', 'offline')
// Forces the approval prompt every time. This is useful for testing,
// but not desirable in a production application.
.setParam('approval_prompt', 'auto')
//.setParam('include_granted_scopes', 'true');
}
These are my methods for accessing the APIs
function getDriveDocs() {
return executeApiMethod('https://www.googleapis.com/drive/v2/','files?maxResults=10');
}
function pingServer(){
return executeApiMethod('https://myapp.appspot.com/_ah/api/myendpoint/v1/','ping');
}
function executeApiMethod(apiUrl, method)
{
//var url = ;
var url = apiUrl + method;
var service = getRuggedService();
return UrlFetchApp.fetch(url, {
'muteHttpExceptions': true,
'method': 'get',
'headers': {
Authorization: 'Bearer ' + service.getAccessToken()
}
});
}
The getDriveDocs() method works perfectly, so I know my auth flow is working correctly. Also, if I call an unauthenticated method in my API I get the correct response. However, when I call the authenticated 'ping' method, the 'user' parameter is always null. Am I missing something in the fetch call? Everything I've read so far seems to suggest that setting
Authorization: 'Bearer ' + service.getAccessToken()
should be enough.
Any help would be much appreciated!
This turned out to be a simple mistake - I had created a new oauth2 credential in the google dev console and had not added the new client id to the API declaration. Here is the working API declaration:
#Api(
name = "myendpoint",
namespace =
#ApiNamespace
(
ownerDomain = "mydomain.com",
ownerName = "mydomain.com",
packagePath = "myapp.model"
),
scopes = {SCOPES},
clientIds = {ANDROID_CLIENT_ID, WEB_CLIENT_ID, API_EXPLORER_CLIENT_ID, GAPPS_CLIENT_ID},
audiences = {WEB CLIENT_ID}
)
I have some sample code that is successfully connecting to SQL Server using Microsoft SQL Server user name and password. But I was wondering if there is a way to use integrated security with this script. Basically which means use the logged in user's credentials without supplying a password in the script.
var sql = require('mssql');
var config = {
server: '127.0.0.1',
database: 'master',
user: 'xx',
password: 'xxx',
options : {
trustedConnection : true
}
}
var connection = new sql.Connection(config, function(err) {
// ... error checks
if(err) {
return console.log("Could not connect to sql: ", err);
}
// Query
var request = new sql.Request(connection);
request.query('select * from dbo.spt_monitor (nolock)', function(err, recordset) {
// ... error checks
console.dir(recordset);
});
// Stored Procedure
});
Wish I could add this as a comment but don't have enough reputation yet... but what happens when you run this without providing a username/password in the config object?
Windows Authentication happens at the login level so there is no need to provide it at the application level.
Just browsed the documentation and looks like you cannot provide a raw connection string to connect, but to connect you want to build something that looks like this:
var connectionString= 'Server=MyServer;Database=MyDb;Trusted_Connection=Yes;'
The source code of the mssql module is here: https://github.com/patriksimek/node-mssql/blob/master/src/msnodesql.coffee... maybe you can fork and do a pull request that would provide an optional flag whether to use Windows Authentication or not, and that flag would remove the Uid={#{user}};Pwd={#{password}} (as it's unneeded for Windows Authentication) from the CONNECTION_STRING_PORT variable in the module's source code.