Journey builder's custom activity: Fetch data extension data in bulk - salesforce

I am new to Salesforce Marketing Cloud and journey builder.
https://developer.salesforce.com/docs/marketing/marketing-cloud/guide/creating-activities.html
We are building journey builder's custom activity in which it will use a data extension as the source and when the journey builder is invoked, it will fetch a row and send this data to our company's internal endpoint. The team got that part working. We are using the postmonger.js.
I have a couple of questions:
Is there a way to retrieve the data from data extension in bulk so that we can call our company's internal bulk endpoint? Calling the endpoint for each record in the data extension for our use case would not be efficient enough and won't work.
When the journey is invoked and an entry in the data extension is retrieved and that data is sent to our internal endpoint, is there a machanism to mark this entry as already sent such that next time the journey is run, it won't process the entry that's already sent?
Here is a snippet of our customActivity.js in which this is populating one record. (I changed some variable names.). Is there a way to populate multiple records such that when "execute" is called, it is passing a list of payloads as input to our internal endpoint.
function save() {
try {
var TemplateNameValue = $('#TemplateName').val();
var TemplateIDValue = $('#TemplateID').val();
let auth = "{{Contact.Attribute.Authorization.Value}}"
payload['arguments'].execute.inArguments = [{
"vendorTemplateId": TemplateIDValue,
"field1": "{{Contact.Attribute.DD.field1}}",
"eventType": TemplateNameValue,
"field2": "{{Contact.Attribute.DD.field2}}",
"field3": "{{Contact.Attribute.DD.field3}}",
"field4": "{{Contact.Attribute.DD.field4}}",
"field5": "{{Contact.Attribute.DD.field5}}",
"field6": "{{Contact.Attribute.DD.field6}}",
"field7": "{{Contact.Attribute.DD.field7}}",
"messageMetadata" : {}
}];
payload['arguments'].execute.headers = `{"Authorization":"${auth}"}`;
payload['configurationArguments'].stop.headers = `{"Authorization":"default"}`;
payload['configurationArguments'].validate.headers = `{"Authorization":"default"}`;
payload['configurationArguments'].publish.headers = `{"Authorization":"default"}`;
payload['configurationArguments'].save.headers = `{"Authorization":"default"}`;
payload['metaData'].isConfigured = true;
console.log(payload);
connection.trigger('updateActivity', payload);
} catch(err) {
document.getElementById("error").style.display = "block";
document.getElementById("error").innerHtml = err;
}
console.log("Template Name: " + JSON.stringify(TemplateNameValue));
console.log("Template ID: " + JSON.stringify(TemplateIDValue));
}
});
Any advise or idea is highly appreciated!
Thank you.
Grace

Firstly, i implore you to not proceed with the design pattern of fetching data for each subscriber, from Marketing Cloud, that gets sent through the custom activity, for arguments sake i'll list two big issues.
You have no way of limiting the configuration of data extensions columns or column names in SFMC (Salesforce Marketing Cloud). If any malicious user or by human error would delete a column or change a column name your service would stop receiving that value.
Secondly, Marketing Cloud has 2 sets of API limitations, yearly and minute by minute. Depending on your licensing, you could run into the yearly limit.
The problem you have with limitation on minutes (2500 for REST and 2000 for SOAP) is that each usage of the custom activity in journey builder would multiple the amount of invocations per minute. Hitting this limit would cause issues for incremental data flows into SFMC.
I'd also suggest not retrieving any data from Marketing Cloud when a customer gets sent through a custom activity. Users should pick which corresponding rows/data that should be sent to the custom activity in their segmentation.
The eventDefinitionKey can be picked up from postmonger after requestedTriggerEventDefinition in the eventDefinitionModel function. eventDefinitionKey can then be used to programmatically populate SFMC's POST call with data from the Journey Data model, thus allowing marketers to select what data to be sent with the subscriber.
Following is some code to show how it would work in your customActivity.js
connection.on(
'requestedTriggerEventDefinition',
function (eventDefinitionModel) {
var eventKey = eventDefinitionModel['eventDefinitionKey'];
save(eventKey);
}
);
function save(eventKey) {
// subscriberKey fetched directly from Contact model
// columnName is populated from the Journey Data model
var params = {
subscriberKey: '{{Contact.key}}',
columnName: '{{Event.' + eventKey + '.columnName}}',
};
payload['arguments'].execute.inArguments = [params];
}

Related

Firestore Security rules with payment form

I'm creating a raffle website. The user connects his wallet and pays for a raffle ticket. After the blockchain transaction confirmation, I add his raffle ticket in a collection in firestore.
It causes a security issue because if I allow the user to write to the raffle ticket collection in my firebase security rules, he could create his own tickets without paying.
I need tickets to be added to the database only if payment has been successfully made.
I don't know how websites that have means of payment do it. Maybe firebase isn't a good solution ?
My project is in react/typescript.
You say you do the payment over the blockchain and I assume you use solidity as your smart contract language?
Why don't you emit an event in your smart contract?
You then listen for these events on a (seperate) server.
That updates your (firebase) database whenever an event was emitted.
(Untested) Sample Code:
How do you emit events in solidity? (raffle.sol)
// SPDX-License-Identifier: MIT
pragma solidity ^0.8.0;
contract Raffle {
event PaymentCompletion(address buyer, uint256 amountOfTickets);
function buyTickets() external payable {
emit PaymentCompletion(msg.sender, msg.value)
}
}
How do you listen to these events?
when using web3js:
const contract = new web3.eth.Contract(CONTRACT_ABI, CONTRACT_ADDRESS);
const lastBlock = await web3.eth.getBlockNumber()
// paymentEvents is an array containing the payments of the last 500 blocks.
const paymentEvents = await contract.getPastEvents(
'PaymentCompletion', // change if your looking for a different event
{ fromBlock: latestBlock - 500, toBlock: 'latest' }
);
now iterate through these events and put them into your database. You can also set up a subscription which notifies you whenever a new block was created, so you can check if new events were inside of the current block.
This is what it would look like if you add the first blockchain event to the firebase realtime database.
var db = admin.database();
var ref = db.ref("/payments");
// ...
ref.child("path/to/transaction").set({
buyer: paymentEvents[0].buyer,
amountOfTickets: paymentEvents[0].amountOfTickets,
// put the rest of your data here
}, (err) => {
if (err) {
console.error(err)
}
})
Alternatively (if you don't want to handle the payment on the blockchain) you could also take a look at stripe, it also has a firebase plugin for easy integration. (but I've never tried it out). However, imo using the blockchain for handling the payment would be the cleanest solution. (+ you don't have the handling fees stripe uses).
I hope I could give you some good clues! Firebase should be definitely suitable for this.

How to update NetSuite through Salesforce?

How would you go about updating NetSuite through Salesforce.
I know that you would use NetSuite's RESTlet and Salesforce Apex code to connect the two, but how would you actually go about doing this in a step by step process?
To send data from Salesforce to NetSuite (specifically customer/account data) you will need to do some preliminary setup in both.
In NetSuite:
Create a RESTlet Script that has at the bare minimum a get and post.
For instance I would create a javascript file on my desktop that contains:
/**
*#NApiVersion 2.x
*#NScriptType restlet
*/
//Use: Update NS customer with data (context) that is passed from SF
define(['N/record'], function(record) //use the record module
{
function postData(context)
{
//load the customer I'm gonna update
var cust = record.load({type:context.recordtype, id:context.id});
log.debug("postData","loaded the customer with NSID: " + context.id);
//set some body fields
cust.setValue("companyname", context.name);
cust.setValue("entityid", context.name + " (US LLC)");
cust.setValue("custentity12", context.formerName);
cust.setValue("phone",context.phone);
cust.setValue("fax",context.fax);
//remove all addresses
while(cust.getLineCount('addressbook') != 0)
cust.removeLine('addressbook',0);
//add default billing address
cust.insertLine('addressbook',0);
cust.setSublistValue('addressbook','defaultbilling',0,true);
cust.setSublistValue('addressbook','label',0,'BILL_TO');
var billingAddress=cust.getSublistSubrecord('addressbook','addressbookaddress',0);
billingAddress.setValue('country',context.billingCountry);
billingAddress.setValue('addr1', context.billingStreet);
billingAddress.setValue('city',context.billingCity);
billingAddress.setValue('state',context.billingState);
billingAddress.setValue('zip',context.billingZip);
//add default shipping address
cust.insertLine('addressbook',0);
cust.setSublistValue('addressbook','defaultshipping',0,true);
cust.setSublistValue('addressbook','label',0,'SHIP_TO');
var shippingAddress=cust.getSublistSubrecord('addressbook','addressbookaddress',0);
shippingAddress.setValue('country',context.shippingCountry);
shippingAddress.setValue('addr1',context.shippingStreet);
shippingAddress.setValue('city',context.shippingCity);
shippingAddress.setValue('state',context.shippingState);
shippingAddress.setValue('zip',context.shippingZip);
//save the record
var NSID = cust.save();
log.debug("postData","saved the record with NSID: " + NSID);
return NSID; //success return the ID to SF
}
//get and post both required, otherwise it doesn't work
return {
get : function (){return "get works";},
post : postData //this is where the sauce happens
};
});
After You've saved this file, go to NetSuite>Customization>Scripting>Scripts>New.
Select the new file that you've saved and create the script record. Your script record in NetSuite should have GET and POST checked under scripts.
Next, click deploy script and choose who will call this script, specifically the user that will login on the salesforce end into NetSuite.
On the deployment page you will need the External URL it goes something like:
https://1234567.restlets.api.netsuite.com/app/site/hosting/restlet.nl?script=123&deploy=1
Note: If any data that you are updating is critical for a process, I would highly recommond creating this in the sandbox first for testing, before moving to production.
In Salesforce sandbox:
Click YourName>Developer Console
In the developer console click File>New and create an Apex Class:
global class NetSuiteWebServiceCallout
{
#future (callout=true) //allow restlet callouts to run asynchronously
public static void UpdateNSCustomer(String body)
{
Http http = new Http();
HttpRequest request = new HttpRequest();
request.setEndPoint('https://1234567.restlets.api.netsuite.com/app/site/hosting/restlet.nl?script=123&deploy=1'); //external URL
request.setMethod('POST');
request.setHeader('Authorization', 'NLAuth nlauth_account=1234567, nlauth_email=login#login.com, nlauth_signature=password'); //login to netsuite, this person must be included in the NS restlet deployment
request.setHeader('Content-Type','application/json');
request.setBody(body);
HttpResponse response = http.send(request);
System.debug(response);
System.debug(response.getBody());
}
}
You will have to set the endpoint here to the external url, the authorization nlauth_account=[your netsuite account number], and the header to your login email and password to a person who is on the deployment of the NS script, the body will be set in the trigger that calls this class.
Next create the trigger that will call this class. I made this script run every time I update the account in Salesforce.
trigger UpdateNSCustomer on Account (after update)
{
for(Account a: Trigger.new)
{
String data = ''; //what to send to NS
data = data + '{"recordtype":"customer","id":"'+a.Netsuite_Internal_ID__c+'","name":"'+a.Name+'","accountCode":"'+a.AccountCode__c+'",';
data = data + '"formerName":"'+a.Former_Company_Names__c+'","phone":"'+a.Phone+'","fax":"'+a.Fax+'","billingStreet":"'+a.Billing_Street__c+'",';
data = data + '"billingCity":"'+a.Billing_City__c+'","billingState":"'+a.Billing_State_Province__c+'","billingZip":"'+a.Billing_Zip_Postal_Code__c+'",';
data = data + '"billingCountry":"'+a.Billing_Country__c+'","shippingStreet":"'+a.Shipping_Street__c+'","shippingCity":"'+a.Shipping_City__c+'",';
data = data + '"shippingState":"'+a.Shipping_State_Province__c+'","shippingZip":"'+a.Shipping_Zip_Postal_Code__c+'","shippingCountry":"'+a.Shipping_Country__c+'"}';
data = data.replaceAll('null','').replaceAll('\n',',').replace('\r','');
System.debug(data);
NetSuiteWebServiceCallout.UpdateNSCustomer(data); //call restlet
}
}
In this script data is the body that you are sending to NetSuite.
Additionally, you will have to create an authorized endpoint for NetSuite in your remote site setings in salesforce (sandbox and production). Go to setup, and quickfind remote site settings which will be under security controls.
Create a new Remote site that has its remote site URL set to the first half of your external url: https://1234567.restlets.api.netsuite.com.
From here, do some testing in the sandbox.
If all looks well deploy the class and trigger to salesforce production.

Twitter4j Getting Tweets Too Long

Hello i have tweet id's and i saved them to database before. But i saw that i could not save created time efficiently (it is saved like 00:00:00). Therefore i wished to update my tweets with tweet id by using following code.
MyConnectionBuilder myConnection = new MyConnectionBuilder();
Twitter twitter = new TwitterFactory(myConnection.configuration.build()).getInstance();
Status status = twitter.showStatus(Long.parseLong(tweetId));
But it takes too much time to get tweets, is there any rate limit for this ? If there is a rate limit how can i make it faster ?
Updating every single tweet via showStatus wastes your "credits" for a given timeframe (rate-limit).
For updating multiple tweets, you should use lookup with a maximum of 100 ids per request. This call will use the /statuses/lookup endpoint.
Rate-Limit and endpoint documentation can be found here
Code-Snipped for it:
Twitter twitter = twitterFactory.getInstance();
ResponseList<Status> responseList = twitter.lookup(ArrayUtils.toPrimitive(ids));
if(responseList != null) {
for (Status status : responseList) {
// do what you need to do here
}
}

firebase 3.0 on query

I am building a project where i want to extract a list from a query using firebase 3.0. I am quite new to this but I imagine that there is a simple answer to my question.
I have this structure :
requests : {
luke1 : {
1 : {
.../...
users : {
0 : {
username : joseph,
answered : 0
}
1 : {
username : mark,
answered : 1
}
}
}
}
}
Basically the logged in user (luke1) sends a request to a number of users (joseph and mark) and lets say i'm logged in as user: joseph.
I want to have get a list of requests sent to joseph which where not replied yet
var ref = firebase.database().ref("requests/");
i want to know how can i write the query.
Thanks for taking time to read this and if you need more information from my end, please let me know.
When using Firebase (and in most NoSQL databases), you will often find that you end up modeling the data for the way your app wants to consume it.
So with you current data model, you can easily get the requests sent by a specific user.
ref.child("requests/luke1").on("child_added", ...
But you cannot yet easily find the requests sent to a specific user. To allow querying for that data easily, you could add an inverted data structure to your database:
received: {
joseph: {
0: {
from: luke1
answered: 0
}
}
}
Now you can easily get joseph's unanswered requests with:
ref.child("received/joseph").orderByChild("answered").equalTo(0).on("child_added", ...
Your initial response is likely that this sort of data duplication is bad. But it's actually quite common in NoSQL databases.
There are many more ways to model this structure. For a great introduction to the topic, I recommend this article on NoSQL data modeling.
To achieve this kind of query you need to store in a variable the currentID of your user. After do this, just try something like the next query:
var ref = firebase.database().ref("request").child(currentUserId).child("users");
If I'm not wrong, it will return you the query that you want.

How to design a restful API with right semantic?

For instance, when selling a subscription to a user - what the system will do is
create an organisation
create a user
create a subscription
create an authentication
create send out an email
more operations based on business logic
And ALL above need to happen in SAME DB transaction as unit of work.
In SOAP semantic, it can be abstracted as register(organisation, User, Plan, authentication details..more parameters) and returns a subscription object.
But in Restful World, we will only deal with resources (only noun in URL) with HTTP verbs, and I found it is very hard to describe such business related logic instead of simple CRUD?
There is no requirement for RESTFUL interfaces that they are mapped 1:1 to a database behind the API.
The logic in your case could be:
client -- POST: SubscriptionRequests(request) --> Server
client <-- RESPONSE: Status|Error -- Server
Upon success, the Status response could contain properties which contain URI's to resulting new entries. Such as: SubscriptionURI = "Subscriptions/ID49343" UserURI="Users/User4711".
And then someone could later on ask about active subscriptions with:
client -- GET: Subscriptions --> Server
client <-- RESPONSE: Subscriptions | Error -- Server
This scheme could be considered RESTful. There is no problem with the fact, that the server has to manipulate a database (invisible to the client) and how it does that.
There is also not a problem that subsequent GET operations on the Subscriptions resource (and Users resource, for that matter) yield different output compared to before the SubscriptionRequest operation having been executed.
There is also no compelling reason to create a more chatty interface, just because you happen to have a certain data base modeling behind.
In that sense, it would be worse if you created an API like:
client -- POST: Users(newUser) --> Server
client <-- RESPONSE: Status|Error -- Server
(if adding user worked bla bla ... )
client -- POST: Subscriptions(userId,other data..) --> Server
client <-- RESPONSE: Status|Error -- Server
Which would basically just mean you did not design your API but simply copied the structure of your data base tables behind (and those will change next week).
In summary, it is not the business of API design to care about how the implementation handles the data base. If you need transactions or if you use some other ways to make sure all those things which need to be done are done is up to the implementation of that SubscriptionRequests.POST handler.
In fact, you think using the RPC mode ;-)
With REST, you must think using resources and representations. What you want to do is adding a subscription, so I would suggest to have a list resource for subscription with a method POST that implements the registration. In the request payload, you will provide what you need for the subscription and get back hints regarding the created subscription.
Here is a sample of the request:
POST /subscriptions/
{
"organization": {
"id": "organizationId",
"name": "organization name",
(...)
},
"user": {
"lastName": "",
(...)
}
}
Here is a sample of the response:
{
"id": "subscriptionId",
"credentials": {
(...)
},
(...)
}
You can notice that the payloads are proposals and perhaps don't exactly match to your subscription, user, ... structures. So feel free to adapt them.
Hope it helps you,
Thierry

Resources