Act upon response of AutoSync insert operation - extjs

I have a Store with autosync:true.
When loading the store, I'm getting complete models:
[{"id":11,"active":true,"name":"ABC","sens":7,"details":119,"type":13,"acl":false,"create":true,"delete":true,"owner":"alexander","members":"gustave\njerome"}]
When syncing a new model to the server, I'll send it with "id":0, so the server knows it has to create a new one. The server will then respond {"success":true,"data":[12],"debug":[]}, where 12 is the id of the newly created entry.
Now I have to add a callback function for the autoSync operation to patch the ids I receive back into the store.
If I had synced manually, this would have been easy:
Ext.getStore("RightsStore").sync({
success:function() {
}
})
But how can I get a special success or a callback function for insert syncs into a store that works with autoSync?

If the server sent {"success":true, "data"[{"id":12, ....}]} then you do not need to do anything. Best is if server sends back complete records it received for CRUD operation with the updated data (in same order). Ext takes care of the rest.

If you're unable to change the server output as #Saki had mentioned, you can just listen to the load event and update the records with the new id there.
store.on('load', me.loaded, me);
loaded: function(store, records) {
}
More details here - http://docs.sencha.com/extjs/4.2.2/#!/api/Ext.data.Store-event-load

Related

Journey builder's custom activity: Fetch data extension data in bulk

I am new to Salesforce Marketing Cloud and journey builder.
https://developer.salesforce.com/docs/marketing/marketing-cloud/guide/creating-activities.html
We are building journey builder's custom activity in which it will use a data extension as the source and when the journey builder is invoked, it will fetch a row and send this data to our company's internal endpoint. The team got that part working. We are using the postmonger.js.
I have a couple of questions:
Is there a way to retrieve the data from data extension in bulk so that we can call our company's internal bulk endpoint? Calling the endpoint for each record in the data extension for our use case would not be efficient enough and won't work.
When the journey is invoked and an entry in the data extension is retrieved and that data is sent to our internal endpoint, is there a machanism to mark this entry as already sent such that next time the journey is run, it won't process the entry that's already sent?
Here is a snippet of our customActivity.js in which this is populating one record. (I changed some variable names.). Is there a way to populate multiple records such that when "execute" is called, it is passing a list of payloads as input to our internal endpoint.
function save() {
try {
var TemplateNameValue = $('#TemplateName').val();
var TemplateIDValue = $('#TemplateID').val();
let auth = "{{Contact.Attribute.Authorization.Value}}"
payload['arguments'].execute.inArguments = [{
"vendorTemplateId": TemplateIDValue,
"field1": "{{Contact.Attribute.DD.field1}}",
"eventType": TemplateNameValue,
"field2": "{{Contact.Attribute.DD.field2}}",
"field3": "{{Contact.Attribute.DD.field3}}",
"field4": "{{Contact.Attribute.DD.field4}}",
"field5": "{{Contact.Attribute.DD.field5}}",
"field6": "{{Contact.Attribute.DD.field6}}",
"field7": "{{Contact.Attribute.DD.field7}}",
"messageMetadata" : {}
}];
payload['arguments'].execute.headers = `{"Authorization":"${auth}"}`;
payload['configurationArguments'].stop.headers = `{"Authorization":"default"}`;
payload['configurationArguments'].validate.headers = `{"Authorization":"default"}`;
payload['configurationArguments'].publish.headers = `{"Authorization":"default"}`;
payload['configurationArguments'].save.headers = `{"Authorization":"default"}`;
payload['metaData'].isConfigured = true;
console.log(payload);
connection.trigger('updateActivity', payload);
} catch(err) {
document.getElementById("error").style.display = "block";
document.getElementById("error").innerHtml = err;
}
console.log("Template Name: " + JSON.stringify(TemplateNameValue));
console.log("Template ID: " + JSON.stringify(TemplateIDValue));
}
});
Any advise or idea is highly appreciated!
Thank you.
Grace
Firstly, i implore you to not proceed with the design pattern of fetching data for each subscriber, from Marketing Cloud, that gets sent through the custom activity, for arguments sake i'll list two big issues.
You have no way of limiting the configuration of data extensions columns or column names in SFMC (Salesforce Marketing Cloud). If any malicious user or by human error would delete a column or change a column name your service would stop receiving that value.
Secondly, Marketing Cloud has 2 sets of API limitations, yearly and minute by minute. Depending on your licensing, you could run into the yearly limit.
The problem you have with limitation on minutes (2500 for REST and 2000 for SOAP) is that each usage of the custom activity in journey builder would multiple the amount of invocations per minute. Hitting this limit would cause issues for incremental data flows into SFMC.
I'd also suggest not retrieving any data from Marketing Cloud when a customer gets sent through a custom activity. Users should pick which corresponding rows/data that should be sent to the custom activity in their segmentation.
The eventDefinitionKey can be picked up from postmonger after requestedTriggerEventDefinition in the eventDefinitionModel function. eventDefinitionKey can then be used to programmatically populate SFMC's POST call with data from the Journey Data model, thus allowing marketers to select what data to be sent with the subscriber.
Following is some code to show how it would work in your customActivity.js
connection.on(
'requestedTriggerEventDefinition',
function (eventDefinitionModel) {
var eventKey = eventDefinitionModel['eventDefinitionKey'];
save(eventKey);
}
);
function save(eventKey) {
// subscriberKey fetched directly from Contact model
// columnName is populated from the Journey Data model
var params = {
subscriberKey: '{{Contact.key}}',
columnName: '{{Event.' + eventKey + '.columnName}}',
};
payload['arguments'].execute.inArguments = [params];
}

Not getting a notification when trying to use real-time database communications

I'm trying to use web sockets to add a new notification to my app. I'm using Backand as my server, and can't seem to get a grasp on responding to the new event. Here's my server side code to send the event:
//Action: Update - After data saved and committed
function backandCallback(userInput, dbRow, parameters, userProfile) {
//Send to array of users
socket.emitUsers("new_notification",userInput, ["admins"]);
return {};
}
And my client code to receive it:
//Wait for server updates on 'items' object
Backand.on('new_notification', function (data) {
//Get the 'items' object that have changed
console.log(data);
});
But when I try to receive the event in my app code, I never see a notification of type "new_notification". Can someone please let me know what's going on?
Make sure you are using emitRole instead of emitUsers. The code above emits to a user array, but it looks like you want to send a notification to the "admins" role. Change this to emitRole() and you should be ok.
For role you want to keep it case sensitive so it will be:
socket.emitRole("items_updated",userInput, "Admin");
For users the syntax is:
socket.emitUsers("items_updated",userInput, ["user2#gmail.com","user1#gmail.com"]);

How to store and load the connections of jsplumb (Using AngularJS and jsPlumb)

Can pls anyone help me out how to store and load the connections, and anchor placements jsplumb and anjular js.
pls check the below image link
(http://i.stack.imgur.com/YC1PR.jpg).
1st one is the image without connections, second one with connections.
when i reload the page, still i need to get the connections in same places
Whenever a connection is established, "connection"(SOURCE) event is triggered. You need to store the connection endpoints details in that triggered function so that you can retrieve them later.
First make sure that you have set proper id for your endpoints. You can manually set at time of endpoint creation as:
var e0 = jsPlumb.addEndpoint("div1",{uuid:"div1_ep1"}), // You can also set uuid based on element it is placed on
e1 = jsPlumb.addEndpoint("div2",{uuid:"div2_ep1"});
Now bind the connection event where you will store the established connections info:
var uuid, index=0; // Array to store the endpoint sets.
jsPlumb.bind("connection", function(ci) {
var eps = ci.connection.endpoints;
console.log(eps[0].getUuid() +"->"+ eps[1].getUuid()); // store this information in 2d-Array or any other format you wish
uuid[index][0]=eps[0].getUuid(); // source endpoint id
uuid[index++][1]=eps[1].getUuid(); // target endpoint id
}
});
You can convert the array information to JSON format and store at the server side. When page is refreshed you need to retrieve the JSON data and restore the connection. For connecting the endpoints based on uuid make use of:
jsPlumb.connect({ uuids:["div1_ep1","div2_ep1"] });
Here is the jsFiddle for making connections based on endpoints.

Persisting a backbone model and the id

Let's assume we create a new model class an instanciate a person using it:
var User = Backbone.Model.extend({
urlRoot: '/user'
});
var nou = new User({
name: "nourdine"
});
Now, of course, we want to persist it. Not having added an id backbone will create a POST request and communicate the server the intention to create an entity under /user containing the data {name: "nourdine"}. Here's how we do it:
nou.save(null, {
success: function (model, response, options) {
// ... what do I do here?
}
})
The server will now create a record in the db containing the JSON data rearranged in some form and assign an ID to it. NOW:
1 - What is the server suppose to return in the HTTP response? A JSON containing the JSON provided by the clinet + the newly created fields, namely the ID of the new record?
2 - who is going to update the model in the client with these data? Me? Matter of fact I would like to tell the model in the client that a new ID has been assign to it by the server so that the next time I do user.save() I will obtain a PUT rather than a POST. But who is supposed to update the model in the client?
Thanks
so this is my work flow for this
client -> create model and populate with data
client -> save model (model.save())
server -> create server side version of model using data, assign an id
server -> respond with success and the id of the newly created model
client -> in the success set the id to the one passed back
now the only potential issue i have with my work flow is if something did not get set in the server successfully but the model was still created, my client model would not reflect that of the server anymore, but i minimize this by returning error if the model could not be created exactly as passed.
And now i am able to call model.save() again this time having the id so initiating a PUT request
From the documentation to a Backbone.Model
After a successful server-side save, the client is (optionally) updated with the server-side state.
So if you return a valid JSON your model will be updated automatically

Backbone - Handle empty response from the server

I use fetch to retrieve my models from the server. It works great when I have data in my DB but how to handle an empty response send by the server to the client ?
For example if no data have been already saved by the user the server send a http 200 response with an empty array and backbone triggers an error callback but I just want to inform the user that there is no data saved in DB. In this case an empty response just means there is no model to load and I don't want to create any model with the response.
Here is the code I use :
app.plans.fetch({
success: function(data) {
app.Notifications.updateMessages({text: "Plans loaded."});
},
error: function (){
app.Notifications.updateMessages({text: "Error."});
}
});
How the server can indicate that it's all right but there is just no data ?
If you are responsible for back-end and you can change response behavior you should use http status 204 for empty content response instead of 200
this should help
If server responds with the 200 it shouldn't trigger an error callback. It's weird.
Could you please add the example of the server response?
Also success and error callbacks take the following options as arguments: (collection, response, options)

Resources