good way for my database app RESTful hierarchy and push to array - angularjs

I use firebase with ionic (angularJS) and its my first utilisation of firebase. Im thinking about my database hierarchy and I am i bit confused.
myapp {
"gyms": {
"0": {},
"1": {},
"2": {}
},
"users" : {
"0" : {},
"1": {},
"2": {}
},
"sessions" : {
"0" : {
"user_id": {
"session_id": {
"routes": {
"0": {
},
"1": {
},
"2": {
},
}
}
}
},
"1": {},
"2": {}
},
}
What i need with RESTful :
// GYMS
gyms/
GET all gym objects
POST : I put manually a new gym with id: 0, 1, 2 ...
gyms/:gym_id
GET the gym corresponding to the id
exemple : gyms/0
gyms/:gym_id/users
GET all users from a specific gym
// USERS
users/
GET all users
POST create a new user, i would have a user_id numeric like id 0, 1, 2...
users/:user_id
GET the user corresponding to the id
// SESSIONS
sessions/
GET all sessions objects
sessions/:user_id
GET all sessions from a specific user
POST create a new session for the user
sessions/:user_id/:session_id
GET a specific session from a specific user
sessions/:user_id/:session_id/routes
GET all routes from a specific session and user
POST create a new route for the session
sessions/:user_id/:session_id/routes/:route_id
GET a specific route from a specific session and user
for resume, gyms contains users who have sessions. Each sessions contains routes. Each user is associated to a gym, and each session is at a user. Each session contains some routes. First time i make a JSON database, so, it is good ?
I said higher i want id like 0, 1, 2 for having a RESTful url like sessions/12/4/routes/8. But i seen the article for array and push with uniqueID generated from firebase like "-JyNAzZHIoMEpBe39a55" but what is the function for pushing an object to an array and have a id like 0, 1, 2? Because url like sessions/JyNAzZHIoMEpBe39a55/JyNAzZHIoMEpBe39a57/routes/JyNAzZHIoMEpBe39a57 are less human readable no ?
sorry for my bad english !

Sequential indices may be more readable. Whether they are better, depends on your aims. Firebase was designed for highly scalable multi-user systems, where indices are likely to become a bottleneck as users are adding items to the database.
From the Firebase blog post on our push ids:
We created push IDs for collaborative, multi-user applications to handle situations where many clients are adding to a list at the same time. We needed IDs that could be generated client-side so that they wouldn’t need to wait for a round-trip to the server. And we needed them to be unique so that many clients could append data to the same location at the same time without worrying about conflicts.
You're free to come up with your own scheme of course, but this summarizes why Firebase recommends using push ids.

Related

Journey builder's custom activity: Fetch data extension data in bulk

I am new to Salesforce Marketing Cloud and journey builder.
https://developer.salesforce.com/docs/marketing/marketing-cloud/guide/creating-activities.html
We are building journey builder's custom activity in which it will use a data extension as the source and when the journey builder is invoked, it will fetch a row and send this data to our company's internal endpoint. The team got that part working. We are using the postmonger.js.
I have a couple of questions:
Is there a way to retrieve the data from data extension in bulk so that we can call our company's internal bulk endpoint? Calling the endpoint for each record in the data extension for our use case would not be efficient enough and won't work.
When the journey is invoked and an entry in the data extension is retrieved and that data is sent to our internal endpoint, is there a machanism to mark this entry as already sent such that next time the journey is run, it won't process the entry that's already sent?
Here is a snippet of our customActivity.js in which this is populating one record. (I changed some variable names.). Is there a way to populate multiple records such that when "execute" is called, it is passing a list of payloads as input to our internal endpoint.
function save() {
try {
var TemplateNameValue = $('#TemplateName').val();
var TemplateIDValue = $('#TemplateID').val();
let auth = "{{Contact.Attribute.Authorization.Value}}"
payload['arguments'].execute.inArguments = [{
"vendorTemplateId": TemplateIDValue,
"field1": "{{Contact.Attribute.DD.field1}}",
"eventType": TemplateNameValue,
"field2": "{{Contact.Attribute.DD.field2}}",
"field3": "{{Contact.Attribute.DD.field3}}",
"field4": "{{Contact.Attribute.DD.field4}}",
"field5": "{{Contact.Attribute.DD.field5}}",
"field6": "{{Contact.Attribute.DD.field6}}",
"field7": "{{Contact.Attribute.DD.field7}}",
"messageMetadata" : {}
}];
payload['arguments'].execute.headers = `{"Authorization":"${auth}"}`;
payload['configurationArguments'].stop.headers = `{"Authorization":"default"}`;
payload['configurationArguments'].validate.headers = `{"Authorization":"default"}`;
payload['configurationArguments'].publish.headers = `{"Authorization":"default"}`;
payload['configurationArguments'].save.headers = `{"Authorization":"default"}`;
payload['metaData'].isConfigured = true;
console.log(payload);
connection.trigger('updateActivity', payload);
} catch(err) {
document.getElementById("error").style.display = "block";
document.getElementById("error").innerHtml = err;
}
console.log("Template Name: " + JSON.stringify(TemplateNameValue));
console.log("Template ID: " + JSON.stringify(TemplateIDValue));
}
});
Any advise or idea is highly appreciated!
Thank you.
Grace
Firstly, i implore you to not proceed with the design pattern of fetching data for each subscriber, from Marketing Cloud, that gets sent through the custom activity, for arguments sake i'll list two big issues.
You have no way of limiting the configuration of data extensions columns or column names in SFMC (Salesforce Marketing Cloud). If any malicious user or by human error would delete a column or change a column name your service would stop receiving that value.
Secondly, Marketing Cloud has 2 sets of API limitations, yearly and minute by minute. Depending on your licensing, you could run into the yearly limit.
The problem you have with limitation on minutes (2500 for REST and 2000 for SOAP) is that each usage of the custom activity in journey builder would multiple the amount of invocations per minute. Hitting this limit would cause issues for incremental data flows into SFMC.
I'd also suggest not retrieving any data from Marketing Cloud when a customer gets sent through a custom activity. Users should pick which corresponding rows/data that should be sent to the custom activity in their segmentation.
The eventDefinitionKey can be picked up from postmonger after requestedTriggerEventDefinition in the eventDefinitionModel function. eventDefinitionKey can then be used to programmatically populate SFMC's POST call with data from the Journey Data model, thus allowing marketers to select what data to be sent with the subscriber.
Following is some code to show how it would work in your customActivity.js
connection.on(
'requestedTriggerEventDefinition',
function (eventDefinitionModel) {
var eventKey = eventDefinitionModel['eventDefinitionKey'];
save(eventKey);
}
);
function save(eventKey) {
// subscriberKey fetched directly from Contact model
// columnName is populated from the Journey Data model
var params = {
subscriberKey: '{{Contact.key}}',
columnName: '{{Event.' + eventKey + '.columnName}}',
};
payload['arguments'].execute.inArguments = [params];
}

Azure B2C Active Directory: Update one property on all User

In my current project I'm using Microsofts Azure B2C Actice Directory.
My plan is to update a speciffic property (testClaim) of every single user.
What I'm actually doing ist loading all the users in my AD and updating each of them in an foreach-loop.
var requestBody = new SetTestClaimRequest
{
ClaimName = "testClaim",
Value = "thisIsATestValue"
};
var client = new RestClient("myRes");
var request = new RestRequest(Method.PUT);
request.AddJsonBody(requestBody);
The problem I'm facing is, that the GraphApi begins to block my requests, after just a few, and just answering with the following error:
Error Calling the Graph API:
{
"odata.error": {
"code": "Request_ThrottledTemporarily",
"message": {
"lang": "en",
"value": "Your request is throttled temporarily. Please try after 150 seconds."
},
"requestId": "ccf8a936-490e-4c4a-87aa-125157b2e6dd",
"date": "2020-04-17T12:37:44"
}
}
Is there a way to avoid this without throttling my request?
In my opinion throttling isn't a choice cause it would take multiple hours to update the amount of users im dealing with.
No, there is no way to bypass throttling limits. It may take some hours to process at the accepted rate. Try 1000 ops per minute maximum. Make sure to implement back off logic if you get a HTTP 429.

Firebase Database - Share Data When UID Unknown?

In Firebase Database; If user_a has data they can access and they want to share this data with user_b, what is the best practice and database structure for securely sharing this data between specific users?
Important: user_a doesn't have any information about user_b account, e.g. uid.
Detailed Example:
1) user_a has a list of clientele.
"users": {
"user_a": {
"clients": [
"clientUid",
"clientUid2"
]
}
},
"clients": {
"clientUid": {
"firstName": "John",
"lastName": "Doe"
},
"clientUid2": {
"firstName": "Joe",
"lastName": "Bloggs"
}
}
2) user_b signs up. user_a now wants to share user_b data in clients with the user_b account that signed up.
Put another way: user_a has a list of clients, one of them creates an account and needs to link to the information user_a has already entered for them.
An important element here is that there is no list of users or accounts that a 'friend request' can be made from for more data. I have experimented with creating a short unique id a user can enter when they sign up to access their data, but am unsure if this is a good way forward and have encountered issues.
This is different to previously asked 'shared data' questions as it focuses on, how is the link between the two users made? Not just the before and after.
I know if two ways:
Either the users must both know a certain value, known as a shared secret.
Or you allow the users to search for each other.
Shared secret
A shared secret can be easily modeled in the database as a location that you can read once you know it. Your short unique id is an example of such a shared secret, where user_a "determines" the secret, and then sends it to user_b out of band.
A simple flow:
user_a clicks a Get secret button in the app
The app determines a secret code, e.g. correct-horse-battery-staple and:
2a. Writes the secret code to the database in a non-queryable-but-readable location, together with the actual information to share, in your case the UID of user_a.
2b. Shows the secret code to user_a.
user_a copies the code and sends it to user_b via some other communication mechanism, e.g. text, email, chat.
user_b clicks Enter secret button in the app, and pastes the secret.
The app reads the location, and is able to read the UID of user_a.
The data structure for this could be something like:
"secrets": {
"correct-horse-battery-staple": "uidOfUserA"
}
Which you secure with these rules:
{
"rules": {
"secrets": {
"$secret": {
".read": true
}
}
}
}
So the above rules don't allow anyone to read /secrets, hence they can't get a list of all secrets. But once they know a secret, they can look up the associated data under it.
Allowing search
The alternative is to have a list of users in your database, and allow user_b to search on some property that they know from user_a. A common property might be their email address, or their display name, which you could store in the database like:
users: {
"uidOfUserA": {
email: "userA#domain.com",
displayName: "user A"
},
"uidOfUserB": {
email: "userB#anotherdomain.com",
displayName: "user B"
}
}
To allow users to search each other is a bit more involved, as in: it will require server-side code. The reason for this is that being able to search over a dataset requires that you can read that dataset. And in Firebase, if you can read a dataset, you can get all data under it. In other words, to allow search a user you need to be able to read all of /users and allowing this would disclose too much information.
So to implement a search securely, you'll need to have rules like this:
{
"rules": {
"users": {
"$uid": {
".read": true,
".write": "auth.uid === $uid"
}
}
}
}
In words: anyone can read the profile for a user for whom they know the UID, but each individual user can only modify their own profile.
With this, you can implement a Cloud Function that then takes the email or display name as input, and searches across all /users. Since Cloud Functions access the database with administrative privileges, they are not restricted by the above security rules.

How watson conversation should say good afternon based on time zone?

If the user logins to the website in morning, watson would say Good Morning!
If the user logins to the website in afternoon, watson would say Good afternoon!
If the user logins to the website in evening, watson would say Good Evening!
I've written like this
{
"conditions": "now().before('12:00:00')",
"output": {
"text": {
"values": [ "Good morning!" ]
}
}
}
But after closing the json editor the code is changing to like this:
{
"output": {
"text": {
"values": [
"Good morning!"
]
}
}
}
Can anyone please say what the solution is? Please provide the entire code for
["good morning,good afternoon,good evening"]
`
You can't define conditions in the JSON editor. So it deletes any field that is not part of the schema.
You can set the condition within the tooling UI at the IF statement section. Just paste in your condition part. As the functionality has recently changed, you will need to do the following.
On the Welcome node, click the "Customise" Cog. Select "Allow multiple responses".
Set your conditions now at each response part.
If you are using the workspace API, then I recommend to export your workspace to see how a node block is correctly structured. Alternatively you can check the API spec.
https://www.ibm.com/watson/developercloud/conversation/api/v1/#create_workspace

Is it possible to process JSON and accessing parameters using service bus?

I have seen that it is possible to add a JSON schema when you are using the "HTTP Request"-trigger and adding the JSON schema in the "Request Body JSON Schema"-box.
I have also looked at adding schema in the "Integration Account", however the section in the documentation says its "to confirm that XML documents you receive are valid", which is not what i am looking for.
I am using a Azure Service Bus Queue.
In this case i am having PeekLock as a trigger, the idea is that the input in the service bus will be of a certain format. It will all be in JSON. I dont "care" or need to know what happens before the service bus, all i know is that each message will contain the same format. What my logic app is supposed to do is to receive the message in the service bus and then mail it to whoever its supposed to go to, and add if there is anything to add from blob storage. I want to be able to access certain "tags" or "parameters", since Service Bus only have its own few tags.
I used the jsonschema.net to get the schema, and here is the JSON of how a format will look like:
{
"items": [
{
"Key": "XXXXXX-XXXX-XXXX-XXXX-XXXXXXX",
"type": "Email",
"data": {
"subject": "Who is the father?",
"bodyBlobUID": "00000000-0000-0000-0000-000000000000",
"to": [
"darth.vader#hotmail.com"
],
"cc": [
"luke.skywalker#nomail.com"
],
"bcc": [
"leia.skywalker#nomail.com"
],
"encoding": "System.Text.UTF8Encoding",
"isBodyHtml": false,
"organisationUID": "00000000-0000-0000-0000-000000000000",
"BlobUIDs": [
"luke.skywalker#nomail.com"
]
}
}
]
}
So my questions are of 2 parts:
1: Is it possible to add JSON schemas without using the HTTP Request
trigger for using service bus?
2: If #1 is possible, or maybe it can
be done in another way. How do i access the tags or parameters of the
JSON format? At this moment i am trying to do transformations using
schemas and maps with the Integration account but it seems
unnecessary.
UPDATE: Parse JSON is now available in Logic Apps.
we will be releasing an action called JSON Parse next week, in which you can specify the service bus output as payload, define schema of the payload, then custom friendly tokens in subsequent steps.

Resources