we have data returned from rest service in below JSON format and I want to group data by ActivityStartDate and then show in format. I can iterate through and create group by ActivityStartDate but this seems not efficient method in case we have more items returned.
Is there any better way to achieve it?
{
"results": [
{
"Id": 1,
"Title": "Food Promotion - 1",
"ActivityStartDate": "2015-12-12T08:00:00Z",
"ActivityEndDate": "2016-01-12T08:00:00Z",
"ActivityDescription": "Two for one if dream and do not dream of fack promotions",
"ShowInHistory": true,
"ShowUpdated": true,
"Modified": "2015-12-14T21:28:37Z",
"Created": "2015-12-14T21:28:37Z"
}
]
}
Reason to group in front end
Assuming you can get a collection from the back end by GET /activities. If the returned JSON array is either small or pre-sorted it may be just more flexible to group late (in front end). This labor division between front and back end follows separation of concerns:
terms front end and back end refer to the separation of concerns between the presentation layer (front end), and the data access layer (back end)
Limitations: Data should be small because then it can be completely fetched and independently sorted in the front end. Contrary big data can lead to performance issues when loading/grouping. Thus it should be pre-sorted on the server, which can be achieved if your REST API (back end) supports sorting/paging for collection resources.
How to group in Angular
See typescript1.7 - How to group data in Angular 2? - Stack Overflow with advice on Angular's Pipes demonstrated on Plunkr in answer.
Similar for older AngularJS there is a Filter component achieving the same goal:
transformation/grouping/filtering of data.
Related
Intro
I have a FireStore database similar to a social media db, with 3 collections Users, Events, and EventUpdates. My goal is to create a feed with eventUpdates created by me and my friends. So i have to expand my database with friendship connections. But i struggle with 3 problems, and hopefully somebody here can push me in the right direction to solve these.
Problem/Question 1:
I added username and user image to the EventUpdate model so it's easier to query. I've heard denormalise is the way to go in a NoSQL database. But if a user updates his user image, i've to update all eventUpdates created by that user. Sounds like something you don't wanne do. But is there a better way to do this?
Problem/Question 2:
How can i create a data structure that is optimised for performing the following query: get eventUpdates from me and my friends ordered by date.
Problem/Question 3:
How to store likes? I can keep a counter in a eventUpdate. But this becomes a problem when i denormalise eventUpdates (see current solution underneath EDIT)..
Data structure example .
{
"users": {
"1": { "name": "Jack", "imageUrl": "http://lorempixel.nl" }
},
"events": {
"A": {
"name": "BeerFestival",
"date": "2018/09/05",
"creatorId": "1"
}
},
"eventUpdates": {
"1": {
"timestamp": "13243543",
"creatorId: "1",
"creatorName": "Jack",
"creatorImageUrl": "http://lorempixel.nl",
"eventId": "A",
"message": "Lorem ipsum"
}
}
}
EDIT
OK, after some trial and error i ended up with the following structure. This structure seems work, but my problem with this solution is that i need to make a lot of write calls to update a single eventUpdate because of all the copies in each feed (1000 followers means 1000 copies). And it looks like i need to do that a lot.
I would like for example to add a like button to each event update. This trigger an update on all EventUpdate copies. For me it looks like firebase is not suited for my project and i'm thinking of replacing it with a SQL DB, or can anyone here change my mind with a better solution?
{
"users": {
"user1": { "name": "Jack",
"imageUrl": "http://lorempixel.nl",
"followers": ["user1"]
}
},
"feeds": {
"user1": {
"eventUpdates": {
"1": {
"timestamp": "13243543",
"creatorId: "1",
"eventId": "A",
"message": "Lorem ipsum"
}
},
"following": {
"user1": {
"name": "Jack",
"imageUrl": "http://lorempixel.nl",
"followers": ["user1"]
}
}
},
"events": {
"A": {
"name": "BeerFestival",
"date": "2018/09/05",
"creatorId": "1"
}
}
}
I added username and user image to the EventUpdate model so it's easier to query. I've heard denormalise is the way to go in a NoSQL database.
That's right, denormalization and is a common practice when it comes to Firebase. If you are new to NoQSL databases, I recommend you see this video, Denormalization is normal with the Firebase Database for a better understanding. It is for Firebase realtime database but same rules apply to Cloud Firestore.
But if a user updates his user image, i've to update all eventUpdates created by that user. Sounds like something you don't wanne do. But is there a better way to do this?
Yes, that's also correct. You need to update all the places where that image exists. Because you have chosen google-cloud-firestore as a tag, I recommend you see my answer from this post because in case of many write operations, Firestore might be a little costly. Please also see Firestore pricing plans.
Regarding Firestore, instead of holding an entire object you can only hold a reference to a picture. In this case, there is nothing that you need to update. It's always a trade between these two techniques and unfortunately there is no way between. You either hold objects or only references to objects. For that, please see my answer from this post.
How can i create a data structure that is optimised for performing the following query: get eventUpdates from me and my friends ordered by date.
As I see, your schema is more a Firebase realtime database schema more than a Cloud Firestore. And to answer to your question, yes you can create. So talking about Firestore, you can create a collection named eventUpdates that can hold eventUpdate objects and to query it according to a timestamp, a query like this is needed:
FirebaseFirestore rootRef = FirebaseFirestore.getInstance();
CollectionReference eventUpdatesRef = rootRef.collection("eventUpdates");
Query query = eventUpdatesRef.orderBy("timestamp", Query.Direction.ASCENDING);
But please note that the timestamp field should be of type Date and not long. Please also take a look at my answer from this post, to see how you can add a date property in a Cloud Firestore database.
How to store likes? I can keep a counter in a eventUpdate. But this becomes a problem when i denormalise eventUpdates (see current solution underneath EDIT)
You can simply add likes but I recommend you see the last part of my answer from this post. So you might consider adding that count in a Firebase realtime database rather than in Cloud Firestore. Both databases work very well together.
This structure seems work, but my problem with this solution is that i need to make a lot of write calls to update a single eventUpdate because of all the copies in each feed (1000 followers means 1000 copies). And it looks like i need to do that a lot.
You might also take a look at my answer from this post.
For me it looks like firebase is not suited for my project and i'm thinking of replacing it with a SQL DB, or can anyone here change my mind with a better solution?
I don't think this way. There are many apps out there that have the exact mechanism as yours and are working very well.
If you want your feed items to be in sync with the real users data (new profile image when the user changes it for example) you can simply store the user ID in the eventUpdate document. This way you don't have to keep them in sync manually, and every time you have to display the item in the feed you could simply fetch user data, and easily query many eventUpdates on userId and created_at fields ( assuming you have them ).
To implement likes in your feed the solution depends on a bunch of things like traffic volume.
The simplest way is to update a likes field with a transaction, but Firestore has a maximum updates frequency on a single document of 1 second. Plus, a transaction can easily fail if more than 5 transactions are trying to update the same document.
To implement a more solid likes system take a look at this page from the official Firebase docs.
Firestore has a different approach to the NoSQL world. Once you know the data you will use (as You already do) there are some very important points about what architecture the data will have. And It depends a lot about how the data grows, what kind of queries you will need and how often you will use them. Some cases You can create a root collection that aggregates data and queries might be easier.
There is a great video from Firebase Channel that might help. Check it out!
How to Structure Your Data | Get to Know Cloud Firestore #5
https://www.youtube.com/watch?v=haMOUb3KVSo
[UPDATED] December 26th
Others videos that might help to model and query your data is these videos:
How to Connect Firebase Users to their Data - 3 Methods
https://www.youtube.com/watch?v=jm66TSlVtcc
How to NOT get a 30K Firebase Bill
https://www.youtube.com/watch?v=Lb-Pnytoi-8
Model Relational Data in Firestore NoSQL
https://www.youtube.com/watch?v=jm66TSlVtcc
I'm building a chat app with Firebase (and AngularJS) and I have a data structure that is similar to the one on this Firebase documentation page. This structure is good for not having to retrieve huge amounts of unneeded data but I don't seem to understand a very simple thing.
With my data looking like below, when a user connects to my app:
How do I retrieve their 10 most recently updated groups and keep this list updated as new messages are posted in groups?
// An index to track Ada's memberships
{
"users": {
"alovelace": {
"name": "Ada Lovelace",
// Index Ada's groups in her profile
"groups": {
// the value here doesn't matter, just that the key exists
"techpioneers": true,
"womentechmakers": true
}
},
...
},
"groups": {
"techpioneers": {
"name": "Historical Tech Pioneers",
"members": {
"alovelace": true,
"ghopper": true,
"eclarke": true
},
"lastUpdateTime": [SOME TIMESTAMP HERE]
},
...
}
}
More information if you care to read
As you can see, I've added "lastUpdateTime": [SOME TIMESTAMP HERE] to the code above because it's how I do it for my app. I can't figure out what should be the "refresh process" for a user group list.
If my user has 100 groups, should I retrieve a list of the group IDs and get the actual groups one by one to be able to only keep the 10 most recent (I'm pretty sure this is not the way to go)?
Also, whenever someone posts a message in a group, it will update the lastUpdateTime in Firebase but how do I keep the user group list synchronized to this?
I've tried some very ugly combinations of child events, orderBys as well as entire chains of functions executing whenever something fires but it doesn't work and seems extremely complicated, for nothing. The whole idea of flattening the data is to keep the queries/fetching to a minimum and I feel that what I have done so far is way too heavy.
To show the list of the 10 most recently updated groups:
ref.child("groups").orderByChild("lastUpdateTime").limitToLast(10)
If you use this approach, please flatten your data further, since the query will now end up retrieving the members of each group, which is not needed for displaying a list of groups.
If you want to a list of the groups the user is subscribed to by order of the last update, you have a few options:
store the last update timestamp for each user's subscriptions
load the user's groups and re-order them client-side
store the last update timestamp for each user's subscriptions
Store the timestamp the group was last updated for each user subscribed to the group:
"usersgroups": {
"alovelace": {
// the value is the timestamp the group was last updated
"techpioneers": 14952982198532978,
"womentechmakers": 14852982198532979
},
You'll note that I split the group memberships from the user profiles here, since you shouldn't nest such loosely related data.
Then you can get the list of the user's group in the correct order with:
ref.child("usersgroups/alovelace").orderByValue()
The main problem with this approach is that you'll need to update the timestamp of a group for all members for ever post. So writes become a lot more expensive.
load the user's groups and re-order them client-side
This may sound like it'll be slower, but it actually won't be too bad. Since you're only loading the groups the user is a member of, the number won't be too high. And Firebase pipelines the requests, so performance is a lot better than you may expect. See Speed up fetching posts for my social network app by using query instead of observing a single event repeatedly
I have created a basic API using Laravel and am currently building the front end with Angular. Something I am struggling to decide on is how / where to cross reference data in the form of id's with their actual value.
A task object is currently returned from the API as the following:
{
"id":1,
"task_owner":7,
"client":2,
"campaign":17,
"created_by":1,
"title":"Finalise agenda for Call.",
"notes":null,
"viewed":0,
"priority":1,
"start_date":"2016-08-10",
"end_date":"2016-08-11",
"sub_tasks":[
{
"id":1,
"title":"my first subtask"
}
]
}
When displaying the task - I obviously want to show actual values, not ID's, for client, campaign, created_by etc. But I also need the id's to update those tables later, and for filters (ie show only tasks where client_id = 2).
So should I cross reference and send back these bits of data and include as part of my task object - or should I pull all user, client and campaign data in separate API calls first, and then cross reference on the front end?
The Eloquent framework is a powerful tool when querying. I would use the with() function to include whatever you want to include.
Task::with('task_owner', 'campaign', 'created_by')->get();
This is requiring of course that the relationships is correctly set up in the Task model
This seems like a common situation so I've been searching for a question like this but I can't find it. I have built a RESTful Web API in ASP.NET. I have two related tables: Photos and PhotoGroups. The Photos table has a PhotoGroupID column that relates to the same column in PhotoGroups. In my AngularJS Single Page Application I have retrieved data from both tables using the standard RESTful queries. I am displaying a page with the Photos listed in a grid layout and one of the columns is the PhotoGroupID, numbers like 1, 2, 3, and 4. So how can I display the names of those groups instead of those numbers by joining the two queries in a RESTful fashion? I know how to add a new method in the Web API that gives me that joined data, but that doesn't seem natural in a RESTful sense. What is the common way to do this on the client side in AngularJS? Is there some kind of filter that joins the two tables, or some special syntax to bind a column to the PhotoGroup name column? I'm going to run into many cases like that when displaying information from related tables in the future and I want to do this the right way.
I started to solve this problem by adding a new method to my Web API that joined the Photos and PhotoGroups tables, but I quickly realized that I would be returning a collection of anonymous objects. To resolve that would require declaring a new class that contained the columns I wanted to return. That seems like it's getting much more complicated than the beauty and simplicity of REST.
So I pondered the alternatives in AngularJS and decided that I could write a custom filter that would convert the PhotoGroupID into a GroupName using the PhotoGroups JSON array that I already had in the controller scope. The code that implemented that solution is shown below. It's a little messy, but not very and it keeps the Web API nice and simple.
in Filter.js:
angular.module('PhotoFilters', []).filter('groupName', function ()
{
return function (input, scope)
{
var groupName = "Not Found";
angular.forEach(scope.PhotoGroups, function (group)
{
if (group.PhotoGroupID == input)
{
groupName = group.GroupName;
}
});
return groupName;
};
});
in my HTML table:
<td>{{ x.PhotoGroupID | groupName:this }}</td>
This question may be too broad/conceptual for the SO community, but I'll give it a shot.
Quick Project Overview:
I have an project that consists of a front end application requesting data from a database via Angular $http requests. Each request is pretty much mapped one to one with a controller that visualizes the data specified in that request.
For example, I can specify keywords over a certain timeframe with:
get/A/kwords/?year=2013&month=9
and receive:
[
{"kword": "a", "count": 100, },
{"kword": "b", "count": 200, },
...
]
which I then plug in to a d3 directive.
The Problem:
I've reached the point in the project where I'm forced to either give extra work to the people developing the backend or the frontend. As the app currently stands, the database sends large chunks of JSON data that the front end then has to apply transformative functions upon in order to shape the data into the format required for the different d3 directives. For example, some JSON requests send excess data that the front end needs logic for to standardize the data entering the directives.
This is logic that I do not think the front end should be forced to handle. In my mind, the front end should only have to interact with the JSON request parameters, and not the format of the actual data coming in. I think it makes more sense for the backend to be able to serve up data in consistent formats depending on the URL params.
For example, instead of the backend serving up data formatted as such:
/get/B/kwords/?year=2013&month=9&limit=6
[
{
"kword": "a",
"data": [{"impressions": 100, "clicks": 150, "conversions": 200} ]
},
{
"kword": "b",
"data":[{"impressions": 50, "clicks": 60, "conversions": 70} ]
},
...
]
and forcing the front end to break apart this array-object-array-object, I should be able to specify a data=impressions parameter in the request:
/get/B/kwords/?year=2013&month=9&limit=6&data=impressions
[
{
"kword": "a",
"data": 100,
},
{
"kword": "b",
"data": 50,
},
...
]
Does this make sense/is this a reasonable request?
I was in a similar situation and I initially ended up going with the route that the backend handles the filtering and the front-end handles just binding data to d3.
The problem is that this is very very slow. Each $http request took 1-3 seconds so the filtering experience was not very good as you had to click a filter and wait to see a response.
It's actually much easier to send as much data as possible to the front-end and do filtering there. So while the initial page load takes a bit longer, filtering is instant. I ended up rewriting both the backend and front end to do the work on the front-end. I tried to make the initial data sent from the back-end as flat as possible and iterated through that array and pushed relevant data to properties on a javascript object to quickly transform data.
If I were to do this project again, I might have tried exploring the libraries dc.js and crossfilter as to try to avoid writing some of my filtering logic.
These are examples of just how fast filtering can be on the client side:
http://dc-js.github.io/dc.js/
http://square.github.io/crossfilter/