Backbone.js: Binding an array of collection to a function - backbone.js

I have an array of collections (coll_array). All collections are bound to same function (process_coll) on all events. That means, any change to any of collection in the array results in execution of same function. My problem is how do I identify the collection on which the event took place. If I could pass arguments to the target function I could pass the identity of the collection but as far as I know there is no way to do it in Backbone events.
initialize: function(){
_(this).bindAll('process_coll');
coll_array ; //array of collections
for(var i=0;i<coll_array.length;i++)
coll_array[i].bind('all', this.process_coll);
coll_array[i].fetch();
}
process_coll: function(){
//some code here
//how do I get the specific collection which resulted in execution of this function?
}

You are probably better off listening for specific events.
initialize: function(){
coll_array ; //array of collections
for(var i=0;i<coll_array.length;i++)
coll_array[i].bind('reset', this.reset_coll);
coll_array[i].bind('add', this.add_coll);
coll_array[i].bind('remove', this.remove_coll);
coll_array[i].fetch();
}
reset_coll: function(collection, options){
// collection argument is the one you want
}
add_coll: function(model, collection, options){
// collection argument is the one you want
}
remove_coll: function(model, collection, options){
// collection argument is the one you want
}

Related

Fetch a backbone collection with only the models with specified value

I have a dictionary of type {name: value}
A = {
name: x,
name: y,
name: z
}
I want to fetch a collection (consisting of models having one of their attribute as 'name'), but to be optimal I want to fetch such that the value of the attribute 'name' exists in my dictionary.
Is there a way to do specific filtering like that?
If you're doing the filtering client-side, overriding the filter method is really NOT the way to go.
Now you no longer have it available, should you need it later. Also, modifying the collection itself from within the filter method is an undesirable sideeffect.
Instead you should be using the parse method, which will automatically be called when fetching the collection.
Now as I understand it, you want to limit the fetched set to models with names matching the keys in your dictionary.
If so, I would do the following:
parse: function(response, options) {
// Do we want to filter the response?
if (options.filterNames) {
// Filter
response = _.filter(response, function(obj) {
// Check if this model name is one of the allowed names
return _.contains(options.filterNames, obj.name);
});
}
// Backbone will use the return value to create the collection
return response;
}
And then call fetch using
someCollection.fetch({filterNames: _.keys(someDictionary)});
If you're certain, you will always be filtering the collection on fetch, you can omit passing the option and just use the dictionary within parse.
Alternatively you could create a fetchFiltered() method on the collection, which would then invoke the line above.
After investigations and trials, here are the two ways this can be resolved:
1. Client side filtering after fetching the collection from the server. This is a less optimal method, especially when the collection is huge. In situations when you really want 5 models out of a 1000 model collection, it can be an overkill. But if the server side has no logic of accepting and using the filtering client side filtering should look something like:
Overload the collection filter code something like:
var filter = {
filter: function() {
var results = _.filter(this.models, function(model) {
// Perform the check on this model, like compare it to your local dict
if (checkPassed) {
return true;
}
return false;
});
results = _.map(results, function(model) {
return model.toJSON();
});
// Reset the existing collection to filtered models
this.reset(results) ;
};
var ExtendedCollection = OriginalCollection.extend(filter);
Pass a filter option in the fetch ajax call to the server, and the server should understand the filter and return the collection based off that.

Backbone model which I see in success callback and error callback is different. #Backbone save

I have a backbone model which has Backbone Collections in it. When I save the model and if it is success then my model object is properly structured as it was. But when error occurs (say validation error), in error callback the model object is modified (Collections inside model object are converted into Array). As a result all my functions defined for that Collections are now "undefined" and gives me error.
save : function() {
this.model.save(_.extend(originalModel.toJSON() || {}, this.model
.toJSON()), {
success : this.onSaveSuccess,
error: this.onSaveError,
include : []
});
},
onSaveSuccess : function(model) {
//Here the model is properly structured
},
onSaveError : function(model, response) {
// Here the model is modified, all collections are now array
//I have to explicitly call my parse method to re structure it.
model = model.parse(model.attributes);
}
I would like to know why is this happening. Am I doing something wrong here ?
For the sake of this example, let's assume the attribute of the model that holds the collection is called "people". It isn't clearly documented, but model.save(attributes) actually behaves like:
model.set(attributes);
model.save();
Here's the relevant annotated source of save(...). What your code is doing is first setting the "people" attribute to the array of people, then attempting to save it. When the save fails, your model has the array, not the collection, as the value of "people".
I suspect your end point is returning the full representation of the model on success, and your model is correctly parsing that representation & re-building the Collection at that point. But your error handler won't do that automatically.
As an aside, in my experience Models that contain Collections are hard to manage & reason about. I've had better luck having a Model that contains an array of data, and then having a method on that Model to build a Collection on the fly. Something like:
var MyModel = Backbone.Model.extend({
// ...
getPeople: function() {
// See if we've previously built this collection
if (!this._peopleCollection) {
var people = this.get('people');
this._peopleCollection = new Backbone.Collection(people);
}
return this._peopleCollection;
}
});
This removes the Collection concept from the server communication (where it's pretty unnecessary), while also providing a smarter data layer of your application (smart Models are a good thing).
The solution for this is passing wait:true in options. This will not modify until and unless server returns a valid response.
save : function() {
this.model.save(_.extend(originalModel.toJSON() || {}, this.model
.toJSON()), {
success : this.onSaveSuccess,
error: this.onSaveError,
**wait:true**
include : []
});
},

Backbone model structure gets changed when returning them from web worker

I am trying to reset a backbone collection with an array of models. It gets reset but the model structure is changed (nested one level).
Here is a detailed explanation:
Model
var SeatModel = Backbone.Model.extend({
defaults:{
},
initialize:function () {
console.log('Model initialized');
}
});
Collection
var myCollection = Backbone.Collection.extend({
url:"",
parse:function (data) {
},
initialize:function () {
console.log('Collection initialized');
}
});
Now, I am executing some logic in a web worker, which generates an array of models. The size of the array varies depending on the url I hit.
When the array is ready, I reset the data in the collection using something like:
(Before this, I have instantiated the collection and set it in an service object)
worker.onmessage = function(e) {
newDataForCollection = e.data;
//update the collection
service.get("myCollection").reset(newDataForCollection);
};
After getting reset, the structure of the collection gets changed to something like:
models: Array[3154]
[0...99]
0:g.Model
attributes:
attributes:
price: "12"
Whereas it should be like:
models: Array[3154]
[0...99]
0:g.Model
attributes:
price: "12"
Also the number of models in the array gets reduced. (Should have been around 6100 in this case).
I am unable to figure out, what causes the internal structure to get nested by one level on invoking reset on the collection.
Updated Post
Figured it out. We cannot send objects with functions in post message, so the models in the array just have the attributes and no functions. Related Passing objects to a web worker
Figured it out. We cannot send objects with functions in post message, so the models in the array just have the attributes and no functions. This was related to issue Passing objects to a web worker

Marionette.js - hijacking CompositeView functions to create streaming pagination

I am creating a streaming paginated list of views. We start the app with an empty collection and add items to the collection at regular intervals. When the size of the collection passes a page_size attribute then the rest of the models should not get rendered, but the compositeView should add page numbers to click on.
I am planning on creating a render function for my compositeView that only renders items based on the current page# and page size by have a function in my collection that returns a list of models like this:
get_page_results: function(page_number){
var all_models = this.models;
var models_start = page_number * this.page_size;
var models_end = models_start + this.page_size;
//return array of results for that page
return all_models.slice(models_start,models_end);
}
My question is, should I even be using Marionette's composite view for this? It seems like im overwriting most of the functionality of Marionette's collectionView to get what I want.
Every time the number of items in my collection changes two things need to be updated:
The itemViews in the collection view
The page numbers at the bottom of the composite view
My strong recommendation is not to do this in the view layer. You're going to add a ton of code to your views, and you're going to end up duplicating a lot of this code between multiple views (one for displaying the data, one for page list and counts, one for ...).
Instead, use a decorator pattern to build a collection that knows how to handle this. I do this for filtering, sorting and paging data, and it works very well.
For example, here's how I set up filtering (running in a JSFdiddle here: http://jsfiddle.net/derickbailey/vm7wK/)
function FilteredCollection(original){
var filtered = Object.create(original);
filtered.filter = function(criteria){
var items;
if (criteria){
items = original.where(criteria);
} else {
items = original.models;
}
filtered.reset(items);
};
return filtered;
}
var stuff = new Backbone.Collection();
var filtered = FilteredCollection(stuff);
var view = Backbone.View.extend({
initialize: function(){
this.collection.on("reset", this.render, this);
},
render: function(){
var result = this.collection.map(function(item){ return item.get("foo"); });
this.$el.html(result.join(","));
}
});
In your case, you won't be doing filtering like this... but the idea for paging and streaming would be the same.
You would track what page # you are on in your "PagingCollection", and then when your original collection is reset or has new data added to it, the PagingCollection functions would re-calculate which data needs to be in the final pagedCollection instance, and reset that collection with the data you need.
Something like this (though this is untested and incomplete. you'll need to fill in some detail and flesh it out for your app's needs)
function PagingCollection(original){
var paged = Object.create(original);
paged.currentPage = 0;
paged.totalPages = 0;
paged.pageSize = 0;
paged.setPageSize = function(size){
paged.pageSize = size;
};
original.on("reset", function(){
paged.currentPage = 0;
paged.totalPages = original.length / paged.pageSize;
// get the models you need from "original" and then
// call paged.reset(models) with that list
});
original.on("add", function(){
paged.currentPage = 0;
paged.totalPages = original.length / paged.pageSize;
// get the models you need from "original" and then
// call paged.reset(models) with that list
});
return paged;
}
Once you have the collection decorated with the paging info, you pass the paged collection to your CollectionView or CompositeView instance. These will properly render the models that are found in the collection that you pass to it.
Regarding CollectionView vs CompositeView ... a CompositeView is a CollectionView (it extends directly from it) that allows you to render a model / template around the collection. That's the majority difference... they both deal with collections, and render a list of views from that collection.
We have built a set of mixins for bakcbone.marionette that you may find usefull (https://github.com/g00fy-/marionette.generic/)
You could use PaginatedMixin, that allows a Backbone collection to be paginated + a PrefetchMixin, so you don't have to pass a prefetched collection to a view.
the only code you would have to do is:
YourListView = Generic.ListView.mixin(PaginatedMixin,LoadingMixin,PrefetchMixin).extend({
paginateBy:10,
template:"#your-list-template",
itemViewOptions:{template:"#your-itemview-template"},
fetchPage:function(page){
this.page = page;
return this.collection.refetch({data:{page:page}}); // your code here
},
hasNextPage:function(){
return true; // your code here
},
});
For a working example see https://github.com/g00fy-/stack.reader/blob/master/js/views.js

Backbone.js firing Collection change event multiple times

In one of by Backbone.js views I am updating the attribute "read" of the current model (instance of Message) by using this.model.set( { read: true } );. I verified that this command is only executed once (I know about "ghost events"). As you can see below I configured the Collection to fire an update event in which the whole Collection gets saved into a variable.
Unfortunately the saveToVar function gets called 3 times instead of one! Also, the first time saveToVar is called, this correctly consists of all the collection's models, whilst the 2nd and 3rd time this only has one model, namely the one I did the update on.
I tracked everything down piece by piece but I have no clue why this happens.
window.Message = Backbone.Model.extend({
});
window.MessageCollection = Backbone.Collection.extend({
model: Message,
initialize: function()
{
this.on("change", this.saveToVar);
},
saveToVar: function(e)
{
App.Data.Messages = this.toJSON();
return;
}
});
In your jsfiddle, you're doing this:
App.Collections.message = new MessageCollection([ ... ]);
var elements = App.Collections.message.where({ id: 4 });
var item = new MessageCollection(elements);
Your where call will return models that are in the message collection, not copies of those models but exactly the same model objects that are in message. Now you have two references to your id: 4 model:
The original one buried inside App.Collections.message.
The one in elements[0].
Both of those references are pointing at the same object. Then you add elements to another MessageCollection. Now you have something like this:
App.Collections.message.models[3] item.models[0]
| |
+--> [{id: 4}] <--+
Both of those collections will be notified about change events on the id: 4 model since collections listen to all events on their members:
Any event that is triggered on a model in a collection will also be triggered on the collection directly, for convenience.
And your collection listens for "change" events in itself:
initialize: function()
{
this.on("change", this.saveToVar);
}
So when you do this:
this.model.set({ read: true });
in your view, both collections will be notified since that model happens to be in both collections.
If we alter your event handler to look like this:
saveToVar: function() {
console.log(_(this.models).pluck('cid'));
}
then you'll see that the same cid (a unique identifier that Backbone generates) appears in both collections. You can also attach a random number to each collection and see what you get in saveToVar: http://jsfiddle.net/ambiguous/mJvJJ/1/
You probably shouldn't have one model in two collections. You probably shouldn't have two copies of the same model kicking around either so cloning elements[0] before creating item might not be a good idea either. You might need to reconsider your architecture.

Resources