Require.js & Marionette: defining includes when extending classes - backbone.js

I'm working on a project using Require.js, Backbone and Marionette, and the define/function calls at the top of my files are getting a bit ludicrous. I'd like to find a way to move the "includes" out of the topmost define block, and into the extend where the are relevant.
So my structure roughly looks like this:
define(['underscore','jquery','handlebars','someTemplate','someModel',etc...], function(_,$,Handlebars,template,model,etc...){
var someView = Backbone.Marionette.ItemView.extend({
// code here
});
return someView;
});
So as I add more views to the file, that define list gets really, really, long. I've tried doing something like:
var someView = define(['someTemplate','someModel'], function(template, model){
return Backbone.Marionette.ItemView.extend({
// code here
});
]);
But then someView is undefined when I call it later. Am I doing this wrong, or is it not possible?

You should split your modules. Having a really long list of dependencies is a sign that your module does to much. Most of the time all view need is model or collection and maybe some subViews and the global eventbus.
Also think about the need to require jQuery,Backbone etc. This are stuff that you will need in most of your modules, its easier to combine them in one file and load them in the first place. There is no much advantage to require this files, all it does is cluttering your define section with a lot of boilerplate.
The define call btw. does not return anything thats why someView is undefined in example. If you really wanna go with this solution you have to use require when you use someView later on. But even then its much cleaner to put someView in its own module file.

you can use your require.js config file dependencies shim to aggregate your "common" or needed
modules into one namespace:
shim: {
marionette: {
deps: ["backbone", "json2", "bootstrap", "backboneRpc"],
exports: 'Marionette'
},
flot_pie: {
deps: ['jquery', 'charting', 'flot_resize'],
exports: '$.plot'
},
}
so now you can call all you need like this:
define([
'marionette',
'flot_pie'
],
function(Marionette, graph) {

Related

M, V, C arrays vs require array inside Controller

What is the Sencha's way of preloading the dependencies?
Inside controller, is it better to put views, stores in arrays like this:
views: ['Full.Path.To.View', 'Full.Path.To.View'],
stores: ['Full.Path.To.Store', 'Full.Path.To.Store']
or just inside requires:
requires: ['Full.Path.To.View', 'Full.Path.To.View', 'Full.Path.To.Store', 'Full.Path.To.Store']
In some cases (when I want Ext.syncRequire()) works only the second option so I wanted your opinion. So, this is being called inside Controller, not inside app.js in Ext.application.
Thank you!
EDIT:
This is my solution and vision of how it should be:
var views = ['MyApp.view.Login.LoginForm', 'MyApp.view.Main.Index'];
var models = ['MyApp.model.Company'];
var stores = ['MyApp.store.Tables'];
var senchaData = ['Ext.field.Text', 'Ext.Button'];
var dependencies = [].concat(views, models, stores, senchaData);
Ext.define('MyApp.controller.Login', {
extend: 'Ext.app.Controller',
requires: dependencies,
config: {
refs: {
loginForm: '#loginForm',
loginButton: '#loginForm #loginButton'
},
control: {
loginButton: {
tap: 'onLogin'
}
}
}...
You have to understand how controller dependencies get resolved by the classmanager because that makes the difference.
Note: Last time I dig into the controller classes was with version ExtJS4.2 so something might have change slightly.
These arrays (models,views,controllers) have some benefits.
One benefit is cleaner code due to the different arrays a second is that the controller can predict the namespace of each class base on the app namespace and the array. You have to know that these arrays get resolved a definition time!
Now these three arrays are nice but the classmanager didn't know them that is why the Ext.app.Controller inject a appropriate hooks for doing so. The hook get triggered when the class get extended and will require all classes found in one of the four array (models,stores,views,controllers). This force the classmanager to load these classes immediately.
Note that the Ext.app.Application is the only one that initializes the controllers, but since ExtJS4.2 it is possible to do it your own with only little effort.
That is all I can say based on your info. If this doesn't help please be more precise in which case loading fails.
Edit
Why not define it this way? It is better to read, isn't it?
Ext.define('MyApp.controller.Login', {
extend: 'Ext.app.Controller',
views: ['Login.LoginForm', 'Main.Index'],
models: ['Company'],
stores: ['Tables'],
requires: ['Ext.field.Text', 'Ext.Button'];
//...
}

What's a good Backbone pattern for managing model instances?

I'm trying to minimize server calls by avoiding any requests I can.
Let's say, for the sake of a example, I have a collection of Matchboxes which belong to Users and have Tags assigned, and then also have a collection of Tags and a collection of Users as part of other pages. Getting matchboxes retrieves the user and tag info, so that I can instantiate all required models with one request, accessing the Tags and Users pages retrieves similar collections (only they deal only with their respective models).
My problem: if matchboxes is one page, and tags and users are two other pages, what's a good way to make sure only one model is ever instantiated for any given entity, ie. if I go into users or tags and edit an entry associated with a matchbox the matchbox entry should have the same entry assigned allowing it to listen and react to the updates with out requiring sending requests when going back to the matchbox page in the example.
I've looked over Backbone.relational but it doesn't seem to do what I need, and would rather not wall myself into a framework. So solutions involving patterns are preferable.
Ended up using http://pathable.github.io/supermodel/ which uses the pattern of overwriting the model attribute on collections with a custom function which calls a special Model.create that itself returns an existing (updated with the new values if necessary) instance of said model. The Model.create call has to be used everywhere else in code for unique models.
So essentially every model has a all() method which is a collection of all instances by id. Whenever a model is added it checks it against the collection and returns an existing object if it exists; the data used to instantiate the duplicate is used to update the existing object ensuring data is not stale (which is a nice bonus to the uniqueness I wanted).
The cleanest method seems to be to just wrap the model function into a function that returns it for clearer use; then for every collection that needs to have unique models wrap said model in the function. I came up with this at the moment:
app.single = function (modelPrototype) {
return function (attrs, options) {
return modelPrototype.create(attrs, options);
};
};
(app there is just a scope global, tied to a particular namespace)
So in collections instead of,
model: app.Model
I would then use
model: app.single(app.Model),
Whenever I update a entry in one part of the application the change will trickle down to every other collection/model since if it's the same instance from the user's perspective it's the same instance in code too.
That's about all I could tell from reading the pattern though the code and documentation. Which is sufficient for my own uses.
I suspect this solution would still have some issues if you're caching renders but I haven't found a use for that (prefer to re-render whenever I can to avoid dealing with various artifacts) so it's all good for me.
Unfortunately the codebase seems to be partially abandoned, so while it works with Backbone 1.0.0 (as far as unique models go), I may need to re-create/fork the pattern in future projects.
I think you should think twice about nesting your models and collections in this way, especially if it's primarily for the purpose of easing the bootstrapping of your app. Instead, try to use id's for inter-referencing between models as much as possible. This design problem you have is most likely only the first of many to come if you structure your model/collection tree in a certain way now, only to find it too inflexible later.
That being said, If all you need is for models referencing other models/collections to be able to refer to the same model/collection instance, then simply instantiating them during bootstrap and passing them in to their respective parent models would be sufficient. You could either load some bootstrap data in one request, or preferably inline that data in the HTML:
<script>
var bs_data = {
users : [
...
],
tags : [
...
],
matchboxes : [
...
]
};
</script>
And then instantiate the corresponding models or collections using the bootstap data.
var matchboxes = new Matchboxes();
matchboxes.set(bs_data.matchboxes);
var users = new Users({matchboxes:matchboxes});
users.set(bs_data.users);
The bootstrap data would come from the same backend so your models and collections would already be in sync without having to fetch anything.
As for design patterns; passing dependencies as constructor arguments is actually the dependency injection pattern, albeit more automated solutions to do so exist.
To make sure only one model is ever instantiated, and it is shared among the other elements that use it, being able to listen and update when any of the elements make a change to it, you can use a Singleton pattern. You can read more about it here
If you use Requirejs you can get same effect if you always return the model instantiated. For example:
// the shared model
define([
'jquery',
'underscore',
'backbone'
], function ($, _, Backbone) {
'use strict';
var Model = Backbone.Model.extend({
// ...
});
// return instantiated, so we'll get the same object back whenever we use this model (singleton)
return new Model();
});
// a view using the model
define([
'jquery',
'underscore',
'backbone',
'model'
], function ($, _, Backbone, modelInstance) {
'use strict';
var View = Backbone.View.extend({
initialize: function () {
// listen to what other elements do
this.listenTo(modelInstance, 'eventFromOtherElement', this.doSomething);
// when this element does something, other elements should be listening to that event
modelInstance.trigger('thisViewEvent');
},
doSomething: function () {
// ...
}
});
return View;
});

Best practice for shared objects in Backbone/Require Application

I've been developing Backbone applications for a little while now, and am just starting to learn to use Backbone with Require.js.
In my backbone app that I am refactoring, I defined a namespace like this: App.model.repo. This model is used over and over again in different views. I do the same thing with a few collections, for example, App.collection.files. These models and collections are bootstrapped in with the initial index file request.
I did find this example, which looks like a great way to get that bootstrapped data in. However, I am struggling with the best way to reuse/share these models and collection between views.
I can think of three possible solutions. Which is best and why? Or is there another solution I am missing entirely?
Solution 1
Define these common modules and collections in the index (when they are bootstrapped in), and then pass them along to each Backbone view as an option (of initialize).
define(['jquery', 'underscore', 'backbone', 'handlebars', 'text!templates/NavBar.html'],
function($, _, Backbone, Handlebars, template){
return Backbone.View.extend({
template: Handlebars.compile(template),
initialize: function(options){
this.repoModel = options.repoModel; // common model passed in
}
});
}
);
These seems clean as far as separation, but could get funky quick, with tons of things being passed all over the place.
Solution 2
Define a globals module, and add commonly used models and collections to it.
// models/Repo.js
define(['backbone'],
function(Backbone){
return Backbone.Model.extend({
idAttribute: 'repo_id'
});
}
);
// globals.js (within index.php, for bootstrapping data)
define(['underscore', 'models/Repo'],
function(_, RepoModel){
var globals = {};
globals.repoModel = new Repo(<?php echo json_encode($repo); ?>);
return globals
}
);
define(['jquery', 'underscore', 'backbone', 'handlebars', 'text!templates/NavBar.html', 'globals'],
function($, _, Backbone, Handlebars, template, globals){
var repoModel = globals.repoModel; // repoModel from globals
return Backbone.View.extend({
template: Handlebars.compile(template),
initialize: function(options){
}
});
}
);
Does this solution defeat the whole point of AMD?
Solution 3
Make some models and collections return an instance, instead of a constructor (effectively making them Singletons).
// models/repo.js
define(['backbone'],
function(Backbone){
// return instance
return new Backbone.Model.extend({
idAttribute: 'repo_id'
});
}
);
// Included in index.php for bootstrapping data
require(['jquery', 'backbone', 'models/repo', 'routers/Application'],
function($, Backbone, repoModel, ApplicationRouter){
repoModel.set(<?php echo json_encode($repo); ?>);
new ApplicationRouter({el: $('.site-container')});
Backbone.history.start();
}
);
define(['jquery', 'underscore', 'backbone', 'handlebars', 'text!templates/NavBar.html', 'models/repo'],
function($, _, Backbone, Handlebars, template, repoModel){
// repoModel has values set by index.php
return Backbone.View.extend({
template: Handlebars.compile(template),
initialize: function(options){
}
});
}
);
This I worry could get real confusing about what is a constructor and what is an instance.
End
If you read this far, you are awesome! Thanks for taking the time.
In my case, I prefer option 3. Although, to prevent confusion, I put every singleton instance in their own folder named instances. Also, I tend to separate the model/collection from the instance module.
Then, I just call them in:
define([
"instance/photos"
], function( photos ) { /* do stuff */ });
I prefer this option as every module is forced to define its dependencies (which is not the case via namespace for example). The solution 2 could do the job, but if I'm using AMD, I want my module as small as possible - plus keeping them small make it easier to unit test.
And lastly, about unit test, I can just re-define the instance inside my unit test to use mock data. So, definitely, option 3.
You can see an example of this pattern on an Open source app I'm working on ATM: https://github.com/iplanwebsites/newtab-bookmarks/tree/master/app
I would take a look at this example repo https://github.com/tbranyen/github-viewer
It is a working example of backbone boiler plate (https://github.com/tbranyen/backbone-boilerplate)
Backbone Boiler plate does a lot of unnecessary fluff, but what is really useful about it, is that it gives some clear directions on common patterns for developing complex javascript apps.
I'll try and come back later today to answer you question more specifically (if someone doesn't beat me to it :)
I prefer Solution 1. It is generally good to avoid using singletons, and using globals is also something to avoid, especially since you are using RequireJS.
Here are some advantages I can think of for Solution 1:
It makes the view code more readable. Someone looking at the module for the first time can immediately see from looking at the initialize function which models it uses. If you use globals, something might be accessed 500 lines down in the file.
It makes it easier to write unit tests for the view code. Since you could possibly pass in fake models in your tests.

Backbone structure

I'm new to backbone, but have watched several tutorial screencasts on it, both with and without requirejs.
My question involves the setup structure (both file structure if using require, and/or variable/object structure).
Most of the tutorials I have watched, seem to prefer a App.Models, App.Collections, and App.Views approach, and each item inside has the name of the module: ie,
App.Models.todo = Backbone.Model.extend({...});
App.Collections.todos = Backbone.Collection.extend({...});
App.Views.todo = Backbone.View.extend({...});
After a little research, trying to find someone that uses the same style as I would like to use, I finally found: File structure for a web app using requirejs and backbone. They seem to prefer more of a App.[Module Name] method: ie,
App.Todo.Model = Backbone.Model.extend({...});
App.Todo.Collection = Backbone.Collection.extend({...});
App.Todo.Views = Backbone.View.extend({...});
I personally prefer the App.[Module Name] structure over having my modules split up, but would like to know the benefits, if any, of having the different structures.
Which structure do you use, and how has it helped you over a different structure you may have seen or used in the past?
I like the approach described in this blog:
http://weblog.bocoup.com/organizing-your-backbone-js-application-with-modules/
If you are using requireJS you don't need/want to attach the models/views to a global namespace object attached to the window (no App.Views, App.Models). One of the nice things about using requireJS or a different AMD module loader is that you can avoid globals.
You can define a model like this:
define(['underscore', 'backbone'],
function(_, Backbone) {
var MyModel = Backbone.Model.extend({});
return MyModel;
});
Then you define a view:
define(['underscore', 'backbone', 'tpl!templates/someTemplate.html'],
function(_, Backbone, template) {
var MyView = Backbone.View.extend({});
return MyView;
});
Now you have a model and a view with no globals. Then if some other module needs to create one of these (maybe your App module), you add it to the define() array and you have it.

Backbone-localstorage doesn't work with require.js

I'm trying to add localstorage to my collections in backbone.js, but for some reason, require.js wont load it.
Here's what is in the main.js file that requirejs loads:
require.config({
paths: {
'jquery': 'libs/jquery/jquery-1.7.1.min',
'underscore': 'libs/underscore/underscore-min',
'backbone': 'libs/backbone/backbone-min',
'backbone-localstorage': 'libs/backbone-localstorage/backbone-localstorage-min',
'text': 'libs/require/text'
}
});
You can see the full source at https://github.com/tominated/Vendotron. I can tell it's not loading because when I put the localstorage snippet into my collection, it errors out in chrome's console saying that Store isn't defined.
Any idea what I'm doing wrong?
As Paul said, you are not requiring the localstorage module anywhere. Require.js 2.0 has a specific mechanism for JavaScript code that is essentially a plugin for other code - the shim option. Including localStorage would look like this:
require.config({
baseUrl: "/js/",
paths: {
jquery: 'lib/jquery-1.8.0',
underscore: 'lib/underscore-1.3.3',
backbone: 'lib/backbone-0.9.2',
'backbone.localStorage': 'lib/backbone.localStorage'
},
shim: {
underscore: {
exports: "_"
},
backbone: {
deps: ['underscore', 'jquery'],
exports: 'Backbone'
},
'backbone.localStorage': {
deps: ['backbone'],
exports: 'Backbone'
}
}
});
This example was copied from the article "Build Backbone Apps Using RequireJS" which also explains how to structure your code and how to compile the code into one file when deploying the application.
There are a couple problems.
First, you are setting the path to backbone-localstorage, but you are never requiring it anywhere, so it is never actually loaded. Setting that path is basically defining a shortcut to it, not loading it.
The second problem is that, like backbone itself, most backbone plugins are not AMD modules. They want to have Backbone loaded first, so they can add their extensions to it.
It looks like you are using an AMD fork of Backbone, but not backbone-localstorage. You could try to find an existing one, or make your own similar to this.
Either that, or you can try to load backbone-localstorage as-is (adding to the dependencies list of your define call), but you would need to use the !order plugin to make sure backbone is loaded first.
On looking inside the source code, where underscore and backbone are required, your path definition in require config should tally with the required path in local storage I.e case sensitive

Resources