ExtJS Ajax vs Extjs Store for single row from server - extjs

Scenario: I have a URL that returns details about the current logged in user. I.e.
one record
no list needed
Options I have:
Perform a manual ExtJS Ajax call each time, having to insert the code everywhere I need the callbacks etc.
Create an ExtJS Store once and fetch the first record from the store instance every time.
Question: Any better options? I'm using ExtJS 4.1.

You don't need to go into low level details to do something like that; use the standard tools. Define a Model, assign it a Proxy of the type you need, and load it. See the docs: http://docs.sencha.com/ext-js/4-1/#!/api/Ext.data.Model

Why are you doing an ajax call each time?
Do it once, and assign it as a global variable?
Ext.Ajax.request({
url: 'page.php',
params: {
id: 1
},
success: function(response){
var text = response.responseText;
window.user = Ext.JSON.decode(text);
my.custom.launchFunction();//does your viewport etc for you
}
});
Having done this, you can now, from anywhere do
console.log(user.Name);
Its neater, and faster, than doing a store lookup.
As a bonus, you user object can be much more complex than the store would handle without a ton of config.

Related

Search a user in a list of usernames in AngularJS

I am looking for something that one would probably find in hundreds of tutorials for AngularJS but I don't quite know where to look.
I want the best practice for searching a user by the user's name out of a list of ten thousands stored in the database. I am thinking about something similar to Facebook's "friend search" field. So as the user starts typing proposed results should appear.
If the list was already on the client, the simple ng-filter behaviour would be enough, but I don't want to dump the whole database in a json file.
Could somebody forward me some hints how to approach this problem and where the pitfalls are? The backend is a Symfony2 application with Doctrine, if that matters...
Thank you!
For requesting ten thousand of data in a combo like component, you should use a typehead with an asynchronous call.
See this component :
https://github.com/angular-ui/bootstrap/tree/master/src/typeahead
Your indexed data have to be indexed (lucene or equivalent) to have acceptable performances.
I believe you are looking for Select2 component, more exactly the select2 with remote data here.
It is integrated in angular-ui
And in your controller, you can set ajax parameter as described in the select2 API description:
$scope.select2Options = {
ajax: {
url: "http://api.rottentomatoes.com/api/public/v1.0/movies.json",
dataType: 'jsonp',
quietMillis: 100,
data: function (term, page) { // page is the one-based page number tracked by Select2
return {
q: term, //search term
page_limit: 10, // page size
page: page, // page number
};
}
The back-end implementation will be your job.

Should I reload data from REST service or add localy when posting new data in single page application

For example we have simple single page application for TODO list. When user adds new item to the list how would you recommend to populate the list of todos.
POST new item to REST service and on success reload data from REST service or
POST new item to REST service and add item to list localy
It's really up to you, and depends on if you need to get anything back from the server, like an id property, it also depends on whether or not your list is filtered in any way server side. If it is, the item may or may not be included. Specifically for a TODO list I might just add it to the list instead of reloading the whole thing.
My approach would be to make the application as responsive as possible. So what I'd do is ensure that the changes reflect on the browser as soon as possible, and then work on the promise from the $http to rollback the changes in case there's a failure.
documentation for $http
You can do something as simple as
$http({url: '/something'})
.success(function() { /*do something*/ })
.error(function() { /*uh-oh*/ });
Option (1) is more robust as it will identify any problems made during the save. It should also provide you with the resource ID which you will need if you then want to later edit and save that todo item; without it you would not know which item to update (unless you are saving everything in a NoSQL type way)

How to reuse store in EXT JS 4 MVC application without multiple reloading?

I am working on extjs 4 MVC application.
Application runs the Viewport, which contains tabpanel.
Each tab has own controller and multiple views.
See my sandboxs at http://wap7.ru/folio/ext-reuseable-store/TE.html
I have one Store used several times (e.g. one tab in topmenu combobox, another in the clietns grid.)
Store configured with autoload: true.
Proxy is configured in the Model.
My problem: Store is loaded multiple times - at every mention in the controller [stores] array.
If I remove one from the array [stores] controller- combobox will be empty, although it states store: Ext.getStore ('STORE-ID')
Please give me a hint, or an example of re-using Store (not Model) as in here http://docs.sencha.com/ext-js/4-0/#!/guide/mvc_pt2
You can just instantiate your store and load it, remove the autoload.
var store = Ext.create('App.store.YourStore').load();
Then pass that store to all your components, just like you would do when you want a paging bar connected to a grid.
This normally works fine to call your store from other controllers:
Ext.getStore('PlatformClient');
I've never tried to put the same store in more than one controller stores array. That seems strange to me.
There are a couple of other oddities about the code you have posted though, maybe they're just typos and maybe they don't make any difference to the framework but they're different from what I normally do so I'll point them out:
First, your model array in the "typical controller" contains a store:
models:[
'te.store.PlatformClient'
],
That one is probably just a typo.
Second, I don't put the full namespace in my store arrays, this is something that may not make a difference, but I don't know. Maybe ExtJS is prepending the namespace on top of the namespace you have written out so it thinks that you are instantiating a different store whenever it initializes a new controller - thereby causing it reload. For example this is more "normal" for whatever it's worth:
stores:[
'Taxonomy',
'PlatformClient',
'DataType'
],
controllers:[
'Taxonomies' ,
'DataType' ,
'DataSale' ,
'Clients'
],
Try setting it up like that and get rid of the duplicative stores in the other controller store arrays.
Also, I want to make sure that you caught the bit in the docs about not needing to define a storeId config for MVC stores. The framework will automatically give the store this:
storeId: [StoreClassName]
So in your example, you would get this automatically:
storeId: 'PlatformClient'
Someone else had trouble with MVC stores recently and it traced back to the framework being flabbergasted by the audacity of the dev supplying their own storeId config with the MVC pattern.
We used to have similar problems with loading. Also got issues with filtering in case of multiple use of the same store (you may want to set filter on one but not on the other). Therefore we load all the stores at application launch. Then whenever we need that store for displaying purposes then instead of using the original we are cloning it in the memory with ha utility function as below.
/**
* Use this for example if you want to apply a filter on a store
* but you dont want the original store to change, so:
* singleton store has no filter
* you clone it to be used with filters in some places.
*
* Note: this will have memory proxy, so no changes to the stores are persistent,
* changes will have no effect on the local/remote db.
*
*/
createStore: function(storeId, data) {
//
// Creates a new store from the given array of records without
// registering the new store.
// See cloneStore for more info
//
var modelName = storeId;
var prevStore = Ext.getStore(storeId);
data = data || prevStore.data.all;
var clonedStore = Ext.create('App.store.' + storeId, {
data: data,
model: 'App.model.' + modelName,
proxy: 'memory'
});
Ext.data.StoreManager.register( prevStore );
return clonedStore;
}
Please note the cloned copy is using memory proxy. Therefore write operations shouldn't be done on it. If you need to update the store then always use the original.
I hope this helps on some way.

Add a "permanent" filter to a store, until I manually call clearFilter

I'm using a store to fetch the specializations of all hikers (so hiker has many specializations). However, I have a detail window where this specializations are added/removed/shown ony for currently selected hiker (yea, it's a detail window).
My problem here is that my store fetch data for all hikers, but I want it to show, when detailed window is up, only data for given hiker. Notice also that I'm showing data in data-grid, so the user possibly can add filters and I noticed that if I add a filter with store.filter({...}) and user add a filter with data-grid, my filters are removed (so basically they are useless.
Which approach should I use? Do you have any suggestion? I were thinking about 1 store for each hiker, but I don't like this solution.
Edit 1:
I also noticed that I can't create a filter in the same way as data-grid builds them:
Ext.create('Ext.util.Filter', {
type: 'numeric',
comparison: 'eq',
field: 'hiker_id',
property: 'hiker_id',
value: me.hiker.get('id'),
root: 'data'
})
Which is boring, because I imnplemented on server side a functionality that parses the grid filters already.
We just give our filters in json format to the store's extra params. And parse that back-end. The extra params stay the same while paging or refreshing the grid.
grid.getStore().getProxy().extraParams = { filter: yourFilter };
-How- are your users doing the filter?
store.filter accepts both arrays and functions, so you can do quite a bit with it, without actually juggling the data on the server. (eg manage an array that is being passed to the filter, test against an object somewhere, or whatever)
Permanent filter? Not so much, but since you can add multiple filters etc, it is relatively trivial to keep track of what filters are in place.
http://docs.sencha.com/ext-js/4-1/#!/api/Ext.data.Store-method-filter

How to generate model id's with Backbone.js

I am setting up the backbone sync mechanism and am a bit confused where to generate the id's for the models.
When I create a new model, should backbone be generating and setting the id, or am i supposed to implement an id generation method, or is there some sort of mechanism where I "PUT" the data to the server, which generates the id and returns a model with the id?
I'm providing a second answer to simplify the code you need to study to get the main points you're pondering about - the actual round about from model to server and how ids play their role.
Say you define a model - Let's go with Jurassic Park.
// Define your model
var Dinosaur = Backbone.Model.extend({
defaults: {
cavemanEater: undefined // Has one property, nom nom or not.
},
urlRoot: 'dino' // This urlRoot is where model can be saved or retrieved
});
var tRex = new Dinosaur({'cavemanEater':true});
You now have instantiated a dinosaur that is a meat eater. Roar.
console.log(tRex);
What you should notice is that in the properties of tRex, your model does not have an id. Instead, you will see a cID which you can think of as a temporary id that Backbone automatically assigns to your models. When a model doesn't have an id it is considered new. The concept of persisting a model (either to a database or local storage) is what allows you to go back to that resource after you've created it and do things like save (PUT) or destroy (DELETE). It would be hard to find that resource if you had no way to point directly at it again! In order to find that resource, your model needs an id, something it currently does not have.
So as the above answers have explained it is the job of your database (or localstorage, or some other solution) to provide Backbone with a resource id. Most of the time, this comes from the resource id itself, aka - the primary key id of your model in some table.
With my setup, I use PHP and mySQL. I would have a table called Dinosaur and each row would be a persistent representation of my dino model. So I'd have an id column (unique auto-incrementing int), and cavemanEater (bool).
The data communication flow happens like this.
You create a model.
The model is new so it only has a cID - no proper ID.
You save the model.
The json representation of your model is SENT to your server (POST)
Your server saves it to the table and gives it a resource id.
Your server SENDS BACK a json representation of the data {id:uniqueID}
Backbone RECEIVES this json representation with id
Backbone automagically updates your model with an id.
Here is what annotated code looks like.
CLIENT:
tRex.save();
// {'cavemanEater':true} is sent to my server
// It uses the urlRoot 'dino' as the URL to send. e.g. http://www.example.com/dino
SERVER:
// Is setup to accept POST requests on this specific ROUTE '/dino'
// Server parses the json into something it can work with, e.g. an associative array
// Server saves the data to the database. Our data has a new primary id of 1.
// Data is now persisted, and we use this state to get the new id of this dino.
$dinoArray = array('id'=>1, 'cavemanEater'=>true);
$dinoJSON = json_encode($dinoArray);
// Server does something to send $dinoJSON back.
CLIENT:
// If successful, receives this json with id and updates your model.
Now your tRex has an id = 1. Or should I say...
tRex.toJSON();
// RETURNS {'id':'1', 'cavemanEater':'true'}
Congrats. If you do this tRex.isNew() it will return false.
Backbone is smart. It knows to POST new models and PUT models that already have a resource id.
The next time you do this:
tRex.save();
Backbone will make a PUT request to the following URL.
http://www.example.com/dino/1
That is the default behavior by the way. But what you'll notice is that the URL is different than save. On the server you would need a route that accepts /dino/:id as opposed to /dino
It will use the /urlRoot/:id route pattern for your models by default unless you tweak it otherwise.
Unfortunately, dinosaurs are extinct.
tRex.destroy();
This will call... Can you guess? Yep. DELETE request to /dino/1.
Your server must distinguish between different requests to different routes in order for Backbone to work. There are several server side technologies that can do this.
Someone mentioned Sinatra if you're using Ruby. Like I said, I use PHP and I use SLIM PHP Framework. It is inspired by Sinatra so it's similar and I love it. The author writes some clean code. How these RESTful server implementations work is outside the scope of this discussion though.
I think this is the basic full travel of new Backbone data with no id, across the internets to your server where it generates, and sends back the resource id, to make your model live happily ever after. (Or destroy() not...)
I don't know if this is too beginner for you but hopefully it will help someone else who runs into this problem. Backbone is really fun to program with.
Other similar Answers:
Ways to save Backbone JS model data
or is there some sort of mechanism where I "PUT" the data to the server, which generates the id and returns a model with the id?
Kind of. When you call the save method of your model, backbone make a POST XHR and your application server should respond with JSON contains an id.
You can see an example here: http://addyosmani.com/blog/building-backbone-js-apps-with-ruby-sinatra-mongodb-and-haml/
Quoting from the link:
post '/api/:thing' do
# parse the post body of the content being posted, convert to a string, insert into
# the collection #thing and return the ObjectId as a string for reference
oid = DB.collection(params[:thing]).insert(JSON.parse(request.body.read.tos))
"{\"id\": \"#{oid.to_s}\"}"
end
If you don't know Ruby keep in mind what the last expression that is evaluated is automatically returned by the method.
What I understand from your question is that you want to have a collection with models that exist on the server. In order to get these models into the collection you'd have to add call 'fetch()' on the collection.
The url would be "/users" or something similar, that would have to return an array of objects with user data in there. Each item in the array would then be passed to UserCollection.add(). Well, actually it would be passed all at once, but you get the point.
After this your collection is populated. The url on the Model is meant for updating and saving the individual model. The url of the collection will also be used for creating models. Backbone's sync is RESTful, like Ruby on Rails. You can actually learn more about it on the documentation of Ruby on Rails:
http://guides.rubyonrails.org/routing.html
What you would generally do is have a different url for your model than for your controller. After populating your collection you have id's for each model in there because they came from the server.
Now when you add a new model based on user input you'd do something like this:
var HomeModel = Backbone.Model.extend({
defaults: {
lead: "not logged in",
},
url: 'test.php',
initialize: function(){
_.bindAll(this, 'handleSave', 'handleError');
// Save already knows if this.isNew.
this.save(undefined, {success: this.handleSave, error: this.handleError});
},
handleSave: function(model, response){
this.model.reset(model);
},
handleError: function(){
},
});
var HomeView = Backbone.View.extend({
initialize: function() {
_.bindAll(this, 'render');
this.model = new HomeModel();
this.model.bind("change", this.render);
},
el: 'div',
render: function() {
// Do things to render...
}
});
var homeView = new HomeView();
The example is from someone else's question I answered, I just add the relevant things.
The general idea is to save the model when it is created, if you need it somewhere else you can just move the code into a function of the model and call that based on events or anything else.

Resources