ExtJS 4 DirectProxy simultaneously store load - extjs

I have 2 stores with different models but same direct proxy configuration. When i load these 2 stores (i call store.load() for both stores at the same time) ext is sending only one request (containing both loads) and the second store is not populated with data. I tried setting batchActions to false with no success. I am using ext direct spring on server side.
Proxy configuration:
proxy: {
type: 'direct',
batchActions:false,
directFn:doctorDirectController.getAll,
reader:{
type:'json',
root:'records'
}
}
When i set timeout for 1 sec everything works fine:
this.doctorStore1.load();
var me = this;
setTimeout(function() {
me.doctorStore2.load();
}, 1000);
So the 2 questions:
How to force directproxy not to batch getAll requests
Why second store is not being populated with data? The request and response contains tids that match up.

It turned out to be problem with Spring. To handle cycling references between entities (which caused problems during jackson serialization) i used following annotation
#JsonIdentityInfo(generator=ObjectIdGenerators.IntSequenceGenerator.class, property="#doctorId")
When simultaneous 2 store loads occured they were serialized in one batch and send back in one response. The above annotation was removing all the duplicates during json serialization causing second store not to be populated with data.

Related

Block or extend Breeze metadata auto fetching

Current application - Angular application with Breeze. Application has ~7 entity managers and different data domains (metadata). When application runs we trying to fetch entity managers, like:
app.run(['$rootScope', 'datacontext1', ... ], function($rootScope, datacontext1, ...) {
datacontext1.loadMetadata();
...
datacontext7.loadMetadata();
}
Every datacontext has its own entity manager and loadMetadata is:
function loadMetadata() {
manager.fetchMetadata().then(function(mdata) {
if (mdata === 'already fetched') {
return;
}
...
applyCustomMetadata(); // Do some custom job with metadata/entity types
});
}
Metadata comes from server asynchronously. Few module has really big metadata, like 200Kb and takes some time for loading and apply to entity manager. Its possible that first Breeze data request executed in same entity manager will be started before this loadMetadata operation finished and as I understand Breeze automatically fetch metadata again. Usually its not a problem, Metadata end point cached on server, but sometimes it produces very strange behavior of Breeze - EntityManager.fetchMetadata resolve promise as "already fetched" and in this case applyCustomMetadata() operation can not be executed.
As I understand problem is inside Breeze and approach its used to resolve metadata promise (seems to be http adapter is singleton and second request override metadata with "already fetched" string and applyCustomMetadata() operation never executes).
Need to figure out some way to resolve issue without significant changes in application.
Logically need to delay full application from using entity managers while loadMetadata done. Looking for any way on Breeze level to disable auto fetch metadata if its already in progress (but not interrupt request, just wait and try again after some time). Any other ideas are fine as well.
Why are you allowing queries to execute before the metadata is loaded? Therein lies your problem.
I have an application bootstrapper that I expose through a global variable; none of my application activities depending on the Entity Manager are started until preliminary processes complete:
var bootstrapper = {
pageReady: ko.observable(false)
};
initBootstrapper();
return bootstrapper;
function initBootstrapper() {
window.MyApp.entityManagerProvider.initialize() // load metadata, lookups, etc
.then(function () {
window.MyApp.router.initialize(); // setup page routes, home ViewModel, etc
bootstrapper.pageReady(true); // show homepage
});
};
Additionally, depending on the frequency of database changes occurring in your organization, you may wish to deliver the metadata to the client synchronously on page_load. See this documentation for further details:
http://breeze.github.io/doc-js/metadata-load-from-script.html

ExtJs Model Proxy vs. Store Proxy

OK, I'm stuck on what should be a basic task in ExtJs. I'm writing a simple login script that sends a user name and password combination to a RESTful web service and receives a GUID if the credentials are correct.
My question is, do I use a Model Proxy or a Store Proxy?
To my understanding, Models represent a single record, whereas Stores are for handling sets of data containing more than one record. If this is correct then it would seem that a Model proxy is the way to go.
Following Sencha's documentation at http://docs.sencha.com/extjs/4.2.1/#!/api/Ext.data.Model the code would look something like this:
Ext.define('AuthenticationModel', {
extend: 'Ext.data.Model',
fields: ['username', 'password'],
proxy: {
type: 'rest',
url : '/authentication'
}
});
//get a reference to the authentication model class
var AuthenticationModel = Ext.ModelManager.getModel('AuthenticationModel');
So far everything is OK, until the next step:
//Use the configured RestProxy to make a GET request
AuthenticationModel.load('???', {
success: function(session) {
console.log('Login successful');
}
});
The load() method for the Model class is a static call expecting a single unique identifier. Logins typically depend upon two factors, username and password.
So it appears Store proxies are the only way to validate someone's username and password credential combination in ExtJS. Can someone verify and explain? Any help to understand this would be greatly appreciated.
You just need to know the following:
The store will use it's own proxy if you configured one for this
instance and if not he takes the proxy from the model.
So you can easily go with two proxy configurations to enable the multi-CRUD operations on the store and the single-CRUD operations on the Models. Note the the static load method of the Model expects the model id because it is supposed to load a model by just one Id (yes, composite keys are not supported). You will also have to fetch the model instance in the callback (As you did).
Back to your Username/password problem
You may apply your session Model with a custom 'loadSession' method
loadSession: function(username,password, config) {
config = Ext.apply({}, config);
config = Ext.applyIf(config, {
action: 'read',
username: username,
password: password
});
var operation = new Ext.data.Operation(config),
scope = config.scope || this,
callback;
callback = function(operation) {
var record = null,
success = operation.wasSuccessful();
if (success) {
record = operation.getRecords()[0];
// If the server didn't set the id, do it here
if (!record.hasId()) {
record.setId(username); // take care to apply the write ID here!!!
}
Ext.callback(config.success, scope, [record, operation]);
} else {
Ext.callback(config.failure, scope, [record, operation]);
}
Ext.callback(config.callback, scope, [record, operation, success]);
};
this.getProxy().read(operation, callback, this);
}
Now call this instead of load.
I found it in the documentation of sencha App Architecture Part 2
Use proxies for models:
It is generally good practice to do this as it allows you to load and
save instances of this model without needing a store. Also, when
multiple stores use this same model, you don’t have to redefine your
proxy on each one of them.
Use proxies for stores:
In Ext JS 4, multiple stores can use the same data model, even if the
stores will load their data from different sources. In our example,
the Station model will be used by the SearchResults and the Stations
store, both loading the data from a different location. One returns
search results, the other returns the user’s favorite stations. To
achieve this, one of our stores will need to override the proxy
defined on the model.

What are the best practices for pre-fetching in backbone.js?

What's the best way to pre-fetch the next page in backbone.js?
Is there a build-in mechanism to do that, or do I have to take care of it myself by making Ajax calls and storing the results.
Also, is there a way to preload the entire page like in JQuery mobile( http://jquerymobile.com/demos/1.2.0/docs/pages/page-cache.html)
There is no built in support for a such a thing. It's dependent on your use case but you could do a number of things.
1) Use setTime() to wait a short time before fetching the data you might be needing shortly. (Probably not a good solution)
2) Set up an event handler to fetch the data on a specific event, or something similar:
$('#my-button').on('hover',function() {
//fetch data
});
To fetch the data you can use the fetch() function on a backbone model or collection, which will return a jqXHR (or you can use a straight $.ajax() call). You can then wait and see if it failed or passed:
var fetch = myModel.fetch();
fetch.done(function(data) {
// Do something with data, maybe store it in a variable than you can later use
})
.fail(function(jqXHR) {
// Handle the failed ajax call
// Use the jqXHR to get the response text and/or response status to do something useful
});
No build-in support, but actually easy to add. Please referer to the concept of View Manager, it is able to handle both "view-keeping" tasks and transitions.
In short, the concept is: view manager is component, which is reposible for switching from one application view to another. It will dispose existing view, so it prevents zombies and memory leaks. As well it could handle transitions between view switch.
Here my way how I handle the loading of pages into an "endless scrolling" list.
Make you backend paginate aware
First of all you require a DB backend which is capable of handlling page load requests:
As an example refer to my git modelloader project which provides a small Coffee based framework integrated into a Node.js/Moongoose environment.
Model Loader on GIT will contain additional information and samples.
Here the most important points:
You backend should support the following Pagination features
Each request will return partial response only limiting it to for example 20 records (size of a page).
By default the last JSON record entry returned by a request, will contain additional technical and meta information about the request, necessary for allowing consumers to implement a paging.
{
_maxRec: "3",
_limit: "20",
_offset: "0"
}
_maxRec will list the total amount of records in the collection
_limit will list the maximum number of requests which are given back
_offset will tell you which set of records was passed back, i.e. an _offset of 200 would mean that result list skipped the first 199 records and presents the records from 200-220
The backend should support the following pagination control parameters:
http(s)://<yourDomainName>/<versionId</<collection>?offset=<number>
Use offset to skip a number of records, as for example with a limit of 20, you would send a first request with offset=0 then offset=20, then offset=40 etc. until you reached _maxRec.
In order to reduce the db activities you should provide a possiblity to reduce the _maxRec calculation for subsequent requests:
http(s)://<yourDomainName>/<versionId</<collection>?maxRec=<number>
By passing in a maxRec parameter (normally the one gotten by an earlier paging requerst), the request handler will by pass the database count objects statement, which results in one db activity less (performance optimization). The passed in number will passed back via _maxRec entry. Normally a consumer will fetch in the first request the _maxRec number and pass it back for the subsequent request, resulting in a faster data access request.
Fire of Backbone Model requests if necessary
So now you have to implement on the Backbone side the firing of page loading requests when necessary.
In the example below we assume to have a Backbone.View which has a list loaded into a jquery.tinyscrollbar based HTML element. The list contains the first 20 records loaded via the URL when built up initially:
http(s)://<yourDomainName>/<versionId</<collection>?offset=0
The View would listen in this case to the following scrolling events
events:
'mousewheel': 'checkScroll'
'mousemove': 'checkScroll'
Goal is as soon the user has scrolled down to the bottom of the scrollable list (e.g. he reaches a point which is 30px above the scrollable list end point) a request will be fired to load the next 20 entries. The following code sample desrcribes the necessary step:
checkScroll: () =>
# We calculate the actual scroll point within the list
ttop = $(".thumb").css("top")
ttop = ttop.substring(0,ttop.length-2)
theight = $(".thumb").css("height")
theight = theight.substring(0,theight.length-2)
triggerPoint = 30 # 30px from the bottom
fh = parseFloat(ttop)+parseFloat(theight)+triggerPoint
# The if will turn true if the end of the scrollable list
# is below 30 pixel, as well as no other loading request is
# actually ongoing
if fh > #scrollbarheight && ! #isLoading && #endLessScrolling
# Make sure that during fetch phase no other request intercepts
#isLoading = true
log.debug "BaseListView.checkscroll "+ ttop + "/"+ theight + "/"+ fh
# So let's increase the offset by the limit
# BTW these two variables (limit, offset) will be retrieved
# and updated by the render method when it's getting back
# the response of the request (not shown here)
skipRec = #offset+#limit
# Now set the model URL and trigger the fetch
#model.url = #baseURL+"?offset="+skipRec+"&maxRec="+#maxRec
#model.fetch
update: true
remove: false
merge: false
add: true
success: (collection, response, options) =>
# Set isLoading to false, as well as
# the URL to the original one
#isLoading = false
#model.url = #baseURL
error: (collection, xhr, options) =>
#isLoading = false
#model.url = #baseURL
The render method of the view will get the response back and will update the scrollable list, which will grow in size and allows the user again to start scrolling down along the new loaded entries.
This will load nicely all the data in a paged manner.

Sencha and Java Servlet sync()

I am adapting a tutorial to change from localstorage to use a java servlet but i am having some problems. I am trying to update the changes a user makes by calling sync() but i am getting these errors.
[WARN][Ext.data.Operation#process] Unable to match the record that came back from the server.
I tried seeing if the updated values where being send to the servlet
String name = request.getParameter("name");
is null. How do I send the updated values back the server and read them? I tried looking for a sencha touch + servlets tutorial but can't find anything
this is my sync code
var showsStore = Ext.getStore("Shows");
if (null == showsStore.findRecord('name', currentShow.data.name)) {
showsStore.add(currentShow);
}
showsStore.sync();
The expected return for a store sync is an array of JSON records, or nothing.
Add this to your data store
writer: {
type: 'json',
rootProperty: 'data',
encode: true,
writeAllFields: true
},

Writing store updates back to server

I have read the Sencha tutorials, API, and forums for awhile but am hitting a brick wall. I have a mySQL/PHP backend that is storing a web application's data. I have the following Sench/ExtJS construct:
App.stores.user = new Ext.data.Store({
model: 'User',
proxy: new Ext.data.AjaxProxy({
url: 'app/stores/scripts/connect.php',
extraParams: {
method:'user',
user_id: 3
},
reader: {
type:'json',
root:'root'
}
}),
autoLoad:true
});
Data is loaded into the store fine, but I have a form that directly updates a User instance.
App.controllers.account = new Ext.Controller({
save: function (options) {
options.user.set(options.data);
options.user.save(); // Generates error: Uncaught Error: You are using a ServerProxy but have not supplied it with a url.
}
});
How can I successfully implement/hook the functions to write the dirty records back to the server? How is the request prepared and passed?
Thank you.
Uptil Ext Js 3.3.1 the store needs to be configured with a writer when performing CRUD operations. It isn't needed for read only. That's why your store is loaded but you're getting an error during write operations. In your case you'll have to specify a JsonWriter. Also, if you're not using the store in a restful manner, then you'll also need to specify a HttpProxy object to tell the store at which url to publish the create/update/delete on the store.
Check the docs for Ext 4. I should be something similar here too.

Resources