Angularjs performance of $scope.watch vs manual function call? - angularjs

Assuming I have a very complicated object where each key has very large nested structures for values. Each entry in the object (key, complex nested value pair) comes as a result of a call to some http service.
Performance-wise... is it better to....
1) In a service, scope watch the object for any new keys, then inside of it, call a function to do something.
2) After a successful http request, manually call functions that would have otherwise been in the scope watch function to process the new information.
I think the angular way is to watch, but if option 1 would cause a performance issue due to some constant watching, I would prefer to do 2.
What is the best performance way to do this provided that my object being watched could potentially have a large volume key/values?

I would recommend creating a version on the scope and watching that. i.e.:
$scope.version = 1;
Then, whenever your code changes something about the object, simply increment the verion:
$scope.version += 1;
And finally, set your watch on "version" to respond to any changes.

Related

AngualrJS: Create a watcher that would get garbage collected

Is it possible to create a watcher that is tied to an object such that
1) the watcher's handler function uses the object but does not prevent it from garbage collected; and
2) if the object is garbage collected, the watcher is also de-registered
The use case is that a component has a collection of internal "helpers" that caches derived data, so they need to $watch scope variables of the parent. However, they are not angular components (no individual scopes for them) and they're added/removed dynamically without a way of hooking their life-cycle.
I guess I can $watch the collection and process the any removals manually, but this seems extremely cumbersome. A more direct solution is welcomed.

Avoiding watchers with AngularJS on Object.defineProperty

I have extended my entities, which are provided by breezejs, with properties that are the result of a recursive call to the property. The property is display only and I never want to store the value.
In particular I added a "start" and an "end" property which depends on the "end" date of the entity before it and the timespan of the current entity. These values are deterministic and there is no sense in storing them. This works brilliantly until I have a few thousand entities. At this point performance tanks due to the two way binding as a result of all of the properties that were touched during the calculation.
I have a caching scheme in place and simply want angularjs to do nothing during the calculation of the "start" and "end" properties.
How can I prevent any watchers/two way binding from occurring while I am calculating these properties?
In a broader sense, how can you work with large amounts of data within angularjs without drowning in watchers?
Edit:
Something like this appears to be the solution:
https://coderwall.com/p/d_aisq/speeding-up-angularjs-s-digest-loop
Where the watchers in the scope are suspended during an operation and then resumed after I am done.
var watches = scope.$$watchers;
scope.$$watchers = [];
// do fast computation that touches a lot of data
scope.$$watchers = watchers;
Edit 2:
This is before binding to the view, this is in assembly data for binding. I don't think one time bindings can be done in the controller?
Like #zeroflagL's suggestion, one time binding is an option that's available
An expression that starts with :: is considered a one-time expression. One-time expressions will stop recalculating once they are stable, which happens after the first digest if the expression result is a non-undefined value
Expressions # angularjs.org
Also, if you want to start a calculation that won't trigger a digest cycle, don't change any members on the $scope. Just use "var" and in the end of the process do $scope.data = newValue;

angularjs using angular.extend to bind $scope to a service bad/good practice

I'd like to know if using
angular.extend($scope,MyService);
Does it break oop encapsulation principle ?
Does it smell like MyService.call($scope) ?
Could you face variable and function conflicts ?
Is this a bad/good practice?
Typically from my experience services are injected into the controller and then will be called from that scope. I wouldn't say that using extend to copy over functions and properties is necessarily bad, but it might mitigate some of the purposes of IoC (inversion of control), which is what injection in angular is based on.
Does it break oop...?
My understanding is that what you would get from this call is additional functions and service calls directly on your scope. This doesn't break OOP in the sense that scope is an object and would have functions applied. Provided those functions + properties make sense on the scope it seems like a fine thing to do from that perspective.
Does it smell like MyService.call($scope)?
As I said in the first paragraph - I don't see why you wouldn't just call the service and either share data or pass in references to objects to the service. Another pattern that is common in angular is to use a promise to process returned data in your scope. That looks like:
MyService.callFunction(parameters).then(function (data) {
// process data here. Since this runs in $scope you can also use $scope normally.
// $scope.$apply() is in progress and will complete when the function returns
});
All the service does is provide the data to the scope then. Point is that I think there are better patterns than "extend".
Can you face conflicts?
In the call angular.extend(a, b); data, properties and functions are copied from b to a. If something already exists on a it will be overwritten with the data from b. So technically the short answer is "yes", you can face conflicts.
The bottom line
So at the end of the day this isn't a bad pattern but there are probably more common patterns I would try to use first.

Is it an antipattern to use angular's $watch in a controller?

In my never ending quest to do things the "proper" angular way, I have been reading a lot about how to have controllers observe the changes in models held in angular services.
Some sites say using a $watch on a controller is categorically wrong:
DON'T use $watch in a controller. It's hard to test and completely unnecessary in almost every case. Use a method on the scope to update the value(s) the watch was changing instead.
Others seem fine with it as long as you clean up after yourself:
The $watch function itself returns a function which will unbind the $watch when called. So, when the $watch is no longer needed, we simply call the function returned by $watch.
There are SO questions and other reputable sites that seem to say right out that using a $watch in a controller is a great way to notice changes in an angular-service-maintained model.
The https://github.com/angular/angular.js/wiki/Best-Practices site, which I think we can give a bit more weight to, says outright that $scope.$watch should replace the need for events. However, for complex SPA's that are handling upwards of 100 models and REST endpoints, choosing to use $watch to avoid events with $broadcast/$emit could end up with lots of watches. On the other hand, if we don't use $watch, for non-trivial apps we end up tons of event spaghetti.
Is this a lose/lose situation? Is it a false choice between events and watches? I know you can use the 2-way binding for many situations, but sometimes you just need a way to listen for changes.
EDIT
Ilan Frumer's comment made me rethink what I'm asking, so perhaps instead of just asking whether it is subjectively good/bad to use a $watch in a controller, let me put the questions this way:
Which implementation is likely to create a performance bottleneck first? Having controllers listen for events (which had to have been broadcast/emitted), or setting up $watch-es in controllers. Remember, large-scale app.
Which implementation creates a maintenance headache first: $watch-es or events? Arguably there is a coupling (tight/loose) either way... event watchers need to know what to listen for, and $watch-es on external values (like MyDataService.getAccountNumber()) both need to know about things happening outside their $scope.
** EDIT over a year later **
Angular has changed / improved a lot since I asked this question, but I still get +1's on it, so I thought I would mention that in looking at the angular team's code, I see a pattern when it comes to watchers in controllers (or directives where there is a scope that gets destroyed):
$scope.$on('$destroy', $scope.$watch('scopeVariable', functionIWantToCall));
What this does it take what the $watch function returns - a function that can be called to deregister the watcher - and give that to the event handler for when the controller is destroyed. This automatically cleans up the watcher.
Whether watches in controllers are code smell or not, if you use them, I believe the angular team's use of this pattern should serve as a strong recommendation for how to use them.
Thanks!
I use both, because honestly, I view them as different tools for different problems.
I'll give an example from an application that I built. I had a complex WebSocket Service that received dynamic data models from a web-socket server. The service itself doesn't care what the model looks like, but, of course, the controller sure does.
When the controller is initiated, it set up a $watch on the service data object so that it knew when it's particular data object had arrived (like waiting for Service.data.foo to exist. As soon as that model came into existence, it was able to bind to it and crate a two-way data-bind to it, the watch became obsolete, and it was destroyed.
On the other side, the service was responsible for broadcasting certain events as well, because sometimes the client would receive literal commands from the server. For instance, the Server might request that the client send some metadata that was stored in the '$rootScope' throughout the application. an .on() was set up in the $rootScope during module.run() step to listen for those commands from the server, gather needed information from other services, and call the WebSocket service back to send the data as requested. Alternatively, if I had done this using a $watch(), I would have needed to set up some sort of arbitrary variable for it to watch, like metadataRequests which I would need to increment every time I receive a request. A broadcast achieves the same thing without having to live in permanent memory, like our variable would.
Essentially, I use a $watch() when there is a specific value that I want to see change (especially if I need to know the before-and-after values), and I use events if there are more high-level conditions that have been met that the controllers need to know about.
With regards to performance, I couldn't tell you which one is going to bottleneck first, but I feel like thinking of it this way will let you use the strengths of each feature where they are strongest. For instance, if you use $on() instead of $watch() to look for changes in data, you will not have access to the values before and after the change, which could limit what you are trying to do.
All that two-way data-binding is, is a $watch on whatever scope property you give to ng-model, which has a controller that allows other directives like input, and form to sync the ng-model value to render the view on a change. Which is detected by their registration of events in the DOM.
In essence,ng-model's $watch compares the value in the model, to the value it has internally. The value it has internally is set by supporting directives (input,form etc).
IMHO The only "events" you should react to in an angular application are user created (ie DOM events). These are solved with directives on the DOM and ng-model linking to the ..model
Also naturally there is async, for which angular provides $q for which the callbacks invoke a $digest.
As for performance, it says it really well in the angular docs. Its run on every $digest. So make it fast. Whats every $digest? Angular traverses all of your active scopes. Each scope has watches. which it executes. and performs comparisons in those. If there are diffs, it will run again. (the next loop around) Its not that simple because its optimized but all of your "angular code" executes in this $digest loop. A lot of directives might invoke a digest with scope.$apply(...) That will cause watches of whatever value they changed to notice and do their thing.
So your original question. Is it an anti-pattern? Absolutely not if you know how to use it. Though I'd just use ng-model. Just because it has had 1.2.10+ iterations of pretty smart people working on it... All of the other 'reactive-ness' of your app can be handled by $q, $timeout and the like.
I think they all have their proper place and, for me, it would be difficult to say stop using one for the others.
Data binding should always be used to keep your data model in sync with changes from the view. I think we can all agree on that.
I think using a watch on a controller to trigger some action based on a data change is useful. Like watching a complex data model to calculate a running total for an invoice. Or watching a model to trigger it as dirty.
I have used broadcast/emit/on when sending a message or an indication of some change from one scope to another that may be several layers away. I have created a custom directive where a broadcast event has been used as a hook to take some action in a controller.

Dereferencing objects with angularJS' $resource

I'm new to AngularJS and I am currently building a webapp using a Django/Tastypie API. This webapp works with posts and an API call (GET) looks like :
{
title: "Bootstrap: wider input field - Stack Overflow",
link: "http://stackoverflow.com/questions/978...",
author: "/v1/users/1/",
resource_uri: "/v1/posts/18/",
}
To request these objects, I created an angular's service which embed resources declared like the following one :
Post: $resource(ConfigurationService.API_URL_RESOURCE + '/v1/posts/:id/')
Everything works like a charm but I would like to solve two problems :
How to properly replace the author field by its value ? In other word, how the request as automatically as possible every reference field ?
How to cache this value to avoid several calls on the same endpoint ?
Again, I'm new to angularJS and I might missed something in my comprehension of the $resource object.
Thanks,
Maxime.
With regard to question one, I know of no trivial, out-of-the-box solution. I suppose you could use custom response transformers to launch subsidiary $resource requests, attaching promises from those requests as properties of the response object (in place of the URLs). Recent versions of the $resource factory allow you to specify such a transformer for $resource instances. You would delegate to the global default response transformer ($httpProvider.defaults.transformResponse) to get your actual JSON, then substitute properties and launch subsidiary requests from there. And remember, when delegating this way, to pass along the first TWO, not ONE, parameters your own response transformer receives when it is called, even though the documentation mentions only one (I learned this the hard way).
There's no way to know when every last promise has been fulfilled, but I presume you won't have any code that will depend on this knowledge (as it is common for your UI to just be bound to bits and pieces of the model object, itself a promise, returned by the original HTTP request).
As for question two, I'm not sure whether you're referring to your main object (in which case $scope should suffice as a means of retaining a reference) or these subsidiary objects that you propose to download as a means of assembling an aggregate on the client side. Presuming the latter, I guess you could do something like maintaining a hash relating URLs to objects in your $scope, say, and have the success functions on your subsidiary $resource requests update this dictionary. Then you could make the response transformer I described above check the hash first to see if it's already got the resource instance desired, getting the $resource from the back end only when such a local copy is absent.
This is all a bunch of work (and round trips) to assemble resources on the client side when it might be much easier just to assemble your aggregate in your application layer and serve it up pre-cooked. REST sets no expectations for data normalization.

Resources