angularjs memory consumption issue - angularjs

Recently I had the opportunity of building a new web application, and thought of trying out Angular to get a good understanding of it. So yeah, I'm fairly new to this framework.
After understanding the nuances of the framework, I found it surprisingly easy to work with. Everything about my experience had been just great, until users started reporting the utterly laggy performance of the application.
The application is fairly simple—it's got 2 screens. One which shows a list of deals, and another where users can add/edit deal information—this second page is a simple form expecting the user to enter deal related information. It looks like this:
The outlined sections are rendered using ng-repeat. The retailers list has some 530 entries whereas the brand list has about 400 entries.
After a bit of profiling, I figured out that visiting this second form screen would keep on increasing the memory consumption of the browser. The first screen doesn't have any such effect. I simply toggled between the first screen and this second form screen, and found that every time this screen would get loaded, the memory consumption would spike by 50-75 MB. Eventually, the browser would just freeze up. Here's how the memory profile looks:
As you can see, the consumption keeps going up, and there's no sign of any GC! Every spike in the node count and memory trace correspond to a visit to the second form-based screen.
Now I have already checked out a whole lot of issues around angular and memory consumption, but each of them mentions that the $scope for any of the views will get removed when a new view loads. The DOM node count certainly doesn't indicate such a thing for me :/
I also came across 2 important points related to the usage of ng-repeat:
Avoid invocation of any function within the ng-repeat directive.
Don't have a two-way binding using ng-model within a ng-repeat directive.
Both of these I've avoided in the second screen, and yet, the memory consumption is going through the roof.
My question might seem to be yet another memory related question w.r.t angular, but I've really tried to get some sort of closure on this and haven't found one.
Would really appreciate any assistance on this, as my decision to progress with the usage of angular for the rest of the portal hinges on solving this issue.
Thanks for reading!
Update 1
Based on Ilan's suggestion, let me add that I make use of 2 plugins for rendering the dropdown and the implementing the date-picker.
For the dropdown, i'm using Bootstrap-select and for the date-picker, I'm using Bootstrap-datepicker.
For bootstrap-select to work, I had to write a custom directive which fired a broadcast on the $last event of ng-repeat. It looks something like this:
.directive('onFinishRender', function($timeout) {
return {
restrict : 'A',
link : function(scope, element, attr) {
if (scope.$last === true) {
$timeout(function() {
scope.$emit('ngRepeatFinished');
});
}
}
};
});
Then in the controller, I rely on this event to invoke the render for the dropdown plugin:
$scope.$on('ngRepeatFinished', function(ngRepeatFinishedEvent) {
$('#retailer').selectpicker('render');
});
For the bootstrap-datepicker, I do not have to do such an elaborate thing, as I only need to wrap the date input field using JS.
Update 2
After turning off the plugins, the memory consumption reduces drastically. However, the problem of a leak still persists. Earlier, whenever the form view was getting loaded, the memory would spike by 50-60 MB. After turning off the plugins, it spikes by 25-35 MB. But as you can see below, the memory consumption keeps on piling up.

I recently spent nights and days finding similar memory leaks as that of yours. These is no direct answer to your question. You will have to do the research but i can give you some pointers to finding the leak.
Don't use any other plugin in your chrome browser except for developer toolbar when debugging leak.
Timeline is good to figure out that there is a leak but to actually see the leak, use profiler tab. It runs a GC everytime you take a Heap Snapshot and gives you a clue if you improved or not.
If you are seeing memory leak in MBs than it is coming from DOMElements. With the size of leaks that you mentioned, i can tell that your whole document is hanging as detached dom element because of one or two components in your page are not getting released and are still hanging as attached dom.
Remove all the elements from your second page and do the switch to see if memory is increasing. If it does, first page has the leak otherwise do the same with second page.
Once you have located which page has the leak, remove all component from that page and add them one by one to see when leak returns.
Hope these steps help you in some way. Also i have found that using $timeout in directive can cause leaks just in case it helps.

Related

React error in render/flush: RangeError in flush RangeError: Maximum call stack size exceeded

I've been in the process of rewriting an old AngularJS app in React (actually it's using preact, chosen by the developer who started this project initially).
This app handles large deeply nested objects that get be displayed via Material UI accordions and tables. The data is more WIDE than deep, but at any rate, React has trouble rendering it all without this RangeError.
I've been dancing with this issue for a while now and have avoided it by strategically managing accordions and not rendering data for accordions that are not open.
I've commonly seen this reported as a recursion issue, and I've carefully reviewed the ode to confirm there is no recursion involved. Plenty of iteration, but no recursion.
Please note the stack trace, it's hitting this in the flush() function, which is not in our application code, but in the Chrome debugger VM. I've set breakpoints and it appears to be something related to DOM operations as the objects being flushed are React elements. Here's a code snippet from the point where this error is hit:
function flush(commit) {
const {
rootId,
unmountIds,
operations,
strings,
stats
} = commit;
if (unmountIds.length === 0 && operations.length === 0) return;
const msg = [rootId, ...flushTable(strings)];
if (unmountIds.length > 0) {
msg.push(MsgTypes.REMOVE_VNODE, unmountIds.length, ...unmountIds);
}
msg.push(...operations); <--- error occurs here when operations.length too long
And the stack trace logged when error occurs:
VM12639:1240 Uncaught (in promise) RangeError: Maximum call stack size exceeded
at flush (<anonymous>:1240:8)
at Object.onCommit (<anonymous>:3409:19)
at o._commit.o.__c (<anonymous>:3678:15)
at QRet.Y.options.__c (index.js:76:17)
at Y (index.js:265:23)
at component.js:141:3
at Array.some (<anonymous>)
at m (component.js:220:9)
The error is occurring if operations is too large. Normally it will be anywhere from a dozen or so in length up to maybe 3000, depending on what's going on, but when I try to load our page displaying the wide/deep nested object this number is more like 150000, which apparently is choking the spread operator.
My sense is that this type of app is a challenge for React. I cannot think of another example of a React app that displays data the way we do with this. If anyone here has experience with this sort of dataset and can offer suggestions as to how to make this work, please share.
My guess is I'm going to need to somehow break this object up into smaller chunks that represent smaller updates, but I'm posting here in case there's something I can learn.
It looks similar to this open issue on the React repo, only it happens in a different place (also in dev tools). Might be worth reporting your issue there too. So probably React is otherwise "fine" rendering this amount of elements, though you'll inevitably get slow performance.
Likely the app is just displaying too much data, or doing it inefficiently.
but when I try to load our page displaying the wide/deep nested object this number is more like 150000, ...
150000 DOM operations is a really high amount. Either your app really does display a whole lot of elements, or the old AngularJS app had too many wrapper elements and these were preserved. Since you mention it concerns data tables, it's probably the first reason. In any case complex applications always need some platform specific optimization.
If you can give an idea about the intended use case, or even better, share (parts of) the code, that would help others to give more targeted advice. Are the 150k operations close to what would happen in real world usage, or is it just a very inflated number for stress testing? Do you see any other performance regressions, compared to the Angular app, with very complex objects? How many tables are on the screen at a time?
A few hundreds of visible elements on the screen already gets quite cramped. So where would all these extra operations coming from? Either you're loading a super long page of which a user can only see a few percent at the same time, or the HTML structure is unnecessarily deeply nested.
Suggested performance improvements
I wouldn't say React isn't suitable for really large amounts of data, but you do need to watch out for some things yourself. React is only your vehicle to apply changes to the DOM. Putting a large amount of elements in the DOM is always going to lead to decreased performance, and is something you usually want to avoid.
In this case you could consider whether it's necessary to display all the table's data, which is probably the bulk of the operations. Using pagination would resolve the problem, and might even make it more user friendly.
If that's not an option, you maybe can use a library like react-lazyload to show/hide the items as they enter/exit the visible part of the table. To achieve this, use their unmountIfInvisible prop. You can then replace a complex data row with a single element that has the same height. The last is important to preserve the scroll height.
<LazyLoad
height={100}
offset={100}
unmountIfInvisible
placeholder={<tr height={100}/>}
>
<MyComplexDataRow />
</LazyLoad>
This way your data table never consists of much more complex elements than can be seen in the viewport. You probably need to tune the offset a bit so that it's always ready in time as it's benig scrolled.

memory increment of the application in the browser while running it for several hour

I am stuck with the memory increment of my application and as it is single page I can't even reload it. After running my application for around 5-6 hour memory size is reaching around 600mb from initial loading i.e 120mb and we did some fixed for this like making the ref to null in the componentWillunMount() and memory has reduced to 400 mb after the same testing for same time but still I can see there are lot many detached element, definitely it caused by some other parts of the code, in the snapshot file which we can take from chrome inbuilt functionality. So is there any way that I can remove all the detached-element while leaving the certain page or why don't browser removed this from memory as the detached-element is retaining some size of the memory ?
DOM node can only be garbage collected when there are no references to it from either the page's DOM tree or JavaScript code.
I suggest you take a look at your code and see if there are functions running not when you want it. If you use react or similar frameworks, you have to be careful with their lifecycle (important!).
Also here https://developers.google.com/web/tools/chrome-devtools/memory-problems/
There are many useful information, such as
- Investigate memory allocation by function
- Spot frequent garbage collections
So is there any way that I can remove all the detached-element while leaving the certain page or why don't browser removed this from memory
I cant offer any more accurate assumption or suggestion if what we have is I use javascript information. Countless consequences from countless combination of libraries, stacks and techniques make this impossible to guess.

Best open-source grid with smooth, infinite scrolling

When I started working on my current project I was given quite an arduous task - to build something that in essence suppose to replace big spreadsheet people use internally in my company.
That's why we I thought a paginated table would never work, and quite honestly I think pagination is stupid. Displaying dynamically changing data on a paginated table is lame. Say an item on page #2 with next data update can land on page whatever.
So we needed to build a grid with nice infinite scroll. Don't get me wrong, I've tried many different solutions. First, I built vanilla ng-repeat thing and tried using ng-infinite-scroll, and then ng-scroll from UI.Utils. That quickly get me to the point where scrolling became painfully slow, and I haven't even had used any crazy stuff like complicated cell templates, ng-ifs or filters. Very soon performance became my biggest pain. When I started adding stuff like resizable columns and custom cell templates, no browser could handle all those bindings anymore.
Then I tried ng-grid, and at first I kinda liked it - easy to use, it has a few nice features I needed, but soon I realized - ng-grid is awful. Current version stuffed with bugs, all contributors stopped fixing those and switched to work on a next version. And only God knows when that will be ready to use. ng-grid turned out to be pretty much worse than even vanilla ng-repeat.
I kept trying to find something better. trNgGrid looked good, but way too simplistic and doesn't offer features I was looking for out of the box.
ng-table didn't look much different from ng-grid, probably it would've caused me same performance issues.
And of course I needed to find a way to optimize bindings. Tried bind-once - wasn't satisfied, grid was still laggy. (upd: angular 1.3 offers {{::foo}} syntax for one-time binding)
Then I tried React. Initial experiment looked promising, but in order to build something more complicated I need to learn React specifics, besides that thing feels kinda non-anguleresque and who knows how to test directives built with angular+react. All my efforts to build nice automated testing failed - I couldn't find a way to make React and PhanthomJS to like each other (which is probably more Phantom's problem. is there better headless browser) Also React doesn't solve "appending to DOM" problem - when you push new elements into the data array, for a few milliseconds browser blocks the UI thread. That of course is completely different type of problem.
My colleague (who's working on server-side of things) after seeing my struggles, grumbled to me that I already spent too much, trying to solve performance problems. He made me to try SlickGrid, telling me stories how this is freakin zee best grid widget. I honestly tried it, and quickly wanted to burn my computer. That thing completely depends on jQuery and bunch of jQueryUI plugins and I refuse to suddenly drop to medieval times of web-development and lose all angular goodness. No, thank you.
Then I came by ux-angularjs-datagrid, and I really, really, really liked it. It uses some smart bad-ass algorithm to keep things very responsive. Project is young, yet looks very promising. I was able to build some basic grid with lots of rows (I mean huge number of rows) without straying too much from the way of angular zen and scrolling still smooth. Unfortunately it's not a complete grid widget solution - you won't have resizable columns and other things out of the box, documentation is somewhat lacking, etc.
Also I found this article, and had mixed feelings about it, these guys applied a few non-documented hacks to angular and most probably those would breaks with feature versions of angular.
Of course there are at least couple of paid options like Wijmo and Kendo UI. Those are compatible with angular, however examples shown are quite simple paginated tables and I'm not sure if it is worth even trying them. I might end-up having same performance issues. Also you can't selectively pay just for the grid widget, you have to buy entire suite - full of shit I probably never use.
So, finally to my question - is there good, guaranteed, less painful way to have nice grid with infinite scrolling? Can someone point to good examples, projects or web-pages? Is it safe to use ux-angularjs-datagrid or better to build my own thing using angular and react? Anybody ever tried Kendo or Wijmo grids?
Please! Don't vote for closing this question, I know there are a lot of similar questions on stackoverflow, and I read through almost every single one of them, yet the question remains open.
Maybe the problem is not with the existing widgets but more with the way you use it.
You have to understand that over 2000 bindings angular digest cycles can take too long for the UI to render smoothly. In the same idea the more html nodes you have on your page, the more memory you will use and you might reach the browser capacity to render so many nodes in a smooth way. This is one of the reason why people use this "lame" pagination.
At the end what you need to achieve to get something "smooth" is to limit the amount of displayed data on the page. To make it transparent you can do pagination on scroll.
This plunker shows you the idea, with smart-table. When scrolling down, the next page is loaded (you will have to implement the previous page when scrolling up). And at any time the maximum amount of rows is 40.
function getData(tableState) {
//here you could create a query string from tableState
//fake ajax call
$scope.isLoading = true;
$timeout(function () {
//if we reset (like after a search or an order)
if (tableState.pagination.start === 0) {
$scope.rowCollection = getAPage();
} else {
//we load more
$scope.rowCollection = $scope.rowCollection.concat(getAPage());
//remove first nodes if needed
if (lastStart < tableState.pagination.start && $scope.rowCollection.length > maxNodes) {
//remove the first nodes
$scope.rowCollection.splice(0, 20);
}
}
lastStart = tableState.pagination.start;
$scope.isLoading = false;
}, 1000);
}
This function is called whenever the user scroll down and reach a threshold (with throttle of course for performance reason)
but the important part is where you remove the first entries in the model if you have loaded more than a given amount of data.
I'd like to bring your attention towards Angular Grid. I had the exactly same problems as you said, so ended up writing (and sharing) my own grid widget. It can handle very large datasets and it has excellent scrolling.

Understanding AngularJS and Google Chrome memory management

i was wondering why - even on the simple SPA application with AngularJS there seems to be a DOM leakage. I may be misinterpreting this but the way I look at this is that DOM elements allocated are not being released properly.
The procedure to reproduce is as follows:
navigate to the page on the screenshot with simple AngularJS application
turn on timeline recording in developer tools
force garbage collection
add an item, and then remove it
force garbage collection
repeat last two steps for atleast 3 times
On the screenshot you can see that after you add an item and remove it there seems to be two more DOM elements more after garbage collection(jump from 502 to 504 DOM elements).
I was hoping that someone could shed some light on this before i get deeper on investigating what is happening. Reason for this test was more complicated AngularJS SPA that I am working on and which also seems to leak memory.
I'm doing a similar thing now. What I've noticed is a couple of things:
1) look at any usage of $(window).fn() -- where fn is any of the functions on the window object; if you're doing that more than once you're adding multiple event handlers to the global object which causes a memory leak
2) $rootScope.$watch() -- similarly, if you're doing this more than once (say, when loading a directive) then you're adding multiple handlers to the global object
3) In my tests (where I go back and forth between two pages) it seems that chrome consumes a large amount of memory (in my case, almost 1GB) before garbage collection kicks in. Maybe when you click the "garbage collection" chrome is not actually doing a GC? Or it's GC for javacsript but not for dom elements?
4) If you add an event handler to the dom element, then remove it from the dom, the handlers never get GC'ed.

how can I exclude an element from an Angular scope?

my premise was wrong. while AngularJS was certainly slowing things down, it was not due to the problem I describe below. however, it was flim's answer to my question - how to exclude an element from an Angular scope - that was able to prove this.
I'm building a site that generates graphs using d3+Raphael from AJAX-fetched data. this results in a LOT of SVG or VML elements in the DOM, depending on what type of chart the user chooses to render (pie has few, line and stacked bar have many, for example).
I'm running into a problem where entering text into text fields controlled by AngularJS brings Firefox to a crawl. I type a few characters, then wait 2-3 seconds for them to suddenly appear, then type a few more, etc. (Chrome seems to handle this a bit better.)
when there is no graph on the page (the user has not provided enough data for one to be generated), editing the contents of these text fields is fine. I assume AngularJS is having trouble when it tries to update the DOM and there's hundreds SVG or VML elements it has to look through.
the graph, however, contains nothing that AngularJS need worry itself with. (there are, however, UI elements both before and after the graph that it DOES need to pay attention to.)
I can think of two solutions:
put the graph's DIV outside the AngularJS controller, and use CSS to position it where it's actually wanted
tell AngularJS - somehow - to nevermind the graph's DIV; to skip it over when keeping the view and model in-sync
the second option seems preferable to me, since it keeps the document layout sane/semantic. is there any way to do this? (or some, even-better solution I have not thought of?)
Have you tried ng-non-bindable? http://docs.angularjs.org/api/ng.directive:ngNonBindable
<ANY ng-non-bindable>
...
</ANY>

Resources