React error in render/flush: RangeError in flush RangeError: Maximum call stack size exceeded - reactjs

I've been in the process of rewriting an old AngularJS app in React (actually it's using preact, chosen by the developer who started this project initially).
This app handles large deeply nested objects that get be displayed via Material UI accordions and tables. The data is more WIDE than deep, but at any rate, React has trouble rendering it all without this RangeError.
I've been dancing with this issue for a while now and have avoided it by strategically managing accordions and not rendering data for accordions that are not open.
I've commonly seen this reported as a recursion issue, and I've carefully reviewed the ode to confirm there is no recursion involved. Plenty of iteration, but no recursion.
Please note the stack trace, it's hitting this in the flush() function, which is not in our application code, but in the Chrome debugger VM. I've set breakpoints and it appears to be something related to DOM operations as the objects being flushed are React elements. Here's a code snippet from the point where this error is hit:
function flush(commit) {
const {
rootId,
unmountIds,
operations,
strings,
stats
} = commit;
if (unmountIds.length === 0 && operations.length === 0) return;
const msg = [rootId, ...flushTable(strings)];
if (unmountIds.length > 0) {
msg.push(MsgTypes.REMOVE_VNODE, unmountIds.length, ...unmountIds);
}
msg.push(...operations); <--- error occurs here when operations.length too long
And the stack trace logged when error occurs:
VM12639:1240 Uncaught (in promise) RangeError: Maximum call stack size exceeded
at flush (<anonymous>:1240:8)
at Object.onCommit (<anonymous>:3409:19)
at o._commit.o.__c (<anonymous>:3678:15)
at QRet.Y.options.__c (index.js:76:17)
at Y (index.js:265:23)
at component.js:141:3
at Array.some (<anonymous>)
at m (component.js:220:9)
The error is occurring if operations is too large. Normally it will be anywhere from a dozen or so in length up to maybe 3000, depending on what's going on, but when I try to load our page displaying the wide/deep nested object this number is more like 150000, which apparently is choking the spread operator.
My sense is that this type of app is a challenge for React. I cannot think of another example of a React app that displays data the way we do with this. If anyone here has experience with this sort of dataset and can offer suggestions as to how to make this work, please share.
My guess is I'm going to need to somehow break this object up into smaller chunks that represent smaller updates, but I'm posting here in case there's something I can learn.

It looks similar to this open issue on the React repo, only it happens in a different place (also in dev tools). Might be worth reporting your issue there too. So probably React is otherwise "fine" rendering this amount of elements, though you'll inevitably get slow performance.
Likely the app is just displaying too much data, or doing it inefficiently.
but when I try to load our page displaying the wide/deep nested object this number is more like 150000, ...
150000 DOM operations is a really high amount. Either your app really does display a whole lot of elements, or the old AngularJS app had too many wrapper elements and these were preserved. Since you mention it concerns data tables, it's probably the first reason. In any case complex applications always need some platform specific optimization.
If you can give an idea about the intended use case, or even better, share (parts of) the code, that would help others to give more targeted advice. Are the 150k operations close to what would happen in real world usage, or is it just a very inflated number for stress testing? Do you see any other performance regressions, compared to the Angular app, with very complex objects? How many tables are on the screen at a time?
A few hundreds of visible elements on the screen already gets quite cramped. So where would all these extra operations coming from? Either you're loading a super long page of which a user can only see a few percent at the same time, or the HTML structure is unnecessarily deeply nested.
Suggested performance improvements
I wouldn't say React isn't suitable for really large amounts of data, but you do need to watch out for some things yourself. React is only your vehicle to apply changes to the DOM. Putting a large amount of elements in the DOM is always going to lead to decreased performance, and is something you usually want to avoid.
In this case you could consider whether it's necessary to display all the table's data, which is probably the bulk of the operations. Using pagination would resolve the problem, and might even make it more user friendly.
If that's not an option, you maybe can use a library like react-lazyload to show/hide the items as they enter/exit the visible part of the table. To achieve this, use their unmountIfInvisible prop. You can then replace a complex data row with a single element that has the same height. The last is important to preserve the scroll height.
<LazyLoad
height={100}
offset={100}
unmountIfInvisible
placeholder={<tr height={100}/>}
>
<MyComplexDataRow />
</LazyLoad>
This way your data table never consists of much more complex elements than can be seen in the viewport. You probably need to tune the offset a bit so that it's always ready in time as it's benig scrolled.

Related

What are the alternative ways of Floodlight Counter tag(in Google Tag Manager)?

Here is the Story of current implementation:
I have over 50 Ad Campaigns. To track the user-behavior, I have implemented Floodlight Count tag for all of those. However, it is eating up lot of container size. Therefore, I am looking for a solution with which, I can dynamically fire Floodlights or without implementing Floodlights, I can get the similar result.
I already have implemented this solution. However, it increases the Loadtime of the webpage as it contains RegEx table(and my RegEx table has over 50 entries).
I am looking for a solution which involves minimal use of Custom variables by mostly using what is available by-default in GTM.
What's available by default in GTM (not counting custom variables) wouldn't be sufficient for covering 50 ad campaigns in an effective manner. You could always create 50 triggers and 50 tags, hardcoding your ids and maintain them separately in different tags, which is quite far from being the best practice. It actually may be the worst practice, especially given how the container size is hard-limited for GTM.
Even a large rLUT should be very insignificant given that the variable against which it's executed is reasonably small and the url is most definitely a tiny string. Have you actually measured the page load speed with and without the rLUT? Have you debugged the load speed issues and narrowed them down to the rLUT? It's actually quite an achievement to make GTM significantly affect the page load speed, given how it's all async and non-blocking.
You could reimplement the rLUT in a CSJ variable, which may be more preferable if you're good at JS, but it won't improve performance. Just like an rLUT won't hinder performance.

React Simple Global Entity Cache instead of Flux/React/etc

I am writing a little "fun" Scala/Scala.js project.
On my server I have Entities which are referenced by uuid-s
(inside Ref-s).
For the sake of "fun", I don't want to use flux/redux architecture but still use React on the client (with ScalaJS-React).
What I am trying to do instead is to have a simple cache, for example:
when a React UserDisplayComponent wants the display the Entity User with uuid=0003
then the render() method calls to the Cache (which is passed in as a prop)
let's assume that this is the first time that the UserDisplayComponent asks for this particular User (with uuid=0003) and the Cache does not have it yet
then the Cache makes an AjaxCall to fetch the User from the server
when the AjaxCall returns the Cache triggers re-render
BUT ! now when the component is asking for the User from the Cache, it gets the User Entity from the Cache immediately and does not trigger an AjaxCall
The way I would like to implement this is the following :
I start a render()
"stuff" inside render() asks the Cache for all sorts of Entities
Cache returns either Loading or the Entity itself.
at the end of render the Cache sends all the AjaxRequest-s to the server and waits for all of them to return
once all AjaxRequests have returned (let's assume that they do - for the sake of simplicity) the Cache triggers a "re-render()" and now all entities that have been requested before are provided by the Cache right away.
of course it can happen that the newly arrived Entity-s will trigger the render() to fetch more Entity-s if for example I load an Entity that is for example case class UserList(ul: List[Ref[User]]) type. But let's not worry about this now.
QUESTIONS:
1) Am I doing something really wrong if I am doing the state handling this way ?
2) Is there an already existing solution for this ?
I looked around but everything was FLUX/REDUX etc... along these lines... - which I want to AVOID for the sake of :
"fun"
curiosity
exploration
playing around
I think this simple cache will be simpler for my use-case because I want to take the "REF" based "domain model" over to the client in a simple way: as if the client was on the server and the network would be infinitely fast and zero latency (this is what the cache would simulate).
Consider what issues you need to address to build a rich dynamic web UI, and what libraries / layers typically handle those issues for you.
1. DOM Events (clicks etc.) need to trigger changes in State
This is relatively easy. DOM nodes expose callback-based listener API that is straightforward to adapt to any architecture.
2. Changes in State need to trigger updates to DOM nodes
This is trickier because it needs to be done efficiently and in a maintainable manner. You don't want to re-render your whole component from scratch whenever its state changes, and you don't want to write tons of jquery-style spaghetti code to manually update the DOM as that would be too error prone even if efficient at runtime.
This problem is mainly why libraries like React exist, they abstract this away behind virtual DOM. But you can also abstract this away without virtual DOM, like my own Laminar library does.
Forgoing a library solution to this problem is only workable for simpler apps.
3. Components should be able to read / write Global State
This is the part that flux / redux solve. Specifically, these are issues #1 and #2 all over again, except as applied to global state as opposed to component state.
4. Caching
Caching is hard because cache needs to be invalidated at some point, on top of everything else above.
Flux / redux do not help with this at all. One of the libraries that does help is Relay, which works much like your proposed solution, except way more elaborate, and on top of React and GraphQL. Reading its documentation will help you with your problem. You can definitely implement a small subset of relay's functionality in plain Scala.js if you don't need the whole React / GraphQL baggage, but you need to know the prior art.
5. Serialization and type safety
This is the only issue on this list that relates to Scala.js as opposed to Javascript and SPAs in general.
Scala objects need to be serialized to travel over the network. Into JSON, protobufs, or whatever else, but you need a system for this that will not involve error-prone manual work. There are many Scala.js libraries that address this issue such as upickle, Autowire, endpoints, sloth, etc. Key words: "Scala JSON library", or "Scala type-safe RPC", depending on what kind of solution you want.
I hope these principles suffice as an answer. When you understand these issues, it should be obvious whether your solution will work for a given use case or not. As it is, you didn't describe how your solution addresses issues 2, 4, and 5. You can use some of the libraries I mentioned or implement your own solutions with similar ideas / algorithms.
On a minor technical note, consider implementing an async, Future-based API for your cache layer, so that it returns Future[Entity] instead of Loading | Entity.

memory increment of the application in the browser while running it for several hour

I am stuck with the memory increment of my application and as it is single page I can't even reload it. After running my application for around 5-6 hour memory size is reaching around 600mb from initial loading i.e 120mb and we did some fixed for this like making the ref to null in the componentWillunMount() and memory has reduced to 400 mb after the same testing for same time but still I can see there are lot many detached element, definitely it caused by some other parts of the code, in the snapshot file which we can take from chrome inbuilt functionality. So is there any way that I can remove all the detached-element while leaving the certain page or why don't browser removed this from memory as the detached-element is retaining some size of the memory ?
DOM node can only be garbage collected when there are no references to it from either the page's DOM tree or JavaScript code.
I suggest you take a look at your code and see if there are functions running not when you want it. If you use react or similar frameworks, you have to be careful with their lifecycle (important!).
Also here https://developers.google.com/web/tools/chrome-devtools/memory-problems/
There are many useful information, such as
- Investigate memory allocation by function
- Spot frequent garbage collections
So is there any way that I can remove all the detached-element while leaving the certain page or why don't browser removed this from memory
I cant offer any more accurate assumption or suggestion if what we have is I use javascript information. Countless consequences from countless combination of libraries, stacks and techniques make this impossible to guess.

How to make JSON loads faster with large data (on HTTP or WebPage)

. Requesting the page(on HTTP or WebPage), it is very slow or even crash unless i load my JSON with fewer data. I really need to solve this since sooner or later i will be using large amount of data frequently. Here are my JSON data. --->>>
Notes:
1. The JSON loads only String and Integer.
2. I used to view my JSON in JSONView more like treeview using plugin
from GoogleChrome.
I am using angular and nodejs. tq
A quick resume of all the things that comes to my mind :
I had a similar issue once. My solutions may make the UI change.
Pagination
I doubt you can display that much data at one time, so the strategy should be divising your data in small amounts and then only load more when the client ask for it.
This way, the whole data is no longer stored in RAM as it is currently. This is how forums works (only 20 topics at a time).
Just imagine if StackOverflow make you load the whole historic of questions in the main page, how much GB would your navigator need just for that ?
You can use pagination in a classic way (button with page number, like google), or in an infinite scroll way, as you want.
For that you need to adapt your api and keep track of the index of the pages you already loaded at every moment in your Front. There are plenty of examples in AngularJS.
Only show the beginning of the data
When you look at Facebook comments, you may have a "show more" button. In their case, maybe it's to not break the UI, but it can also be used to not overload data.
You can display only the main lines of your datae (titles or somewhat) and add a button so the user can load more details if they want.
In your data model, the cost seems to be on the second level of "C". Just load data untill this second level, and download the remaining part (for this object) only if the user asks for.
Once again, no need to overload, your client's RAM will be thankfull, and your client's mobile 3G too.
Optimize your data stucture
If this is still not enough :
As StefanArya said in comment, indeed remove the "I" attribute, which is redundant with the JSON key.
Remove the "I" as you can use Object.keys() to get key name.
You also may don't need that much precision on your floats.
If I see any other ideas, I'll edit this post later.

Best open-source grid with smooth, infinite scrolling

When I started working on my current project I was given quite an arduous task - to build something that in essence suppose to replace big spreadsheet people use internally in my company.
That's why we I thought a paginated table would never work, and quite honestly I think pagination is stupid. Displaying dynamically changing data on a paginated table is lame. Say an item on page #2 with next data update can land on page whatever.
So we needed to build a grid with nice infinite scroll. Don't get me wrong, I've tried many different solutions. First, I built vanilla ng-repeat thing and tried using ng-infinite-scroll, and then ng-scroll from UI.Utils. That quickly get me to the point where scrolling became painfully slow, and I haven't even had used any crazy stuff like complicated cell templates, ng-ifs or filters. Very soon performance became my biggest pain. When I started adding stuff like resizable columns and custom cell templates, no browser could handle all those bindings anymore.
Then I tried ng-grid, and at first I kinda liked it - easy to use, it has a few nice features I needed, but soon I realized - ng-grid is awful. Current version stuffed with bugs, all contributors stopped fixing those and switched to work on a next version. And only God knows when that will be ready to use. ng-grid turned out to be pretty much worse than even vanilla ng-repeat.
I kept trying to find something better. trNgGrid looked good, but way too simplistic and doesn't offer features I was looking for out of the box.
ng-table didn't look much different from ng-grid, probably it would've caused me same performance issues.
And of course I needed to find a way to optimize bindings. Tried bind-once - wasn't satisfied, grid was still laggy. (upd: angular 1.3 offers {{::foo}} syntax for one-time binding)
Then I tried React. Initial experiment looked promising, but in order to build something more complicated I need to learn React specifics, besides that thing feels kinda non-anguleresque and who knows how to test directives built with angular+react. All my efforts to build nice automated testing failed - I couldn't find a way to make React and PhanthomJS to like each other (which is probably more Phantom's problem. is there better headless browser) Also React doesn't solve "appending to DOM" problem - when you push new elements into the data array, for a few milliseconds browser blocks the UI thread. That of course is completely different type of problem.
My colleague (who's working on server-side of things) after seeing my struggles, grumbled to me that I already spent too much, trying to solve performance problems. He made me to try SlickGrid, telling me stories how this is freakin zee best grid widget. I honestly tried it, and quickly wanted to burn my computer. That thing completely depends on jQuery and bunch of jQueryUI plugins and I refuse to suddenly drop to medieval times of web-development and lose all angular goodness. No, thank you.
Then I came by ux-angularjs-datagrid, and I really, really, really liked it. It uses some smart bad-ass algorithm to keep things very responsive. Project is young, yet looks very promising. I was able to build some basic grid with lots of rows (I mean huge number of rows) without straying too much from the way of angular zen and scrolling still smooth. Unfortunately it's not a complete grid widget solution - you won't have resizable columns and other things out of the box, documentation is somewhat lacking, etc.
Also I found this article, and had mixed feelings about it, these guys applied a few non-documented hacks to angular and most probably those would breaks with feature versions of angular.
Of course there are at least couple of paid options like Wijmo and Kendo UI. Those are compatible with angular, however examples shown are quite simple paginated tables and I'm not sure if it is worth even trying them. I might end-up having same performance issues. Also you can't selectively pay just for the grid widget, you have to buy entire suite - full of shit I probably never use.
So, finally to my question - is there good, guaranteed, less painful way to have nice grid with infinite scrolling? Can someone point to good examples, projects or web-pages? Is it safe to use ux-angularjs-datagrid or better to build my own thing using angular and react? Anybody ever tried Kendo or Wijmo grids?
Please! Don't vote for closing this question, I know there are a lot of similar questions on stackoverflow, and I read through almost every single one of them, yet the question remains open.
Maybe the problem is not with the existing widgets but more with the way you use it.
You have to understand that over 2000 bindings angular digest cycles can take too long for the UI to render smoothly. In the same idea the more html nodes you have on your page, the more memory you will use and you might reach the browser capacity to render so many nodes in a smooth way. This is one of the reason why people use this "lame" pagination.
At the end what you need to achieve to get something "smooth" is to limit the amount of displayed data on the page. To make it transparent you can do pagination on scroll.
This plunker shows you the idea, with smart-table. When scrolling down, the next page is loaded (you will have to implement the previous page when scrolling up). And at any time the maximum amount of rows is 40.
function getData(tableState) {
//here you could create a query string from tableState
//fake ajax call
$scope.isLoading = true;
$timeout(function () {
//if we reset (like after a search or an order)
if (tableState.pagination.start === 0) {
$scope.rowCollection = getAPage();
} else {
//we load more
$scope.rowCollection = $scope.rowCollection.concat(getAPage());
//remove first nodes if needed
if (lastStart < tableState.pagination.start && $scope.rowCollection.length > maxNodes) {
//remove the first nodes
$scope.rowCollection.splice(0, 20);
}
}
lastStart = tableState.pagination.start;
$scope.isLoading = false;
}, 1000);
}
This function is called whenever the user scroll down and reach a threshold (with throttle of course for performance reason)
but the important part is where you remove the first entries in the model if you have loaded more than a given amount of data.
I'd like to bring your attention towards Angular Grid. I had the exactly same problems as you said, so ended up writing (and sharing) my own grid widget. It can handle very large datasets and it has excellent scrolling.

Resources