React performance issue in Mixin.closeall - reactjs

I've created a React Application with a grid that you can filter and I used Griddle for my grid and then added a control that shows the number of visible rows when filtering. However, when filtering there is about a 1/2 second delay from when I calculate the new new values to write out and when the update is done by React. As you can see in the gif below the console is writing out logging in my code that goes to the bottom of the stack and then from there it's all React code.
I profiled this issue in chrome and the results of that are shown in the image below. The top two calls are the last parts of my code. As you can see that code executes quickly and then there is 1/2 a second spent doing the Mixin.closeall call. Any ideas on what could be causing this?

Related

How to load items using react-window-infinite-loader with react-window FixedSizeGrid

I'm attempting to use react-window-infinite-loader with a FixedSizeGrid from react-window.
Before I start, I created a codesandbox with a minimal example which I'll reference in the question.
The issues I'm having are:
The grid doesn't seem to load all of the items in view. This is seen on initial load where the bottom two items on the first page are still in the loading state. This is strange because according to the docs, the infinite loader will load rows up until the threshold, which is 15 by default.
When scrolling, items near the end of the grid are not loaded in. In the example this can be seen by scrolling to the bottom and the seeing a batch of items in the loading state.
I think that my issues are due to using the infinite loader with a grid but following the example of the list. I think that part of the issue is the isItemLoaded function because the index I get from that doesn't seem to correspond with the index of the item, but I think it's the row?
In the end I'm not sure how to resolve these as the documentation I've seen has the infinite loader being used with a list component instead of a grid component.

Scrolling position changes after loaded more items to react-virtualized list

The following issues below point to the same problem I am experiencing:
Scroll jumps to the top when List data gets updated
InfiniteLoader jumps when scrolling up after loadMoreRows completes
Adding new items to List causes Scrollbar Jumping
(Using react-virtualized) vsx-registry: scrolling and selecting a tree-node jumps back to the top
And of course My SandBox Here. When you are testing in my sandbox, scroll to the bottom as fast as you can and then stop to wait for loading. A single or multiple jumps will be clearly noticeable.
I am also cross-posting this on GitHub right now. If anyone would come up with a fix that would help a lot of people here using react-virtualized.
Attempts:
When I remove the Cellmeasurer for both the content and the loader and replace row height with a manually defined function, the problem seems to disappear.
To minimize the scenario, I removed the Autosizer and Windowscroller and left with only InfiniteLoader and CellMeasurer. The problem is still there. The jump occurs exactly when rendering the items Infiniteloader appends to the list.

Highcharts addPoint animation

I'm using addPoint with animation=true to update a series in Highcharts. When the update frequency is higher than the animation duration the whole time series is showing some strange animation.
Example: http://jsfiddle.net/t797ewpw/13/
Related issue: https://github.com/highcharts/highcharts/issues/7926
I tried to find a workaround (maybe using some callback to keep the state of a running animation and delay calls to addPoint). But it seems there is no such callback (only for initial animation).
Are there any other workarounds for this issue?

Chartjs - Dataset colors are getting overloaded with angular-chartjs

I am using the angularjs wrapper of the Chartjs. I am updating the graph when the new data receive from a push event program. Also I am coloring in-between selected graph data sets.
When I go to another page and come back to the graph page again, the opacity of the color of the dataset getting increased and ultimately opacity goes away.
It seems to me that the data set colors are doesn't get cleared when I revisit the page. However this issue get solved when I refresh the page.
How can I clear the previous data set details when I revisit.
I tried the chart.clear() but didn't work
angular-chartjs is outdated and don't really work with the newer versions of chartJS.
Check this issue out: #677.
The issuer gives an example on how to implement chartJS with a wrapper. You need to be aware of the fact that chartJS alone don't handle the colors anymore. You need to set them on your own.
On the other hand if you really don't want to do it on your own you could try my wrapper implementation for chartJS.

Dealing with elements that are not visible due to zoom level

We have automation processes that have been scraping a website for months. Recently we discovered that our scraper isn't working.
Turns out, the website has been changed so that they have a table with 9 columns placed inside a frame. However, either the frame or table is styled improperly and only the first 6 columns are on the screen at 100% zoom because the table is too big. When you zoom out to about 50%, the rest of the columns manage to fit in the view.
Unfortunately, there is a button on the very last column that I need to click, but Selenium doesn't like it and says the element is not in view at 100% zoom.
When I use the following to check whether it's in view or not
while (!elmt.isDisplayed()) {
// wait for it
}
and manually zoom the page out so that the column with the button comes into view, it then successfully clicks the button.
How can I zoom the page out?
Yes I had the similar issue while testing in Safari.
The work around is to zoom out the page programmatically before you perform any operation on that page.
With help of js you can do this:
document.body.style.zoom = "30%"
Or whatever value is required. In selenium make use of javascript engine to execute this script.
This worked work around perfectly worked for me.

Resources