I am working on a Backbone.js application which is nearly done by now. My problem is that it seems like my application requires a lot of CPU performance. A regular Macbook Air takes up to 30% CPU if you visit my website (the Firefox process).
I can't think of any reason for this. I have like 6-7 different Views and a table with like 60 Views (each entry/row is a View object). Also I use setInterval() to fetch updates from the API every 10 seconds, but they're in total 4 HTTP requests with a content-length of ~1000, which should be totally acceptable.
According to Backbone-Eye I have 66 Models, 67 Views, 4 Collections, 1 Router. Also I took a "Javascript CPU profile" and it seems that a lot of CPU performance is used for rendering/painting, but with no information how to reduce it.
I would appreciate any tips how to reduce CPU load in my Backbone App.
Stagger the 4 requests you make every 10 seconds. Make each one of them poll between 9.8 to 10.2 seconds instead of doing them all at 10 seconds.
After you do these 4 fetches check if the content has changed. Only re-render the views if the content from your fetch has changed.
Do you have view memory leaks, zombie views? Do you properly close each row view? read How To: Detect Backbone Memory Leaks
Related
I'm using a snowflake trial version to do a performance test.
I perform 9 heavy queries (20 mins taken by XS cluster) at the same time and watch the warehouse or history pane. However, the time to display page is too much; about 30 seconds.
I think the cloudservice (like hadoop headnode?) doesn't have adequate resources to do this.
Is it because I'm using the trial version? If I use enterprise or business critical versions, will it happen?
The "cloud services", is actually unrelated to your warehouses, and 9 queries is not enough overload that. But at the same time the trial accounts might be on slightly underpowered SC layer, but I am not sure why that would be the case. The allocated credit spend is just that.
I am puzzled what you are trying to "test" about running many slow for the server sized queries at the same time?
When you say the page takes 30 seconds to load, do you mean if you do nothing the query execution status/time is only updated every ~30 seconds, or if you do a full page reload it is blank for 30 seconds?
I have an application made in ionic, it has several sections. It has a section that the first time is used download a dataset of 18,000 records (using $ http requests) and stored in a local database (pouchdb). then I get these 18 000 records and gender cycles for certain operations I need.
the problem is that whenever I access this section, this will gradually becoming slower. in the application settings (app.js) I have the option
{Cache: false}
in all routes of my app.
with this is supposed to release memory, I do not understand why gradually becomes increasingly slow.
the way I have my project with controllers and data in html views.
every time I go to this section I get the 18 thousand records and perform operations. It is necessary to cover all these 18 000 records each time I log. for example I have the list of all the cities of my country, someone selected from a dropdown a city and then I walk all records for operations, then it is necessary. more or less is what I have.
This only happens when I compile for android devices. but I have a samsung galaxy s7 and does not occur this problem, always fast. which makes me think it may be a ram problem.
What I can do?
I have an application made in ionic, it has several sections. It has a section that the first time is used download a dataset of 18,000 records (using $ http requests) and stored in a local database (pouchdb). then I get these 18 000 records and gender cycles for certain operations I need.
the problem is that whenever I access this section, this will gradually becoming slower. in the application settings (app.js) I have the option
{Cache: false}
in all routes of my app.
with this is supposed to release memory, I do not understand why gradually becomes increasingly slow. the way I have my project with controllers and data in html views. every time I go to this section I get the 18 thousand records and perform operations. It is necessary to cover all these 18 000 records each time I log. for example I have the list of all the cities of my country, someone selected from a dropdown a city and then I walk all records for operations, then it is necessary. more or less is what I have.
This only happens when I compile for android devices. but I have a samsung galaxy s7 and does not occur this problem, always fast. which makes me think it may be a ram problem. What I can do?
I have a report with a refresh time of about 1 minute, that I reduced from 5 minutes by doing the following:
Displaying fewer elements (e.g. tables, charts)
Making the formulae as simple as possible (e.g. simplifying nested if statements)
Making the queries simpler (e.g. pulling through fewer columns)
My question is, are there any other ways for me to reduce refresh time that I haven't considered?
Talking round the office shows that the methods I used were optimal from my side. Slow refresh times were caused by systemic under investment by my employer, e.g.
Universes being clunking and indexed incorrectly
Slow internet connection/being restricted to IE
Slow laptop to access BO4
This lead me to build a business case to invest in the infrastructure by leveraging the amount of time lost in waiting for reports to refresh.
I have a complex dashboard that I would like to update every minute. It is an Angularjs SPA application with an IIS backend running in Azure.
Dashboard shows approximatey 30-40 dashlet widgets on it. Each widget needs approximately 10 collections of data entities . Each collection gets about 3-5 new data points every minute
I want to ensure that app i the browser is performing well and is interactable (this is is very important) and that my web servers are scalable (this is secondary, because I'd rather add more web servers than sacrafice speed and interactivity of the browser)
Should I update the whole dashboard at once? (1 very large call, will probably download a 1200-1600 data entities... probably a lot more for some users, and a lot less for others). This one puts the most strain on the web servers and is less scalable from the web server perspective. But I am not sure what the browser impact is.
Update a single dashlet widget at a time? (30-40 chunky calls, each one returning about 40 pieces of information)
Update each collection of data entities inside the dashboard individually? (About 300-400 tiny calls, each returning ~3-5 pieces of information)
One can assume that total server time to generate data for 1 large update and for 300-400 individual data points is very similar.
Timeliness of updates is not /super/ important... if some widgets update 10 seconds later and some on time, that's fine. Responsivness of the browser and the website is important in general, so that if users decides to interact with the dashlets, the thing should be very responsive
Appreciate any advice
AngularJS optimizations are all about:
Making sure binding expression evaluation is fast.
Minimize the number of things being watched.
Minimize the number of digest cycles: A phase when angular check for model changes by comparing old and new model values.
But before you begin to fix\optimize any of the above parts it is important to understand how Angular data binding works and what are digest cycles. This SO post should help you in this regards.
Coming back to the possible optimization.
Fixing first one is matter of making sure if you are using functions in binding expression, they evaluate fast and do not do any compute intensive or remote operation. Simply because binding expressions are evaluated multiple times during multiple digest cycles.
Secondly minimizing the number of watches. This requires that you analyze your view binding behavior:
Are there parts which once bound do not change: Use bindonce direcitive or if you are on Angular 1.3, Angular itself supports one time binding using :: syntax in expression.
Create DOM elements only for things visible in the view: Using ng-if or ng-switch rather than ng-show\ng-hide can help. See how ngInfiniteScroll works.
Lastly reducing the number of digest cycles helps as it mean less number of dirty checks to perform over the lifetime of the app.
Angular will perform these digest cycles at various times during application execution.
Each remote call too results in a complete digest cycle hence reducing remote calls will be helpful.
There are some further optimization possibilities if you use scope.$digest instead of scope.$apply. scope.$apply triggers a app wide digest cycle, whereas scope.$digest only triggers the children scopes.
To actually optimize the digest cycle look at Building Huuuuuge Apps with AngularJS from Brian Ford
But before anything measure how things are working using tools like Batarang, and make sure such optimizations are required.
I have noticed that my site in Cakephp is very very slow. I have rewritten my entire site in Cakephp with exactly the same functionality and it's taking 400 ms to generate every page instead of 20ms. 400ms is far away from the 50-100ms parsetimes I am hoping to archieve. Site speed is very important for me, it was one of the reasons I moved away from learning more about Drupal.
When writing all SQL queries myself and working with simple incudes, there was no need to do much optimizing. I have to start optimizing the code now though.
All pages show in a block the number of users, newsposts, articles and a few other things that have been posted. This takes 9 SQL queries and seems to take away some performance. That's what I want to use caching for.
At the moment my site doesn't get that many visitors and I'm mainly rebuilding it to become a better webdeveloper and the high parsetime bums me out. I am going to remove Croogo alltogether and only work with self-written code. I already stumbled on many horrible performance degrading parts of Croogo.
I would like to save all those 9 query results in cache once an hour via a cronjob. I want to run a cronjob with the 9 queries that saves the results in the cache. My question is how I can save data longer in cache? It normally saves data 10minutes, but I'd like to save this specific data for 150 minutes and run a cronjob every 2 hours. I know it can be done via core.php, but I wouldn't like to cache everything for 150 minutes, just the statistics-data for the leftmost block at www.daweb.nl.
Statistieken
Artikelen:
Leden:
Javascripts: 29
Nieuwsberichten: 4
Nodes: 16
PHP Scripts:
Members, Articles, PHP Scripts are empty, which means nobody has accessed the pages that generate the relevant data. I could make a long block of code with a lot of if (there is cache) and else (generate cache), but that's not going to make things much prettier either. Also, I'd have no idea where to place that code. I am not looking to write bunchload of code in app_controller.php, can't be good for the site.
If site speed is important to you (more than those automagics Cake has to offer) then you might want to look at CodeIgniter.
Anyway, here's how to set cache setting for elements: http://book.cakephp.org/view/1083/Caching-Elements