I am developing a website in drupal7, it takes 8 to 10 seconds to load each page the first time, next time it takes 1 to 2 seconds. I have enabled cache, css compression and JS compression.
Please provide suggestion to reduce the time of first time loading.
There are some steps by which you can improve your site performance:
If you are using image on your site so please try to upload image in
KB or Bytes.
Use http compression.
Use cache module like varnich, mamcache, APC etc.
Enable default cache system of drupal (in your case it is already
enabled).
Enable CSS and JS compression (in your case it is already
enabled).
For further details please follow these 2 links
drupal.stackexchange.com/why-is-drupal-7-so-slow
5-ways-to-improve-performance-in-drupal
Related
I have a small but fun to develop app. It was a fast experiment to learn a bit more about redux and react and I got to the point that I consider the app mature enough to start learning about optimization.
I did some pure component optimization attemts, but they didn't improve the time to first load, so I move on.
The next optimization I tried was using react lazy for lazy load some components that I don't need at first time. For example, I have an error component that I only need if I have to show an unlikely error, so that is what I did, and surprisingly (and according to lighthouse) all the first time metrics (time to first interactive, time to first meaningful paint etc) got way worse.
Here is an screenshot of the report before trying to use react lazy:
As you can see, from the performance point of view there was not much to improve, but as I was trying to learn modern react, I tried anyway. Here is the best I have been able to get using react lazy to split one component:
As you can see, way worse. The problems it detected were not all related to caching policies, here they are:
Seems that the main thread is getting busier to parse all the javascript. It makes no sense to me, because going to Chrome dev-tools and inspecting on detail the network requests (on the performance tab) the resulting bundles are downloaded in parallel. However the bundles on both versions are almost the same size, except that the split version of the application has 5 chunks instead of 2:
First bundle without code split
URL
bundle.js
Duration
45.62 ms
Request Method
GET
Priority
High
Mime Type
application/javascript
Encoded Data
6.5 KB
Decoded Body
31.0 KB
First bundle with react lazy split
URL
bundle.js
Duration
28.63 ms
Request Method
GET
Priority
High
Mime Type
application/javascript
Encoded Data
7.1 KB
Decoded Body
33.7 KB
First downloaded chunk:
Network request
URL
0.chunk.js
Duration
255.83 ms
Request Method
GET
Priority
High
Mime Type
application/javascript
Encoded Data
579 KB
Decoded Body
2.7 MB
First chunk with react lazy split(is labeled 5 but it is actually the first):
URL
5.chunk.js
Duration
276.40 ms
Request Method
GET
Priority
High
Mime Type
application/javascript
Encoded Data
559 KB
Decoded Body
2.6 MB
My conclusion is that react lazy introduces a significant overhead that only pays off if the size of the loaded components is big enough.
HoweVer, does that mean that big aplications can never score high on first paint ?
I made some bigger apps with VUE that got almost 90 on performance, so I'm pretty sure I'm doing something wrong here.
Something to mention is that the first screenshot is being served from github pages while the second is being served locally, but that should not influence the problem at hand, should it ?
The code for the non split version of the app is publicly available here: https://github.com/danielo515/itunes
The biggest time consumption in the “Script Evaluation” 1.672ms. So, try reducing this time.
Analyze size of JavaScript, which libraries you can replace by the smaller
version or use pure JavaScript. If you use CRA try Analyzing the Bundle
Size or webpack-bundle-analyzer. For example instead of lodash
maybe you can use smaller library lodash-es;
Use server-side rendering. Consider use loadable-components
(advice from React doc). But if you use slow server (or low level of cashing) has possibility increase a value of "Time to First Byte";
Use Pre-Rendering into Static HTML Files.
Also, a very useful tool for web-page speed analyze is webpagetest.org.
I found the first web page loading time for CN1 Javascript Built taking too long, need about 2 minutes.
I attached the Chrome's network loading screen shot, found the classes.js is the most heavy page, possible to zip it?
Second, there is 2 theme files that downloaded sequentially, is it possible for them to load at the same time?
Kindly advice.
Normally I would answer that you can look at the performance section of the developer guide but the relevant sections there relate to reducing the theme.res size which seems pretty small in your case.
The largest portion in your code is the class files so I'm guessing that the best way to reduce them is to further reduce dependencies so the obfucator can remove more dead code. Keep in mind that the classes.js file is cached and can be deployed via CDN's such as cloudflair to improve download speeds. It can be served in a gzipped form as well which is a part of the CDN repertoire.
I'm facing some performance issue.
Indeed, the app itself reacts pretty well when it's loaded but my problem is more like when I perform a F5 ( and app reload again), then I see some strange idle time that are killing my app loading time.
When starting my app :
I do some queries to my webserver : take around 500ms - 700ms
I present the requested datas
I have some translation to every text (that generates a lot of watchers)
I use angular-translate for that. If you have a better solution of internationalisation ;)
When app is started, I have 1200 Watchers. I have tried removed ALL translation and have around 700 watchers but do not see any difference on behaviour.
I have around 25 Modules loaded, 18 are mandatory for app to run
Well, let's dig to the issue, here is a chrome Timeline representing my problem.
I presented 2 different ones.
Any idea why I have those IDLE Times ?
Any Idea on how I can investigate more on the root cause of that issue ?
Edit : Information 09/06/2015
all my JS code is into appcache, therefore, all my code is loaded at
app first init, and then into cache. I have added into appcache all
vendors / lib / js / fonts / img etc...
there is during that time 5 or 6 requests that are mainly done into
the first block. This is a corporation private angularJS app
from those requests, we granted to the user rights and init the app
we have tried to remove all HTML generation from the app, and keep only the JS to see if the blocking point was the rendering, no
effects
we have tried to remove angular-translate, that generates a lot of watchers for "on the go" translation, no effects
Thank for support,
Have you tried using one-time binding for parts you only want to show for reading not editing e.g. dynamically generated Tables / Lists ?
Hello {{::name}}!
more here
I would like to implement my own slideshow and image gallery (for a foto reporting website).
Is there a best approach or tehcniques (using GAE and GWT) related to :
reducing the slideshow's loading time (a slideshow containing 30 images (960px * 780px) for example, should I load them all firstly and then let the user navigating ?)
should I do scaling operation (the image's resolution is greater than the browser's one) on the server side or in the client side ?
is there some know-well problem concerning the storage (if I have a lot of images) ?
If you have some advice or links about this topics, could you post them please ? Thanks you.
Question 1: preload vs lazy
Answer: The more you load to start the longer it takes your instance to spin up and the longer and larger the bandwidth. So in general you should probably use a lazy loader but prefetch the thumbnails and the next image.
Question 2: image scaling
Answer: I suggest creating a scaled version on upload that you serve then allow the user to download the full size image. Don't do this on the client again the bandwidth would eat you alive.
Question 3: storage
Answer: Use the blobstore/python or blobstore/java instead of db.BlobProperty because it saves money on storage and allows for files over 1mb.
is there an elegant way to determine the size of data downloaded from a website -- bearing in mind that not all requests will go to the same domain that you originally visited and that other browsers may in the background be polling at the same time. Ideally i'd like to look at the size of each individual page -- or for a Flash site the total downloaded over time.
I'm looking for some kind of browser plug-in or Fiddler script. I'm not sure Fiddler would work due to the issues pointed out above.
I want to compare sites similar to mine for total filesize - and keep track of my own site also.
Firebug and HttpFox are two Firefox plugin that can be used to determine the size of data downloaded from a website for one single page. While Firebug is a great tool for any web developer, HttpFox is a more specialized plugin to analyze HTTP requests / responses (with relative size).
You can install both and try them out, just be sure to disable the one while enabling the other.
If you need a website wide measurement:
If the website is made of plain HTML and assets (like CSS, images, flash, ...) you can check how big the folder containing the website is on the server (this assumes you can login on the server)
You can mirror the website locally using wget, curl or some GUI based application like Site Sucker and check how big the folder containing the mirror is
If you know the website is huge but you don't know how much, you can estimate its size. i.e. www.mygallery.com has 1000 galleries; each gallery has an average of 20 images loaded; every image is stored in 2 different sizes (thumbnail and full size) an average of for _n_kb / image; ...
Keep in mind that if you download / estimating a dynamic websites, you are dealing with what the website produces, not with the real size of the website on the server. A small PHP script can produce tons of HTML.
Have you tried Firebug for Firefox?
The "Net" panel in Firebug will tell you the size and fetch time of each fetched file, along with the totals.
You can download the entire site and then you will know for sure!
https://www.httrack.com/