Improve TinyMCE performance in Angular app - angularjs

I've set up tinyMCE in my angular application by using the latest (4.x) version of tinyMCe and the latest version of angular-ui/ui-tinymce (https://github.com/angular-ui/ui-tinymce).
All of the code is minified.
In my application I have multiple instances of tinyMCE on a page (up to three) and the application uses the angular routing mechanism.
Everything is set up correctly, the editors work (and each of them has their own configuration).
The problem I'm facing now is performance. Whenever I load a new page the tinyMCE instances recreate themselves even if they are already there (= in the dom)! Creating a tinyMCE editor takes some time (up to 3 seconds) The amount of text in it doesn't seem to matter much.
I've tried using tinyMCE's gzip compressor but I couldn't get it to work.
What actions can I take to improve the performance in my application?
If at all relevant: I'm using a Java backend and AngularJs version 1.2.16

How-to optimize initialization speed of TinyMCE
(Wanna see original article?)
Here are some actions to take to boost initialization/loading time of TinyMCE.
Use and install the TinyMCE Compressor.
This will bundle all JavaScript HTTP requests into one big request and also gzip compress them by 75%.
Enable the button_tile_map option (should be enabled by default).
This makes the icons load faster since multiple image requests are replaced with a few tilemap requests.
Compress other scripts using the custom scripts option inside the compressor.
There might be other third party scripts on the same page. These can be added to the compressor as well.
Disable plugins that you don't need.
Remember to both remove them from the tinyMCE.init and the tinyMCE_GZ.init calls.
There is currently no compressor for TinyMCE 4 for Java backend, unfortunatelly.
And as you've already said, all of the code is minified.
So the only thing I can advice: remove unused plugins and reduce a number of requests by concatenating multiple JS files into as few files as possible.

Related

Is it possible to build an AngularJS App without using multiple files

I was wondering if it is possible to build an AngularJS app in a single file versus having multiple .js files.
During developing having one file makes no sense, cause it is very hard to support such code. For production you can use special tools, so user will see only one js file. This is done for angular itself -- you can find angular source to contain plenty of files, while when you include angular in project you include just one angular-XXX.js file.

Why Does sencha touch take so long to load?

I used Sencha 1 in the past and I am going now for Sencha Touch 2... And I am interested in using Sencha Cmd...
So... I did the basic sencha command:
sencha generate app Nameofmyapp path/to/myapp
And when I try to load the website... it takes over 10 seconds to load (and it has no functionalities). I have seen other people having the same issues than me, and I found the most useful answer here (and it solved the problem):
Sencha Touch 2.2.0 loads very slowly. Is it normal?
But it's nonsense... Is it not possible to make it run faster without needed to minify everything? Is there something Sencha cmd does that I should erase because it's too heavy and useless?
Should I not use Sencha cmd?
Well, I think the minification isn't the problem, rather that all components (wether you use them or not) are loaded from separate files. The hundreds of requests take time. When you build your app with sencha app build production all (and only) the required components are concatenated into one big file that loads quite rapidly.
Have a look at the docs at Using Sencha Cmd with Sencha Touch that state
Sencha Cmd automates all optimizations for your application, including the following:
Resolving dependencies required by the application and only including exactly what is used for optimal file size/performance.
Enabling HTML5 application cache via automatic generation of "cache.manifest" and resources checksum.
Minifying all JavaScript and CSS assets.
Storing all JavaScript and CSS assets inside local storage on first load and patching them via delta updates between releases.
As a result, your production build can load instantly on subsequent access and updates on the fly with minimal network transfer.
Edit: There is a way to speed up development by changing sencha-touch.js to sencha-touch-all.js in your app.json file under "js". Next use the Sencha Touch Cmd command sencha app refresh which updates your bootstrap.js file. Now all components are loaded from one single file. You need to change this back before building your app though, otherwise all components will end up in your build even the ones you are not using.

Is YUI Compressor useless for ExtJS app minification

My manager wants me to look into initially using YUI Compressor to minify our ExtJS 4.2 app.
So I wrote a python file to concatenate all my ExtJS app files into a single file, and then minify that one file with YUI Compressor.
But I get errors related to objects not found, because order matters with JavaScript.
So for an app with many files, with multiple developers adding new files, it seems questionable whether YUI Compressor can be effectively used to minify ExtJS apps.
Is this true, or am I missing something?
YUI Compressor is must-have tool in the deployment process because it reduces js and css files in about 40%.
In your case you should use YUI Compressor AFTER you compile your ExtJS app into one js file. This is not effective to compress dozens of tiny files before concatenating them.
Of course the order of files for concatenation matters. It is based on dependency requirements declared in each file (which is actually ExtJS class).
I was tired to use Sencha.cmd because it's huge, inconvenient to install on servers, hard to automate and integrate into deployment processes.
I developed my own tool Extapp which makes the ExtJS applications builds. It requires java jre environment to run jar file.
Extapp : https://github.com/liberborn/extapp
Using YUI is not useless, the problem lies with concatenating files without analyzing the dependencies between them.

what (amd) script loader to use for mobile site

I'm starting work on a new version of a mobile site. I am looking into using an amd script loader and have pretty much narrowed it down to require and lsjs. I know there are many pro's and con's to both, but I am trying to figure all of those out for the mobile version of my site. Does anyone have experience with this lib's at the mobile level? Just trying to get a discussion going here of what people think the best way to go is. (anyone with a 1500 rep want to create an lsjs tag :) ). Maybe either of the creators of these libraries (todd burke or richard backhouse) have an opinion on this
thanks
EDIT:
thanks to Simon Smith for the great info down below. has anyone used lsjs? it looks very promising in terms of speed, but does not have the user base, documentation, or (i think) features of require/curl, but still looks very promising
I would say use RequireJS until you're ready to go to production. Then compile your scripts and replace RequireJS with Almond. It's a bare-bones library made by James Burke (author of RequireJS) so you can rely on it to work seamlessly:
Some developers like to use the AMD API to code modular JavaScript,
but after doing an optimized build, they do not want to include a full
AMD loader like RequireJS, since they do not need all that
functionality. Some use cases, like mobile, are very sensitive to file
sizes.
By including almond in the built file, there is no need for RequireJS.
almond is around 1 kilobyte when minified with Closure Compiler and
gzipped.
https://github.com/jrburke/almond
EDIT:
Curl.js is also an option. I haven't used it but know that is a lot smaller than RequireJS. Did a bit of research as to why:
RequireJS does the following over Curl (via James Burke):
Supports multiversion/contexts, useful for mock testing, but you can get by without it
Supports loading plain JS files via require, does not have to be an AMD module
Supports special detection and work with older versions of jQuery (should not be an issue if you use jQuery 1.7.1 or later)
(At the moment) better support for simplified wrapped commonjs style: define(function(require) {});
In short, if you are only going to deal with AMD modules in your app,
do not need the multiversion/context support, and are not using the
simplified commonjs wrapping style, or using an older jQuery, then
curl can be a good choice.
https://groups.google.com/forum/?fromgroups=#!topic/requirejs/niUyLZrivgs
And the author of Curl:
RequireJS runs in more places than curl.js, including WebWorkers and
node.js. It's also got more "battle testing" than curl.js, which may
mean it has less bugs around edge cases. curl.js is also missing a few
important features, such as preloading of implicit dependencies and
support for AMD-wrapped commonjs modules. These are both coming in
version 0.6 (late next week).
On the plus side, curl.js...
is as small as 1/4 the size of RequireJS -- even when bundled with the
js! and domReady! plugins it is still less than half the size.
is faster at loading modules than RequireJS, but only meaningfully so in
IE6-8 or in development (non-build) environments.
supports pluggable
module loaders for formats other than AMD (we're working on unwrapped
CJSM/1.1 and CJSM/2.0, for instance).
supports configuration-based
dependency management via IOC containers like wire.js (via cram.js).
supports inlining of css (via cram.js) and concatenation of css (via
cram.js 0.3 by end of year)
https://github.com/cujojs/curl/issues/35#issuecomment-2954344
Back in 2014 I faced the same problem. I had some extra requirements though in order to make the site fast on mobile:
Small enough to be inlined (to avoid paying an extra request-tax to
get the loader onboard).
Inlined config file (get rid of a request).
Config file in pure javascript (no parsing overhead).
Let the browser do the actual loading (browsers are smart these days) of files.
Connect all asynchronously loaded modules together.
Support for single page apps that include legacy code that uses sprinkled $(function(){...}) constructs, yet I insist on loading jQuery late and asynchronously to speed things up.
After evaluating RequireJS, curl, lsjs and a bunch of others, I concluded that none of them came close enough to what I needed for my projects. Eventually I decided to create my own lockandload AMD-loader. I didn't open-source it at the time, because that meant writing documentation. But I recently open-sourced it with fresh docs in case it benefits others.

Sencha too slow

I introduced Sencha grid in one of my JSPs. Locally sencha is quite fast but on an external server it is too slow.
I followed deploy instructions here
http://docs.sencha.com/ext-js/4-0/#!/guide/getting_started
using ext-debug.js and my app.js.
Then, in my JSP, I imported app-all.js (670KB) and ext.js
Where am I wrong ?
Thanks
app-all.js is 670KB, which is a very big file. You should refactor, optimize and minify the code so it will be smaller. You could even separate into multiple files per class or implementation and do a dynamic js load (but that would take more time). A good target would be as small as ext.js's.
Also if you have access to your webserver (i.e. Apache/Tomcat), you could turn on gz compression to compress files before sending to browsers. Also look out for other webserver optimizations.
(btw, your question sounds more like a webserver issue rather than a sencha related issue).
Another way to improve the load time of your application is making sure ext.js and app-all.js are cached by the browser. This way the first time your application loads it will be slow, but the following loads will be faster.
Look into the cache-control, expires and other HTTP cache controlling headers (this appears to be a nice explanation). Your server should generate this headers when sending the files you want to be cached.
The real problem, as it appears from the timeline, is the slow connection to the server (10 seconds loading 206/665 KB is slow for most connections), so you should see if there are no other server problems causing the slowness.

Resources