qooxdoo is slow to run when developing with source-all - qooxdoo

When running:
python generate.py source-all
I get all the libraries in my application. This is all good.
When running the application qooxdoo is loading all classes separatly.
I want to use qooxdoo as an online development tool where only the build should be run in the end. Also when running both the server and client in dev mode it runs slow when loading each class of qooxdoo separately.
Can I instead include http://cdnjs.cloudflare.com/ajax/libs/qooxdoo/4.1/q.min.js or a local copy from the server software library folder and for development only run:
python generate.py source

First I'd like to clarify the terms, because I felt some ambiguity in question's.
Terms
Qooxdoo generator's execution unit is a job, e.g. info job is invoked as ./generator.py info. Jobs that are involved in dependency management (finding how an application class depends on other classes from the application, the framework and 3rd party libraries) produce a target. Target may include original classes as-is (by full path), build parts (set of concatenated classes, which may be optimised, plus some meta data and data optionally), or mix of the two. Target is loaded by a web browser via loader.
Source target
Source target jobs are means of dependency management for development (writing code). There are three of such.
source
With the source job all the classes of the application are in their original source form, and their files are directly loaded from their original paths on the file system.
The target includes only actual dependency from the application, the framework and libraries. All classes loaded as-is (hundreds of requests). You may have loading issues even loading it from file:// in some browsers (e.g. in test runner that waits for AUT to load not long enough).
source-all
source-all will include all known classes, be they part of your application, the qooxdoo framework, or any other qooxdoo library or contribution you might be using.
The target includes all existsing classes from the application, the framework and declared libraries. All classes loaded as-is (hundreds of requests, more than for source job). More loading issues than with source job.
source-hybrid (which is default default-job)
The source-hybrid job concatenates the contents of the classes that make up the application into a few files parts, only leaving your own application classes separate. Having the other class files (framework, libraries, contribs) chunked together you get the loading speed of nearly the build version, while at the same time retaining the accessibility of your own application files for debugging.
The target includes only actual dependency from the application, the framework and libraries. All non-application classes are concatenated into parts (dozen or two). Application classes loaded as-is, the rest is loaded from parts. Gives best loading performance.
Build target
There's only one build target job, build. However depending on configuration it can produce a single-file build or partial (multi-file, usually several files) build. Unlike source targets, build target's files are optimised (minified) and intended for production deployment.
Online development
There's example application for online development, playground (application/playground in SDK). Let's look at relevant jobs from its config.json.
"playground-compile" :
{
"extend" : [ "libraries" ],
//...
"include" :
[
"${APPLICATION}.*",
"qx.*"
],
"exclude" :
[
"qx.test.*",
"qx.dev.unit.*",
//..
],
//...
},
"build-script" :
{
"extend" : [ "playground-compile" ],
//...
},
"source" :
{
"extend" : [ "playground-compile" ],
//...
}
As you can see above, for both targets all qooxdoo classes (except tests and development classes) are included so you can use them in playground snippets. Generally it is your case. The rest depends on sort of the online development you have and your requirements to it. You may base it on source-hybrid (easier to debug) or on build target (faster to load). You may mix up some custom configuration, basing it on playground's config.
One important thing that needs to be noted is that if you plan to have significantly more complex code than it is the case for playground snippets, implemented in number of classes that depend on each other, you will need to handle dependency management yourself (i.e. load classes in proper order). If your "online" code deals with a subset of the framework, working with tabular data for instance, it makes sense to include only the subset (e.g. qx.ui.table.* instead of qx.*). For you information, playground's single-file build target is 2.1MB (~550kB gzipped).
Misc
q.min.js is qxWeb. It is jQuery/jQueryUI-like library. Basically, from application development point of view qxWeb has noting to do with normal Qooxdoo flow.

A build created with source-all loads each class, framework and application classes alike, individually - i.e. it loads the hundreds of framework classes one by one (last time I checked, it also loads framework classes which are not used by the application code).
As a workaround, the framework devs have created the job source-hybrid. This one works just as source-all, but creates and loads a concatenation of all framework classes, instead of each class individually (and I think it minifies the framework classes too). Using source-hybrid instead of source-all should improve loading speed significantly.
(What would be great would be the framework devs to add an option to the build* jobs to also generate source maps. But that's not in the generator so far.)

Related

Optimizing build time for multiple extjs applications

We have a modular monolith application, each module being an extjs app. Modules share alot of features / functionality, therefore the most of code is sitting in a common extjs package that gets imported into each module, the module themselves are relatively thin. We also provide an accessibility build, ie., everything is built at least twice (once with normal theme, once with high contrast), but for some apps more (some logic is managed through extjs macros to exclude / include different regions at build time).
The end result is agonizing build time. ~10 apps, each built at least twice, each build lasting just under 2 minutes. It's all because each app is built from scratch. Is there a straightforward means to build it together? So that instead of rebuilding extjs code / common package code / themes 10 times, it would be just built once and reused in build process of all apps?
This looks very relevant "saving and restoring sets" . But it seems to be some lower level feature, which would as far as I understand it come useful if we were reimplementing build process from scratch and tossing out app.json. Is there a clear way how to incorporate it into existing higher level features like sencha app build?
You could go ahead and build the packages (if any) separated from the application, then drop the packages in the owning directories of the build. However, Sencha CMD and the way the class system and resolving the dependencies works makes it really hard to untangle the build process, so it's hard to give a general advice here.
You might want to look into the package loader of Ext JS and the "uses" configuration option for the app.json.

How to allow additional 3rd party React modules to be installed after compilation of a static web server

Struggling with collision of technical terms (most especially the term "plugin" which has about seven different meanings within the react development stack).
Short question:
Is there a way to pre-compile static webpack modules that can be installed separately from a main static react web application, while still sharing modules contained in the main web application? (The question as best I can formulate it using my relatively naïve react developer skills). I'd like the ability to plug in web user interface components supplied by 3rd party developers after the fact. i.e. installable runtime React UI components, not requiring react compilation at install time.
Details:
I have a static React web app that allows remote control of audio plugins (specifically LV2 audio plugins). It's a single-page static react app (that communicates via we sockets with the running application), hosted by a static C++ web server. Realtime and IOT agility requirements make a python hosted dynamic web server an runtime compilation an unattractive prospect (https://github.com/rerdavies/pipedal)
What I want to do is allow extension of the web UI using separate bundles provided by 3rd-party LV2 plugins. The ideal solution would be to allow static webpack bundles pre-compiled by the lv2 plaigns and placed in /usr/lib/lv2/<pluginname.lvw>/resource directories to be consumed by the web app at runtime. I'm using a custom C++ web server, so redirecting URLs into the /usr/lib/lv2/xx/resource directories is straightforward.
The main app would be distributed one apt package. Lv2 plugins would be compiled (potentially by 3rd party developers) against an "sdk package" provided by the main app build, after the main app was built, and then distributed in separate packages. Ideally, I'd like to pre-compile the ui code for the plugins to static webpack modules before their installers are built.
I more-or-less understand how I would do this if I were using raw CLI tools and configuration files (tsc, webpack, babel). But I can't help thinking I would be reinventing a wheel. (And I do have concerns that I'm going to incur serious version-dependency problems).
I would like to code-share the base modules (react, #mui controls, and a limited set of app-supplied components and interfaces).
I see the path through the various tools to make this happen, using my own custom build script, I think. I can get the typescript compiler to do code-splitting; I can probably figure out how to get the babel transpiler to do the right thing. And I think I understand how to write webpack configuration files that will process do sharing of modules from the main app. And a likely path to build and distribute an npm package to do the setup and build of LV2 plugin projects. And how to write supporting CMake build rules for building and installing such packages. &c. But I'm concerned that I'm going to go down a large rabbit hole trying to reinvent something that surely must exist already. And I can imagine seven thousand ways for this to go horribly wrong. :-P
So far, I have implemented the TypeScript compiler portion of the build procedure. And writing various bits to dynamicall intercept and service resource requests in the web server is trivial. But it has become painfully obvious that I also need to do babel and webpack build steps as well.
I haven't yet looked at the react-scripts package contents to see if I can steal code to build what I want there. Perhaps that's a viable path.
Is there a way to do this with off-the-shelf npm packages and off-the-shelf npm build procedures? I can find all kinds of bits to get me part way; but the integration of all the bits is rather daunting. Should I just do the deed, and start writing my own custom build scripts to make this happen?

qooxdoo: How to handle non build time plugins?

Given the case that you have a basic GUI that must be extensible by plugins not known when the generate run of the main GUI is done. Contributable plugins may consist of some manifest, resources, localization, some code that is executable in the GUI environment and can provide custom widgets.
From what I see in the moment, it could be done by
Let a plugin developer build against the ordinary source, generating a part for the plugin. Then manually register a qx.io.part.Part with the generated parts to the GUI running on the non developer side.
Just load a combined source JS for that plugin, containing the resources and load them manually via eval.
I'd personally prefere the first one, as it already includes everything that might be used by a plugin. But it uses a method that is marked as internal.
Are there any experiences with that? Are there other, more elegant ways to achieve that?

Qooxdoo build-all deployment

I'm new to qooxdoo. I'd like to use it for an embedded web interface for an application I'm developing right now.
To keep building my application as easy as possible I'd like to stay away from using the python build scripts after every change if possible. Because the website will only be used once in a while by a single user load times etc. are also not a big concern for me.
I've read about the "build-all" target but could not find a detailed description on how to activate it with the current release. Can someone explain how I can get a complete desktop build of qooxdoo?
You don't have to run generate.py every time you change the code, only every time you reference a new class. During development it's usually relatively infrequent that you have to re-run the generator, compared to how often you will do the edit/save/alt-tab/refresh/test cycle.
But you can do what you're asking during development by using the "source-all" target, eg:
./generate.py source-all
When loading an app from a file:// url this is fine because file:// URLs are very fast, but you can optimise this manually by modifying your config.json to incorporate specific sets of classes.
To do this, in your application's config.json, add (or edit) a job called "source" and add:
"jobs": {
"source": {
"include": [ "qx.ui.*" ]
}
This will cause all of the qx.ui.* classes to be included into the ./generate.py source build of your application; obviously you can fine tune this further.
When it comes to deploying your application, use ./generate.py build because this will produce a minimised, optimised version (with debug code removed etc) which uses only those classes that are required.
If you are still looking for a build version of Qooxdoo, here is my qxSimple project. It includes some examples.
http://adeliz.github.io/qxsimple/
You can also generate your own build version following these steps :
Dowload the latest qooxdoo release
Go in the framework folder
Edit the config.json file
uncomment the //build-all line
run generate.py build-all

Where do you keep common reference files for multiple Silverlight projects?

I would like to know what are the industry standards or suggestion on how are you doing at your end for following situation.
I am creating multiple silverlight projects which get publised at different dates. All these projects uses varios shared code (common dlls). These shared code would be used in client side or server side. My question is, if the shared code changes would you recompile all the afftected project and release or recompile only when you are making change to the actual code which uses the shared component?
For now, client side, we create a assembly reference folder in each silverlight project and put the latest required dlls in it. By doing it, it has all required files in the XAP itself and it will not conflict with other projects and it works fine. With this approach I will not rebuild any other client side code just because common dll changed. If the common dll change is required for multiple projects then drop the latest copy in all affected projects and build them and distribute them.
On the other hand, the server side (Domain Services using EF), all the service code sit under bin folder of the web site. So if i would make a change to a common dll, then not only I need to publish the latest common dll for current project to work, but also recompile all the other services to use the new dll.
Would like to know your opinions and suggestions.
Thanks
There are two approaches possible:
Add Common Code to the solution and have a project reference
Get the build process to build to a folder and reference from there
I prefer first option. I always build and debug using the latest code and do not have to worry about stale references. I have used the second approach in the past and it is messy and can waste your team's time going after debugging bugs that do not exist (old version referenced). In fact I remember Visual Studio sometimes would not get a later version when it was available.
Another alternative for your Silverlight projects would be to use MEF to dynamically download a XAP file containing the common libraries. Then if the common libraries change, you could publish an updated "CommonLibraries.xap", and your Silverlight clients can pick up the refresh independently of the rest of the Silverlight application.
You could follow the same approach with other projects that use these common libraries. The applications could dynamically load the common libraries so that the common libraries can be refreshed independently.
If possible, consider consuming the "common library" code via WCF services.

Resources