Angular: one module for every component: antipattern? - angularjs

So I have encountered a practice where people will create a module for each component that has service dependencies. that way, when someone wants to use a given component, they dont have to read through the code to see what providers to add to the given module. is this an antipattern? will it cause performance issues or something?
Is there some recommended guidelines on lower/upper limits on how many components/directives/providers/etc should be in a given module? has there been testing of the angular/angularJs ecosystems with 100s of modules on a view? in comparison with if just regular components were all bundled in maybe like 20ish modules instead?

Generally, there are different module types in Angular and guidance as to what they should contain and what modules should import them:
Widget modules - Contains mostly UI components, but no services. Imported by Feature modules.
Features modules - Contains domain-specific private components. Imported by AppModule.
Service modules - Contains services exclusively. Imported by AppModule.
Routed modules - A specialized Feature module, that is the target of routing.
Routing modules - Contains navigation routes and resolver/guard services.
Modules can have dependencies on other modules. For example, a Widgets module could be expected to be used with a Services module, where AppModule imports the ServicesModule, and FeatureModule imports the WidgetsModule. The BrowserModule/CommonModule is an example of this pattern; so is RouterModule.forRoot()/RouterModule.forChild().
I would say its overkill to have one module per component. It would be hard to organize and group common functionality together and leverage services in any meaningful way. It could easily become unwieldy when your imports for a single module run into double-digits.
[Edit]
After re-reading this question, I would like to add a clarification because I think the one-component approach with NgModule encapsulation deserves more attention. I don’t believe in 1:1 module-to-component - that would be overkill. However, I am in full support of 1:many module-to-component where the module exports only one component, and all the other components are private to the module. This latter approach is known as NgModule encapsulation, and it is an excellent way to build your application in a way that loosely couples your top-level components.

If you have a relatively small app then bloating a single module is ok. Once your app starts to get relatively large then it makes sense to start lazy loading modules so your users don't have to download the entire build on app start. Grouping related functionality into modules makes things easier to maintain as well.
So for a smallish site, a single module will be fine but as your app grows it makes sense to start refactoring components and services into modules.

Practically, angular always prefer you to have basic module -
Features module, Shared module, Routing module, Core module,
But if application is big it is better to have lazy loading for modules, to have module for every component is best option, which increase app speed along with this browser load that component which is required.

Related

Micro-Frontends, Web Components, and Sharing Libraries

so I'm working on migrating my company's app to a micro-frontend approach. We are following the standard described in https://micro-frontends.org/. While under the hood everything is currently using React, we are wrapping things with Web Components so that we will have the freedom and flexibility to be framework-agnostic in the future. We've got a working architecture up and running and so far it is working beautifully. We even created a fancy compatibility layer on top of the Web Component spec which allows us to pass React-like props to the Web Components, including Objects, Arrays, and even Functions. This allows for much better interaction between them.
The main concern we have right now is duplication of libraries. We're a React shop, so even though we have this framework agnostic approach, everything is using React. While this new approach gives us the ability to individually upgrade pieces of our app to a newer React version (finally), we still don't like the idea of so much duplication of the React library.
To put it in perspective, even Gzipped, React/ReactDOM are over 40kb. That's super tiny individually, but scaled up it starts to take up more and more bandwidth. RAM-wise it's less of an issue, about 130kb for those libraries, and given the RAM capacity of most devices now it's not a huge deal.
But, of course, we want things to be as optimized and streamlined as possible. So I'm hoping someone can suggest a way for the micro-frontend apps (the ones wrapped in a Web Component) can get React and other libraries from the parent app.
You should know that the parent app JavaScript is loaded prior to the micro-frontends. Each micro-frontend is loaded via a <script> tag. Lastly, we are NOT using the Shadow DOM at the moment, a tradeoff we made to benefit how we are migrating our existing code into the new micro-frontend architecture.
The core idea is to tell module bundler on how to package your micro-frontends.
Assuming you are using Webpack to bundle your applications, here are the two things that you need to do.
Step 1:
Declare React as an External dependency like in your Webpack config:
externals: {
'react': 'React',
'react-dom': 'ReactDOM'
},
Step 2:
Before you load your parent application's JS, ensure that you are loading React and ReactDOM from CDN or other equivalent place:
<script crossorigin src="https://unpkg.com/react#16/umd/react.production.min.js"></script>
<script crossorigin src="https://unpkg.com/react-dom#16/umd/react-dom.production.min.js"></script>
Put these script in your main index.html which is responsible for bootstrapping your entire SPA.
Explanation
When you declare certain package/library as external, Webpack does not include it as part of the bundle. It assumes that the outer environment will make that particular version available as a global variable. In case of React, it uses React and ReactDOM as global variables.
By doing this and including it via CDN, you will be left with exactly one copy of React and ReactDOM. When a user visits the application for the First time, it will be slower but once cached, it should not be a problem
Further, you can extend this idea and also declare them as external for your parent app or parent shell container.
Possible solution is to prepare library using Import Map but as it does not support IE11+ I recommend you using SystemJS?
https://github.com/systemjs/systemjs
especially this one seems to be close to your case:
https://github.com/systemjs/systemjs-examples/tree/master/loading-code/react-hello-world
At html you do:
<script type="systemjs-importmap">
{
"imports": {
"react": "https://cdn.jsdelivr.net/npm/react/umd/react.production.min.js",
"react-dom": "https://cdn.jsdelivr.net/npm/react-dom/umd/react-dom.production.min.js"
}
}
</script>
<script src="https://cdn.jsdelivr.net/npm/systemjs/dist/system.min.js"></script>
Then you could import ReactJS, as browser knows where to take it from.
Potentially there is a possibility to build some kind of library that looks into your micro front end's (MFE's) package.json, to get know which library needs to be in use and dynamically create imports object similar to example above.
We need to keep in mind that there is a need to cover versioning check. Mentioned library could potentially store some kind of map which connects dependency with version with place where it's accessible. Imagine that in case mentioned we need to deal with different react version on each MFE :).
Then while loading another MFE library could check if required dependencies has been already included, if some missing, download it, otherwise use what has been already fetched.
Another thing is use webpack 5, with module federation, which I not recommended yet, as it is not stable enough for production, but there is a possibility to start playing with it. Hard part will be covering version check, so most probably need another abstraction layer to deal with it.
https://indepth.dev/webpack-5-module-federation-a-game-changer-in-javascript-architecture/

Can I NOT use requirejs in marionette/backbone?

People mention requirejs together with marionette, backbonejs and the like.
requirejs seems an asset loader -- executing your rules on when to load what.
I know the first 'page' of my single-page-app already needs most of the files. If I don't mind loading all files in one go, can I simply ignore requirejs?
Technically yes. Only dependencies for marionette-backbone are
jQuery v1.8+
Underscore v1.4.4 - 1.6.0
Backbone v1.0.0 - 1.1.2 are preferred
Backbone.Wreqr (Comes automatically with the bundled build)
Backbone.BabySitter(Comes automatically with the bundled build)
Further require.js can manage use code structure in a manner which give your code much resource efficient code at the end. From my point of view for simple application which you need simple set of views,models and collection with manageable amount of code it ok to proceed without require.js.
But if your application have complex logic and higher number of resources it's good to go require.js. Because it not good to send 15+ like individual resource requests server at very beginning of your application load. Require can make any number of your resource in to one server resource. That's the advantage.
What I prefer is one request of all css, one for all js, one for sprite image for graphic if things are big to handle which allow to create fast performing application.
Take you decision looking at the amount of resources of the project. It's not essential have require.js form the beginning of your application development.

Keeping Angular modules completely separated

I'm trying to set up a large angular web app, with an architecture as described here
What I can't get my head around is how to make modules completely independent from other modules. All github boilerplate's or example projects which focus on modules all seem to have modules which heavily depend on common modules or functions/services. That would sort of defy the purpose of keeping everything separate doesn't it?
Take the following two example:
I have a utilities module which handles some basic functions (hashes etc.), and another module which handles all communication with an API. Imagine a User module which needs to make hashes and communicate with the API, how do I handle that? Directly injecting the Util and API modules as dependencies would break the in dependability, and putting them into the User module as well would mean a lot of double code (imagine multiple modules using the Util and API modules). Or should I use the mediator to mediate the communication?
User information which is stores in the User module is something that should be used almost application wide, and in the facade as well for example (the facade should handle security according to the article). How can I allow all of the application to access the information, without everything being dependent?
Thanks in advance :)
I answered a similar question here.
You are on the right track with separating behaviors into modules.
The idea is to embrace Dependency Injection as a means of collecting these behaviors into a rolled up module. With an IoC container at your disposal you can exercise discipline in your naming and other conventions to keep your app loosely coupled through configurable components.
We have a general app type module that simply registers the appropriate feature modules. That is in turn bootstrapped by a boot module that simply calls angular.bootstrap(['app']) somewhere (ours is in the page).
We can easily decorate or replace existing services/controllers/whatever with this setup and teams can work in isolation while features scale at different rates.
For example:
var myWidget = angular.module('myWidget',[])
myWidget.directive('myWidget',function(){...})
myWidget.factory('somethingImportant',function(){...})
var myDomainModel = angular.module('myDomain',[])
myDomainModel.factory('someModel',function(){...})
... //more models
var app = angular.module('app',['myWidget','myDomain'])
app.factory('applicationWidePolicyHere',function(){...})
//on index.html; das boot.
(function(){ angular.bootstrap(document,['app']) })()

What is angular-loader.js for?

I saw a similar question on the Google groups and also here on Stackoverflow. Both times the question was not answered. The code in this file doesn't make it very clear about what exactly it does and how it is used. Also it's not clear from the Angular documentation.
Can someone explain how this is used. Also can this be used along with Require.js?
Angular loader allows your angular scripts to be loaded in any order.
As angular-seed project shows us, Angular loader does not have any specific api, you just put it at the top of your index file (so that it's executed first) and then proceed to load your application files anyway you prefer.
But, the most important thing for your use case is that you don't really need angular loader at all. RequireJS also allows your files to be loaded in any order, but it also provides you with many other features that angular loader just isn't made for.
So, yes, you may use it with RequireJS, but you don't need to, because it becomes redundant.
Angular modules solve the problem of removing global state from the application and provide a way of configuring the injector. As opposed to AMD or require.js modules, Angular modules don't try to solve the problem of script load ordering or lazy script fetching. These goals are orthogonal and both module systems can live side by side and fulfil their goals.
http://docs.angularjs.org/tutorial/step_07#anoteaboutdiinjectorandproviders
It allows for you asynchronously load files when bootstrapping your angular application. A good example is the angular-seed project that has an index-async.html file that does this.
index-async.html
This is useful for using other libraries that load in modules asynchronously.
See angular-async-loader:
https://github.com/subchen/angular-async-loader/
To async load following components:
List item
controller
services
filter
directive
value
constant
provider
decorator

How do you structure your Backbone + RequireJS applications?

I've been struggling trying to strike the right balance between reusability and complexity when it comes to organizing my Backbone objects into AMD's (for medium- to large-scale applications)
(A) Should every Backbone object (models, views, etc) be in their own module?
(B) Should related Backbone objects be in the same AMD module? (ie: PersonModel, PersonCollection, PersonView objects in the same module definition)
Option (A) seems to allow the most flexibility and reusability, but also the most complexity because of the (potentially) high number of files. While option (B) may make it easier to manage things, but less flexible and really difficult to unit test.
How is (or has) everyone else structured these things?
I good thing about requirejs is that it allow you to abstract the physical files into structured namespaces. You can take the approach (A) and create each backbone class in their own file, then create a "namespace" module to glue all the related classes together.
// Suppose you have PersonView.js, PersonCollectionjs, PersonModel.js as modules
// create a Person module to function as namespace
define(["PersonModel", "PersonCollection", "PersonView"], function(model, collection, view) {
return {
Model: model,
Collection: collection,
View: view
};
});
This keep the modules organized in their own files and gives you some flexibility to write one module per class without requiring you to expose this organization for the rest of the application (I really don't like to have to write require("PersonView", "PersonModel" ... ) every time I need to use the person's objects, it's easier and cleaner for consumers to declare a dependency on a "namespace" instead of independent classes).
For medium to large backbone projects I prefer to use requirejs with a separate module for every model, collection, and view. I also use the "Text" plugin for requirejs so I can load underscore templates just as I would any other module. This for me seems to be the sanest way to manage a large project and I have never really felt overwhelmed with the number of files I have.
+1 on using the requirejs optimizer when pushing your app to production. Works really well.
http://requirejs.org/docs/optimization.html
I just released an open source toolkit which will hopefully help others as much as it helps me. It is a composition of many open source tools which gives you a working requirejs backbone app out of the box.
It provides single commands to run: dev web server, jasmine single browser test runner, jasmine js-test-driver multi browser test runner, and concatenization/minification for JavaScript and CSS. It also outputs an unminified version of your app for production debugging, precompiles your handlebar templates, and supports internationalization.
No setup is required. It just works.
http://github.com/davidjnelson...

Resources