Does it make sense to eject a project from create react app and "take back control" - reactjs

Create react app is an awesome way to setup a new react project. However i can see it forces certain decisions onto you that come baked in, eg using Jest rather than other test runners such as karma/mocha. As I am setting up a new greenfield project with React, am trying to identify is the best practise to stay with it and accept certain constraints or do most teams end up ejecting and in the parlance of brexit "take back control" and what the reasoning is.

create-react-app actually has a lot of sensible defaults and make it an ideal starting point. But they also regularly update things to stay in sync with where the industry is going. So that's great. And it is maintained by some of the same people responsible for React.
The biggest drawback (and strength) is that it doesn't include many other libraries. You have to add those yourself.
But if you do that you occasionally find that you need to add or tweak a small thing in the Babel/Webpack config.
Luckily there is a middle group. Using libraries like react-app-rewired (https://github.com/timarney/react-app-rewired) allows you to make small changes to the Webpack config without ejecting just yet.
Once you do that you will want to be very careful with upgrading react-scripts. Because every time you do it might break your Webpack changes to their script.
But only once that pain is too much would I consider ejecting.

Related

Is more granular versioning in a monorepo with a container possible?

My team has a monorepo written with React, built with Webpack, and managed with Lerna.
Currently, our monorepo contains a package for each screen in the app, plus a "container" package that is basically a router that lazily serves each screen. The container package has all the screens' packages as dependencies.
The problem we keep running into is that, by Lerna's convention, that container package always contains the latest version of each screen. However, we aren't always ready to take the latest version of each screen to production.
I think we need more granular control over the versions of each screen/dependency.
What would be a better way to handle this? Module Federation? peerDependencies? Are there other alternatives?
I don't know if this is right for your use case as you may need to stick with a monorepo for some reason, but we have a similar situation where our frontend needs to pull in different screens from different custom packages. The way we handle this is by structuring each screen or set of screens as its own npm package in its own directory (this can be as simple as just creating a corresponding package.json), publishing it to its own private Git repository, and then installing it in the container package via npm as you would any other module (you will need to create a Git token or set up SSH access if you use a private repo).
The added benefit of this is that you can use Git release tags to mark commits with versions (we wrote our own utility that manages this process for us automatically using Git hooks to make this easier), and then you can use the same semver ranges that you would with a regular npm package to control which version you install.
For example, one of your dependencies in your container package.json could look something like this: "my-package": "git+ssh://git#github.<company>.com:<org or user>/<repo>#semver:^1.0.0 and on the GitHub side, you would mark your commit with the tag v1.0.0. Now just import your components and render as needed
However, we aren't always ready to take the latest version of each screen to production.
If this is a situation that occurs very often, then maybe a monorepo is not the best way to structure the project code? A monorepo is ideal to accommodate the opposite case, where you want to be sure that everything integrates together before merging.
This is often the fastest way to develop, as you don't end up pushing an indeterminate amount of integration work into the future. If you (or someone else) have to come back to it later you'll lose time context switching. Getting it out of the way is more efficient and a monorepo makes that as easy as it can be.
If you need to support older versions of code for some time because it's depended on by code you can't change (yet), you can sometimes just store multiple versions on your main branch. React is a great example, take a look at all the .new.js and .old.js files:
https://github.com/facebook/react/tree/e099e1dc0eeaef7ef1cb2b6430533b2e962f80a9/packages/react-reconciler/src Current last commit on main
Sure, it creates some "duplication", but if you need to have both versions working and maintained for the foreseeable future, it's much easier if they're both there all the time. Everybody gets to pull it, nobody can miss it because they didn't check out some tag/branch.
You shouldn't overdo this either, of course. Ideally it's a temporary measure you can remove once no code depends on the old version anymore.

How to force-refresh browser cache automatically for each React App deployment in 2022?

By default, when a React Application is deployed, the cached version will display, even if changes were made in the new build. This requires the end user to hard-refresh their page (Ctrl+F5) to see new changes. It is not feasible to have a user do this each time.
With the constant upgrades and speed of software evolving, there are many different ways to accomplish this. However, the answers widely vary due to different practices being used each year.
What is the best practice to accomplish this in 2022? Is there a short and easy way to do this?
Edit: I am using the create-react-app project template.
This has to do with broswer caching the old pages and not React, to avoid this you need to either force refresh as you mentioned or changing your build strategy with some sort of different file naming, this can be done by using webpack caching here https://webpack.js.org/guides/caching/.
This way your scripts will have different names with each new deployment and the browser will ahve to reload the new scripts when fetching the index.html.

What are the benefits of adding configs to package.json?

I have always placed different "tooling configurations" in their own files in my front-end projects.
For example: babel in babel.config.js, jest in jest.config.js, eslint in an .eslintrc.json, etc.
I have noticed recently however that it is possible to place many of these configurations directly in a projects package.json file instead.
I did some digging around online and asked a few colleagues but no-one can seem to give me a definitive answer as to why one might prefer one approach over the other.
Is it purely a matter of preference?
My suggestion is to avoid stuffing package.json with custom configs mainly for two reasons.
First is regarding developer expectations. Putting yourself in the position of a person that is just getting off with JavaScript and NPM ecosystem, you see something that's not documented, at least not where you expect it to be. This kind of experience could easily drive folks away especially if they come from more strict developer platforms and languages.
Second, keyword collision or more importantly the thought of such being possible. We don't expect NPM to consider not using some keyword in the future just because some shiny new lib is using it too, do we?
On the other hand having dedicated files for babel, browserlist, postcss is such more simple, self-explanatory approach and every single of those projects already recommends using dedicated files for configurations.
Here are the Pros and Cons of package.json configs IMO:
Pro package.json becomes a single source of truth for your apps packages, scripts, and, now, configs. It's a one-stop for everything related to your app.
Pro It declutters your root directory. Having a separate config file floating around for all your different packages gets messy.
Con Keyword collision possibility. If Node comes out with a new keyword that happens to match that of an existing package config, then you have to either not use the keyword or move the config to a separate file.
Con Most package documentation references the separate config file in their examples. It could be confusing for troubleshooting or new team members.
Con Your package.json file could get HUGE. And if many configs are managed, the potential for a merge conflict with another team member increases greatly.
Overall it comes down to personal preference. There are some tools like husky (https://github.com/typicode/husky) that put config in package.json by default. Our team does a combination of both.
I think there is no benefit you will make the package.json file more complicated and there is no reason for that if you make separate files you make declarative and people understand the project better

What is the general practice for express and react based application. Keeping the server and client code in same or different projects/folders?

I am from a microsoft background where I always used to keep server and client applications in separate projects.
Now I am writing a client-server application with express as back-end and react js as front-end. Since i am totally a newbie to these two tools, I would like to know..
what is the general practice?:
keeping the express(server) code base and react(client) code base as separate projects? or keeping the server and client code bases together in the same project? I could not think of any pros & cons of either of these approaches.
Your valuable recommendations are welcome!.
PS: please do not mark this question as opinionated.. i believe have a valid reason to ask for recommendations.
I would prefer keeping the server and client as separate projects because that way we can easily manage their dependencies, dev dependencies and unit tests files.
Also if in case we need to move to a different framework for front end at later point we can do that without disturbing the server.
In my opinion, it's probably best to have separate projects here. But you made me think a little about the "why" for something that seems obvious at first glance, but maybe is not.
My expectation is that a project should be mostly organized one-to-one on building a single type of target, whether that be a website, a mobile app, a backend service. Projects are usually an expression of all the dependencies needed to build or otherwise output one functioning, standalone software component. Build and testing tools in the software development ecosystem are organized around this convention, as are industry expectations.
Even if you could make the argument that there are advantages to monolithic projects that generate multiple software components, you are going against people's expectations and that creates the need for more learning and communication. So all things being equal, it's better to go with a more popular choice.
Other common disadvantages of monolithic projects:
greater tendency for design to become tightly coupled and brittle
longer build times (if using one "build everything" script)
takes longer to figure out what the heck all this code in the project is!
It's also quite possible to make macro-projects that work with multiple sub-projects, and in a way have the benefits of both approaches. This is basically just some kind of build script that grabs the output of sub-project builds and does something useful with them in a combination, e.g. deploy to a server environment, run automated tests.
Finally, all devs should be equipped with tools that let them hop between discreet projects easily. If there are pains to doing this, it's best to solve them without resorting to a monolothic project structure.
Some examples of practices that help with developing React/Node-based software that relies on multiple projects:
The IDE easily supports editing multiple projects. And not in some cumbersome "one project loaded at a time" way.
Projects are deployed to a repository that can be easily used by npm or yarn to load in software components as dependencies.
Use "npm link" to work with editable local versions of sub-projects all at once. More generally, don't require a full publish and deploy action to have access to sub-projects you are developing along with your main React-based project.
Use automated build systems like Jenkins to handle macro tasks like building projects together, deploying, or running automated tests.
Use versioning scrupulously in package.json. Let each software component have it's own version# and follow the semver convention which indicates when changes may break compatibility.
If you have a single team (developer) working on front and back end software, then set the dependency versions in package.json to always get the latest versions of sub-projects (packages).
If you have separate teams working on front and backend software, you may want to relax the dependency version to be major version#s only with semver range in package.json. (Basically, you want some protection from breaking changes.)

Any reasons not to use Angular CLI to get started?

I'm trying to decide if I should use Angular CLI for a new project. My primary reason for doing so would be to avoid the hassles of setting up a new project right now and instead focus on learning the new Angular and building the application.
I'm coming from Angular 1.x so the hassles for me stem from learning all the new tooling in addition to the new Angular. Most of the docs reference systemjs but webpack seems like the direction the community is moving in so I would like to go that route.
I would prefer to learn and become comfortable with the Angular toolchain (including webpack) but I'd like to push that off a little if possible. I generally don't prefer "black boxes" like the CLI.
I would like to start by using the CLI and then break away at a point in the future when I have time to invest in learning more about webpack, etc. My question is: What limitations does the CLI put on me, can I easily break away from it in the future, and generally what else should I consider before using it as a quick way to get started?
I started working with Angular 2 while it was in Alpha, long before Angular-cli was available. During this early stage I struggled with the build environment - I was using systemjs and a whole pile of self-built spaghetti code of gulp tasks to handle transpiling, minifying, bundling etc. For every hour I spent writing angular code, however, it seemed I was spending two hours on the build environment. Did I learn alot? Sure. Was it a good use of my time? Not very.
The angular-cli changed all of that. It was built by the Angular team to accomplish all of the development and build tasks that an angular developer needs. It is always improving and when there have been problems they have been address quickly. I now can create an ng2 project in a few minutes with "ng new projectname --style=scss." I can run immediately in development mode with "ng serve." Changes automatically get compiled. I can build for production with "ng build -prod -aot" and have my entire ng2 project ready for production in minutes with Ahead of Time (aot) pre-compilation and tree-shaking.
So my advise to anyone would be this. If you want to quickly get into the serious work of building ng2 apps, and not waste your time re-creating the build and production environments yourself, then use angular-cli.
If you have time to burn and want to learn more about what's underneath the hood with angular2, then have a go at it yourself; you will certainly gain a better understanding of things; but you'll just end up using angular-cli anyway.
I am going to argue against using the CLI.
I have been using Angular 2 from the early days before RC. Indeed there was a lot of confusion back then, having to deal with all these packaging solutions (require.js, system.js, webpack). I got to admit that it was not a pleasant time back then, and it drained a lot of time.
BUT
Nowadays, I have a strong skill set in setting up builds and deployments. I have experimented with lots of possible ways to configure it and to achieve greatness. Recently we had to develop a plugins architecture for our webapp at the office. Guess what, knowing webpack has saved my skin big time. I was able to find an initial solution which was not that great. Eventually, after polishing it and taking advantage of webpack, we have created a really nice solution, with minimal code, without interfering in either webpack or angular architecture.
There is no chance that I would have been able to do this without all the pain of having to deal with webpack constantly. I hear very often misconceptions from my peers about how webpack and angular works. I do my best to explain stuff, but nothing beats doing it yourself. I'm sorry to say, but hiding behind the CLI will do you no good. A senior developer that I can trust to create new architecture must have solid webpack, angular and typescript know-how.
If you do not understand these tools properly you will be relegated to the menial tasks of applying existing patterns never to be trusted to go out there in the wild jungle and creat the new architecture for others to follow. You need to be able to think for yourself and to take your own decisions.
Conclusion
See whether or not the CLI is the tool for your current task and choose accordingly. Don't just blindly follow the first advice you see. If you are in charge of a project and you have to call the shots, knowing webpack is a must.
I am in same situation as you, but after doing a lot of research on the subject, I have come to the conclution, that it is perfectly fine to use the CLI to build Angular 2 projects.
The CLI is supported by the Angular team and in constant development with a big community - even turorials and the NG2-Book use the CLI as the configuration.
The CLI use Webpack integrated and exposes the configuration through the CLI json, but I read that soon it is possible to use command 'ng eject' to eject the webpack config file itself (if needed).
I believe the future (even now), that it's normal to use the CLI with integrated webpack, instead of using Webpack as a seperate bundler.

Resources