React hot reload doesn't work in docker container - reactjs

I am trying to set up React with docker, but for some reason I cannot get the hot reload to work. Currently, if I create a file it recompiles, but if I change something in a file it does not. Also, I didn't change any packages or configuration files, the project was generated with npx create-react-app projectname --template typescript.
From researching this online I found out that I needed to add CHOKIDAR_USEPOLLING=true to a .env file, I tried this but it didn't work, I tried placing the .env in all directories in case I placed it in the wrong one. I also added it to the docker-compose.yml environment.
In addition to this, I also tried downgrading react-scripts to 4.0.3 because I found this, that also didn't work.
I also tried changing a file locally and then checking if it also changes inside the docker container, it does, so I'm pretty sure my docker related files are correct.
Versions
Node 16.14
Docker Desktop 4.5.1 (Windows)
react 17.0.2
react-scripts 5.0.0
Directory structure
project/
│ README.md
│ docker-compose.yml
│
└───frontend/
│ Dockerfile
│ package.json
│ src/
│ ...
Dockerfile
FROM node:16.14-alpine3.14
WORKDIR /app
COPY package.json .
COPY package-lock.json .
RUN npm install
CMD ["npm", "start"]
docker-compose.yml
services:
frontend:
build: ./frontend
ports:
- "3000:3000"
volumes:
- "./frontend:/app"
- "/app/node_modules"
environment:
CHOKIDAR_USEPOLLING: "true"

If you are on Windows and use react-scripts 5.x.x or above, CHOKIDAR_USEPOLLING is not working. Make changes your package.json instead in the following way:
"scripts": {
...
"start": "WATCHPACK_POLLING=true react-scripts start",
...
}

The Dockerfile you have is great for when you want to package your app into a container, ready for deployment. It's not so good for development where you want to have the source outside the container and have the running container react to changes in the source.
What I do is keep the Dockerfile for packaging the app and only build that, when I'm done.
When developing, I can often do that without a dockerfile at all, just by running a container and mapping my source code into it.
For instance, here's a command I use to run a node app
docker run -u=1000:1000 -v $(pwd):/app -w=/app -d -p 3000:3000 --rm --name=nodedev node bash -c "npm install && npm run dev"
As you can see, it just runs a standard node image. Let me go through the different parts of the command:
-u 1000:1000 1000 is my UID and GID on the host. By running the container using the same ids, any files created by the container will be owned by me on the host.
-v $(pwd):/app map the current directory into the /app directory in the container
-w /app set the working directory in the container to /app
-d run detached
-p 3000:3000 map port 3000 in the container to 3000 on the host
--rm remove the container when it exits
-name=nodedev give it a name, so I can kill it without looking up the name
at the end there's a command for the container bash -c "npm install && npm run dev" which starts by installing any dependencies and then runs the dev script in the package.json file. That script starts up node in a mode, where it hot reloads.

Polling wasn't my issue -- watching the docker output I saw it recompiled correctly. My problem was related to networking in the hot loader. I found that the browser is trying to open a websocket back to the node server to ws://localhost:3000/sockjs-node but whatever network settings are like on my computer, this wouldn't route back to the docker container. I added
WDS_SOCKET_HOST=127.0.0.1
to the environment variables. After restarting, the browser successfully connects to ws://127.0.0.1:3000/sockjs-node and hot reloading works properly.
This was tied to using a docker-compose solution similar to the original post. I was able to also make things with with a variation on the docker run approach, but it was much slower to launch.

WSL Workaround for CRA 5.0+
watch.js
const fs = require('fs');
const path = require('path');
if (process.env.NODE_ENV === 'development') {
const webPackConfigFile = path.resolve('./node_modules/react-scripts/config/webpack.config.js');
let webPackConfigFileText = fs.readFileSync(webPackConfigFile, 'utf8');
if (!webPackConfigFileText.includes('watchOptions')) {
if (webPackConfigFileText.includes('performance: false,')) {
webPackConfigFileText = webPackConfigFileText.replace(
'performance: false,',
"performance: false,\n\t\twatchOptions: { aggregateTimeout: 200, poll: 1000, ignored: '**/node_modules', },"
);
fs.writeFileSync(webPackConfigFile, webPackConfigFileText, 'utf8');
} else {
throw new Error(`Failed to inject watchOptions`);
}
}
}
package.json
"scripts": {
"start": "node ./watch && react-scripts start",

I had same issue once, This issue was in my Dockerfile The workdir was /var/app/ while in my docker-compose.yml I mounted current working directory to /var/app/frontend, I just removed that /frontend and works fine. please check yours. thanks

Make sure you use --legacy-watch tag in package.json file
{
"name": "history",
"version": "1.0.0",
"description": "",
"main": "./src/index.js",
"scripts": {
"start": "node ./src/index.js",
"start:dev": "nodemon --legacy-watch ./src/index.js"
},

I had the same issue, and my mistake was that I was editing on my local machine and was expecting code reload inside the docker container. Since 2 of them are at different locations - it will never work out and I had to restart the docker-compose again and again.
Later I used the vscode command palette
"Attach to Running Container..."
option and the selected required container and cd into my code folder and made changes and live reload (or apply code changes on the refresh page) started working.
This helped me solve one issue, the next issue is to use the local ssh-key inside the docker container such that I can do my git sync inside the container itself.

Setting the NODE_ENV="development" in the Dockerfile enables the hot reload

Related

Unable to start React with TypeScript in Docker container

I'm trying to npm run start a React application which was created with --template typescript.
Typescript is therefore installed (as a React dependency) but my Docker container complains with a generic error message that TypeScript wouldn't be installed. I'm therefore unable to start the application inside a Docker container.
Everything works, when I start the application (with the same package.json) outside the container.
Error
> frontend#0.1.0 start /app
> react-scripts start
It looks like you're trying to use TypeScript but do not have typescript installed.
Please install typescript by running npm install typescript.
npm ERR! Exit status 1
I added TypeScript via npm install typescript and rebuilded the Docker container. But it still shows the error message.
Even after adding typescript manually as a dependency (even inside the container with a direct call to npm install typescript there!), the container complained about not being able to find TypeScript (which doens't seem to be true as I can validate that TypeScript was installed inside the container as tsc -version shows me the correct output).
Code
My Dockerfile looks like this:
FROM node:15.4.0-alpine3.12
# set working directory
ARG app_path=/app
WORKDIR ${app_path}
# add `node_modules/.bin` to $PATH
ENV PATH ${app_path}/node_modules/.bin:$PATH
# set Docker port
EXPOSE 3000
# copy configs, no need to copy src files as they get bind mounted later on (see docker-compose)
COPY package*.json ./
COPY tsconfig.json ./
# install all app dependencies
RUN npm install --silent
# validate typescript installation
RUN tsc --version
My docker-compose.yaml file looks like this:
version: '3.8'
services:
frontend:
build:
context: ./frontend
dockerfile: ../Dockerfile
image: frontend:dev
container_name: dev_frontend_react
ports:
- 4000:3000
command: ['npm', 'run', 'start']
volumes:
- ${HOST_PROJECT_PATH}/frontend:/app
# add a virtual volume to overwrite the copied node_modules folder as the
# container installs its own node_modules
- node_modules:/app/node_modules
environment:
- NODE_ENV=development
restart: unless-stopped
volumes:
node_modules:
name: frontend_node_modules
Same question, non working solution
I found another solution to the same question here on StackOverflow:
Asked to install Typescript when already installed when building Docker image
But this solution is hard-copying the project into the container and creating an actual build. But that solution is not acceptable for me as it prevents the hot reload feature of React from working.
During Docker image build (i.e. in the Dockerfile) you are installing the dependencies in to the node_modules folder inside the container (RUN npm install --silent). Type script (tsc --version) works as it gets installed there.
Later, in the docker-compose file, you are replacing the node_modules folder with a folder from the host machine. So, effectively, the commands from the Dockerfile have no effect.
I'm not sure what your goal is, but I can see several options:
if you want the docker container to install dependencies into the folder on the host machine (?), you need to do it when the container is being started, not when the image is being build
if you just want to use dependencies from the host folder, make sure that the typescript is there
if your goal is to run the app on docker but still be able to edit and hot-reload the app, try mounting the src folder alone (but then you won't b able to add new dependencies)
UPDATE:
Actually your solution should work. Docker-compose should copy files from the node_modules inside the container into the mounted volume. There is one gotcha though: it does it only when the volume is newly created (see populate a volume using a container). So, to solve the problem, try removing the volume and rerun docker compose:
docker volume rm frontend_node_modules
Unfortunately you need to do it every time there are any changes to the dependiences. Alternatively you can use:
docker-compose down -v
which will remove volumes as well (by default docker-compose down does not remove volumes, thus the -v switch)

Loopback 4 served static files not showing on kubernetes

everyone. Have been banging my head against this for a while, and maybe someone else has a better idea of what my issue is. I have a react and lb4 application and I want to host it on our kubernetes cluster. I build the react project and put it into a build folder in my lb4 project and serve those files while using the lb4 backend also for APIs. I put everything in a docker container, and when I run the container locally, it works as I would expect. When I put the container on kubernetes, I am able to hit the APIs from the loop back project but get a 404 when trying to hit the GUI.
In my LB4 project, I have this to serve the static files:
constructor(options: ApplicationConfig = {}) {
super(options);
// Set up the custom sequence
this.sequence(MySequence);
// Set up default home page
this.static('/',path.join(__dirname, '../build'));
// Customize #loopback/rest-explorer configuration here
this.bind(RestExplorerBindings.CONFIG).to({
path: '/explorer',
});
this.component(RestExplorerComponent);
this.projectRoot = __dirname;
and here is my docker file that I'm using:
RUN mkdir -p /src/app
COPY . /src/app
WORKDIR src/app
ARG env_name
ENV NODE_ENV=env_name
ENV PUBLIC_PATH "/"
RUN npm install
RUN npm run build:client
COPY /src/client/build /src/server/
EXPOSE 3001
CMD ["npm", "run", "start"]
Anyone notice anything that might be the issue? Would appreciate it greatly. Thanks.
Edit: Kind of found the issue, I think. Looks like the copying of the static files at the copy step in my dockerfile doesn't quite work as I intended, so I think it's looking at an empty folder on the kubernetes cluster. Now just to see why that isn't working.

Multiple env files choosing which one on docker - create react app

I am building a react app using cra, so the problem is the application just has the client side code which means there is no nodejs part.
I have two different environments one is development and one is production, as cra tells there is a order of preference:
.env
.env.development
.env.production
So if .env.production file is there in repo it will take that one and use that config based on the script that I give, if I use npm run build it will use .env.production and if I use npm start it will use .env.development if the file is there.
So I can add .env, .env.development, .env.production, but when I build the image in the Docker I can give only one command either it should be npm start or npm run build. So how should I solve this?
Install a local development environment; typically your only host dependency will be Node itself. Use the .env.development file there, via something like the Webpack dev server, with a command like yarn start.
Use Docker principally as a deployment mechanism. Your Dockerfile can build your application using the .env.production file, and then copy it into something like an Nginx container that doesn't need Node at all. It should follow the pattern in the CRA Creating a Production Build docs. Loosely,
FROM node:lts AS build
WORKDIR /app
COPY package.json package.lock .
RUN npm install
COPY . .
ENV NODE_ENV=production
RUN npm run build
FROM nginx
COPY --from=build /app/build /usr/share/nginx/html
# Base image provides default EXPOSE, CMD
This pattern gets around all of the difficulties of trying to make Docker act like a local development environment (filesystem permissions, node_modules not updating, live reloading being flaky, ...) by just using an actual local development environment; but at deployment time you get the benefits of a self-contained Docker image with no host dependencies.

Using Docker for create react app without using Nodejs to serve the file

Without using NodeJs to serve the static file, I am trying to build Docker image for create react app with the below folder structure
sampleapp -
client
src
...
DockerFile
So client is build by create-react-app client, the application is just consuming services and rendering it.
Dockerfile:
FROM node:10.15.1-alpine
COPY . /var/panda/client
WORKDIR /var/panda/client
RUN npm install --no-cache && npm run build
EXPOSE 4000
CMD ["npm", "start"]
How can I start the Docker container in local and prod, and is the above Docker script is fine for running the application in production build?
If you're asking how to run an image, you simply do docker build -t client ., and then docker run client. Your Dockerfile is not fine for a prod environment because it runs as root. You should add the following lines just before the last line.
RUN adduser -D user
USER user
Once you've run npm run build you will have a set of static files you can serve. Anything that serves static files will work fine. One typical setup is to use a multi-stage build to first build the application, then serve it out of an Nginx image:
FROM node:10.15.1-alpine AS build
COPY . /var/panda/client
WORKDIR /var/panda/client
RUN npm install --no-cache && npm run build
FROM nginx:1.17
COPY --from=build /var/panda/client/build /usr/share/nginx/html
For day-to-day development, just use the ordinary CRA tools like npm run start directly on your desktop system. You'll get features like live reloading, and you won't have to mess with Docker volumes or permissions. Since the application ultimately runs in the browser it can't participate in things like Docker networking; except for pre-deployment testing there's not really any advantage to running it in Docker.

Host an Angular App, ExpressJS endpoint on a AWS EC2 server

First of all, I would inform the readers that I am pretty new in NodeJS, Angular and Express.
I have partially completed a project, where I am needed to create a Website in AngularJS with a server side logic(ExpressJS).
But while developing I realised that hosting or deploying a MEAN stack isn't as straightforward like LAMP stack.
So i request a solution to the following problem,
I want to host a website developed in Angular with the endpoint in ExpressJS and database in MySQL.
I have tried to find solutions to this. But none of them painted a clear picture in front of me.
Sadly, the server i have is a free tier due to budget constraints and its a plain simple Ubuntu 18.04 System.
Here is one link that i tried to understand but is for azure.
This one was kind of more helpful but again it raised many questions.
Since I am new to this technology I would be grateful if somebody would help me through the deployment process of Angular and Express together on the same server.
I would go with Docker. One container running a node image and another container running mysql image. The node container will run your angular and express app. Also with Docker you will have no difference between your developing environment and your production environment.
Do you have Docker installed? Which OS are you using?
Download node image from Docker Hub:
docker pull node
Then i would create a Dockerfile to generate an image from node image while copying all your source code on it.
FROM node:latest
LABEL author="Your Name"
ENV NODE_ENV=production PORT=3000
COPY . /app
WORKDIR /app
RUN npm install
EXPOSE $PORT
ENTRYPOINT ["npm", "start"]
The COPY command will copy the source code of your current directory (.) to the app directory inside the container. WORKDIR will set the context where your commands will be executed inside the container so you can run npm install where your package.json is. RUN will download all app dependencies inside the container. ENTRYPOINT will execute the file that will start your app as specified in your package.json file, like below:
"name": "app",
"version": "1.0.0",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"start": "node index.js"
},
"license": "ISC",
"dependencies": { ... }
.dockerignore file (so you do not copy your node modules, Dockerfile, etc inside your container):
node_modules
npm-debug.log
Dockerfile*
docker-compose*
.dockerignore
.git
.gitignore
README.md
LICENSE
.vscode
To create an image based on the above Dockerfile (you need to place Dockerfile and run docker build in the same folder of your app container):
docker build -t image_name .
To run your image in a Docker container:
docker run -d -p 3000:3000 image_name
Running the container like this you can open your app in the browser with your DOCKER_HOST_IP:PORT and it will run your app.
Assuming you are running your app in PORT 3000, we are mapping external 3000 port to the internal port 3000 inside the container where your app is running.
EXPRESS
In order for express to serve your files, you need to set express.static:
// serve client side code.
app.use('/', express.static('app_folder'));
You can git clone your app on EC2 instance and then install a systemd service, here's an example of a service file:
[Unit]
Description=My App
After=syslog.target network.target
[Service]
Environment=NODE_ENV=production
ExecStart=/usr/bin/node /home/appuser/repo-app/index.js
WorkingDirectory=/home/appuser/repo-app/
Restart=always
StandardOutput=syslog
StandardError=syslog
SyslogIdentifier=myapp
User=appuser
Group=appuser
[Install]
WantedBy=multi-user.target
You can also make a good use of a haproxy in front of your express endpoint.

Resources