How to access react app which is ran in docker container? - reactjs

Dockerfile
FROM node:alpine
WORKDIR '/app'
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "run", "start"]
docker-compose.yml
version: "3"
services:
web:
build:
context: .
dockerfile: Dockerfile
ports:
- "3000:3000"
volumes:
- /app/node_modules
- .:/app
command: ["npm", "start"]
Commands which I fired to get this up
docker-compose -f docker-compose.yml up --build
After this I went to https://localhost:3000 and this project did not load.
Here is reproducible repo https://github.com/reyanshmishra/My-Portfolio-ReactJS
Thanks

You can't run the react project by building like this. You have to add this line into Dockerfile like this to run your application,
# Install `serve` to run the application.
RUN npm install -g serve
Example Dockerfile
FROM node:alpine
WORKDIR '/app'
COPY package.json .
# Copy all local files into the image.
COPY . .
RUN npm install
RUN npm audit fix
# Build for production.
RUN npm run build --production
# Install `serve` to run the application.
RUN npm install -g serve
# Set the command to start the node server.
CMD serve -s build
# Tell Docker about the port we'll run on.
EXPOSE 5000
You can run image like this (your app will run port 5000 by default. so you have to change docker-compose.yml file as well):
**$ docker run -p 5000:5000 <image name>**
if you need to access the image without running the container,
How to access a docker image?
if you need to access the running container
$ docker exec -ti bash

From webpack-dev-server documentation :
Either method will start a server instance and begin listening for connections from localhost on port 8080.
I guess you can modifiy docker-compose.yml to :
ports:
- "3000:8080"
And then you should be able to access your app using http://localhost:3000.
OR
You can modify your webpack config to use port 3000 instead of the default 8080 :
devServer: {
contentBase: path.join(__dirname, "src"),
hot: true,
inline: true,
historyApiFallback: true,
stats: {
colors: true
},
port: 3000
},

Related

React JS doesn't update in Docker compose

First of all, I am really new to dockers and everything that relates to it. I tried to use tutorials and instructions from web, but I see the answers from this question and I guess I don't have other options other than move to Linux completely (which I really don't want to do)
I have a docker-compose project with ASP.NET Core, Nginx and React. FrontEnd and BackEnd have their own dockerfile and dockerignore.
FrontEnd doesn't update code changes at all.
I tried to build only client service by using docker-compose build --no-cache but after two minutes of compiling I see that it didn't change anything.
Only possible solution for me is to delete docker-compose project and compile every service again. But it makes development so much difficult.
dockerfile:
FROM node:16-alpine
WORKDIR /app
ENV PATH /app/node_modules/.bin:$PATH
COPY ./package.json /app
RUN npm install
COPY . .
RUN npm run build
CMD ["npm", "start"]
docker-compose.yml (client service):
client:
image: client
build:
context: ./walletfrontend
dockerfile: Dockerfile
environment:
- WATCHPACK_POLLING=true
I tried to use WATCHPACK_POLLING=true as suggested in another question but I think it does nothing and I am not really sure why is it needed for.
UPDATE:
So I think I found a solution in this article:
https://shahmirprogrammer.medium.com/docker-with-react-changes-reflect-real-time-inside-a-container-f83acf208f8a
It really updates the changes in real time.
So my next goal is to change this strange command which I don't have a clue what it does to docker-compose equivalent:) :
docker run -d -p 3000:3000 -v /app/node_modules -v $(pwd):/app --name dockerized-react-app react-app-image:1
UPDATE #2:
So I think solution is to use volumes with correct path to my host machine:
volumes:
- /app/node_modules
- ./walletfrontend:/app
It's been almost a week and it was really difficult to find a solution for this. I hope this would help in the future for newbies like me
FROM node:16-alpine
WORKDIR /app
ENV PATH /app/node_modules/.bin:$PATH
COPY ./package.json /app
COPY . .
RUN npm install
RUN npm run build
CMD ["npm", "start"]
copy all the files before you run npm install.
a really useful tutorial on this can be found here
https://mherman.org/blog/dockerizing-a-react-app/
Dockerfile
# pull official base image
FROM node:13.12.0-alpine
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm install --silent
RUN npm install react-scripts#3.4.1 -g --silent
# add app
COPY . ./
# start app
CMD ["npm", "start"]
docker-compose.yml
version: '3.7'
services:
sample:
container_name: sample
build:
context: .
dockerfile: Dockerfile
volumes:
- '.:/app'
- '/app/node_modules'
ports:
- 3001:3000
environment:
- CHOKIDAR_USEPOLLING=true
that guide I just linked you also contains info on how to get your app production ready

Docker Compose compiles successfully, and from within the container its returning the webpage successfully using curl but not from host

Im building out a docker dev environment and moving a bunch of existing apps into docker containers.
I have a an old react app which compiles using webpack, it seems no matter what I do in the docker compose file that it still remains unreachable from my host machine (MacOS)
running this works successfully, but not from the host
$ docker exec -it <container id> /bin/sh
# curl localhost:8080
<!DOCTYPE html>
<html>
...
</html>
Dockerfile
# syntax=docker/dockerfile:1
FROM node:16.16.0-buster as base
WORKDIR /app
ENV PATH /app/node_modules/.bin:$PATH
COPY package.json ./
COPY yarn.lock ./
COPY webpack.common.js ./
## import prod, because after install the build script tests if it can compile successfully
COPY webpack.prod.js ./
COPY . ./
RUN yarn install
EXPOSE 8080
FROM base as dev
CMD ["yarn", "run", "dev"]
Option 1) Doesnt Work
version: '3.8'
services:
backbone-ui:
network_mode: host
build:
context: .
container_name: backbone-ui
ports:
- target: 8080
host_ip: 127.0.0.1
published: 8080
protocol: http
mode: host
volumes:
- ./:/app
command: yarn run dev
Option 2) Doesnt Work
version: '3.8'
services:
backbone-ui:
network_mode: host
build:
context: .
container_name: backbone-ui
ports:
- 8080:8080
volumes:
- ./:/app
command: yarn run dev
Option 3) Doesnt Work
version: '3.8'
services:
backbone-ui:
build:
context: .
container_name: backbone-ui
ports:
- 8080:8080
volumes:
- ./:/app
command: yarn run dev
Failed Attempt
Dockerfile (paired with Option 2 compose.yml)
# syntax=docker/dockerfile:1
FROM node:16.16.0-buster as base
WORKDIR /app
ENV PATH /app/node_modules/.bin:$PATH
COPY package.json ./
COPY yarn.lock ./
COPY webpack.common.js ./
COPY webpack.prod.js ./
COPY . ./
RUN yarn install
FROM base as dev
EXPOSE 8080
CMD ["yarn", "run", "dev"]
Failed Attempt
updated package.json script
"dev": "webpack-dev-server --open --config webpack.dev.js --hot --host 0.0.0.0"
confirmed after rebuilding that the app is listening on 0.0.0.0:8080, but failed to respond on the host
$ docker exec -it <container id> /bin/sh
# curl 0.0.0.0:8080
<!DOCTYPE html>
<html>
...
</html>

Dockerized React app not recompiling code

I'm trying to dockerize a basic CRA template created through npx create-react-app project-name, of which Dockerfile would look like:
FROM node:latest
WORKDIR /usr/src/client
COPY ./ ./
RUN npm install
EXPOSE 3000
CMD ["npm", "start"]
I've built the image by running docker build -t containername .
and then run it with docker run -it -p 3000:3000 containername
Everything works fine, the container runs successfully and I can see the webpage running on the browser.
Problem here is webpack hot reloading not working, causing the app to not recompile upon changes.
Same question was posed already here and here but sadly with unsuccessful results. Problem seems to appear for Windows users, but in my case I'm on Mac.
I've tried already:
Updating npm start script with CHOKIDAR_USEPOLLING=true react-scripts start
Adding EXPOSE 35729 as explained here
Any suggestion is highly appreciated, thank you in advance!
i think webpack server doesn't see any new changes, because you modify your local file, but container uses its copies in runtime, which was passed in build time. so you should mount your local dir to container.
i can suggest you use docker-compose to mount your work dir from host to container:
docker-compose.yml
version: '3.4'
services:
app:
build: ./
volumes:
- ./src:/usr/src/client
ports:
- 3000:3000 # HOST:CONTAINER
command: npm start
or maybe use -v $(pwd)/src:/app/src in run command docker run ...
Unbelievably, the issue was caused by the working directory naming.
In order to fix it, I simply had to change it from /usr/src/client to /app and it started recompiling, even though I have no clue of the why.
FROM node:latest
WORKDIR /app
COPY ./ ./
RUN npm install
EXPOSE 3000
CMD ["npm", "start"]
The following worked for me.
docker run -p 3000:3000 -v ${PWD}:/usr/app -e CHOKIDAR_USEPOLLING=true globoreactapp/latest
Run that on windows poweshell.
If you are using windows cmd, you may try,
docker run -p 3000:3000 -v %cd%:/usr/app -e CHOKIDAR_USEPOLLING=true globoreactapp/latest
And if you are running on some linux distro, you can try something like this.
docker run -p 3000:3000 -v -e CHOKIDAR_USEPOLLING=true $(pwd):/usr/app
So here specifying
-e CHOKIDAR_USEPOLLING=true
made the difference for me.
Also with docker-compose, you need to add the same environment variable. Mine looks as follows.
version: '3'
services:
redis-server:
image: 'redis'
node-app:
build:
context: ./globoappserver
ports:
- "9080:8080"
container_name: api-server
ui:
build:
context: ./globo-react-app-ui
environment:
- CHOKIDAR_USEPOLLING=true
ports:
- "7000:3000"
stdin_open: true
volumes:
- ./globo-react-app-ui:/usr/app

Docker container works locally but not when uploaded to elasticbeanstalk

My docker container works locally, I'm trying to deploy it on elastic beanstalk using travis.
My travis build is successful. The docker container has been tested locally and it works. On AWS Elastic Beanstalk I get a "Not a file/Directory error" for my build directory.
Dockerfile
FROM node:alpine as builder
WORKDIR '/app'
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "run", "build"]
#Run Phase
FROM nginx
EXPOSE 80
COPY --from=builder /app/build /usr/share/nginx/html
Dockerfile.dev
FROM node:alpine
WORKDIR '/app'
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "run", "start"]
travis.yml
sudo: required
services:
- docker
before_install:
- docker build -t *******/docker -f Dockerfile.dev .
script:
- docker run -e CI=true *******/docker npm run test -- --coverage
deploy:
provider: elasticbeanstalk
region: "ap-south-1"
app: "docker"
env: "Docker-env-2"
bucket_name: "***********************"
bucket_path: "docker"
on:
branch: master
access_key_id: $AWS_ACCESS_KEY
secret_access_key: $AWS_SECRET_KEY
Following are the logs -
Travis output
Elastic Beanstalk output
Any help would be appreciated, thanks!
To run it locally, I run the following commands-
1) docker build -t *******/docker .
2) docker run -it <port>:80 <container_id>
It works as expected and I can reach the server on localhost:.
I've put the same commands on the travis.yml file as well.
There are two dockerfiles because I would only be needing the "build" directory in the production container and I can ignore the rest of the directories to save space.
I realized that the build directory was listed in the .gitignore file, thereby preventing travis-ci from accessing it as it isn't in the repo.
Once I removed it and re-deployed it, worked perfectly.

"Create React App" with Docker

I was wondering if anyone had any experience using create-react-app with docker. I was able to get it set up with a Dockerfile like:
from node
RUN mkdir /src
WORKDIR /src
ADD package.json /src/package.json
RUN npm install
EXPOSE 3000
CMD [ "npm", "start" ]
And then used a docker-compose file like:
app:
volumes:
- "./app:/src"
ports:
- "3000:3000"
- "35729:35729"
build: ./app
This allowed me to start up the container and view the app. However livereload didn't work when saving files in the mounted volume and webpack created several .json.gzip files in the src directory.
Any suggestions for getting this working correctly?
Yeah, as aholbreich mentioned, I'd use npm install / npm start locally on my machine for development, just because it's so easy. It's probably possible with docker-compose, mounting volumes etc. too, but I think it could be a bit fiddly to set up.
For deployment you can then very easily use a Dockerfile. Here's an example Dockerfile I'm using:
FROM node:6.9
# Create app directory
RUN mkdir -p /src/app
WORKDIR /src/app
# to make npm test run only once non-interactively
ENV CI=true
# Install app dependencies
COPY package.json /src/app/
RUN npm install && \
npm install -g pushstate-server
# Bundle app source
COPY . /src/app
# Build and optimize react app
RUN npm run build
EXPOSE 9000
# defined in package.json
CMD [ "npm", "run", "start:prod" ]
You need to add the start:prod option to your package.json:
"scripts": {
"start": "react-scripts start",
"start:prod": "pushstate-server build",
"build": "react-scripts build",
"test": "react-scripts test --env=jsdom",
"eject": "react-scripts eject"
},
You can run the tests on your CI service with:
docker run <image> npm test
There's nothing stopping you from running this docker container locally as well to make sure things work as expected.
I recently made a small project called hello-docker-react who just does what the op is looking for.
It's made with docker-compose, create-react-app, yarn, a node image, and a small entrypoint script.
Live reload work flawlessly and I haven't found any problems yet.
https://github.com/lopezator/hello-docker-react
here is good gide for this
https://mherman.org/blog/dockerizing-a-react-app/
for development
# base image
FROM node:9.6.1
# set working directory
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# add `/usr/src/app/node_modules/.bin` to $PATH
ENV PATH /usr/src/app/node_modules/.bin:$PATH
# install and cache app dependencies
COPY package.json /usr/src/app/package.json
RUN npm install --silent
RUN npm install react-scripts#1.1.1 -g --silent
# start app
CMD ["npm", "start"]
for production
# build environment
FROM node:9.6.1 as builder
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
ENV PATH /usr/src/app/node_modules/.bin:$PATH
COPY package.json /usr/src/app/package.json
RUN npm install --silent
RUN npm install react-scripts#1.1.1 -g --silent
COPY . /usr/src/app
RUN npm run build
# production environment
FROM nginx:1.13.9-alpine
COPY --from=builder /usr/src/app/build /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]
Not exactly a direct improvement of the author's code, but I was able to get a development environment working with very little code - and no direct dependency to node on my machine - like this:
docker-compose.yml
services:
node:
image: node:16
user: "node"
command: "npm start"
working_dir: /app
volumes:
- ./:/app
ports:
- 3000:3000
This way, you avoid creating docker images from a Dockerfile.
Usage is generally like this:
install dependencies before running: docker compose run node npm install
run development environment: docker compose up
install new dependencies: docker compose run node npm install [package name]
clean up docker instances created with compose run: docker compose rm
While using docker in development with create-react-app, i discovered that it is possible to override the webpackDevServer configuration by adding CHOKIDAR_USEPOLLING=1to your .env file. This will make the file watching work again. It even refreshes the browser page on the host! The only thing that i discovered is that it doesn't open up a webpage automatically.
I can also advise to add tty: true to your service to have your original console output back into your terminal. To remove the container name prefixes in the logs, you can run something like this after running docker-compose up -d:
docker-compose logs -f --tail=100 client | cut -f2 -d \"|\""
Running with CRA 4.0 and many dependencies
.dockerignore
.git
.gitignore
node_modules
build
Dockerfile.dev
FROM node:alpine
WORKDIR /app
COPY package.json /app
RUN yarn install
COPY . .
CMD ["yarn", "start"]
docker-compose.dev.yml
version: "3.8"
services:
print:
stdin_open: true
build:
context: .
dockerfile: Dockerfile.dev
ports:
- "3000:3000"
volumes:
- ".:/app"
- "/app/node_modules"
Dockerfile.prod
FROM node:alpine as build
WORKDIR /app
COPY package.json /app
RUN yarn install
COPY . /app
RUN yarn run build
FROM nginx:stable-alpine
COPY ./nginx/nginx.conf /etc/nginx/conf.d/default.conf
COPY --from=build /app/build /usr/share/nginx/html
docker-compose.prod.yml
version: "3.8"
services:
print:
stdin_open: true
build:
context: .
dockerfile: Dockerfile.prod
ports:
- "80:80"
nginx.conf
server {
listen 80;
server_name frontend;
location / {
root /usr/share/nginx/html;
index index.html;
try_files $uri /index.html;
}
}
To run
docker-compose.exe -f .\docker-compose.yml up --build
or
docker-compose.exe -f .\docker-compose.dev.yml up --build
Here is a simple (pure docker) solution without local installation of runtime (e.g. node):
cd /tmp
docker run -it --rm -v "$PWD":/app -w /app node yarn create react-app my-app
sudo chown -R $USER:root my-app/
cd my-app
nano docker-compose.yml # see docker-compose.yml below
docker compose up -d
docker-compose.yml:
services:
node:
image: node:16-alpine
environment:
- CHOKIDAR_USEPOLLING=true
- FAST_REFRESH=true
working_dir: /app
ports:
- '3000:3000'
command: "yarn start"
volumes:
- './:/app'
open localhost:3000 in your browser. Hot reload should work out of the box.

Resources