Dockerizing a MERN stack app, how to define ports - reactjs

I want to dockerize my MERN app, here is the dockerfile for my frontend:
FROM node:18.8-alpine
COPY . ./app
WORKDIR /app
RUN npm install
EXPOSE 3000
CMD ["npm", "start"]
and here is the dockerfile for my backend:
FROM node:18.8-alpine
COPY . ./app
WORKDIR /app
RUN npm install
EXPOSE 3000
CMD ["node", "server.js"]
I also want to use docker-compose to run both frontend and backend together and this is the config file:
version: '3.8'
services:
backend:
build: ./backend
container_name: backend_C
ports:
- 80:80
frontend:
build: ./frontend
container_name: frontend_C
ports:
- 3000:3000
stdin_open: true
tty: true
The problem is since both my backend and frontend run on port 3000, there would be a conflict when I run my images. I don't know how to specify and change the ports for them.

You can simply specify different host ports mapping to the same container ports:
version: '3.8'
services:
backend:
build: ./backend
container_name: backend_C
ports:
- 3000:3000
frontend:
build: ./frontend
container_name: frontend_C
ports:
- 8080:3000
stdin_open: true
tty: true
Now your backend is available on port 3000 and your frontend on port 8080

Related

Uncaught Error: Cannot find module '#blueprintjs/table'

I need help
I have project created with docker.
I run it with docker-compose up.
But I have a Uncaught Error: Cannot find module '#blueprintjs/table'.
The project consists of three containers.
Problem only in skynet.refinedev container.
Please tell me how to solve this problem. I have tried installing #blueprintjs/table via npm and docker-compose exec skynet.refinedev npm install #blueprintjs/core --save
docker-compose.yml:
services:
skynet.nextjs:
container_name: skynet.nextjs
stdin_open: true
build:
context: ./nextjs-test
dockerfile: Dockerfile
volumes:
- ./nextjs-test:/nextjs-test
- /nextjs-test/node_modules/
ports:
- '${FORWARD_DASHBOARD_PORT:-3004}:3002'
command: npm run dev
networks:
- skynet.frontend
skynet.refinedev:
container_name: skynet.refinedev
stdin_open: true
build:
context: ./refinedev-test
dockerfile: Dockerfile
volumes:
- ./refinedev-test:/refinedev-test
- /refinedev-test/node_modules/
ports:
- '${FORWARD_FRONTEND_PORT:-3005}:3001'
# - '${FORWARD_FRONTEND_PORT:-3007}:3007'
# - '${FORWARD_FRONTEND_PORT:-3008}:3008'
command: npm run start
networks:
- skynet.frontend
skynet.velzon:
container_name: skynet.velzon
stdin_open: true
build:
context: ./velzon-reactstrap
dockerfile: Dockerfile
volumes:
- ./velzon-reactstrap:/velzon-reactstrap
- /velzon-reactstrap/node_modules/
ports:
- '${FORWARD_TEMPLATE_PORT:-3006}:3000'
command: npm run start
networks:
- skynet.frontend
networks:
skynet.frontend:
driver: bridge
ch_ntw:
driver: bridge
ipam:
config:
- subnet: 10.222.1.0/24
volumes:
sailmysql:
driver: local
sailredis:
driver: local
sailmeilisearch:
driver: local

How can I exclude node_modules from shared Volumes in docker container?

I am working on a MERN Stack Project and I am trying to use Docker for both Development and Production version of the Project. I have created docker-compose for both modes (Dev , Prod) which has 3 services (Frontend, Backend, Database) Now Everything is Connecting correctly and working just fine but For publishing changes in Development mode I am using volumes in it and Now that I am a Windows user, The node_modules in my Project folder and the node_module in Container ( Which are Linux builds for same packages ) are generating Error. I am providing my Docker-Compose File as well.
Error
docker-compose.yml
services:
devengers:
container_name: devengers-root
build:
context: .
dockerfile: Dockerfile.development
image: devengers
backend:
container_name: devengers-backend
image: devengers
ports:
- 3000:3000
environment:
- MONGODB_URL=mongodb://database:27017
networks:
- local_net
depends_on:
- devengers
- database
command: npm run start:dev
volumes:
- ".:/Devengers"
frontend:
container_name: devengers-frontend
image: devengers
ports:
- 8080:8080
environment:
- API=http://backend:3000
networks:
- local_net
depends_on:
- backend
- database
command: npm run dev
volumes:
- ".:/Devengers"
database:
container_name: devengers-database
image: mongo:4.0-xenial
ports:
- 27017:27017
networks:
- local_net
volumes:
- mongodb_data:/data/db
networks:
local_net:
volumes:
mongodb_data:

TemplateDoesNotExist at / index.html in Docker | react with django

Why my react build folder doesn't serve after run python manage.py collectstatic command in Dockerfile? I have tried for a long time to dockerized my whole project but I did fail to collect static files. Where I miss please have a look into it.
***This is my backend/Dockerfile inside django project ***
FROM python:3.9.5-slim
ENV PYTHONUNBUFFERED 1
RUN pip install --upgrade pip
RUN apt-get update && apt-get install build-essential python-dev -y
COPY requirements.txt requirements.txt
RUN pip3 install -r requirements.txt && pip3 install uwsgi
WORKDIR /django
COPY . .
CMD ['RUN', 'python', 'manage.py', 'collectstatic', '--noinput']
EXPOSE 8000
CMD ["uwsgi", "--http", ":9090", "--wsgi-file", "IMS/wsgi.py", "--master", "--processes", "4", "--threads", "2"]
And this is my frontend/Dockerfile inside my react folder
FROM node:13.12.0-alpine
WORKDIR /react
COPY . .
RUN npm install
RUN npm run build
EXPOSE 3000
And finally, this is my docker-compose.yml file where I setup 3 services
version: "3.3"
services:
backend:
build:
context: ./backend/
command: gunicorn IMS.wsgi --bind 0.0.0.0:8000
container_name: ims-backend
restart: always
ports:
- "8000:8000"
networks:
- ims-network
ims-postgres:
image: "postgres:14beta2-alpine3.14"
restart: always
container_name: ims-postgres
hostname: emrdsaws.czlz2b677ytu.ap-south-1.rds.amazonaws.com
environment:
- POSTGRES_PASSWORD=zkf!%uW_?&UG%4
- POSTGRES_DB=imsrds
- POSTGRES_USER=dbrdsubuntume12
ports:
- 5432
frontend:
build:
context: ./frontend/
volumes:
- react_build:/react/build
expose:
- 3000
stdin_open: true
nginx:
image: nginx:latest
ports:
- "8080:8080"
volumes:
- ./nginx/nginx-setup.conf:/etc/nginx/conf.d/default.conf:ro
- react_build:/var/www/react
depends_on:
- backend
- frontend
- ims-postgres
volumes:
react_build:
networks:
ims-network:
driver: bridge

Docker and React configuration [WINDOWS 10 home and VSCODE]

I can't get all the routes of React with docker-compose up command.
docker-compose up => that only allows me to access the default route of the react app. Also, I can access them successfully with local npm run command. Am I missing something, may be in containerisation?
Any ideas why is it happening?
Here's my .yml file
version: "3"
services:
client:
build:
context: ./client
dockerfile: Dockerfile
image: fc-client-app
restart: always
ports:
- "80:80"
volumes:
- /client-app/node_modules
- .:/client-app
depends_on:
- "server"
server:
build:
context: ./server
dockerfile: Dockerfile
image: fc-server-app
ports:
- "8080:8080"
volumes:
- /server-app/node_modules
- .:/server-app
The problem is with the client service.
And here's my Docker File of Client service:-
FROM node:lts
WORKDIR /usr/src/client-app
ENV PATH /usr/src/client-app/node_modules/.bin:$PATH
COPY package*.json ./
RUN npm install
RUN npm install react-scripts#3.4.1 -g
COPY . .
EXPOSE 80
CMD ["npm", "start"]
You are exposing port 8080 in your client docker file, but the port specified in the docker-compose is 80 for your client service. And 8080 is for your server service. Please try changing the client port in your docker file.

How do I mount my local React directory into my React docker container?

I'm trying to build a React 16.13.0 app, running in a Docker container (alongside a Django app). I would like to mount my local React directory so that my React docker container reads its files from there so that if I change a file on my local file system, it's automatically picked up by my React docker container. I have this docker-compose.yml file ...
version: '3'
services:
...
client:
build:
context: ./client
volumes:
- /app/node_modules
- ./client:/app
ports:
- '3001:3000'
restart: always
container_name: web-app
environment:
- NODE_ENV=development
- REACT_APP_PROXY=http://localhost:9090
#command: npm run start
depends_on:
- web
...
This is the Dockerfile file in my React directory (client/Dockerfile) ...
FROM node:10-alpine AS alpine
# A directory within the virtualized Docker environment
# Becomes more relevant when using Docker Compose later
WORKDIR /usr/src/app
# Copies package.json and package-lock.json to Docker environment
COPY package*.json ./
# Installs all node packages
RUN npm install
# Finally runs the application
CMD [ "npm", "start" ]
Sadly, this doesn't seem to be working. Changes to my local file system are not getting reflected in my running Docker container. What else should I be doing?
Dockerfile seems ok. Here is portion of docker-compose.yml. Note env. variable CHOKIDAR_USEPOLLING=true at the bottom.
version: '3.7'
services:
react:
container_name: react
build:
context: react/
dockerfile: Dockerfile
volumes:
- './react:/app'
- '/app/node_modules'
stdin_open: true
ports:
- 3000:3000
environment:
- CHOKIDAR_USEPOLLING=true

Resources