Docker and React configuration [WINDOWS 10 home and VSCODE] - reactjs

I can't get all the routes of React with docker-compose up command.
docker-compose up => that only allows me to access the default route of the react app. Also, I can access them successfully with local npm run command. Am I missing something, may be in containerisation?
Any ideas why is it happening?
Here's my .yml file
version: "3"
services:
client:
build:
context: ./client
dockerfile: Dockerfile
image: fc-client-app
restart: always
ports:
- "80:80"
volumes:
- /client-app/node_modules
- .:/client-app
depends_on:
- "server"
server:
build:
context: ./server
dockerfile: Dockerfile
image: fc-server-app
ports:
- "8080:8080"
volumes:
- /server-app/node_modules
- .:/server-app
The problem is with the client service.
And here's my Docker File of Client service:-
FROM node:lts
WORKDIR /usr/src/client-app
ENV PATH /usr/src/client-app/node_modules/.bin:$PATH
COPY package*.json ./
RUN npm install
RUN npm install react-scripts#3.4.1 -g
COPY . .
EXPOSE 80
CMD ["npm", "start"]

You are exposing port 8080 in your client docker file, but the port specified in the docker-compose is 80 for your client service. And 8080 is for your server service. Please try changing the client port in your docker file.

Related

Dockerizing a MERN stack app, how to define ports

I want to dockerize my MERN app, here is the dockerfile for my frontend:
FROM node:18.8-alpine
COPY . ./app
WORKDIR /app
RUN npm install
EXPOSE 3000
CMD ["npm", "start"]
and here is the dockerfile for my backend:
FROM node:18.8-alpine
COPY . ./app
WORKDIR /app
RUN npm install
EXPOSE 3000
CMD ["node", "server.js"]
I also want to use docker-compose to run both frontend and backend together and this is the config file:
version: '3.8'
services:
backend:
build: ./backend
container_name: backend_C
ports:
- 80:80
frontend:
build: ./frontend
container_name: frontend_C
ports:
- 3000:3000
stdin_open: true
tty: true
The problem is since both my backend and frontend run on port 3000, there would be a conflict when I run my images. I don't know how to specify and change the ports for them.
You can simply specify different host ports mapping to the same container ports:
version: '3.8'
services:
backend:
build: ./backend
container_name: backend_C
ports:
- 3000:3000
frontend:
build: ./frontend
container_name: frontend_C
ports:
- 8080:3000
stdin_open: true
tty: true
Now your backend is available on port 3000 and your frontend on port 8080

How to point docker to run a React app and open in browser?

I am trying to run a docker on my React app but it does not connect. Actually I see it is up at address 0.0.0.0:3000, but does not open in a browser. I am new to docker and still figuring out in how it works. How can I connect and open the app?
docker-compose.yml
version: "3"
services:
node:
build: .
image: node:16
container_name: myapp
tty: true
stdin_open: true
command: bash
restart: always
working_dir: /app
volumes:
- ./:/app
ports:
- 3000:3000
Dockerfile
FROM node:16-alpine
WORKDIR /app
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "start"]
Running at port 3000
e810d9f622c0 node:16 "docker-entrypoint.s…" 5 minutes ago Up 5 minutes 0.0.0.0:3000->3000/tcp, :::3000->3000/tcp
Your docker-compose file contains a number of misunderstandings
When you have both build: and image: docker-compose will build an image based on your dockerfile and tag it with the name you give in image:. However, you name yours node:16 which is not a good idea since that could be mistaken for the official node image.
you have both tty: and stdin_open: set to true. If you want to run node, then there's no need to run it interactively.
Your command: overrides the CMD defined in the Dockerfile, so your npm start command is no longer run.
Setting work_dir: to /app is superfluous since the WORKDIR is already set to /app in the Dockerfile.
Mapping a volume to /app hides everything in the /app path in the docker image. You've probably added this to get hot reload. Doing it this way is not a good idea, since it makes evereything in your Dockerfile that builds the /app directory irrelevant.
That leaves us with
version: "3"
services:
node:
build: .
container_name: myapp
restart: always
ports:
- 3000:3000
Run that and you should be able to go to http://localhost:3000/ and see your app.

Connect with database - docker-compose up

I'm trying to make one Docker Compose file to up an WEB (reactJS), API (.NET Core 2.1) and an SQL Server instance.
When I init the database and run .NET with dotnet cli, it works (using a connection string Server=localhost). However what I've been googling is that localhost does not work on containers. And when using container I can't get my .NET Core to connect with my SQL Server.
Can anyone shed some light what am I doing wrong?
I have this repo:
https://github.com/lucasgozzi/sagetest
And I'm currently using a branch names 'docker'. Here is my Docker files and composer in case you don't want to clone the repo.
Backend dockerfile:
FROM mcr.microsoft.com/dotnet/core/aspnet:2.1
WORKDIR /app
FROM mcr.microsoft.com/dotnet/core/sdk:2.1
WORKDIR /src
COPY . .
RUN dotnet restore "./Api/Api.csproj"
RUN dotnet build "Api/Api.csproj" -c Release -o /app/build
RUN dotnet publish "Api/Api.csproj" -c Release -o /app/publish
EXPOSE 5000
WORKDIR /app/publish
ENTRYPOINT ["dotnet", "Api.dll"]
Frontend dockerfile:
# base image
FROM node:12.2.0-alpine
# set working directory
WORKDIR /app
EXPOSE 3000
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install and cache app dependencies
COPY . .
RUN npm install --silent
RUN npm install react-scripts -g --silent
# start app
CMD ["npm", "start"]
Docker compose:
version: '3.1'
services:
api:
container_name: "teste-sage-api"
image: 'teste-sage-api'
build:
context: ./backend
dockerfile: Dockerfile
volumes:
- ./backend:/var/www/backend
ports:
- "5000:5000"
depends_on:
- "database"
networks:
- sagetest-network
web:
container_name: "teste-sage-web"
image: 'teste-sage-web'
build:
context: ./frontend_react
dockerfile: Dockerfile
ports:
- "3000:3000"
depends_on:
- "api"
networks:
- sagetest-network
database:
container_name: "sql-server"
image: "mcr.microsoft.com/mssql/server"
environment:
SA_PASSWORD: "Teste#123"
ACCEPT_EULA: "Y"
ports:
- "1433:1433"
networks:
- sagetest-network
networks:
sagetest-network:
driver: bridge
You can access the database form your containers on the service name, in your case this will be database. But you need to make sure that the database container is up and running before trying to connect to it. depends_on is not enough for this case. you may need to implement a waitfor in your dotnet container. check this for more info https://docs.docker.com/compose/startup-order/

How do I mount my local React directory into my React docker container?

I'm trying to build a React 16.13.0 app, running in a Docker container (alongside a Django app). I would like to mount my local React directory so that my React docker container reads its files from there so that if I change a file on my local file system, it's automatically picked up by my React docker container. I have this docker-compose.yml file ...
version: '3'
services:
...
client:
build:
context: ./client
volumes:
- /app/node_modules
- ./client:/app
ports:
- '3001:3000'
restart: always
container_name: web-app
environment:
- NODE_ENV=development
- REACT_APP_PROXY=http://localhost:9090
#command: npm run start
depends_on:
- web
...
This is the Dockerfile file in my React directory (client/Dockerfile) ...
FROM node:10-alpine AS alpine
# A directory within the virtualized Docker environment
# Becomes more relevant when using Docker Compose later
WORKDIR /usr/src/app
# Copies package.json and package-lock.json to Docker environment
COPY package*.json ./
# Installs all node packages
RUN npm install
# Finally runs the application
CMD [ "npm", "start" ]
Sadly, this doesn't seem to be working. Changes to my local file system are not getting reflected in my running Docker container. What else should I be doing?
Dockerfile seems ok. Here is portion of docker-compose.yml. Note env. variable CHOKIDAR_USEPOLLING=true at the bottom.
version: '3.7'
services:
react:
container_name: react
build:
context: react/
dockerfile: Dockerfile
volumes:
- './react:/app'
- '/app/node_modules'
stdin_open: true
ports:
- 3000:3000
environment:
- CHOKIDAR_USEPOLLING=true

How to copy some data from one container to another container by docker-compose.yml

Now I am developing a React application. For the deployment I want to use nginx as the web server. I have written a docker-compose file with two services (One for React app and another for nginx webserver). Usually nginx service needs only the 'build' folder from the react project.
Now my question is how can I copy the 'build' folder from the react container to the nginx container directory when the react container is running.
Please take a look on the Dockerfiles and the yaml file.
docker-compose.yaml
version: "3"
services:
nginx-server:
image: nginx_server:dev
container_name: nginx
build:
context: ./nginx
dockerfile: Dockerfile
restart: always
command: >
sh -c "cp -R /build/ /var/www/html/" // I want to do something like that
volumes:
- .:/react_app_server/nginx
ports:
- 80:80
depends_on:
- react-app
networks:
- server_network
react-app:
container_name: my_react_app
build:
context: .
dockerfile: ./Dockerfile
image: my_react_app:dev
tty: true
volumes:
- .:/react_app
ports:
- "1109:1109"
networks:
- frontend_network
command: >
bash -c "npm run-script build"
networks:
frontend_network:
driver: bridge
server_network:
driver: bridge
volumes:
static-volume:
Dockerfile for React app
FROM node:10.16.3
RUN mkdir /app
WORKDIR /app
COPY . /app
ENV PATH /app/node_modules/.bin:$PATH
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g --silent
RUN npm run-script build
Dockerfile for Nginx
FROM nginx:1.16.1-alpine
RUN rm /etc/nginx/conf.d/default.conf
COPY /prod.conf /etc/nginx/conf.d
Project Directory
My_React_App
build
nginx
Dockerfile
prod.conf
node_modules
public
src
.dockerignore
docker-compose.yaml
Dockerfile
package-lock.json
package.json
README.md
You don't actually need to copy the data, rather making use of same volume should work.
You need to share a named volume between the two containers.
Your docker-compose.yaml should be:
version: "3"
services:
nginx-server:
image: nginx_server:dev
container_name: nginx
|
|
volumes:
- .:/react_app_server/nginx
- app-volume:/var/www/html/
|
|
react-app:
container_name: my_react_app
build:
context: .
|
|
volumes:
- .:/react_app
- app-volume:/path/to/build/folder/
|
|
NOTE: Here app-volume is a named volume which we are mounting at directory inside react-app container where the build folder is expected to get created. The same app-volume named volume is also mounted inside nginx container at /var/www/html/ where you want the build folder to get copied.
Also instead of named volume you can also mount same host directory in both the containers and share the data. -v /samepath/on/host:/path/to/build/folder and -v /samepath/on/host:/var/www/html.
Hope this helps.

Resources