Docker | Google Cloud - Fails when react start - reactjs

I'm trying to dockerize a MERN stack but when the time comes when react has to start, the container exit with status 0.
This is the log error:
This is the structure of my project:
- project
- server
- api
api.yml
server.js
Dockerfile
- www
app.yml
Dockerfile
docker-compose.yml
The www folder contains the starter files the will be generate from npx create-react-app www.
The content of server/Dockerfile is:
FROM node:latest
RUN mkdir -p /usr/server
WORKDIR /usr/server
RUN npm install -g nodemon
EXPOSE 3000
CMD [ "npm", "start" ]
the content of www/Dockerfile is:
FROM node:latest
RUN mkdir -p /usr/www/src/app
WORKDIR /usr/www/src/app
EXPOSE 3000
CMD [ "npm", "start" ]
and, in the end the content of docker-compose.yml is:
version: '3.7'
services:
mongodb:
image: mongo
ports:
- 27017:27017
api:
build: ./server/
ports:
- "6200:6200"
volumes:
- ./server:/usr/server
depends_on:
- mongodb
www:
build: ./www/
ports:
- 3000:3000
volumes:
- ./www:/usr/www/src/app
depends_on:
- api
Now, in the title I mention Google Cloud, because I tried to distribute the different "server" and "www" parts in my production environment. The distribution of server works correctly butwww fails with this error:
The errors that are generated by Docker and Google Cloud seem very similar or am I wrong? Could it be a reaction problem or am I wrong in both cases?
I also leave the contents of the app.yaml andapi.yaml files.
app.yaml
runtime: nodejs
env: flex
# Only for developing
manual_scaling:
instances: 1
resources:
cpu: 1
memory_gb: 0.5
disk_size_gb: 10
handlers:
- url: /.*
static_files: build/index.html
upload: build/index.html
- url: /
static_dir: build
The content of the api.yaml file is the same as that of app.yaml but without the handlers section.

Related

Traefik Django & React setup

Recently I came across server configuration using GitLab CI/CD and docker-compose, I have two separated repositories one for Django and the other for React JS on Gitlab.
The Django Repo contains the following production.yml file:
version: '3'
volumes:
production_postgres_data: {}
production_postgres_data_backups: {}
production_traefik: {}
services:
django: &django
build:
context: .
dockerfile: ./compose/production/django/Dockerfile
image: one_sell_production_django
platform: linux/x86_64
expose: # new
- 5000
depends_on:
- postgres
- redis
env_file:
- ./.envs/.production/.django
- ./.envs/.production/.postgres
command: /start
labels: # new
- "traefik.enable=true"
- "traefik.http.routers.django.rule=Host(`core.lwe.local`)"
postgres:
build:
context: .
dockerfile: ./compose/production/postgres/Dockerfile
image: one_sell_production_postgres
expose:
- 5432
volumes:
- production_postgres_data:/var/lib/postgresql/data:Z
- production_postgres_data_backups:/backups:z
env_file:
- ./.envs/.production/.postgres
traefik: # new
image: traefik:v2.2
ports:
- 80:80
- 8081:8080
volumes:
- "./compose/production/traefik/traefik.dev.toml:/etc/traefik/traefik.toml"
- "/var/run/docker.sock:/var/run/docker.sock:ro"
redis:
image: redis:6
This is work perfectly using the Traefik, I have also the following code for React JS repo:
version: '3.8'
services:
frontend:
build:
context: ./
dockerfile: Dockerfile
expose:
- 3000
labels: # new
- "traefik.enable=true"
- "traefik.http.routers.django.rule=Host(`lwe.local`)"
restart: 'always'
env_file:
- .env
Now I don't know how to connect both Django and React Js Repo using the Traefik and also how the CI/CD configuration should be, the following is the CI/CD configuration for Django Repo (I omitted unnecessary info and just include the deploy stage):
deploy:
stage: deploy
tags:
- docker
when: always
before_script:
- mkdir -p .envs/.production/
- touch .envs/.production/.django
- touch .envs/.production/.postgres
- touch .env
- chmod +x ./setup_env.sh
- sh setup_env.sh
- less .envs/.production/.django
- less .envs/.production/.postgres
- docker-compose -f production.yml build
- docker-compose -f production.yml run --rm django python manage.py migrate
script:
- docker-compose -f local.yml up -d

React App doesn't refresh on changes using Docker-Compose

Consider the Docker compose
version: '3'
services:
frontend:
build:
context: ./frontend
container_name: frontend
command: npm start
stdin_open: true
tty: true
volumes:
- ./frontend:/usr/app
ports:
- "3000:3000"
backend:
build:
context: ./backend
container_name: backend
command: npm start
environment:
- PORT=3001
- MONGO_URL=mongodb://api_mongo:27017
volumes:
- ./backend/src:/usr/app/src
ports:
- "3001:3001"
api_mongo:
image: mongo:latest
container_name: api_mongo
volumes:
- mongodb_api:/data/db
ports:
- "27017:27017"
volumes:
mongodb_api:
And the React Dockerfile :
FROM node:14.10.1-alpine3.12
WORKDIR /usr/app
COPY package.json .
RUN npm i
COPY . .
Folder Structure :
-frontend
-backend
-docker-compose.yml
And inside Frontend :
And inside src :
When I change files inside src it doesn't reflect on the Docker side.
How can we fix this ?
Here is the answer :
If you are running on Windows, please read this: Create-React-App has some issues detecting when files get changed on Windows based machines. To fix this, please do the following:
In the root project directory, create a file called .env
Add the following text to the file and save it: CHOKIDAR_USEPOLLING=true
That's all!
Don't use same name dir for different services like you use /usr/app change this to /client/app for client and server/app for backend and then it all works and use environment:- CHOKIDAR_USEPOLLING=true and use FROM node:16.5.0-alpine and can use stdin_open: true

Docker Compose with React and Nginx

I'm trying to use docker-compose for deployment of my React app, which uses an express backend and Postgres Database. My idea is to have shared volumes from my docker-compose. Then build from my Dockerfile into the volume, so that Nginx will be able to serve the files. The problem now is that it works when i build the project the first time, but if I change something in my React Client and run "docker-compose up --build" it looks like everything is building as it should, but the files served are still the same. Is COPY command in my dockerfile not overwriting the old files?
Dockerfile in my React Client Project
FROM node:13.12.0-alpine as build
WORKDIR /app
COPY package.json ./
COPY package-lock.json ./
RUN npm install
COPY . ./
RUN npm run build
FROM node:13.12.0-alpine
COPY --from=build /app/build /var/lib/frontend
docker-compose
version: "3.7"
services:
callstat_backend:
build: ./callstat-backend
restart: always
ports:
- "3000:3000"
env_file:
- keys.env
depends_on:
- postgres
callstat_frontend:
build: ./callstat-client
volumes:
- frontend/:/var/lib/frontend
postgres:
image: postgres:11.2-alpine
ports:
- "5432:5432"
volumes:
- pgdata:/var/lib/postgresql/data
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: callstat
nginx:
image: nginx
volumes:
- frontend:/usr/share/nginx/html
- ./nginx.conf:/etc/nginx/conf.d/default.conf
ports:
- "80:80"
depends_on:
- callstat_frontend
volumes:
pgdata:
frontend:
Maybe i'm taking a totaly wrong approach here?
You can run the commands in the following order:
# stop down the services
docker-compose stop
# remove the previously created docker resources
docker-compose rm
# bring up the services again
docker-compose up --build
This was your previous volume be removed and new one will be created with the updated changes.
NOTE: This is okay from the development perspective, but docker volumes are really expected to persist between deployments. For artifacts like code changes ideally images should be published as part of build process. To get little more insight into this topic you can refer to https://github.com/docker/compose/issues/2127

Angular in docker compose does not reload changes

I am new to docker and I am trying to make an aplication using django-rest and angular. My current docker-compose file looks like this:
version: '3'
services:
db:
image: postgres
environment:
- POSTGRES_DB=pirate
- POSTGRES_USER=django
- POSTGRES_PASSWORD=secreat
volumes:
- db-data:/var/lib/postgresql/data
ports:
- "5432:5432"
backend:
entrypoint: /entrypoint.sh
build: ./backend
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
healthcheck:
test: [“CMD”, “curl”, “--fail”, 'http://localhost:8000']
interval: 10s
timeout: 5s
retries: 3
frontend:
build: ./frontend
volumes:
- .:/app
healthcheck:
test: [“CMD”, “curl”, “--fail”, 'http://localhost:4200']
interval: 10s
timeout: 5s
retries: 3
ports:
- "4200:4200"
nginx:
build: ./nginx
healthcheck:
test: [“CMD”, “curl”, “--fail”, 'http://localhost']
interval: 10s
timeout: 5s
retries: 3
ports:
- "80:80"
links:
- frontend
volumes:
db-data:
And this is my angular Dockerfile:
FROM node:8.6
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app
RUN npm install
# Here starts angular cli workaround
USER node
RUN mkdir /home/node/.npm-global
ENV PATH=/home/node/.npm-global/bin:$PATH
ENV NPM_CONFIG_PREFIX=/home/node/.npm-global
RUN npm install -g #angular/cli
# Here ends
COPY . /usr/src/app
CMD ["npm", "start"]
And now this is the problem: Whenever I change sth in my angular code the docker image with angular does not reload changes. I dont know what I am doing wrong.
THe problem is related to how the filesystem works in Docker. To fix this I suggest you to either perform hot reloads (you have add EXPOSE 49153 in Dockerfile and ports - '49153:49153' in docker-compose.yml)
There are other solution like inotify or nodemon but they require that you use the --poll option when you start your application. The problem is that they keep polling the fs for changes and if the application is big your machine will be a lot slower than you'd like.
I think I found the issue. You copy the ./app in /usr/src/app but you're setting .:/app as a volume. So this means that if you get in your docker instance you'll find your application in 2 places: /app and /usr/src/app.
To fix this you should have this mapping: .:/usr/src/app
Btw, you're going to use the node_modules from your host and this might create some issues. To avoid this you can add an empty volume mapping: /usr/src/app/node_modules
If you get inside your running container, you'll find that the folder app exists twice. You can try it, by executing:
docker exec -it $instanceName /bin/sh
ls /app
ls /usr/src/app
The problem is that only the content of /app changes during your coding, while your application is currently executing the content of /usr/src/app which remains always the same.
Your frontend in the docker-compose should look like this:
frontend:
build: ./frontend
volumes:
- .:/usr/src/app
- /usr/src/app/node_modules
I came across the same issue in Docker Desktop for Windows. I know it has been a while but, anybody came here looking for a answer like me, you should do these steps.
Modify start command to "start": "ng serve --host 0.0.0.0 --poll 500" on scripts section in package.json. (Here the number 500 means that client will check every 500 milliseconds whether a change has been made, you can reduce this number. Refer this)
Make sure port 49153 is exposed in Dockerfile (use correct node version)
FROM node:10.16.3-alpine
WORKDIR /app
COPY package.json .
RUN npm install
COPY . .
EXPOSE 4200 49153
CMD npm run start
Map ports and volumes in docker-compose.yml
version: "3.7"
services:
webapp:
build: .
ports:
- "4200:4200"
- "49153:49153"
volumes:
- "/app/node_modules"
- ".:/app"
After that running docker-compose up will build an image and spin up a new container which will automatically reload with new changes.

Docker compose with subdirectory and live reload

I created an app using create-react-app and set up docker compose to set up the container and start the app. When the app is in the root directory, the app starts and the live reload works. But when I move the app to a subdirectory, I can get the app to start, but the live reload does not work.
Here's the working setup:
Dockerfile
FROM node:7.7.2
ADD . /code
WORKDIR /code
RUN npm install
EXPOSE 3000
CMD npm start
docker-compose.yml
version: "2"
services:
client:
build: .
ports:
- "3000:3000"
volumes:
- .:/code
Directory structure
app
- node_modules
- docker-compose
- Dockerfile
- package.json
- src
- public
Here's the structure that I would like:
app
- server
- client
/ node_modules
/ Dockerfile
/ package.json
/ src
/ public
- docker-compose.yml
I've tried every variation that I can think of, but the live reload will not work.
The first thing I had to do was change the build location:
version: "2"
services:
client:
build: ./client
ports:
- "3000:3000"
volumes:
- .:/code
Then I got an error when trying to run docker-compose up:
npm ERR! enoent ENOENT: no such file or directory, open '/code/package.json'
So I changed the volume to - .:/client/code and rebuilt and ran the command and the app started, but no live reload.
Anyway to do this when the app is in a subdirectory?
There's no difference to the paths inside the container when you move your local directory. So you only need to change the local references.
The volume mount should come from ./client
version: "2"
services:
client:
build: ./client
ports:
- "3000:3000"
volumes:
- ./client:/code

Resources