Docker installation for running node and react environment - reactjs

i am facing following error while installing docker
Service 'webpack' failed to build: failed to register layer: open /var/lib/docker/aufs/layers/7e80462cf605c738f8d502a5d2707a4e4a7fb03daad65d0113240d9f1428df0f: no such file or directory
version: "2"
services:
webpack:
image : express-react-image
build: .
command: ./bin/webpack-dev
volumes:
- .:/src/app
environment:
- VIRTUAL_HOST=localhost
ports:
- "8080:8080"
networks:
- front_tier
server:
build: .
command: ./bin/start-web
volumes:
- .:/src/app
environment:
- VIRTUAL_HOST=localhost
- APP_HOST=http://localhost:3000
ports:
- "3000:3000"
networks:
- front_tier
volumes:
data:
driver: local
networks:
front_tier:
driver: bridge

The error message you're seeing, failed to register layer, means that Docker is failing while building an image because it can't find a cached layer it expects to. The easiest way to resolve this is probably to remove all your cached layers and start the build from scratch. docker-compose rm Might do the trick. If not, I'd start removing containers with docker rm and images with docker rmi 'til you've got a clean enough slate that it works.
If manual cleanup is slow/hard, you might have better luck with docker system prune.
Regardless of you you get your cached layers cleaned up, I'd expect that to work with a fresh docker-compose up.

Related

Docker React Issue / Bug ? ( Docker building React images with stale code)

Basically I've setup a webapp with this stack: db: MySQL , frontend: React.js, backend: FastAPI (Python)
It's SSL Secure (Since the domain is being Mitigated thru by Cloudflare)
NGINX is used for Service endpoints api.domain.com is the API & domain.com for the frontend and some SSL keys stuff.
Even though this is quote on quote "Production", I'm still running a react development server for prototyping.
**PROBLEM:**
I'm running this project on a VPS, when I update the frontend on the VPS then delete all docker containers and images via the commands:
docker rm -vf $(docker ps -aq) #Delete all containers and volumes
docker rmi -f $(docker images -aq) #Delete all images
The changes still remain old, the caching on my browser is disabled aswell and i've tried multiple methods of clearing cache, it's not that.
The bundle.js webpack file still has it's old changes when being received from React, Specifically, I edited some Endpoint constants for production & that constant is stale in bundle.js! when changing stuff for the backend (Python files) it's working just fine and updated upon CMDS: docker-compose up -d & docker-compose up but react is acting up)
I've tried:
docker-compose pull
docker-compose build --no-cache
docker-compose
Made sure all my frontend files where saved infact.
no luck..
The API is running fine on HTTPS & React.js is also running fine, but the changes are just stale, it's very weird.
docker-compose.yml
version: "3.9"
services:
db:
image: mysql:${MYSQL_VERSION}
restart: always
environment:
- MYSQL_DATABASE=${MYSQL_DB}
- MYSQL_USER=${MYSQL_USERNAME}
- MYSQL_PASSWORD=${MYSQL_PASSWORD}
- MYSQL_ROOT_PASSWORD=${MYSQL_ROOT_PASSWORD}
ports:
- "${MYSQL_PORT}:${MYSQL_PORT}"
expose:
- "${MYSQL_PORT}"
volumes:
- db:/var/lib/mysql
networks:
- mysql_network
backend:
container_name: fastapi-backend
build: ./backend/app
volumes:
- ./backend:/code
ports:
- "${FASTAPI_PORT}:${FASTAPI_PORT}"
env_file:
- .env
depends_on:
- db
networks:
- mysql_network
- backend
restart: always
frontend:
container_name: react-frontend
build: ./frontend/client
ports:
- "${REACT_PORT}:${REACT_PORT}"
depends_on:
- backend
networks:
- backend
restart: always
volumes:
db:
driver: local
networks:
backend:
driver: bridge
mysql_network:
driver: bridge
This was working for my last release, for some reason it just stopped updating, docker is creating the images fine just with stale code...
This is being ran on Ubuntu 22.04 (Debian-like Linux Distribution)

Docker react image doesn't hot reload

Red all the posts around. Tried with lower "react-scripts" (mine is 5.0.1), used CHOKIDAR_USEPOLLING: 'true', basically everything on the first two pages on google.
Hot reloading still doesn't work. My docker-compose.yaml:
version: '3.3'
services:
database:
container_name: mysql
image: mysql
command: --default-authentication-plugin=mysql_native_password
environment:
MYSQL_ROOT_PASSWORD: toma123
MYSQL_DATABASE: api
MYSQL_USER: toma
MYSQL_PASSWORD: toma123
ports:
- '4306:3306'
volumes:
- mysql-data:/var/lib/mysql
php:
container_name: php
build:
context: ./php
ports:
- '9000:9000'
volumes:
- ./../api:/var/www/api
depends_on:
- database
nginx:
container_name: nginx
image: nginx:stable-alpine
ports:
- '8080:80'
volumes:
- ./../api:/var/www/api
- ./nginx/default.conf:/etc/nginx/conf.d/default.conf
depends_on:
- php
- database
react:
container_name: react
build:
context: ./../frontend
ports:
- '3001:3000'
volumes:
- node_modules:/home/app/node_modules
volumes:
mysql-data:
driver: local
node_modules:
driver: local
And my react Dockerfile
FROM node
RUN mkdir -p /home/app
WORKDIR /home/app
COPY package*.json ./
RUN npm install
COPY . .
CMD ["npm", "start"]
Even if a lot of your application is in Docker it doesn't mean you need to exclusively use Docker. Since one of Docker's primary goals is to prevent containers from accessing host files, it can be tricky to convince it to emulate a normal host live-development environment.
Install Node on your host. It's likely you have this anyways, or you can trivially apt-get install or brew install it.
Start everything except the front-end application you're developing using Docker; then start your application, on the host, in the normal way.
docker-compose up -d nginx
npm run dev
You may need to make a couple of configuration changes for this to work. For example, in this development environment, the database address will be localhost:4306, but when deployed it will be database:3306, and you'll need to do things like configure Webpack to proxy backend requests to http://localhost:8080. You might set environment variables in your docker-compose.yml for this, and in your code have them default to the things they are in the non-Docker development environment.
const dbHost = process.env.DB_HOST || 'localhost';
const dbPort = process.env.DB_PORT || 4306;
In your Compose setup, do not mount volumes: over your application code or libraries. Once you get up to the point of doing final integration testing on this code, build and run the actual image you're going to deploy. So that section of the Dockerfile might look like
version: '3.8'
services:
react:
build: ../frontend
ports:
- '3001:3000'
# no volumes:
# container_name: is also unnecessary

cannot dockerize react app: unable to connect to database

I am trying to dockerize a react app with postgres database.
I am new to docker, so I followed tutorials online to come up with Dockerfile and docker-compose as shown below.
Dockerfile
# pull the official base image
FROM node:13.12.0-alpine
# set working direction
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
EXPOSE 1338
ENV PATH /app/node_modules/.bin:$PATH
# install application dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm i
# add app
COPY . ./
# start app
CMD ["npm", "start"]
docker-compose.yml
version: '3.7'
services:
sample:
container_name: sample
build:
context: .
dockerfile: ./Dockerfile
volumes:
- '.:/app'
- '/app/node_modules'
ports:
- 1338:1338
environment:
- CHOKIDAR_USEPOLLING=true
- ASPNETCORE_URLS=https://+:1338
- ASPNETCORE_HTTPS_PORT=1338
depends_on:
- db
db:
container_name: db
image: postgres:14-alpine
restart: always
ports:
- "5432:5432"
environment:
POSTGRES_DB: ###
POSTGRES_USER: ###
POSTGRES_PASSWORD: ###
# I hide these information for privacy purpose, but I am 100% sure I input these information correctly.
volumes:
- ./db-data/:/var/lib/postgresql/data/
adminer:
image: adminer
restart: always
ports:
- 8080:8080
volumes:
pgdata1:
so what happened is when I tried to run docker-compose up , I suppose the db part had no issue, since it wrote database system is ready to accept connections. However, the "sample" part ended up with an error:
Server wasn't able to start properly.
error Error: connect ECONNREFUSED 127.0.0.1:5432
at TCPConnectWrap.afterConnect [as oncomplete]
which does not make much sense to me since the database is already up so there should not be any issue with connection at all.
Feel free to share your view, any idea would be appreciated. Thank you.

Docker file system not loading into localhost

Current I am working on a full stack application with a react frontend, mysql DB, and apache php instance. Something seems to be up with my changes going from my docker container to localhost. I can write from my local machine -> docker, but it seems like localhost is not reading react from my docker container.
I know that my mount is working correctly local machine -> docker file system because whenever I make changes in my IDE and save, then go and cat App.js within my docker container, that changes are there.
Any insight would be helpful, I think what is happening is that docker is taking a copy of the file upon creating the container, because whenever I remake the container, my changes to through to localhost.
p.s. I'm newish to docker, so let me know if you need more information. Thanks!
docker-compose
version: "3.7"
services:
frontend:
container_name: frontend
build:
context: "./hartley_react"
dockerfile: Dockerfile
volumes:
- "./hartley_react:/app"
- "/app/node_modules"
ports:
- 3000:3000
stdin_open: true
environment:
- CHOKIDAR_USEPOLLING=true
command: npm start
php:
container_name: php
build:
context: "./dockerfiles/php-img/"
ports:
- "80:80"
volumes:
- ./src:/var/www/html/
db:
container_name: db
image: mysql
command: --default-authentication-plugin=mysql_native_password
restart: always
environment:
MYSQL_ROOT_PASSWORD: example
MYSQL_DATABASE: userdb
MYSQL_USER: my_user
MYSQL_PASSWORD: my_password
volumes:
- ./mysqldata:/var/lib/mysql
adminer:
container_name: adminer
depends_on:
- db
image: adminer
restart: always
ports:
- 8080:8080
volumes:
my-mysqldata:
frontend:
React DockerFile
FROM node:17.4.0-alpine
WORKDIR /app
COPY package.json .
RUN npm install
COPY . ./
EXPOSE 3000
CMD [ "npm", "start" ]
I guess your problem is that npm start do not auto reload if you edit your files. For that you could use nodemon or supervisor which can reload the project each time a file is updated. Otherwise you should restart manually (probably by restarting the docker container)
There are a few things you can try:
check your package.json file and specifically scripts whether start gives npm start with hot reload option or not.
to do so, you may do the full test in your local (without docker) and check whether the changes you are making in html (frontend) is indeed reflecting to your application locally without rebuilding or not.
secondly, create another script inside package.json file (custom script) to have npm run/ npm dev (available in react but not sure for your case) with hot-reload or use nodemon for that.
Once you have that, use that in your docker-compose file in place of CMD [ "npm", "start" ]
For me, It looks like your dockerfile and docker-compose file along with the named volume definition looks ok.
Only one thing though - Not sure why did you mention the "command: npm start" inside docker-compose file while you already have covered that part in your dockerfile while creating an image.

Docker Compose with React and Nginx

I'm trying to use docker-compose for deployment of my React app, which uses an express backend and Postgres Database. My idea is to have shared volumes from my docker-compose. Then build from my Dockerfile into the volume, so that Nginx will be able to serve the files. The problem now is that it works when i build the project the first time, but if I change something in my React Client and run "docker-compose up --build" it looks like everything is building as it should, but the files served are still the same. Is COPY command in my dockerfile not overwriting the old files?
Dockerfile in my React Client Project
FROM node:13.12.0-alpine as build
WORKDIR /app
COPY package.json ./
COPY package-lock.json ./
RUN npm install
COPY . ./
RUN npm run build
FROM node:13.12.0-alpine
COPY --from=build /app/build /var/lib/frontend
docker-compose
version: "3.7"
services:
callstat_backend:
build: ./callstat-backend
restart: always
ports:
- "3000:3000"
env_file:
- keys.env
depends_on:
- postgres
callstat_frontend:
build: ./callstat-client
volumes:
- frontend/:/var/lib/frontend
postgres:
image: postgres:11.2-alpine
ports:
- "5432:5432"
volumes:
- pgdata:/var/lib/postgresql/data
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: callstat
nginx:
image: nginx
volumes:
- frontend:/usr/share/nginx/html
- ./nginx.conf:/etc/nginx/conf.d/default.conf
ports:
- "80:80"
depends_on:
- callstat_frontend
volumes:
pgdata:
frontend:
Maybe i'm taking a totaly wrong approach here?
You can run the commands in the following order:
# stop down the services
docker-compose stop
# remove the previously created docker resources
docker-compose rm
# bring up the services again
docker-compose up --build
This was your previous volume be removed and new one will be created with the updated changes.
NOTE: This is okay from the development perspective, but docker volumes are really expected to persist between deployments. For artifacts like code changes ideally images should be published as part of build process. To get little more insight into this topic you can refer to https://github.com/docker/compose/issues/2127

Resources