What is this error? I cant run reactjs project in docker. I tried to run reactjs project in docker by docker-container using docker-compose.yml file and Dockerfile
app | npm ERR! syscall open
app | npm ERR! path /app/package.json
app | npm ERR! errno -2
app | npm ERR! enoent ENOENT: no such file or directory, open '/app/package.json'
app | npm ERR! enoent This is related to npm not being able to find a file.
app | npm ERR! enoent
app |
app | npm ERR! A complete log of this run can be found in:
app | npm ERR! /root/.npm/_logs/2022-03-25T09_02_12_947Z-debug-0.log
docker-compose.yml
version: '3.6'
services:
pdp-front:
build:
context: .
dockerfile: pdp-front/Dockerfile
command: npm run start
container_name: app
ports:
- "9999:9999"
volumes:
- ./:/pdp-front
- /pdp-front/node_modules
Dockerfile
FROM node:alpine as builder
WORKDIR /app
COPY pdp-front/package.json ./app
RUN npm install
COPY . ./
ENV PATH /app/node_modules/.bin:$PATH
CMD ["npm", "start"]
You're dropping content into several different paths inside the container. I expect, if you docker-compose run pdp-front sh to get an interactive shell and look around, you'll see a file /app/app that's actually the package.json file, and then the rest of your application is inside a subdirectory /app/pdp-front/....
The easiest approach here is to set up the pdp-front/Dockerfile so that it installs an image based only on its own directory. So for any COPY instructions, the left-hand side will be relative to the pdp-front directory, and the right-hand side will be relative to the specified WORKDIR. The Dockerfile can look like:
FROM node:alpine as builder
WORKDIR /app
COPY package.json ./ # <-- no subdirectories on either side of this
RUN npm install
COPY . ./
ENV PATH /app/node_modules/.bin:$PATH
CMD ["npm", "start"]
Then in the docker-compose.yml file, you can specify the pdp-front directory as the build context directory and then use the default dockerfile: Dockerfile setting relative to that directory. Eliminating some other unnecessary options, this can reduce to just
version: '3.8'
services:
pdp-front:
build: ./pdp-front
ports:
- "9999:9999"
You need the build: { context: ., dockerfile: pdp-front/Dockerfile } in the case where the application needs content from one of its sibling directories. In this case you'll need to work out the right directory layout; perhaps
WORKDIR /app/pdp-front # default to the application-specific subdirectory
COPY pdp-front/package.json ./ # and install its specific dependencies
RUN npm ci
COPY . /app/ # copy the entire multi-project tree into /app
# RUN npm run build
CMD ["npm", "start"] # with the pdp-front subdirectory as WORKDIR
Related
version: "1.0.0"
services:
########################################################################################################
############################################# VALIDLY #################################################
########################################################################################################
validly-studio:
build:
context: ./studio
dockerfile: Dockerfile
volumes:
- type: bind
source: ./studio
target: /app
- /app/node_modules
restart: unless-stopped
ports:
- 3000:3000
networks:
- validly
networks:
validly:
above is my docker-compose.yml file
FROM node:16.14-alpine
# set working directory
WORKDIR /app
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm install
# add app
COPY . ./
# start app
CMD ["npm", "start"]
this is my Dockerfile.
docker builds the react app and it prompts me to goto localhost:3000 where the app is running. But when I goto localhost:3000. I shows connection refused.
In your Dockerfile you copy in folder App everything to app/
And in your docker-compose you map /app to /app/node_modules.
This will not work.
Choose 1 of the two, and my instinct (and many error in the past) tell me that you should build everything in Dockerfile, including copying node_modules, and don't touch it in docker-compose.
Dockerfile
Template Dockerfile for React: (this one with NextJS environment, which makes it only more complex)
# Dependencies Container
FROM node:lts-alpine3.12 AS deps
RUN apk add --no-cache libc6-compat
WORKDIR /app
# Here we create node_modules
COPY package.json ./
COPY package-lock.json ./
RUN npm install -g npm#7.24.0 --no-update-notifier
RUN npm --version
RUN npm ci --no-update-notifier
# Rebuild the source code only when needed
FROM node:lts-alpine3.12 AS builder
WORKDIR /app
COPY . .
# Here we copy node_modules from previous intermediate container
COPY --from=deps /app/node_modules ./node_modules
RUN npm install -g npm#7.24.0 --no-update-notifier
RUN npm --version
RUN node -v
RUN npm run build --no-update-notifier
# Production Image
FROM node:16-bullseye AS runner
WORKDIR /app
ENV NODE_ENV production
# Here we only copy. No building needed, keeps the image small.
COPY --from=builder /app/.next ./.next
COPY --from=builder /app/node_modules ./node_modules
COPY --from=builder /app/package.json ./package.json
COPY --from=builder /app/package-lock.json ./package-lock.json
RUN addgroup -gid 1001 nodejs
RUN adduser -uid 1002 nextjs
RUN adduser nextjs nodejs
RUN chown -R nextjs:nodejs /app/.next
USER nextjs
docker-compose.yml
Here a template docker-compose.yml using the above Dockerfile
version: "3.9"
services:
webshop:
build:
context: ./build
dockerfile: Dockerfile_webshop
image: mywebshop
restart: "no"
container_name: MyWebshop
command: ["npm", "start"]
As you see, no volumes needed.
I created a dockerfile for frontend project and a docker compose project for the whole project. Running the frontend within a container works fine, however when I try with docker compose I always get the same error
Attaching to frontend_dev
frontend_dev | npm ERR! code ENOENT
frontend_dev | npm ERR! syscall open
frontend_dev | npm ERR! path /app/package.json
frontend_dev | npm ERR! errno -2
frontend_dev | npm ERR! enoent ENOENT: no such file or directory, open '/app/package.json'
frontend_dev | npm ERR! enoent This is related to npm not being able to find a file.
frontend_dev | npm ERR! enoent
frontend_dev |
frontend_dev | npm ERR! A complete log of this run can be found in:
frontend_dev | npm ERR! /root/.npm/_logs/2021-01-10T22_25_43_776Z-debug.log
frontend_dev exited with code 254
The dockerfile in the frontend folder is like this
#pull base image
FROM node:13.12.0-alpine
#set working directory
WORKDIR /app
#install app dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm install
#add frontend to docker container
COPY . ./
#launch frontend app
CMD [ "npm", "start" ]
and the docker-compose.yml is as this:
version: "3.0"
services:
frontend:
container_name: frontend_dev
build:
context: ./frontend
dockerfile: Dockerfile
volumes:
- /app/node_modules
- .:/app
ports:
- 3001:3000
stdin_open: true
environment:
- CHOKIDAR_USEPOLLING=true
and here is the structure of the project.project structre ,
docker-compose.yml is at the root of both frontend and backend, and just frontend contains the Dockerfile related to it.
I have a React App and I need to run it in docker. Inside this container I need to build 3 instance with the same code, but with different environments by replacing .env.production with my .env.production2 and .env.production3 file. I have a problem with dockerfile: if I'm not use RUN npm install after changing WORKDIR - build stops with error:
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm WARN Local package.json exists, but node_modules missing, did you mean to install?
So right now it works only with this dockerfile:
FROM node:12 as build-box
COPY . /app/expert
COPY . /app/expert-control-chat
COPY . /app/expert-control-support
WORKDIR /app/expert
ARG NPMTOKEN
ENV NPMTOKEN=$NPMTOKEN
RUN npm config set _auth $NPMTOKEN
RUN npm install
# Build
FROM build-box as publish
WORKDIR /app/expert
RUN npm run build
WORKDIR /
RUN rm -rf /app/expert-control-chat/.env.production
COPY .env.production.chat-control /app/expert-control-chat/.env.production
WORKDIR /app/expert-control-chat
RUN npm install
RUN npm run build
WORKDIR /
RUN rm -rf /app/expert-control-support/.env.production
COPY .env.production.chat-support /app/expert-control-support/.env.production
WORKDIR /app/expert-control-support
RUN npm install
RUN npm run build
FROM nginx as runtime
RUN rm -rf /etc/nginx/conf.d
COPY nginx.conf /etc/nginx/nginx.conf
WORKDIR /app/expert
COPY --from=publish /app/expert/build ./
WORKDIR /app/expert-control-chat
COPY --from=publish /app/expert-control-chat/build ./
WORKDIR /app/expert-control-support
COPY --from=publish /app/expert-control-support/build ./
# Start
EXPOSE 3000
CMD ["nginx", "-g", "daemon off;"]
Can you tell me the correct way to make a build?
I think that there is another way to build it, but I can't get that.
I am deploying a react app to Heroku via TravisCI. The fact that I'm using Heroku doesn't really affect what I'm about to ask, I'm pretty sure, it's just there for context. Travis successfully deploys the app until I add a testing step (the script section) in .travis.yml:
language: generic
sudo: required
services:
- docker
before_install:
- docker build -t myapp:prod -f Dockerfile.prod .
script:
- docker run -e CI=true myapp:prod npm run test
after_success:
- docker build -t myapp:prod -f Dockerfile.prod .
- echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_ID" --password-stdin
- docker push myapp:prod
deploy:
provider: heroku
app: myapp
skip_cleanup: true
api_key:
secure: <my_key>
However, my Dockerfile.prod is a multi-stage node + nginx where the nginx stage doesn't keep any node or npm stuff:
# build environment
FROM node:13.12.0-alpine as builder
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
# some CI stuff I guess
RUN npm ci
RUN npm install react-scripts#3.4.1 -g --silent
COPY . ./
RUN npm run build
# production environment
FROM nginx:stable-alpine
COPY nginx.conf /etc/nginx/conf.d/default.conf
# If using React Router
COPY --from=builder /app/build /usr/share/nginx/html
# For Heroku
CMD sed -i -e 's/$PORT/'"$PORT"'/g' /etc/nginx/conf.d/default.conf && nginx -g 'daemon off;'
Therefore, it is my understanding that .travis.yml tries to run that npm run test command inside my nginx container and can't execute npm commands (no node installed, right?). So guided by SO answers such as this one I started adding commands into that nginx stage such as
COPY package.json ./
COPY package-lock.json ./
RUN apk add --update npm
but I realized I might be approaching this the wrong way. Should I perhaps be adding npm through Travis? That is, should I include in .travis.yml in the scripts section something like docker run -e CI=true myapp:prod apk add --update npm and whatever else is necessary? This would result in a smaller nginx image no? However, would I run into problems with package.json from the node stage in Dockerfile.prod or anything like that?
In summary, to use TravisCI to test a dockerized react app served with nginx, at what point should I install npm into my image? Does it happen as part of script in .travis.yml or does it happen in Dockerfile.prod? If it is recommened to npm run tests inside Dockerfile.prod, would I do that in the first stage (node) or the second (nginx)?
Thanks
EDIT: Not sure if this can be considered solved, but a user on Reddit recommended to simply RUN npm run test right before the RUN npm run build.
At the top of my react component (Coffee.jsx), I have this import:
import ReactPlayer from 'react-player';
The package 'react-player' is certainly installed, present at package.json and node_modules/.
My code runs inside a docker container. Everytime I spin my containers up, like so:
docker-compose -f docker-compose-dev.yml up -d
I am getting this error:
./src/components/Coffees.jsx
Module not found: Can't resolve 'react-player' in '/usr/src/app/src/components'
this is what console shows me:
Brewing.jsx:22 Uncaught Error: Cannot find module 'react-player'
at webpackMissingModule (Brewing.jsx:22)
at Module../src/components/Coffees.jsx (Brewing.jsx:22)
at __webpack_require__ (bootstrap:781)
at fn (bootstrap:149)
at Module../src/App.jsx (Spotify.css:4)
at __webpack_require__ (bootstrap:781)
at fn (bootstrap:149)
at Module../src/index.js (spotify-auth.js:8)
at __webpack_require__ (bootstrap:781)
at fn (bootstrap:149)
at Object.0 (index.js:10)
at __webpack_require__ (bootstrap:781)
at checkDeferredModules (bootstrap:45)
at Array.webpackJsonpCallback [as push] (bootstrap:32)
at main.chunk.js:1
docker-compose-dev.yml:
client:
build:
context: ./services/client
dockerfile: Dockerfile-dev
volumes:
- './services/client:/usr/src/app'
- '/usr/src/app/node_modules'
ports:
- 3000:3000
environment:
- NODE_ENV=development
- REACT_APP_WEB_SERVICE_URL=${REACT_APP_WEB_SERVICE_URL}
depends_on:
- web
Dockerfile-dev:
# base image
FROM node:11.12.0-alpine
# set working directory
WORKDIR /usr/src/app
# add `/usr/src/app/node_modules/.bin` to $PATH
ENV PATH /usr/src/app/node_modules/.bin:$PATH
# install and cache app dependencies
COPY package.json /usr/src/app/package.json
COPY package-lock.json /usr/src/app/package-lock.json
RUN npm ci
RUN npm install react-scripts#2.1.8 -g --silent
# start app
CMD ["npm", "start"]
folder structure:
services/
docker-compose-dev.yml
node_modules/
client/
Dockerfile-dev
package.json
package-lock.json
node_modules/
react-player/
Temporay fix:
The hack fixing this is waiting for some time, along with some forced changes in my code either in Coffee.jsx or Brewing.jsx.
After I save the changed code, the package is found.
Then, when I stop containers and up them again, problem resumes. I have trying using the flag --build after up -d, to no avail.
Whats going on? How do I fix this?
more persistent fix:
After removing volumes from docker-compose-dev.yml and rebuilding, like so:
#volumes:
#- './services/client:/usr/src/app'
#- '/usr/src/app/node_modules'
I still get the error:
client_1 | > client#0.1.0 start /usr/src/app
client_1 | > react-scripts start
client_1 |
client_1 | Could not find a required file.
client_1 | Name: index.html
client_1 | Searched in: /usr/src/app/public
client_1 | npm ERR! code ELIFECYCLE
client_1 | npm ERR! errno 1
client_1 | npm ERR! client#0.1.0 start: `react-scripts start`
client_1 | npm ERR! Exit status 1
client_1 | npm ERR!
client_1 | npm ERR! Failed at the client#0.1.0 start script.
client_1 | npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
client_1 |
client_1 | npm ERR! A complete log of this run can be found in:
client_1 | npm ERR! /root/.npm/_logs/2019-11-05T15_14_42_967Z-debug.log
Then it only works if I uncomment volumes again and run the containers with volumes. An answer explaining reasons for
a) temporary fix
b) more permanent fix
would be very appreciated.
Managing node_modules is such a pain with docker. There are great discussions on the stackoverflow about how you can run Javascrip app with docker. Here is how I do it,
Dockerfile
FROM node:11.12.0-alpine
# first installed node_modules in cache and copy them to src folder
RUN mkdir /usr/src/cache
WORKDIR /usr/src/cache
COPY package.json .
RUN npm install -q
# now make a different directory for src code
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# set path to run packages from node_modules
ENV NODE_PATH=/usr/src/app/node_modules/.bin
COPY . .
docker-compose.yaml
app:
build: .
image: app
container_name: services.app
volumes:
- .:/usr/src/app
ports:
- 3000:5000
# this will copy node_modules to src folder, otherwise node_modules will be wipeed out as we don't
# have the node_modules in the host machine
command: /usr/src/app/entrypoint.sh prod
And my entrypoint.sh file looks like
#!/bin/bash
cp -r /usr/src/cache/node_modules/. /usr/src/app/node_modules/
exec npm start
So, the basic idea here is, when you build the image you store the node_modules somewhere in the path, but when you actually run it, you copy that node_modules and place it in the app folder. This way, your local node_modules never clashes with the one in the docker.
You can add node_modules in .dockerignore if you want to make a COPY faster.
First Down services:
docker-compose -f docker-compose-dev.yml down
Then re-build services (without cache):
docker-compose -f docker-compose-dev.yml build --no-cache
In the last run services:
docker-compose -f docker-compose-dev.yml up
I will go for installing packages whenever container up it will install on start on stop as well plus if can control mechanism too when to install or when to go with build time modules.
#!/bin/sh
npm ci
npm install react-scripts#2.1.8 -g --silent
exec npm start
With this approach the container will always install updated node modules, also will not need to build each time during development.
Also, we can control this behaviour.
#!/bin/sh
if [ "$PACKAGE_UPDATE" = true ] ; then
echo 'installing fresh node modules'
# you can also remove existing modules at this step
npm ci
npm install react-scripts#2.1.8 -g --silent
fi
exec npm start
so the docker run for not installing packages will be
docker run -e PACKAGE_UPDATE=true -it my_image
and will mount the anonymous volume to not conflict with host node modules.
volumes:
- './services/client:/usr/src/app'
- '/usr/src/app/node_modules'