React and Docker - Uncaught Error: Cannot find module 'react-player' - reactjs

At the top of my react component (Coffee.jsx), I have this import:
import ReactPlayer from 'react-player';
The package 'react-player' is certainly installed, present at package.json and node_modules/.
My code runs inside a docker container. Everytime I spin my containers up, like so:
docker-compose -f docker-compose-dev.yml up -d
I am getting this error:
./src/components/Coffees.jsx
Module not found: Can't resolve 'react-player' in '/usr/src/app/src/components'
this is what console shows me:
Brewing.jsx:22 Uncaught Error: Cannot find module 'react-player'
at webpackMissingModule (Brewing.jsx:22)
at Module../src/components/Coffees.jsx (Brewing.jsx:22)
at __webpack_require__ (bootstrap:781)
at fn (bootstrap:149)
at Module../src/App.jsx (Spotify.css:4)
at __webpack_require__ (bootstrap:781)
at fn (bootstrap:149)
at Module../src/index.js (spotify-auth.js:8)
at __webpack_require__ (bootstrap:781)
at fn (bootstrap:149)
at Object.0 (index.js:10)
at __webpack_require__ (bootstrap:781)
at checkDeferredModules (bootstrap:45)
at Array.webpackJsonpCallback [as push] (bootstrap:32)
at main.chunk.js:1
docker-compose-dev.yml:
client:
build:
context: ./services/client
dockerfile: Dockerfile-dev
volumes:
- './services/client:/usr/src/app'
- '/usr/src/app/node_modules'
ports:
- 3000:3000
environment:
- NODE_ENV=development
- REACT_APP_WEB_SERVICE_URL=${REACT_APP_WEB_SERVICE_URL}
depends_on:
- web
Dockerfile-dev:
# base image
FROM node:11.12.0-alpine
# set working directory
WORKDIR /usr/src/app
# add `/usr/src/app/node_modules/.bin` to $PATH
ENV PATH /usr/src/app/node_modules/.bin:$PATH
# install and cache app dependencies
COPY package.json /usr/src/app/package.json
COPY package-lock.json /usr/src/app/package-lock.json
RUN npm ci
RUN npm install react-scripts#2.1.8 -g --silent
# start app
CMD ["npm", "start"]
folder structure:
services/
docker-compose-dev.yml
node_modules/
client/
Dockerfile-dev
package.json
package-lock.json
node_modules/
react-player/
Temporay fix:
The hack fixing this is waiting for some time, along with some forced changes in my code either in Coffee.jsx or Brewing.jsx.
After I save the changed code, the package is found.
Then, when I stop containers and up them again, problem resumes. I have trying using the flag --build after up -d, to no avail.
Whats going on? How do I fix this?
more persistent fix:
After removing volumes from docker-compose-dev.yml and rebuilding, like so:
#volumes:
#- './services/client:/usr/src/app'
#- '/usr/src/app/node_modules'
I still get the error:
client_1 | > client#0.1.0 start /usr/src/app
client_1 | > react-scripts start
client_1 |
client_1 | Could not find a required file.
client_1 | Name: index.html
client_1 | Searched in: /usr/src/app/public
client_1 | npm ERR! code ELIFECYCLE
client_1 | npm ERR! errno 1
client_1 | npm ERR! client#0.1.0 start: `react-scripts start`
client_1 | npm ERR! Exit status 1
client_1 | npm ERR!
client_1 | npm ERR! Failed at the client#0.1.0 start script.
client_1 | npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
client_1 |
client_1 | npm ERR! A complete log of this run can be found in:
client_1 | npm ERR! /root/.npm/_logs/2019-11-05T15_14_42_967Z-debug.log
Then it only works if I uncomment volumes again and run the containers with volumes. An answer explaining reasons for
a) temporary fix
b) more permanent fix
would be very appreciated.

Managing node_modules is such a pain with docker. There are great discussions on the stackoverflow about how you can run Javascrip app with docker. Here is how I do it,
Dockerfile
FROM node:11.12.0-alpine
# first installed node_modules in cache and copy them to src folder
RUN mkdir /usr/src/cache
WORKDIR /usr/src/cache
COPY package.json .
RUN npm install -q
# now make a different directory for src code
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# set path to run packages from node_modules
ENV NODE_PATH=/usr/src/app/node_modules/.bin
COPY . .
docker-compose.yaml
app:
build: .
image: app
container_name: services.app
volumes:
- .:/usr/src/app
ports:
- 3000:5000
# this will copy node_modules to src folder, otherwise node_modules will be wipeed out as we don't
# have the node_modules in the host machine
command: /usr/src/app/entrypoint.sh prod
And my entrypoint.sh file looks like
#!/bin/bash
cp -r /usr/src/cache/node_modules/. /usr/src/app/node_modules/
exec npm start
So, the basic idea here is, when you build the image you store the node_modules somewhere in the path, but when you actually run it, you copy that node_modules and place it in the app folder. This way, your local node_modules never clashes with the one in the docker.
You can add node_modules in .dockerignore if you want to make a COPY faster.

First Down services:
docker-compose -f docker-compose-dev.yml down
Then re-build services (without cache):
docker-compose -f docker-compose-dev.yml build --no-cache
In the last run services:
docker-compose -f docker-compose-dev.yml up

I will go for installing packages whenever container up it will install on start on stop as well plus if can control mechanism too when to install or when to go with build time modules.
#!/bin/sh
npm ci
npm install react-scripts#2.1.8 -g --silent
exec npm start
With this approach the container will always install updated node modules, also will not need to build each time during development.
Also, we can control this behaviour.
#!/bin/sh
if [ "$PACKAGE_UPDATE" = true ] ; then
echo 'installing fresh node modules'
# you can also remove existing modules at this step
npm ci
npm install react-scripts#2.1.8 -g --silent
fi
exec npm start
so the docker run for not installing packages will be
docker run -e PACKAGE_UPDATE=true -it my_image
and will mount the anonymous volume to not conflict with host node modules.
volumes:
- './services/client:/usr/src/app'
- '/usr/src/app/node_modules'

Related

Error when building container for react + storybook in prod environment

I'm trying to set up docker for a react application and storybook, but I'm running into an error when building it.
Here's the error:
#0 95.75 npm ERR! path /usr/src/app
#0 95.75 npm ERR! command failed
#0 95.75 npm ERR! signal SIGKILL
#0 95.75 npm ERR! command sh -c build-storybook -s public
The react image is working as expected and I was able to get both working in a dev environment, but for prod I get that error when building storybook image.
Here's the Dockerfile I have for storybook:
FROM node:19-alpine3.16 as build
WORKDIR /usr/src/app
COPY package.json ./
RUN npm install
COPY . .
RUN npm run build-storybook
COPY . .
FROM nginx:1.23.3
COPY --from=build /usr/src/app/storybook-static /usr/share/nginx/html/storybook
And here's the docker-compose file:
version: '3.8'
services:
storybook:
container_name: storybook
build:
context: .
dockerfile: Dockerfile.storybook.prod
react:
container_name: react
build:
context: .
dockerfile: Dockerfile.prod
ports:
- 8080:80
env_file:
- ./.env.prod
My goal is to have the react app running on the root and storybook on /storybook. For dev, which won't build prod static files, both are running with no issues.
Any clue what I'm doing wrong?

Reactjs Dockerisation Faild

What is this error? I cant run reactjs project in docker. I tried to run reactjs project in docker by docker-container using docker-compose.yml file and Dockerfile
app | npm ERR! syscall open
app | npm ERR! path /app/package.json
app | npm ERR! errno -2
app | npm ERR! enoent ENOENT: no such file or directory, open '/app/package.json'
app | npm ERR! enoent This is related to npm not being able to find a file.
app | npm ERR! enoent
app |
app | npm ERR! A complete log of this run can be found in:
app | npm ERR! /root/.npm/_logs/2022-03-25T09_02_12_947Z-debug-0.log
docker-compose.yml
version: '3.6'
services:
pdp-front:
build:
context: .
dockerfile: pdp-front/Dockerfile
command: npm run start
container_name: app
ports:
- "9999:9999"
volumes:
- ./:/pdp-front
- /pdp-front/node_modules
Dockerfile
FROM node:alpine as builder
WORKDIR /app
COPY pdp-front/package.json ./app
RUN npm install
COPY . ./
ENV PATH /app/node_modules/.bin:$PATH
CMD ["npm", "start"]
You're dropping content into several different paths inside the container. I expect, if you docker-compose run pdp-front sh to get an interactive shell and look around, you'll see a file /app/app that's actually the package.json file, and then the rest of your application is inside a subdirectory /app/pdp-front/....
The easiest approach here is to set up the pdp-front/Dockerfile so that it installs an image based only on its own directory. So for any COPY instructions, the left-hand side will be relative to the pdp-front directory, and the right-hand side will be relative to the specified WORKDIR. The Dockerfile can look like:
FROM node:alpine as builder
WORKDIR /app
COPY package.json ./ # <-- no subdirectories on either side of this
RUN npm install
COPY . ./
ENV PATH /app/node_modules/.bin:$PATH
CMD ["npm", "start"]
Then in the docker-compose.yml file, you can specify the pdp-front directory as the build context directory and then use the default dockerfile: Dockerfile setting relative to that directory. Eliminating some other unnecessary options, this can reduce to just
version: '3.8'
services:
pdp-front:
build: ./pdp-front
ports:
- "9999:9999"
You need the build: { context: ., dockerfile: pdp-front/Dockerfile } in the case where the application needs content from one of its sibling directories. In this case you'll need to work out the right directory layout; perhaps
WORKDIR /app/pdp-front # default to the application-specific subdirectory
COPY pdp-front/package.json ./ # and install its specific dependencies
RUN npm ci
COPY . /app/ # copy the entire multi-project tree into /app
# RUN npm run build
CMD ["npm", "start"] # with the pdp-front subdirectory as WORKDIR

Hot reload with docker react app doesn't work

i try to use my react app with docker without always rebuild image from my dockerfile. i use a docker-compose file to build my react app but i have always the same error.
PS : im new with Docker so certainly i made some mistakes.
before docker-compose up, i use docker-compose build
when i use docker-compose up i have this error :
frontend_1 | npm ERR! code ENOENT
frontend_1 | npm ERR! syscall open
frontend_1 | npm ERR! path /app/package.json
frontend_1 | npm ERR! errno -2
frontend_1 | npm ERR! enoent ENOENT: no such file or directory, open '/app/package.json'
frontend_1 | npm ERR! enoent This is related to npm not being able to find a file.
frontend_1 | npm ERR! enoent
frontend_1 |
frontend_1 | npm ERR! A complete log of this run can be found in:
frontend_1 | npm ERR! /root/.npm/_logs/2021-03-23T10_26_47_886Z-debug.log
rfid-commisioning_frontend_1 exited with code 254
my docker file :
WORKDIR /app
ENV PATH /app/node_modules/.bin:$PATH
COPY package.json ./
COPY package-lock.json ./
RUN npm install
COPY . ./
CMD ["npm", "start"]
my docker-compose file :
version: "3"
services:
frontend:
build: Front/RFID
command: ["npm", "start"]
environment:
- CHOKIDAR_USEPOLLING="true"
- NODE_ENV=development
volumes:
- "./Front/RFID/:/app"
- "/app/node_modules"
ports:
- "3000:3000"
Folder structure Frontend :
Folder Structure Global :

npm run tests in docker with travis CI

I am deploying a react app to Heroku via TravisCI. The fact that I'm using Heroku doesn't really affect what I'm about to ask, I'm pretty sure, it's just there for context. Travis successfully deploys the app until I add a testing step (the script section) in .travis.yml:
language: generic
sudo: required
services:
- docker
before_install:
- docker build -t myapp:prod -f Dockerfile.prod .
script:
- docker run -e CI=true myapp:prod npm run test
after_success:
- docker build -t myapp:prod -f Dockerfile.prod .
- echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_ID" --password-stdin
- docker push myapp:prod
deploy:
provider: heroku
app: myapp
skip_cleanup: true
api_key:
secure: <my_key>
However, my Dockerfile.prod is a multi-stage node + nginx where the nginx stage doesn't keep any node or npm stuff:
# build environment
FROM node:13.12.0-alpine as builder
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
# some CI stuff I guess
RUN npm ci
RUN npm install react-scripts#3.4.1 -g --silent
COPY . ./
RUN npm run build
# production environment
FROM nginx:stable-alpine
COPY nginx.conf /etc/nginx/conf.d/default.conf
# If using React Router
COPY --from=builder /app/build /usr/share/nginx/html
# For Heroku
CMD sed -i -e 's/$PORT/'"$PORT"'/g' /etc/nginx/conf.d/default.conf && nginx -g 'daemon off;'
Therefore, it is my understanding that .travis.yml tries to run that npm run test command inside my nginx container and can't execute npm commands (no node installed, right?). So guided by SO answers such as this one I started adding commands into that nginx stage such as
COPY package.json ./
COPY package-lock.json ./
RUN apk add --update npm
but I realized I might be approaching this the wrong way. Should I perhaps be adding npm through Travis? That is, should I include in .travis.yml in the scripts section something like docker run -e CI=true myapp:prod apk add --update npm and whatever else is necessary? This would result in a smaller nginx image no? However, would I run into problems with package.json from the node stage in Dockerfile.prod or anything like that?
In summary, to use TravisCI to test a dockerized react app served with nginx, at what point should I install npm into my image? Does it happen as part of script in .travis.yml or does it happen in Dockerfile.prod? If it is recommened to npm run tests inside Dockerfile.prod, would I do that in the first stage (node) or the second (nginx)?
Thanks
EDIT: Not sure if this can be considered solved, but a user on Reddit recommended to simply RUN npm run test right before the RUN npm run build.

Could not get uid/gid when building Node/Docker

My Dockerfile is using alpine and globally installing react-scripts. When it tries to install it, it fails with "could not get uid/gid" error. I added the "---unsafe-perm" option to the npm install -g command. The docker container is successfully created, but the permissions in the container are messaged up for the installed files. I see the username and group set to 1000 for all of them. I tried adding the following command to the Dockerfile right before the install step but that didn't help.
RUN npm -g config set user root
Build error
Error: could not get uid/gid
[ 'nobody', 0 ]
at /usr/local/lib/node_modules/npm/node_modules/uid-number/uid-number.js:37:16
at ChildProcess.exithandler (child_process.js:296:5)
at ChildProcess.emit (events.js:182:13)
at maybeClose (internal/child_process.js:961:16)
at Process.ChildProcess._handle.onexit (internal/child_process.js:250:5)
TypeError: Cannot read property 'get' of undefined
at errorHandler (/usr/local/lib/node_modules/npm/lib/utils/error-handler.js:205:18)
at /usr/local/lib/node_modules/npm/bin/npm-cli.js:76:20
at cb (/usr/local/lib/node_modules/npm/lib/npm.js:228:22)
at /usr/local/lib/node_modules/npm/lib/npm.js:266:24
at /usr/local/lib/node_modules/npm/lib/config/core.js:83:7
at Array.forEach (<anonymous>)
at /usr/local/lib/node_modules/npm/lib/config/core.js:82:13
at f (/usr/local/lib/node_modules/npm/node_modules/once/once.js:25:25)
at afterExtras (/usr/local/lib/node_modules/npm/lib/config/core.js:173:20)
at Conf.<anonymous> (/usr/local/lib/node_modules/npm/lib/config/core.js:231:22)
/usr/local/lib/node_modules/npm/lib/utils/error-handler.js:205
if (npm.config.get('json')) {
^
TypeError: Cannot read property 'get' of undefined
at process.errorHandler (/usr/local/lib/node_modules/npm/lib/utils/error-handler.js:205:18)
at process.emit (events.js:182:13)
at process._fatalException (internal/bootstrap/node.js:472:27)
ERROR: Service 'sample-app' failed to build: The command '/bin/sh -c npm install react-scripts#1.1.1 -g' returned a non-zero code:
Dockerfile
/usr/src/app # cat Dockerfile
# build environment
FROM node:10-alpine as builder
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
ENV PATH /usr/src/app/node_modules/.bin:$PATH
COPY package.json /usr/src/app/package.json
RUN npm install
RUN npm install react-scripts#1.1.1 -g
COPY . /usr/src/app
RUN npm run build
# production environment
FROM nginx:1.13.9-alpine
COPY --from=builder /usr/src/app/build /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]
UPD Fixed in nodejs#12.4.0?
Check if this is linked to nodejs/docker-node issue 813:
Root cause seems to be: Thread stack size
The default stack size for new threads on glibc is determined based on the resource limit governing the main thread’s stack (RLIMIT_STACK).
It generally ends up being 2-10 MB.
There three possible solutions:
Talk to Alpine teams to fix it. There were some discussions already
Fix it in the node docker alpine image as follows
Set default npm_config_unsafe_perm=true in the docker image as a workaround until it's fixed.
You already tried the third option, but consider also:
Alternatively, you should switch to the slim (Debian) variant until this get's fixe upstream by the Alpine team.
I faced same issue in Docker for node-alpine image when I am dockerizing my react application
I resolved with following dockerfile configuration.
FROM node:8.10.0-alpine
# Set a working directory
WORKDIR /usr/src/app
COPY ./build/package.json .
COPY ./build/yarn.lock .
# To handle 'not get uid/gid'
RUN npm config set unsafe-perm true
# Install Node.js dependencies
RUN yarn install --production --no-progress
# Copy application files
COPY ./build .
# Install pm2
RUN npm install -g pm2 --silent
# Run the container under "node" user by default
USER node
CMD ["pm2", "start", "mypm2config.yml", "--no-daemon", "--env", "preprod"]

Resources