Docker container exiting immediately after starting when using npm init react-app - reactjs

I am trying to start a Docker container with a react project, the project is created using npm init react-app.
This is my docker file
# Specify a base image
FROM node:alpine
WORKDIR /usr/app
# Install some depenendencies
COPY ./package.json ./
RUN npm install
COPY ./ ./
# Default command
CMD ["npm", "run", "start"]
Docker build . creates an image successfully (with a lot of npm warnings) and then when I run Docker run <image> this is the output in my terminal
> mytest#0.1.0 start /usr/app
> react-scripts start
ℹ 「wds」: Project is running at http://172.17.0.2/
ℹ 「wds」: webpack output is served from
ℹ 「wds」: Content not from webpack is served from /usr/app/public
ℹ 「wds」: 404s will fallback to /
Starting the development server...
Soon as it hits Starting the development server... it stops running in my terminal. If I check Docker ps I can see no containers are running, if I run Docker ps -a I can see a container was started up and then exited immediately.
Docker logs shows the terminal output above, anybody run into this situation? Its only with my npm init react-app project, my other nodejs + express projects run fine with the exact same docker file

I tried downgrading but it didn't work. What worked for me is in docker-compose file in the react app sector I added:
stdin_open: true
This solution is also suggested here: issue on github

docker run -it -p 80:3000 imagename resolve my problem.

Running the command with -it worked for me:
docker run -it -p 3001:3000 Imageid

I solve the problem. Inside package.json replace "react-scripts": "3.4.1" to "react-scripts": "3.4.0"
Then rebuild image and It works !
They mess up something with react-scripts": "3.4.1"
Just use version 3.4.0

So nothing is wrong with the setup turns out there is an open issue for this.
https://github.com/facebook/create-react-app/issues/8688
Downgrading to 3.4 solves for now

Ran into the same issue. -it flag resolved the issue.
docker run -it -p 3001:3001 <image-id>

I got the same issue as you and just ran the above command to fix it:
sudo docker run -it -p 3001:3000 Image-Name
Hope it helps

You can use the below with -itd flag as well.
sudo docker run -itd -p 3001:3000 Image-Name

Using the suggestion mentioned in this ans. on stackoverflow and on github, of adding
ENV CI=true to the Dockerfile before CMD ["npm", "run", "start"], worked.
I'm not sure why this works. It may have something to do with this point. You can read more about the env CI=true in the react docs.

Related

npm run tests in docker with travis CI

I am deploying a react app to Heroku via TravisCI. The fact that I'm using Heroku doesn't really affect what I'm about to ask, I'm pretty sure, it's just there for context. Travis successfully deploys the app until I add a testing step (the script section) in .travis.yml:
language: generic
sudo: required
services:
- docker
before_install:
- docker build -t myapp:prod -f Dockerfile.prod .
script:
- docker run -e CI=true myapp:prod npm run test
after_success:
- docker build -t myapp:prod -f Dockerfile.prod .
- echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_ID" --password-stdin
- docker push myapp:prod
deploy:
provider: heroku
app: myapp
skip_cleanup: true
api_key:
secure: <my_key>
However, my Dockerfile.prod is a multi-stage node + nginx where the nginx stage doesn't keep any node or npm stuff:
# build environment
FROM node:13.12.0-alpine as builder
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
# some CI stuff I guess
RUN npm ci
RUN npm install react-scripts#3.4.1 -g --silent
COPY . ./
RUN npm run build
# production environment
FROM nginx:stable-alpine
COPY nginx.conf /etc/nginx/conf.d/default.conf
# If using React Router
COPY --from=builder /app/build /usr/share/nginx/html
# For Heroku
CMD sed -i -e 's/$PORT/'"$PORT"'/g' /etc/nginx/conf.d/default.conf && nginx -g 'daemon off;'
Therefore, it is my understanding that .travis.yml tries to run that npm run test command inside my nginx container and can't execute npm commands (no node installed, right?). So guided by SO answers such as this one I started adding commands into that nginx stage such as
COPY package.json ./
COPY package-lock.json ./
RUN apk add --update npm
but I realized I might be approaching this the wrong way. Should I perhaps be adding npm through Travis? That is, should I include in .travis.yml in the scripts section something like docker run -e CI=true myapp:prod apk add --update npm and whatever else is necessary? This would result in a smaller nginx image no? However, would I run into problems with package.json from the node stage in Dockerfile.prod or anything like that?
In summary, to use TravisCI to test a dockerized react app served with nginx, at what point should I install npm into my image? Does it happen as part of script in .travis.yml or does it happen in Dockerfile.prod? If it is recommened to npm run tests inside Dockerfile.prod, would I do that in the first stage (node) or the second (nginx)?
Thanks
EDIT: Not sure if this can be considered solved, but a user on Reddit recommended to simply RUN npm run test right before the RUN npm run build.

Docker container works locally but not when uploaded to elasticbeanstalk

My docker container works locally, I'm trying to deploy it on elastic beanstalk using travis.
My travis build is successful. The docker container has been tested locally and it works. On AWS Elastic Beanstalk I get a "Not a file/Directory error" for my build directory.
Dockerfile
FROM node:alpine as builder
WORKDIR '/app'
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "run", "build"]
#Run Phase
FROM nginx
EXPOSE 80
COPY --from=builder /app/build /usr/share/nginx/html
Dockerfile.dev
FROM node:alpine
WORKDIR '/app'
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "run", "start"]
travis.yml
sudo: required
services:
- docker
before_install:
- docker build -t *******/docker -f Dockerfile.dev .
script:
- docker run -e CI=true *******/docker npm run test -- --coverage
deploy:
provider: elasticbeanstalk
region: "ap-south-1"
app: "docker"
env: "Docker-env-2"
bucket_name: "***********************"
bucket_path: "docker"
on:
branch: master
access_key_id: $AWS_ACCESS_KEY
secret_access_key: $AWS_SECRET_KEY
Following are the logs -
Travis output
Elastic Beanstalk output
Any help would be appreciated, thanks!
To run it locally, I run the following commands-
1) docker build -t *******/docker .
2) docker run -it <port>:80 <container_id>
It works as expected and I can reach the server on localhost:.
I've put the same commands on the travis.yml file as well.
There are two dockerfiles because I would only be needing the "build" directory in the production container and I can ignore the rest of the directories to save space.
I realized that the build directory was listed in the .gitignore file, thereby preventing travis-ci from accessing it as it isn't in the repo.
Once I removed it and re-deployed it, worked perfectly.

Not able to access the react application which is in the docker images

I've implemented a react application and made into docker. When I run the following command i'm not able to access the react application.
Command :- docker run -p 3000:3000 reactapp
> warmup#0.1.0 start /usr/src/app
> react-scripts start
ℹ 「wds」: Project is running at http://172.17.0.2/
ℹ 「wds」: webpack output is served from
ℹ 「wds」: Content not from webpack is served from /usr/src/app/public
ℹ 「wds」: 404s will fallback to /
Starting the development server...
I'm getting the above message but i'm not able to access the react application.
Below is my docker file.
FROM node
# A directory within the virtualized Docker environment
# Becomes more relevant when using Docker Compose later
WORKDIR /usr/src/app
# Copies package.json and package-lock.json to Docker environment
COPY package*.json ./
# Installs all node packages
RUN npm install
# Copies everything over to Docker environment
COPY . .
# Uses port which is used by the actual application
EXPOSE 3000
# Finally runs the application
CMD [ "npm", "start" ]
Check if you are hitting https://github.com/facebook/create-react-app/issues/8688
You might want to run the docker container with -it option.
docker run -it -p 3000:3000 reactapp

React Script exits immediate after starting development server using docker run command

I installed the module called create-react-app then executed this create-react-app frontend command for project generation. Its a custom image. After the project generated I made the custom docker YAML file for custom image generation. The image was generated successfully. But while running that image using this command docker run b99c49b119be it's exiting immediately after saying starting the development server. See below for the error.
I ran command for custom image generation docker build -f Dockerfile.dev .
Error
Successfully built b99c49b119be
[root#client frontend]# docker run b99c49b119be
> frontend#0.1.0 start /app
> react-scripts start
ℹ 「wds」: Project is running at http://172.17.0.2/
ℹ 「wds」: webpack output is served from
ℹ 「wds」: Content not from webpack is served from /app/public
ℹ 「wds」: 404s will fallback to /
Starting the development server...
Dockerfile
FROM node:alpine
WORKDIR '/app'
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "run", "start"]
For development purpose, just use npm start which uses react-scripts to run and watch your application for changes.
To deploy your application using docker, you can use static files server like serve in docker. Use dockerfile below.
FROM node:8-alpine
RUN npm install serve -g
COPY build/ .
EXPOSE 5000
CMD ["serve", "-s", "."]
Before you build the image, run npm run build to produce a production ready distribution - which is a collection of all files in build folder of your project
For more details on how to deploy react app created with create-react-app create-react-app docs
You need to add the -it flag to run the container in interactive mode (keep STDIN open and allocate a pseudo-tty):
docker run -it CONTAINER_ID
This is caused due to recent update in the Create React App library and adding this flag should resolve your issue.

can't run react app with docker container

I have a react-app, which simple showing hello-world message but I like to run the app throug docker-container but having this problem. After this message, process stopped without running app..
ℹ 「wds」: Project is running at http://172.17.0.2/
ℹ 「wds」: webpack output is served from
ℹ 「wds」: Content not from webpack is served from /app/public
ℹ 「wds」: 404s will fallback to /
Starting the development server...
Can't understand what I should do because I have very small app with basic code in Dockerfile
FROM node:alpine
RUN mkdir /app
COPY . /app
WORKDIR /app
COPY package.json ./
RUN npm install
CMD ["npm", "start"]
Do I need to install webpack-dev-server, I tried but got version error like 'manually added server' has lower version than already server. so I re-install the webpack-dev-server.
I have created app with 'create-react-app', so I think every dependency is managed automatically..
Is anyone have idea, how can I solve the problem.. thanks in advance (BTW..)
Command which I use to build: docker build . -t lucki
Command to run image: docker run -p 3000:3000 lucki
this is project stracture:
after adding DEBUG=* in Dockerfile, I have response as:
The problem is that the dev mode will not run if it is not an interactive terminal.
Change your docker command to include an interactive terminal:
Add -it to your docker run command (-i interactive, -t pseudo-TTY) e.g. docker run -it -p 3000:3000 your_container
Canonical troubleshooting
Make sure the code runs without docker
Does npm start work on the command line?
Showing debug info
Add DEBUG=* as an environment variable inside your container.DEBUG is an environment variable which controls logging for many Node modules.
In your Dockerfile, add
ENV DEBUG=*
Or on the command line, add -e 'DEBUG=*' to your docker command.
This may help spot error messages which are somehow getting swallowed
Run node directly
Instead of running npm start, run your file directly.
e.g. in your Dockerfile,
CMD ["node", "index.js"]
Try running another docker container
If this is a problem with your docker setup, running a known good container may help you discover it.
docker run --rm -it node:alpine
Improvements
Your Dockerfile could also be simplified a bit.
FROM node:alpine
WORKDIR /app
COPY package*.json ./
RUN npm install --production
COPY . .
CMD ["npm", "start"]
mkdir is not needed, as WORKDIR automatically creates the directory.
package*.json will also copy package-lock.json
--production will skip installing devDependencies
Putting the COPY command last will leverage cache better (you won't have to re-run npm install unless your dependencies have changed)
You might also want to use Tini. Tini forwards signals, which means docker stop and pressing control+c in an interactive terminal will actually stop the node process immediately.
If you are using Docker 1.13+, add --init to the command line to have signals forwarded and processes reaped. On older versions, follow the instructions in the README
Same Problem I have Faced and it's fixed By using the following Command
Cause of Problem:- Due to react project setup it requires an input trigger to start the server if not it will automatically stop
FIX:- add -it with the docker run command
Example:- docker run --name main-app -it -p 3000:3000 main-image-react
I got the same issue
it solved when I use like
docker run -it -p 3000:80 <image name>
using -it other than -p and -d solved the issue.

Resources