Docker image wouldn't start React app in Google Compute engine - reactjs

Backstory: So I have copied the files to the Google Compute Engine VM and am trying to run docker-compose up --build, which in theory should start 3 different instances: rasa-ui, rasa-api and rasa-actions. However, it does not start the one, which has react app on it (rasa-ui). I figured docker file might be the problem.
I tried running only docker image for react app (rasa-ui) and it doesn't start the app at all.
On my local machine code runs and starts the app fine.
Commands I used (and that worked) for docker images locally were:
docker build -t rasa-ui -f Dockerfile
docker run -p 3000:3000 rasa-ui
When I use the same commands in VM, it builds the image, but doesn't run it (and doesn't show any errors, when I run it)
Docker file:
FROM node:alpine
WORKDIR /app
COPY package.json ./
COPY package-lock.json ./
COPY ./ ./
RUN npm i
CMD ["npm", "run", "start"]
Any suggestions?

Related

Trouble running a docker container for react-app where package.json is in a subfolder

I am trying to create a dockerfile for a project that has the following folder structure:
HDWD-project
|_ client
| |_ package.json
|_ server
|_ package.json
Client is a react-app and I am just working with this at the moment, before including server which is the backend.
I am having real trouble figuring out the logic of the dockerfile and have googled furiouly for the last two days. All the examples are too easy.
I just can't seem to get react-app to start in the container, and get varying error messages. But I need to know the dockerfile is fine before I proceed.
FROM node:latest
WORKDIR HDWD-project
COPY ./client/package.json .
RUN npm install
COPY . .
RUN cd client
CMD ["npm", "start"]
Going forward I have a script that can start both the server and the client, but I'm just trying to get my head around docker and getting the client frontend to run fine.
Would anyone be able to correct me on where the issue in this config is and explain it?
This is a docker file for the frontend(client in your case). You can make a dockerfile under your client folder and build the image with docker build -t image-name:tag-name .
# Pull the latest node image from dockerhub
FROM node:latest
# Create app directory
WORKDIR /usr/src/app
# Copy package.json and package-lock.json to the workdir
COPY package*.json ./
# Install the dependencies
RUN npm install
# Bundle app source
COPY . .
# Run the app in docker
CMD ["npm", "start"]

How to maintain a single dockerfile for all the environments for react js?

I have the written the following Dockerfile after looking at lots of implementation for react with multi stage builds. I am trying to achieve to have a single Dockerfile for all the environments. Currently dev + prod but in future, dev, qa, automation, staging, pt, preprod, prod.
# Create a base image with dependencies
FROM node:16.15.0-alpine3.14 AS base
ENV NPM_CONFIG_LOGLEVEL warn
RUN addgroup app && adduser -S -G app app
USER app
WORKDIR /app
ENV PATH /app/node_modules/.bin:$PATH
COPY package.json yarn.lock ./
# Create a development environment
FROM base AS development
ENV NODE_ENV development
RUN yarn install --frozen-lockfile
COPY . .
EXPOSE 3000
CMD [ "yarn", "start" ]
# Generate build
FROM base AS builder
ENV NODE_ENV production
RUN yarn install --production
COPY . .
RUN yarn build
# Create a production image
FROM nginx:1.21.6-alpine as production
ENV NODE_ENV production
COPY --from=builder /app/build /usr/share/nginx/htm
COPY nginx.conf /etc/nginx/conf.d/default.conf
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]
So I have a couple of questions as below,
In the above Dockerfile when I target production in my docker-compose.yml file will it run the development stage as well in the Dockerfile?
If so how can I avoid it being run? Because the development has a different yarn install command ad it also copies the src folder which will be redundant in production stage.
Should I strive to have a single Dockerfile and multiple docker-compose.yml files like docker-compose.dev.yml, docker-compose.prod.yml etc?
you can have a base file docker-compose.yml which will contain the common services. you can then create docker-compose-prod.yml , docker-compose-dev.yml, etc. which contains environment specific overrides. While executing docker compose command, you can specify multiple files and the configuration will be merged and executed as single file.
docker-compose -f docker-compose.yml -f docker-compose.prod.yml up -d
you can read more here.

Docker not detecting changes for react app in windows

I generated a base react app with create-react-app and generated a Dockerfile.dev inside of the project directory
FROM node:16-alpine
WORKDIR '/app'
COPY ./package.json ./
RUN npm install
COPY ./ ./
CMD [ "npm","start" ]
Ran the build with docker build -f Dockerfile.dev -t dawnmd/react .
Started the docker container with docker run -it -p 3000:3000 -v /app/node_modules -v ${PWD}:/app dawnmd/react
The app not detecting changes from the host i.e windows 11 when I change something in the host file.
It's true that CHOKIDAR_USEPOLLING=true doesn't work with docker and windows 11.
However there is a workaround - you can use vscode remote container extension found here:
VSCode Documentation: https://code.visualstudio.com/docs/remote/containers
VSCode Remote Download: https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers
If you have Docker Desktop installed, VSCode installed, and VSCode Remote containers installed. You can then do the following:
Open the react project in VSCode
Press CRTL + SHIFT + P, and then type and select the following: "Remote-Container: Open folder in Container"
VSCODE Open in Remote Containers
It would then ask for, "Add Development Container Configuration Files" to which you can select one of the following from dockerfile, docker-compose, or from predefined container configuration. Your project maybe different from mine hence the selection is up to you.
VS Code Remote-Container Configuration
4.Once Everything has been loaded. Using VSCode select terminal from the menu and then new terminal and then type "npm start" or "yarn start" as shown on the figure.
Running React in Remote-Container
*Opinion: The benefit of running react using vscode remote container is that the computer doesn't have to work much harder as it re-compiles/rebuild react whenever you save the js file on demand. Unlike chokidar_usepolling which compiles react every-second. Which makes my computer scream running a virtual library and the constant reload on the browser.
Note: Any changes made in the remote-container is also saved in the host machine.
Note: You would have to use git bash or another terminal(i.e Powershell) to execute all git commands as the terminal in vscode opens a terminal which resides in the container.
For windows using power-shell the run command will be:
docker run -it --rm -v ${PWD}:/app -v /app/node_modules -p 3000:3000 -e CHOKIDAR_USEPOLLING=true <IMAGE>
CHOKIDAR_USEPOLLING=true enables a polling mechanism via chokidar (which wraps fs.watch, fs.watchFile, and fsevents) so that hot-reloading will work

docker bind mount not working in react app

I am using docker toolbox on windows home and having trouble figuring out how to get bind mount working in my frontend app. I want changes to be reflected upon changing content in the src directory.
App structure:
Dockerfile:
FROM node
WORKDIR /app
COPY package.json .
RUN npm install
COPY . .
EXPOSE 3000
CMD [ "npm", "start" ]
Docker commands:
(within the frontend dir) docker build -t frontend .
docker run -p 3000:3000 -d -it --rm --name frontend-app -v ${cwd}:/app/src frontend
Any help is highly appreciated.
EDIT
cwd -> E:\docker\multi\frontend
cwd/src is also not working. However, i find that with /e/docker/multi/frontend/src the changes are reflected upon re running the same image
i have ran into same issue it feels like we should use nodemon to look for file changes and restart the app.
because with docker reference and tutorials project does the thing.

Why does docker run do nothing when i try to run my app?

I made a website to React and I'm trying to deploy it to an Nginx server by using Docker. My Dockerfile is in the root folder of my project and looks like this:
FROM tiangolo/node-frontend:10 as build-stage
WORKDIR /app
COPY . ./
RUN yarn run build
# Stage 1, based on Nginx, to have only the compiled app, ready for production with Nginx
FROM nginx:1.15
COPY --from=build-stage /app/build/ /usr/share/nginx/html
# Copy the default nginx.conf provided by tiangolo/node-frontend
COPY --from=build-stage /nginx.conf /etc/nginx/conf.d/default.conf
When I run docker build -t mywebsite . on the docker terminal I receive a small warning that I'm building a docker image from windows against a non-windows Docker host but that doesn't seem to be a problem.
However, when I run docker run mywebsite nothing happens, at all.
In case it's necessary, my project website is hosted on GitHub: https://github.com/rgomez96/Tecnolab
What are you expecting ? Nothing will happen on the console except the nginx log.
You should see something happening if you go to http:ip_of_your_container.
Otherwise, you can just launch your container with this command :
docker container run -d -p 80:80 mywebsite
With this command you'll be able to connect to your nginx at this address http://localhost as you are forwarding all traffic from the port 80 of your container to the port 80 of your host.

Resources