Hot reload on React with docker-compose failing on Windows - reactjs

I'm trying to get React to change content of the site when it's file is being saved.
I'm using VS code which doesn't have safe write. I'm using docker-compose on Windows via Docker Desktop.
Dockerfile:
FROM node:17-alpine
WORKDIR /front
ARG FRONT_CMD
ARG API_HOSTNAME
ENV REACT_APP_API_HOSTNAME=$API_HOSTNAME
COPY . .
RUN npm i #emotion/react #emotion/styled
CMD $FRONT_CMD
relevant part of docker-compose.yml:
frontend:
volumes:
- ./frontend/src:/front/src
- /front/node_modules
build:
context: ./frontend
dockerfile: Dockerfile
args:
- FRONT_CMD=${FRONT_CMD}
- API_HOSTNAME=${API_HOSTNAME}
env_file:
- .env.dev
networks:
- internal
environment:
- CHOKIDAR_USEPOLLING=true
- FAST_REFRESH=false
- NODE_ENV=development
Everything is running behind traefik. CHOKIDAR_USEPOLLING and FAST_REFRESH seem to make no difference, I start with ' docker-compose --env-file ..env.dev up' - within the file FRONT_CMD="npm start" which behaves just fine. Env.dev should be clear indication of dev build (and is, works the same without the addition) to React, but I added NODE_ENV just be safe. I tried adding all of them into build envs just be super sure, but nothing changes. React files lay in 'frontend' folder, which is in the same location as docker-compose.yml.
Every time React says it compiled successfully and warns me that it's a development build.
Only suspicion I have left is that there's some issue with updating files with Windows locally while docker uses Linux, but I have no idea where to go from there even if that's the case.

Shortest way I found was to start from the other side, that is attach editor to container instead of updating container based on changes in system files. I followed the guide here: https://code.visualstudio.com/docs/remote/attach-container

Related

Code Error : While running docker container with docker volume

I am facing the below issue,
I have a docker container ( React APP ) running with volume attached to it. So that my code changes are reflected automatically and its working fine.
The docker compose file commands is as below
volumes:
- ".:/app"
- "./src:/web/src"
- "/app/node_modules"
ports:
- "3000:3000"
- "35729:35729"
Now what I have done is, I have created a clone of the same repo and pulled to my local machine into a different folder. Now I am trying to run the container with
docker-compose up -d --build
It spins up the container and replace the existing one, but throws a error while app is running
service-worker.js:26 Uncaught (in promise) TypeError: Failed
to fetch
at service-worker.js:26:16 (anonymous) # service-worker.js:26
log.js:21 [HMR] Waiting for update signal from WDS... main.chunk.js:10
Uncaught Error: Module build failed (from
./node_modules/sass-loader/dist/cjs.js): SassError: Can't find
stylesheet to import. ╷ 1 │ #import "./themes.scss"; │
^^^^^^^^^^^^^^^
But when I am commenting out the following line in volumes section in Docker-compose file and runs the docker-compose up command, there is no error and apps loads fine
- "./src:/web/src"
I tried removing the volumes after deleting the container and running it again. But when the above line is there in docker-compose it does not work.
Any idea how to resolve it and have volume attached ?
try deleting the directory or files where the volumes were stored and
run docker compose up --build -d not docker compose up -d --build

Running react-snap on AWS codebuild

I have a react website that I host on AWS. I have created code pipeline in AWS that connects to my github, which automatically builds the projects using codeBuild and deploys it to S3.
I'm trying to add react-snap to the project. It works well locally but when I try to build it in codebuild I get this error
Error: Failed to launch chrome!
/codebuild/output/src159566889/src/node_modules/puppeteer/.local-chromium/linux-686378/chrome-linux/chrome: error while loading shared libraries: libXss.so.1: cannot open shared object file: No such file or directory
TROUBLESHOOTING: https://github.com/GoogleChrome/puppeteer/blob/master/docs/troubleshooting.md
at onClose (/codebuild/output/src159566889/src/node_modules/puppeteer/lib/Launcher.js:348:14)
at Interface.<anonymous> (/codebuild/output/src159566889/src/node_modules/puppeteer/lib/Launcher.js:337:50)
at Interface.emit (events.js:326:22)
at Interface.close (readline.js:416:8)
at Socket.onend (readline.js:194:10)
at Socket.emit (events.js:326:22)
at endReadableNT (_stream_readable.js:1241:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
error Command failed with exit code 1.
I have tried to google it but I didn't find anything specific to codebuild and react-snap. I have found similar questions in regards to running chrome on codebuild but they related to different environments like angular and so I wasn't able to copy their solutions.
This is what my current buildspec.yaml file looks like
version: 0.2
env:
variables:
S3_BUCKET: "xyz"
STAGE: "beta"
phases:
install:
commands:
- yarn install
build:
commands:
- echo "Building for $STAGE"
- yarn build
- sam package --template-file cloudformation/Root.json --s3-bucket ${S3_BUCKET} --s3-prefix WebsiteCF/${CODEBUILD_RESOLVED_SOURCE_VERSION} --output-template-file build/packaged-template.yaml
artifacts:
files:
- '**/*'
base-directory: 'build'
Based on the instruction on the link provided by the error, I tried adding this but it didn't work
install:
commands:
- PYTHON=python2 amazon-linux-extras install epel -y
- yum install -y chromium
- yarn install
I managed to get it working using these steps:
Make sure your AWS code builder is using aws/codebuild/standard:5.0
Go t AWS code builder -> Edit -> Environment -> Override image
Create a addArgs.sh file to your project with this content
# modifies react-snap defaultOptions to add the --no-sandbox and --disable-setuid-sandbox flags so that puppeteer/chromium can run in the codebuild standard image
sed -i "s/puppeteerArgs: \[\],/puppeteerArgs: \[\"--no-sandbox\", \"--disable-setuid-sandbox\"\],/" ./node_modules/react-snap/index.js
echo changed arguments in react-snap
To your buildspec.yml file, add these lines to the install stage
# Install chrome headless
- apt-get -y update
- apt-get --assume-yes install chromium-browser
- sh ./addArgs.sh # run custom script to change options on react-snap to make it work
I found the answer from here - https://github.com/stereobooster/react-snap/issues/122

GitLab CI Pipeline Job gives error JavaScript heap out of memory

We have a WordPress plugin written in JS with the help of the tool wp-reactivate.
Our goal is to make a GitLab CI Pipeline that increases the version in all places, builds the project and deploys it to the WordPress.org SVN repository. So far, the SVN deployment does work, incrementing the version number is unimplemented yet, but we have a problem building the project. The GitLab CI Runner refuses to finish the process since it ran out of available memory.
We have already tried (with no effect):
Setting GENERATE_SOURCEMAP=false
Setting NODE_OPTIONS="--max_old_space_size=8192"
Running node --max-old-space-size=8192
Our .gitlab-ci.yml file:
stages:
- build
- deploy
default:
image: node
BuildApp:
stage: build
before_script:
- GENERATE_SOURCEMAP=false
- NODE_OPTIONS=\"--max_old_space_size=8192\"
- node --max-old-space-size=8192
script:
- yarn
- yarn prod
PluginSVN:
stage: deploy
before_script:
- apt-get install subversion
- curl -o /usr/bin/deploy.sh https://git-cdn.e15r.co/open-source/wp-org-plugin-deploy/raw/master/scripts/deploy.sh
- chmod +x /usr/bin/deploy.sh
script: /usr/bin/deploy.sh
when: on_success
Is there any way to increase the amount of available memory, or reduce the amount of memory required for building the project?
Check Gitlab Forum:
Every runner only have 1CPU, 4GB RAM,
which means you don't have to adjust node options, it won't work.
For me, self-hosted is an option.
what ever I install gitlab-runner on self host, or docker, I got same issue.
Finally I got the root cause. The ec2 instance I created is too low, t2.micro
After I adjust it to t3.medium (you should be fine to adjust to any, with 4GB+ memory), it works without this issue any more.

Failure in build using Travis, AWS Elasticbeanstalk and Docker

I am facing an issue while building my React project using GitHub as a repository, Travis as CI with AWS ElasticBeanStalk as a service to run my app using Docker. I am able to run my test suite but after that, it is not deploying my app on AWS and also not getting any error in Travis console except below:
Below is my Travis .yml file configuration:
language: generic
services:
- docker
before_install:
- docker build -t heet1996/my-profile -f Dockerfile.dev .
script:
- docker run heet1996/my-profile npm run test -- --coverage
deploy:
provider: elasticbeanstalk
region: "us-east-1"
app: "My-profile"
env: "MyProfile-env"
bucket_name: "elasticbeanstalk-us-east-1-413920612934"
bucket_path: "My-profile"
on:
branch: master
access_key_id: $AWS_ACCESS_KEY
secret_access_key: "$AWS_SECRET_KEY"
Let me know if you need more information
A couple things you could try:
Your script command needs to set the environment var CI=true
So
script:
- docker run heet1996/my-profile npm run test -- --coverage
Becomes
script:
- docker run -e CI=true heet1996/my-profile npm run test -- --coverage
Also AWS needs the access variables to be named differently.
Change
access_key_id: $AWS_ACCESS_KEY
secret_access_key: "$AWS_SECRET_KEY"
To
access_key_id: "$AWS_ACCESS_KEY_ID"
secret_access_key: "$AWS_SECRET_ACCESS_KEY"
Using the option --coverage, your test will hang, waiting for input. Hence the message: "...no output has been received in the last 10m0s...".
At a certain point, --coverage was probably able to stop tests (as some used for that purpose), but I guess it was not meant for that and subsequent versions of docker removed that behavior.
Your test must conclude and the conclusion be a success for the deployment by Travis to begin.
Use instead the option --watchAll=false. So you should have:
...
script:
- docker run heet1996/my-profile npm run test -- --watchAll=false
...
That would take care of the obvious issue of your test never concluding (that could be the only issue). Afterward, make sure that your tests are successful. Then, you can worry about other issues such as authentication on AWS, etc...

Reading an environment variable in react which was set by docker

I am using docker to build my react application and deploy it in nginx.
I have set an environment variable in docker-compose.yml
version: '2'
services:
nginx:
container_name: ui
environment:
- HOST_IP_ADDRESS= xxx.xxx.xx.xx
build:
context: nginx/
ports:
- "80:80"
After the docker container is created I can see hi when I echo the variable inside the container.
However, when I am trying to read it in react using process.env.HOST_IP_ADDRESS it is logging undefined.
I read in a blogpost somewhere that the env variables can be only accessed in production environment. Since, I am building the app and deploying it in nginx, I should be able to access it, but for some reason I am not able to read it.
Am I doing something fundamentally wrong here. If so, please let me know a solution. I am not a react expert, I am just managing someone else's code.
UPDATE:
The Dockerfile looks as follows:
FROM node:8 as ui-builder
WORKDIR /home/ui
COPY helloworld .
RUN npm install
RUN npm run build
FROM nginx
COPY --from=ui-builder /home/ui/build /usr/share/nginx/html
CMD ["nginx", "-g", "daemon off;"]
The React Component snippet is as follows:
import React, { Component } from 'react';
class HelloWorld extends Component {
render() {
console.log(process.env.HOST_IP_ADDRESS);
return (
<div className="helloContainer">
<h1>Hello, world!</h1>
</div>
);
}
}
export default HelloWorld;
I would like to thank everyone who posted answers and comments.The problem that I was facing was solved using a combination of these answers and some help from other resources.
As suggested by #DavidMaze (in the comments), I started looking into webpack config present in my code. I found out that the webpack was reading all the environment variables declared inside the container.
So I started experimenting with my Dockerfile and docker-compose.yml as I realized that REACT_APP_HOST_IP_ADDRESS was not being passed as an environment variable when the react was building the code.
The first thing I changed was the Dockerfile. I statically declared the IP inside dockerfile for testing
ENV REACT_APP_HOST_IP_ADDRESS localhost.
By doing this I was able to see the value localhost inside the env variables which were read by webpack.
Now I tried passing the ENV Variable from docker-compose to dockerfile as suggested by #Alex in his answer, but it didn't work.
So I referred to https://github.com/docker/compose/issues/5600 and changed the docker-compose.yml and Dockerfile as follows
docker-compose.yml
version: '2'
services:
nginx:
container_name: ui
build:
context: nginx/
args:
REACT_APP_HOST_IP_ADDRESS: ${IP_ADDRESS}
ports:
- "80:80"
where IP_ADDRESS is exported as an env variable.
Dockerfile
FROM node:8 as ui-builder
WORKDIR /home/ui
COPY helloworld .
RUN npm install
ARG REACT_APP_HOST_IP_ADDRESS
ENV REACT_APP_HOST_IP_ADDRESS $REACT_APP_HOST_IP_ADDRESS
RUN npm run build
FROM nginx
COPY --from=ui-builder /home/ui/build /usr/share/nginx/html
CMD ["nginx", "-g", "daemon off;"]
React Component
import React, { Component } from 'react';
class HelloWorld extends Component {
render() {
console.log(process.env.REACT_APP_HOST_IP_ADDRESS);
return (
<div className="helloContainer">
<h1>Hello, world!</h1>
</div>
);
}
}
export default HelloWorld;
This configuration makes available the variables passed via ARG in docker-compose to Dockerfile during the image build process and hence the variables can be in turn declared as env variables which React can use during build process provided the webpack reads the env variables.
The webpack will be able to read the env variables using DefinePlugin
https://webpack.js.org/plugins/define-plugin/.
Make sure you prefix your variables with REACT_APP_ (as seen here), otherwise it won't be picked up by React.
You should check next moments
I. You env variables have prefix REACT_APP_
II. In docker file you have ARG and ENV commands like
ARG REACT_APP_DEBUG
ENV REACT_APP_DEBUG=$REACT_APP_DEBUG
III. you pass your arg as build arg
in docker-compose.yml it looks like
services:
my-app:
build:
args:
REACT_APP_DEBUG=True
or in docker build it looks like
docker build -t my_app:dev --build-arg REACT_APP_DEBUG=True .
Env variables should start with REACT_APP_ otherwise NODE_ENV variables are a bit confused and your environment variable will not work:
environment:
- REACT_APP_DEBUG=TRUE
Otherwise, docker-compose.yml is not valid and you will see an error message:
services.client.environment contains an invalid type, it should be an object, or an array
Here is a working sample:
docker-compose.yml
version: "3.3"
services:
client:
container_name: client
environment:
- REACT_APP_DEBUG=TRUE
build:
dockerfile: Dockerfile
context: ./web/client
Dockerfile
FROM node:6.0.0
# Set env variable
ARG REACT_APP_DEBUG
ENV REACT_APP_DEBUG=$REACT_APP_DEBUG
# that will be empty
RUN echo "DEBUG": $REACT_APP_DEBUG
Run:
->docker-compose run client node
->process.env.REACT_APP_DEBUG
'TRUE'
Here is my solution using ENV in my Dockerfile, DefinePlugin in the webpack.config.js and process.env in my javascript codes:
First set you environment variable and its value in your Dockerfile :
...
RUN npm install
ENV MY_ENV_VAR my_env_value
...
Then using DefinePlugin plugin, add it to process.env in webpack.config.js:
const webpack = require('webpack');
...
plugins: [
new webpack.DefinePlugin({
'process.env.MY_ENV_VAR': JSON.stringify(env.MY_ENV_VAR),
}),
],
...
And finally use the env variable in your code:
const host = process.env.MY_ENV_VAR || 'a_default_value_in_case_no_env_is_found';
I checked how it's done in API Platform, config just defines consts based on env ('.env' file):
export const API_HOST = process.env.REACT_APP_API_ENTRYPOINT;
export const API_PATH = '/';
Importing this you have one value (API_HOST) while process.env.HOST_IP_ADDRESS refers to deep object structure unavailable at runtime.
I use Github CI to set secrets per env. First in GH action file I use run: docker build ... --build-arg REACT_APP_APIURL=${{ secrets.REACT_APP_APIURL }} .
Then I use them in Dockerfile in my repo to build there final image with the React app like:
ARG REACT_APP_APIURL
RUN test -n "$REACT_APP_APIURL" || (echo "REACT_APP_APIURL not set in GH/your environment" && false)
...
RUN npm run build
This value is used in the npm run build automatically (used in my react typescript codebase as process.env.REACT_APP_APIURL). I chose to check for this value and let the app fail immediately on load if something is wrong with my Docker image or configuration somewhere.
export const config = {
apiUrl: setApiUrlFromEnv(),
};
function setApiUrlFromEnv() {
if (process.env.REACT_APP_APIURL) {
return process.env.REACT_APP_APIURL;
} else {
// if something goes wrong in setup, do not start app and show err directly in web console
throw new Error(
`ENV not configured properly (REACT_APP_APIURL) to use desired ENV variables (${process.env.REACT_APP_APIURL})`
);
}
}
Step 1: Add args for env to docker-compose file
We use args instead of environment field because environment fields are not available on build stage
services:
...
web_app:
build:
context: .
dockerfile: Dockerfile
args:
- MY_ENV=HELLO_WORLD
Step 1_alternative: If image is built by cloudbuild instead of docker-compose, we should add args to cloudbuild.yml file
steps:
- name: ...
args:
...
- --build-arg
- MY_ENV=HELLO_WORLD
Step: 2: Add ARGS and ENVS to dockerfile
We use ARG command to get variables from docker-compose args
We use ENV to set env for the build
ARG MY_ENV
ENV MY_ENV=$MY_ENV
RUN echo "$MY_ENV"
Step 3: Update webpack config
Use webpack.ProvidePlugin({ process: "process/browser" }) to enable process in web app
Use webpack.DefinePlugin to define env variables available in web app
Add process library to dev dependency by npm i -S process
plugins: [
new webpack.ProvidePlugin({
process: "process/browser"
}),
new webpack.DefinePlugin({ "process.env": JSON.stringify(process.env) })
]
Technically, we can't use environment variables in browser context, that's why we usually use DefinePlugin or EnvironmentPlugin in webpack based projects like CRA and Vue-CLI to statically replace process.env.* with environment variables.
But this way forces us to rebuild the whole application multiple times (e.g., development. staging and production).
To fix this, I want to share you a set of plugins: import-meta-env, with these plugins, you only need to define and pass the env you want to use and the plugins do the rest for you.
During production, you can use this plugin to statically replace import.meta.env.* with some expression (we use import.meta because process.env is a Node specific object), and on the startup of container, you can run a special script to inject your environment variables which may passed from docker run, stored in your Google Cloud Run, etc.
I have also created an example for Docker.
Hope this helps you and people who needs it.
Accessing container environment at startup time with typescript / react / docker
Here a solution that works with .env files that can be included via
env_file: myapp.env in docker-compose or directly as .env.
Basic idea is following this approach https://blog.codecentric.de/react-application-container-environment-aware-kubernetes-deployment
Basic idea
Provide a config.js file as static hosted resource under public at container startup. Use the entrypoint of the docker container to generate the config.js. Link to the config.js within index.html to make it available in the app.
Full working example
Step by step instruction. Git repo here
Create example app
npx create-react-app read-env-example --template typescript
Navigate to fresh app
cd read-env-example
Create Dockerfile
mkdir -p docker/build
docker/build/Dockerfile
# build environment
FROM node:19-alpine3.15 as builder
WORKDIR /app
ENV PATH /app/node_modules/.bin:$PATH
COPY package.json ./
COPY package-lock.json ./
RUN npm ci
RUN npm install react-scripts#5.0.1 -g
COPY . ./
RUN PUBLIC_URL="." npm run build
# production environment
FROM nginx:stable-alpine
COPY --from=builder /app/build /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]
COPY docker/build/docker-entrypoint.sh /
RUN chmod +x docker-entrypoint.sh
ENTRYPOINT ["/docker-entrypoint.sh"]
Create docker-entrypoint.sh
This script will be executed at container start.
It generates the config.js file containing all environment variables starting with 'MYAPP' under window.extended.
docker/build/docker-entrypoint.sh
#!/bin/sh -eu
function generateConfigJs(){
echo "/*<![CDATA[*/";
echo "window.extended = window.extended || {};";
for i in `env | grep '^MYAPP'`
do
key=$(echo "$i" | cut -d"=" -f1);
val=$(echo "$i" | cut -d"=" -f2);
echo "window.extended.${key}='${val}' ;";
done
echo "/*]]>*/";
}
generateConfigJs > /usr/share/nginx/html/config.js
nginx -g "daemon off;"
Create docker-compose.yml
mkdir docker/run
docker/run/docker-compose.yml
version: "3.2"
services:
read-env-example:
image: read-env-example:0.1.0
ports:
- 80:80
env_file:
- myapp.env
Create runtime config for your app
docker/run/myapp.env
MYAPP_API_ENDPOINT='http://elasticsearch:9200'
Create config.js <-- this is where .env will be injected.
public/config.js
/*<![CDATA[*/
window.extended = window.extended || {};
window.extended.MYAPP_API_ENDPOINT='http://localhost:9200';
/*]]>*/
Note: This file will be completely overwritten by the docker-entrypoint.sh. For development purposes you can set it to any value that is appropriate, e.g. when used together with npm start.
Include config.js in index.html
public/index.html
<head>
...
<script type="text/javascript" src="%PUBLIC_URL%/config.js" ></script>
...
</head>
<body>
Make use of your environment variable
src/index.tsx
...
declare global {
interface Window { extended: any; }
}
root.render(
<React.StrictMode>
<App {...{MYAPP_API_ENDPOINT:window.extended.MYAPP_API_ENDPOINT}}/>
</React.StrictMode>
);
...
src/App.tsx
...
type Config={
MYAPP_API_ENDPOINT:string
}
function App(props : Config) {
return (
<div className="App">
<header className="App-header">
<div>
You have configured {props.MYAPP_API_ENDPOINT}
</div>
</header>
</div>
);
}
...
src/App.test.tsx
test('renders learn react link', () => {
render(<App {...{MYAPP_API_ENDPOINT:"teststring"}}/>);
const linkElement = screen.getByText(/You have configured teststring/i);
expect(linkElement).toBeInTheDocument();
});
Build and test
npm install
npm test
Create docker image
docker build -f docker/build/Dockerfile -t read-env-example:0.1.0 .
Run container
docker-compose -f ./docker/run/docker-compose.yml up
Navigate to your app
Open http://localhost in your browser.
You will see the content of MYAPP_API_ENDPOINT like you provided in your docker/run/myapp.env.
Further usage
You can provide additional variables starting with MYAPP. The docker-entrypoint.sh script will search for all variables starting with MYAPP and make them available through the windows object.

Resources