Unable to run cypress in docker reactjs project - Please reinstall Cypress by running: cypress install - reactjs

I spent two days just figuring out running cypress in a docker so I can test it locally. Here is the Dockerfile
FROM cypress/base:16 as UIbuilder
ENV NODE_ENV development
WORKDIR /opt/app
COPY custom-ui .
RUN npm install
RUN npm run build
RUN npm pack
FROM cypress/base:16 as APPbuilder
ARG REACT_APP_PORTAL_API
ARG REACT_APP_ENVIRONMENT="production"
ENV REACT_APP_PORTAL_API=$REACT_APP_PORTAL_API
ENV REACT_APP_ENVIRONMENT=$REACT_APP_ENVIRONMENT
WORKDIR /opt/app
ENV APP_DIR /opt/app/
COPY . .
# Build App
COPY --from=UIbuilder /opt/app/*.tgz .
RUN npm install ./*.tgz --legacy-peer-deps
RUN chown -R node /opt/app/node_modules
RUN npm install
USER node
RUN npx cypress verify
# CMD ["ws", "--directory", ".", "--spa", "index.html", "--log.format", "combined"]
EXPOSE 3000
CMD ["npm","start"]
And here is the docker-compose.dev.yml file
version: "3.7"
services:
app:
container_name: frontend
image: frontend
build: .
env_file:
- .env.dev
ports:
- "3000:3000"
volumes:
- .:/opt/app:rw
- /opt/app/node_modules
- ./project-cypress/cypress:/opt/app/cypress
- ./project-cypress/cypress.config.js:/opt/app/cypress.config.js
environment:
- CYPRESS_baseUrl=http://app
command: npx cypress run
Here is the project directory is structured, a reactjs project
frontend
custom-ui
build
cypress
e2e
fixtures
screenshots
support
videos
public
src
....
cypress.config.js
Every time I run docker-compose -f docker-compose.dev.yml up --build
I got the following errors
Step 21/23 : RUN npx cypress verify
---> Running in 9c2b1398e724
No version of Cypress is installed in: /home/node/.cache/Cypress/10.8.0/Cypress
Please reinstall Cypress by running: cypress install
----------
Cypress executable not found at: /home/node/.cache/Cypress/10.8.0/Cypress/Cypress
----------
Platform: linux-x64 (Debian - 11.3)
Cypress Version: 10.8.0
ERROR: Service 'app' failed to build : The command '/bin/sh -c npx cypress verify' returned a non-zero code: 1
Any ideas about what I'm done wrong?
regards

In your package.json file try to add
"cy:run": "cypress install && cypress run"
In a docker-compose.dev.yml file in the command section run cy:run.
command: npm run cy:run
I know it's weird you may ask yourself why you should run a cypress install if you use a cypress image.. but I had the same problem, and this helped me.

Related

How do you get "react-scripts test" to work with AWS CodePipeline?

I have a React app that I generated using Create React App. I'd like to make sure the tests pass before the build starts in AWS CodePipeline. Here is my buildspec.yml file which works just fine without the npm test command. Once I add the test command, it runs the tests, which I have setup to fail, and then the build just hangs.
version: 0.2
phases:
install:
commands:
- cd react/menu && npm install
build:
commands:
- npm test
- npm run-script build
artifacts:
files: "**/*"
base-directory: "react/menu/build"
I was able to get it to work by turning off watch mode as #haseeb-anwar suggested.
version: 0.2
phases:
install:
commands:
- cd react/menu && npm install
build:
commands:
- npm test -- --watchAll=false
- npm run-script build
artifacts:
files: "**/*"
base-directory: "react/menu/build"

Google App Engine - Node.js entrypoint as defined in the app.yaml isn't triggering

I have a node.js application that's been Dockerized with gcr.io/google-appengine/nodejs and deploys and runs fine. Knowing that by default GAE will run the "start" script in the package.json I simply replaced what was already there with next start -p 8080 which runs no problem.
I'd like to trigger an alternative script rather than "start" and so I created a script called "cloud-start" and plugged in the above command as the value. In the app.yaml I added the "entrypoint" property and yarn cloud-run as the value:
runtime: custom
env: flex
entrypoint: yarn cloud-start
service: my-app
vm_health_check:
enable_health_check: False
manual_scaling:
instances: 1
resources:
memory_gb: 4
The "cloud-start" script is never executed though - I even tried replacing "yarn" with "npm" and still no luck. Why would my entrypoint not be triggering?
Here's my Dockerfile just in case its relevant:
# Use the base App Engine Docker image, based on Ubuntu 16.0.4.
FROM gcr.io/google-appengine/nodejs
# Install locate for debugging purposes
RUN apt-get update -y && \
apt-get install --no-install-recommends -y -q \
locate
COPY . /app
WORKDIR /app
RUN npm install --global yarn
RUN yarn
RUN yarn static
EXPOSE 8080
Try entrypoint: npm run cloud-start it works for me.
Explanation of this answer can be found here: https://issuetracker.google.com/issues/110097743#comment11
yarn is only available at build time to install your dependencies. The yarn executable is not available in the Node.js runtime after that.
If you want to run your "dev" script, just use "start" : "npm run dev"

npm run tests in docker with travis CI

I am deploying a react app to Heroku via TravisCI. The fact that I'm using Heroku doesn't really affect what I'm about to ask, I'm pretty sure, it's just there for context. Travis successfully deploys the app until I add a testing step (the script section) in .travis.yml:
language: generic
sudo: required
services:
- docker
before_install:
- docker build -t myapp:prod -f Dockerfile.prod .
script:
- docker run -e CI=true myapp:prod npm run test
after_success:
- docker build -t myapp:prod -f Dockerfile.prod .
- echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_ID" --password-stdin
- docker push myapp:prod
deploy:
provider: heroku
app: myapp
skip_cleanup: true
api_key:
secure: <my_key>
However, my Dockerfile.prod is a multi-stage node + nginx where the nginx stage doesn't keep any node or npm stuff:
# build environment
FROM node:13.12.0-alpine as builder
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
# some CI stuff I guess
RUN npm ci
RUN npm install react-scripts#3.4.1 -g --silent
COPY . ./
RUN npm run build
# production environment
FROM nginx:stable-alpine
COPY nginx.conf /etc/nginx/conf.d/default.conf
# If using React Router
COPY --from=builder /app/build /usr/share/nginx/html
# For Heroku
CMD sed -i -e 's/$PORT/'"$PORT"'/g' /etc/nginx/conf.d/default.conf && nginx -g 'daemon off;'
Therefore, it is my understanding that .travis.yml tries to run that npm run test command inside my nginx container and can't execute npm commands (no node installed, right?). So guided by SO answers such as this one I started adding commands into that nginx stage such as
COPY package.json ./
COPY package-lock.json ./
RUN apk add --update npm
but I realized I might be approaching this the wrong way. Should I perhaps be adding npm through Travis? That is, should I include in .travis.yml in the scripts section something like docker run -e CI=true myapp:prod apk add --update npm and whatever else is necessary? This would result in a smaller nginx image no? However, would I run into problems with package.json from the node stage in Dockerfile.prod or anything like that?
In summary, to use TravisCI to test a dockerized react app served with nginx, at what point should I install npm into my image? Does it happen as part of script in .travis.yml or does it happen in Dockerfile.prod? If it is recommened to npm run tests inside Dockerfile.prod, would I do that in the first stage (node) or the second (nginx)?
Thanks
EDIT: Not sure if this can be considered solved, but a user on Reddit recommended to simply RUN npm run test right before the RUN npm run build.

Dockerized react app differences between running the image with and without docker-compose

// Dockerfile
# pull base image
FROM node:10.15.1-alpine
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY ./package.json ./
COPY ./yarn.lock ./
RUN yarn install
RUN yarn global add react-scripts#3.4.1 // not sure if this is even necessary
# add app
COPY . ./
# start app
CMD ["yarn", "start"]
Command I'm running to build it:
docker build -t matrix-fe:dev .
Command to run it:
docker run -it --rm -v ${PWD}:/app -v /app/node_modules -p 3000:3000 -e CHOKIDAR_USEPOLLING=true matrix-fe:dev
And then this is my composer yml:
version: '3.7'
services:
matrix-fe:
container_name: matrix-fe
build:
context: ./a-fe
dockerfile: Dockerfile
volumes:
- './a-fe:/app'
- '/app/node_modules'
ports:
- 3000:3000
environment:
- CHOKIDAR_USEPOLLING=true
Then to build and run it:
docker-compose up --build
Error I'm getting:
matrix-fe | yarn run v1.13.0
matrix-fe | $ react-scripts start
matrix-fe | It looks like you're trying to use TypeScript but do not have typescript installed.
matrix-fe | Please install typescript by running yarn add typescript.
matrix-fe | If you are not trying to use TypeScript, please remove the tsconfig.json file from your package root (and any TypeScript files).
matrix-fe |
Why is this happening? I can obviously try to also install typescript, but it is a dependency in package.json and should be installed, I also added node_modules to path. How is running the image with and without docker-compose different? Compose does create a different image called matrix_matrix-fe, but then Dockerfile hasn't changed. docker-compose.yml is in a top level folder, structure looks like this:
/matrix
/a-fe
./package.json
./Dockerfile
...
/a-be
./docker-compose.yml
Help me understand what's different, volumes are same, ENV variables are same, not seeing anything off.
Edit: forgot to mention that running the image without docker-compose doesn't output errors, it's working properly.

How to build and deploy ReactJs using docker-compose

I'm trying to make that every time a new change is made on the app I do not need to build the app, and then run the docker-compose file. What I'm trying to do is that when I change code in my application (ReactJs) to just go and run docker-compose file, so then docker-compose will build and run it using nginx.
Here's what my docker-compose.yml looks like:
version: '2'
services:
nginx:
image: 'bitnami/nginx:1.14.2'
ports:
- '80:8080'
volumes:
- ./build:/var/www/my-app
- ./nginx.conf:/opt/bitnami/nginx/conf/nginx.conf:ro
Right now with this code, I need to build the application myself running npm run build and then go and run the docker-compose file so it would take the changes.
I don't exactly know how to do it, so I assume I need to create a Dockerfile run npm run build and then call the bitmani/nginx:1.14.2 based on their docs: https://hub.docker.com/r/bitnami/nginx/
FROM node:8.7.0-alpine
RUN npm install
RUN npm run build
docker run --name nginx \
-v /path/to/my_vhost.conf:/opt/bitnami/nginx/conf/vhosts/my_vhost.conf:ro \
-v /path/to/nginx-persistence/nginx/conf/bitnami/certs:/bitnami/nginx/conf/bitnami/certs \
bitnami/nginx:latest
and in docker-compose.yml call build . instead of image: bitnami/nginx.
You should use a stage build for this. Your Dockerfile should look like this:
# Stage 1 - Building image
FROM node:8.7.0-alpine as node
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
# Stage 2 - Running image
FROM bitnami/nginx:1.14.2
COPY --from=node /usr/src/app/build /var/www/my-app
COPY ./nginx.conf /opt/bitnami/nginx/conf/nginx.conf
And your docker-compose:
version: '3.3'
services:
myApp:
image: myapp:1.0
container_name: my-app
build: .
ports:
- 80:8080
I adapted this from one of my projects so if you have any issues let me know and I'll check them.
I hope it helps.

Resources