React js download a file located in another folder - reactjs

I'm currently working on a web/mobile project. I'm implementing a download button to download the mobile client.
I'm using Docker images and docker compose to deploy the project, I have an image to build the apk with flutter and an image to build and run the front web.
Here is the docker-compose file
version: "3.9"
services:
client_mobile:
build: ./flutter
volumes:
- apk-volume:/app/client
web:
build: ./front-web
env_file:
- ./front-web/.env
ports:
- "8081:3000"
depends_on:
- client_mobile
restart: always
volumes:
- apk-volume:/app/client
volumes:
apk-volume:
Here is the flutter Dockerfile
FROM androidsdk/android-30
# Install flutter
RUN apt-get update
RUN apt-get install -y bash git unzip wget
RUN apt-get clean
WORKDIR /
RUN git clone https://github.com/flutter/flutter.git
ENV PATH "$PATH:/flutter/bin"
ENV FLUTTER_PATH /flutter/
RUN flutter upgrade
RUN flutter precache
# Install gradle
RUN wget https://services.gradle.org/distributions/gradle-7.3.3-bin.zip
RUN mkdir /opt/gradle
RUN unzip -d /opt/gradle gradle-7.3.3-bin.zip
ENV PATH "$PATH:/opt/gradle/gradle-7.3.3/bin"
RUN rm gradle-7.3.3-bin.zip
RUN mkdir -p /app/android
RUN echo "sdk.dir=$ANDROID_SDK" >> /app/android/local.properties
RUN echo "flutter.sdk=$FLUTTER_PATH" >> /app/android/local.properties
RUN echo "flutter.buildMode=debug" >> /app/android/local.properties
RUN echo "flutter.versionName=1.0.0" >> /app/android/local.properties
RUN echo "flutter.versionCode=1" >> /app/android/local.properties
WORKDIR /app/android
COPY ./ /app/
RUN gradle --refresh-dependencies
RUN flutter build apk --release
CMD mv ../build/app/outputs/apk/release/app-release.apk /app/client/app-release.apk
Here is the docker file of the react
FROM node AS builder
WORKDIR /usr/src/app
COPY ./package.json /usr/src/app/
RUN cd /usr/src/app; npm install -g npm#latest; npm install
COPY ./ /usr/src/app/
RUN cd /usr/src/app; npm run build
FROM builder
WORKDIR /usr/src/app
COPY --from=builder /usr/src/app/build /usr/src/app/build
RUN npm install -g serve
EXPOSE 3000
CMD ["serve", "-s", "build", "-l", "3000"]
And here is the way i'm downloading the apk file
<a href='/app/client/area_mobile.apk' download style={{textDecoration: "none"}}>
<div className='userPanel__sideBar__rows__row'>
<div style={{marginRight: "5%", marginTop: "2%"}}>
<DownloadIcon />
</div>
<p>Download Mobile</p>
</div>
</a>
When I click it download a file but it's named index.html and contain the equivalent of the actual page i'm in.
Do you know how to deal with this ?
Thanks in advance

Ok so i've found a way, it's to directly put the .apk in the build file after building the app.

Related

Docker container with .net and react app not loading client

To Begin I have created .net core 6 project with react.js from visual studio 2022.
I have added docker to my project as well.
I have been following this tutorial.
"Quickstart: Use Docker with a React Single-page App in Visual Studio"
https://learn.microsoft.com/en-us/visualstudio/containers/container-tools-react?view=vs-2022
I came to the point where I'm able to build my image however when I'm starting my docker container it is running only my backend api, react app/client is not loading at all.
Any Ideas?
here is how my docker file looks like
#See https://aka.ms/containerfastmode to understand how Visual Studio uses this Dockerfile to build your images for faster debugging.
FROM mcr.microsoft.com/dotnet/aspnet:6.0 AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443
RUN apt-get update
RUN apt-get install -y curl
RUN apt-get install -y libpng-dev libjpeg-dev curl libxi6 build-essential libgl1-mesa-glx
RUN curl -sL https://deb.nodesource.com/setup_lts.x | bash -
RUN apt-get install -y nodejs
FROM mcr.microsoft.com/dotnet/sdk:6.0 AS build
RUN apt-get update
RUN apt-get install -y curl
RUN apt-get install -y libpng-dev libjpeg-dev curl libxi6 build-essential libgl1-mesa-glx
RUN curl -sL https://deb.nodesource.com/setup_lts.x | bash -
RUN apt-get install -y nodejs
WORKDIR /src
COPY ["admin_tool_api_ui/admin_tool_api_ui.csproj", "admin_tool_api_ui/"]
RUN dotnet restore "admin_tool_api_ui/admin_tool_api_ui.csproj"
COPY . .
WORKDIR "/src/admin_tool_api_ui"
RUN dotnet build "admin_tool_api_ui.csproj" -c Release -o /app/build
FROM build AS publish
RUN dotnet publish "admin_tool_api_ui.csproj" -c Release -o /app/publish
FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "admin_tool_api_ui.dll"]
First of all, I feel your pain. Thanks to Microsoft documentation I'm now a person with high level of patiency, I learned to accept things as they are in life.
Second, I think you are missing the front-end build, which is this piece:
FROM node:16 AS build-web
COPY ./admin_tool_api_ui/ClientApp/package.json /admin_tool_api_ui/ClientApp/package.json
COPY ./admin_tool_api_ui/ClientApp/package-lock.json /admin_tool_api_ui/ClientApp/package-lock.json
WORKDIR /admin_tool_api_ui/ClientApp
RUN npm ci
COPY ./admin_tool_api_ui/ClientApp/ /admin_tool_api_ui/ClientApp
RUN npm run build
So your final Dockerfile can be something like this:
#See https://aka.ms/containerfastmode to understand how Visual Studio uses this Dockerfile to build your images for faster debugging.
FROM mcr.microsoft.com/dotnet/aspnet:6.0 AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443
FROM mcr.microsoft.com/dotnet/sdk:6.0 AS build
RUN apt-get update
RUN apt-get install -y curl
RUN apt-get install -y libpng-dev libjpeg-dev curl libxi6 build-essential libgl1-mesa-glx
RUN curl -sL https://deb.nodesource.com/setup_lts.x | bash -
RUN apt-get install -y nodejs
WORKDIR /src
COPY ["admin_tool_api_ui/admin_tool_api_ui.csproj", "admin_tool_api_ui/"]
RUN dotnet restore "admin_tool_api_ui/admin_tool_api_ui.csproj"
COPY . .
WORKDIR "/src/admin_tool_api_ui"
RUN dotnet build "admin_tool_api_ui.csproj" -c Release -o /app/build
FROM build AS publish
RUN dotnet publish "admin_tool_api_ui.csproj" -c Release -o /app/publish
FROM node:16 AS build-web
COPY ./admin_tool_api_ui/ClientApp/package.json /admin_tool_api_ui/ClientApp/package.json
COPY ./admin_tool_api_ui/ClientApp/package-lock.json /admin_tool_api_ui/ClientApp/package-lock.json
WORKDIR /admin_tool_api_ui/ClientApp
RUN npm ci
COPY ./admin_tool_api_ui/ClientApp/ /admin_tool_api_ui/ClientApp
RUN npm run build
FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
COPY --from=build-web /admin_tool_api_ui/ClientApp/build ./ClientApp/build
ENTRYPOINT ["dotnet", "admin_tool_api_ui.dll"]
I just set property "Copy to Output directory" = "Copy always" for all React related files ClientApp directory hierarchy via "Properties" dialog.

npm run tests in docker with travis CI

I am deploying a react app to Heroku via TravisCI. The fact that I'm using Heroku doesn't really affect what I'm about to ask, I'm pretty sure, it's just there for context. Travis successfully deploys the app until I add a testing step (the script section) in .travis.yml:
language: generic
sudo: required
services:
- docker
before_install:
- docker build -t myapp:prod -f Dockerfile.prod .
script:
- docker run -e CI=true myapp:prod npm run test
after_success:
- docker build -t myapp:prod -f Dockerfile.prod .
- echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_ID" --password-stdin
- docker push myapp:prod
deploy:
provider: heroku
app: myapp
skip_cleanup: true
api_key:
secure: <my_key>
However, my Dockerfile.prod is a multi-stage node + nginx where the nginx stage doesn't keep any node or npm stuff:
# build environment
FROM node:13.12.0-alpine as builder
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
# some CI stuff I guess
RUN npm ci
RUN npm install react-scripts#3.4.1 -g --silent
COPY . ./
RUN npm run build
# production environment
FROM nginx:stable-alpine
COPY nginx.conf /etc/nginx/conf.d/default.conf
# If using React Router
COPY --from=builder /app/build /usr/share/nginx/html
# For Heroku
CMD sed -i -e 's/$PORT/'"$PORT"'/g' /etc/nginx/conf.d/default.conf && nginx -g 'daemon off;'
Therefore, it is my understanding that .travis.yml tries to run that npm run test command inside my nginx container and can't execute npm commands (no node installed, right?). So guided by SO answers such as this one I started adding commands into that nginx stage such as
COPY package.json ./
COPY package-lock.json ./
RUN apk add --update npm
but I realized I might be approaching this the wrong way. Should I perhaps be adding npm through Travis? That is, should I include in .travis.yml in the scripts section something like docker run -e CI=true myapp:prod apk add --update npm and whatever else is necessary? This would result in a smaller nginx image no? However, would I run into problems with package.json from the node stage in Dockerfile.prod or anything like that?
In summary, to use TravisCI to test a dockerized react app served with nginx, at what point should I install npm into my image? Does it happen as part of script in .travis.yml or does it happen in Dockerfile.prod? If it is recommened to npm run tests inside Dockerfile.prod, would I do that in the first stage (node) or the second (nginx)?
Thanks
EDIT: Not sure if this can be considered solved, but a user on Reddit recommended to simply RUN npm run test right before the RUN npm run build.

How to access create-react-app run in a docker container?

I followed the steps under https://mherman.org/blog/dockerizing-a-react-app/
My setup:
Windows 10 Home
docker commands are run in the Docker Quickstart Terminal https://docs.docker.com/toolbox/toolbox_install_windows/
How to reproduce: Follow the steps from the first link:
install create-react-app globally:
npm install -g create-react-app#3.4.1
Generate new app:
$ npm init react-app sample --use-npm
$ cd sample
Create Dockerfile in the root of directory:
# pull official base image
FROM node:13.12.0-alpine
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm install --silent
RUN npm install react-scripts#3.4.1 -g --silent
# add app
COPY . ./
# start app
CMD ["npm", "start"]
Add .dockerignore:
node_modules
build
.dockerignore
Dockerfile
Dockerfile.prod
Build and tag the dockerimage:
$ docker build -t sample:dev .
Spin up the container:
$ docker run \
-it \
--rm \
-v ${PWD}:/app \
-v /app/node_modules \
-p 3001:3000 \
-e CHOKIDAR_USEPOLLING=true \
sample:dev
This is what I see in the Docker Quickstart Terminal:
And this is my project structure:
However, when I go to localhost:3001 as described in the post, I see
Any idea where I'm missing something?
When you run the container, specify the port like this: 3000:80 (use :80)
$ docker run -it --rm -p 3000:80 sample:prod
You can then navigate to http://localhost:3000/ to see the CRA app.

How do I get a directory from a container down to the travis-ci working directory?

I am getting an error when travis-ci builds my app in a docker container. The build folder is not coming down.Here is the error logs
Deploying application
Initialized empty Git repository in /tmp/d20190115-5107-
1w5c6ge/work/.git/
Switched to a new branch 'gh-pages'
cd -
cd /tmp/d20190115-5107-1w5c6ge/work
rsync: change_dir "/app/build" failed: No such file or directory (2)
rsync error: some files/attrs were not transferred (see previous errors)
(code 23) at main.c(1183) [sender=3.1.0]
Could not copy /app/build.
Here are my .travis.yml and dockerfile .
# Grants super user permissions
sudo: required
# travis ci installs docker into travis container
services:
- docker
# before tests are ran build docker image
before_install:
- docker build -t dvontrec/fn-killers -f Dockerfile.dev .
script:
# SHOULD ADD TESTS
- docker run dvontrec/fn-killers pwd
- docker run dvontrec/fn-killers ls
# Steps before deploy:
defore_deploy:
- docker run dvontrec/fn-killers -f npm run build
# Steps to deploy to github pages
deploy:
provider: pages
skip_cleanup: true
github_token: $github_token
on:
branch: master
FROM node:alpine
WORKDIR './app'
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "run", "start-docker"]
Does anyone know how to get the files down from the container?
I found out what i did wrong, to deploy with docker you need to have an nginx container that will copy everything down. Here is the Dockerfile i used.
# Build phase
FROM node:alpine as builder
WORKDIR '/app'
COPY package.json .
RUN npm install
COPY . .
RUN npm run build
# Run phase
FROM nginx
EXPOSE 80
COPY --from=builder /app/build /usr/share/nginx/html

"Create React App" with Docker

I was wondering if anyone had any experience using create-react-app with docker. I was able to get it set up with a Dockerfile like:
from node
RUN mkdir /src
WORKDIR /src
ADD package.json /src/package.json
RUN npm install
EXPOSE 3000
CMD [ "npm", "start" ]
And then used a docker-compose file like:
app:
volumes:
- "./app:/src"
ports:
- "3000:3000"
- "35729:35729"
build: ./app
This allowed me to start up the container and view the app. However livereload didn't work when saving files in the mounted volume and webpack created several .json.gzip files in the src directory.
Any suggestions for getting this working correctly?
Yeah, as aholbreich mentioned, I'd use npm install / npm start locally on my machine for development, just because it's so easy. It's probably possible with docker-compose, mounting volumes etc. too, but I think it could be a bit fiddly to set up.
For deployment you can then very easily use a Dockerfile. Here's an example Dockerfile I'm using:
FROM node:6.9
# Create app directory
RUN mkdir -p /src/app
WORKDIR /src/app
# to make npm test run only once non-interactively
ENV CI=true
# Install app dependencies
COPY package.json /src/app/
RUN npm install && \
npm install -g pushstate-server
# Bundle app source
COPY . /src/app
# Build and optimize react app
RUN npm run build
EXPOSE 9000
# defined in package.json
CMD [ "npm", "run", "start:prod" ]
You need to add the start:prod option to your package.json:
"scripts": {
"start": "react-scripts start",
"start:prod": "pushstate-server build",
"build": "react-scripts build",
"test": "react-scripts test --env=jsdom",
"eject": "react-scripts eject"
},
You can run the tests on your CI service with:
docker run <image> npm test
There's nothing stopping you from running this docker container locally as well to make sure things work as expected.
I recently made a small project called hello-docker-react who just does what the op is looking for.
It's made with docker-compose, create-react-app, yarn, a node image, and a small entrypoint script.
Live reload work flawlessly and I haven't found any problems yet.
https://github.com/lopezator/hello-docker-react
here is good gide for this
https://mherman.org/blog/dockerizing-a-react-app/
for development
# base image
FROM node:9.6.1
# set working directory
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# add `/usr/src/app/node_modules/.bin` to $PATH
ENV PATH /usr/src/app/node_modules/.bin:$PATH
# install and cache app dependencies
COPY package.json /usr/src/app/package.json
RUN npm install --silent
RUN npm install react-scripts#1.1.1 -g --silent
# start app
CMD ["npm", "start"]
for production
# build environment
FROM node:9.6.1 as builder
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
ENV PATH /usr/src/app/node_modules/.bin:$PATH
COPY package.json /usr/src/app/package.json
RUN npm install --silent
RUN npm install react-scripts#1.1.1 -g --silent
COPY . /usr/src/app
RUN npm run build
# production environment
FROM nginx:1.13.9-alpine
COPY --from=builder /usr/src/app/build /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]
Not exactly a direct improvement of the author's code, but I was able to get a development environment working with very little code - and no direct dependency to node on my machine - like this:
docker-compose.yml
services:
node:
image: node:16
user: "node"
command: "npm start"
working_dir: /app
volumes:
- ./:/app
ports:
- 3000:3000
This way, you avoid creating docker images from a Dockerfile.
Usage is generally like this:
install dependencies before running: docker compose run node npm install
run development environment: docker compose up
install new dependencies: docker compose run node npm install [package name]
clean up docker instances created with compose run: docker compose rm
While using docker in development with create-react-app, i discovered that it is possible to override the webpackDevServer configuration by adding CHOKIDAR_USEPOLLING=1to your .env file. This will make the file watching work again. It even refreshes the browser page on the host! The only thing that i discovered is that it doesn't open up a webpage automatically.
I can also advise to add tty: true to your service to have your original console output back into your terminal. To remove the container name prefixes in the logs, you can run something like this after running docker-compose up -d:
docker-compose logs -f --tail=100 client | cut -f2 -d \"|\""
Running with CRA 4.0 and many dependencies
.dockerignore
.git
.gitignore
node_modules
build
Dockerfile.dev
FROM node:alpine
WORKDIR /app
COPY package.json /app
RUN yarn install
COPY . .
CMD ["yarn", "start"]
docker-compose.dev.yml
version: "3.8"
services:
print:
stdin_open: true
build:
context: .
dockerfile: Dockerfile.dev
ports:
- "3000:3000"
volumes:
- ".:/app"
- "/app/node_modules"
Dockerfile.prod
FROM node:alpine as build
WORKDIR /app
COPY package.json /app
RUN yarn install
COPY . /app
RUN yarn run build
FROM nginx:stable-alpine
COPY ./nginx/nginx.conf /etc/nginx/conf.d/default.conf
COPY --from=build /app/build /usr/share/nginx/html
docker-compose.prod.yml
version: "3.8"
services:
print:
stdin_open: true
build:
context: .
dockerfile: Dockerfile.prod
ports:
- "80:80"
nginx.conf
server {
listen 80;
server_name frontend;
location / {
root /usr/share/nginx/html;
index index.html;
try_files $uri /index.html;
}
}
To run
docker-compose.exe -f .\docker-compose.yml up --build
or
docker-compose.exe -f .\docker-compose.dev.yml up --build
Here is a simple (pure docker) solution without local installation of runtime (e.g. node):
cd /tmp
docker run -it --rm -v "$PWD":/app -w /app node yarn create react-app my-app
sudo chown -R $USER:root my-app/
cd my-app
nano docker-compose.yml # see docker-compose.yml below
docker compose up -d
docker-compose.yml:
services:
node:
image: node:16-alpine
environment:
- CHOKIDAR_USEPOLLING=true
- FAST_REFRESH=true
working_dir: /app
ports:
- '3000:3000'
command: "yarn start"
volumes:
- './:/app'
open localhost:3000 in your browser. Hot reload should work out of the box.

Resources