Structure of Angular / NodeJS repository which will run in docker - angularjs

We have repository with an application written in Angular.
It needs a docker container with nginx to be hosted.
The nodejs needs a docker container of nodejs so our app will be split up in 2 containers which will be linked.
So to write 2 dockerfiles (one for each image) we have to split up our folders in our repo like:
root
Angular : contains dockerfile for nginx
NodeJS : contains dockerfile for nodejs
But the problem is they both need the package.json. (Angular for devdependencies and NodeJS for the dependencies).
Which is the best structure in the repo for your application?

The Nginx and Nodejs containers can share a volume in Docker Compose if you like. You can use the volumes_from parameter. It will mount all of the volumes from another service or container, optionally specifying read-only access(ro) or read-write(rw). https://docs.docker.com/compose/compose-file/
volumes_from:
- service_name
- container_name
- service_name:rw
In your case, package.json can be in your Nodejs container but it can also be accessible by the Nginx container using this parameter.

Related

is there a way to copy a file that gets added by building a docker into another container?

I'm trying to dockerize a react app I made by making a container for development, one for production and a last one for hosting the app, in my production dockerfile I use the command RUN npm run build
So that i can get the build file that i need to host the app, but i can't copy it in the host container using docker-compose, is there a way to do it with docker-compose or do i need to build the app in the hosting container(and erasing the production container since it won't have any purpose anymore)?

How to deploy app with Docker Compose + React + Django + Nginx?

I'm building an app using Docker Compose, React, Django and Nginx. After struggling for a few days I managed to set up a docker-compose file that successfully connected all these services, from collecting the React static files and having Nginx serve them, to having Nginx point to the Django static files instead of Django serving them, to adding other services like Celery to the Docker Compose config.
However, it seems like there's no easy place to publish + deploy this container (the Docker registry doesn't accept containers I think?). All I could find was Azure and AWS integrations, which are definitely a step up from the Heroku deployment I was doing before. My Heroku no longer works as it needs the React + Django to all be at the same depth level of folders, or it doesn't let me use the 'heroku/nodejs' buildpack. Is there a deployment option that lets me maintain the separate folder structure + ease of development of Docker Compose, without being as complex as Azure and AWS? Thanks in advance!
You can upload your container to heroku container registry
https://devcenter.heroku.com/categories/deploying-with-docker
add a heroku.yml file
build:
docker:
web: Dockerfile
run:
web: bundle exec puma -C config/puma.rb
then with the heroku-cli
heroku create
heroku container:push web

How to deploy multiple containers on Heroku?

I have created API(Flask RestFul) service, UI(in ReactJS) and Proxy service each having its own Dockerfile in their respective folder.
And there is Docker-compose.yaml file in main repository. It works locally on running following command docker-compose -f docker-compose.prod.yaml up, but I am unable to find way to deploy multiple containers on heroku?
Here is my github repo: https://github.com/Darpan313/Flask-React-nginx-Docker-Compose

Setup react app build folder onto Google Kubernetes

Currently, I have a repo that contains both a Node.js Express backend and React frontend. The repo's image is in Google Container Registry and is used on a Google Kubernetes cluster. There is an url provided by a load balancer, where it is the backend url that is serving the static build server. In the future, I want to separate the backend/frontend into two different repos (one for backend and one for frontend).
I believe making changes for the backend in the cluster won't be difficult, but I am having trouble figuring out how to add the React frontend to this since the build folder will be in a different repo than the backend. I read online that to serve a React app on GCP, you would upload the build folder onto a bucket and have that bucket served on App Engine, which will provide a url to access it on the web.
I'm wondering if this is how it would be done on a Kubernetes cluster or if there is a different approach since it is not using App Engine, rather Google Kubernetes.
I hope this makes sense (I am still fairly new to Google Cloud) and any feedback/tips will be appreciated!
Thanks!
There are different approaches to this.
Approach 1: Serve your frontend via Google Cloud Storage.
There is a guide in the GCP documentation: Hosting a static website to set this up. After the build copy all the files to the cloud storage and you are done.
Approach 2: Add your fronted to your backend while building the Docker image
Build your frontend and pack it into a Docker image with something like this:
FROM node AS build
WORKDIR /app
COPY . .
RUN npm ci && npm run build
FROM scratch
COPY --from=build /app/dist /app
Build your backend and copy the frontend:
FROM myapp/frontend as frontend
FROM node
// build backend
COPY --from=frontend /app /path/where/frontend/belongs
This decouples both builds but you will always have to deploy the backend for a frontend change.
Approach 3: Serve your frontend with nginx (or another web server)
FROM node AS build
WORKDIR /app
COPY . .
RUN npm ci && npm run build
FROM nginx
COPY --from=build /app/dist /usr/share/nginx/html
You might also adapt the nginx.conf to enable routing without hash paths. See this article by codecentric for more information on that.

create-react-app + docker = QA and PROD Deploy

I'm using create-react-app for my projects using docker as my dev env.
Now I would like to know how is the best practice to deploy my project into AWS (I'll deploy the docker).
Maybe my question is a dummy but I'm really stuck on it.
My docker file has a command yarn start... for dev it is enough I don't need to build anything, my bundle will run in memory, but for QA or PROD I would like to build using npm run build but as I know it will create a new folder with the files that should be used on prod env.
That said, my question is: what is the best practice for this kind of situation?
Thanks.
This is what I did:
Use npm run build to build all static files.
Use _/nginx image to customize an HTTP server which serves those static files. (Dockerfile)
Upload the customized image to Amazon EC2 Container Service (ECS).
Load the image in ECS task. Then use ELBv2 to start a load balance server to forward all outside requests to ECS.
(Optional) Enable HTTPS in ELBv2.
One time things:
Figure out the mechanism of ECS. You need to create at least one host server for ECS. I used the Amazon ECS-Optimized AMI.
Create a Docker repository on ECS so you can upload your customized Docker image.
Create ECS task definition(s) for your service.
Create ECS cluster(s) and add task(s).
Configure ELBv2 so it can forward the traffic to your internal ECS dynamic port.
(Optional) Write script to automate everyday deployment.
I would get paid if someone wants me to do those things for her/him. Or you can figure it out by yourself following those clues.
However, if your website is a simple static site, I recommend to use Github pages: it's free and simple. My solution is for multiple static + dynamic applications which may involved other services (e.g. Redis, ElasticSearch) and required daily/hourly deployments.
You would have to run npm run build and then copy the resulting files into your container. You could use a separate Dockerfile.build to build the files, extract them and add them to your final container. Your final container should be able to serve the files. You can base it on nginx or another server. You can also use it as a data volume container in your existing server container.
Recent versions of Docker make this process easier by allowing you to combine the two Dockerfiles. You can have a build container and then the final container both be defined in the same file.
Here's a simple example for your use case:
FROM node:onbuild AS builder
RUN npm run build
FROM nginx:latest
COPY --from=builder /usr/src/app/build /usr/share/nginx/html
You'd probably want to include your own nginx configuration file.
More on multistage builds here:
https://docs.docker.com/engine/userguide/eng-image/multistage-build/

Resources