How to deploy app with Docker Compose + React + Django + Nginx? - reactjs

I'm building an app using Docker Compose, React, Django and Nginx. After struggling for a few days I managed to set up a docker-compose file that successfully connected all these services, from collecting the React static files and having Nginx serve them, to having Nginx point to the Django static files instead of Django serving them, to adding other services like Celery to the Docker Compose config.
However, it seems like there's no easy place to publish + deploy this container (the Docker registry doesn't accept containers I think?). All I could find was Azure and AWS integrations, which are definitely a step up from the Heroku deployment I was doing before. My Heroku no longer works as it needs the React + Django to all be at the same depth level of folders, or it doesn't let me use the 'heroku/nodejs' buildpack. Is there a deployment option that lets me maintain the separate folder structure + ease of development of Docker Compose, without being as complex as Azure and AWS? Thanks in advance!

You can upload your container to heroku container registry
https://devcenter.heroku.com/categories/deploying-with-docker
add a heroku.yml file
build:
docker:
web: Dockerfile
run:
web: bundle exec puma -C config/puma.rb
then with the heroku-cli
heroku create
heroku container:push web

Related

How to deploy multiple containers on Heroku?

I have created API(Flask RestFul) service, UI(in ReactJS) and Proxy service each having its own Dockerfile in their respective folder.
And there is Docker-compose.yaml file in main repository. It works locally on running following command docker-compose -f docker-compose.prod.yaml up, but I am unable to find way to deploy multiple containers on heroku?
Here is my github repo: https://github.com/Darpan313/Flask-React-nginx-Docker-Compose

Setup react app build folder onto Google Kubernetes

Currently, I have a repo that contains both a Node.js Express backend and React frontend. The repo's image is in Google Container Registry and is used on a Google Kubernetes cluster. There is an url provided by a load balancer, where it is the backend url that is serving the static build server. In the future, I want to separate the backend/frontend into two different repos (one for backend and one for frontend).
I believe making changes for the backend in the cluster won't be difficult, but I am having trouble figuring out how to add the React frontend to this since the build folder will be in a different repo than the backend. I read online that to serve a React app on GCP, you would upload the build folder onto a bucket and have that bucket served on App Engine, which will provide a url to access it on the web.
I'm wondering if this is how it would be done on a Kubernetes cluster or if there is a different approach since it is not using App Engine, rather Google Kubernetes.
I hope this makes sense (I am still fairly new to Google Cloud) and any feedback/tips will be appreciated!
Thanks!
There are different approaches to this.
Approach 1: Serve your frontend via Google Cloud Storage.
There is a guide in the GCP documentation: Hosting a static website to set this up. After the build copy all the files to the cloud storage and you are done.
Approach 2: Add your fronted to your backend while building the Docker image
Build your frontend and pack it into a Docker image with something like this:
FROM node AS build
WORKDIR /app
COPY . .
RUN npm ci && npm run build
FROM scratch
COPY --from=build /app/dist /app
Build your backend and copy the frontend:
FROM myapp/frontend as frontend
FROM node
// build backend
COPY --from=frontend /app /path/where/frontend/belongs
This decouples both builds but you will always have to deploy the backend for a frontend change.
Approach 3: Serve your frontend with nginx (or another web server)
FROM node AS build
WORKDIR /app
COPY . .
RUN npm ci && npm run build
FROM nginx
COPY --from=build /app/dist /usr/share/nginx/html
You might also adapt the nginx.conf to enable routing without hash paths. See this article by codecentric for more information on that.

Is is possible to deploy different monorepos packages to different domains

I'm trying to deploy a monorepo of 2 react packages (repos) where each package is deployed to different aws s3 bucket , is it possible ?
package A to be deployed to api.mywebsite.com
package B to be deployed to www.mywebsite.com
Welcome to Stackoverflow!
I suggest you do the following
In your build step create two artifacts (e.g. api, web). For example, if your npm build creates a ./build directory, do cp -r build api for the api project and similar for the web project
In your deployment step do something like aws s3 cp api s3://api.mywebsite.com --recursive or aws s3 sync api s3://api.mywebsite.com and same for the web artifact

How to push existing docker image to google app-engine

I am currently using the app engine maven plugin, which seems to trigger a Google cloud build to build a docker image and then push to app engine.
Is it possible for me to just push an exiting docker image from docker hub or google container registry?
You can deploy to App Engine using a specific Docker image hosted on Google Container Registry by using the --image-url flag like this:
gcloud app deploy --image-url=[HOSTNAME]/[PROJECT-ID]/[IMAGE]
See doc here for more info on the hostname options.
It is also possible to do this through the Dockerfile in your app directory.
I noticed this while searching for ways to customize Google's own NGINX container in the App Engine instance (this is what is used to serve your app).
The first line of the Nginx Dockerfile is FROM nginx. This is referencing the 'nginx' image in the default image repository. As such this could be any image in the default registry as referenced by name. The default registry seems to be the Docker-hub registry (did not investigate if Google is mirroring or similar).
In this way, your app directory only need contain 2 files: app.yaml and Dockerfile.

Structure of Angular / NodeJS repository which will run in docker

We have repository with an application written in Angular.
It needs a docker container with nginx to be hosted.
The nodejs needs a docker container of nodejs so our app will be split up in 2 containers which will be linked.
So to write 2 dockerfiles (one for each image) we have to split up our folders in our repo like:
root
Angular : contains dockerfile for nginx
NodeJS : contains dockerfile for nodejs
But the problem is they both need the package.json. (Angular for devdependencies and NodeJS for the dependencies).
Which is the best structure in the repo for your application?
The Nginx and Nodejs containers can share a volume in Docker Compose if you like. You can use the volumes_from parameter. It will mount all of the volumes from another service or container, optionally specifying read-only access(ro) or read-write(rw). https://docs.docker.com/compose/compose-file/
volumes_from:
- service_name
- container_name
- service_name:rw
In your case, package.json can be in your Nodejs container but it can also be accessible by the Nginx container using this parameter.

Resources