Is is possible to deploy different monorepos packages to different domains - reactjs

I'm trying to deploy a monorepo of 2 react packages (repos) where each package is deployed to different aws s3 bucket , is it possible ?
package A to be deployed to api.mywebsite.com
package B to be deployed to www.mywebsite.com

Welcome to Stackoverflow!
I suggest you do the following
In your build step create two artifacts (e.g. api, web). For example, if your npm build creates a ./build directory, do cp -r build api for the api project and similar for the web project
In your deployment step do something like aws s3 cp api s3://api.mywebsite.com --recursive or aws s3 sync api s3://api.mywebsite.com and same for the web artifact

Related

GIT CD "Laravel + React" application

I have an application that uses Laravel as backend and React as frontend
The two applications are stored in separate repositories.
In the local environment, I serve the Laravel application with "php artisan:serve" and the React application with "npm run start".
The two applications communicate with each other through POST/GET APIs.
Now I want to create a "deploy" repository.
The deploy repository should have two folders:
backend (containing Laravel application)
frontend (containing React application)
I want that every time a merge is made on the main branch of one of the two repos (backend or frontend) the changes are pushed to the deploy repository too.
The deploy repo will take care of building the app and eventually build a docker image.
Is this possible?
There are better ways/patterns to achieve what I want?

Jenkins Integration with S3 for hosting React Application

I am new to Jenkins, I have a React js application where it is hosted on Amazon S3. So I wanted to make a CI/CD pipeline using Jenkins. Most of the part is done but I am stuck at the last step. Connecting Jenkins with Amazon S3. I am able to get the updated code from Github and generate a build file in the Jenkins server. Now I wanted to move this new build file to AWS S3 for static website hosting.
I hosted my Jenkins server on EC2 Instance.
Can anyone help me to achieve this? Thanks in advance
You have 2 solution to implement
1.configure aws cli on Jenkins Slave or Master where you are running build and in jenkins Step Run command to copy those file to s3bucket
ex: aws s3 cp
https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html
You can use s3 publisher plugin to copy artifact to bucket

Setup react app build folder onto Google Kubernetes

Currently, I have a repo that contains both a Node.js Express backend and React frontend. The repo's image is in Google Container Registry and is used on a Google Kubernetes cluster. There is an url provided by a load balancer, where it is the backend url that is serving the static build server. In the future, I want to separate the backend/frontend into two different repos (one for backend and one for frontend).
I believe making changes for the backend in the cluster won't be difficult, but I am having trouble figuring out how to add the React frontend to this since the build folder will be in a different repo than the backend. I read online that to serve a React app on GCP, you would upload the build folder onto a bucket and have that bucket served on App Engine, which will provide a url to access it on the web.
I'm wondering if this is how it would be done on a Kubernetes cluster or if there is a different approach since it is not using App Engine, rather Google Kubernetes.
I hope this makes sense (I am still fairly new to Google Cloud) and any feedback/tips will be appreciated!
Thanks!
There are different approaches to this.
Approach 1: Serve your frontend via Google Cloud Storage.
There is a guide in the GCP documentation: Hosting a static website to set this up. After the build copy all the files to the cloud storage and you are done.
Approach 2: Add your fronted to your backend while building the Docker image
Build your frontend and pack it into a Docker image with something like this:
FROM node AS build
WORKDIR /app
COPY . .
RUN npm ci && npm run build
FROM scratch
COPY --from=build /app/dist /app
Build your backend and copy the frontend:
FROM myapp/frontend as frontend
FROM node
// build backend
COPY --from=frontend /app /path/where/frontend/belongs
This decouples both builds but you will always have to deploy the backend for a frontend change.
Approach 3: Serve your frontend with nginx (or another web server)
FROM node AS build
WORKDIR /app
COPY . .
RUN npm ci && npm run build
FROM nginx
COPY --from=build /app/dist /usr/share/nginx/html
You might also adapt the nginx.conf to enable routing without hash paths. See this article by codecentric for more information on that.

Can i deploy local angularjs project to google application engine using ng deploy?

I created a project local to my Mac using "ng cli"; specifically, "ng new." The project runs locally, but I do not have a way to deploy it to my account in Google Cloud - Application Engine.
I followed Google's tutorial using gcloud commands in the cloud, but I prefer to use my local repository, etc. as I am running on "free" until I can afford to be commercially viable.
NOTE: You require a billing account to create a REPO in GCP.
I finally found this in Google Cloud Platform documents. So, basically, follow the instructions on Google's site (url below) to create a remote and local project. Write your code in the local repo, then push to the remote repo and deploy with gcloud commands (in a cloud shell in https://console.cloud.google.com) from the remote repository.
https://cloud.google.com/source-repositories/docs/quickstart
(Synopsis of the page intro ...)
Quickstart
This page shows you how to set up a GCP repository and use it as a remote for a local Git repository.
The sections below walk you through the steps of creating a local Git repository that contains files for a sample App Engine application, adding a GCP repository as a remote, and pushing the contents of the local repository.

Appengine - Deployment appcfg.py vs git

I've recently started with Appengine and I'd been using the regular old deployment method using appcfg.py.
Now I want to start deployment using release pipelines. I created a pipeline in my Project settings, then authenticated myself in gcloud.
Now if I do gcloud init myproj-id, I should theoretically get the content of my project pulled from the server right? But that doesn't happen.
https://developers.google.com/cloud/sdk/gcloud/reference/init
If you have enabled push-to-deploy in the Cloud Console, one of the
things that gcloud init will do for you is cloning the Google-hosted
git repository associated with PROJECT
So, my questions:
Why is the content not pulled?
What happens if I push my project via git now? How would Appengine manage my previously deployed project via appcfg.py versus my git push'd project?
Why is the content not pulled?
Did you configure your repository by using the Configure your repository link at the top of the pipeline configuration page?
What happens if I push my project via git now? How would Appengine manage my previously deployed project via appcfg.py versus my git push'd project?
Unless you use different versions, App Engine won't make a difference; your git pushed project will overwrite your previous one.

Resources