I have an application that uses Laravel as backend and React as frontend
The two applications are stored in separate repositories.
In the local environment, I serve the Laravel application with "php artisan:serve" and the React application with "npm run start".
The two applications communicate with each other through POST/GET APIs.
Now I want to create a "deploy" repository.
The deploy repository should have two folders:
backend (containing Laravel application)
frontend (containing React application)
I want that every time a merge is made on the main branch of one of the two repos (backend or frontend) the changes are pushed to the deploy repository too.
The deploy repo will take care of building the app and eventually build a docker image.
Is this possible?
There are better ways/patterns to achieve what I want?
Related
I have a react app with django backend. Every time when I need to deploy the app I have to follow following steps -
Git pull new django code
Copy react build from local to aws
Collect static files using python manage.py collectstatic
I have to automate this process using bare minimum dependencies. What can be done in such case?
Thanks in Advance.
We have an app that is deployed to an Azure Web App
It's a create-react-app project.
My question is, when wanting to configure for a particular environment eg. API URLs, do you need to replace the env file and run "yarn build" again?
I am noticing that certain fields such as process.env.REACT_APP_API_URL are getting baked when building so I am thinking that a re-build is needed when needing to re-configure.
Is there a better way to pull configuration in a react app that doesn't require a re-build?
I recently create Express and MongoDB API and after that, I connect that API to my React Application successfully and which is running well. But one situation occurring during deployment now I need to deploy both projects separately means I need two hosting plan for them. So, I want that both project running on the same host. Is it possible? and How?
A build of any React application creates (unless webpack-dev-server is used to compile it) the output which consists of static files: script bundles and .html file(s) that reference the bundles in <script> tag.
Those files can and should be copied to Express in production build. Then you have one server on one host. When React application runs inside a client (e.g. browser) it gets everything including .html files, script bundles, API responses from the single source: Express backend/server. Which by the way excludes any possibility of getting CORS issues so many people are asking about here on SO.
Example, e.g. a boilerplate project that demonstrates all that.
If you execute commands:
git clone https://github.com/winwiz1/crisp-react.git
cd crisp-react
yarn install && yarn start:prod
and then point your browser to localhost:3000, you will have React app running inside your browser and it got everything from one single Express backend.
crisp-react has two subdirectories: client with React client app and server with Express backend. The build command copies the build artifacts from one subdirectory to another. So you can even rename or delete the client subdirectory after a build, it won't affect the Express backend.
I'm working on Project using React for the frontend and Laravel for the backend using RESTfull API.
I developed each one in separate directories but now I'm trying to deploy them in the same folder I don't really know what to do.
or can I deploy then each one in their own folder? if yes how can I run them on the same server (apache)?
The directory really shouldn't matter. Since React is a frontend javascript development framework, it runs on the client while the laravel backend will run on the server itself. All you need to do is serve the entry point html and the javascript file created from your react project to the client.
I assume you're thinking about the "development server" that you run while developing the react app. You need to, depending on your build environment, do a production build and serve the files in some way to the client.
When using create react app you can use the deployment build instructions: https://facebook.github.io/create-react-app/docs/deployment
So to summarise:
Host your laravel backend on the apache server
Upload entry point html (you can serve this via laravel, create a template with the correct html)
Serve the deployment javascript file for your react app (just include it on the same html page)
For the next 6 months I've to manage two or more developer on a web application development, based on Laravel (backend framework used for restful ws) and AngularJs (frontend framework which calls my ws).
As far as I know, the Angular code must resides into the public folder (the Laravel public folder), but in this way, I cannot use different repositories for two submodules (frontend app and backend app), in order to assign to each developer own repo (frontend repo to frontend dev, backend repo to backend dev).
How can I organize the project, allowing each dev to work independently?
How can I organize the project, in order to be able to deploy frontend and backend code on production server in different moments?
I'll plan to use agile methods.
Git Submodules can be the solution here.
You can organize you code this way: main repo (backend) + submodule (frontend).
to deploy backend only:
cd backend
git fetch && git reset --hard origin/master
to deploy frontend only:
cd frontend
git fetch && git reset --hard origin/master
to deploy both backend and frontend:
cd backend
git fetch && git reset --hard origin/master
git submodule sync
git submodule update --init --recursive
Of course this is only the simple example, but I assume it's clear enough to get the point of Git submodules :)