I've split my react application in 3 different projects, using CRA for all of them, auth, X and Y. User is first sent to auth, then I redirect him to either X or Y based on some info.
It works perfectly on PRODUCTION environment (because they run on the same domain), but on dev, X and Y failed to authenticate the user, because they run on different ports (different domains) the data in local storage is not shared between auth, X and Y.
I've tried to find a way to use a reverse proxy (http-proxy) to host the React dev servers on the same domain, but failed too, because the services could not find the assets/static folder, resulting in 404. Also tried http-proxy-middleware, as it is recommended on the CRA docs page, but failed to do so. Is there an easier way that I'm not seeing?
Edit: Found something new, but also failed. Used react-rewired to override CRA scripts, to use PUBLIC_PATH on DEV, but now my bundle.js returns an index.html file.
The following code does redirect to the accordingly react project, but the assets are requested to the wrong path.
const apiProxy = httpProxy.createProxyServer();
app.all("/login/*", function(req, res) {
console.log('redirecting to Login');
apiProxy.web(req, res, {target: servers.login});
});
app.all("/implementacao/*", function(req, res) {
console.log('redirecting to Implementation');
apiProxy.web(req, res, {target: servers.implementation});
});
So I used react-rewired to change the public path
const {
override,
} = require('customize-cra');
module.exports = {
webpack: override(
(config) => {
config.output.publicPath = '/login/';
return config;
},
),
jest: config => {
return config;
},
devServer: configFunction => (proxy, allowedHost) => {
return configFunction(proxy, allowedHost);
},
paths: (paths, env) => {
return paths;
}
};
Now, the assets requests are made correctly to /login/, but nothing the dev server always return an index.html file.
Even with react-app-rewired, to override config, and use publicPath on dev, the assets will not be served from the publicPath.
There is already a pull request on CRA to use PUBLIC_URL in dev mode.
Is there an easier way that I'm not seeing?
Another approach would be to use multiple React Single Page Applications (SPAs) inside one application, see crisp-react. E.g. instead of 3 CRAs in 3 applications/projects have 3 SPAs in one application/project. The backend surely can get data from other backend servers transparently for each SPA.
how do I migrate from a set of existing CRA projects to using crisp-react ?
Background
crisp-react comes with two stock SPAs called ‘First’ and ‘Second’. Both render some explanatory/sample UI.
Migration overview
1.Pick one CRA project and migrate it to the ‘First’ SPA. When finished, you have two CRAs left and two crisp-react SPAs: ‘First’ (renders your UI) and ‘Second’ (still renders the sample UI). Rename the ‘First’ SPA to give it more meaningful name.
2. Pick another CRA and migrate it. When finished, you have one CRA left and two crisp-react SPAs both rendering your UI.
3.Modify crisp-react to add the third SPA and then migrate the remaining CRA to the third SPA.
Migration steps (sketch)
1.1 Follow crisp-react Getting Started.
1.2 The landing page of the First SPA is rendered by crisp-react/client/src/entrypoints/first.tsx
The landing page of the CRA is rendered by src/index.tsx
Replace the content of the former with the latter.
1.3 The first CRA consists of React components: src/App.tsx and others you added. Copy the components to crisp-react/client/src/components/from-first-cra/
1.4 Ensure crisp-react client app compiles: From crisp-react/client/ execute: yarn compile
1.5 Ensure crisp-react client app builds: From crisp-react/client/ execute: yarn build
1.6 Ensure crisp-react client looks ok without backend data: see client Usage Scenarios.
1.7 Get the backend (e.g. Express) involved: see backend Usage Scenarios.
1.8 Milestone reached: browser can be pointed to backend (Express) and get from it html files and bundles - which results in the first SPA rendering initial UI (to the extent possible without data supplied via API enpoints).
1.9 Decide how the the first SPA will get data from API. 3 basic choices here:
- the API endpoints are implemented in Express so you can retire your backend servers
- Express does expose API endpoints but acts as a reverse proxy getting data from your backend servers
- Express knows nothing about API and data supplied by backend servers that are queried directly by the components inside the first SPA.
2.1 Second SRA
as above
...
Related
Simple setup:
React App created with create-react-app
ASP.NET Core web API - a couple of controllers (currently no security until I make it work)
Both the API and Application are deployed to Azure.
When I run the app locally with configured proxy (I contact the deployed API on Azure) it works correctly makes the calls.
If I try the API directly from my machine it works too (PostMan for example)
When I open the deployed React APP - The application loads correctly but the call to the API doesn't get proxy(ed). What I mean it's not returning 404, 403 - it returns status 200, but makes the call to the app itself instead of proxy the request to the API.
I've tried both using "proxy" configuration in package.json as well as using "http-proxy-middleware". Both cases work with locally running app, but not deployed. Here is the configuration of the proxy:
module.exports = function (app) {
app.use(
'/api',
createProxyMiddleware({
target: 'https://XXXXXX.azurewebsites.net',
changeOrigin: true,
})
);
};
I suppose it's something related to the configuration of the node server used when I deploy to azure, but I don't have a clue what.
I've used the following tutorial for deployment: https://websitebeaver.com/deploy-create-react-app-to-azure-app-services
But as seen from content there is no proxy part in it.
After long troubleshooting I realize the issue was due to wrong understanding of the proxy configuration.
So I was not realizing the proxy configuration is respected only in debug mode and totally ignored into prod build. And of course while debugging the issue I was not realizing the Azure Deployment pipe was doing production build. So the resolution was to detect the production/dev on application layer and inject correct URL.
Here I hit another issue - process.env.NODE_ENV, which I was using to distinguish between development and production was undefined into my index.tsx - it was available in App.tsx and all its children, but not in index.tsx where my dependency container was initialized.
What resolved my issue is package called dotenv. Then I've just imported it into index.tsx and the process.env was available.
import * as dotenv from 'dotenv';
I have an existing ASP.NET Core application (that uses razor pages) and I am trying to convert it, one component at a time, to React until I can completely make it a SPA. The idea is to create an entry point for each of my razor pages until I can combine them all into one SPA. I have most of this working except for the use of webpack-dev-server to serve my bundles. The problem I am having is the ASP.NET app runs on port 44321 and the dev server runs on port 8080 so the script tags in my .cshtml files cannot see the bundles that are being hosted from webpack.
I can temporarily change them from:
<script src="./dist/[name].bundle.js"></script>
To something like:
<script src="http://localhost:8080/[name].bundle.js"></script>
To get around this, but this is not long term solution.
I have created a sample application to showcase what I am trying to accomplish here: https://github.com/jkruse24/AspNetReact.
Is there any way to either get my ASP.Net application to listen on the port that webpack-dev-server is serving to (without changing my script tags) or to have my webpack-dev-server serve to the port that my ASP.Net app is running on?
I have tried to use the .NET CORE SPA middleware (Microsoft.AspNetCore.SpaProxy) but either I have not configured it correctly or I am misunderstanding what it is used for. Upon adding in the below code (which is commented out in my github sample) my application still looks at the .\dist directory for my bundles (which are still there from running actual builds).
if (env.IsDevelopment())
{
app.UseSpa(spa =>
{
spa.Options.SourcePath = "./ClientApp";
spa.UseReactDevelopmentServer(npmScript: "start");
spa.UseProxyToSpaDevelopmentServer("http://localhost:8080");
});
}
I ended up getting this working using the .NET Core SPA Middleware. When I originally tried to used the middleware, it was working fine, but I didn't have my webpack dev server configured to serve my bundles to the correct location.
As you can see above, I was serving them to
http://localhost:8080/[name].bundle.js
when they needed to be served to
http://localhost:8080/dist/[name].bundle.js
My problem was that my webpack publicPath was not set correctly. I made an update commit on my repository here. More specifically, this was the file diff that solved my problem.
I am building a react website with firebase functions backend.
I'm using firebase serve to locally host the node.js backend that I connect to my react code through express API endpoints, and I am using react-scripts start to test my react frontend app.
all my get requests in my react app use /some endpoint to communicate with my firebase localserver. But they are running on different ports. firebase serves it on localhost:5000 while react live server hosts it at localhost:3000.
I tried many things and couldn't get any useful way to make this work. I at last added my react project as a subfolder in my firebase project and made the hosting public path at firebase.json to my react build directory. It works now but I always have to run npm run build on my react app on every change, to make it compile my app into the build directory, which is painfully slow.
What is the proper way to do this? debug react app and firebase backend together.
I finally enabled cross-origin-requests on my server using cors module
Serverside code
const cors = require("cors");
app.get("/test", (req, res) => {
return cors()(req, res, async () => {
res.send("working...");
});
});
Serverside code
And then adding a simple config file in the react side, to switch between debugging and deployed testing really helped.
config.js
var domain = "";
// domain = "http://localhost:5000";
export {domain}
then whenever I use apis in react, I simply comment/uncomment the second line to switch between local and deployed testing.
Then whenever I use APIs, I append `domain` before every url in all references, eg fetch requests
import { domain } from "config.js";
fetch(domain + "/int-search", ...
Then it worked fine running both the firebase backend and the react application on localhost, using firebase serve and npm start for my react app.
I was hoping to deploy a Next.js app with Laravel API. I had developed React apps with CRA and in those I used the API server to serve the index.html of the CRA as the entry point of the app.
But in Next.js, after development I get to know that it needs a Node.js server to serve (which is my bad, didn't notice that). There is an option next export that builds a static representation of the Next.js app and it has an index.html. I am serving the index.html as the entry of the app by my Laravel API. It is serving the page, but just some of the static contents.
What I was hoping to know is it possible to host the aPI and the Next app from a single PHP shared hosting without any node server? If so, how? If not so, what could be the alternatives?
Actually the acepted answer is completly wrong, when you do yarn build and in your package.json is set like "build": "next build && next export", you will get an out folder which all the items in there are used to build without node.js server
Now since you are using laravel, and you use the out folder you will only load half of the page because the routes are not set properly. for that to work you need to edit your next.config.js edit it to
module.exports = {
distDir: '/resources/views',
assetPrefix: '/resources/views',
}
These will set the root directory to the root one in Laravel. now this will work for SPA (single page application) only for the dynamic routes you need to match with a view file for each one that you have in your out folder
For each route that you have you need to create a new "get" route in laravel
Route::get('/', function () {
return require resource_path('views/index.html');
});
Route::get('/contacts', function () {
return require resource_path('views/contacts.html');
});
Route::get('/post/{slug}', function () {
return require resource_path('views/post/[slug].html');
});
Notice that you can pass a wildcard for dynamic routes and they are all gonna work. once you do that and you deploy route out folder inside /resources/views in Laravel it's going to work
Apparently there is no alternative to nodejs server, which is not an option for me currently, so I unfortunately had to abandon next.js and create a CRA app and used as much from the next.js as I could.
I have a web application in React that I needed to implement a contact form. The application is created using create-react-app and the server folder added. For the form I used sendgrid mail. Does the server work on port 4567, how do the app build to work on the domain? It is a one-page application.
Thx, it is important.
When running in production, a React app is simple HTML, CSS, and JavaScript. These files are sent from your server to a client when requested in the same way that requests/responses are handled for any web page. There are a few steps that need to be done before your React app is ready for production
1: Create a Production Build
First you need to create a production build of your app. This process takes all of your separate .js or .jsx files and puts them together into a single minified file, and the same for .css. Then your index.html is updated to include a link to the CSS and script to the JS. This is done so that only three files will need to be sent rather than the 10s or 100s that exist in development.
If you used create-react-app to start your application, you can use the command:
npm run build
to do this. Otherwise, you need to have webpack installed, and then run:
node_modules/.bin/webpack --config webpack.prod.js --mode production
(which you might want to add as a script to package.json).
See React: Optimizing Performance for more.
2. Serve your Application
Now your server should have a route for your application and when it receives a request on that route, the server should respond by sending index.html from your client/build/ directory (where client/ is the directory of the React app).
Here is an example with Node/Express as the server (in app.js):
const path = require('path');
app.get('*', (req, res) => {
res.sendFile(path.join(__dirname), 'client', 'build', 'index.html');
});
Note that this is just the way to send a static file using Node and can easily be done with any server.
Additional
You mentioned you want to submit forms with your application. If your routes for receiving POST requests match the routes that the forms are on (e.g. form is on /form and server listens for POST on /form) you can just use the default HTML form submission. However this is not a great way to do things when using React because then routing will be controlled by your server rather than by React. Instead you should use some sort of AJAX method to submit the form.
Since your server is now serving your React app (rather than React serving itself as in development), you can just make relative requests and those requests will be made to your server. For example the request (using the fetch API):
const models = await fetch('/api/models');
Will be made to your_host/api/models by default.
in the package.json add
"proxy": "http://localhost:4567"