I have the following website which is a React built site. I have an nginx load-balance site with two backend servers. The individual servers work perfectly but when behind the load-balancers the site rarely load and looking at the browser dev tools there are a ton of 404 Not Found errors:
https://junoscan.skynetexplorers.com
I don't understand why the sites does not load. Sometimes a browser will start working properly. For example, currently Brave Browser does not work on my desktop but started working on my cell phone. What is happening? How do I fix this behavior?
##
# Set Rate Limiting (DDoS protection)
##
limit_req_zone $binary_remote_addr zone=req_zone:10m rate=5r/s;
# This is the internal server behind the proxy
upstream bdipper_node {
least_conn;
server cluster.provider-0.prod.sjc1.akash.pub:31375;
server cluster.provider-2.prod.ewr1.akash.pub:31639;
}
map $http_upgrade $connection_upgrade {
default upgrade;
'' close;
}
# This is the public facing listening server AND configures SSL for the website
server {
root /_next/static/chunks;
sendfile on;
tcp_nopush on;
sendfile_max_chunk 1m;
tcp_nodelay on;
keepalive_timeout 65;
listen 443 ssl;
location / {
limit_req zone=req_zone burst=20 nodelay;
proxy_pass http://bdipper_node/;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $connection_upgrade;
proxy_set_header Host $host:443;
}
}
# This redirect http to https
server {
listen 80 ;
return 301 https://$host$request_uri;
}
Related
I am stuck on being unable to overcome CORS errors after switching to domain name via Cloudflare. I just can't seem to get the backend api to use the domain name like https://www.my_domain.com/api
What I have done:
Applied nginx solution and restarted nginx
In cloudflare I updated DNS management with three A records: *, www, and <my_domain.com>
In cloudflare Always Use HTTPS is set to On
ExpressJS enabled cors library
Applied nginx solution below...
Note: I have React running on port 3000. React shows when visiting the domain name in the browser!
location /api {
proxy_pass http://localhost:8000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
Command I used to test and restart nginx:
sudo nginx -t && sudo systemctl restart nginx
In cloudflare DNS management...
A record, Name: *, Content: <server_ip>, TTL: Auto, Proxy status: DNS only
A record, Name: <my_domain_name>, Content: <server_ip>, TTL: Auto, Proxy status: Proxied
A record, Name: www, Content: <server_ip>, TTL: Auto, Proxy status: Proxied
In cloudflare, SSL/TLS,Edge Certificates, Always Use HTTPS is set to On.
ExpressJS enabled cors library
And certainly, I have added the cors library to express like so:
const express = require("express");
const cors = require("cors");
require('dotenv').config()
const app = express();
if (process.env.NODE_ENV === 'production') {
app.use(cors({ origin: `${process.env.CLIENT_URL}` }));
}
my backend .env file:
NODE_ENV=production
CLIENT_URL=http://my_domain.com
my frontend react .env file
CLIENT_URL=/api
I noticed also that in the browser console, it shows the request is being sent as http://<server_ip>:8000/api and not https:<domain_name>/api as is expected.
Also, I just made the change to the domain name today.
Despite having these settings above, when I try to login, I see the request gets blocked due to cors error.
What could be causing the cors error?
Followed this tutorial...
https://medium.com/#nishankjaintdk/serving-a-website-on-a-registered-domain-with-https-using-nginx-and-lets-encrypt-8d482e01a682
Resulted in discovering these instructions...
https://certbot.eff.org/lets-encrypt/ubuntubionic-nginx
Ran these commands (from the certbot instructions) in my linux terminal...
sudo snap install core; sudo snap refresh core; sudo apt-get remove certbot; sudo snap install --classic certbot; sudo ln -s /snap/bin/certbot /usr/bin/certbot; sudo certbot --nginx
There was a prompt for domain names at this point. So, I entered my domains like this...
my_domain.com www.my_domain.com
Per the instructions, I then tested the renewal (you'll need to annually renew but these will be put into cron for *hopefully autorenew)
sudo certbot renew --dry-run
The domain api worked but the react app did not show yet, indicating perhaps some issue in Cloudflare.
After, I got my website to show up in the browser like this...
Logged into Cloudflare.com
Went to the domain's SSL/TLS encryption settings
Changed it from Flexible to Full
Clicked on the Edge certificates
Turned off the HTTPS Always On
After doing all of the above, it resulted in the domain showing my react app which fully worked with the api and no CORS errors.
We are making a V2 Docusaurus website.
After building the website in the server, we could well use it with https. Here is a part of my_server_block.conf:
server {
listen 3001 ssl;
ssl_certificate /certs/server.crt;
ssl_certificate_key /certs/server.key;
ssl_session_cache shared:SSL:1m;
ssl_session_timeout 5m;
ssl_ciphers HIGH:!aNULL:!MD5;
ssl_prefer_server_ciphers on;
location / {
proxy_pass http://localhost:3002;
proxy_redirect off;
proxy_set_header Host $host:$server_port;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Ssl on;
}
}
In localhost, http works. However, we need to test https in localhost now. But https returns an error, though I started it by HTTPS=true yarn start: This site can’t provide a secure connection localhost sent an invalid response. ERR_SSL_PROTOCOL_ERROR:
Does anyone know what I should do to make https work in localhost?
Edit 1: I tried HTTPs=true SSL_CRT_FILE=certs/server.crt SSL_KEY_FILE=certs/server.key yarn start, https://localhost:3001 still returned the same error. Note that certs/server.crt and certs/server.crt are the files that make https work in our production server via ngnix:
server {
listen 3001 ssl;
ssl_certificate /certs/server.crt;
ssl_certificate_key /certs/server.key;
You are using Nginx, so use it for SSL offloading (your current config) and don't start https on the Docusaurus site. So user in the browser will use https, but Docusaurus will be using http.
If you start https on the Docusaurus site and you will be proxypassing with http proxy_pass http://localhost:3002;, then it is obvious problem - connection with http protocol to https endpoint. You may proxypass with https protocol proxy_pass https://localhost:3002; of course, but that may need more advance configuration. Just keep it simple and use SSL offloading in the Nginx.
There is an issue with https support on localhost in react-dev-utils#^v9.0.3, which is a dependency of docusaurus.
https://github.com/facebook/create-react-app/issues/8075
https://github.com/facebook/create-react-app/pull/8079
It is fixed in react-dev-utils#10.1.0
Docusaurus 2 uses Create React App's utils internally and you might need to specify the path to your cert and key as per the instructions here. I'm not familiar with the server config so I can't help you there.
Maybe this answer will be helpful - How can I provide a SSL certificate with create-react-app?
I bought a VPS on OVH, which currently runs on Debian 9.
I installed successfully SSL over defaults ports (80 and 443) and it's working great when displaying basic html.
However, I'm totally lost concerning the run of my react app (basic app to try configuration).
It works in http in Safari but doesn't work at all in Chrome : "This site can’t provide a secure connection wecode-it.fr sent an invalid response.
ERR_SSL_PROTOCOL_ERROR"
I already checked the date of my server which is correct.
I'm running my app locally with npm start and want to use development mode for now. If you have any advice tho on building the app for production, I'll take it too. I think I'll use docker but I don't how to use it yet.
Here is my nginx configuration.
listen 80 default_server;
listen [::]:80 default_server;
server_name wecode-it.fr www.wecode-it.fr;
root /usr/share/nginx/html;
index index.html;
location ~ /.well-known {
allow all;
}
return 301 https://$server_name$request_uri;
}
server {
listen 443 ssl http2 default_server;
listen [::]:443 ssl http2 default_server;
ssl on;
ssl_certificate /etc/letsencrypt/live/wecode-it.fr/fullchain.pem
ssl_certificate_key /etc/letsencrypt/live/wecode-it.fr/privkey.pem
ss_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_prefer_servers_ciphers on;
ssl_ecdh_curve secp384r1;
ssl_session_cache shared:SSL:10m;
ssl_session_tickets off; # Requires nginx >= 1.5.9
ssl_stapling on; # Requires nginx >= 1.3.7
ssl_stapling_verify on; # Requires nginx => 1.3.7
resolver 8.8.8.8 8.8.4.4 valid=300s;
resolver_timeout 5s;
add_header Strict-Transport-Security "max-age=63072000; includeSubdomains";
add_header X-Frame-Options DENY;
add_header X-Content-Type-Options nosniff;
ssl_dhparam /etc/ssl/certs/dhparam.pem;
location / {
proxy_pass https://MYIP:3030;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
I'm available to answer any of you questions.
Thank you.
You have to decide if you want to use the proxy or direct.
Some ssl settings are missing (if certificate is local).
But for now, you can try this setting:
server {
listen 443 ssl;
server_name backend1.example.com;
ssl_certificate /etc/ssl/certs/server.crt;
ssl_certificate_key /etc/ssl/certs/server.key;
#...
location /yourapp {
proxy_pass http://url_to_app.com;
#...
}
}
See more in docs
https://docs.nginx.com/nginx/admin-guide/security-controls/securing-http-traffic-upstream/#configuring-upstream-servers
I completely reinstalled my vps, did every steps from scratch and it's now working. I think I lost myself trying too much stuff, and starting all over again made it simple.
I am running a Django based web application inside a set of Docker containers and I'm trying to include both a REST API (using django-REST-framework) as well as the ReactJS app that consumes it. All my other apps are served over HTTPS but I am running into Mixed Active Content when it comes to the React app hitting the REST API inside the Docker network. The React App is being hosted within my NGINX container and served up as a static site.
Here's the relevant config for my Nginx container:
# SSL Website
server {
listen 443 http2 ssl;
listen [::]:443 http2 ssl;
server_name *.domain.com;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_ciphers EECDH+CHACHA20:EECDH+AES128:RSA+AES128:EECDH+AES256:RSA+AES256:EECDH+3DES:RSA+3DES:!MD5;
ssl_prefer_server_ciphers on;
ssl_certificate /etc/nginx/ssl/my_cert.crt;
ssl_certificate_key /etc/nginx/ssl/my_key.key;
ssl_stapling on;
ssl_stapling_verify on;
access_log /home/logs/error.log;
error_log /home/logs/access.log;
upstream django {
server web:9000;
}
location /
{
include uwsgi_params;
# Proxy settings
proxy_pass http://django;
proxy_http_version 1.1;
proxy_buffering off;
proxy_set_header Host $http_host;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
# REACT APPLICATION
location /faqs {
autoindex on;
sendfile on;
alias /usr/share/nginx/html/faqs;
}
}
The during development the React app was hitting my REST API from outside the network so resources calls used https like so:
axios.get(https://myapp.domain.com/api/)
and everything went relatively smoothly, barring the occasional CORS error.
However, now that both the React and the API are running inside the Docker network NGINX is not involved in the communication between containers and the routes are like so:
axios.get(http://web:9000/api)
This gives me the aggravating Mixed Active Content Error.
I've seen multiple questions similar to this but most are either not using Docker containers or use some NGINX directives I've already got in my config file. Given the popularity of Docker for these kind of loosely coupled applications I would imagine solutions abound for this kind of problem. Sadly I have not managed to come across any and as such, any suggestions would be greatly appreciated.
Since your application includes both an API and a web client from the same end point, you have a "gateway" in nginx that routes all requests to either end point. So far, common practice (although you are missing a load balancer, but that's a different discussion)
All requests to your API should be to https. You should also be serving your static site over https with the same certificate from the same domain. If this isn't the case - there is your problem.
Furthermore, all routes and urls inside your react application should be relative. That means that the react app doesn't need to know what your domain is. Neither should your API ideally although that is sometimes harder to do.
your axios call, given that the react app is served from the same domain over https, should be
axios.get(/api)
I have an app where the admin side is built in angular, but the front consumer facing side is not.. and lives on a different server.
I'd like if visitors coming to:
domain.com/#/admin => angular app
But if they got to:
domain.com
This goes to the other server
Is this possible?
Thanks.
For a little more clarity, I think what I need is this:
I have a rails app as an api that angular consumes. Rails lives on server A and Angular lives on server B.
Both servers use nginx as the web server.
Also, the rails app serves up its own content, so I want any path other than admin to go to rails.
What I want is if users go to:
mydomain.com --> they hit the rails app and content on server A
when they go to:
mydomain.com/admin or mydomain.com/#/admin --> they hit the angular app on server B
I think I almost figured it out with the following nginx configs on the rails server, server A:
upstream serverA {
server rails;
server unix:///var/www/rails/shared/tmp/sockets/puma.sock;
}
upstream serverB {
server angular;
}
server {
listen 80;
server_name serverA;
root /var/www/rails/current/public;
location / {
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass http://serverA;
}
location ~* ^/assets/ {
expires 1y;
add_header Cache-Control public;
add_header Last-Modified "";
add_header ETag "";
break;
}
location /admin/ {
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass http://serverB/#/;
}
}
While this gets me close, my angular scripts and styles that are references by my angular index.html file are not being found.