I have a React and .NET 6 project, each in separate containers and brought up together on the same Docker Virtual Network via Docker Compose. Both start and run fine but the React frontend is unable to communicate to the backend.
For whatever reason, I can access the weather api on my backend using my browser (doesn't work on the frontend) but not any other endpoint. https://localhost:5001/api/weatherforecast returns the generated weather data (default tutorial weather api).
But when I go to other endpoints like https://localhost:5001/api/user?userId=1, I get the following: Error: The SSL connection could not be established, see inner exception.
I'm not sure if it is an issue with the setupProxy.js or something wrong with the self-signed cert on the .NET backend.
docker-compose.yml
version: "3.8"
networks:
api-network:
services:
server:
build:
context: ./server-testing/server
args:
PROXY: ${PROXY}
container_name: 'server'
ports:
- '5000:5000'
- '5001:5001'
networks:
- api-network
environment:
- ASPNETCORE_ENVIRONMENT=Development
- ASPNETCORE_Kestrel__Certificates__Default__Password=${PW}
- ASPNETCORE_Kestrel__Certificates__Default__Path=/https/aspnetapp.pfx
volumes:
- ~/.aspnet/https:/https:ro
client:
build:
context: ./client
args:
PROXY: ${PROXY}
container_name: 'client'
depends_on:
- server
ports:
- '3000:3000'
networks:
- api-network
stdin_open: true # Keep STDIN open even if not attached
tty: true # Allocate a pseudo-TTY
Frontend Setup
The build is served by an nginx layer to serve a production React build.
In my package.json, I have "start": "set HTTPS=true&&react-scripts start" instead of "start": "react-scripts start".
// setupProxy.js
const { createProxyMiddleware } = require('http-proxy-middleware');
const context = [
"/api"
];
module.exports = function (app) {
const appProxy = createProxyMiddleware(context, {
target: 'https://server:5001/',
secure: false,
changeOrigin: true
});
app.use(appProxy);
};
Backend Setup
Self-signed certs generated with dotnet dev-certs https -ep %USERPROFILE%\.aspnet\https\aspnetapp.pfx -p { password here }
Three endpoints:
/weatherforecast
/user/
/cart/
Endpoints are accessed as https://localhost:5001/api/endpoint?params
If you need any more information, please let me know. Thanks!
Related
Basically I've setup a webapp with this stack: db: MySQL , frontend: React.js, backend: FastAPI (Python)
It's SSL Secure (Since the domain is being Mitigated thru by Cloudflare)
NGINX is used for Service endpoints api.domain.com is the API & domain.com for the frontend and some SSL keys stuff.
Even though this is quote on quote "Production", I'm still running a react development server for prototyping.
**PROBLEM:**
I'm running this project on a VPS, when I update the frontend on the VPS then delete all docker containers and images via the commands:
docker rm -vf $(docker ps -aq) #Delete all containers and volumes
docker rmi -f $(docker images -aq) #Delete all images
The changes still remain old, the caching on my browser is disabled aswell and i've tried multiple methods of clearing cache, it's not that.
The bundle.js webpack file still has it's old changes when being received from React, Specifically, I edited some Endpoint constants for production & that constant is stale in bundle.js! when changing stuff for the backend (Python files) it's working just fine and updated upon CMDS: docker-compose up -d & docker-compose up but react is acting up)
I've tried:
docker-compose pull
docker-compose build --no-cache
docker-compose
Made sure all my frontend files where saved infact.
no luck..
The API is running fine on HTTPS & React.js is also running fine, but the changes are just stale, it's very weird.
docker-compose.yml
version: "3.9"
services:
db:
image: mysql:${MYSQL_VERSION}
restart: always
environment:
- MYSQL_DATABASE=${MYSQL_DB}
- MYSQL_USER=${MYSQL_USERNAME}
- MYSQL_PASSWORD=${MYSQL_PASSWORD}
- MYSQL_ROOT_PASSWORD=${MYSQL_ROOT_PASSWORD}
ports:
- "${MYSQL_PORT}:${MYSQL_PORT}"
expose:
- "${MYSQL_PORT}"
volumes:
- db:/var/lib/mysql
networks:
- mysql_network
backend:
container_name: fastapi-backend
build: ./backend/app
volumes:
- ./backend:/code
ports:
- "${FASTAPI_PORT}:${FASTAPI_PORT}"
env_file:
- .env
depends_on:
- db
networks:
- mysql_network
- backend
restart: always
frontend:
container_name: react-frontend
build: ./frontend/client
ports:
- "${REACT_PORT}:${REACT_PORT}"
depends_on:
- backend
networks:
- backend
restart: always
volumes:
db:
driver: local
networks:
backend:
driver: bridge
mysql_network:
driver: bridge
This was working for my last release, for some reason it just stopped updating, docker is creating the images fine just with stale code...
This is being ran on Ubuntu 22.04 (Debian-like Linux Distribution)
new to Docker and containers in general. Trying to containerize a simple MERN-based todo list application. Locally on my PC, I can successfully send HTTP post requests from my React frontend to my Nodejs/Express backend and create a new todo item. I use the 'proxy' field in my client folder's package.json file, as shown below:
React starts up on port 3000, my API server starts up on 3001, and with the proxy field defined, all is good locally.
My issue arises when I containerize the three services (i.e. React, API server, and MongoDB). When I try to make the same fetch post request, I receive the following console error:
I will provide the code for my docker-compose file; perhaps it is useful for helping provide me a solution?
version: '3.7'
services:
client:
depends_on:
- server
build:
context: ./client
dockerfile: Dockerfile
image: jlcomp03/rajant-client
container_name: container_client
command: npm start
volumes:
- ./client/src/:/usr/app/src
- ./client/public:/usr/app/public
# - /usr/app/node_modules
ports:
- "3000:3000"
networks:
- frontend
stdin_open: true
tty: true
server:
depends_on:
- mongo
build:
context: ./server
dockerfile: Dockerfile
image: jlcomp03/rajant-server
container_name: container_server
# command: /usr/src/app/node_modules/.bin/nodemon server.js
volumes:
- ./server/src:/usr/app/src
# - /usr/src/app/node_modules
ports:
- "3001:3001"
links:
- mongo
environment:
- NODE_ENV=development
- MONGODB_CONNSTRING='mongodb://container_mongodb:27017/todo_db'
networks:
- frontend
- backend
mongo:
image: mongo
restart: always
container_name: container_mongodb
volumes:
- mongo-data:/data/db
ports:
- "27017:27017"
networks:
- backend
volumes:
mongo-data:
driver: local
node_modules:
web-root:
driver: local
networks:
backend:
driver: bridge
frontend:
My intuition tells me the issue(s) lies in some configuration parameter I am not addressing in my docker-compose.yml file? Please help!
Your proxy config won't work with containers because of its use of localhost.
The Docker bridge network docs provide some insight why:
Containers on the default bridge network can only access each other by IP addresses, unless you use the --link option, which is considered legacy. On a user-defined bridge network, containers can resolve each other by name or alias.
I'd suggest creating your own bridge network and communicating via container name or alias.
{
"proxy": "http://container_server:3001"
}
Another option is to use http://host.docker.internal:3001.
I have created a Docker container with a react app and a flask app. As of now, I am getting the error: Proxy error: Could not proxy request /backend_endpoint from localhost:3000 to http://api:5000. In my package.json, I have the line "proxy": "http://api:5000". I read on another post that you are supposed to rename localhost:5000 to container_name:5000 in package.json, which is why I have "proxy": "http://api:5000",. I would appreciate any advice/insight! thank you. Here is my docker-compose.yml:
updated docker-compose.yml
```version: '3.7'
services:
client:
build:
context: ./react-app
dockerfile: Dockerfile
tty: true
ports:
- "3000:3000"
volumes:
- ./react-app:/app
- /app/node_modules
api:
build:
context: ./react-app/api
dockerfile: Dockerfile
ports:
- "5000:5000"```
Error message:Proxy error: Could not proxy request /user_entry from localhost:3000 to http://api:5000. See https://nodejs.org/api/errors.html#errors_common_system_errors for more information (ECONNREFUSED).
I have three components in my docker-compose, mysql db, flask server and react webapp.
When I do request to the server in the chrome, it gives me:
OPTIONS http://server:5000/getTables net::ERR_CONNECTION_TIMED_OUT
But if I get into react container using docker exec -it and then curl -X POST server:5000/getTables, it works fine.
This is how I set up the cors in the flask app:
#app.route('/getTables', methods=['GET', 'POST', 'OPTIONS'])
#cross_origin(origin='*')
This is my docker-compose.yaml:
version: '3'
services:
db:
image: mysql:5.7
restart: always
environment:
MYSQL_DATABASE: 'TEST'
MYSQL_USER: 'xxx'
MYSQL_PASSWORD: 'xxx'
MYSQL_ROOT_PASSWORD: 'xxx'
ports:
- '3306:3306'
expose:
- '3306'
volumes:
- my-db:/var/lib/mysql
server:
build: ./flask
restart: on-failure
ports:
- '5000:5000'
expose:
- '5000'
volumes:
- ./flask:/server
web:
build: ./react
ports:
- 80:3000
volumes:
my-db:
This is the flask-server log:
server_1 | * Serving Flask app "app" (lazy loading)
server_1 | * Environment: production
server_1 | WARNING: This is a development server. Do not use it in a production deployment.
server_1 | Use a production WSGI server instead.
server_1 | * Debug mode: off
server_1 | * Running on http://0.0.0.0:5000/ (Press CTRL+C to quit)
Is there anything wrong with my docker-compose file? Thank you in advance for the help!
EDIT
It seems like an issue related to my react webapp container network setup. When I run the react app locally it works fine, but it will not connect to external API if I put it inside docker.
If you can send request to localhost:5000 from your host machine then your docker-compose is fine. You just need to add
127.0.0.1 server
to you hosts file.
The Error
When deploying to Azure Web Apps with Multi-container support, I receive an "Invalid Host Header" message from https://mysite.azurewebsites.com
Local Setup
This runs fine.
I have two Docker containers: client a React app and server an Express app hosting my API. I am using a proxy to host my API on server.
In client's package.json I have defined:
"proxy": "http://localhost:3001"
I use the following docker compose file to build locally.
version: '2.1'
services:
server:
build: ./server
expose:
- ${APP_SERVER_PORT}
environment:
API_HOST: ${API_HOST}
APP_SERVER_PORT: ${APP_SERVER_PORT}
ports:
- ${APP_SERVER_PORT}:${APP_SERVER_PORT}
volumes:
- ./server/src:/app/project-server/src
command: npm start
client:
build: ./client
environment:
- REACT_APP_PORT=${REACT_APP_PORT}
expose:
- ${REACT_APP_PORT}
ports:
- ${REACT_APP_PORT}:${REACT_APP_PORT}
volumes:
- ./client/src:/app/project-client/src
- ./client/public:/app/project-client/public
links:
- server
command: npm start
Everything runs fine.
On Azure
When deploying to Azure I have the following. client and server images have been stored in Azure Container Registry. They appear to load just fine from the logs.
In my App Service > Container Settings I am loading the images from Azure Container Registry (ACR) and I'm using the following configuration (Docker compose) file.
version: '2.1'
services:
client:
image: <clientimage>.azurecr.io/clientimage:v1
build: ./client
expose:
- 3000
ports:
- 3000:3000
command: npm start
server:
image: <serverimage>.azurecr.io/<serverimage>:v1
build: ./server
expose:
- 3001
ports:
- 3001:3001
command: npm start
I have also defined in Application Settings:
WEBSITES_PORT to be 3000.
This results in the error on my site "Invalid Host Header"
Things I've tried
• Serving the app from the static folder in server. This works in that it serves the app, but it messes up my authentication. I need to be able to serve the static portion from client's App.js and have that talk to my Express API for database calls and authentication.
• In my docker-compose file binding the front end to:
ports:
- 3000:80
• A few other port combinations but no luck.
Also, I think this has something to do with the proxy in client's package.json based on this repo
Any help would be greatly appreciated!
Update
It is the proxy setting.
This somewhat solves it. By removing "proxy": "http://localhost:3001" I am able to load the website, but the suggested answer in the problem does not work for me. i.e. I am now unable to access my API.
Never used azure before and I also don't use a proxy (due to its random connection issues), but if your application is basically running express, you can utilize cors. (As a side note, it's more common to run your express server on 5000 than 3001.)
I first set up an env/config.js folder and file like so:
module.exports = {
development: {
database: 'mongodb://localhost/boilerplate-dev-db',
port: 5000,
portal: 'http://localhost:3000',
},
production: {
database: 'mongodb://localhost/boilerplate-prod-db',
port: 5000,
portal: 'http://example.com',
},
staging: {
database: 'mongodb://localhost/boilerplate-staging-db',
port: 5000,
portal: 'http://localhost:3000',
}
};
Then, depending on the environment, I can implement cors where I'm defining express middleware:
const cors = require('cors');
const config = require('./path/to/env/config.js');
const env = process.env.NODE_ENV;
app.use(
cors({
credentials: true,
origin: config[env].portal,
}),
);
Please note the portal and the AJAX requests MUST have matching host names. For example, if my application is hosted on http://example.com, my front-end API requests must be making requests to http://example.com/api/ (not http://localhost:3000/api/ -- click here to see how I implement it for my website), and the portal env must match the host name http://example.com. This set up is flexible and necessary when running multiple environments.
Or if you're using the create-react-app, then simply eject your app and implement a proxy inside the webpack production configuration.
Or migrate your application to my fullstack boilerplate, which implements the cors example above.
So, I ended up having to move off of containers and serve the React app up in more of a typical MERN architecture with the Express server hosting the React app from the static build folder. I set up some routes with PassportJS to handle my authentication.
Not my preferred solution, I would have preferred to use containers, but this works. Hope this points someone out there in the right direction!