I am unable to connect to socketio in production.
This is my traefik file. my backend gets all calls which go to /api....
backend:
build:
context: ./backend
dockerfile: Dockerfile
command: ["npm", "run", "start"]
labels:
- "traefik.enable=true"
- "traefik.http.routers.backend.rule=Host(`mydomain.com`) && PathPrefix(`/api`)"
networks:
- web
My express server:
export const io = new Server(server, {
cors: {
origin: [
"http://localhost:5173",
"https://mysecretdomain.com",
],
},
});
My react frontend:
let socket: any;
if (import.meta.env.MODE === 'development') {
socket = io(API_CONFIGS.SOCKET_IO_URL);
} else {
// the app runs with traefik and is available under the prefix /api
socket = io(API_CONFIGS.SOCKET_IO_URL, { path: '/api/socket.io' });
}
In Development, everything works fine but in prod im getting a 404 error
example call from my domain
https://mysecretdomain/api/socket.io/?EIO=4&transport=polling&t=OKSmUX8
Status:404
I am completly out of ideas. Can someone help me out?
Related
I have created two very simple containers to understand/test HTTP requests between containers, but for some reason, I just can't get my containers communicating. I keep getting GET http://backend:5000/ net::ERR_NAME_NOT_RESOLVED
My first container, a simple react app with no functionality, just gets container name from process.env.REACT_APP_URL, and makes a get request by fetch(http://${url}:5000/).
import { useState } from "react";
const MyComponent = () => {
const [message, setmessage] = useState("Hello");
async function buttonClick() {
let url = process.env.REACT_APP_URL;
try {
let response = await fetch(`http://${url}:5000/`);
console.log("This is response", response);
setmessage(response.data);
} catch (error) {
console.log("error occured:", error);
}
}
return (
<>
<p>{message}</p>
<button onClick={buttonClick}>Click Me!</button>
</>
);
};
export default MyComponent;
My second container, again incredibly simple Flask app, with Hello World served in the homepage route, and nothing else.
from flask import Flask, jsonify
app = Flask(__name__)
#app.route("/")
def hello_world():
return jsonify("Hello World"), 200
if __name__ == '__main__':
app.run(host="0.0.0.0")
And their corresponding docker files,
FROM node:17.4.0-alpine
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm install --silent
RUN npm install react-scripts#3.4.1 -g --silent
# add app
COPY . ./
# start app
CMD ["npm", "start"]
FROM python:3.6.5-alpine
RUN apk update && apk upgrade && apk add gcc musl-dev libc-dev libc6-compat linux-headers build-base git libffi-dev openssl-dev
COPY . .
RUN pip install -r requirements.txt
CMD ["python", "./myfile.py"]
Finally, I am using docker-compose to orchestrate these containers:
version: "3"
services:
backend:
build:
context: ./server
container_name: backend
expose:
- 5000
ports:
- 5000:5000
frontend:
build:
context: ./web
container_name: frontend
expose:
- 3000
ports:
- 3000:3000
environment:
- REACT_APP_URL=backend
depends_on:
- "backend"
links:
- "backend:backend"
My file system is as follows:
/sample-app
|_ server
|_ web
|_ docker-compose.yml
I have been trying to understand what I am doing wrong and I just can't find it. I appreciate any help. 🙏 🙏
Your frontend communicates with the backend with the exposed port on the host machine (not a container-container communication that needs to happen if the backend container wanted to connect with a DB container). Therefore, the hostname should be localhost not backend.
Try with the following change
frontend:
...
...
environment:
- REACT_APP_URL=localhost
...
...
You need to convert the response to JSON as well.
let response = await fetch(`http://${url}:5000/`);
response = await response.json();
console.log("This is response", response);
setmessage(response.data);
I'm trying to have my React front end application interact with a Flask API, both Dockerized and built together with docker-compose. Here is the docker-compose.yml file:
version: "3.9"
services:
server:
build: ./server
ports:
- "80:5000"
volumes:
- ./server:/app
environment:
FLASK_ENV: development
env_file:
- ./.env
web:
build: ./app
ports:
- "3000:3000"
volumes:
- ./app:/user/src/app
depends_on:
- server
The package.json looks like this:
{
"name": "housing",
"version": "0.1.0",
"private": true,
...
"proxy":"http://server:80"
}
And then in App.js file trying to call the API with:
callAPI( some_arg ) {
var h = new Headers();
h.append("Content-Type", "application/json");
h.append("Access-Control-Allow-Origin", "*");
var raw = JSON.stringify({"some_arg":some_arg});
var requestOptions = {
method: 'POST',
headers: h,
body: raw,
redirect: 'follow'
};
const url = '/api/some_service'
fetch(url, requestOptions).then(res => res.json()).then(data => {
this.setState({some_component_data: data});
});
}
Unfortunately doing this results in an error:
Proxy error: Could not proxy request /api/some_service from localhost:3000 to http://server:80.
It works fine if I replace server with 0.0.0.0 but I'd quite like to use the actual container name in package.json. How can I do this?
My use case is a little different (django + redis), but I would try some combination of these 2 things:
Remove the http:// and just use server:80
Specify container_name in your docker-compose file. I don't know if this is actually necessary or if it uses the service name to connect, but worth a shot if the first thing doesn't work alone.
For my use case, the connection string is just redis://redis and the docker-compose section for that service looks like this:
redis:
image: redis
container_name: redis
restart: always
command: redis-server --requirepass <password>
volumes:
- redis_data:/data
ports:
- "6379:6379"
The Dockerfile for my React client:
FROM node:10
WORKDIR /app/client
COPY ["package.json", "package-lock.json", "./"]
RUN npm install --production
COPY . .
EXPOSE 3000
CMD ["npm", "start"]
The Dockerfile for my Express backend:
FROM node:10
WORKDIR /app/server
COPY ["package.json", "package-lock.json", "./"]
RUN ls
RUN npm install --production
COPY . .
EXPOSE 5000
CMD ["node", "server.js"]
My docker-compose.yml file in my project's root:
version: '3'
services:
backend:
build:
context: ./backend
dockerfile: ./Dockerfile
image: "isaacasante/mcm-backend"
ports:
- "5000:5000"
frontend:
build:
context: ./client
dockerfile: ./Dockerfile
image: "isaacasante/mcm-client"
ports:
- "3000:3000"
links:
- "backend"
My server.js file under my backend folder:
var express = require("express");
var cors = require("cors");
var app = express();
var path = require("path");
// Enable CORS and handle JSON requests
app.use(cors());
app.use(express.json());
app.post("/", function (req, res, next) {
// console.log(req.body);
res.json({ msg: "This is CORS-enabled for all origins!" });
});
// Set router for email notifications
const mailRouter = require("./routers/mail");
const readerRouter = require("./routers/reader");
const notificationsRouter = require("./routers/booking-notifications");
app.use("/email", mailRouter);
app.use("/reader", readerRouter);
app.use("/notifications", notificationsRouter);
if (process.env.NODE_ENV === "production") {
app.use(express.static("mcm-app/build"));
app.get("*", (req, res) => {
res.sendFile(path.join(__dirname, "mcm-app", "build", "index.html"));
});
}
app.listen(5000, function () {
console.log("server starting...");
});
When I run:
docker-compose up
I get the following output in my terminal:
$ docker-compose up
Starting mcm_fyp_backend_1 ... done
Starting mcm_fyp_frontend_1 ... done
Attaching to mcm_fyp_backend_1, mcm_fyp_frontend_1
backend_1 | server starting...
frontend_1 |
frontend_1 | > mcm-app#0.1.0 start /app/client
frontend_1 | > react-scripts start
frontend_1 |
frontend_1 | ? ?wds?: Project is running at http://172.18.0.3/
frontend_1 | ? ?wds?: webpack output is served from
frontend_1 | ? ?wds?: Content not from webpack is served from /app/client/public
frontend_1 | ? ?wds?: 404s will fallback to /
frontend_1 | Starting the development server...
frontend_1 |
mcm_fyp_frontend_1 exited with code 0
My backend exits with code 0, and I can't load my app. My backend is running though.
What am I doing wrong, and how can I get my React-Express-Node app running with Docker Compose?
I found the solution to prevent my frontend service from exiting with 0. I had to add tty: true for it in my docker-compose.yml file. Also, to make the frontend-backend interaction work as expected in the app, I had to change my proxy command in my client's package.json to the following:
"proxy": "http://backend:5000"
And I changed my links command in my docker-compose.yml to this:
links:
- "backend:be"
After rebuilding, everything is working as intended.
I'm trying to finalize an HTTPS connection for a production server I'm building but am running into issues with HTTPS.
I've tried a few guides on how to set it up individually for both react and node but cannot seem to get the connection completed.
Here's a simplified version of my server.js file for express:
const https = require('https');
const helmet = require('helmet');
const server = express();
server.use(helmet);
https.createServer({
key: fs.readFileSync('server.key'),
cert: fs.readFileSync('server.cert')
}, server)
.listen(port, () => console.log(`Listening on port ${port}`));
And my frontend build path in my package.json:
"scripts": {
"start": "set HTTPS=true && react-scripts start"
}
But when I go to https://localhost:3000 I get the following from the console and the screen is only white
Failed to load resource: net::ERR_EMPTY_RESPONSE index.js:1437
TypeError: Failed to fetch
Any thoughts on what I'm doing wrong? Thanks in advance for your time and for any help you can provide.
I'm using Gatsby with netlify-lambda which creates a server for functions on the 9000 ports:
http://localhost:9000/myFunctionName
In production the address of the functions is:
/.netlify/functions/myFunctionName
So I would like to have a dev mode proxy that serves http://localhost:9000/ when I call /.netlify/functions.
My custom Webpack config in gatsby-node.js:
exports.modifyWebpackConfig = ({ config, stage }) => {
if (stage === 'develop') {
config.merge({
devServer: {
proxy: {
'/.netlify/functions': {
target: 'http://localhost:9000',
pathRewrite: {
'^/\\.netlify/functions': ''
}
}
}
}
})
}
}
Does not work.
I tried this too https://www.gatsbyjs.org/docs/api-proxy/#api-proxy but I need to rewrite the url and not just prefix.
What's the best way to use netlify-lambda with Gatsby ?
Thanks
Update: Gatsby now does support Express middleware, in this merged PR. This will not support the Webpack Dev Server proxy configuration, but will allow using ordinary Express proxy middleware.
To use with netlify-lambda just add this to your gatsby-config.js :
const proxy = require("http-proxy-middleware")
module.exports = {
developMiddleware: app => {
app.use(
"/.netlify/functions/",
proxy({
target: "http://localhost:9000",
pathRewrite: {
"/.netlify/functions/": "",
}
})
)
}
}
https://www.gatsbyjs.org/docs/api-proxy/#advanced-proxying
Unfortunately, the Gatsby development proxy configuration is not at all extensible. Since Gatsby does not use the webpack dev server, its proxy options are not available. See https://github.com/gatsbyjs/gatsby/issues/2869#issuecomment-378935446
I achieve this use case by putting an nginx proxy in front of both my Gatsby development server and the netlify-lambda server. This is a very simplified version of the proxy server config (trailing slash is important):
server {
listen 8001;
location /.netlify/functions {
proxy_pass http://0.0.0.0:9000/;
}
location / {
proxy_pass http://0.0.0.0:8000;
}
}
I'm using an nginx proxy anyhow, so that's not an undesirable amount of extra dev setup, but it would certainly be better if Gatsby supported this common case.