I'm trying to test my React application on a mobile device. I'm using ngrok to make my local server available to other devices and have gotten this working with a variety of other applications. However, when I try to connect ngrok to the React dev server, I get the error:
Invalid Host Header
I believe that React blocks all requests from another source by default. Any thoughts?
I'm encountering a similar issue and found two solutions that work as far as viewing the application directly in a browser
ngrok http 8080 --host-header="localhost:8080"
ngrok http --host-header=rewrite 8080
obviously, replace 8080 with whatever port you're running on
this solution still raises an error when I use this in an embedded page, that pulls the bundle.js from the react app. I think since it rewrites the header to localhost when this is embedded, it's looking to localhost, which the app is no longer running on
Option 1
If you do not need to use Authentication you can add configs to ngrok commands
ngrok http 9000 --host-header=rewrite
or
ngrok http 9000 --host-header="localhost:9000"
But in this case Authentication will not work on your website because ngrok rewriting headers and session is not valid for your ngrok domain
Option 2
If you are using webpack you can add the following configuration
devServer: {
disableHostCheck: true
}
In that case Authentication header will be valid for your ngrok domain
Don't know why but tried everything and it didn't work for me.
What finally worked for me is this:
ngrok http https://localhost:4200 -host-header="localhost:4200"
it might be useful for someone
If you use webpack devServer the simplest way is to set disableHostCheck, check webpack doc like this
devServer: {
contentBase: path.join(__dirname, './dist'),
compress: true,
host: 'localhost',
// host: '0.0.0.0',
port: 8080,
disableHostCheck: true //for ngrok
},
I used this set up in a react app that works. I created a config file named configstrp.js that contains the following:
module.exports = {
ngrok: {
// use the local frontend port to connect
enabled: process.env.NODE_ENV !== 'production',
port: process.env.PORT || 3000,
subdomain: process.env.NGROK_SUBDOMAIN,
authtoken: process.env.NGROK_AUTHTOKEN
}, }
Require the file in the server.
const configstrp = require('./config/configstrp.js');
const ngrok = configstrp.ngrok.enabled ? require('ngrok') : null;
and connect as such
if (ngrok) {
console.log('If nGronk')
ngrok.connect(
{
addr: configstrp.ngrok.port,
subdomain: configstrp.ngrok.subdomain,
authtoken: configstrp.ngrok.authtoken,
host_header:3000
},
(err, url) => {
if (err) {
} else {
}
}
);
}
Do not pass a subdomain if you do not have a custom domain
Windows, ngrok v3
ngrok http <url> --host-header=<host>:<port>
Related
I have an issue with #react-oauth/google npm package.
When I using react app on port 3000 and backend django on port 8000 every thing work's, but after I build react app and using port 8000, I try to log in via google, I get that Erros:
Failed to load resource: the server responded with a status of 400 m=credential_button_library:45 [GSI_LOGGER]: The given origin is not allowed for the given client ID.
I did double check on 'Authorised JavaScript origins' and 'Authorised redirect URIs' (image attached)
but the giving origin are allowed, so whats can be the problem?
I read about similar problems here on the site and also tried CHAT GPT but nothing helped.
This is my configurations:
CORS_ALLOWED_ORIGINS = [
"http://localhost:8000",
"http://localhost:3000",
"http://127.0.0.1:3000",
"http://127.0.0.1:8000"
]
class GoogleLogin(SocialLoginView):
adapter_class = GoogleOAuth2Adapter
callback_url = ['http://localhost:8000', 'http://localhost:3000', 'http://127.0.0.1:8000', 'http://localhost:8000/accounts/google/login/callback/'] # !
client_class = OAuth2Client
Comment the CORS_ALLOWED_ORIGINS and try with CORS_ALLOW_ALL_ORIGINS = True and if this doesn't work try to remove this http://localhost:3000 and http://127.0.0.1:8000 from both ( CORS_ALLOWED_ORIGINS and GoogleLogin ) and If the above things doesn't work then try to run your react app on PORT - 3000 and on the URL just change your port to 8000 and then try.
Forgive bad formatting as it is my first question on here, and thanks in advance for reading!
I am currently writing a remote web application that utilises Apache Guacamole to allow RDP, VNC, and SSH connections. The components I am using are:
Django for backend server - API calls (database info) and Guacamole Websocket Transmissions;
I am using Pyguacamole with Django consumers to handle Guacamole Server communication;
Reactjs for frontend and proxy;
Nginx for reverse proxy;
All this is hosted on a Centos Stream 8 vm
Basically, my websocket has trouble communicating through a proxy. When I run the application without a proxy (firefox in centos running localhost:3000 directly), the guacamole connection works! Though this is where the application communicates directly with the Django server on port 8000. What I want is for the react application to proxy websocket communications to port 8000 for me, so my nginx proxy only has to deal with port 3000 for production.
Here is the code I have tried for my react proxy (src/setupProxy.js):
const { createProxyMiddleware } = require('http-proxy-middleware');
let proxy_location = '';
module.exports = function(app) {
app.use(createProxyMiddleware('/api', { target: 'http://localhost:8000', changeOrigin: true, logLevel: "debug" } ));
app.use( createProxyMiddleware('/ws', { target: 'ws://localhost:8000' + proxy_location, ws: true, changeOrigin: true, logLebel: "debug" } ));
};
I have also already tried with http://localhost:8000 for the ws target url. Also, the api proxy works, but I am unsure if the ws proxy works. After making a websocket request, the consumer does a guacamole handshake, but disconnects the websocket before it can send anything back.
Also, the HPM output shows that it does try upgrading to websocket, but the client disconnects immediately.
Do let me know if you require more information.
I managed to find what was wrong, it was a small mistake though I felt the need to update this thread.
Basically, in consumers I used accept() instead of websocket_accept(), receive() instead of websocket_receive(), and so on. Careless mistake on my part, but hope this helps someone out!
The Error
When deploying to Azure Web Apps with Multi-container support, I receive an "Invalid Host Header" message from https://mysite.azurewebsites.com
Local Setup
This runs fine.
I have two Docker containers: client a React app and server an Express app hosting my API. I am using a proxy to host my API on server.
In client's package.json I have defined:
"proxy": "http://localhost:3001"
I use the following docker compose file to build locally.
version: '2.1'
services:
server:
build: ./server
expose:
- ${APP_SERVER_PORT}
environment:
API_HOST: ${API_HOST}
APP_SERVER_PORT: ${APP_SERVER_PORT}
ports:
- ${APP_SERVER_PORT}:${APP_SERVER_PORT}
volumes:
- ./server/src:/app/project-server/src
command: npm start
client:
build: ./client
environment:
- REACT_APP_PORT=${REACT_APP_PORT}
expose:
- ${REACT_APP_PORT}
ports:
- ${REACT_APP_PORT}:${REACT_APP_PORT}
volumes:
- ./client/src:/app/project-client/src
- ./client/public:/app/project-client/public
links:
- server
command: npm start
Everything runs fine.
On Azure
When deploying to Azure I have the following. client and server images have been stored in Azure Container Registry. They appear to load just fine from the logs.
In my App Service > Container Settings I am loading the images from Azure Container Registry (ACR) and I'm using the following configuration (Docker compose) file.
version: '2.1'
services:
client:
image: <clientimage>.azurecr.io/clientimage:v1
build: ./client
expose:
- 3000
ports:
- 3000:3000
command: npm start
server:
image: <serverimage>.azurecr.io/<serverimage>:v1
build: ./server
expose:
- 3001
ports:
- 3001:3001
command: npm start
I have also defined in Application Settings:
WEBSITES_PORT to be 3000.
This results in the error on my site "Invalid Host Header"
Things I've tried
• Serving the app from the static folder in server. This works in that it serves the app, but it messes up my authentication. I need to be able to serve the static portion from client's App.js and have that talk to my Express API for database calls and authentication.
• In my docker-compose file binding the front end to:
ports:
- 3000:80
• A few other port combinations but no luck.
Also, I think this has something to do with the proxy in client's package.json based on this repo
Any help would be greatly appreciated!
Update
It is the proxy setting.
This somewhat solves it. By removing "proxy": "http://localhost:3001" I am able to load the website, but the suggested answer in the problem does not work for me. i.e. I am now unable to access my API.
Never used azure before and I also don't use a proxy (due to its random connection issues), but if your application is basically running express, you can utilize cors. (As a side note, it's more common to run your express server on 5000 than 3001.)
I first set up an env/config.js folder and file like so:
module.exports = {
development: {
database: 'mongodb://localhost/boilerplate-dev-db',
port: 5000,
portal: 'http://localhost:3000',
},
production: {
database: 'mongodb://localhost/boilerplate-prod-db',
port: 5000,
portal: 'http://example.com',
},
staging: {
database: 'mongodb://localhost/boilerplate-staging-db',
port: 5000,
portal: 'http://localhost:3000',
}
};
Then, depending on the environment, I can implement cors where I'm defining express middleware:
const cors = require('cors');
const config = require('./path/to/env/config.js');
const env = process.env.NODE_ENV;
app.use(
cors({
credentials: true,
origin: config[env].portal,
}),
);
Please note the portal and the AJAX requests MUST have matching host names. For example, if my application is hosted on http://example.com, my front-end API requests must be making requests to http://example.com/api/ (not http://localhost:3000/api/ -- click here to see how I implement it for my website), and the portal env must match the host name http://example.com. This set up is flexible and necessary when running multiple environments.
Or if you're using the create-react-app, then simply eject your app and implement a proxy inside the webpack production configuration.
Or migrate your application to my fullstack boilerplate, which implements the cors example above.
So, I ended up having to move off of containers and serve the React app up in more of a typical MERN architecture with the Express server hosting the React app from the static build folder. I set up some routes with PassportJS to handle my authentication.
Not my preferred solution, I would have preferred to use containers, but this works. Hope this points someone out there in the right direction!
I am having a bit of trouble setting up my create-react-app application to proxy requests to my test hosting on Microsoft azure. I have set up the proxy in my app's package.json as follows:
"proxy":{
"/api/*":{
"target":"https://mytestbackend.azurewebsites.net",
"secure":false
}
}
I have set up an axios request to be sent to the backend server on azure. It is in a stand-alone .js which I call from one of my react application's events. It looks like this:
import axios from 'axios';
const login = async (username, password) => {
console.log("Username to send is:"+username);
console.log("password to send is:"+password);
let response = await axios.post('/api/user/login', {username:username,password:password});
console.log(response);
};
export {login};
The problem can't be in my react components, because those two console.log() call show that the values entered are being recieved. If I remove the "secure":false setting from package.json, request fails with Http Error: 500. But if I use the secure setting, it fails with a 404 page. Can someone please shed a little light on what am I doing wrong? Can I only use the proxy on "localhost"? The documentation suggests otherwise. Any help is greatly appreciated.
I have verified that CORS is enabled for the domain on which the dev server is running on the Azure Management Portal. And if I do the request by using the backend's URL directly (that is, not using the create-react-app proxy), it works. The problem must be something in the way the proxy is configured.
The response text for the HTTP Errpr 500 which happens when not using secure is :
Proxy error: Could not proxy request /api/user/login from localhost:3000 to https://mytestbackend.azurewebsites.net (undefined).
Additional info: I have also tested by running my Backend locally on my development machine. The error message occurs but the "undefined" in the parenthesis says "UNABLE_TO_VERIFY_LEAF_SIGNATURE". If using "secure: false, I can call the login endpoint successfully, but calls to other endpoints which require authentication fail because the cookie is not sent by axios.
Doing:
curl -v https://mytestbackend.azurewebsites.net/api/user/login
Has this output:
* SSLv3, TLS handshake, Client hello (1):
* SSLv3, TLS handshake, Server hello (2):
* SSLv3, TLS handshake, CERT (11):
* SSLv3, TLS alert, Server hello (2):
* SSL certificate problem: unable to get local issuer certificate
* Closing connection #0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: http://curl.haxx.se/docs/sslcerts.html
curl performs SSL certificate verification by default, using a "bundle"
of Certificate Authority (CA) public keys (CA certs). If the default
bundle file isn't adequate, you can specify an alternate file
using the --cacert option.
If this HTTPS server uses a certificate signed by a CA represented in
the bundle, the certificate verification probably failed due to a
problem with the certificate (it might be expired, or the name might
not match the domain name in the URL).
If you'd like to turn off curl's verification of the certificate, use
the -k (or --insecure) option.
create-react-app use WebPackDevServer which uses https://github.com/chimurai/http-proxy-middleware#options
So you can use all the options from the same
Now one key header that is import in such cases of externally hosted server is host. This at times can issues if not correct, see below example
Websocket works on EC2 url but not on ElasticBeanstalk URL
Next is the cookies might be associated with localhost, i checked and they should go without any modification. But you might want to use the cookieDomainRewrite: "" option as well
So the final config I would use is below
"proxy":{
"/api/*":{
"target":"https://mytestbackend.azurewebsites.net",
"secure":false,
"headers": {
"host": "mytestbackend.azurewebsites.net"
},
"cookieDomainRewrite": ""
}
}
Also on your client you want to use the withCredentials:true
let userinfo =await axios.get('/api/secured/userinfo',{withCredentials:true});
Create react app http-proxy-middleware, and should support the full set of options.
Some things I would try:
The path to match may be /api/** instead of /api/* if you want to nest multiple levels deep (eg. for /api/user/login)
You may need to add changeOrigin: true if you're proxying to something remotely (not on localhost)
You will likely want to keep secure: false as you aren't running localhost with https.
So in total, I would try
"proxy":{
"/api/**": {
"target": "https://mytestbackend.azurewebsites.net",
"secure": false,
"changeOrigin": true
}
}
After days of trying unsuccessfully to do this, I finally found a setup that works. Proxy is configured like this:
"proxy": {
"/api/user/login": {
"target": "https://localhost:44396",
"logLevel": "debug",
"secure": false
},
"/api/secured/userinfo": {
"target": "https://localhost:44396",
"secure": false,
"logLevel":"debug",
"secure":false
}
Request to both endpoints on the client have withCredientials:true
try {
await axios({
method:'post',
url:'/api/user/login',
withCredentials:true,
data:
{username:username,password:password}}
);
let userinfo =await axios.get('/api/secured/userinfo',{withCredentials:true});
return userinfo;
As you can see, I've moved to testing on my local dev machine. For whatever reason, this setup refuses to work on the azure-hosted backend. I would have preferred that it work as I originally intended, but at least now I can continue with my project.
I am finding myself pretty stuck using grunt-connect-proxy to make calls from my yeoman generated angular app running on port 9000 to my laravel backend which is running on port 8000. After following the instructions on the grunt-connect-proxy github I see the following message upon running grunt serve:
Running "configureProxies:server" (configureProxies) task
Proxy created for: /api to localhost:8000
I have my proxies set up here in connect.proxies directly following connect.options:
proxies: [{
context: '/api', // the context of the data service
host: 'localhost', // wherever the data service is running
port: 8000 // the port that the data service is running on
}],
In my controller then attempt to make a call to the api to test my proxy:
var Proxy = $resource('/api/v1/purchase');
Proxy.get(function(test){
console.log(test);
});
In the result of this in my console is a 500 error indicating that the call was still made to port 9000 rather than 8000:
http://localhost:9000/api/v1/purchase 500 (Internal Server Error)
Here is a link to a gist containing my full gruntfile: https://gist.github.com/JohnBueno/7d48027f739cc91e0b79
I have seen quite a few posts on this but so far none of them have been of much help to me.