I'm building a simple web app (MEAN) and deploying to GCP (Google Cloud Platform) I have built an application where my client runs on 4200 and the server runs on port 3000. I have two choices, combine my client and server to run on the same port, or simply allow the connection between 4200 and 3000. Currently, I get an error trying to proxy requests between these two ports. Any ideas how I open 3000 so 4200 can talk to it?
Related
I have a microservice that is deployed on a payara app server. Everything works so far. Now I try to include an embedded REDIS Server which listens on port 6379.
In order to use this, I have to open that port and ensure that the traffic to this port is redirected to my Application.
I already configured a network listener on 6379. So that means that payara is listening on this port too.
But how can I achieve, that this port is forwarded to my own application?
I'm running a local express server at Port 5000. However, when I fetch http://localhost:5000/api from my react webapp (XAMPP), the server does not respond. How would I self host my express app to make it accessible from anywhere, not just during development.
Ideally, if your express server is running on port 5000 then it should be accessible by localhost:5000 or 127.0.0.1:5000. Try looking at your console logs in your browser for details(provide more information). Another thing to look out for is to check if it's running on http or https in production.
By default the server will run on all interfaces, not just localhost, so you can access it remotely by using its IP address on that network. (see https://stackoverflow.com/a/33957043/9726680)
I have a rails application deployed on google app engine. I visit myapp.com/resque with a local redis server running and it works. When doing this in production, I cant seem to connect. I have a redis vm instance deployed on google compute engine and I cannot redis-cli -h 123.123.123:6379 into it on any of the servers. It only returns Could not connect to Redis at 11.111.11.1:6379: Connection refused. I've tried using both the internal and external IPs and no luck.
I had the same issue.
It was due to bind in redis conf. By default redis was binded to only 127.0.0.1
It's in redis.conf file
bind 127.0.0.1
As mentioned in redis security link its there so that only trusted clients are allowed to connect and by default its only localhost
I have a .NET app that is serving up an Angular 1.4 app. I'm running a second Angular app in the nav bar that connects to a second server running Node.js and Socket.io.
I can connect to the the client to the server when I'm running the server on my localhost via
io.connect('http://localhost:8080')
but not when I'm trying to connect to my Heroku instance via
io.connect('http://my-weird-heroku-generated-url.com').
Is this because I'm not specifying the heroku port? Do I need to use the unaliased numeric IP address to my Heroku app (similar to the 127.0.0.1:8080 address/port to my localhost)?
The current setup,
Google Compute Engine running Windows Server 2012 (GCE Server 2012)
Google Compute Engine running Debian Wheezy (GCE Server Wheezy)
GCE Server 2012 has one open port, tcp 3389 to GCE Server Wheezy
GCE Server Wheezy is running Guacamole with NLA enabled and Tomcat 7 and is working off x.x.x.x:8080/guacamole/
So I have, what I hope, is a secure connection between GCE Server 2012 and GCE Server Wheezy. Now I want to be able to access x.x.x.x:8080/guacamole/ securely, but the setup with SSL has been difficult.
What I want to know is if it's possible to access GCE Server Wheezy through Google App Engine, which already has great SSL protection. Essentially, I would like to be able to open one port and IP address (range) and/or some sort of internal connection between a GCE website and GAE, and then access everything through GAE. My assumption is that since traffic from GCE and GAE never leave Google's internal infrastructure and they are tied only to each other, this would be an easy and affordable way to add powerful SSL encryption to my Guacamole/Tomcat setup.
Alternative ideas to easily add SSL to my setup would also be greatly appreciated.
Setup the HTTP Load balancer and you're set in a few mouse clicks...
HTTP/HTTPS load balancing provides global load balancing for incoming
HTTP or HTTPS requests, allowing these requests to be sent to
different sets of backends based on patterns in the URL. HTTP requests
can be load balanced based on port 80 or port 8080. HTTPS requests can
be load balanced on port 443. HTTPS load balancing also supports SPDY
and HTTP/2. HTTP/HTTPS load balancing does not support WebSocket.
See https://cloud.google.com/compute/docs/load-balancing/http/