I have a microservice that is deployed on a payara app server. Everything works so far. Now I try to include an embedded REDIS Server which listens on port 6379.
In order to use this, I have to open that port and ensure that the traffic to this port is redirected to my Application.
I already configured a network listener on 6379. So that means that payara is listening on this port too.
But how can I achieve, that this port is forwarded to my own application?
Related
If there is a VOLTTRON central deployment on ZMQ, would I need to have network ports on a firewall opened up if the actual VOLTTRON central instance is behind a firewall?
Basically I am looking at deploying an edge device in a building to collect some BACnet data (temporary research deployment) and hoping to aim this edge device instance to our central VOLTTRON instance that runs a SQL historian that is behind a firewall.
Does ZMQ run on port 5555? And I would I need to have our firewall opened up or port forwarding on this port to handle the bi-directional ZMQ bus?
VOLTTRON itself by default is on port 22916 (this is the zmq port volttron uses). This is configured independently from the web port. When initializing a web instance there are some more dependencies that are required than just the initial bootstrap.py so you will want to use bootstrap.py --web to make sure those are added.
If the edge devices will have a web server on them, then there must be an inbound connection from browser or code to reach that end device. In order for a volttron central agent to connect to an edge device, the edge device instance must have the vip-address of the central instance in its $VOLTTRON_HOME/config file or within the platform agent's config file. Edge devices should have the volttroncentralplatform agent installed on it for this scenario.
ZMQ could run on whatever port you configure it to. To configure volttron to use it specify the vip-address in the ~/.volttron/config file to whatever port you would like i.e. vip-address=tcp://127.0.0.1:22916 (only bound to 127.0.0.1 ip address).
I am making a webapp that allows users to connect to their own database and run various commands. How do I make it so that they can connect to anything BUT localhost (I don't want them messing with my database)? I do need to have localhost access for my website backend stuff so I can't simply disable it in one of the config files.
I am using Golang if that makes any difference.
They would be connecting to their own remote hosts and not a database connected to localhost.
Add pg_hba.conf entries that reject connections from Unix sockets and all local IP addresses.
I'm running a local express server at Port 5000. However, when I fetch http://localhost:5000/api from my react webapp (XAMPP), the server does not respond. How would I self host my express app to make it accessible from anywhere, not just during development.
Ideally, if your express server is running on port 5000 then it should be accessible by localhost:5000 or 127.0.0.1:5000. Try looking at your console logs in your browser for details(provide more information). Another thing to look out for is to check if it's running on http or https in production.
By default the server will run on all interfaces, not just localhost, so you can access it remotely by using its IP address on that network. (see https://stackoverflow.com/a/33957043/9726680)
The current setup,
Google Compute Engine running Windows Server 2012 (GCE Server 2012)
Google Compute Engine running Debian Wheezy (GCE Server Wheezy)
GCE Server 2012 has one open port, tcp 3389 to GCE Server Wheezy
GCE Server Wheezy is running Guacamole with NLA enabled and Tomcat 7 and is working off x.x.x.x:8080/guacamole/
So I have, what I hope, is a secure connection between GCE Server 2012 and GCE Server Wheezy. Now I want to be able to access x.x.x.x:8080/guacamole/ securely, but the setup with SSL has been difficult.
What I want to know is if it's possible to access GCE Server Wheezy through Google App Engine, which already has great SSL protection. Essentially, I would like to be able to open one port and IP address (range) and/or some sort of internal connection between a GCE website and GAE, and then access everything through GAE. My assumption is that since traffic from GCE and GAE never leave Google's internal infrastructure and they are tied only to each other, this would be an easy and affordable way to add powerful SSL encryption to my Guacamole/Tomcat setup.
Alternative ideas to easily add SSL to my setup would also be greatly appreciated.
Setup the HTTP Load balancer and you're set in a few mouse clicks...
HTTP/HTTPS load balancing provides global load balancing for incoming
HTTP or HTTPS requests, allowing these requests to be sent to
different sets of backends based on patterns in the URL. HTTP requests
can be load balanced based on port 80 or port 8080. HTTPS requests can
be load balanced on port 443. HTTPS load balancing also supports SPDY
and HTTP/2. HTTP/HTTPS load balancing does not support WebSocket.
See https://cloud.google.com/compute/docs/load-balancing/http/
I am making some application and I need to connect to database which is on Amazon server.
It works fine from local but I need direct access to database without ssl tunneling.
On AWS console 3306 port is opened