Does Weblogic support local configuration across multiple managed servers - weblogic11g

In Weblogic, Is there a way to ensure that every machine in the domain(with managed servers) has the local configuration files.

Related

How do I connect the local SQL Server database for the ASP.NET Core application running inside either local Docker or Kubernetes?

I created an ASP.NET Core Web API using VS2017. After that I enabled the Docker support for my application.
Next, I was implemented the EF Core feature. After that I tested application locally then it’s working fine, database also created. But whenever I run the application inside local Docker or local Kubernetes the application won't work properly as I expected. because I used the Local SQL Server so, whatever the container running inside either Docker or Kubernetes it doesn't know the SQL Server or SQL Server database.
Can anyone suggest how to use the local database in the container running inside either Docker or Kubernetes?
You need to give the host's IP. In linux, you can use "host.docker.internal" hostname to connect to the host machine. It is supposedly working in Windows, however it has many, many, far too many problems in Windows.
If this hostname does not work for you, you have 2 IP addresses. One is the docker's gateway, that should start with 10...* or 172...* depending on how you set it up. Normally to learn this one, use docker inspect <container> and you can see the default gateway in the network section. However, Kubernetes might change these and it might be providing a better means to access the host. I did not use Kubernetes, so I don't know.
The other option is to use the IP address of the host, assigned by your network using DHCP. It should normally start with 192.168...
Your containers should be able to access applications on your host using these IP addresses. If the problem persists, turn off your firewall, and try pinging from inside the containers.

Restrict database connection to application running from a network share using TNSNames

I have a somewhat unique(though probably not) situation. I have users that access a 3rd party application over a network share. This application connects to an Oracle database. The problem is, we have Production, QA, Test, and Dev databases and separate shares/applications for each, but the application doesn't care what database it connects to. So I have users launching the Test application for testing and they log into the Production database. This causes major issues.
Is there any way to restrict what database they log into by network share?
I tried using TNSNames on each server that houses each version of the application and that works great...if they are running it on the server, but since all users have Oracle installed on their local machines and they run the application from a network share, their Oracle takes over and allows them to connect to any database (using LDAP).

Should we have a Database Server without an External IP?

I was thinking not to set any External IP Addresses on the cockroach database cluster that I will be launching, as I would be connecting to the cluster using the Internal IP Addresses itself.
I am deploying the cluster on GCP (Google Cloud Platform) and would have other application servers that would have External IPs and would interface with the Database Cluster.
I wanted to know, if there are any drawbacks or generally not a good practice to proceed with the above configuration.
In my opinion it is a good decision to avoid external IPs as much as possible, especially for databases and backend servers. it is an increase in security on the expense of convince in connecting (e.g. direct SSH).
You'll be able to SSH to such servers as needed by opening SSH tunnels from other VMs in the same VPC or VPN (openvpn, tinc, ..).

how to connect the local host to the internet server on netbeans(java)?

I have developed a netbeans project with a database connected using derby local server. when i distribute this project to other computer systems , how do i make the updates done on the table through other systems reflect on all other systems? DOes this involve connecting the host to the internet server?

How to use my local database when developing an Openshift application?

I'm giving a try to the openshift platform but I don't get how to configure it to use my local database instance (mySql, postgres, mongodb...) when doing local testing
Should I use environment variables like OPENSHIFT__DB_HOST in my local machine?
Could I use maven profiles or something like that to use a different datasource depending on the environment?
thanks
I have a python/mongodb app. I use
import os
if 'OPENSHIFT_DATA_DIR' in os.environment :
use Openshift's environment variables to setup database connection
else
use local DB or connect to OPenshift DB via port forward
To tell if I'm running on OPENSHIFT or my local PC system.
I have have two separate databases. My local small one for testing and the actual one on OPENSHIFT.
If on OPENSHIFT I just use their environment variables. If on my pc I just connect to my local development database.
You could also use port forward command 'rhc port -forward -a ' and then connect to your OPENSHIFT DB instance. You still would have to determine if your are local or OPENSHIFT to connect correctly.

Resources