So, I am trying to retrieve a file from a remote host and copy it to my local ubuntu instance. I can ssh into the remote host from my ubuntu instance just fine, but when I try to connect with scp to transfer the file I get the error "connection timed out".
I am using the following command:
sudo scp angela#192.168.194.57:/bm/data/'2021-02-03 21.22.23 - northvancouver.pdf' /home/angela
I think it might have to do with the firewall/security settings on the server. The server requires an ssh key to connect, which I have set up on my ubuntu instance, but if I try to connect in the same way as I do with ssh (like trying angela#buildmapperserver-arbutus which is the server name I ssh into) it does not recognize the address (which I assume is because you need to exclusively use an ip address with scp?)
I don't know what the issue is or how to fix it. We have a github repository synced to the server and I would just push the files to the repo as a shortcut, but some of the files are too large for that.
Any ideas? If you need more information, just ask.
Are you able to successfully connect to the remote server using the command below?
ssh angela#192.168.194.57
If the ssh works, then check the firewall settings. It may be blocking the file transfer protocol.
Using sftp instead of scp works so I am doing that.
You can try to specify the path to the key
scp -i path/to/private/key user#ip:source/path target/path
Related
I have an asp.net core that connects to a SQL Server. When I run it in docker on my local computer everything works as expected, but when run the docker image on a linux server (centos 8) i get a network error when trying to connect to the database. I don't know what to do, since I used the actual server's ip in the connection string and it still does not work.
Thank you a lot
You should configure the firewall correct and watch if SELinux is blocking you in some way.
Just to speed up your testing, try after these commands:
sudo su -
systemctl disable firewalld
setenforce 0
I have got VSCode Remote via SSH working successfully between my Mac client and our Centos server. Our server is not online by default and I have to manually set proxies for it to download the Server.
VSCode Server wants to re-wget itself every now and then when I connect. When it does I need to kill the connection, set the proxies and reconnect to allow it to download the Server files.
My question: Is this wget procedure required (i.e. important updates) or can I bypass it?
Thanks in advance
I'm trying to add a postgresql database as a datasource in IntelliJ IDEA Ultimate.
I've worked with a datasource through ONE ssh tunnel already. But now the database server is behind a firewall which only accepts ssh connections from a management server. The only way to access the db server goes over the management server.
So I (or IntelliJ) have to connect via ssh to this server and then, by using another user, tunnel via ssh to the database server.
Everything clear? :-D
The problem is, that IntelliJ offers only to configure one ssh tunnel. But after the first tunnel I need to use a second one, to finally connect to the database server...
Any Ideas?
Thx in advance.
I'd create a local port forward using OpenSSH or any similar tool which will forward 127.0.0.1:2222 to firewall:22 via the Management Server, then use IntelliJ IDEA tunnel configuration to 127.0.0.1:2222 like you would do with the single tunnel.
ssh -L 127.0.0.1:2222:firewall:22 <management server>
You can configure an External Tool to automate this process. On Windows machine I had great experience with Bitvise SSH Client for creating tunnels/port forwards and starting them automatically.
ssh supports your scenario out of the box. The trick is to create two entries in your ~/.ssh/config file for the management server, one for your-user and one for special-user. Then use ProxyJump to chain your connections together.
So, start by setting up a Host section for the management server and the user your are connecting to from your local machine:
Host mgmt
HostName management.server.com
User your-user
...
Then, set up a Host for the user on the management server that you will be logging in as:
Host mgmt-special-user
HostName management.server.com
User special-user
To this same host, add a directive to tell ssh to create a tunnel to your DB:
LocalForward <free-port-on-your-box> <db-ip-or-host>:<db-port>
Then tell ssh that this host can be reached from the first host:
ProxyJump mgmt
You can now ssh mgmt-special-user from your local machine. ssh will automatically jump through the mgmt host, and will also automatically extend the tunnel through mgmt and back to your local machine.
ProxyJump (-J) was added in OpenSSH 7.3 (released in 2016).
I would like to access a Azure SQL database. It is accessible on port 1433. However, the firewall is blocking this request. Therefore, I wold like to tunnel my connection.
Setup
The client is running on Windows 10. I can connect to a remote linux server. This server runs at home and when I'm at home, I am able to access the SQL database. So I assume my linux server is also able to connect to the Azure database.
Tunnel request?
I want to access the database via an ODBC connection. So, I tried to tunnel the connection using putty:
Putty connects to linux server and tunnels localhost:2433 to server-url:1433
Client connects to localhost:2433
Client has access to database
However, this is not working.
What goes wrong?
I am able to connect using putty to linux server.
I have setup a tunnel inside putty at the Connection > SSH > Tunnels page:
Source port: 2433
Destination: server-url:1433
I have set the radio buttons to Local and auto.
What goes wrong here? I don't know how I can investigate this properly. Maybe there is a problem at my linux server, that it can't connect to Azure SQL. But I think my tunnel is not working correctly. Can you help?
I didn't get it to work with putty, but if you have access to a shell you can use
ssh -L <local_port>:localhost:<remote_port> user#server -i "path to your private key file if you need one for authentification"
So in your specific example it would be
ssh -L 2433:localhost:1433 user#server-url
I used this to create an ODBC connection from RStudio (you can use the built in terminal to establish the tunneling) to a postgres db running in a docker container on azure.
Today I want to know if it is possible to manage a PostgreSQL on OpenShift using pgAdmin, in the same way that we can manage an Amazon RDS database?
Use port forwarding on your local machine to establish a connection to your remote server:
rhc port-forward -a applicationName
Now check the output on your command line to which port the postgres database service is mapped, if available this will be 5432 by default.
A step by step documentation about port forwarding is available at Getting Started with Port Forwarding on OpenShift.
Another way would be to configure the ssh-tunnel into your connection.
Tunnel-Host would be your applications name
Username the beginning string, when ssh'ing to that connection: XXX#application-name
Identity file would be id_rsa in .ssh fodler in your home directory
password your openshift password