Need to copy remotely hosted file via Shell Command - file

There is a file that hosted remotely on a server that is not supporting Shell Access. I bought a new server that supports Shell Access so now I want to copy a file that is on the non-supporting server to new server via a Shell Command using Putty.
The file URL is like this http://www.domain.com/file.gzip and it is username/password protected.
To be more specific, I want to copy a backup of a home directory from cPanel to my new server via Shell command. I have done this few months ago but I don't remember it now and also I failed to Google it.

why dont you use wget ?
so you can download your files to your new server.?
$ wget --user='userhere' --password='myPassword' http://youroldhost/backup.zip

Related

Azure Data Studio: unable to restore SQL Server database - Access Denied

I'm in the process of migrating the database from one server to another. When I try to select the backup file (.bak) within Docker, I'm getting an 'Access Denied' error. How to provide access permission to the Docker container?
enter image description here
I had the same error, and what I concluded is that the problem is the file is not in the format it should be, to be precise since I have remote access to my Linux server with GUI, I just copied the .bak file from the windows machine to Linux, which I repeat is not advised way to transfer files form one OS to another. I solved the problem by posting .bak on Google Drive and then generating a download link that I typed in the terminal later.
To generate a proper link from Google Drive I recommend the following guide:
https://bytesbin.com/skip-google-drive-virus-scan-warning-large-files/
When you get the link type in the terminal:
$ curl -L -o The_Name.bak "The_link"
For restoring the database follow this tutorial:
https://learn.microsoft.com/en-us/sql/linux/tutorial-restore-backup-in-sql-server-container?view=sql-server-ver16

Informatica Cloud - Picking up files from SFTP and inserting records in Salesforce

Our objective is as follows
a) Pick up a file "Test.csv" from a Secure FTP location.
b) After picking up the file we need to insert the contents of the file into an object in Salesforce.
I created the following connection for the Remote SFTP (the location which will contain "Test.csv")
Step 1
This is as shown below
Step 2
Then I started to build a Data Synchronization Task as below
What we want is for the Informatica Cloud to connect to the secure FTP location and extract the contents from a .csv from that location into our object in Salesforce.
But as you can see in Step 2, it does not allow me to choose .csv from that remote location.
Instead the wizard prompts me to choose a file from a local directory (which is my machine ...where the secure agent is running) and this is not what I want
What should I do in this scenario ?
Can someone help ?
You can write a UNIX script to transfer the file to your secure agent and then use informatica to read the file. Although, I have never tried using sftp in cloud, I have used cloud and I do know that all files are tied up to the location of the secure agent( either server or local computer) .
The local directory is used for template files. The idea is that you set up the task using a local template and then IC will connect to the FTP site when you actually run the task.
The Informatica video below shows how this works at around 1:10:
This video explains how it works at around 1:10:
http://videos.informaticacloud.com/2FQjj/secure-ftp-and-salesforececom-using-informatica-cloud/
Can you elaborate the Secure agent OS as in Windows or Linux.
For Windows environment you will have to call the script using WINSCP or CYGWIN utility I recommend the former.
For Linux the basic commands in script should work.

Transferring files with SFTP

So I'm trying to transfer files to a remote computer on an SSH system. 'I've used the sftp command, used lls to confirm the presence of the file in the local computer, and then implemented the put filename command. However, I receive the same result each time:
stat filename: No such file or directory
I just don't know what's going wrong! Any help or troubleshooting tips would be appreciated.
If you're currently using Windows you can download winscp and use that to transfer files. It has a nice graphic interface that is easy to interact with
Well, supposing that you are on a Linux/Unix environment, you could use scp. Typically, the syntax for an scp command would be like this:
$ scp foobar.txt your_username#remotehost.net:/some/remote/directory
The above command copies the file foobar.txt which resides in the local computer, to a specific directory in the remote machine, using a username (you will be asked for a password later).
The sftp command line client uses the ssh transport and will tunnel your connections using your key. So if you have ssh access, you should also have sftp access. This is a secure option for people who are more comfortable with ftp. Most GUI ftp clients should also support sftp.
I was facing also in this issue when trying to upload files from the local to the remote server. I did commands well and clean but the mistake I was making was that: I've logged into the remote server with ssh and then login with sftp. In that way, sftp will consider that your remote server is the local (as I logged in first to this via ssh) when using the command below:
put /c/path/to/file.txt
So, the thing to do is to login directly to the server via sftp and putting your local files in there.

Can I create a shared folder on remote machine?

I am trying to automate the Account Creation process in Active Directory and I want to create the user home directory on a server and then I want it to become a shared folder with some user permissions.
I can create folder on that machine (remote machine) but I cannot convert it to a shared folder. Is there a way I can do that. Note I can create shared folder locally and can set permissions but cannot do that when the machine is different.
Which language are you using to script ?
You can do exactly what you are doing localy, on a remote computer using psExec from SysInternals. You'll find at th end of this post how to do it in ldap mixed in with WMI.

how to download another website file into my website?

hi I have a domain.com website and I was wondering how would I download a file with simple url into my website space ? is it using shell scripts or what ?
use wget or wget for windows
It depends on the tools you have available on the web server. If you have ssh access, and either wget or curl is installed on the server, then that would be the easiest way: ssh into the server, and the issue a command like this:
wget http://example.com/resource
If you don't have ssh access, you'll need to write a script in whatever scripting language you have available on the server, make sure it's properly protected, and then run it.
That completely depends on what kind of access you have to the server.
When you have shell access, for example you could do a simple wget <url> to download a file, and then you would just move it to the correct location.
Otherwise, if it is shared hosting and you only have ftp access, you can put a script on the server you can executes that then downloads a remote file.

Resources