I am interested in creating a batch file (ran on my computer) that can copy files from a server location to a computer connected to my network.
Something like:
xcopy //SERVER/FILE CONNECTEDCOMPUTER
It would be fine if I had to run the batch from the server or something like that. I just want to be able to remotely send a file to a connected computer.
As long as you have access to the files (the files are on a share you can read from), XCOPY should work fine. If you map a share to a local drive letter, you have the normal syntax for XCOPY as if you copy locally.
Without a mapped drive, simply use something like this to copy from server to C:\ :
XCOPY \SERVERNAME\SHARENAME\FILEORDIRECTORIES C:\
I am not sure I understand your aversion for sharing if what you want to do is share files... but you could install an ssh server for Windows on the remote machine, such as this - Bitvise
Then all you need to do is use
winscp file remote:
or
pscp file remote:
if you go the putty route.
Or you could install the free Filezilla FTP Server on the remote machine and send your file via FTP.
Related
I have installed Jenkins at 1.1.1.01 Ipaddress, and a bat file does exist at remote fileserver 1.1.1.02 Ipaddress (that may differ by user, because I will give Ipaddress as a parameter).
Can I deploy that bat file through Jenkins pipeline?
You need first to check, independently of Jenkins, if you can access 1.1.1.02 (or any other remote server IPs) from 1.1.1.01, assuming 1.1.1.01 is the server executing your job.
If, from 1.1.1.01, you can SSH for instance to 1.1.1.02, or scp 1.1.1.02, then you can copy a file (like your bat file), from 1.1.1.01 to 1.1.1.02 or vice-versa.
I'm trying to run a batch file from a local Windows server that calls on computers in my domain to pull from the shared folder and run an exe. I'm not sure if my script is trying to do too much or too little.
So I run the below batch locally
X:\pstools\psexec.exe \\Computer -d -u DOMAIN\user -p password -i \\SERVER\test\testfile.bat
and testfile.bat:
#echo off
pushd \\SERVER\test\
call program.exe
popd
When I run the script, psexec runs and I get a confirmation that testfile.bat was started on target computer. On the targeted computer nothing happens. If I navigate to the share on the targeted computer and run testfile.bat, I get "CMD.EXE was not started with the above path as the current directory.UNC paths are not supported. Defaulting to Windows directory." From there the computer runs the called .exe with no issues.
If I target this towards another server in my domain it executes perfectly, but not on domain computers. I thought maybe a GPO issue, but I can't find a solution.
Thanks for any knowledge or help provided!
Thanks for all the tips everyone! This is how I ended up getting it working for anyone who might have the same issue.
Script on Server:
x:\pstools\psexec.exe \\Computer(or text file with computers listed) -d -s cmd /c (batchfile.bat)
Similiar to what I was trying before, but to ensure you run the command line as System on the remote PC you have to specify "-s cmd". The /c copies the batch to the remote system to run locally. You can include a "-f" to overwrite if you've already copied it previously. I just put the batchfile in the pstools folder.
Batchfile.bat:
pushd \\networkdrive
call (.bat/.cmd/.exe/etc.)
popd
I disabled the firewall for testing, but I believe you have to open TCP-445/UDP-137 for PSEXEC. Otherwise, that's it. Super simple solution.
I am writing a script to install a program remotely on requested computers. Only problem is that the file we invoke to install the application is actually a shortcut that points to an .exe and an .ini so it can install with specific parameters. Is there a way I can run the shortcut from the batch file so that it points to both file and installs on the users computer with parameters already set?
Is this what you are looking for?
xcopy "source" "destination" Example: xcopy E:\fileC:\Users\user\destination
I try to copy a lot o file from my ftp = ftp://ftp.prodega.ch so:
I created a code.txt file with this text:
open ftp://ftp.prodega.ch
user
password
lcd /D "E:\f2\" //this is my local directory
cd Bilder1/ //this is ftp folder
mget *
pause
the I execute in cmd this row : ftp -s:code.txt but I meet with this error:
unknown host: ftp://ftp.prodega.ch
help me please
When connecting to a server via the built-in FTP client you have to skip ftp://!
So open ftp.prodega.ch will fix your issue.
However, you might still face another problem. The standard FTP client doesn't support passive mode which is required by most of the servers. If you are not able to modify the server, you won't be able to download the files. You should consider using a PowerShell script instead or use a different FTP client with command line support.
I am trying to call a .cmd file on a remote server (which works) and from that .cmd file call an external executable on the remote server to compress some files. This used to work in an older environment (remote server was a 2003 machine), but we are migrating to a new 2012r2 server and now I am getting a path not found error. I know the path's are correct, because locally I can run all of these commands without any problems. Let me lay it out a bit cleaner:
Calling server:
I use the following command line to call the script which lives on the remote server:
\\server1\path1a\path1b\myscript.cmd \\server1\path2a\path2b\
Remote server:
On here the contents of the "myscript.cmd" file is:
#echo off
e:\utils\gzip.exe -N -3 -a %1\p*
if %errorlevel% GTR 0 goto zipfail
echo ZIP WORKED!
exit
:zipfail
echo ZIP FAILED with error: %errorlevel%
exit
As you can see I am passing in a parameter to where the source files to be zipped live. The account on the calling server that I using has Full Access (both file and share level) to the directory where the .cmd file lives, as well as to the local path e:\utils where on the remote server the gzip utility lives. I can run this from the remote server and it all works normally, but when I try to call it from a remote machine I get back and error of "the system cannot fine the path specified". I've confirmed that the issue is not the "c:\utils\gzip.exe" path, in that if that is missing or incorrect I get a different error which states that it cannot find the gzip utility. That means the issue is getting gzip to launch and have it access the remote path where the files are to be compressed.
(BTW, I have tried putting gzip to the same path where the .cmd file lives, same results.)
Any ideas? Is this some new security restriction on 2012 whereby a remote executing script is unable/not allowed to access remote executables?
Paul was right - its been too long since I have worked with batch files that I am starting to forget the basics... grrrr!
The path (and called executable) was missing on the calling (aka machine running the script). Once I fixed that its was working great!