I am trying to call a .cmd file on a remote server (which works) and from that .cmd file call an external executable on the remote server to compress some files. This used to work in an older environment (remote server was a 2003 machine), but we are migrating to a new 2012r2 server and now I am getting a path not found error. I know the path's are correct, because locally I can run all of these commands without any problems. Let me lay it out a bit cleaner:
Calling server:
I use the following command line to call the script which lives on the remote server:
\\server1\path1a\path1b\myscript.cmd \\server1\path2a\path2b\
Remote server:
On here the contents of the "myscript.cmd" file is:
#echo off
e:\utils\gzip.exe -N -3 -a %1\p*
if %errorlevel% GTR 0 goto zipfail
echo ZIP WORKED!
exit
:zipfail
echo ZIP FAILED with error: %errorlevel%
exit
As you can see I am passing in a parameter to where the source files to be zipped live. The account on the calling server that I using has Full Access (both file and share level) to the directory where the .cmd file lives, as well as to the local path e:\utils where on the remote server the gzip utility lives. I can run this from the remote server and it all works normally, but when I try to call it from a remote machine I get back and error of "the system cannot fine the path specified". I've confirmed that the issue is not the "c:\utils\gzip.exe" path, in that if that is missing or incorrect I get a different error which states that it cannot find the gzip utility. That means the issue is getting gzip to launch and have it access the remote path where the files are to be compressed.
(BTW, I have tried putting gzip to the same path where the .cmd file lives, same results.)
Any ideas? Is this some new security restriction on 2012 whereby a remote executing script is unable/not allowed to access remote executables?
Paul was right - its been too long since I have worked with batch files that I am starting to forget the basics... grrrr!
The path (and called executable) was missing on the calling (aka machine running the script). Once I fixed that its was working great!
Related
I have gone through the following guide to set up an SSIS package to retrieve a text file located on an SFTP server:
https://www.mssqltips.com/sqlservertip/3435/using-sftp-with-sql-server-integration-services/
To summarize, the SSIS package executes PSFTP.exe (A PuTTY tool) which takes the necessary credentials to connect to the server. It also takes a batch file that it executes after connecting. This batch file contains the commands to retrieve the desired text file. To start from the guide, it simply contains a cmd command to change directory, and a get command to retrieve the file:
cmd DataDump
get TeleMarketingResults.txt
All of this works fine.
The issue arises when I try to make this batch file logic more complex as it does not seem to recognize basic keywords. For instance, I would like to modify it to retrieve the most recent file, so I tried adding this:
for /f %%i in ('dir /b/a-d/od/t:c') do set LAST=%%i
echo The most recently created file is %LAST%
but then I get these errors:
psftp: unknown command "for"
psftp: unknown command "echo"
If I execute the batch file manually in a local directory, it works. The issue only occurs when passing it as a parameter to PSFTP.exe. Why is this?
psftp script file can contain psftp commands only. for, set or dir are not psftp commands.
There's hardly any reasonable way to retrieve latest file using psftp. You would have to do it in two steps. First to retrieve listing and store it to a file. Then parse that file using some smart batch file commands to find the latest files. And then run psftp again to download that file. It is cumbersome and ineffective as it requires two connections.
You better use a more powerful SFTP client. For example it's trivial with my WinSCP SFTP client. See
Question WinSCP select most recent file or
WinSCP article Downloading the most recent file.
We have an desktop application that dynamically generates a command file to pull specific files that have the current date in the name. So in the end we have a command file that looks like this:
lcd e:\localpath
mget Filename0111.dat
mget Filenametwo0111.dat
mget Filenamethree0111.dat
bye
Where 0111 is MMDD. The command file is created via a .bat file that the desktop app executes. The application then connects to the remote server via PSFTP.exe and runs that command file to pull files.
The problem we're running into is we updated the PSFTP.exe to a newer version due to a separate issue that occurred. Now if a file is not available on the remote server it returns an error code 2 which stops the rest of the files from being retrieved. So if the first file in the list doesn't exist then it fails and the rest of the files are not downloaded.
Is there a way to ignore the error code 2 so that the rest of the files get retrieved? I had thought at first to run PSFTP.exe and it's commands through a batch file but that didn't work.
Any ideas?
PSFTP.exe has a command -be that will continue executing the batch if there is an issue.
When running a batch file, this option causes PSFTP to continue processing even if a command fails to complete successfully.
You might want this to happen if you wanted to delete a file and didn't care if it was already not present, for example.
I'm executing a batch file on a remote server from our build server (Jenkins) through SSH. The batch file uses fart.exe commands to find and replace text. I have placed the fart.exe in C:\Windows\System32 and I'm invoking it as C:\windows\system32\fart.exe in the batch file.
The command works perfectly fine oin the remote server, but when invoking through SSH I get an error in the Jenkins log as:
'"C:\windows\system32\fart.exe"' is not recognized as internal or external command.
This is the only error I'm getting and the other commands successfully execute in the batch script. Both of the servers are Windows Server 2012 R2.
I tried adding the path to system variable but it didn't work.
This is how the Fart.exe is used in the batch script.
for /R "%BACKUP_SOURCE%" %%G in (%ConfigFile%) do (
"C:\windows\system32\fart.exe" "%%G" %PlaceHolder% %AppPath%
)
Invoking From Jenkins
I don't think the way i invoke the batch script do matter, because it is done trough the Jenkins SSH plugin. The batch script get invoked successfully. The error i get is from when executing the Fart.exe
I tried invoking a different command of an exe located in the same path, and that is successfully invoked. so i guess the issue is isolated to FART.EXE.
I am interested in creating a batch file (ran on my computer) that can copy files from a server location to a computer connected to my network.
Something like:
xcopy //SERVER/FILE CONNECTEDCOMPUTER
It would be fine if I had to run the batch from the server or something like that. I just want to be able to remotely send a file to a connected computer.
As long as you have access to the files (the files are on a share you can read from), XCOPY should work fine. If you map a share to a local drive letter, you have the normal syntax for XCOPY as if you copy locally.
Without a mapped drive, simply use something like this to copy from server to C:\ :
XCOPY \SERVERNAME\SHARENAME\FILEORDIRECTORIES C:\
I am not sure I understand your aversion for sharing if what you want to do is share files... but you could install an ssh server for Windows on the remote machine, such as this - Bitvise
Then all you need to do is use
winscp file remote:
or
pscp file remote:
if you go the putty route.
Or you could install the free Filezilla FTP Server on the remote machine and send your file via FTP.
There is a ftp command in my batch script :
FTP -n -s:D:\scripts\Test\get.ftp
Where get.ftp contains all ftp commands including "mget abc*".
Issue here is when file(s) of names starting with abc* is not available, mget is not failing. Also, if any other ftp command fails also, the script is not exiting with error status 1. i.e. "FTP -n -s:D:\scripts\Test\get.ftp" exiting without issues.
Not able to make the batch script fail when there is no file to pick up.
Need suggestion if someone has faced similar issue.
-Krishna
The mget command works by obtaining a remote folder listing and parsing the list for the wildcard pattern that you provide. As long as the listing can be obtained successfully,
it is not considered an error if your pattern did not match any of the files on the list.
Your batch script can be setup to compare the local folder listing before and after invoking the ftp command to check if a file was downloaded. You can also use a scripted ftp solution like kermit or ftp script to be able to have more control on error reporting.