I'm working on a project that requires sending multiple files to the same node. The files are available for sending at the same time and I created a simple C.D shell script to send the files. I looped the call to this script to send all the files (about 20) at the same time.
In my script I'm intending to delete the files within the loop and after the CD script is called. However,.. some one at work , a colleague, told me that the files may not be sent on the spot but rather put in a queue for transmission at a later stage if the C.D node is busy and hence deleting the files would cause errors.
Can someone advise if this is the case? Are the files not fiscally copied even if put in a queue?
I find it weird that the CD script would complete with a successful return code and give me the process number and I still cannot delete the file?
Thanks,
Sergio
You can use the if statement for each file, if the exit code for the file is 0, only the file is deleted and CD moves to the next step for copying the next file.
if (step01=0) then
run task (Do something)
sysopts="rm filename"
Related
Using Shake, to create an mp3 (this is just a learning example), I use lame, and then id3v2 to tag it.
If the lame succeeds, but the id3v2 fails, then I'm left with the mp3 file in place; but of course it is "wrong". I was looking for an option to automatically delete target files if a producing command errors, but I can't find anything. I can do this manually by checking the exit code and using removeFiles, or by building in a temporary directory and moving as the last step; but this seems like a common-enough requirement (make does this by default), so I wonder if there's a function or simple technique that I'm just not seeing.
The reason Make does this by default is that if Make has a partial incomplete file on disk, it considers the task to have run successfully and be up to date, which breaks everything. In contrast, Shake records that a task ran successfully in a separate file (.shake.database), so it knows that your mp3 file isn't complete, and will rebuild it next time.
While Shake doesn't need you to delete the file, you might still want to so as to avoid confusing users. You can do that with actionOnException, something like:
let generateMp3 = do cmd "lame" ... ; cmd "id3v2" ...
let deleteMp3 = removeFile "foo.mp3"
actionOnException generateMp3 deleteMp3
I'm writing an SSIS package that runs on a schedule to poll a folder location.
The files will be picked up by a foreach loop, however these files are quite large and therefore take time to be copied into the directory.
I'd like to know how SSIS behaves when a foreach file loop is run on a directory with a file that is still being copied, will it skip over the file as it's not complete? Is there a danger that SSIS will attempt to load a part copied file?
Nope there is no danger to the File. Nor will it be copied partially [Unless you are removing it after the copying of files is done.] That would be a disaster. ;)
It has nothing to do with the Loop being in place. The File System Task is the one you need to consider. It will act as if you are manual copying a file. What would happen if you did that? Also what else are you doing after you are done with the coping of the file from Source to the Destination folder matters like [Are you removing?]
In short "NOTHING WOULD HAPPEN". No partial files will be copied.
I'm pretty much a noob at this, so any help is appreciated.
I'm trying to run the video transcoding executable REDline on all .R3D files in a given folder. REDline only accepts single files, which is the issue. I finally got it to search recursively for the files I need, but my problem is the search function passes the next result to REDline before the first one is finished transcoding. I have the search results that need to run in a variable inside REDline.
Here's the code:
for /r D:\folder\ %%a in (*) do (
"C:/Program Files/REDCINE-X PRO 64-bit/REDLine.exe" --exportPreset "Prores_Intermediate" --i "%%~dpnxa" --useRSX 2 --masterRMDFolder "" -s 0 -e 95
)
After about .7 seconds, REDline reports 'received stop message from client'.
I don't think this is a REDline error, as I have been able to transcode single files successfully.
Thanks.
Try start /wait when running the executable.
If that does not help, the executable might start another executable which does the actual job. In that case identify the other executable using Process Monitor or Process Explorer. Check their command line parameter to see if you can run that executable directly.
If you can't run the other process yourself, you can wait until that process has exited. See Wait for executable to finish here on StackOverflow.
Im not familiar with that particular executable, but here are some suggestions you may try:
- Check the allowed parameters for REDLine.exe (documentation or possibly /?) to find out if there is any that might address your problem
- There are tools that allow you to check for the existance of a process (Microsoft Sysinternals, check the ones starting with PS*.exe) http://technet.microsoft.com/de-de/sysinternals/bb545021.aspx
Use this to improve your loop so that you check if the process is running first before continuing with the next loop - or loop without doing anything until the process has exited, you can do with repeatedly calling a ping to localhost inside your loop for a fixed time interval for example
- Check the errorcodes returned by programs to see if that helps you
- Put the code that starts your encoding inside its own batch file, and the check if the process is currently running as well
- Use the creation of a dummy file before starting and delete it once finished to discern if an instance of your batch is running
- Check the difference between calling a command directly from a batch file, and using the start command to run it at the same time
I have a problem when trying to upload multiple files to one WinSCP directory, i can manage to copy just one single file, but the problem is that i need to upload many files that are generated by a software, the names are not fixed ones, so i need to make use of wildcards in roder to copy all of them, i have tried many variants on the code, but it all was unsuccessful, the code i am using is:
open "sftp://myserver:MyPass#sfts.us.myserver.com" -hostkey="hostkey"
put "C:\from*.*" "/Myserverfolder/Subfolder/"
exit
This code does actually copy the first alphabetically named file, but it ignores the rest of the files.
Any help with it would be much appreciated
Try this in script
Lcd C:\from
Cd Myserverfolder/Subfolder
Put *
Try and do all manually first so you can see what's going on.
i have a batch file, b1.bat which internally starts another two batch files, b2.bat and b3.bat and b2.bat internall calls b4.bat and root batch file, b1.bat,waits until those three(b2,b3 and b4) finishes. In summary, scenario like this:
b1.bat -> b2.bat -> b4.bat
-> b3.bat
I want to write output of all 4 batch files(b1.bat, b2.bat, b3.bat and b4.bat) into single log file, my_log.txt. I want to do this with minimal effort ie., changing less no. of batch files as i have lot of batch files like this without logging. So i want to provide logging for them.
I) Is it possible to control the log file output from parent batch file ie.,b1.bat?
II) Do i need to change all batch files with redirection operator which writes the output to log file?
I could'nt find proper solution for this. Please suggest me in this regard.
Assuming you are NOT doing any asynchronous processing using START, you should be able to simply use:
b1.bat >my_log.txt
You might also want to capture error messages by appending 2>&1 to the command.