Copy subdirectories one at a time with a pause - batch-file

I have a folder tree with several subdirectories within subdirectories. I am looking for a batch script to move one bottom level subdirectory, wait 10 minutes and move the next, wait 10 minutes and so on. Alternatively I could move one file, wait a minute, then move the next and so on, as long is it replicates the directory structure.
I have been able to COPY files from a folder with a delay between each file using instructions found here: Copy Paste Files with delay
However I want to MOVE files and include subdirectories.
The reason behind this request is that I am moving a large amount of media (video specifically) and it needs to be checked into a database once it lands in the scanned location. Because the database is in use, dumping the entire contents of the top level folder will overload the system. At the moment I am manually moving one bottom level folder at a time and it is taking up way too much of my time.
Thanks!

Related

Date in NLog file name and limit the number of log files

I'd like to achieve the following behaviour with NLog for rolling files:
1. prevent renaming or moving the file when starting a new file, and
2. limit the total number or size of old log files to avoid capacity issues over time
The first requirement can be achieved e.g. by adding a timestamp like ${shortdate} to the file name. Example:
logs\trace2017-10-27.log <-- today's log file to write
logs\trace2017-10-26.log
logs\trace2017-10-25.log
logs\trace2017-10-24.log <-- keep only the last 2 files, so delete this one
According to other posts it is however not possible to use date in the file name and archive parameters like maxArchiveFiles together. If I use maxArchiveFiles, I have to keep the log file name constant:
logs\trace.log <-- today's log file to write
logs\archive\trace2017-10-26.log
logs\archive\trace2017-10-25.log
logs\archive\trace2017-10-24.log <-- keep only the last 2 files, so delete this one
But in this case every day on the first write it moves the yesterday's trace to archive and starts a new file.
The reason I'd like to prevent moving the trace file is because we use Splunk log monitor that is watching the files in the log folder for updates, reads the new lines and feeds to Splunk.
My concern is that if I have an event written at 23:59:59.567, the next event at 00:00:00.002 clears the previous content before the log monitor is able to read it in that fraction of a second.
To be honest I haven't tested this scenario as it would be complicated to set up as my team doesn't own Splunk, etc. - so please correct me if this cannot happen.
Note also I know that it is possible to directly feed Splunk other ways like via network connection, but the current setup for Splunk at our company is reading from log files so it would be easier that way.
Any idea how to solve this with NLog?
When using NLog 4.4 (or older) then you have to go into Halloween mode and make some trickery.
This example makes hourly log-files in the same folder, and ensure archive cleanup is performed after 840 hours (35 days):
fileName="${logDirectory}/Log.${date:format=yyyy-MM-dd-HH}.log"
archiveFileName="${logDirectory}/Log.{#}.log"
archiveDateFormat="yyyy-MM-dd-HH"
archiveNumbering="Date"
archiveEvery="Year"
maxArchiveFiles="840"
archiveFileName - Using {#} allows the archive cleanup to generate proper file wildcard.
archiveDateFormat - Must match the ${date:format=} of the fileName (So remember to correct both date-formats, if change is needed)
archiveNumbering=Date - Configures the archive cleanup to support parsing of filenames as dates.
archiveEvery=Year - Activates the archive cleanup, but also the archive file operation. Because the configured fileName automatically ensures the archive file operation, then we don't want any additional archive operations (Ex. avoiding generating extra empty files at midnight).
maxArchiveFiles - How many archive files to keep around.
With NLog 4.5 (Still in BETA), then it will be a lot easier (As one just have to specify MaxArchiveFiles). See also https://github.com/NLog/NLog/pull/1993

Word default folder is different when started programmatically

We have a 2 step process that collects all filenames from a folder into a Word document for use elsewhere.
The original process was to run an old DOS batchfile that collected the filenames into a DOS .txt. Then we launched a Word .docx with a macro that imported the .txt and massaged the formatting. After visual inspection we hit ‘Save’ and that was it.
I had the bright idea that a step could be taken out by launching Word directly from the batch, so I inserted the line: start winword filename. This works great except that the default location that Word wants to save in is now my Templates folder. Running it the old way still works perfectly.
The question is: why is the default location changed by launching Word programmatically and how can it be forced back to the correct location?
Thanks
you can use:
start /D C:\path\to\folder "" winword.exe
this program starts winword.exe and save all files to C:\path\to\folder
ill assume that winword.exe is in the current directory.
for help with the start command : http://ss64.com/nt/start.htm
I investigated the start command, but never did figure out why it operated differently. The eventual solution was to include the Save action in the macro. I still don't know why we didn't have to do that before, but it works now so we're declaring success and moving on.

Connect Direct Multiple files, one Node.

I'm working on a project that requires sending multiple files to the same node. The files are available for sending at the same time and I created a simple C.D shell script to send the files. I looped the call to this script to send all the files (about 20) at the same time.
In my script I'm intending to delete the files within the loop and after the CD script is called. However,.. some one at work , a colleague, told me that the files may not be sent on the spot but rather put in a queue for transmission at a later stage if the C.D node is busy and hence deleting the files would cause errors.
Can someone advise if this is the case? Are the files not fiscally copied even if put in a queue?
I find it weird that the CD script would complete with a successful return code and give me the process number and I still cannot delete the file?
Thanks,
Sergio
You can use the if statement for each file, if the exit code for the file is 0, only the file is deleted and CD moves to the next step for copying the next file.
if (step01=0) then
run task (Do something)
sysopts="rm filename"

batch file copy multiple .txt files with date restrictions from multiple directories to one directory

I am trying to copy all .txt files that I have scattered throughout several subdirectories of one main directory into another directory using a batch file. I have research this site and found lots of answers at this link: batch file Copy files with certain extensions from multiple directories into one directory. Like the code below from Jay:
set dSource=C:\Main directory\sub directory
set dTarget=D:\Documents
set fType=*.doc
for /f "delims=" %%f in ('dir /a-d /b /s "%dSource%\%fType%"') do (
copy /V "%%f" "%dTarget%\" 2>nul
)
My question is how to modify this code or other codes on this link to batch copy the files with time stamps, like I only want to copy .txt files created from Jan 1, 2012 to Nov 1, 2012.
My suggestion for finding and moving *.txt files in a directory tree of a drive, or the entire drive, or even multiple drives with last modification date in a definite time period is:
Start Windows Explorer.
Click on button Search.
Open advanced search options for finding files and folders.
Select/enter to search for files by last modification date.
Enter the two dates to specify time period or select the time period.
Run the search.
Select all found files in search result, for example with Ctrl+A.
Press Ctrl+X to mark the found files for being cut (moved).
Open the folder into which the files should be moved.
Press Ctrl+V to paste the files (move them).
That's it.
Nobody needs to code a batch job for this task if this find + move files job should not be done periodically using a scheduling task.
The exact steps for doing such an advanced find for files in a definite time period with Windows Explorer depends on version of Windows. See for example the computer tips
Find files by date modified in Windows for Windows 8/7/Vista, or
for Windows XP: How do I search for a file on my computer?
And of course there are many freeware and shareware tools which support also finding files according to various search criterias like last modification date within a specified time period and move them.
Well, that does not really answer the question as it does not contain the batch code for doing the job. So I answer this question with another question:
Why thinking about coding a batch file for such a task hard to adapt for varying dates if dozens of GUI applications including Windows Explorer exist doing the same by simple user input with no need on coding skills and therefore very easy to use, and the find + move must be done only once or from time to time with changed criterias?

Replay a file-based data stream

I have a live stream of data based on files in different formats. Data comes over the network and is written to files in certain subdirectories in a directory hierarchy. From there it is picked up and processed further. I would like to replay e.g. one day of this data stream for testing and simulation purposes. I could duplicate the data stream for one day to a second machine and „record“ it this way, by just letting the files pile up without processing or moving them.
I need something simple like a Perl script which takes a base directory, looks at all contained files in subdirectories and their creation time and then copies the files at the same time of the day to a different base directory.
Simple example: I have files a/file.1 2012-03-28 15:00, b/file.2 2012-03-28 09:00, c/file.3 2012-03-28 12:00. If I run the script/program on 2012-03-29 at 08:00 it should sleep until 09:00, copy b/file.2 to ../target_dir/b/file.2, then sleep until 12:00, copy c/file.3 to ../target_dir/c/file.3, then sleep until 15:00 and copy a/file.1 to ../target_dir/a/file.1.
Does a tool like this already exist? It seems I’m missing the right search keywords to find it.
The environment is Linux, command line preferred. For one day it would be thousands of files with a few GB in total. The timing does not have to be ultra-precise. Second resolution would be good, minute resolution would be sufficient.

Resources