LS,
At first i'm new to programming.
My study project is to mount an image (e01) to a free driveletter and copy out some registry files and ntuser.dat files.
The image files are on a server \foldername\number\number\name.E01
I have a working (small) script that mounts an image with only single partition. So finding the registerhives wasn't that hard.
Now i want to mount an image with more than one partition so it isn't clear that the partition i need to search for the registry hives is on the first mounted free drive letter.
If there are 3 partitions than there would be 3 driveletters that will be show in explorer.
So i have to determine which drives are already in use before mounting and after mounting which driveletters are added so i can search at those locations. i think compare something and than use the first free driveletter. That driveletter should be added to:
osfmount -a -t file -f \\servername\number\number\name.E01 -m #:
where the "#" is the driveletter. i don't know if i can alter the command so that it accept that # is a value that is pulled from another line. or something like that. don't know how to explain this.
it has to be in a batch file. so no python or any other scripting language
a co-worker pointed this to me:
pattern="foo"
for _dir in *"${pattern}"*; do
[ -d "${_dir}" ] && dir="${_dir}" && break
done
echo "${dir}"
yah, right....
maybe someone can put me in the right direction that i understand.
i will certainly type more code/lines than is necessary but hey, i'm a newbie.
tnx.
Related
i came across a problem with subversion at my work. I want to create a post-commit-hook (post-commit.bat file) command that creates information about the last transaction.
The code looks like this:
#echo off
set file="D:\mypath\logfile%2.txt"
svn log D:\'my path to repro'\ -r %2 -v > %file%
The %2 corresponds to the last revision number. It creates the file with the correct number and tries to write in it. But then the commit hangs and the file remains open. The curious thing is, if I manually trigger the command with a valid revision number, then the whole thing works. Only with the hook it hangs and it also does not commit the files.
Can anyone help me or have any ideas for my problem?
I found a solution, maybe this might be helpful for some person.
I used the wrong command "log". Instead you have to use "svnlook changed..." on the server to get the latest information about the last commit.
I need something that can copy a specified file any and everywhere on my drive (or computer) where that file already exists; i.e. update a file. I tried to search this site, in case I'm not the first, and found this:
CMD command line: copy file to multiple locations at the same time
But not quite the same.
Example:
Say I have a file called CurrentList.txt, and I have copies of it all over my hard drive. But then I change it and I want all the copies to update. So I want to copy the newer one over all the others. It could 'copy if newer', but generally I know it's newer, so it could also just find every instance and copy over it.
I was originally going to use some kind of .bat file that would have to iterate over every folder seeking the file in question, but my batch file programming is limited/rusty. Then I looked to see if xcopy could do it, but I don't think so...
For how I will use it most, I generally know where those files are going to be, so it actually might be as good or better if I could specify it to (using example), "copy CurrentList.txt, overwriting all other copies wherever found in the C:\Lists folder and all subfolders".
I would really like to be able to have it in a context menu, so I could (from a file explorer) right click on a file or selected files and choose the option to distribute it.
Thanks in advance for any ideas.
Use the "replace" command...
replace CurrentList.txt C:\Lists /s
I have a simple script to move files between directories. Basically, it is:
mv /dir/* /dir/proc/
saved into a shell script "mvproc.sh".
For some reason, when I run the script (sh mvproc.sh) the file indeed gets moved, but it does not retain the filename and instead gets just an empty filename. When I run the same command at the bash prompt, it works fine however.
This script used to work fine on Debian but we had a hard drive failure and I am now migrating everything over to a Ubuntu machine.
Any idea why this is happening? It seems so simple yet I cannot figure it out.
Many thanks.
edit...
I think I found the solution. For some reason it was putting in carriage returns and maybe line breaks or something that I could not see while editing the sh script in either Notepad++ or even gedit. To solve this, when I open the scripts in gedit, I do a Save As, and select Unix/Linux in the drop down menu towards the bottom. This hopefully gets rid of the weird carriage returns even though I could not see them.
Hopefully this helps some poor soul like me in the future pulling their hair out over this!
Thanks!
Try: mv /dir/file /dir/proc/file
You are indeed moving the file, but aren't specifying a destination name. Other usages of mv:
Move and rename: mv /dir/filename /dir/proc/newfilename
Rename: mv /dir/filename /dir/newfilename
I have a folder with a few files in it; I like to keep my folder clean of any stray files that can end up in it. Such stray files may include automatically generated backup files or log files, but could be a simple as someone accidentally saving to the wrong folder (my folder).
Rather then have to pick through all this all the time I would like to know if I can create a batch file that only keeps a number of specified files (by name and location) but deletes anything not on the "list".
[edit] Sorry when I first saw the question I read bash instead of batch. I don't delete the not so useful answer since as was pointed out in the comments it could be done with cygwin.
You can list the files, exclude the one you want to keep with grep and the submit them to rm.
If all the files are in one directory:
ls | grep -v -f ~/.list_of_files_to_exclude | xargs rm
or in a directory tree
find . | grep -v -f ~/.list_of_files_to_exclude | xargs rm
where ~/.list_of_files_to_exclude is a file with the list of patterns to exclude (one per line)
Before testing it make a backup copy and substitute rm with echo to see if the output is really what you want.
White lists for file survival is an incredibly dangerous concept. I would strongly suggest rethinking that.
If you must do it, might I suggest that you actually implement it thus:
Move ALL files to a backup area (one created per run such as a directory containing the current date and time).
Use your white list to copy back files that you wanted to keep, such as with copy c:\backups\2011_04_07_11_52_04\*.cpp c:\original_dir).
That way, you keep all the non-white-listed files in case you screw up (and you will at some point, trust me) and you don't have to worry about negative logic in your batch file (remove all files that _aren't of all these types), instead using the simpler option (move back every file that is of each type).
I have a bunch of MP3 files split up into artist\album, and I want to move these all into a single directory, and get rid of the directory itself, using a windows batch file (hence the tags)
You can start from:
for /R %%x in (*.mp3) do move "%%x" "c:\dir"
Use the Windows search function to search for *.MP3, wait for it to finish. Select all results and use cut. Paste into the target directory.
Then the subdirectories should be empty. You can select them all at once and delete them.
For a bit of an overkill of an effort, install any Unix utilities (e.g. CYGWIN, many oithers) and do "mv //* final_dir" :)
Of course, you will be left with a highly useful and uber cool set of unix utilities for Windows.
Another overkill is t install ActivePErl and do it in Perl:
map { move($_, $final_dir) || die "Can not move $_: $!" } glob("basedir/*/*/*");
fs-dependent, filenumberlimit experimental result was you can have thousands of files same level here, didn't try > 10000, > 1000 ok
EDIT I see you want to do wit with "win batch" (in one of your comments added later)... I leave my answer up as an alternative...
I've used JP soft's 4NT (a command.com replacement) to do this.
cd <root of mp3 tree>
global /i move *.mpr \newdir
just beware that newdir aboslutely must not be a child of <root of mp3 tree>
global executes a command (the move command) in every subdirecotry of the starting directory. /i tells it to ignore returncodes (a directory might contain zero mp3 files).
4NT is nolonger sold but "Take Command" should work also.
artist>move *.mp3 destinationDirectory will work I believe.
This should be moved to superuser, first off. Second, I use MusicBrainz for my mp3 library.
Since the question has gotten more complex, let me elaborate on MusicBrainz.
You point it at a music folder, deep as you want, and it grabs all songs found in that directory. It then offers to retag them based on it's user-generated DB. It uses some crazy method of audio finger printing to guess any songs that either lack meta-data or need the right meta-data (say goodbye to Aretha Franklin doing "Son of Preacher Man" and the famous Rolling Stones cover of "Brown Eyed Girl").
After finishing up with any meta data correction, you hit save, and it will:
a) replace/add the meta data tags
b) move your mp3 files into directories based on any pattern you specify
c) if you set this, it will delete any folders that it leaves empty upon file relocation
So, you could simply tell it NOT to retag and NOT to use any meta data for folder destination, and it will all that you want (and more if you want it to).
I have mine set to grab stuff from my "Giant Music Mess" folder and then put them into folders based on artist, album, disc, and finally give the mp3 file a "track# - title" rename. Something like Music Library/%Artist%/%Album%/%Vol%/%#% - %title