Search for exe file within a condition - file

I want to search for all .exe files greater than 200 kB or smaller than 120 kB in the current folder and its subfolders. Then I want to move them to another folder called "folder" and execute in this folder the file called "executable.exe" infinitely and show some information about its memory consumption.
Any ideas?

Using a gnu-ish find (not sure what mingw uses), something like this?
cd your_folder
find . -name '*.exe' \( -size +200k -o -size -120k \) -exec mv {} folder \;
cd folder
run_some_executable.exe
The find finds your files, exec's a move of each one to your folder.
Then it cd's to your folder and runs the executable.
You'd then have to run another tool to check memory consumption.

Related

How to move (mv) file by file starting from the last on macOS

I would like to move file by file from one folder to another starting with the last file and so on until there are no more files in the source folder.
I already managed to find and isolate the last file in the source folder with find and move to directory folder with mv:
find ~/Music/Music -not -path '*/\.*' -type f | gtac | head-1 | xargs -I '{}' mv {} ~/Music/iTunes/iTunes\ Media/Automatically\ Add\ to\ Music.localized
Is it also possible to add a rest time between the different mv?
Thank you in advance for your answers

Is there a way to move n files from n folders each folder contain 1 file with the same extension i.e (.pdf)

i have 126 folders in the same directory with names like this
"5d843c63-2043-499b-abd6-6ea0bbde5f58"
each folder contain 1 #pdf file
i want to move each file to one
i have used the command and managed to locate all the pdfs in the separated folders
locate ~/Desktop/wps\ /*pdf
then i used the " | " to move them at once but i couldn't i use the commnad
locate ~/Desktop/wps\ /*pdf | mv ~/Desktop/pdffff/
STOUD be like mv: missing destination file operand after '/home/yousef/Desktop/pdffff/'
using ubuntu
I don't know the locate command, but if it returns a list of paths of the files you want, do
mv `locate ~/Desktop/wps\ /*pdf` /path_to/new_folder_to_store_pdfs
The backticks ` ` will execute the locate command first, then the output of locate will be substituted into the outer mv command.
I did it just now with find and it worked.
mv `find -name *pdf` new_folder

Getting specific files from server

Using Terminal and Shell/Bash commands is there a way to retrive specific files from a web directory? I.e.
Directory: www.site.com/samples/
copy all files ending in ".h" into a folder
The folder contains text files, and other files associated that are of no use.
Thanks :)
There are multiple ways of achieving this recursively:
1. using find
1.1 making directorys using find -p to create recursive folders without errors
cd path;
mkdir backup
find www.site.com/samples/ -type d -exec mkdir -p {} backup/{} \;
1.2 finding specific files and copying to backup folder -p to perserve permissions
find www.site.com/samples/ -name \*.h -exec cp -p {} backup/{} \;
Using tar well actually for reverse type of work i.e. to exclude specific files which the part of the question related to text files matches this answer more:
You can have as many excludes as you liked added on
tar --exclude=*.txt --exclude=*.filetype2 --exclude=*.filetype3 -cvzf site-backup.tar.gz www.site.com
mv www.site.com www.site.com.1
tar -xvzf site-backup.tar.gz
You can use the wget for that, but if there are no links to that files. I.e. they exist, but they are not referenced from any html page, then bruteforce is the only option.
cp -aiv /www.site.com/samples/*.h /somefolder/
http://linux.die.net/man/1/cp

Replace all files of a certain type in all directories and subdirectories?

I have played around with the find command and anything else I can think of but nothing will work.
I would like my bash script to be able to find all of a file type in a given directory and all of its subdirectories and replace the file with another.
EX: lets say
/home/test1/randomfolder/index.html
/home/test1/randomfolder/stuff.html
/home/different/stuff/index.html
/home/different/stuff/another.html
Each of those .html files need to be found when the program is given /home/ as a directory to search in, and then replaced by echoing the other file into them.
Is this possible in bash?
This should more or less get you going in the right direction:
for file in `find . -type f -name \*.html`; do echo "new content" > $file; done

symlink-copying a directory hierarchy

What's the simplest way on Linux to "copy" a directory hierarchy so that a new hierarchy of directories are created while all "files" are just symlinks pointing back to the actual files on the source hierarchy?
cp -s does not work recursively.
I just did a quick test on a linux box and cp -sR /orig /dest does exactly what you described: creates a directory hierarchy with symlinks for non-directories back to the original.
cp -as /root/absolute/path/name dest_dir
will do what you want. Note that the source name must be an absolute path, it cannot be relative. Else, you'll get this error: "xyz-file: can make relative symbolic links only in current directory."
Also, be careful as to what you're copying: if dest_dir already exists, you'll have to do something like:
cp -as /root/absolute/path/name/* dest_dir/
cp -as /root/absolute/path/name/.* dest_dir/
Starting from above the original & new directories, I think this pair of find(1) commands will do what you need:
find original -type d -exec mkdir new/{} \;
find original -type f -exec ln -s {} new/{} \;
The first instance sets up the directory structure by finding only directories in the original tree and recreating them in the new tree. The second creates the symlinks to the original files in the new tree.
There's also the "lndir" utility (from X) which does such a thing; I found it mentioned here: Debian Bug report #301030: can we move lndir to coreutils or debianutils? , and I'm now happily using it.
I googled around a little bit and found a command called lns, available from here.
If you feel like getting your hands dirty
Here is a trick that will automatically create the destination folder, subfolders and symlink all files recursively.
In the folder where the files you want to symlink and sub folders are:
create a file shell.sh:
nano shell.sh
copy and paste this charmer:
#!/bin/bash
export DESTINATION=/your/destination/folder/
export TARGET=/your/target/folder/
find . -type d -print0 | xargs -0 bash -c 'for DIR in "$#";
do
echo "${DESTINATION}${DIR}"
mkdir -p "${DESTINATION}${DIR}"
done' -
find . -type f -print0 | xargs -0 bash -c 'for file in "$#";
do
ln -s "${TARGET}${file}" "${DESTINATION}${file}"
done' -
save the file ctrl+O
close the file ctrl+X
Make your script executable chmod 777 shell.sh
Run your script ./shell.sh
Happy hacking!
I know the question was regarding shell, but since you can call perl from shell, I wrote a tool to do something very similar to this, and posted it on perlmonks a few years ago. In my case, I generally wanted directories to remain links until I decide otherwise. It'd be a fairly trivial change to do this automatically and recursively.

Resources