find and rename multiple files on multiple folder - file

Finding a way to rename multiple files on a multiple folder
folder i.e. I have file called "jobsforms.html.bak" on a multiple folder under:
/home/sites/juk/jobsforms.html.bak
/home/sites/juan/jobsforms.html.bak
/home/sites/pedro/jobsforms.html.bak
/home/sites/luois/jobsforms.html.bak
I want to rename all the files found as: "jobsforms.html" how can I do that.
I was trying to do this aproach.
find /home/sites -name "jobsform.html.bak" -exec bash -c 'mv "$1" "${1%/*}"/jobsform.html' -- {} \;
Anyone can help me how to go about to do this.
Than you,
David

You could pipe the output of find to awk, using the sub function to remove a substring from the filename:
find /home/sites -name "jobsforms.html.bak" | awk '{ori=$0; sub(/\.bak$/,"",$0); system("mv \""ori"\" "$0)}'

Related

Run an awk script on every file of a certain type in a directory

I have a directory with several hundred .log files in it, and I have a script to pull some info out of them and print it to an existing file. Running it on one file goes like
awk -f HLGcheck.sh 1-1-1.log >> outputs.txt
and this works fine. I've looked around for several hours online and I can't seem to find a decent way to have it run on all .log files in the directory. Any help from people smarter than me would be appreciated.
Some techniques:
If the awk script can only handle one file at a time, use a for loop as shown or
find . -name '*.log' -exec awk -f HLGcheck.sh '{}' \; >> outputs.txt
If the awk script can handle multiple files:
awk -f HLGcheck.sh *.log >> outputs.txt
find . -name '*.log' -exec awk -f HLGcheck.sh '{}' \+ >> outputs.txt
bash has for loop for this purpose
$ for f in *.log; do your_processing_here; done
you can refer to the file in processing as $f

Getting specific files from server

Using Terminal and Shell/Bash commands is there a way to retrive specific files from a web directory? I.e.
Directory: www.site.com/samples/
copy all files ending in ".h" into a folder
The folder contains text files, and other files associated that are of no use.
Thanks :)
There are multiple ways of achieving this recursively:
1. using find
1.1 making directorys using find -p to create recursive folders without errors
cd path;
mkdir backup
find www.site.com/samples/ -type d -exec mkdir -p {} backup/{} \;
1.2 finding specific files and copying to backup folder -p to perserve permissions
find www.site.com/samples/ -name \*.h -exec cp -p {} backup/{} \;
Using tar well actually for reverse type of work i.e. to exclude specific files which the part of the question related to text files matches this answer more:
You can have as many excludes as you liked added on
tar --exclude=*.txt --exclude=*.filetype2 --exclude=*.filetype3 -cvzf site-backup.tar.gz www.site.com
mv www.site.com www.site.com.1
tar -xvzf site-backup.tar.gz
You can use the wget for that, but if there are no links to that files. I.e. they exist, but they are not referenced from any html page, then bruteforce is the only option.
cp -aiv /www.site.com/samples/*.h /somefolder/
http://linux.die.net/man/1/cp

Looping through sub folders not working Unix

I have a folder with multiple sub-folders and each sub-folder contains 10-15 files. I want to perform a certain operation only on the text files in these folders. The folders contain other types of files as well. For now, I am just trying to write a simple for loop to access every file.
for /r in *.txt; do "need to perform this on every file"; done
This gives me an error -bash: ``/R': not a valid identifier
Thanks for the help.
P.S I am using cygwin on Win 7.
Your /r is the problem, that's not a valid identifier (as bash said, you need to drop the /). Also, this won't recurse into subdirectories. If your operation is simple, you can directly use the exec option of find. {} is a placeholder for the filename.
find . -name "*.txt" -exec ls -l {} \;
Otherwise, try something like
for r in $( find . -name "*.txt" ) ; do
echo $r
#more actions...
done
With bash:
shopt -s globstar
for file in **/*.txt; do ...
I would use "find" for your application case
Something like
find . -name "*.txt" -exec doSomeThing {} \;

Script to delete specific files from all zip archives in a directory

I'd like to delete the file with a name, say file.xml, from all zip archives in a given directory. Any simple solution is welcome, whether it's a bash script, python code or whatever.
Thanks!
Something like this would do it recursively in a dir
#!/bin/bash
find /my/dir -name "*.zip" -exec zip \{\} -d file.xml \;
Something along these lines:
find /starting/folder -name "*.zip"|xargs -I{} zip -d {} file.xml
Do make a backup of your zips before...

symlink-copying a directory hierarchy

What's the simplest way on Linux to "copy" a directory hierarchy so that a new hierarchy of directories are created while all "files" are just symlinks pointing back to the actual files on the source hierarchy?
cp -s does not work recursively.
I just did a quick test on a linux box and cp -sR /orig /dest does exactly what you described: creates a directory hierarchy with symlinks for non-directories back to the original.
cp -as /root/absolute/path/name dest_dir
will do what you want. Note that the source name must be an absolute path, it cannot be relative. Else, you'll get this error: "xyz-file: can make relative symbolic links only in current directory."
Also, be careful as to what you're copying: if dest_dir already exists, you'll have to do something like:
cp -as /root/absolute/path/name/* dest_dir/
cp -as /root/absolute/path/name/.* dest_dir/
Starting from above the original & new directories, I think this pair of find(1) commands will do what you need:
find original -type d -exec mkdir new/{} \;
find original -type f -exec ln -s {} new/{} \;
The first instance sets up the directory structure by finding only directories in the original tree and recreating them in the new tree. The second creates the symlinks to the original files in the new tree.
There's also the "lndir" utility (from X) which does such a thing; I found it mentioned here: Debian Bug report #301030: can we move lndir to coreutils or debianutils? , and I'm now happily using it.
I googled around a little bit and found a command called lns, available from here.
If you feel like getting your hands dirty
Here is a trick that will automatically create the destination folder, subfolders and symlink all files recursively.
In the folder where the files you want to symlink and sub folders are:
create a file shell.sh:
nano shell.sh
copy and paste this charmer:
#!/bin/bash
export DESTINATION=/your/destination/folder/
export TARGET=/your/target/folder/
find . -type d -print0 | xargs -0 bash -c 'for DIR in "$#";
do
echo "${DESTINATION}${DIR}"
mkdir -p "${DESTINATION}${DIR}"
done' -
find . -type f -print0 | xargs -0 bash -c 'for file in "$#";
do
ln -s "${TARGET}${file}" "${DESTINATION}${file}"
done' -
save the file ctrl+O
close the file ctrl+X
Make your script executable chmod 777 shell.sh
Run your script ./shell.sh
Happy hacking!
I know the question was regarding shell, but since you can call perl from shell, I wrote a tool to do something very similar to this, and posted it on perlmonks a few years ago. In my case, I generally wanted directories to remain links until I decide otherwise. It'd be a fairly trivial change to do this automatically and recursively.

Resources