Looping through sub folders not working Unix - file

I have a folder with multiple sub-folders and each sub-folder contains 10-15 files. I want to perform a certain operation only on the text files in these folders. The folders contain other types of files as well. For now, I am just trying to write a simple for loop to access every file.
for /r in *.txt; do "need to perform this on every file"; done
This gives me an error -bash: ``/R': not a valid identifier
Thanks for the help.
P.S I am using cygwin on Win 7.

Your /r is the problem, that's not a valid identifier (as bash said, you need to drop the /). Also, this won't recurse into subdirectories. If your operation is simple, you can directly use the exec option of find. {} is a placeholder for the filename.
find . -name "*.txt" -exec ls -l {} \;
Otherwise, try something like
for r in $( find . -name "*.txt" ) ; do
echo $r
#more actions...
done

With bash:
shopt -s globstar
for file in **/*.txt; do ...

I would use "find" for your application case
Something like
find . -name "*.txt" -exec doSomeThing {} \;

Related

Error when trying to recursively add file extension to all files

Referring to this post, recursively add file extension to all files, I am trying to recursively add extensions to many files within many separate subfolders. All of the files appearing at the end of my subfolders do not have any extension at all, and I would like to give them all a .html extension.
I have tried the following in my command prompt after using cd to change to the parent directory that I would like to use:
find /path -type f -not -name "*.*" -exec mv "{}" "{}".html \;
However, I receive the following error: "FIND: Invalid switch"
I am new to using the command prompt for this type of manipulation, so please excuse my ignorance. I am thinking that maybe I have to change the /path to the directory I want it to look through, but I tried that to no avail.
I have also tried the following command:
find . -type f -exec mv '{}' '{}'.html \;
and receive the following error: FIND: Parameter format not correct
I am running Windows 10.
Seems like -not isn't available in your find version, use ! instead:
find /path -type f \! -name "*.*" -exec mv "{}" "{}".html \;
From find manual:
-not expr
Same as ! expr, but not POSIX compliant.
-not is an archaic form of logical negation; the current form is ! (exclamation). It has to be followed by a boolean expression. In this case, you followed it with -name, which fouled the command line parsing. -name is another option, not a valid expression operator.
You need to build the negation within your regular expression: negate the period, not the entire name.
I see another strong indicator: what is FIND? The command you supposedly ran is find; UNIX is case-significant. At whatever command line you're using, type man find or find --help to get a list of options and semantics. I'm worried that the bash you have under Windows isn't full-featured.
Are you familiar with the Windows command rename? It has a syntax similar to the UNIX mv, although it will work on multiple files. For instance
rename [^.]* *.html
I think would work for a single directory.
Apologies to all who commented and left answers. I believe I was unclear that I was trying to use this specifically from the windows cmd prompt. I used the following to add extensions to all files at the end of my subfolders:
FOR /R %f IN (*.) DO REN "%f" *.html

Run an awk script on every file of a certain type in a directory

I have a directory with several hundred .log files in it, and I have a script to pull some info out of them and print it to an existing file. Running it on one file goes like
awk -f HLGcheck.sh 1-1-1.log >> outputs.txt
and this works fine. I've looked around for several hours online and I can't seem to find a decent way to have it run on all .log files in the directory. Any help from people smarter than me would be appreciated.
Some techniques:
If the awk script can only handle one file at a time, use a for loop as shown or
find . -name '*.log' -exec awk -f HLGcheck.sh '{}' \; >> outputs.txt
If the awk script can handle multiple files:
awk -f HLGcheck.sh *.log >> outputs.txt
find . -name '*.log' -exec awk -f HLGcheck.sh '{}' \+ >> outputs.txt
bash has for loop for this purpose
$ for f in *.log; do your_processing_here; done
you can refer to the file in processing as $f

find and rename multiple files on multiple folder

Finding a way to rename multiple files on a multiple folder
folder i.e. I have file called "jobsforms.html.bak" on a multiple folder under:
/home/sites/juk/jobsforms.html.bak
/home/sites/juan/jobsforms.html.bak
/home/sites/pedro/jobsforms.html.bak
/home/sites/luois/jobsforms.html.bak
I want to rename all the files found as: "jobsforms.html" how can I do that.
I was trying to do this aproach.
find /home/sites -name "jobsform.html.bak" -exec bash -c 'mv "$1" "${1%/*}"/jobsform.html' -- {} \;
Anyone can help me how to go about to do this.
Than you,
David
You could pipe the output of find to awk, using the sub function to remove a substring from the filename:
find /home/sites -name "jobsforms.html.bak" | awk '{ori=$0; sub(/\.bak$/,"",$0); system("mv \""ori"\" "$0)}'

Append some text to the end of multiple files in Linux

How can I append the following code to the end of numerous php files in a directory and its sub directory:
</div>
<div id="preloader" style="display:none;position: absolute;top: 90px;margin-left: 265px;">
<img src="ajax-loader.gif"/>
</div>
I have tried with:
echo "my text" >> *.php
But the terminal displays the error:
bash : *.php: ambiguous redirect
I usually use tee because I think it looks a little cleaner and it generally fits on one line.
echo "my text" | tee -a *.php
You don't specify the shell, you could try the foreach command. Under tcsh (and I'm sure a very similar version is available for bash) you can say something like interactively:
foreach i (*.php)
foreach> echo "my text" >> $i
foreach> end
$i will take on the name of each file each time through the loop.
As always, when doing operations on a large number of files, it's probably a good idea to test them in a small directory with sample files to make sure it works as expected.
Oops .. bash in error message (I'll tag your question with it). The equivalent loop would be
for i in *.php
do
echo "my text" >> $i
done
If you want to cover multiple directories below the one where you are you can specify
*/*.php
rather than *.php
BashFAQ/056 does a decent job of explaining why what you tried doesn't work. Have a look.
Since you're using bash (according to your error), the for command is your friend.
for filename in *.php; do
echo "text" >> "$filename"
done
If you'd like to pull "text" from a file, you could instead do this:
for filename in *.php; do
cat /path/to/sourcefile >> "$filename"
done
Now ... you might have files in subdirectories. If so, you could use the find command to find and process them:
find . -name "*.php" -type f -exec sh -c "cat /path/to/sourcefile >> {}" \;
The find command identifies what files using conditions like -name and -type, then the -exec command runs basically the same thing I showed you in the previous "for" loop. The final \; indicates to find that this is the end of arguments to the -exec option.
You can man find for lots more details about this.
The find command is portable and is generally recommended for this kind of activity especially if you want your solution to be portable to other systems. But since you're currently using bash, you may also be able to handle subdirectories using bash's globstar option:
shopt -s globstar
for filename in **/*.php; do
cat /path/to/sourcefile >> "$filename"
done
You can man bash and search for "globstar" for more details about this. This option requires bash version 4 or higher.
NOTE: You may have other problems with what you're doing. PHP scripts don't need to end with a ?>, so you might be adding HTML that the script will try to interpret as PHP code.
You can use sed combined with find. Assume your project tree is
/MyProject/
/MyProject/Page1/file.php
/MyProject/Page2/file.php
etc.
Save the code you want to append on /MyProject/. Call it append.txt
From /MyProject/ run:
find . -name "*.php" -print | xargs sed -i '$r append.txt'
Explain:
find does as it is, it looks for all .php, including subdirectories
xargs will pass (i.e. run) sed for all .php that have just been found
sed will do the appending. '$r append.txt' means go to the end of the file ($) and write (paste) whatever is in append.txt there. Don't forget -i otherwise it will just print out the appended file and not save it.
Source: http://www.grymoire.com/unix/Sed.html#uh-37
You can do (Work even if there's space in your file path) :
#!/bin/bash
# Create a tempory file named /tmp/end_of_my_php.txt
cat << EOF > /tmp/end_of_my_php.txt
</div>
<div id="preloader" style="display:none;position: absolute;top: 90px;margin-left: 265px;">
<img src="ajax-loader.gif"/>
</div>
EOF
find . -type f -name "*.php" | while read the_file
do
echo "Processing $the_file"
#cp "$the_file" "${the_file}.bak" # Uncomment if you want to save a backup of your file
cat /tmp/end_of_my_php.txt >> "$the_file"
done
echo
echo done
PS: You must run the script from the directory you want to browse
Inspired from #Dantastic answer :
echo "my text" | tee -a file1.txt | tee -a file2.txt

'ctags' command is not creating tags for C header files

I am trying to create a tag file manually for C sources (*.c and *.h) using ctags command. Unfortunately the tag file is not having entries of all files, specially the header files.
I am using following command on the command prompt:
find . -name \*.[ch] -exec ctags {} \;
Kindly point out if I am missing some flag or something else above.
If you execute (your version):
find . -name \*.[ch] -exec ctags {} \;
then find executes ctags once for each file that is found. The tags file is overwritten each time, and only the tags for the last file remain.
Instead, you need to tell find to execute ctags exactly once, and specify all the matching files in one call. This is how you do that:
find . -name \*.[ch] -exec ctags {} +
OR (I like trojanfoe's version from the comment below because it is easier to read):
ctags $(find . -name \*.[ch])

Resources