Error when trying to recursively add file extension to all files - file

Referring to this post, recursively add file extension to all files, I am trying to recursively add extensions to many files within many separate subfolders. All of the files appearing at the end of my subfolders do not have any extension at all, and I would like to give them all a .html extension.
I have tried the following in my command prompt after using cd to change to the parent directory that I would like to use:
find /path -type f -not -name "*.*" -exec mv "{}" "{}".html \;
However, I receive the following error: "FIND: Invalid switch"
I am new to using the command prompt for this type of manipulation, so please excuse my ignorance. I am thinking that maybe I have to change the /path to the directory I want it to look through, but I tried that to no avail.
I have also tried the following command:
find . -type f -exec mv '{}' '{}'.html \;
and receive the following error: FIND: Parameter format not correct
I am running Windows 10.

Seems like -not isn't available in your find version, use ! instead:
find /path -type f \! -name "*.*" -exec mv "{}" "{}".html \;
From find manual:
-not expr
Same as ! expr, but not POSIX compliant.

-not is an archaic form of logical negation; the current form is ! (exclamation). It has to be followed by a boolean expression. In this case, you followed it with -name, which fouled the command line parsing. -name is another option, not a valid expression operator.
You need to build the negation within your regular expression: negate the period, not the entire name.
I see another strong indicator: what is FIND? The command you supposedly ran is find; UNIX is case-significant. At whatever command line you're using, type man find or find --help to get a list of options and semantics. I'm worried that the bash you have under Windows isn't full-featured.
Are you familiar with the Windows command rename? It has a syntax similar to the UNIX mv, although it will work on multiple files. For instance
rename [^.]* *.html
I think would work for a single directory.

Apologies to all who commented and left answers. I believe I was unclear that I was trying to use this specifically from the windows cmd prompt. I used the following to add extensions to all files at the end of my subfolders:
FOR /R %f IN (*.) DO REN "%f" *.html

Related

'ctags' command is not creating tags for C header files

I am trying to create a tag file manually for C sources (*.c and *.h) using ctags command. Unfortunately the tag file is not having entries of all files, specially the header files.
I am using following command on the command prompt:
find . -name \*.[ch] -exec ctags {} \;
Kindly point out if I am missing some flag or something else above.
If you execute (your version):
find . -name \*.[ch] -exec ctags {} \;
then find executes ctags once for each file that is found. The tags file is overwritten each time, and only the tags for the last file remain.
Instead, you need to tell find to execute ctags exactly once, and specify all the matching files in one call. This is how you do that:
find . -name \*.[ch] -exec ctags {} +
OR (I like trojanfoe's version from the comment below because it is easier to read):
ctags $(find . -name \*.[ch])

Replace all files of a certain type in all directories and subdirectories?

I have played around with the find command and anything else I can think of but nothing will work.
I would like my bash script to be able to find all of a file type in a given directory and all of its subdirectories and replace the file with another.
EX: lets say
/home/test1/randomfolder/index.html
/home/test1/randomfolder/stuff.html
/home/different/stuff/index.html
/home/different/stuff/another.html
Each of those .html files need to be found when the program is given /home/ as a directory to search in, and then replaced by echoing the other file into them.
Is this possible in bash?
This should more or less get you going in the right direction:
for file in `find . -type f -name \*.html`; do echo "new content" > $file; done

Looping through sub folders not working Unix

I have a folder with multiple sub-folders and each sub-folder contains 10-15 files. I want to perform a certain operation only on the text files in these folders. The folders contain other types of files as well. For now, I am just trying to write a simple for loop to access every file.
for /r in *.txt; do "need to perform this on every file"; done
This gives me an error -bash: ``/R': not a valid identifier
Thanks for the help.
P.S I am using cygwin on Win 7.
Your /r is the problem, that's not a valid identifier (as bash said, you need to drop the /). Also, this won't recurse into subdirectories. If your operation is simple, you can directly use the exec option of find. {} is a placeholder for the filename.
find . -name "*.txt" -exec ls -l {} \;
Otherwise, try something like
for r in $( find . -name "*.txt" ) ; do
echo $r
#more actions...
done
With bash:
shopt -s globstar
for file in **/*.txt; do ...
I would use "find" for your application case
Something like
find . -name "*.txt" -exec doSomeThing {} \;

symlink-copying a directory hierarchy

What's the simplest way on Linux to "copy" a directory hierarchy so that a new hierarchy of directories are created while all "files" are just symlinks pointing back to the actual files on the source hierarchy?
cp -s does not work recursively.
I just did a quick test on a linux box and cp -sR /orig /dest does exactly what you described: creates a directory hierarchy with symlinks for non-directories back to the original.
cp -as /root/absolute/path/name dest_dir
will do what you want. Note that the source name must be an absolute path, it cannot be relative. Else, you'll get this error: "xyz-file: can make relative symbolic links only in current directory."
Also, be careful as to what you're copying: if dest_dir already exists, you'll have to do something like:
cp -as /root/absolute/path/name/* dest_dir/
cp -as /root/absolute/path/name/.* dest_dir/
Starting from above the original & new directories, I think this pair of find(1) commands will do what you need:
find original -type d -exec mkdir new/{} \;
find original -type f -exec ln -s {} new/{} \;
The first instance sets up the directory structure by finding only directories in the original tree and recreating them in the new tree. The second creates the symlinks to the original files in the new tree.
There's also the "lndir" utility (from X) which does such a thing; I found it mentioned here: Debian Bug report #301030: can we move lndir to coreutils or debianutils? , and I'm now happily using it.
I googled around a little bit and found a command called lns, available from here.
If you feel like getting your hands dirty
Here is a trick that will automatically create the destination folder, subfolders and symlink all files recursively.
In the folder where the files you want to symlink and sub folders are:
create a file shell.sh:
nano shell.sh
copy and paste this charmer:
#!/bin/bash
export DESTINATION=/your/destination/folder/
export TARGET=/your/target/folder/
find . -type d -print0 | xargs -0 bash -c 'for DIR in "$#";
do
echo "${DESTINATION}${DIR}"
mkdir -p "${DESTINATION}${DIR}"
done' -
find . -type f -print0 | xargs -0 bash -c 'for file in "$#";
do
ln -s "${TARGET}${file}" "${DESTINATION}${file}"
done' -
save the file ctrl+O
close the file ctrl+X
Make your script executable chmod 777 shell.sh
Run your script ./shell.sh
Happy hacking!
I know the question was regarding shell, but since you can call perl from shell, I wrote a tool to do something very similar to this, and posted it on perlmonks a few years ago. In my case, I generally wanted directories to remain links until I decide otherwise. It'd be a fairly trivial change to do this automatically and recursively.

Text specification for a tree of files?

I'm looking for examples of specifying files in a tree structure, for example, for specifying the set of files to search in a grep tool. I'd like to be able to include and exclude files and directories by name matches. I'm sure there are examples out there, but I'm having a hard time finding them.
Here's an example of a possible syntax:
*.py *.html
*.txt *.js
-*.pyc
-.svn/
-*combo_*.js
(this would mean include file with extensions .py .html .txt .js, exclude .pyc files, anything under a .svn directory, and any file matching combo_.js)
I know I've seen these sorts of specifications in other tools before. Is this ringing any bells for anyone?
There is no single standard format for this kind of thing, but if you want to copy something that is widely recognized, have a look at the rsync documentation. Look at the chapter on "INCLUDE/EXCLUDE PATTERN RULES."
Apache Ant provides 'ant globs or patterns where:
**/foo/**/*.java
means "any file ending in '.java' in a directory which includes a directory named 'foo' in its path" -- including ./foo/X.java
In your example syntax, is it implicitly understood that there's an escaping character so that you can explicitly include a file that begins with a dash? (The same question goes for any other wildcard characters, but I suppose I'd expect to see more files with dashes in their names than asterisks.)
Various command shells use * (and possibly ? to match a single char), as in your example, but they generally only match against a string of characters that doesn't include a path component separator (i.e. '\' on Windows systems, '/' elsewhere). I've also seen such source control apps as Perforce use additional patterns that can match against path component separators. For instance, with Perforce the pattern "foo/...ext" (without quotes) will match all files under the foo/ directory structure that end with "ext", regardless of whether they are in foo/ itself or in one of its descendant directories. This seems to be a useful pattern.
If you're using bash, you can use the extglob extension to get some nice globbing functions. Enable it as follows:
shopt -s extglob
Then you can do things like the following:
# everything but .html, .jpg or ,gif files
ls -d !(*.html|*gif|*jpg)
# list file9, file22 but not fileit
ls file+([0-9])
# begins with apl or un only
ls -d +(apl*|un*)
See also this page.
How about find in unixish environments?
Find can, of course, do more than build a list of files, but that is one of the common ways it is used. From the man page:
NAME
find -- walk a file hierarchy
SYNOPSIS
find [-H | -L | -P] [-EXdsx] [-f pathname] pathname ... expression
find [-H | -L | -P] [-EXdsx] -f pathname [pathname ...] expression
DESCRIPTION
The find utility recursively descends the directory tree for each
pathname listed, evaluating an expression (composed of the
primaries''
andoperands'' listed below) in terms of each file in the tree.
to achieve your goal I would write something like (formatted for readability):
find ./ \( -name *.{py,html,txt,js,pyc} -or \
-name *combo_*.js -or \
\( -name *.svn -and -type d\)\) \
-print
Moreover there is a idomatic pattern using xargs which makes find suitable for sending the whole list so constructed to an arbitrary command as in:
find /path -type f -print0 | xargs -0 rm
find(1) is a fine tool as described in the previous answer but if it gets more complicated, you should consider either writing your own script in any of the usual suspects (Ruby, Perl, Python et al.) or try to use one of the more powerful shells such as zsh which has a ** globbing commands and you can specify things to exclude. The latter is probably more complicated though.
You might want to check out ack, which allows you to specify file types to search in with options like --perl, etc.
It also ignores .svn directories by default, as well as core dumps, editor cruft, binary files, and so on.

Resources