I've got an array of filenames I'd like to be ignored by stow, for example
IGNORES=('post_install\.sh' 'dummy')
(actually, that list isn't fixed but read from a file and will not always have the same length, so hardcoding like below won't work).
I form commandline flags out of the array like so
IGNORES=("${IGNORES[#]/#/--ignore=\'}")
IGNORES=("${IGNORES[#]/%/\'}")
When I do
stow -v "${IGNORES[#]}" -t $home $pkg
however, the ignores are not respected by stow, but it doesn't complain about invalid arguments either. Directly writing
stow -v --ignore='post_install\.sh' --ignore='ignore' -t $home $pkg
does work though.
What is the difference between these two ways of passing the --ignore flags, any ideas how to fix the issue? To my understanding, "${IGNORES[#]}" should evaluate to one word per array element and have the intended effect (I tried removing the quotes and indexing the array with *, too, but to no avail).
Thanks!
So while writing the post, I came across the solution: The single quotes I added here
IGNORES=("${IGNORES[#]/#/--ignore=\'}")
IGNORES=("${IGNORES[#]/%/\'}")
became part of the file names to ignore, and indeed a file named 'ignore' would be skipped; doing only
IGNORES=("${IGNORES[#]/#/--ignore=}")
has the desired effect. I still need to check how this copes with spaces in the array elements, but my guess is that it works just fine since the necessity of quoting words with spaces only stems from splitting a complete commandline into words like
stow -v --ignore='the file' -t $home $pkg
vs
stow -v --ignore=the file -t $home $pkg
which is not a problem for the above and "${IGNORES[#]}" gets the words just right.
Related
I have bash script with set -o nounset option (and I want that!).
Now, I want construct a command invocation, but I don't know number of arguments beforehand, so I want to use an array for that (example below). However, when ARRAY is an empty array, "${ARRAY[#]}" fails.
Question: how to #-expand array ("${ARRAY[#]}") so that the expansion does not fail when set -o nounset is on?
Example:
# Clone git repo. Use --reference if ${reference_local_repo} exist.
reference_local_repo=.....
test -d "${reference_local_repo}" \
&& reference=("--reference" "${reference_local_repo}") \
|| reference=()
git clone "${reference[#]}" http://address/of/the/repo
Of course, I could use the following instead:
# bad example
reference=''
test -d "${reference_local_repo}" && reference="--reference ${reference_local_repo}"
... but that wouldn't work if the path to local repo contained a whitespace.
As a workaround, instead of reference=() i use reference=("-c" "dummy.dummy=dummy"). That way I avoid empty array, and Bash does not complain. Alternatively, i can (rename the array variable and) have "clone" as the first array element. So I got this working, but I'd like to learn The Proper Way.
For the record, I'm using GNU bash, version 4.3.42(1)-release (x86_64-pc-linux-gnu).
To answer your specific question: The very old and simple way to deal with this is:
${reference[#]+"${reference[#]}"}
If reference is unset, nothing is expanded.
If it is set, all its components are expanded.
Read the historical roots for this use:
Once upon 20 or so years ago, some broken minor variants of the Bourne Shell substituted an empty string "" for "$#" if there were no arguments,
Of course, in this specific case:
test -d "${reference_local_repo}" && abool="" || unset abool
git clone ${abool+--reference "$reference_local_repo"} http://address/of/the/repo
When abool is set to NUL ("") (or some other value if you so choose to use), it is set, and in the next line it expands to what is after the plus (yes, as exactly two parameters).
When abool is unset, it completely disappears in the next line expansion.
Maybe this is more verbose:
unset abool
if test -d "${reference_local_repo}"; then abool="ValidDir"; fi
git clone ${abool+--reference "$reference_local_repo"} http://address/of/the/repo
I don't understand why you're using an array here. You could just:
test -d "${reference_local_repo}" \
&& reference="${reference_local_repo}" \
|| reference=""
git clone ${reference:+--reference "$reference"} http://address/of/the/repo
Now there are no undefined variables, and no mucking about with arrays for what is actually a single value.
You may use an auxiliar variable (or just redefine the same variable) to check if an array has anything:
foo=${your_array[#]:-}
and then:
git clone ... "${foo}" ...
This is compatible with the nounset flag. The :- expansion at the end of the variable (${your_array[#]:-}) will yield an empty string if $your_array is undefined.
This question already has answers here:
How do you store a list of directories into an array in Bash (and then print them out)?
(4 answers)
Closed 7 years ago.
I need to save content of two directories in an array to compare them later. Thats the solution i write:
DirContent()
{
#past '$1' directorys to 'directorys'
local DIRECTORYS=`ls -l --time-style="long-iso" $1 | egrep '^d' | awk '{print $8}'`
local CONTENT
local i
for DIR in $DIRECTORYS
do
i=+1
CONTENT[i]=${DIR}
done
echo $CONTENT
}
Then when I try to print this array I get empty output. Both directories are not empty. Please tell me what am I doing wrong here.
Thanks, Siery.
The core of this question is answered in the one I marked as a duplicate. Here are a few more pointers:
All uppercase variable names are discouraged as they are more likely to clash with environment variables.
You assign to DIRECTORYS (should probably be "directories") the output of a complicated command, which suffers from a few deficiencies:
Instead of backticks as in var=`command`, the syntax var=$(command) is preferred.
egrep is deprecated and grep -E is preferred.
The grep and awk commands could be combined to awk /^d/ '{ print $8 }'.
There are better ways to get directories, for example find, but the output of find shouldn't be parsed either.
You shouldn't process the output of ls programmatically: filenames can contain spaces, newlines, other special characters...
DIRECTORYS is now just one long string, and you rely on word splitting to iterate over it. Again, spaces in filenames will trip you up.
DIR isn't declared local.
To increase i, you'd use (( ++i )).
CONTENT[i]=${DIR} is actually okay: the i is automatically expanded here and doesn't have to be prepended by a $. Normally you'd want to quote your variables like "$dir", but in this case we happen to know that it won't be split any further as it already is the result of word splitting.
Array indices start at zero and you're skipping zero. You should increase the counter after the assignment.
Instead of using a counter, you can just append to an array with content+=("$dir").
To print the contents of an array, you'd use echo "${CONTENT[#]}".
But really, what you should do instead of all this: a call DirContent some_directory is equivalent to echo some_directory/*/, and if you want that in an array, you'd just use
arr=(some_directory/*/)
instead of the whole function – this even works for weird filenames. And is much, much shorter.
If you have hidden directories (names starts with .), you can use shopt -s dotglob to include them as well.
You can try
for((i=0;i<${#CONTENT[*]};i++))
do
echo ${CONTENT[$i]}
done
instead of echo $CONTENT
Also these change are required
((i=+1))
CONTENT[$i]=${DIR}
in your above code
Is it possible to send an array variable from the command line,
(where argsGrep="$#" and the command line input is something to the extent of -i Something) to a grep command
e.g.
result=$(grep $argsGrep ./file)
When $argsGrep has only the term to be searched, it works just fine, but the moment it contains more than the text and has a grep command, I can't get it to work whatsoever.
Don't use the intermediate string. It will just break things.
Just expand "$#" at the point you need it.
If you must save the contents of "$#" for some reason then you must use another array.
argsarr=("$#")
result=$(grep "${argsarr[#]}" ./file)
Suppose I have some program called "combine" that takes input of "red", "green" and "blue"-type files to produce an output file (let's say "color.jpg")... BUT the number of each type is arbitrary. Let's also suppose that there's no way to determine what type the file is except through how the user classifies them. What do people usually do in this case?
For instance, on the command line, some of the approaches might be:
command red1,red2,red3 green1,green2 blue1 color.jpg
This comma-approach breaks down if commas can appear in the filenames. It's the approach I like the most though. Another idea would be
command "red1 red2 red3" "green1 green2" "blue1" color.jpg
but this approach also has trouble with spaces in names.
I could also require ASCII files containing lists giving the files of each type:
command redlist greenlist bluelist color.jpg
but this requires lugging around extra files.
Further ideas? Is there a standard LINUX way of doing this?
The standard way would be this:
command --red red1.jpg --red red2.jpg --blue blue1.jpg
With short options:
command -r red1.jpg -r red2.jpg -b blue1.jpg
With bash shorthand:
command -r={red1,red2}.jpg -b blue1.jpg
(The above gets expanded by the shell so it looks like the previous invocation.)
Doing things this way avoids arbitrary limitations like "no commas in filenames" and also makes your program more interoperable with standard *nix utilities like xargs and so on.
Another way is accepting:
command -r redfile1 redfile2 -b bluefile1 blue2 blue2 -g green1
so that:
command -r red* -b blue* -g green*
is possible.
Edit: I think this has been answered successfully, but I can't check 'til later. I've reformatted it as suggested though.
The question: I have a series of files, each with a name of the form XXXXNAME, where XXXX is some number. I want to move them all to separate folders called XXXX and have them called NAME. I can do this manually, but I was hoping that by naming them XXXXNAME there'd be some way I could tell Terminal (I think that's the right name, but not really sure) to move them there. Something like
mv *NAME */NAME
but where it takes whatever * was in the first case and regurgitates it to the path.
This is on some form of Linux, with a bash shell.
In the real life case, the files are 0000GNUmakefile, with sequential numbering. I'm having to make lots of similar-but-slightly-altered versions of a program to compile and run on a cluster as part of my research. It would probably have been quicker to write a program to edit all the files and put in the right place in the first place, but I didn't.
This is probably extremely simple, and I should be able to find an answer myself, if I knew the right words. Thing is, I have no formal training in programming, so I don't know what to call things to search for them. So hopefully this will result in me getting an answer, and maybe knowing how to find out the answer for similar things myself next time. With the basic programming I've picked up, I'm sure I could write a program to do this for me, but I'm hoping there's a simple way to do it just using functionality already in Terminal. I probably shouldn't be allowed to play with these things.
Thanks for any help! I can actually program in C and Python a fair amount, but that's through trial and error largely, and I still don't know what I can do and can't do in Terminal.
SO many ways to achieve this.
I find that the old standbys sed and awk are often the most powerful.
ls | sed -rne 's:^([0-9]{4})(NAME)$:mv -iv & \1/\2:p'
If you're satisfied that the commands look right, pipe the command line through a shell:
ls | sed -rne 's:^([0-9]{4})(NAME)$:mv -iv & \1/\2:p' | sh
I put NAME in brackets and used \2 so that if it varies more than your example indicates, you can come up with a regular expression to handle your filenames better.
To do the same thing in gawk (GNU awk, the variant found in most GNU/Linux distros):
ls | gawk '/^[0-9]{4}NAME$/ {printf("mv -iv %s %s/%s\n", $1, substr($0,0,4), substr($0,5))}'
As with the first sample, this produces commands which, if they make sense to you, can be piped through a shell by appending | sh to the end of the line.
Note that with all these mv commands, I've added the -i and -v options. This is for your protection. Read the man page for mv (by typing man mv in your Linux terminal) to see if you should be comfortable leaving them out.
Also, I'm assuming with these lines that all your directories already exist. You didn't mention if they do. If they don't, here's a one-liner to create the directories.
ls | sed -rne 's:^([0-9]{4})(NAME)$:mkdir -p \1:p' | sort -u
As with the others, append | sh to run the commands.
I should mention that it is generally recommended to use constructs like for (in Tim's answer) or find instead of parsing the output of ls. That said, when your filename format is as simple as /[0-9]{4}word/, I find the quick sed one-liner to be the way to go.
Lastly, if by NAME you actually mean "any string of characters" rather than the literal string "NAME", then in all my examples above, replace NAME with .*.
The following script will do this for you. Copy the script into a file on the remote machine (we'll call it sortfiles.sh).
#!/bin/bash
# Get all files in current directory having names XXXXsomename, where X is an integer
files=$(find . -name '[0-9][0-9][0-9][0-9]*')
# Build a list of the XXXX patterns found in the list of files
dirs=
for name in ${files}; do
dirs="${dirs} $(echo ${name} | cut -c 3-6)"
done
# Remove redundant entries from the list of XXXX patterns
dirs=$(echo ${dirs} | uniq)
# Create any XXXX directories that are not already present
for name in ${dirs}; do
if [[ ! -d ${name} ]]; then
mkdir ${name}
fi
done
# Move each of the XXXXsomename files to the appropriate directory
for name in ${files}; do
mv ${name} $(echo ${name} | cut -c 3-6)
done
# Return from script with normal status
exit 0
From the command line, do chmod +x sortfiles.sh
Execute the script with ./sortfiles.sh
Just open the Terminal application, cd into the directory that contains the files you want moved/renamed, and copy and paste these commands into the command line.
for file in [0-9][0-9][0-9][0-9]*; do
dirName="${file%%*([^0-9])}"
mkdir -p "$dirName"
mv "$file" "$dirName/${file##*([0-9])}"
done
This assumes all the files that you want to rename and move are in the same directory. The file globbing also assumes that there are at least four digits at the start of the filename. If there are more than four numbers, it will still be caught, but not if there are less than four. If there are less than four, take off the appropriate number of [0-9]s from the first line.
It does not handle the case where "NAME" (i.e. the name of the new file you want) starts with a number.
See this site for more information about string manipulation in bash.