I want to read a list of patterns that may (probably do) contain wildcards from a file
The patterns might look like this:
/vobs/of_app/unix/*
/vobs/of_app/bin/*
etc
My first attempt was to do this:
old_IFS=$IFS
IFS=$'\n'
array=($(cat $file))
This worked fine when the patterns did not match anything in the filesystem, but when they did match things in the filesystem, they got expanded so instead of containing the patterns, my array contained the directory listings of the specified directories. This was no good.
I next tried quoting like this
array=("$(cat $file)")
But this dumped the entire contents of the file into the 0th element of the array.
How can I prevent it from expanding the wildcards into directory listings while still putting each line of the file into a separate array element?
Bash 4 introduced readarray:
readarray -t array < "$file"
and you're done.
array=()
while read line; do
array+=("$line")
done < "$file"
Related
I need to split a comma separated, but quoted list of strings into an indexed bash array in a script.
I know there are a lot of posts on the web in general and also on SO that show how to create an indexed array from a given line / string, but I could not find any example that does the array elements the way I need. I apologise, if I have missed any obvious examples from SO itself.
I am reading a file that I receive from someone, and cannot change it.
The file is formatted like this
"Grant ACL","grantacls.sh"
"Revoke ACL","revokeacls.sh"
"Get ACls for Topic","topicacls.sh"
"Get Topics for User with ACLs","useracls.sh"
I need to create an array for each line above where the separator is comma - and each of the quoted string will be an array element. I have tried various options. The latest attempt was using a construct like this - copied from some example on the web
parseScriptMapLine=${scriptName[$IN_OPTION]}
mapfile -td ',' script1 < <(echo -n "${parseScriptMapLine//, /,}")
declare -p script1
echo "script1 $script1"
where script name is an associative array created from the original file, whose format is with 1, 2, etc. as the key and the other part after '=' sign as value.
The above snippet prints
script1
And the value part I need to split into an indexed array, so that I can pass the second element as a parameter. When creating indexed array from the value string, if I have to lose the quotes, that is fine or if it creates the elements with the quotes, that is fine too.
1="Grant ACL","grantacls.sh"
2="Revoke ACL","revokeacls.sh"
3="Get ACls for Topic","topicacls.sh"
4="Get Topics for User with ACLs","useracls.sh"
I have looked at a lot of examples, but haven't been able to get this particular requirement working.
Thank you
With apologies, I could not understand what you wanted - this sounds like an X/Y Problem. Can you clarify?
Maybe this?
$: while IFS=',"' read -r _ a _ _ d _ && [[ -n "$d" ]]; do echo "a=[$a] d=[$d]"; done < file
a=[Grant ACL] d=[grantacls.sh]
a=[Revoke ACL] d=[revokeacls.sh]
a=[Get ACls for Topic] d=[topicacls.sh]
a=[Get Topics for User with ACLs] d=[useracls.sh]
That will let you do whatever you wanted with the fields, which I named a and d.
If you just want to load the lines of the file into an array -
$: mapfile -t script1 < file
$: for i in "${!script1[#]}"; do echo "$i=${script1[i]}"; done
0="Grant ACL","grantacls.sh"
1="Revoke ACL","revokeacls.sh"
2="Get ACls for Topic","topicacls.sh"
3="Get Topics for User with ACLs","useracls.sh"
If you want a two-dimensional array, then sorry, you're going to have to use something besides bash. or get more creative.
I have two directories with photos that I want to manipulate to output a random order of the files each time a script is run. How would I create such a list?
d1=/home/Photos/*.jpg
d2=/mnt/JillsPC/home/Photos/*.jpg
# somehow make a combined list, files = d1 + d2
# somehow randomise the file order
# during execution of the for;do;done loop, no file should be repeated
for f in $files; do
echo $f # full path to each file
done
I wouldn't use variables if you don't have to. It's more natural if you chain a couple of commands together with pipes or process substitution. That way everything operates on streams of data without loading the entire list of names into memory all at once.
You can use shuf to randomly permute input lines, and find to list files one per line. Or, to be maximally safe, let's use \0 separators. Finally, a while loop with process substitution reads line by line into a variable.
while IFS= read -d $'\0' -r file; do
echo "$file"
done < <(find /home/Photos/ /mnt/JillsPC/home/Photos/ -name '*.jpg' -print0 | shuf -z)
That said, if you do want to use some variables then you should use arrays. Arrays handle file names with whitespace and other special characters correctly, whereas regular string variables muck them all up.
d1=(/home/Photos/*.jpg)
d2=(/mnt/JillsPC/home/Photos/*.jpg)
files=("${d1[#]}" "${d2[#]}")
Iterating in order would be easy:
for file in "${files[#]}"; do
echo "$file"
done
Shuffling is tricky though. shuf is still the best tool but it works best on a stream of data. We can use printf to print each file name with the trailing \0 we need to make shuf -z happy.
d1=(/home/Photos/*.jpg)
d2=(/mnt/JillsPC/home/Photos/*.jpg)
files=("${d1[#]}" "${d2[#]}")
while IFS= read -d $'\0' -r file; do
echo "$file"
done < <(printf '%s\0' "${files[#]}" | shuf -z)
Further reading:
How can I read a file (data stream, variable) line-by-line (and/or field-by-field)?
How can I find and safely handle file names containing newlines, spaces or both?
I set variables in a loop that's in a pipeline. Why do they disappear after the loop terminates? Or, why can't I pipe data to read?
How can I randomize (shuffle) the order of lines in a file? Or select a random line from a file, or select a random file from a directory?
I came up with this solution after some more reading:
files=(/home/roy/Photos/*.jpg /mnt/JillsPC/home/jill/Photos/*.jpg)
printf '%s\n' "${files[#]}" | sort -R
Edit: updated with John's improvements from comments.
You can add any number of directories into an array declaration (though see caveat with complex names in comments).
sort -R seems to use shuf internally from looking at it's man page.
This was the original, which works, but is not as robust as the above:
files=(/home/roy/Photos/*.jpg /mnt/JillsPC/home/jill/Photos/*.jpg)
(IFS=$'\n'; echo "${files[*]}") | sort -R
With IFS=$'\n', echoing the array will display it line by line (IFS=$'somestring' is syntax for string literals with escape sequences. So unlike '\n', $'\n' is the correct way to set it to a line break). IFS is not needed when using the printf method above.
echo ${files[*]} will print out all array elements at once, using the IFS defined in
I want to be able to store a directory's contents inside of an array. I know that you can use:
#!/bin/bash
declare -A array
for i in Directory; do
array[$i]=$i
done
to store directory contents in an associative array. But if there are subdirectories inside of a directory, I want to be able to store them and their contents inside of the same array. I also tried using:
declare -A arr1
find Directory -print0 | while read -d $'\0' file; do
arr1[$file]=$file
echo "${arr1[$file]}"
done
but this just runs into the problem where the array contents vanish once the while loop ends due to the subshell being discarded from the pipeline (not sure if I'm describing this correctly).
I even tried the following:
for i in $(find Directory/*); do
arr2[$i]="$i"
echo $i
done
but the output is a total disaster for files containing any spaces.
How can I store both a directory and all of its subdirectories (and their subdirectories if need be) inside of a single array?
So you know, you don't need associative arrays. A simpler way to add an element to a regular indexed array is:
array+=("$value")
Your find|while approach is on the right track. As you've surmised, you need to get rid of the pipeline. You can do that with process substitution:
while read -d $'\0' file; do
arr1+=("$file")
done < <(find Directory -print0)
Another way to do this without find is with globbing. If you just want the files under Directory it's as simple as:
array=(Directory/*)
If you want to recurse through all of its subdirectories as well, you can enable globstar and use **:
shopt -s globstar
array=(Directory/**)
The globbing methods are really nice because they automatically handle file names with whitespace and other special characters.
I am able to read file into a regular array with a single statement:
local -a ary
readarray -t ary < $fileName
Not happening is reading a file into assoc. array.
I have control over file creation and so would like to do as simply as possible w/o loops if possible at all.
So file content can be following to be read in as:
keyname=valueInfo
But I am willing to replace = with another string if cuts down on code, especially in a single line code as above.
And ...
So would it be possible to read such a file into an assoc array using something like an until or from - i.e. read into an assoc array until it hits a word, or would I have to do this as part of loop?
This will allow me to keep a lot of similar values in same file, but read into separate arrays.
I looked at mapfile as well, but does same as readarray.
Finally ...
I am creating an options list - to select from - as below:
local -a arr=("${!1}")
select option in ${arr[*]}; do
echo ${option}
break
done
Works fine - however the list shown is not sorted. I would like to have it sorted if possible at all.
Hope it is ok to put all 3 questions into 1 as the questions are similar - all on arrays.
Thank you.
First thing, associative arrays are declared with -A not -a:
local -A ary
And if you want to declare a variable on global scope, use declare outside of a function:
declare -A ary
Or use -g if BASH_VERSION >= 4.2.
If your lines do have keyname=valueInfo, with readarray, you can process it like this:
readarray -t lines < "$fileName"
for line in "${lines[#]}"; do
key=${line%%=*}
value=${line#*=}
ary[$key]=$value ## Or simply ary[${line%%=*}]=${line#*=}
done
Using a while read loop can also be an option:
while IFS= read -r line; do
ary[${line%%=*}]=${line#*=}
done < "$fileName"
Or
while IFS== read -r key value; do
ary[$key]=$value
done < "$fileName"
I have a file (my_ID_file) of IDs, one ID per line, no other excess white white spaces. it was created from using the cut command form another file. the file looks like 101 lines of this...
PA10
PA102
PA103
PA105
PA107
PA109
I am trying to use these IDs in a for loop to create a directory structure. so I use the readarray function as such to create the array...
readarray TIDs < my_ID_file
and then use a for loop as such to create the directory structure...
for T in "${TIDs[#]}"
do
mkdir "$T"_folder
done
this produces directories named....
PA10?_folder
PA102?_folder
PA103?_folder
PA105?_folder
PA107?_folder
PA109?_folder
If however I declare the array manually like....
TIDs=(PA10 PA102 PA103 PA105 PA107 PA109)
and then run the for loop I get the correct directory structure produced like....
PA10_folder
PA102_folder
PA103_folder
PA105_folder
PA107_folder
PA109_folder
Where are these question marks coming from? how can i declare arrays from files like this without having this question mark appearing in subsequent use of the array?
Thanks
Your text file has DOS-style line endings. The "?" appear because the carriage returns would mess up ls output. Try od -c my_ID_file
Solution: dos2unix my_ID_file
take #2: readarray does not remove the line's newline by default. You really want
readarray -t TIDs < my_ID_file
reference: http://www.gnu.org/software/bash/manual/bashref.html#index-mapfile