I have a function to which I am passing a path.
What happens is function setSub calls function testSub and under certain conditions testSub calls setSub with a different path.
Here is what I have so far
shopt -s nullglob
function setSub() {
local assets=("$1"/*)
echo ${#assets[#]} ######### Here
for asset in "${assets[#]}";
do
if [ -d "$asset" ]; then
setSub "$asset"
fi;
done
}
The place I marked 'Here' outputs the array length. Problem is whenever it calls itself the assets array is 0. The above sample should drill down and list the number of items in a directory. (What the sample does, not what my whole script does)
It seems what I was looking for was shopt -s dotglob
My test case only had dot files in the directory.
But after #l0b0 suggestion I did a bit of research and found this.
After reading about the issues with globing, I thought it best to replace it with find
Related
I have encountered a very curious problem, while trying to learn bash.
Usually trying to print an echo by simply parsing the variable name like this only outputs the first member Hello.
#!/bin/bash
declare -a test
test[0]="Hello"
test[1]="World"
echo $test # Only prints "Hello"
BUT, for some reason this piece of code prints out ALL members of the given array.
#!/bin/bash
declare -a files
counter=0
for file in "./*"
do
files[$counter]=$file
let $((counter++))
done
echo $files # prints "./file1 ./file2 ./file3" and so on
And I can't seem to wrap my head around it on why it outputs the whole array instead of only the first member. I think it has something to do with my usage of the foreach-loop, but I was unable to find any concrete answer. It's driving me crazy!
Please send help!
When you quoted the pattern, you only created a single entry in your array:
$ declare -p files
declare -a files=([0]="./*")
If you had quoted the parameter expansion, you would see
$ echo "$files"
./*
Without the quotes, the expansion is subject to pathname generation, so echo receives multiple arguments, each of which is printed.
To build the array you expected, drop the quotes around the pattern. The results of pathname generation are not subject to further word-splitting (or recursive pathname generation), so no quotes would be needed.
for file in ./*
do
...
done
I am trying to loop through an array of directories using a bash script so I can list directories with their timestamp, ownership etc using ls -arlt. I am reviewing bash so would like some feedback.
It works with declare -a for those indirect references but for each directory it outputs and extra directory from the /home/user.
I tried to use declare -n and declare -r for each directory and doesn't work.
#!/bin/bash
# Bash variables
acpi=/etc/acpi
apm=/etc/apm
xml=/etc/xml
array=( acpi apm xml )
# Function to display timestamp, ownership ...
displayInfo()
{
for i in "${array[#]}"; do
declare -n curArray=$i
if [[ -d ${curArray} ]]; then
declare -a _acpi=${curArray[0]} _apm=${curArray[1]} _xml=${curArray[2]}
echo "Displaying folder apci: "
cd $_acpi
ls -alrt
read -p "Press enter to continue"
echo "Displaying folder apm: "
cd $_apm
ls -alrt
read -p "Press enter to continue"
echo "Displaying folder xml: "
cd $_xml
ls -alrt
read -p "Press enter to continue"
else
echo "Displayed Failed" >&2
exit 1
fi
done
}
displayInfo
exit 0
It outputs an extra directory listing the /home/user and don't want that output.
There are a lot of complex and powerful shell features being used here, but in ways that don't fit together or make sense. I'll go over the mistakes in a minute, first let me just give how I'd do it. One thing I will use that you might not be familiar with is indirect variable references with ${!var} -- this is like using a nameref variable, but IMO it's clearer what's going on.
acpi=/etc/acpi
apm=/etc/apm
xml=/etc/xml
array=( acpi apm xml )
displayInfo()
{
for curDirectory in "${array[#]}"; do
if [[ -d ${!curDirectory} ]]; then
echo "Displaying folder $curDirectory:"
ls -alrt "${!curDirectory}"
read -p "Press enter to continue"
else
echo "Error: ${!curDirectory} does not exist or is not a directory" >&2
exit 1
fi
done
}
displayInfo
(One problem with this is that it does the "Press enter to continue" thing after each directory, rather than just between them. This can be fixed, but it's a little more work.)
Ok, now for what went wrong with the original. My main recommendation for you would be to try mentally stepping through your code to see what it's doing. It can help to put set -x before it, so the shell will print its interpretation of what it's doing as it runs, and see how it compares to what you expected. Let's do a short walkthrough of the displayInfo function:
for i in "${array[#]}"; do
This will loop over the contents of array, so on the first pass through the loop i will be set to "acpi". Good so far.
declare -n curArray=$i
This creates a nameref variable pointing to the other variable acpi -- this is similar to what I did with ${! }, and basically reasonable so far. Well, with one exception: the name suggests it's an array, but acpi is a plain variable, not an array.
if [[ -d ${curArray} ]]; then
This checks whether the contents of the acpi variable, "/etc/acpi" is the path of an existing directory (which it is). Still doing good.
declare -a _acpi=${curArray[0]} _apm=${curArray[1]} _xml=${curArray[2]}
Here's where things go completely off the rails. curArray points to the variable acpi, so ${curArray[0]} etc are equivalent to ${acpi[0]} etc. But acpi isn't an array, it's a plain variable, so ${acpi[0]} gets its value, and ${acpi[1]} and ${acpi[2]} get nothing. Furthermore, you're using declare -a (declare arrays), but you're just assigning single values to _acpi, _apm, and _xml. They're declared as arrays, but you're just using them as plain variables (basically the reverse of how you're using curArray -> acpi).
There's a deeper confusion here as well. The for loop above is iterating over "acpi", "apm", and "xml", and we're currently working on "acpi". During this pass through the loop, you should only be working on acpi, not also trying to work on apm and xml. That's the point of having a for loop there.
Ok, that's the main problem here, but let me just point out a couple of other things I'd consider bad practice:
cd $_apm
ls -alrt
Using a variable reference without double-quotes around it like this invites parsing confusion; you should almost always put double-quotes, like cd "$_apm". Also, using cd in a script is dangerous because if it fails the rest of the script will execute in the wrong place. In this case, _apm is empty, so without double-quotes it's equivalent to just cd, which moves to your home directory. This is why you're getting that result. If you used cd "$_apm" it would get an error instead... but since you don't check for that it'll go ahead and still list an irrelevant location.
It's almost always better to avoid cd and its complications entirely, and just use explicit paths, like ls -alrt "$_apm".
echo "Displayed Failed" >&2
exit 1
Do you actually want to exit the entire script if one of the directories doesn't exist? It'd make more sense to me to just return 1 (which exits just the function, not the entire script), or better yet continue (which just goes on to the next iteration of the loop -- i.e. the next directory on the list). I left the exit in my version, but I'd recommend changing it.
One more similar thing:
acpi=/etc/acpi
apm=/etc/apm
xml=/etc/xml
array=( acpi apm xml )
Is there any actual reason to use this array -> variable name -> actual directory path system (and resulting indirect expansion or nameref complications), rather than just having an array of directory paths, like this?
array=( /etc/acpi /etc/apm /etc/xml )
I left the indirection in my version above, but really if there's no reason for it I'd remove the complication.
I have an assignment I am working on, but I am having a problem getting it started. Some of the assignment text is below, which can help guild me in the right direction.
My main problem is getting the list of files into an array. I think if I can do that, the rest should be easy. I can push the files into an array that are passed as arguments, but I don't know how to get all the files from a directory, broken up into each file into an array.
Any help would be greatly appreciated!
Thanks to Benjamin W's comment:
Just use files=(*)
Or, if you want to include hidden files and don't want do get in trouble with empty folders, use this (thanks to Fred's comment):
shopt -s nullglob dotglob
files=(*)
#!/bin/bash
shopt -s nullglob
arr=(/home/*)
for ((i=0; i<${#arr[#]}; i++)); do
echo "${arr[$i]}"
done
This script checks if it has been given any parameters ((( $# == 0 ))), and if not, it uses set -- "$PWD" to set the first positional parameter to the current directory, ..
After that, for f (which is short for for f in "$#") loops over all the parameters for processing.
#!/bin/bash
(( $# == 0 )) && set -- "$PWD"
for f; do
# Do something with f
done
I want to be able to store a directory's contents inside of an array. I know that you can use:
#!/bin/bash
declare -A array
for i in Directory; do
array[$i]=$i
done
to store directory contents in an associative array. But if there are subdirectories inside of a directory, I want to be able to store them and their contents inside of the same array. I also tried using:
declare -A arr1
find Directory -print0 | while read -d $'\0' file; do
arr1[$file]=$file
echo "${arr1[$file]}"
done
but this just runs into the problem where the array contents vanish once the while loop ends due to the subshell being discarded from the pipeline (not sure if I'm describing this correctly).
I even tried the following:
for i in $(find Directory/*); do
arr2[$i]="$i"
echo $i
done
but the output is a total disaster for files containing any spaces.
How can I store both a directory and all of its subdirectories (and their subdirectories if need be) inside of a single array?
So you know, you don't need associative arrays. A simpler way to add an element to a regular indexed array is:
array+=("$value")
Your find|while approach is on the right track. As you've surmised, you need to get rid of the pipeline. You can do that with process substitution:
while read -d $'\0' file; do
arr1+=("$file")
done < <(find Directory -print0)
Another way to do this without find is with globbing. If you just want the files under Directory it's as simple as:
array=(Directory/*)
If you want to recurse through all of its subdirectories as well, you can enable globstar and use **:
shopt -s globstar
array=(Directory/**)
The globbing methods are really nice because they automatically handle file names with whitespace and other special characters.
In a shell script, I need to apply the same shell code to all files that either have .F90 or .F as extension.
For the moment I use
for file in *.F90 ;do ...
# Code I need to run
for file in *.F; do ...
# Same block of code copy-pasted.
Is there a way to merge these two loops making an array of matching files then applying the action ?
I'm not sure I understand your question, because I don't see why you would need an array.
This would be legal syntax:
for file in *.F90 *.F ; do ...
To keep a list of the files impacted, you could do:
shopt -s nullglob
files="*.F90 *.F"
for file in ${files} ; do ...
Note: the nullglob line prevents lame (IMO) behavior should *.F90 or *.F not match any files.