In a shell script, I need to apply the same shell code to all files that either have .F90 or .F as extension.
For the moment I use
for file in *.F90 ;do ...
# Code I need to run
for file in *.F; do ...
# Same block of code copy-pasted.
Is there a way to merge these two loops making an array of matching files then applying the action ?
I'm not sure I understand your question, because I don't see why you would need an array.
This would be legal syntax:
for file in *.F90 *.F ; do ...
To keep a list of the files impacted, you could do:
shopt -s nullglob
files="*.F90 *.F"
for file in ${files} ; do ...
Note: the nullglob line prevents lame (IMO) behavior should *.F90 or *.F not match any files.
Related
I use rsync to copy files according to pairwise defined sources/destinations, read from two config-files.
When transfering a single pair (one file), it works.
When using two or more pairs, the first transfer always fails with:
rsync: link_stat "MyFILEPATH\#015" failed: No such file or directory (2)
All following copies are successful.
Shuffeling the order of files, doesn't change the behavior.
It's always the first one.
So, I'd rule out corrupted/missing files as a cause.
This is my script:
#!/bin/bash
mapfile -t sources <"source-files.txt"
mapfile -t destinations <"destination-folders.txt"
for i in "${!sources[#]}"; do
rsync -av -- "${sources[i]}" "${destinations[i]}"
done
In the config-files, ...
sources-file lists one file-path ...
destination-file lists one folder-path ...
... per line without quotes or any whitespaces.
I am writing a little script that outputs a list of duplicate files in the directory, ie. pairs of XXX.jpg and XXX (1).jpg. I want to use the output of this script as an argument to a command, namely ql (quicklook) so I can look through all such images (to verify they are indeed duplicate images, or just filenames). For instance, I can do `ql (' which will allow me to look through all the files 'XXX (1).jpg'; but I want to include in that list also the original 'XXX.jpg' files.
Here is my script so far:
dups=()
for file in *\(*; do
dups+=( "${file}" )
breakdown=( $file )
dupfile="${breakdown[0]}.jpg"
if [ -e "$dupfile" ]; then
dups+=( "$dupfile" )
fi
done
echo ${dups[#]}
As far as building an array of the required filenames goes, it works. But when it comes to invoking something like ql $(./printdups.sh), the command gets confused by the filenames with spaces. It will attempt to open 'XXX' as a file, and then '(1).jpg' as another file. So the question is, how can I echo this array such that filenames with spaces are recognised as such by the command I pass it to?
I have tried changing line 3 to:
dups+=( "'$file'" )
And:
dups+=( "${file/ /\ }" )
Both to no avail.
You can't pass arrays from one process to another. All you are doing is writing a space-separated sequence of file names to standard output, and the unquoted command substitution in ql $(./printdups.sh) fails for the same reason you need an array in the first place: word-splitting does not distinguish between spaces in file names and spaces between file names.
I would recommend defining a function, rather than a script, and have that function populate a global array that you can access directly after the function has been called.
get_dups () {
dups=()
for file in *\(*; do
dups+=( "$file" )
read -a breakdown <<< "$file" # safer way to split the name into parts
dupfile="${breakdown[0]}.jpg"
if [ -e "$dupfile" ]; then
dups+=( "$dupfile" )
fi
done
}
get_dups
ql "${dups[#]}"
I want to be able to store a directory's contents inside of an array. I know that you can use:
#!/bin/bash
declare -A array
for i in Directory; do
array[$i]=$i
done
to store directory contents in an associative array. But if there are subdirectories inside of a directory, I want to be able to store them and their contents inside of the same array. I also tried using:
declare -A arr1
find Directory -print0 | while read -d $'\0' file; do
arr1[$file]=$file
echo "${arr1[$file]}"
done
but this just runs into the problem where the array contents vanish once the while loop ends due to the subshell being discarded from the pipeline (not sure if I'm describing this correctly).
I even tried the following:
for i in $(find Directory/*); do
arr2[$i]="$i"
echo $i
done
but the output is a total disaster for files containing any spaces.
How can I store both a directory and all of its subdirectories (and their subdirectories if need be) inside of a single array?
So you know, you don't need associative arrays. A simpler way to add an element to a regular indexed array is:
array+=("$value")
Your find|while approach is on the right track. As you've surmised, you need to get rid of the pipeline. You can do that with process substitution:
while read -d $'\0' file; do
arr1+=("$file")
done < <(find Directory -print0)
Another way to do this without find is with globbing. If you just want the files under Directory it's as simple as:
array=(Directory/*)
If you want to recurse through all of its subdirectories as well, you can enable globstar and use **:
shopt -s globstar
array=(Directory/**)
The globbing methods are really nice because they automatically handle file names with whitespace and other special characters.
I have a function to which I am passing a path.
What happens is function setSub calls function testSub and under certain conditions testSub calls setSub with a different path.
Here is what I have so far
shopt -s nullglob
function setSub() {
local assets=("$1"/*)
echo ${#assets[#]} ######### Here
for asset in "${assets[#]}";
do
if [ -d "$asset" ]; then
setSub "$asset"
fi;
done
}
The place I marked 'Here' outputs the array length. Problem is whenever it calls itself the assets array is 0. The above sample should drill down and list the number of items in a directory. (What the sample does, not what my whole script does)
It seems what I was looking for was shopt -s dotglob
My test case only had dot files in the directory.
But after #l0b0 suggestion I did a bit of research and found this.
After reading about the issues with globing, I thought it best to replace it with find
I used the statement echo *([!min]).css to get all filenames in the current directory with the .css extension, except for the ones with .min.css extension. That worked on the bash.
However, when I use this to initialize an array in a bash script like that
files=(*([!min]).css)
it doesn't work anymore. Bash says there is an unexpected opening bracket somewhere. My editor's syntax highlighting also looks like the brackets of the glob inside the array initialization are not correct, however I wasn't able to get it right.
Any advice? Thanks.
EDIT: I use GNU Bash 4.3.033 on ArchLinux.
To use extended globs, you must enable the extglob shell option. Put it at the start of your script, just below the shebang:
#!/usr/bin/env bash
shopt -s extglob
#...
files=( !(*.min).css )
#...
Note that shell options are not inherited, so even though you may have extglob enabled in the interactive bash you run the script from, you still have to explicitly enable it in the script.