Bash extended globbing brackets break array initalization - arrays

I used the statement echo *([!min]).css to get all filenames in the current directory with the .css extension, except for the ones with .min.css extension. That worked on the bash.
However, when I use this to initialize an array in a bash script like that
files=(*([!min]).css)
it doesn't work anymore. Bash says there is an unexpected opening bracket somewhere. My editor's syntax highlighting also looks like the brackets of the glob inside the array initialization are not correct, however I wasn't able to get it right.
Any advice? Thanks.
EDIT: I use GNU Bash 4.3.033 on ArchLinux.

To use extended globs, you must enable the extglob shell option. Put it at the start of your script, just below the shebang:
#!/usr/bin/env bash
shopt -s extglob
#...
files=( !(*.min).css )
#...
Note that shell options are not inherited, so even though you may have extglob enabled in the interactive bash you run the script from, you still have to explicitly enable it in the script.

Related

CMake: how to avoid escaping spaces in command line? [duplicate]

I'm trying to create a custom command that runs with some environment variables, such as LDFLAGS, whose value needs to be quoted if it contains spaces:
LDFLAGS="-Lmydir -Lmyotherdir"
I cannot find a way to include this argument in a CMake custom command, due to CMake's escaping rules. Here's what I've tried so far:
COMMAND LDFLAGS="-Ldir -Ldir2" echo blah VERBATIM)
yields "LDFLAGS=\"-Ldir -Ldir2\"" echo blah
COMMAND LDFLAGS=\"-Ldir -Ldir2\" echo blah VERBATIM)
yields LDFLAGS=\"-Ldir -Ldir2\" echo blah
It seems I either get the whole string quoted, or the escaped quotes don't resolve when used as part of the command.
I would appreciate either a way to include the literal double-quote or as an alternative a better way to set environment variables for a command. Please note that I'm still on CMake 2.8, so I don't have the new "env" command available in 3.2.
Note that this is not a duplicate of When to quote variables? as none of those quoting methods work for this particular case.
The obvious choice - often recommended when hitting the boundaries of COMMAND especially with older versions of CMake - is to use an external script.
I just wanted to add some simple COMMAND only variations that do work and won't need a shell, but are - I have to admit - still partly platform dependent.
One example would be to put only the quoted part into a variable:
set(vars_as_string "-Ldir -Ldir2")
add_custom_target(
QuotedEnvVar
COMMAND env LD_FLAGS=${vars_as_string} | grep LD_FLAGS
)
Which actually does escape the space and not the quotes.
Another example would be to add it with escaped quotes as a "launcher" rule:
add_custom_target(
LauncherEnvVar
COMMAND env | grep LD_FLAGS
)
set_target_properties(
LauncherEnvVar
PROPERTIES RULE_LAUNCH_CUSTOM "env LD_FLAGS=\"-Ldir -Ldir2\""
)
Edit: Added examples for multiple quoted arguments without the need of escaping quotes
Another example would be to "hide some of the complexity" in a function and - if you want to add this to all your custom command calls - use the global/directory RULE_LAUNCH_CUSTOM property:
function(set_env)
get_property(_env GLOBAL PROPERTY RULE_LAUNCH_CUSTOM)
if (NOT _env)
set_property(GLOBAL PROPERTY RULE_LAUNCH_CUSTOM "env")
endif()
foreach(_arg IN LISTS ARGN)
set_property(GLOBAL APPEND_STRING PROPERTY RULE_LAUNCH_CUSTOM " ${_arg}")
endforeach()
endfunction(set_env)
set_env(LDFLAGS="-Ldir1 -Ldir2" CFLAGS="-Idira -Idirb")
add_custom_target(
MultipleEnvVar
COMMAND env | grep -E 'LDFLAGS|CFLAGS'
)
Alternative (for CMake >= 3.0)
I think what we actually are looking for here (besides the cmake -E env ...) is named Bracket Argument and does allow any character without the need of adding backslashes:
set_property(
GLOBAL PROPERTY
RULE_LAUNCH_CUSTOM [=[env LDFLAGS="-Ldir1 -Ldir2" CFLAGS="-Idira -Idirb"]=]
)
add_custom_target(
MultipleEnvVarNew
COMMAND env | grep -E 'LDFLAGS|CFLAGS'
)
References
0005145: Set environment variables for ADD_CUSTOM_COMMAND/ADD_CUSTOM_TARGET
How to modify environment variables passed to custom CMake target?
[CMake] How to set environment variable for custom command
cmake: when to quote variables?
You need three backslashes. I needed this recently to get a preprocessor define from PkgConfig and apply it to my C++ flags:
pkg_get_variable(SHADERDIR movit shaderdir)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -DSHADERDIR=\\\"${SHADERDIR}\\\"")
Florian's answer is wrong on several counts:
Putting the quoted part in a variable makes no difference.
You should definitely use VERBATIM. It fixes platform-specific quoting bugs.
You definitely shouldn't use RULE_LAUNCH_CUSTOM for this. It isn't intended for this and only works with some generators.
You shouldn't use env as the command. It isn't available on Windows.
It turns out the real reason OPs code doesn't work is that CMake always fully quotes the first word after COMMAND because it's supposed to be the name of an executable. You simply shouldn't put environment variables first.
For example:
add_custom_command(
OUTPUT q1.txt
COMMAND ENV_VAR="a b" echo "hello" > q1.txt
VERBATIM
)
add_custom_target(q1 ALL DEPENDS q1.txt)
$ VERBOSE=1 make
...
"ENV_VAR=\"a b\"" echo hello > q1.txt
/bin/sh: ENV_VAR="a b": command not found
So how do you pass an environment variable with spaces? Simple.
add_custom_command(
OUTPUT q1.txt
COMMAND ${CMAKE_COMMAND} -E env ENV_VAR="a b" echo "hello" > q1.txt
VERBATIM
)
Ok, I removed my original answer as the one proposed by #Florian is better. There is one additional tweak needed for multiple quoted args. Consider a list of environment variables as such:
set(my_env_vars LDFLAGS="-Ldir1 -Ldir2" CFLAGS="-Idira -Idirb")
In order to produce the desired expansion, convert to string and then replace ; with a space.
set(my_env_string "${my_env_vars}") #produces LDFLAGS="...";CFLAGS="..."
string(REPLACE ";" " " my_env_string "${my_env_string}")
Then you can proceed with #Florian's brilliant answer and add the custom launch rule. If you need semicolons in your string then you'll need to convert them to something else first.
Note that in this case I didn't need to launch with env:
set_target_properties(mytarget PROPERTIES RULE_LAUNCH_CUSTOM "${my_env_string}")
This of course depends on your shell.
On second thought, my original answer is below as I also have a case where I don't have access to the target name.
set(my_env LDFLAGS=\"-Ldir -Ldir2" CFLAGS=\"-Idira -Idirb\")
add_custom_command(COMMAND sh -c "${my_env} grep LDFLAGS" VERBATIM)
This technique still requires that the semicolons from the list->string conversion be replaced.
Some folks suggest to use ${CMAKE_COMMAND} and pass your executable as an argument, e.g:
COMMAND ${CMAKE_COMMAND} -E env "$(WindowsSdkDir)/bin/x64/makecert.exe" ...
That worked for me.

How to merge two wildcard expressions

In a shell script, I need to apply the same shell code to all files that either have .F90 or .F as extension.
For the moment I use
for file in *.F90 ;do ...
# Code I need to run
for file in *.F; do ...
# Same block of code copy-pasted.
Is there a way to merge these two loops making an array of matching files then applying the action ?
I'm not sure I understand your question, because I don't see why you would need an array.
This would be legal syntax:
for file in *.F90 *.F ; do ...
To keep a list of the files impacted, you could do:
shopt -s nullglob
files="*.F90 *.F"
for file in ${files} ; do ...
Note: the nullglob line prevents lame (IMO) behavior should *.F90 or *.F not match any files.

Run C program from shell script [duplicate]

I have a script in unix that looks like this:
#!/bin/bash
gcc -osign sign.c
./sign < /usr/share/dict/words | sort | squash > out
Whenever I try to run this script it gives me an error saying that squash is not a valid command. squash is a shell script stored in the same directory as this script and looks like this:
#!/bin/bash
awk -f squash.awk
I have execute permissions set correctly but for some reason it doesn't run. Is there something else I have to do to make it able to run like shown? I am rather new to scripting so any help would be greatly appreciated!
As mentioned in #Biffen's comment, unless . is in your $PATH variable, you need to specify ./squash for the same reason you need to specify ./sign.
When parsing a bare word on the command line, bash checks all the directories listed in $PATH to see if said word is an executable file living inside any of them. Unless . is in $PATH, bash won't find squash.
To avoid this problem, you can tell bash not to go looking for squash by giving bash the complete path to it, namely ./squash.

Create array in bash with variables as array name

I'm not sure if this has been answered, I've looked and haven't found anything that looks like what I'm trying to do. I also posted this to stackexchange (https://unix.stackexchange.com/questions/189293/create-array-in-bash-with-variables-as-array-name)
I have a number of shell scripts that are capable of running against a ksh or bash shell, and they make use of arrays. I created a function named "setArray" that interrogates the running shell and determines what builtin to use to create the array - for ksh, set -A, for bash, typeset -a. However, I'm having some issues with the bash portion.
The function takes two arguments, the name of the array and the value to add. This then becomes ${ARRAY_NAME} and ${VARIABLE_VALUE}. Doing the following:
set -A $(eval echo \${ARRAY_NAME}) $(eval echo \${${ARRAY_NAME}[*]}) "${VARIABLE_VALUE}"
works perfectly in ksh. However,
typeset -a $(eval echo \${ARRAY_NAME})=( $(eval echo \${${ARRAY_NAME}[*]}) "${VARIABLE_VALUE}" )
does not. This provides
bash: syntax error near unexpected token '('
I know I can just make it a list of strings (e.g. MYARRAY="one two three") and just loop through it using the IFS, but I don't want to lose the ability to use an array either.
Any thoughts ?
Given the assertion that the ksh portion of this function is working only the bash portion needs to be created. For which the following should work and, I believe, be safe and robust (though evidence to the contrary is welcome).
eval $ARRAY_NAME+=\(\"\$VARIABLE_VALUE\"\)
First expansion only expands $ARRAY_NAME to get
eval array+=("$VARIABLE_VALUE")
which eval then causes to be evaluated again normally.

Does it possible combine bash and awk script files?

I have some bash script where I get values of variable, that I would like use in awk.
Does it possible include whole awk (like it possible with bash script files) file in bash e.g.:
#!/bin/sh
var1=$1
source myawk.sh
and myawk.sh:
print $1;
Bash and awk are different languages, each with their own interpreter of the same name. The tiny sample you show is stripped down too far to make much sense:
You've marked both files as shell scripts; one using the shebang #!/bin/sh and the other using the extension .sh. Obviously the shell can read shell script, and the command to do so is called . in Bourne shell (or source in csh and bash).
The shell script assigns a variable, but you're not using it anywhere. Did you mean passing it on to the awk script?
Both the awk and shell script use $1, which has different meanings for them (in bash, it's from the command line or a set command; in awk, it's from a parsed input line).
The two tools are often used in tandem, as the shell is better at combining separate programs and awk is better at reformatting tabular or structured text. It was so common that a whole language evolved to combine the tasks; Perl's roots are as a combination of shell, awk and sed.
If you just wanted to pass a variable from the shell script into an awk script, use -v. The man page is your friend.
first of all, if you're writing bash don't use #!/bin/sh that will put you in compatibility mode which is only necessarly if you're writing for portability (and then you have to adhere to the POSIX normative).
now regarding your question you just have to run awk from inside your bash script, like this:
#!/bin/bash
var1=$1
awk -f myawk.sh
also you should use .awk as extension I guess.
Or, many ppl do sth like this:
#!/bin/env bash
#Bash things start
...
var1=$1
#Bash things stop
#Awk things start,
#may use pipes or variable to interact with bash
awk -v V1=var1 '
#AWK program, can even include awk scripts here.
'
#Bash things
I suggest this page here by Bruce Barnett:
http://www.grymoire.com/Unix/Awk.html#uh-3
You can also use double quote to make use of shell's extract feature but it is confusing.
Personally I just try to avoid those fancy gnu additions of bash or awk and make my scripts ksh+(n)awk compatible.
As an hardcore AWK user, I soon realized that doing the following was really a huge help :
Defining and exporting an AWK_REPO variable in my bashrc
#Content of bashrc
export AWK_REPO=~/bin/AWK
Storing there every AWK script I write using the .awk extension.
You can then call it from anywhere like this :
awk -f $AWK_REPO/myScript.awk $file
or even, using Shebangs and adding AWK_REPO to PATH (with export PATH=${AWK_REPO}:${PATH})
myScript.awk $file

Resources