How to pass array as argument inside ARGV - arrays

I have a ruby script that takes two inputs for ARGV. The second input is an array of files. I'm having trouble iterating through the file array passed to ARGV. Here is what I have so far:
arg1, FILES = ARGV
FILES.each do |fname|
#do something with fname
end
I'm running my script from the command line like this:
ruby myScript.rb arg1 ['path/to/file1.jpg', '/path/to/file2.jpg']
And I'm getting the following error:
zsh: bad pattern: [path/to/file1.jpg,
Enclosing the array argument in single quotes like this:
ruby myScript.rb arg1 '['path/to/file1.jpg', '/path/to/file2.jpg']'
Produces a different error, as it interprets it as a String rather than array.
How can I accomplish this correctly?

Use
arg1, *FILES = ARGV
and don't place any brackets or commas during invocation:
ruby myScript.rb arg1 file1 file2 file3
EDIT: If you want to add another argument, you can do:
arg1, arg2, *FILES = ARGV
or
arg1, *FILES, arg2 = ARGV

You can't pass an array as a command-line argument. All arguments are passed as strings.
But given your code, you could just pass the arguments like this:
$ ruby myScript.rb arg1 path/to/file1.jpg /path/to/file2.jpg
And then, change your first line of code to this:
arg1, *FILES = ARGV
And after, arg1 = 'arg1' and FILES = ['path/to/file1.jpg', 'path/to/file2.jpg'].

You can either split manually, e.g. arg, arr = ARGV[0], ARGV[1..-1] or use the splat operator arg, *arr = ARGV

arg1, _files = ARGV
files = eval(_files)
files.each { |f| ... }
But there are reasons to not use eval (see Is 'eval' supposed to be nasty?).
You might pass the files list as a json string and then do JSON.parse on it to be safer, e.g.:
require 'json'
arg1, _files = ARGV
files = JSON.parse(_files)
files.each { |f| ... }
#> ruby myScript.rb arg1 '["path/to/file1.jpg", "/path/to/file2.jpg"]'

Related

How to portability use "${#:2}"?

On Allow for ${#:2} syntax in variable assignment they say I should not use "${#:2}" because it breaks things across different shells, and I should use "${*:2}" instead.
But using "${*:2}" instead of "${#:2}" is nonsense because doing "${#:2}" is not equivalent to "${*:2}" as the following example:
#!/bin/bash
check_args() {
echo "\$#=$#"
local counter=0
for var in "$#"
do
counter=$((counter+1));
printf "$counter. '$var', ";
done
printf "\\n\\n"
}
# setting arguments
set -- "space1 notspace" "space2 notspace" "lastargument"; counter=1
echo $counter': ---------------- "$*"'; counter=$((counter+1))
check_args "$*"
echo $counter': ---------------- "${*:2}"'; counter=$((counter+1))
check_args "${*:2}"
echo $counter': ---------------- "${#:2}"'; counter=$((counter+1))
check_args "${#:2}"
-->
GNU bash, version 4.4.12(1)-release (x86_64-pc-linux-gnu)
1: ---------------- "$*"
$#=1
1. 'space1 notspace space2 notspace lastargument',
2: ---------------- "${*:2}"
$#=1
1. 'space2 notspace lastargument',
3: ---------------- "${#:2}"
$#=2
1. 'space2 notspace', 2. 'lastargument',
If I cannot use "${#:2}" (as they say), what is the equivalent can I use instead?
This is original question Process all arguments except the first one (in a bash script) and their only answer to keep arguments with spaces together is to use "${#:2}"
There's context that's not clear in the question unless you follow the links. It's concerning the following recommendation from shellcheck.net:
local _help_text="${#:2}"
^––SC2124 Assigning an array to a string! Assign as array, or use * instead of # to concatenate.
Short answer: Don't assign lists of things (like arguments) to plain variables, use an array instead.
Long answer: Generally, "${#:2}" will get all but the first argument, with each treated as a separate item ("word"). "${*:2}", on the other hand, produces a single item consisting of all but the first argument stuck together, separated by a space (or whatever the first character of $IFS is).
But in the specific case where you're assigning to a plain variable, the variable is only capable of storing a single item, so var="${#:2}" also collapses the arguments down to a single item, but it does it in a less consistent way than "${*:2}". In order to avoid this, use something that is capable of storing multiple items: an array. So:
Really bad: var="${#:2}"
Slightly less bad: var="${*:2}"
Much better: arrayvar=("${#:2}") (the parentheses make this an array)
Note: to get the elements of the array back, with each one treated properly as a separate item, use "${arrayvar[#]}". Also, arrays are not supported by all shells (notably, dash doesn't support them), so if you use them you should be sure to use a bash shebang (#!/bin/bash or #!/usr/bin/env bash). If you really need portability to other shells, things get much more complicated.
Neither ${#:2} nor ${*:2} is portable, and many shells will reject both as invalid syntax. If you want to process all arguments except the first, you should get rid of the first with a shift.
first="${1}"
shift
echo The arguments after the first are:
for x; do echo "$x"; done
At this point, the first argument is in "$first" and the positional parameters are shifted down one.
This demonstrates how to combine all ${#} arguments into a single variable one without the hack ${#:1} or ${#:2} (live example):
#!/bin/bash
function get_all_arguments_as_single_one_unquoted() {
single_argument="$(printf "%s " "${#}")";
printf "unquoted arguments %s: '%s'\\n" "${#}" "${single_argument}";
}
function get_all_arguments_as_single_one_quoted() {
single_argument="${1}";
printf "quoted arguments %s: '%s'\\n" "${#}" "${single_argument}";
}
function escape_arguments() {
escaped_arguments="$(printf '%q ' "${#}")";
get_all_arguments_as_single_one_quoted "${escaped_arguments}";
get_all_arguments_as_single_one_unquoted ${escaped_arguments};
}
set -- "first argument" "last argument";
escape_arguments "${#}";
-->
GNU bash, version 4.4.12(1)-release (x86_64-pc-linux-gnu)
quoted arguments 1: 'first\ argument last\ argument '
unquoted arguments 4: 'first\ argument last\ argument '
As #William Pursell answer points out, if you would like to get only {#:2} arguments, you can add a shift call before "${#}"
function escape_arguments() {
shift;
escaped_arguments="$(printf '%q ' "${#}")";
get_all_arguments_as_single_one_quoted "${escaped_arguments}";
get_all_arguments_as_single_one_unquoted ${escaped_arguments};
}
-->
GNU bash, version 4.4.12(1)-release (x86_64-pc-linux-gnu)
quoted arguments 1: 'last\ argument '
unquoted arguments 2: 'last\ argument '

tcl library: how to use tcl_eval() to set return result for c-code tcl command extension?

Let's suppose I implemented a new tcl command written in c-code that I registered using Tcl_CreateObjCommand, and inside of this c-code I call Tcl_Eval to eval a string containing code to create an Associative array and store it in a variable tmp. How can I set this tmp variable created with Tcl_eval() as the return result object from the c-function?
Example:
int MyCommand(
ClientData clientData,
Tcl_Interp* interp,
int argc,
char* argv[])
{
int rc = Tcl_Eval(interp,
"array set tmp [list {key1} {value1} {key2} {value2}]");
if (rc != TCL_OK) {
return rc;
}
//???
Tcl_SetObjResult(interp, ?? tmp variable from eval??);
return TCL_OK;
}
When I run the Tcl interpreter with the above C-extension, I would expect to see this result:
TCL> set x [MyCommand]
TCL> puts "$x(key1)"
value1 # Currently an Error and not set
TCL> puts "$x(key2)"
value2 # Currently and Error and not set
In a way the above works. Just not the way I want it to. For Example, if I type:
TCL> set x [MyCommand]
TCL> puts "$tmp(key1)"
value1 # Its Works! Except, I didn't want to set a global variable tmp
TCL> puts "$tmp(key2)"
value2 # Its Works! Except, I didn't want to set a global variable tmp
(Maybe its a "feature" to set tmp instead??) Anyways, I still want it to work the correct way by returning the value using the proc "return" mechanism.
It should be legal to call Tcl_Eval() from inside of Tcl_Eval of c-command-extension because the documentation for the "Tcl Library" States that for tcl_eval, it is legal to make nested calls to evaluate other commands. I just don't know how to copy the object result from Tcl_Eval to "return" object for c-extension procedure.
I see two problems here. You can't set the return value of a command to be the value of an array because arrays are not values. Arrays are collections of variables indexed by a string. It's a common misunderstanding. You could return the value of an element of an array. If you want a key / value map that is a proper Tcl value, consider a dictionary. Dictionaries are values and can be returned as the value of a command.
The second problem why are you using Tcl_Eval() to create an array. It is much simpler to use Tcl_SetVar() or one of its several variations to build an array.
The recommended way to set an array (given you're working with char* values in the first place) is using calls to Tcl_SetVar2 (so named because it takes variable names as two parts).
Tcl_SetVar2(interp, "tmp", "key1", "value1", 0);
Tcl_SetVar2(interp, "tmp", "key2", "value2", 0);
Idiomatically, you'd use a name passed in as an argument to your C command implementation as an argument, so that the caller can tell you what variable to write into, and you'd want to check the results too:
int MyCommand(
ClientData clientData,
Tcl_Interp* interp,
int argc,
char* argv[])
{
// Missing: check # of arguments
if (Tcl_SetVar2(interp, argv[1], "key1", "value1", 0) == NULL)
return TCL_ERROR;
if (Tcl_SetVar2(interp, argv[1], "key2", "value2", 0) == NULL)
return TCL_ERROR;
return TCL_OK;
}
You'd then call that like this:
MyCommand x
# It has no meaningful result.
puts $x(key1)
puts $x(key2)

fragmenting data from text file into list of facts to prolog file

I want to separate the data file into list of facts like functor(arg1, arg2, ..., argN) which the name of functor is the uppercase line and the arguments are the lowercase lines that follow them,
subsequently, new clauses are saved in a prolog file created at the execution
file.txt
FUNCTOR1
arg1
arg2
FUNCTOR2
arg1
arg2
arg3
FUNCTOR3
arg1
arg2
arg3
arg4
result :
?- split_data_to_facts('file.txt',List,'file.pl').
List = ['functor1(arg1,arg2)','functor2(arg1,arg2,arg3)','functor3(arg1,arg2,arg3,arg4)'].
file.pl
"." will be appended as the last
functor1(arg1,arg2).
functor2(arg1,arg2,arg3).
functor3(arg1,arg2,arg3,arg4).
after building and compiling the new prolog file file.pl:
?- functor2(X,Y,Z).
X=arg1,
Y=arg2,
Z=arg3;
yes
Let's assume a builtin read_line_to_codes/2 is available: then you could apply a lookahead of one line:
process_file(Path) :-
open(Path, read, In),
read_line_to_codes(In, Line1),
read_line_to_codes(In, Line2), % assume not empty
process_lines(Line2, [Line1], In, Facts),
maplist(writeln, Facts). % just for test
process_lines(end_of_file, LastFactDef, In, [LastFact]) :-
lines_fact(LastFactDef, LastFact),
close(In).
process_lines([U|Us], LastFactDef, In, [LastFact|Facts]) :-
upper_lower(U, _),
lines_fact(LastFactDef, LastFact),
read_line_to_codes(In, Line),
process_lines(Line, [[U|Us]], In, Facts).
process_lines(Last, Lines, In, Facts) :-
read_line_to_codes(In, Line),
process_lines(Line, [Last|Lines], In, Facts).
lines_fact(Lines, Fact) :-
reverse(Lines, [FunctorUpper|ArgCodes]),
maplist(make_lower, FunctorUpper, FunctorCodes),
maplist(atom_codes, [Functor|Args], [FunctorCodes|ArgCodes]),
Fact =.. [Functor|Args].
% if uppercase get lowercase
upper_lower(U, L) :-
between(0'A, 0'Z, U), L is 0'a + U - 0'A.
make_lower(C, L) :- upper_lower(C, L) ; L = C.
running a test in SWI-Prolog (where we have by default available read_line_to_codes/2 and between/3):
?- process_file('/home/carlo/test/file.txt').
functor1(arg1 ,arg2)
functor2(arg1,arg2,arg3)
functor3(arg1,arg2,arg3,arg4)
true

use execl with some null arguments

the program reads from a config file some values, some are defined, some are not, some have value 0, some are active.
I have the following code:
char *arg1="", *arg1_value="", *arg2="", *arg2_value="", *arg3="", *arg3_value="", *arg4="", *arg4_value="";
//part where I read from config file
execl("./test", "test", arg1, arg1_value, arg2, arg2_value, arg3, arg3_value, arg4, arg4_value, (char*) 0);
How can I use execl but in case some variables are missing or set to 0 not to affect the others? I'm sure this is not the best approach by setting their value to ""
I assume that each argX is a switch like "-xxxx" and argX_value it's corresponding value and you always want to pass both if argX is defined. Then you could use execv() instead of execl() like that:
#define MAXARGS 4
char *argv[2*MAXARGS+2];
int i, argc;
argv[0] = "prog";
argc = 1;
if( arg1 && strcmp( arg1, "" ) != 0 ) {
argv[argc++] = arg1;
argv[argc++] = arg1_value;
}
// the same for arg2 to arg4
argv[argc] = NULL;
execv( "./prog", argv );
Initializing arguments with default values and passing appropriate arguments(by checking whether the values are default or not) to execl could be the only option we have in such condition.
if(argv1 == defval_arg1)
execl("./prog","prog", "required args in case argv1 is missing according to syntax of ./prog");
if(argv2 == defval_arg2)
execl("./prog","prog", "required args in case argv2 is missing");
Update
if there are more arguments to be checked you can simply use loop to form an array of args to be passed, by checking values of args.
for(i=0;i<10;i++){
if (args[i]!=default_value_args[i]) {
args_2b_passed[j++]=args[i];
}
}
and then use varient of exec function using args_2b_passed array. man 3 exec for more help.

How to create a TCL function with optional arguments using SWIG?

I have a simple c/c++ app that has an optional TCL interpreter with the function wrappers generated using SWIG. For several of the functions all the arguments are optional. How is this typically handled? I'd like to support a TCL command like this, where any of the arguments are optional but the C function takes fixed arguments:
//TCL command
get_list [options] filename
-opt1
-opt2
-opt3 arg1
-opt4 arg2
filename
//C function
static signed get_list(bool opt1,
bool opt2,
char * arg1,
objectType * arg2,
char * fileName)
Currently I have something like this:
static pList * get_list(char * arg1=NULL,
char * arg2=NULL,
char * arg3=NULL,
tObject * arg4=NULL)
This has many problems such as enforcing the object pointer is always the last argument. The SWIG documentation talks at length about C functions with variable arguments using "..." but I don't think this is what I need. I'd like the C function arguments to be fixed.
The easiest method is to wrap a Tcl procedure around the outside, like this:
rename get_list original.get_list
proc get_list args {
if {[llength $args] == 0 || [llength $args] % 2 == 0} {
error "wrong # args: ..."; # Do a proper error message here!
}
# Handle the required argument
set filename [lindex $args end]
# Initialize the defaults
array set opts {
-opt1 false
-opt2 false
-opt3 ""
-opt4 ""
}
# Merge in the supplied options
foreach {opt val} [lrange $args 0 end-1] {
if {![info exist opts($opt)]} {
error "unrecognized option \"$opt\""
}
set opts($opt) $value
}
# Hand off to C level...
original.get_list $opts(-opt1) $opts(-opt2) $opts(-opt3) $opts(-opt4) $filename
}
If you've got Tcl 8.6, that last handoff is best done with tailcall so the rewriting code is cut out of the Tcl stack. It's not vital though, as SWIGged code rarely resolves names of Tcl commands and variables.

Resources