How to *not* expand "~" in tcsh backtick/eval wrapper? - eval

I'm not interested in a "why tcsh is evil" response, this question requires tcsh for it's use. If this is a bug in tcsh, please just let me know that.
I'm generally pretty good at quoting output but I've been stumped by a use case I have to deal with. Here's the simplified case:
I have an executable that I eval the output of. Sometimes it needs to print output (to stdout) so I have tried using echo (builtin), /bin/echo and /bin/printf.
As a simple example, say my executable is called "simpleWrap" and it outputs:
echo "hello world"
So I run (eventually using an alias, but that's irrelevant here):
eval `simpleWrap`
And I get, as expected:
hello world
But here's the problem. Sometimes I need a tilde in the output. So let's try some examples. We'll put tildes in the simpleWrap output (this is not the contents of the script, but what it outputs):
echo "These are tildes: ~ and \~ and \\~"
And surprisingly when I eval the output of simpleWrap now, I get:
These are tildes: /home/dave and \~ and \~
Either the ~ is expanded to my home directory, or the \ protects it, but I can't get rid of the backslash.
How can I just print a '~' in the output of an eval with backticks?
I believe the backticks are forcing the ~ expansion, but they aren't doing it in a consistent way. For example, if I skip the backticks and just do:
eval echo "These are tildes: ~ and \~ and \\~"
Then I get a consistent, expected output:
These are tildes: /home/dave and ~ and \~
Is there something wrong with substitution in backticks, or am I missing the proper quoting possibility? (I've also tried wrapping in single and double quotes to no avail)

The differences arises from the order shell substitutions take place, and the different behavior of parsing the \, when it is inside double-quotes, as opposed to no double-quotes.
To demonstrate the behavior of \:
echo \~
=> ~
echo "\~"
=> \~
echo "\\~"
=> \~
Scenario 1
eval `simpleWrap`
Here, "simpleWrap" outputs a "raw" string which includes tildes. The string is then passed unchanged (no substitutions) to the eval command (because that's how backticks work), which basically runs a new shell. So the new shell sees this command line:
echo "These are tildes: ~ and \~ and \\~"
(Note that the quotes are seen by the eval command).
The output of this command is the one you didn't expect, i.e. both \~ (denote A) and \\~ (denote B) produce the same output. Why?
First of all, neither A nor B is substituted, because tilde substitution only happens if the tilde is the first character in its word, which isn't the case for either.
Now, for A, since \~ is not a known escape sequence (unlike \n for example), the shell leaves it as is, producing \~.
For B, \\ is a known escape sequence, so the shell correctly interprets it as \, then appends the rest of the string to it, producing \~ again.
Scenario 2
eval echo "These are tildes: ~ and \~ and \\~"
Here, there's only one command line, which includes the tildes. First, there's the shell substitution, which affects neither A nor B, but in this stage, the quotes drop (they are used for grouping words into a single argument, and are not passed to the command). Then, eval runs (echo hasn't run yet), and is pass this as input: echo These are tildes: ~ and \~ and \\~.
Note the quotes dropped. The output of the command without the quotes is what you expect (see "demonstration" above):
echo These are tildes: /home/dave and \~ and \\~
=> These are tildes: /home/dave and ~ and \~
How to make it work?
Drop the quotes!
simpleWrap should print this:
echo These are tildes: ~ and \~ and \\~
instead of this:
echo "These are tildes: ~ and \~ and \\~"

Related

shell send args to a C program with spaces [duplicate]

This question already has answers here:
How can I store a command in a variable in a shell script?
(12 answers)
Closed 4 years ago.
These work as advertised:
grep -ir 'hello world' .
grep -ir hello\ world .
These don't:
argumentString1="-ir 'hello world'"
argumentString2="-ir hello\\ world"
grep $argumentString1 .
grep $argumentString2 .
Despite 'hello world' being enclosed by quotes in the second example, grep interprets 'hello (and hello\) as one argument and world' (and world) as another, which means that, in this case, 'hello will be the search pattern and world' will be the search path.
Again, this only happens when the arguments are expanded from the argumentString variables. grep properly interprets 'hello world' (and hello\ world) as a single argument in the first example.
Can anyone explain why this is? Is there a proper way to expand a string variable that will preserve the syntax of each character such that it is correctly interpreted by shell commands?
Why
When the string is expanded, it is split into words, but it is not re-evaluated to find special characters such as quotes or dollar signs or ... This is the way the shell has 'always' behaved, since the Bourne shell back in 1978 or thereabouts.
Fix
In bash, use an array to hold the arguments:
argumentArray=(-ir 'hello world')
grep "${argumentArray[#]}" .
Or, if brave/foolhardy, use eval:
argumentString="-ir 'hello world'"
eval "grep $argumentString ."
On the other hand, discretion is often the better part of valour, and working with eval is a place where discretion is better than bravery. If you are not completely in control of the string that is eval'd (if there's any user input in the command string that has not been rigorously validated), then you are opening yourself to potentially serious problems.
Note that the sequence of expansions for Bash is described in Shell Expansions in the GNU Bash manual. Note in particular sections 3.5.3 Shell Parameter Expansion, 3.5.7 Word Splitting, and 3.5.9 Quote Removal.
When you put quote characters into variables, they just become plain literals (see http://mywiki.wooledge.org/BashFAQ/050; thanks #tripleee for pointing out this link)
Instead, try using an array to pass your arguments:
argumentString=(-ir 'hello world')
grep "${argumentString[#]}" .
In looking at this and related questions, I'm surprised that no one brought up using an explicit subshell. For bash, and other modern shells, you can execute a command line explicitly. In bash, it requires the -c option.
argumentString="-ir 'hello world'"
bash -c "grep $argumentString ."
Works exactly as original questioner desired. There are two restrictions to this technique:
You can only use single quotes within the command or argument strings.
Only exported environment variables will be available to the command
Also, this technique handles redirection and piping, and other shellisms work as well. You also can use bash internal commands as well as any other command that works at the command line, because you are essentially asking a subshell bash to interpret it directly as a command line. Here's a more complex example, a somewhat gratuitously complex ls -l variant.
cmd="prefix=`pwd` && ls | xargs -n 1 echo \'In $prefix:\'"
bash -c "$cmd"
I have built command processors both this way and with parameter arrays. Generally, this way is much easier to write and debug, and it's trivial to echo the command you are executing. OTOH, param arrays work nicely when you really do have abstract arrays of parameters, as opposed to just wanting a simple command variant.

Parameter expansion for find command

Consider the code (the variable $i is there because it was in a loop, adding several conditions to the pattern, e.g. *.a and *.b, ... but to illustrate this problem only one wildcard pattern is enough):
#!/bin/bash
i="a"
PATTERN="-name bar -or -name *.$i"
find . \( $PATTERN \)
If ran on a folder containing files bar and foo.a, it works, outputting:
./foo.a
./bar
But if you now add a new file to the folder, namely zoo.a, then it no longer works:
find: paths must precede expression: zoo.a
Presumably, because the wildcard in *.$i gets expanded by the shell to foo.a zoo.a, which leads to an invalid find command pattern. So one attempt at a fix is to put quotes around the wildcard pattern. Except it does not work:
with single quotes -- PATTERN="-name bar -or -name '*.$i'" the find command outputs only bar. Escaping the single quotes (\') yields the same result.
idem with double quotes: PATTERN="-name bar -or -name \"*.$i\"" -- only bar is returned.
in the find command, if $PATTERN is replaced with "$PATTERN", out comes an error (for single quotes same error, but with single quotes around the wildcard pattern):
find: unknown predicate -name bar -or -name "*.a"'
Of course, replacing $PATTERN with '$PATTERN' also does not work... (no expansion whatsoever takes place).
The only way I could get it to work was to use... eval!
FINDSTR="find . \( $PATTERN \)"
eval $FINDSTR
This works properly:
./zoo.a
./foo.a
./bar
Now after a lot of googling, I saw it mentioned several times that to do this kind of thing, one should use arrays. But this doesn't work:
i="a"
PATTERN=( -name bar -or -name '*.$i' )
find . \( "${PATTERN[#]}" \)
# result: ./bar
In the find line the array has to be enclosed in double quotes, because we want it to be expanded. But single quotes around the wildcard expression don't work, and neither does not quotes at all:
i="a"
PATTERN=( -name bar -or -name *.$i )
find . \( "${PATTERN[#]}" \)
# result: find: paths must precede expression: zoo.a
BUT DOUBLE QUOTES DO WORK!!
i="a"
PATTERN=( -name bar -or -name "*.$i" )
find . \( "${PATTERN[#]}" \)
# result:
# ./zoo.a
# ./foo.a
# ./bar
So I guess my question are actually two questions:
a) in this last example using arrays, why are double quotes required around the *.$i?
b) using an array in this way is supposed to expand «to all elements individually quoted». How would do this with a variable (cf my first attempt)? After getting this to function, I went back and tried using a variable again, with blackslashed single quotes, or \\', but nothing worked (I just got bar). What would I have to do to emulate "by hand" as it were, the quoting done when using arrays?
Thank you in advance for your help.
Required reading:
BashFAQ — I'm trying to put a command in a variable, but the complex cases always fail!
a) in this last example using arrays, why are double quotes required around the *.$i?
You need to use some form of quoting to prevent the shell from performing glob expansion on *. Variables are not expanded in single quotes so '*.$i' doesn't work. It does inhibit glob expansion but it also stops variable expansion. "*.$i" inhibits glob expansion but allows variable expansion, which is perfect.
To really delve into the details, there are two things you need to do here:
Escape or quote * to prevent glob expansion.
Treat $i as a variable expansion, but quote it to prevent word splitting and glob expansion.
Any form of quoting will do for item 1: \*, "*", '*', and $'*' are all acceptable ways to ensure it's treated as a literal asterisk.
For item 2, double quoting is the only answer. A bare $i is subject to word splitting and globbing -- if you have i='foo bar' or i='foo*' the whitespace and globs will cause problems. \$i and '$i' both treat the dollar sign literally, so they're out.
"$i" is the only quoting that does everything right. It's why common shell advice is to always double quote variable expansions.
The end result is, any of the following would work:
"*.$i"
\*."$i"
'*'."$i"
"*"."$i"
'*.'"$i"
Clearly, the first is the simplest.
b) using an array in this way is supposed to expand «to all elements individually quoted». How would do this with a variable (cf my first attempt)? After getting this to function, I went back and tried using a variable again, with blackslashed single quotes, or \\', but nothing worked (I just got bar). What would I have to do to emulate "by hand" as it were, the quoting done when using arrays?
You'd have to cobble together something with eval, but that's dangerous. Fundamentally, arrays are more powerful than simple string variables. There's no magic combination of quotes and backslashes that will let you do what an array can do. Arrays are the right tool for the job.
Could you explain in a little more detail, why ... PATTERN="-name bar -or -name \"*.$i\"" does not work? The quoted double quotes should, when the find command is actually ran, expand the $i but not the glob.
Sure. Let's say we write:
i=a
PATTERN="-name bar -or -name \"*.$i\""
find . \( $PATTERN \)
After the first two line runs, what is the value of $PATTERN? Let's check:
$ i=a
$ PATTERN="-name bar -or -name \"*.$i\""
$ printf '%s\n' "$PATTERN"
-name bar -or -name "*.a"
You'll notice that $i has already been replaced with a, and the backslashes have been removed.
Now let's see how exactly the find command is parsed. In the last line $PATTERN is unquoted because we want all the words to be split apart, right? If you write a bare variable name Bash ends up performing an implied split+glob operation. It performs word splitting and glob expansion. What does that mean, exactly?
Let's take a look at how Bash performs command-line expansion. In the Bash man page under the "Expansion" section we can see the order of operations:
Brace expansion
Tilde expansion, parameter and variable expansion, arithmetic expansion, command substitution, and process substitution
Word splitting
Pathname (AKA glob) expansion
Quote removal
Let's run through these operations by hand and see how find . \( $PATTERN \) is parsed. The end result will be a list of strings, so I'll use a JSON-like syntax to show each stage. We'll start with a list containing a single string:
['find . \( $PATTERN \)']
As a preliminary step, the command-line as a whole is subject to word splitting.
['find', '.', '\(', '$PATTERN', '\)']
Brace expansion -- No change.
Variable expansion
['find', '.', '\(', '-name bar -or -name "*.a"', '\)']
$PATTERN is replaced. For the moment it is all a single string, whitespace and all.
Word splitting
['find', '.', '\(', '-name', 'bar', '-or', '-name', '"*.a"', '\)']
The shell scans the results of variable expansion that did not occur within double quotes for word splitting. $PATTERN was unquoted, so it's expanded. Now it is a bunch of individual words. So far so good.
Glob expansion
['find', '.', '\(', '-name', 'bar', '-or', '-name', '"*.a"', '\)']
Bash scans the results of word splitting for globs. Not the entire command-line, just the tokens -name, bar, -or, -name, and "*.a".
It looks like nothing happened, yes? Not so fast! Looks can be deceiving. Bash actually performed glob expansion. It just happened that the glob didn't match anything. But it could...†
Quote removal
['find', '.', '(', '-name', 'bar', '-or', '-name', '"*.a"', ')']
The backslashes are gone. But the double quotes are still there.
After the preceding expansions, all unquoted occurrences of the characters \, ', and " that did not result from one of the above expansions are removed.
And that's the end result. The double quotes are still there, so instead of searching for files named *.a it searches for ones named "*.a" with literal double quotes characters in their name. That search is bound to fail.
Adding a pair of escaped quotes \" didn't at all do what we wanted. The quotes didn't disappear like they were supposed to and broke the search. Not only that, but they also didn't inhibit globbing like they should have.
TL;DR — Quotes inside a variable aren't parsed the same way as quotes outside a variable.
† The first four tokens have no special characters. But the last one, "*.a", does. That asterisk is a wildcard. If you read the "pathname expansion" section of the man page carefully you'll see that there's no mention of quotes being ignored. The double quotes do not protect the asterisk.
Hang on! What? I thought quotes inhibit glob expansion!
They do—normally. If you write quotes out by hand they do indeed stop glob expansion. But if you put them inside an unquoted variable, they don't.
$ touch 'foobar' '"foobar"'
$ ls
foobar "foobar"
$ ls foo*
foobar
$ ls "foo*"
ls: foo*: No such file or directory
$ var="\"foo*\""
$ echo "$var"
"foo*"
$ ls $var
"foobar"
Read that over carefully. If we create a file named "foobar"—that is, it has literal double quotes in its filename—then ls $var prints "foobar". The glob is expanded and matches the (admittedly contrived) filename!
Why didn't the quotes help? Well, the explanation is subtle, and tricky. The man page says:
After word splitting ... bash scans each word for the characters *, ?, and [.
Any time Bash performs word splitting it also expands globs. Remember how I said unquoted variables are subject to an implied split+glob operator? This is what I meant. Splitting and globbing go hand in hand.
If you write ls "foo*" the quotes prevent foo* from being subject to splitting and globbing. However if you write ls $var then $var is expanded, split, and globbed. It wasn't surrounded by double quotes. It doesn't matter that it contains double quotes. By the time those double quotes show up it's too late. Word splitting has already been performed, and so globbing is done as well.

Bash running explicit path works, but variable path doesn't [duplicate]

This question already has answers here:
How can I store a command in a variable in a shell script?
(12 answers)
Closed 4 years ago.
These work as advertised:
grep -ir 'hello world' .
grep -ir hello\ world .
These don't:
argumentString1="-ir 'hello world'"
argumentString2="-ir hello\\ world"
grep $argumentString1 .
grep $argumentString2 .
Despite 'hello world' being enclosed by quotes in the second example, grep interprets 'hello (and hello\) as one argument and world' (and world) as another, which means that, in this case, 'hello will be the search pattern and world' will be the search path.
Again, this only happens when the arguments are expanded from the argumentString variables. grep properly interprets 'hello world' (and hello\ world) as a single argument in the first example.
Can anyone explain why this is? Is there a proper way to expand a string variable that will preserve the syntax of each character such that it is correctly interpreted by shell commands?
Why
When the string is expanded, it is split into words, but it is not re-evaluated to find special characters such as quotes or dollar signs or ... This is the way the shell has 'always' behaved, since the Bourne shell back in 1978 or thereabouts.
Fix
In bash, use an array to hold the arguments:
argumentArray=(-ir 'hello world')
grep "${argumentArray[#]}" .
Or, if brave/foolhardy, use eval:
argumentString="-ir 'hello world'"
eval "grep $argumentString ."
On the other hand, discretion is often the better part of valour, and working with eval is a place where discretion is better than bravery. If you are not completely in control of the string that is eval'd (if there's any user input in the command string that has not been rigorously validated), then you are opening yourself to potentially serious problems.
Note that the sequence of expansions for Bash is described in Shell Expansions in the GNU Bash manual. Note in particular sections 3.5.3 Shell Parameter Expansion, 3.5.7 Word Splitting, and 3.5.9 Quote Removal.
When you put quote characters into variables, they just become plain literals (see http://mywiki.wooledge.org/BashFAQ/050; thanks #tripleee for pointing out this link)
Instead, try using an array to pass your arguments:
argumentString=(-ir 'hello world')
grep "${argumentString[#]}" .
In looking at this and related questions, I'm surprised that no one brought up using an explicit subshell. For bash, and other modern shells, you can execute a command line explicitly. In bash, it requires the -c option.
argumentString="-ir 'hello world'"
bash -c "grep $argumentString ."
Works exactly as original questioner desired. There are two restrictions to this technique:
You can only use single quotes within the command or argument strings.
Only exported environment variables will be available to the command
Also, this technique handles redirection and piping, and other shellisms work as well. You also can use bash internal commands as well as any other command that works at the command line, because you are essentially asking a subshell bash to interpret it directly as a command line. Here's a more complex example, a somewhat gratuitously complex ls -l variant.
cmd="prefix=`pwd` && ls | xargs -n 1 echo \'In $prefix:\'"
bash -c "$cmd"
I have built command processors both this way and with parameter arrays. Generally, this way is much easier to write and debug, and it's trivial to echo the command you are executing. OTOH, param arrays work nicely when you really do have abstract arrays of parameters, as opposed to just wanting a simple command variant.

What is the difference between bash arrays with the notation ${array[*]} and ${array[#]} [duplicate]

I'm taking a stab at writing a bash completion for the first time, and I'm a bit confused about about the two ways of dereferencing bash arrays (${array[#]} and ${array[*]}).
Here's the relevant chunk of code (it works, but I would like to understand it better):
_switch()
{
local cur perls
local ROOT=${PERLBREW_ROOT:-$HOME/perl5/perlbrew}
COMPREPLY=()
cur=${COMP_WORDS[COMP_CWORD]}
perls=($ROOT/perls/perl-*)
# remove all but the final part of the name
perls=(${perls[*]##*/})
COMPREPLY=( $( compgen -W "${perls[*]} /usr/bin/perl" -- ${cur} ) )
}
bash's documentation says:
Any element of an array may be referenced using ${name[subscript]}. The braces are required to avoid conflicts with the shell's filename expansion operators. If the subscript is ‘#’ or ‘*’, the word expands to all members of the array name. These subscripts differ only when the word appears within double quotes. If the word is double-quoted, ${name[*]} expands to a single word with the value of each array member separated by the first character of the IFS variable, and ${name[#]} expands each element of name to a separate word.
Now I think I understand that compgen -W expects a string containing a wordlist of possible alternatives, but in this context I don't understand what "${name[#]} expands each element of name to a separate word" means.
Long story short: ${array[*]} works; ${array[#]} doesn't. I would like to know why, and I would like to understand better what exactly ${array[#]} expands into.
(This is an expansion of my comment on Kaleb Pederson's answer -- see that answer for a more general treatment of [#] vs [*].)
When bash (or any similar shell) parses a command line, it splits it into a series of "words" (which I will call "shell-words" to avoid confusion later). Generally, shell-words are separated by spaces (or other whitespace), but spaces can be included in a shell-word by escaping or quoting them. The difference between [#] and [*]-expanded arrays in double-quotes is that "${myarray[#]}" leads to each element of the array being treated as a separate shell-word, while "${myarray[*]}" results in a single shell-word with all of the elements of the array separated by spaces (or whatever the first character of IFS is).
Usually, the [#] behavior is what you want. Suppose we have perls=(perl-one perl-two) and use ls "${perls[*]}" -- that's equivalent to ls "perl-one perl-two", which will look for single file named perl-one perl-two, which is probably not what you wanted. ls "${perls[#]}" is equivalent to ls "perl-one" "perl-two", which is much more likely to do something useful.
Providing a list of completion words (which I will call comp-words to avoid confusion with shell-words) to compgen is different; the -W option takes a list of comp-words, but it must be in the form of a single shell-word with the comp-words separated by spaces. Note that command options that take arguments always (at least as far as I know) take a single shell-word -- otherwise there'd be no way to tell when the arguments to the option end, and the regular command arguments (/other option flags) begin.
In more detail:
perls=(perl-one perl-two)
compgen -W "${perls[*]} /usr/bin/perl" -- ${cur}
is equivalent to:
compgen -W "perl-one perl-two /usr/bin/perl" -- ${cur}
...which does what you want. On the other hand,
perls=(perl-one perl-two)
compgen -W "${perls[#]} /usr/bin/perl" -- ${cur}
is equivalent to:
compgen -W "perl-one" "perl-two /usr/bin/perl" -- ${cur}
...which is complete nonsense: "perl-one" is the only comp-word attached to the -W flag, and the first real argument -- which compgen will take as the string to be completed -- is "perl-two /usr/bin/perl". I'd expect compgen to complain that it's been given extra arguments ("--" and whatever's in $cur), but apparently it just ignores them.
Your title asks about ${array[#]} versus ${array[*]} (both within {}) but then you ask about $array[*] versus $array[#] (both without {}) which is a bit confusing. I'll answer both (within {}):
When you quote an array variable and use # as a subscript, each element of the array is expanded to its full content regardless of whitespace (actually, one of $IFS) that may be present within that content. When you use the asterisk (*) as the subscript (regardless of whether it's quoted or not) it may expand to new content created by breaking up each array element's content at $IFS.
Here's the example script:
#!/bin/sh
myarray[0]="one"
myarray[1]="two"
myarray[3]="three four"
echo "with quotes around myarray[*]"
for x in "${myarray[*]}"; do
echo "ARG[*]: '$x'"
done
echo "with quotes around myarray[#]"
for x in "${myarray[#]}"; do
echo "ARG[#]: '$x'"
done
echo "without quotes around myarray[*]"
for x in ${myarray[*]}; do
echo "ARG[*]: '$x'"
done
echo "without quotes around myarray[#]"
for x in ${myarray[#]}; do
echo "ARG[#]: '$x'"
done
And here's it's output:
with quotes around myarray[*]
ARG[*]: 'one two three four'
with quotes around myarray[#]
ARG[#]: 'one'
ARG[#]: 'two'
ARG[#]: 'three four'
without quotes around myarray[*]
ARG[*]: 'one'
ARG[*]: 'two'
ARG[*]: 'three'
ARG[*]: 'four'
without quotes around myarray[#]
ARG[#]: 'one'
ARG[#]: 'two'
ARG[#]: 'three'
ARG[#]: 'four'
I personally usually want "${myarray[#]}". Now, to answer the second part of your question, ${array[#]} versus $array[#].
Quoting the bash docs, which you quoted:
The braces are required to avoid conflicts with the shell's filename expansion operators.
$ myarray=
$ myarray[0]="one"
$ myarray[1]="two"
$ echo ${myarray[#]}
one two
But, when you do $myarray[#], the dollar sign is tightly bound to myarray so it is evaluated before the [#]. For example:
$ ls $myarray[#]
ls: cannot access one[#]: No such file or directory
But, as noted in the documentation, the brackets are for filename expansion, so let's try this:
$ touch one#
$ ls $myarray[#]
one#
Now we can see that the filename expansion happened after the $myarray exapansion.
And one more note, $myarray without a subscript expands to the first value of the array:
$ myarray[0]="one four"
$ echo $myarray[5]
one four[5]

Eval madness in ksh

Man I hate eval...
I'm stuck with this ksh, and it has to be this way.
There's this function I need, which will receive a variable name and a value. Will do some things to the contents of that variable and the value and then would have to update the variable that was received. Sort of:
REPORT="a text where TADA is wrong.."
setOutputReport REPORT "this"
echo $REPORT
a text where this is wrong..
Where the function would be something like
function setOutputReport {
eval local currentReport=\$$1
local reportVar=$1
local varValue=$2
newReport=$(echo "$currentReport"|sed -e 's/TADA/$varValue')
# here be dragons
eval "$reportVar=\"$newReport\""
}
I had this headache before, never manage to get this eval right at first. Important here, the REPORT var may contain multiple lines (\n's). This might be important as one of the attempts managed to correctly replace the contents of the variable with the fist line only :/
thanks.
One risk, not with eval but with the "varValue" as the replacement in the sed command: if varValue contains a slash, the sed command will break
local varValue=$(printf "%s\n" "$2" | sed 's:/:\\/:g')
local newReport=$(echo "$currentReport"|sed -e "s/TADA/$varValue/")
If your printf has the %q specifier, that will add a layer of security -- %q escapes things like quotes, backticks and dollar signs, and also escaped chars like newline and tab:
eval "$(printf "%s=%q" "$reportVar" "$newReport")"
Here's an example of what %q does (this is bash, I hope your version of ksh corresponds):
$ y='a `string` "with $quotes"
with multiple
lines'
$ printf "%s=%q\n" x "$y"
x=$'a `string` "with $quotes"\nand multiple\nlines'
$ eval "$(printf "%s=%q" x "$y")"
$ echo "$x"
a `string` "with $quotes"
and multiple
lines

Resources