csh color prompt via eval - eval

My goal is to set a prompt in csh shell via eval.
I've created a tcl script that prints a command (in real life it does much more):
puts "set prompt=\"%m %{\\033\[1;31m%}red prompt%{\\033\[0m%} %~ >\""
Then, in csh, I defined an alias:
alias red_prompt 'eval `/usr/bin/tclsh test_prompt.tcl`'
When I use that alias, I get an error:
> red_prompt
Missing ].
I tried playing with backslashes but it did not help.
How I can make it working?

You don't really need an external command; you can use %{...%} to output escape sequences:
set prompt = '%{\033[1;31m%}RED%{\033[0m%}%% '
Relevant manpage entry:
%{string%}
Includes string as a literal escape sequence. It should be
used only to change terminal attributes and should not move
the cursor location. This cannot be the last sequence in
prompt.
There are also some shortcuts for standout, bold, and underline (but not colours):
%S (%s)
Start (stop) standout mode.
%B (%b)
Start (stop) boldfacing mode.
%U (%u)
Start (stop) underline mode.

Related

add_history problem while trying to make a minishell [duplicate]

I'm using the readline library in C to create a bash-like prompt within bash. When I tried to make the prompt colorful, with color sequences like these, the coloring works great, but the cursor spacing is messed up. The input is wrapped around too early and the wrap-around is to the same line so it starts overwriting the prompt. I thought I should escape the color-sequences with \[ and \] like
readline("\[\e[1;31m$\e[0m\] ")
But that prints the square brackets, and if I escape the backslashes it prints those too. How do I escape the color codes so the cursor still works?
The way to tell readline that a character sequence in a prompt string doesn't actually move the cursor when output to the screen is to surround it with the markers RL_PROMPT_START_IGNORE (currently, this is the character literal '\001' in readline's C header file) and RL_PROMPT_END_IGNORE (currently '\002').
And as #Joachim and #Alter said, use '\033' instead of '\e' for portability.
I found this question when looking to refine the GNU readline prompt in a bash script. Like readline in C code, \[ and \] aren't special but \001 and \002 will work when given literally via the special treatment bash affords quoted words of the form $'string'. I've been here before (and left unsatisfied due to not knowing to combine it with $'…'), so I figured I'd leave my explanation here now that I have a solution.
Using the data provided here, I was able to conclude this result:
C1=$'\001\033[1;34m\002' # blue - from \e[1;34m
C0=$'\001\033[0;0m\002' # reset - from \e[0;0m
while read -p "${C1}myshell>$C0 " -e command; do
echo "you said: $command"
done
This gives a blue prompt that says myshell> and has a trailing space, without colors for the actual command. Neither hitting Up nor entering a command that wraps to the next line will be confused by the non-printing characters.
As explained in the accepted answer, \001 (Start of Heading) and \002 (Start of Text) are the RL_PROMPT_START_IGNORE and RL_PROMPT_END_IGNORE markers, which tell bash and readline not to count anything between them for the purpose of painting the terminal. (Also found here: \033 is more reliable than \e and since I'm now using octal codes anyway, I might as well use one more.)
There seems to be quite the dearth of documentation on this; the best I could find was in perl's documentation for Term::ReadLine::Gnu, which says:
PROMPT may include some escape sequences. Use RL_PROMPT_START_IGNORE to begin a sequence of non-printing characters, and RL_PROMPT_END_IGNORE to end the sequence.

How to safely pass an arbitrary text as parameter to a program in a shell script?

I'm writing a GUI application for character recognition that uses Tesseract. I want to allow the user to specify a custom shell command to be executed with /bin/sh -c when the text is ready.
The problem is the recognized text can contain literally anything, for example && rm -rf some_dir.
My first thought was to make it like in many other programs, where
the user can type the command in a text entry, and then special strings (like in printf()) in the command are replaced by the appropriate data (in my case, it might be %t). Then the whole string is passed to execvp(). For example, here is a screenshot from qBittorrent:
The problem is that even if I properly escape the text before replacing %t, nothing prevents the user to add extra quotes around the specifier:
echo '%t' >> history.txt
So the full command to be executed is:
echo ''&& rm -rf some_dir'' >> history.txt
Obviously, that's a bad idea.
The second option is only let the user to choose an executable (with a file selection dialog), so I can manually put the text from Tesseract as argv[1] for execvp(). The idea is that the executable can be a script where users can put anything they want and access the text with "$1". That way, the command injection is not possible (I think). Here's an example script a user can create:
#!/bin/sh
echo "$1" >> history.txt
It there any pitfalls with this approach? Or maybe there's a better way to safely pass an arbitrary text as parameter to a program in shell script?
In-Band: Escaping Arbitrary Data In An Unquoted Context
Don't do this. See the "Out-Of-Band" section below.
To make an arbitrarily C string (containing no NULs) evaluate to itself when used in an unquoted context in a strictly POSIX-compliant shell, you can use the following steps:
Prepend a ' (moving from the required initial unquoted context to a single-quoted context).
Replace each literal ' within the data with the string '"'"'. These characters work as follows:
' closes the initial single-quoted context.
" enters a double-quoted context.
' is, in a double-quoted context, literal.
" closes the double-quoted context.
' re-enters single-quoted context.
Append a ' (returning to the required initial single-quoted context).
This works correctly in a POSIX-compliant shell because the only character that is not literal inside of a single-quoted context is '; even backslashes are parsed as literal in that context.
However, this only works correctly when sigils are used only in an unquoted context (thus putting onus on your users to get things right), and when a shell is strictly POSIX-compliant. Also, in a worst-case scenario, you can have the string generated by this transform be up to 5x longer than the original; one thus needs to be cautious around how the memory used for the transform is allocated.
(One might ask why '"'"' is advised instead of '\''; this is because backslashes change their meaning used inside legacy backtick command substitution syntax, so the longer form is more robust).
Out-Of-Band: Environment Variables, Or Command-Line Arguments
Data should only be passed out-of-band from code, such that it's never run through the parser at all. When invoking a shell, there are two straightforward ways to do this (other than using files): Environment variables, and command-line arguments.
In both of the below mechanisms, only the user_provided_shell_script need be trusted (though this also requires that it be trusted not to introduce new or additional vulnerabilities; invoking eval or any moral equivalent thereto voids all guarantees, but that's the user's problem, not yours).
Using Environment Variables
Excluding error handling (if setenv() returns a nonzero result, this should be treated as an error, and perror() or similar should be used to report to the user), this will look like:
setenv("torrent_name", torrent_name_str, 1);
setenv("torrent_category", torrent_category_str, 1);
setenv("save_path", path_str, 1);
# shell script should use "$torrent_name", etc
system(user_provided_shell_script);
A few notes:
While values can be arbitrary C strings, it's important that the variable names be restricted -- either hardcoded constants as above, or prefixed with a constant (lowercase 7-bit ASCII) string and tested to contain only characters which are permissible shell variable names. (A lower-case prefix is advised because POSIX-compliant shells use only all-caps names for variables that modify their own behavior; see the POSIX spec on environment variables, particularly the note that "The name space of environment variable names containing lowercase letters is reserved for applications. Applications can define any environment variables with names from this name space without modifying the behavior of the standard utilities").
Environment space is a limited resource; on modern Linux, the maximum combined storage for both environment variables and command-line arguments is typically on the scale of 128kb; thus, setting large environment variables will cause execve()-family calls with large command lines to fail. Validating that length is within reasonable domain-specific limits is wise.
Using Command-Line Arguments:
This version requires an explicit API, such that the user configuring the trigger command knows which value will be passed in $1, which will be passed in $2, etc.
/* You'll need to do the usual fork() before this, and the usual waitpid() after
* if you want to let it complete before proceeding.
* Lots of Q&A entries on the site already showing the context.
*/
execl("/bin/sh", "-c", user_provided_shell_script,
"sh", /* this is $0 in the script */
torrent_name_str, /* this is $1 in the script */
torrent_category_str, /* this is $2 in the script */
path_str, /* this is $3 in the script */
NUL);
Any time you're runnng commands with even the possibility of user input making its way into them you must escape for the shell context.
There's no built-in function in C to do this, so you're on your own, but the basic idea is to render user parameters as either properly escaped strings or as separate arguments to some kind of execution function (e.g. exec family).

Perfectly forwarding arguments in batch

I have a small python script:
# args.py
import sys; print(sys.argv)
How can I write a .bat wrapper file that forwards all of the arguments to this script?
To eliminate my shell from the tests, I'm going to invoke it as:
import subprocess
import sys
def test_bat(*args):
return subprocess.check_output(['args.bat'] + list(args), encoding='ascii')
The obvious choice of batch file
#echo off
python args.py %*
Works for simple cases:
>>> test_bat('a', 'b', 'c')
"['args.py', 'a', 'b', 'c']\n"
>>> test_bat('a', 'b c')
"['args.py', 'a', 'b c']\n"
But rapidly falls apart when tried on arbitrary strings:
>>> test_bat('a b', 'c\n d')
"['args.py', 'a b', 'c']\n" # missing d
>>> test_bat('a', 'b^^^^^c')
"['args.py', 'a', 'b^c']\n" # missing ^^^^
Is it even possible to make a bat file pass on its arguments unmodified?
To prove it's not subprocess causing the issue - try running the above with
def test_py(*args):
return subprocess.check_output([sys.executable, 'args.py'] + list(args), encoding='ascii')
All of the tests behave as expected
Similar questions:
Get list of passed arguments in Windows batch script (.bat) - does not address lossless forwarding
Redirecting passed arguments to a windows batch file - addresses the same ideas as my question, but incorrectly closed as a duplicate of the above, and with less clear test-cases
Forwarding all batch file parameters to inner command - question does not consider corner-cases, accepted answer does not work for them
In short: There is no robust way to pass arguments through as-is via a batch file, because of how cmd.exe interprets arguments; note that cmd.exe is invariably involved, given that it is the interpreter needed to execute batch files, even if you invoke the batch file using an API that requests no shell involvement.
The problem in a nutshell:
On Windows, invoking an external program requires use of a command line as a single string for technical reasons. Therefore, even using an array-based, shell-free way of invoking an external program requires automated composition of a command line in which the individual arguments are embedded.
E.g., Python's subprocess.check_output() accepts the target executable and its arguments individually, as the elements of an array, as demonstrated in the question.
The target executable is invoked directly, using a command line that was automatically composed behind the scenes, without using the platform's shell as an intermediary (the way that Python's os.system() call does, for instance) - unless it so happens the target executable itself requires that shell as the executing interpreter, as is the case with cmd.exe for batch files.
Composing the command line requires selective double-quoting and escaping of embedded " chars. when embedding the individual arguments; typically that involves:
Using enclosing double-quoting ("..."), but only around arguments that contain whitespace (spaces).
Escaping embedded double quotes as \"
Notably, no other characters trigger double-quoting or individual escaping, even though those characters may have special meaning to a given shell.
While this approach works well with most external programs, it does NOT work reliably with batch files:
Unfortunately, cmd.exe doesn't treat the arguments as literals, but interprets them as if you had submitted the batch-file call in an interactive console (Command Prompt).
Combined with how the command line is composed (as described above), this results in many ways that the arguments can be misinterpreted and break the invocation altogether.
The primary problem is that arguments that end up unquoted in the command line that cmd.exe sees may break the invocation, namely if they contain characters such as & , |, > or <.
Even if the invocation doesn't break, characters such as ^ may get misinterpreted.
See below for specific examples of problematic arguments.
Trying to work around the problem on the calling side with embedded quoting - e.g., using '"^^^^^" as an argument in Python - does not work, because most languages, including Python, use \" to escape " characters behind the scenes, which cmd.exe does not recognize (it only recognizes "").
Hypothetically, you could painstakingly ^-escape individual characters in whitespace-free arguments, but not only is that quite cumbersome, it still wouldn't address all issues - see below.
Jeb's answer commendably addresses some of these issues inside the batch file, but it is quite complex and it too cannot address all issues - see next point.
There is no way to work around the following fundamental restrictions:
cmd.exe fundamentally cannot handle arguments with embedded newlines (line breaks):
Parsing the argument list simply stops at the first newline encountered.
CR (0xD) chars. in isolation are quietly removed.
The interpretation of % as part of an environment-variable reference (e.g, %OS%) cannot be suppressed:
%% does NOT help, because, curiously and unfortunately, the parsing rules of an interactive cmd.exe session apply(!), where the only way to suppress expansion is to employ the "variable-name disrupter trick", e.g., %^OS%, which only works in unquoted arguments - in double-quoted arguments, you fundamentally cannot prevent expansion.
You're lucky if the env. variable happens not to exist; the token is then left alone (e.g., %NoSuchVar% or %No Such Var% (note that cmd.exe does support variable names with spaces).
Examples of whitespace-free arguments that either break batch-file invocation or result in unwanted alteration of the value:
^^^^^
^ in unquoted strings is cmd.exe's escape character that escapes the next character, i.e., treats it as a literal; ^^ therefore represents a literal, single ^, so the above yields ^^, with the last ^ getting discarded
a|b
| separates commands in a pipeline, so cmd.exe will attempt to pipe the part of the command line before | to a command named b and the invocation will most likely break or, perhaps worse, will not work as intended and execute something it shouldn't.
To make this work, you'd need to define the argument as 'a^^^|b' (sic) on the Python side.
Note that a & b would not be affected, because the embedded whitespace would trigger double-quoting on the Python side, and use of & inside "..." is safe.
Other characters that pose similar problems are & < >
Interessting question, but it's tricky.
The main problem is, that %* can't be used here, as it modifies the content or completely fails dependent of the content.
To get the unmodified argv, you should use a technic like Get list of passed arguments in Windows batch script (.bat).
#echo off
SETLOCAL DisableDelayedExpansion
SETLOCAL
for %%a in (1) do (
set "prompt=$_"
echo on
for %%b in (1) do rem * #%*#
#echo off
) > argv.txt
ENDLOCAL
for /F "delims=" %%L in (argv.txt) do (
set "argv=%%L"
)
SETLOCAL EnableDelayedExpansion
set "argv=!argv:*#=!"
set "argv=!argv:~0,-2!"
REM argv now contains the unmodified content of %* .
c:\dev\Python35-32\python.exe args.py !argv!
This can be used to build a wrapper with limitations.
Carriage returns can't be fetched at all.
Line feeds currently can not be fetched in a safe way

How to "source" a shell file in C?

I have one C program and one shell script and I'd like to "source" shell script using my C.
I tried use system() function, after it I can run script properly, but my colors doesn't work.
For example instead of CYAN - I defined it as:
CYAN='\e[96m'
it shows only \e[96m and some functions just failed with message:
./myscript.sh: 27: [: y: unexpected operator
Is there some solution?
A program that is not itself the shell cannot "source" a file of shell commands as the shell itself can do. A program can run such a file as a script, either directly or by invoking a shell to run it, but the script then gets its own environment, and any changes it applies to that environment do not propagate to the parent process's environment.
Programs receive their environment as a function of program startup. If you want a variable to be set in a program's environment then by far the easiest thing to do is arrange for it to be set when the program is invoked, either by exporting it from the parent process's environment or by wrapping program launch in a script that arranges for the same. There are additional alternatives on the process startup side, as well.
If a C program wants to alter its environment after startup, then it can use the setenv() and unsetenv() functions. Those are defined by POSIX, not C itself, but if we're talking about sourcing shell commands then it seems reasonable to assume a POSIX context.
Additionally, if you are trying to define CYAN as a shell variable whose contents are an ANSI escape sequence, then your syntax is wrong. No escape sequences at all are recognized within ordinary single quotes (even closing single quote cannot be escaped). Within double quotes the backslash does function as an escape character, but in a strict sense: C-style character codes are not supported there. If, again, you're processing that in the shell, as opposed to in C, then you appear to want
CYAN=$'\e[96m'
(Note the $, which is essential for \e to be recognized as representing the "escape" character, and which causes the shell to recognize a few other C-style escape sequences as well.)

Foreign language characters replaced by "?"

I am working on a program which takes file/folder names as input. Currently when I try to run a file which has got foreign language character in its name it is replaced by a ? For each of its character. I am running my exe on command prompt so trying to run the particular file results in an error. When I am using DIR on command prompt it displays ? For each character of the file name. Is there any way to display the actual foreign language characters in command prompt as I believe that could be causing my exe not to work any of those files.
This is the text that I am trying to read - 科普書籍推展教案 which is being replaced by ? on the console.
The command prompt can only display characters in your current ACP. So, if you have files with names outside the ACP, you're going to see ?. You can use changecp to pick a different CP, but here is no code page for full Unicode in the DOS box.
Inside your code, you need to learn to use 'W' API to work with full unicode pathnames. The safest thing is to just #define _UNICODE and use it uniformly.

Resources