Calling execve bash on bash scripts can't find arguments - c

I have two (Ubuntu Linux) bash scripts which take input arguments. They need to be run simultaneously. I tried execve with arguments e.g.
char *argv[10] = { "/mnt/hgfs/F/working/script.sh", "file1", "file2", NULL };
execve(argv[0], argv, NULL)
but the bash script can't seem to find any arguments at e.g. $0, $1, $2.
printf "gcc -c ./%s.c -o ./%s.o\n" $1 $1;
gcc -c ./$1.c -o ./$1.o -g
exit 0;
output is gcc -c ./main.c -o ./main.o
and then a lot of errors like /usr/include/libio.h:53:21: error: stdarg.h: No such file or directory
What's missing?

Does your script start with the hashbang line? I think that's a must, something like:
#!/bin/bash
For example, see the following C program:
#include <stdio.h>
#include <unistd.h>
char *argv[10] = { "./qq.sh", "file1", NULL };
int main (void) {
int rc = execve (argv[0], argv, NULL);
printf ("rc = %d\n", rc);
return 0;
}
When this is compiled and run with the following qq.sh file, it outputs rc = -1:
echo $1
when you change the file to:
#!/bin/bash
echo $1
it outputs:
file1
as expected.
The other thing you need to watch out for is with using these VMWare shared folders, evidenced by /mnt/hgfs. If the file was created with a Windows-type editor, it may have the "DOS" line endings of carriage-return/line-feed - that may well be causing problems with the execution of the scripts.
You can check for this by running:
od -xcb /mnt/hgfs/F/working/script.sh
and seeing if any \r characters appear.
For example, if I use the shell script with the hashbang line in it (but appen a carriage return to the line), I also get the rc = -1 output, meaning it couldn't find the shell.
And, now, based on your edits, your script has no trouble interpreting the arguments at all. The fact that it outputs:
gcc -c ./main.c -o ./main.o
is proof positive of this since it's seeing $1 as main.
The problem you actually have is that the compiler is working but it cannot find strdarg.h included from your libio.h file - this has nothing to do with whether bash can see those arguments.
My suggestion is to try and compile it manually with that command and see if you get the same errors. If so, it's a problem with what you're trying to compile rather than a bash or exec issue.
If it does compile okay, it may be because of the destruction of the environment variables in your execve call.

Related

Why doesn't the execve command in C on macOS allow the 'which' command to work?

Why does the execve command in C on macOS not allow the 'which' command to work? It works on non-Mac devices.
#include <errno.h>
#include <sys/types.h>
#include <sys/stat.h>
#include <fcntl.h>
#include <unistd.h>
#include <stdio.h>
int main()
{
int fd;
char cmd[] = "/bin/cat";
char cmd1[] = "/usr/bin/which";
char *s[]={"which","ls",NULL};
if (execve(cmd1, s, NULL) == -1)
perror("oops ur wrong!!");
}
Expected output
 clang-7 -pthread -lm -o main main.c
 ./main
/bin/ls

but on a Mac, it returns nothing.
macOS
The code works. It doesn't work well, but it does work.
Given the null PATH in the environment (because you've used execve() and provided NULL as the environment), /usr/bin/which can't find ls — it has nowhere to look for it because PATH is not set.
On my machine (a MacBook Pro running macOS Big Sur 11.7.1 — it's a work machine and the company IT is behind the times), /usr/bin/which is a universal binary with two architectures. If I run /usr/bin/which ozymandias on the command line, there is no output (I don't have a command ozymandias anywhere), but the exit status is 1 (failure). That's an odd implementation — not reporting an error — but it works within its limits.
You can see this effect with:
$ (unset PATH; /usr/bin/which ls)
$ echo $?
1
$
If you use execv() instead of execve() and remove the , NULL from the argument list, the output is /bin/ls and the exit status is 0.
Linux
Just for comparison, on a RHEL 7.4 machine, I get different results:
$ which -a which
which='alias | /usr/bin/which --tty-only --read-alias --show-dot --show-tilde'
/usr/bin/alias
/usr/bin/which
/usr/bin/which
$ file /usr/bin/which
/usr/bin/which: ELF 64-bit LSB executable, x86-64, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.32, BuildID[sha1]=317ba624d2914607bf9246993446803a977fbc18, stripped
$ /usr/bin/which which
/usr/bin/which
$ (unset PATH; /usr/bin/which which)
/usr/bin/which: no which in ((null))
$ /usr/bin/which ozymandias
/usr/bin/which: no ozymandias in (/work2/jleffler/bin:/u/jleffler/bin:/usr/perl/v5.34.0/bin:/usr/gcc/v12.2.0/bin:/usr/local/bin:/usr/bin:/usr/sbin)
$ /usr/bin/which --help
Usage: /usr/bin/which [options] [--] COMMAND [...]
Write the full path of COMMAND(s) to standard output.
--version, -[vV] Print version and exit successfully.
--help, Print this help and exit successfully.
--skip-dot Skip directories in PATH that start with a dot.
--skip-tilde Skip directories in PATH that start with a tilde.
--show-dot Don't expand a dot to current directory in output.
--show-tilde Output a tilde for HOME directory for non-root.
--tty-only Stop processing options on the right if not on tty.
--all, -a Print all matches in PATH, not just the first
--read-alias, -i Read list of aliases from stdin.
--skip-alias Ignore option --read-alias; don't read stdin.
--read-functions Read shell functions from stdin.
--skip-functions Ignore option --read-functions; don't read stdin.
Recommended use is to write the output of (alias; declare -f) to standard
input, so that which can show aliases and shell functions. See which(1) for
examples.
If the options --read-alias and/or --read-functions are specified then the
output can be a full alias or function definition, optionally followed by
the full path of each command used inside of those.
Report bugs to <which-bugs#gnu.org>.
$
PATH sanitized — radically shortened.
The which command reports an error when it can't find the command. It is a standalone executable on this Linux machine, and the which alias feeds it the aliases so it can report on them. The -a option reports on all the things that could be known as which (the second which in which -a which).
I found that adding the envp(path argument in main) to the arguments made it work
#include <errno.h>
#include <sys/types.h>
#include <sys/stat.h>
#include <fcntl.h>
#include <unistd.h>
#include <stdio.h>
int main(int argv, char *argc[],char *envp[])
{
int fd;
char cmd1[] = "/usr/bin/which";
char *s[] = {"which", "ls", NULL};
if (execve(cmd1, s, envp) == -1)
perror("oops ur wrong!!");
}
thanks anyways

Syntax error reported in basic C code - what is wrong?

The code is given below.
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
int main ( int argc, char *argv[] )
{
//FILE *fps;
char secret[512] =" ";
FILE *fps = fopen("/etc/comp2700/share/secret", "r");
if(fps == NULL)
{
printf("Secret file not found\n");
return 1;
}
fgets(secret, 512, fps);
printf("Secret: %s\n", secret);
fclose(fps);
return 0;
}
When I am trying to run this program it is repeatedly throwing the following error:
./attack1.c: line 4: syntax error near unexpected token `('
./attack1.c: line 4: `int main ( int argc, char *argv[] )'
You need to compile your source file with gcc as follows
gcc -o attack attack1.c
then run it with
./attack
You should read up on the difference between compiled versus interpreted languages.
There is a short video here explaining the difference.
You cannot run your C program from the command line as ./attack1.c. Normally the shell would refuse to execute the C source file because it should not have execute permission, but for some reason, on your system, it must have the x bits and is read by the default shell as a script.
Of course this fails because attack1.c contains C code, not a command file. Note that the #include lines are interpreted as comments by the shell and the error only occurs at line 4.
To run a C program, you must first compile it to produce an executable:
gcc -Wall -o attack1 attack1.c
And then run the executable if there were no compilation errors:
./attack1
You can combine these commands as
gcc -Wall -o attack1 attack1.c && ./attack1
First, you need to compile the attack.c code using the following command:
gcc attack.c
This will create one executable file a.out which you can run using the following command:
./a.out
Hope this helps you.

GCC/Command Prompt error: '.' is not recognized as an internal or external command

I'm fairly new to C and am completely new to using the command prompt and GCC to compile and run my programs. I'm struggling to find the right words to ask this question properly so please bear with me, I am doing my best.
I need to use GCC to compile and run this C program but I'm getting an error that I do not understand. In this example program, I was told to use these lines to compile and run the code:
$ gcc -Wall -std=c99 -o anagrams anagrams.c
$ ./anagrams dictionary1.txt output1.txt
So that is what I did. GCC does compile the program file, so the first line does not give me any error. But GCC does not like the second line, as shown below:
C:\Users\...\Example>gcc -Wall -std=c99 -o anagrams anagrams.c
C:\Users\...\Example>./anagrams dictionary1.txt output1.txt
'.' is not recognized as an internal or external command,
operable program or batch file.
Everywhere I look, it says to use "./filename" to run the program after compiling so I don't understand why it is not working for me. Any help or advice would be really appreciated.
Also, here is the main() of the program to show why those two .txt files are needed:
int main(int argc, char *argv[])
{
AryElement *ary;
int aryLen;
if (argc != 3) {
printf("Wrong number of arguments to program.\n");
printf("Usage: ./anagrams infile outfile\n");
exit(EXIT_FAILURE);
}
char *inFile = argv[1];
char *outFile = argv[2];
ary = buildAnagramArray(inFile,&aryLen);
printAnagramArray(outFile,ary,aryLen);
freeAnagramArray(ary,aryLen);
return EXIT_SUCCESS;
}
'.' is not recognized as an internal or external command
This is not a GCC error. The error is issued by the shell when trying to run a command.
On Windows this
./anagrams dictionary1.txt output1.txt
should be
.\anagrams dictionary1.txt output1.txt
as on Windows the path delimiter is \ as opposedto IX'ish systems where it is /.
On both systems . denotes the current directory.
The reason for the crash you mention in your comment is not obvious from the minimal sources you show. Also this is a different question.

compiling and running a c program using exec()

I am writing a program using execv() that compiles and runs another program. I've written up a simple C program named helloWorld.c that when executed outputs, "Hello world," and a second file named testExec.c that is supposed to compile and run helloWorld.c. I've been looking around everywhere to find a way to do this, but I haven't found any answers. The code in testExec.c is:
#include <stdio.h>
#include <unistd.h>
int main(){
char *args[] = {"./hellWorld.c", "./a.out", NULL};
execv("usr/bin/cc", args);
return 0;
}
testExec.c compiles with no errors. However, when I run it I get an error that says, "fatal error: -fuse-linker-plugin, but liblto_plugin.so not found. compilation terminated." Which I think means helloWorld.c is being compiled but when it comes time to run helloWorld.c this error is thrown. I thought maybe that was because I had a.out and helloWorld.c prefaced with './'. I removed './' from both, then either one individually, and still no luck.
I also did 'sudo apt-get install build-essential' along with 'sudo apt-get install gcc'. I wasn't sure if that would resolve the issue but I really wasn't sure what else to try. Anyway, any help would be appreciated!
You're missing the leading slash when calling cc.
Also, the first argument in the argument list is the name of the executable. The actual arguments come after that. You're also not using -o to specify the name of the output file.
#include <stdio.h>
#include <unistd.h>
int main(){
char *args[] = {"cc", "-o", "./a.out", "./hellWorld.c", NULL};
execv("/usr/bin/cc", args);
return 0;
}
EDIT:
The above only compiles. If you want to compile and run, you can do this:
#include <stdio.h>
#include <unistd.h>
int main(){
system("cc -o ./a.out ./hellWorld.c");
execl("./a.out", "a.out", NULL);
return 0;
}
Although this is probably best done as a shell script:
#!/bin/sh
cc -o ./a.out ./hellWorld.c
./a.out

printf doesn't work, even with newline and fflush()

I'm compiling the below C code with gcc. No errors are thrown during compilation or at runtime. I ran through the code with gdb, and the answer given in sum is correct at the end, yet the printf() does not display anything on the screen. I've tried all sorts of combinations of fprintf(), printf(), and fflush(), but nothing works.
What do I need to change so the program will print the result to stdout?
#include <stdio.h>
#include <stdlib.h>
int main()
{
int num = 9;
int i, sum; i = 1, sum = 0;
while (i < 2 * num) {
sum = sum + i * i;
++i;
}
printf("sum: %d\n", sum);
fflush(stdout);
return 0;
}
The code is correct, and should print sum: 1785 for any conforming implementation.
This is a guess (update: which turns out to be correct), but ...
You've named the source file test.c, and you compile it with:
$ gcc test.c -o test
(or something similar) and execute it with:
$ test
which produces no output.
The problem is that test is a standard Unix command (and also a built-in command in some shells). When you type a command name to the shell, it first looks for built-in commands, then for executables in directories specified in your $PATH environment variable.
To execute a command in the current directory, prepend ./ to the name:
$ ./test
sum: 1785
$
This applies to any command in the current directory. There are so many built-in commands that you can't reasonably avoid colliding with them. Cultivating the habit of running executables in the current directory by typing ./whatever means that name collisions don't matter.
(Don't be tempted to add . to the front of your $PATH; that's dangerous. Think about what could happen if you cd into a directory and type ls, if there happens to be a malicious ls command there.)
There is nothing wrong with your program. It has to work. Try running it with redirection:
./a.out > myout
..and see if you get any output. If not, I'd suspect there is a problem with some kind of standard library mismatch.
Another option to check would be to build using SUN C compiler as opposed to gcc and see if that works. If it does, gcc is the culprit.

Resources