popen and output of system command - c

I have to figure out the available space in /mnt/ in my application. I wrote the following code. However, execute_cmd some times returns junk apart from the actual output. For ex: 4.5K(followed by junk). Where am I going wrong? Could some one review and let me know why execute_cmd returns a junk byte at the end? How do I improve the code?
char *execute_cmd(char *cmd)
{
FILE *fp;
char path[100];
int ii = 0;
//char ii = 0;
char *buffer = malloc(1024);
char len = 0;
/* Open the command for reading. */
fp = popen(cmd, "r");
if (fp == NULL) {
printf("Failed to run command\n" );
exit(1);
}
printf("Running command is: %s\n", cmd);
memset(buffer, 0, sizeof(buffer));
do {
len = fread(path, 100, 1, fp); /* Is it okay to use fread? I do not know how many bytes to read as this function is a generic function which can be used for executing any command */
strcat(buffer,path);
printf("Number of bytes is: %d\n", len);
} while (len != 0);
len = strlen(buffer);
printf("Buffer contents are: %s %d\n", buffer,len);
/* close */
pclose(fp);
}
void main()
{
char *buffer = "df -h | grep \"/mnt\" | awk '{ print $4}'"; /* FIXME */
char len;
char units;
float number;
char dummy = 0;
char *avail_space;
avail_space = execute_cmd(buffer);
len = strlen(avail_space);
units = avail_space[len - 1];
printf("Available space is: %s %d %c end here\n", avail_space, len, units);
number = strtof(avail_space, NULL);
printf("Number is: %f\n", number);
}

sizeof(buffer) is sizeof(char*), which is probably 8 (or maybe 4). So your memset only clears a little bit of buffer. But with your use of fread, it's not just buffer that needs to be cleared; it's the temporary path.
Uninitialized local variables like path are not zero-initialised. You could use memset(path, 0, sizeof(path)); to clear it -- here the sizeof works because path really is an array -- but simpler is to initialise it in the declaration: char path[100] = "";.
Since fread does not NUL-terminate what it reads, there might be arbitrary garbage following it, making the strcat Undefined Behaviour. In fact, the strcat is totally unnecessary and a waste of cycles. You know how much data you read (it's in len) so you know exactly where to read the next chunk and you can do so directly without a temporary buffer and without a copy.
For future reference, if you are planning on calling malloc and then using memset to clear the allocated region, you should instead use calloc. That's what it's there for.

Related

using memcmp to compare buffers with 0 bytes

I am trying to compare 2 texts from files byte by byte using memcmp, after I read both of them into memory, one file into a buffer(char* or char[], tried both). the problem is, the file I read into a buffer have a lot of 0 bytes, which makes him stop at the first 0 byte thinking it is a null terminating zero, which makes a segmentation fault. how can I make the function keep compare bytes even so there are 0 bytes?
I already tried to check if the buffer is full or not, I printed it byte by byte and it showed all of the bytes including the 0 bytes. when I print it completely using printf("%s", buffer) I get only the first byte(the second byte is 0 byte).
void detect_virus(char *buffer, unsigned int size){
link* l = (link*) malloc(sizeof(link));
load(l);
unsigned int location = 0;
while(l != NULL){
location = 0;
while(location < size - l->vir->SigSize){
int isVirus = memcmp(buffer + location, l->vir->sig, l->vir->SigSize);
if(isVirus == 0)
printf("%d, %s, %d\n", location, l->vir->virusName, l->vir->SigSize);
location++;
}
}
free(l);
}
void detect(link* list){
char filename[50];
fgets(filename, 50, stdin);
sscanf(filename, "%s", filename);
FILE* file = fopen(filename, "rb");
char* buffer = (char*) malloc(10000);
fseek(file, 0, SEEK_END);
unsigned int size = ftell(file);
fseek(file, 0, SEEK_SET);
fread(buffer, 1, size, file);
detect_virus(buffer, size);
fclose(file);
}
I get a segmentation fault at the first time the memcmp function is called, instead of fully compare the texts. any ideas how to fix that?
edit
code for load function:
void load(link* list){
printf("Enter Viruses file name: \n");
char* filename = (char*) malloc(100);
fgets(filename, 100, stdin);
sscanf(filename, "%s", filename);
FILE* file = fopen(filename, "r");
while(!feof(file)){
short length = 0;
fread(&length, 2, 1, file);
if(length == 0)
break;
struct virus* v = (struct virus*)malloc(length);
fseek(file, -2, SEEK_CUR);
fread(v, length, 1, file);
v->SigSize = v->SigSize - 18;
list_append(list, v);
}
list = list->nextVirus;
free(filename);
fclose(file);
}
as a note, I tested the function before and it worked.
edit
I found out the problem, thank you all!
Per 7.21.6.7 The sscanf function, paragraph 2 of the C standard (bolding mine):
The sscanf function is equivalent to fscanf, except that input is obtained from a string (specified by the argument s) rather than from a stream. Reaching the end of the string is equivalent to encountering end-of-file for the fscanf function. If copying takes place between objects that overlap, the behavior is undefined.
Note the bolded portion.
In your code:
sscanf(filename, "%s", filename);
the filename array certainly overlaps with the filename array, thus invoking undefined behavior.
Remove that line of code.
You also need to add error checking, especially checking that the return from fopen() is not NULL.

Using popen("ls -la") produces strange result

I wrote some C code for it to get the result of an "ls -la" command using popen and write the result into an C. The code looks like this:
unsigned int ls(char *destination, const char *username, const char *relative_path)
{
printf("LS IMP\n");
//if(!username || !relative_path) return -1;
FILE *ls_pipe = NULL;
unsigned long ls_pipe_size = -1;
const char ls_command[] = "ls -la ";
char ls_path[255] = "/home/";
char ls_full_command[255];
char buffer[255];
bzero(buffer, 255);
char *entries = NULL;
bzero(ls_full_command, 255);
strcat(ls_path, username);
strcat(ls_path, relative_path);
strcat(ls_full_command, ls_command);
strcat(ls_full_command, ls_path);
printf("AFTER CATS\n");
ls_pipe = popen(ls_full_command, "r");
if(ls_pipe == NULL) return -1;
printf("Pipe ok!");
fseek(ls_pipe, 0, SEEK_END);
ls_pipe_size = ftell(ls_pipe);
rewind(ls_pipe);
printf("Filesize: %lu\n", ls_pipe_size);
int i;
for(i = 0; i < 100; i++)
{
fread(buffer, 1, 255, ls_pipe);
printf("%s", buffer);
}
//entries = (char*) malloc(sizeof(char) * ls_pipe_size);
//if(entries == NULL) return -1;
printf("Entries ok!\n");
//if(ls_pipe_size != fread(destination, sizeof(char), ls_pipe_size, ls_pipe)) return -1;
fclose(ls_pipe);
return strlen(destination);
}
The problem is the size of the pipe is huge (?) and in the result after the proper result three entries start to appear non-stop for like infinity.
Is there any way of reading from it without knowing the exact number of lines of the result using something like another popen with wc -l?
Thanks
P.S there are some modifications in the code when i was trying to test what's going wrong and the malloc didn't work because of the insane size of the pipe.
You can't seek on a pipe — period. Any value you get back from ftell() is immaterial or erroneous. You can't rewind a pipe because you can't seek on a pipe. You can only read data once from a pipe.
So, you need to redesign the code to read an indefinite amount of data.
Here's some reasonably working code — but I needed to adapt it to Mac OS X and my machine, so instead of /home/ it uses /Users/, and the call to ls() uses my user name. The code properly handles buffers full of data that do not end with a null (listing about 570 lines of output for my bin directory). I've left the interface to ls unchanged although it almost doesn't use destination and returning the length of destination is otherwise unrelated to what it is doing. It also uses pclose() to close the pipe. Using pclose() avoids leaving zombies around and returns the exit status of the executed program where fclose() will not.
#include <assert.h>
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
static unsigned int ls(char *destination, const char *username, const char *relative_path)
{
printf("LS IMP\n");
assert(destination != 0 && username != 0 && relative_path != 0);
const char ls_command[] = "ls -la ";
char ls_path[255] = "/Users/";
char ls_full_command[255];
snprintf(ls_full_command, sizeof(ls_full_command), "%s %s%s/%s",
ls_command, ls_path, username, relative_path);
FILE *ls_pipe = popen(ls_full_command, "r");
if (ls_pipe == NULL)
return -1;
printf("Pipe ok!\n");
char buffer[255];
int nbytes;
while ((nbytes = fread(buffer, 1, 255, ls_pipe)) > 0)
printf("%.*s", nbytes, buffer);
putchar('\n');
printf("Entries ok!\n");
pclose(ls_pipe);
return strlen(destination);
}
int main(void)
{
unsigned int length = ls("/", "jleffler", "bin");
printf("ls() returned %u\n", length);
return(0);
}

C program not entering for loop, but executing everything right up until it

for some reason when running a program it refuses to enter a for loop, and rather just hangs.
main()
{
char *buffer;
int chunk_offset, current_chunk_number, total_chunks;
int filelen;
filelen = ReadFileintoBuffer( buffer);
printf("file read \n"); // error break point 1
int events = filelen/eventlen; // number of 512 + 2 32 bit events events with timecode
int sp = 0;
int i,j;
int toterror[25];
printf("file of length %d has %d events \n", filelen, events);
printf(" i = %d \n", i);
for( i = 0; i < events; i++)
{
printf("analyzed %d events of %d", i, events);
sp=0;
int error[13];
Analyzeevent(buffer+i*eventlen, error);
if(error[0])
{
sp = 12;
}
for( j =1; j < 13; j++)
{
toterror[j+sp] += error[j];
}
}
printf("post for loop");
Printerrs(toterror, events);
exit(0);
}
It prints everything down to i = (9727988 in this particular case), then nothing. it all just stops. any idea what happened? i learned c++ and programming in c is very strange and awkward for me right now. compiler doesnt throw up any warnings or anything
thank you for your help in advance.
edit: for ring0 heres the code to ReadFileintoBuffer:
int ReadFileintoBuffer( char *buffer)
{
int filelen;
char text[200];
printf("Input File: " );
scanf( "%s" ,text);
FILE *file = 0;
int i;
//Open file
file = fopen(text, "rb");
if (!file)
{
fprintf(stderr, "Unable to open file \n");
exit(0);
}
//Get file length
fseek(file, 0, SEEK_END); // find the end of the file
filelen=ftell(file); // set the current pointer (currently at end from above) as file length
fseek(file, 0, SEEK_SET); // set pointer back to beginning of file
//Allocate memory
buffer=(char *)malloc(filelen+1);
if (!buffer)
{
fprintf(stderr, "Memory error! \n");
fclose(file);
return;
}
//Read file contents into buffer
fread(buffer, filelen, 1, file);
fclose(file); // clse the file, all in buffer now
return filelen;
}
You may be getting some buffering issues with printf. Try adding a \n to the print inside the for loop. (Like you have in some of the earlier prints)
It may be executing all the code but you aren't seeing the printed result.
You need to use a debugger - gdb
What is eventlen here? What is its type? What is its value and filelen's value?
int events = filelen/eventlen;
It could be possible that your event is negative and so refusing to enter this loop!
for( i = 0; i < events; i++)
Looking at
char *buffer;
filelen = ReadFileintoBuffer( buffer);
it seems something is not going right.
We don't see the code of ReadFileintoBuffer but it cannot use the buffer parameter in any way:
buffer is a non allocated string
its value, and not its reference, is passed to the function
Thus, either
buffer should be allocated prior to calling ReadFileintoBuffer(), eg with malloc() or
its referenced should be passed instead, like
filelen = ReadFileintoBuffer( &buffer);
It all depends on the ReadFileintoBuffer() function.
Edit based on ReadFileintoBuffer() code
The code has a problem with buffer as expected.
The local buffer parameter is allocated in the function, but its pointer value is not copied into the caller's buffer.
There are several ways to fix this - one of which is, as mentioned above, to pass the reference to the buffer pointer, and not its value to the function.
filelen = ReadFileintoBuffer( &buffer); // note the &
And the function itself - buffer related:
int ReadFileintoBuffer( char **buffer) // note the **
{
// ...
//Allocate memory ( using *buffer instead of buffer )
*buffer=(char *)malloc(filelen+1);
if (!*buffer)
{
// ...
}
//Read file contents into *buffer
fread(*buffer, filelen, 1, file);
// ...
}
This way the main() buffer variable allocation is actually performed by that function.
Note that the solution you approved does not fix the problem: it just empties the printf buffer... Unless buffer is fixed, your program cannot run correctly.
If you are trying to catch some problem by intermittent printf's, than use them with fflush'es. \n works as a such, too. Or you can't say nothing looking at seeming hanging app.
eventlen is not defined prior to this line:
int events = filelen/eventlen;
Use printf to see its value.

C getline function not reading lines as specified

I need getline() to read the request header sent by my browser to the webserver I'm programming. This is the getMessage function which is supposed to do that task:
char *getMessage(int fd) {
FILE *sstream = fdopen(fd, "r");
// initialise block to 1 char and set it to null
char *block = malloc(sizeof(char));
*block = '\0';
int size = 1;
// Read from the file descriptor fd (using a FILE stream) until a blank line is
// received.
// Read 100 lines (buffersize) from sstream and put into the buffer. If lines have
// been successfully read concatenate them with block.
int buffersize = 100;
char *buffer = malloc (buffersize + 1);
while(getline(&buffer,&buffersize,sstream) != -1){
int length = strlen(buffer);
printf("Buffer length: %d\n",length);
block = realloc(block,strlen(block)+strlen(buffer)+1);
strcat(block,buffer);
if(strcmp(buffer,"\r\n") == 0) break;
}
int len = strlen(block);
printf("Block length: %d\n", len);
printf("%s \n", block);
return block;
}
Basically the input of the getMessage function (fd), is the input from my listening socket declared in my main method. I have verified that the output is correct. Now I need to convert the output from the file descriptor to a string and return that string. But every time I run my server it gets stuck in the while loop. Not executing the statements in the loop.
EDIT: Added a loop-terminating condition: Now it jumps to "Block length" immediatley.
Help is much appreciated!
If you are using the POSIX 2008 getline() function, then you're throwing away useful information (it returns the length of the line it reads, so if you capture that information, you would not need the strlen() in the loop.
If the code blocks on a getline() call, it probably means that the upstream socket is not closed, but there is no data being sent any more. Your sending code needs to close the socket so that this code can detect EOF.
Or, since you discuss 'a blank line', then maybe your code should be checking for a line containing just \r\n (or maybe just \n) and break the loop; your code is not doing that at the moment.
Your loop also exhibits quadratic behaviour because you are repeatedly using strcat(). You would do better to keep tabs on the end of the string and simply strcpy() the new data after the old, then adjust the pointer to the end of the string.
On further review, I note that you use fdopen() to open a file stream based on the file descriptor, but you neither close it nor return the file stream to the caller for closing. This leads to a leakage problem.
Rule of Thumb: if you allocate a resource, you should release it, or pass it back to be released.
I recommend changing the interface to use an already-open FILE *, and doing the fdopen() in the calling code. Alternatively, if you won't need the file descriptor again, you can keep the current interface and use fclose() before returning, but this will close the underlying file descriptor too.
This code works for me (MacOS X 10.7.2; XCode 4.2.1):
#include <unistd.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
extern char *getMessage(FILE *);
char *getMessage(FILE *fp)
{
char *block = 0;
size_t size = 0;
size_t buffersize = 0;
char *buffer = 0;
ssize_t newlen;
while ((newlen = getline(&buffer, &buffersize, fp)) > 0)
{
printf("Buffer length: %ld\n", (long)newlen);
block = realloc(block, size + newlen + 1);
strcat(&block[size], buffer);
size += newlen;
if (strcmp(buffer, "\r\n") == 0)
break;
}
printf("Block length: %zd\n", size);
if (size > 0)
printf("<<%s>>\n", block);
return block;
}
int main(void)
{
char *msg;
while ((msg = getMessage(stdin)) != 0)
{
printf("Double check: <<%s>>\n", msg);
free(msg);
}
return 0;
}
I tested it with a file with DOS-style line endings as standard input, with both a blank line as the last line and with a non-blank line. Two blank lines in a row also seemed to be OK.
char buffer = (char *) malloc (buffersize + 1);
should be:
char *buffer = malloc (buffersize + 1);

In C, how should I read a text file and print all strings

I have a text file named test.txt
I want to write a C program that can read this file and print the content to the console (assume the file contains only ASCII text).
I don't know how to get the size of my string variable. Like this:
char str[999];
FILE * file;
file = fopen( "test.txt" , "r");
if (file) {
while (fscanf(file, "%s", str)!=EOF)
printf("%s",str);
fclose(file);
}
The size 999 doesn't work because the string returned by fscanf can be larger than that. How can I solve this?
The simplest way is to read a character, and print it right after reading:
int c;
FILE *file;
file = fopen("test.txt", "r");
if (file) {
while ((c = getc(file)) != EOF)
putchar(c);
fclose(file);
}
c is int above, since EOF is a negative number, and a plain char may be unsigned.
If you want to read the file in chunks, but without dynamic memory allocation, you can do:
#define CHUNK 1024 /* read 1024 bytes at a time */
char buf[CHUNK];
FILE *file;
size_t nread;
file = fopen("test.txt", "r");
if (file) {
while ((nread = fread(buf, 1, sizeof buf, file)) > 0)
fwrite(buf, 1, nread, stdout);
if (ferror(file)) {
/* deal with error */
}
fclose(file);
}
The second method above is essentially how you will read a file with a dynamically allocated array:
char *buf = malloc(chunk);
if (buf == NULL) {
/* deal with malloc() failure */
}
/* otherwise do this. Note 'chunk' instead of 'sizeof buf' */
while ((nread = fread(buf, 1, chunk, file)) > 0) {
/* as above */
}
Your method of fscanf() with %s as format loses information about whitespace in the file, so it is not exactly copying a file to stdout.
There are plenty of good answers here about reading it in chunks, I'm just gonna show you a little trick that reads all the content at once to a buffer and prints it.
I'm not saying it's better. It's not, and as Ricardo sometimes it can be bad, but I find it's a nice solution for the simple cases.
I sprinkled it with comments because there's a lot going on.
#include <stdio.h>
#include <stdlib.h>
char* ReadFile(char *filename)
{
char *buffer = NULL;
int string_size, read_size;
FILE *handler = fopen(filename, "r");
if (handler)
{
// Seek the last byte of the file
fseek(handler, 0, SEEK_END);
// Offset from the first to the last byte, or in other words, filesize
string_size = ftell(handler);
// go back to the start of the file
rewind(handler);
// Allocate a string that can hold it all
buffer = (char*) malloc(sizeof(char) * (string_size + 1) );
// Read it all in one operation
read_size = fread(buffer, sizeof(char), string_size, handler);
// fread doesn't set it so put a \0 in the last position
// and buffer is now officially a string
buffer[string_size] = '\0';
if (string_size != read_size)
{
// Something went wrong, throw away the memory and set
// the buffer to NULL
free(buffer);
buffer = NULL;
}
// Always remember to close the file.
fclose(handler);
}
return buffer;
}
int main()
{
char *string = ReadFile("yourfile.txt");
if (string)
{
puts(string);
free(string);
}
return 0;
}
Let me know if it's useful or you could learn something from it :)
Instead just directly print the characters onto the console because the text file maybe very large and you may require a lot of memory.
#include <stdio.h>
#include <stdlib.h>
int main() {
FILE *f;
char c;
f=fopen("test.txt","rt");
while((c=fgetc(f))!=EOF){
printf("%c",c);
}
fclose(f);
return 0;
}
Use "read()" instead o fscanf:
ssize_t read(int fildes, void *buf, size_t nbyte);
DESCRIPTION
The read() function shall attempt to read nbyte bytes from the file associated with the open file descriptor, fildes, into the buffer pointed to by buf.
Here is an example:
http://cmagical.blogspot.com/2010/01/c-programming-on-unix-implementing-cat.html
Working part from that example:
f=open(argv[1],O_RDONLY);
while ((n=read(f,l,80)) > 0)
write(1,l,n);
An alternate approach is to use getc/putc to read/write 1 char at a time. A lot less efficient. A good example: http://www.eskimo.com/~scs/cclass/notes/sx13.html
You can use fgets and limit the size of the read string.
char *fgets(char *str, int num, FILE *stream);
You can change the while in your code to:
while (fgets(str, 100, file)) /* printf("%s", str) */;
Two approaches leap to mind.
First, don't use scanf. Use fgets() which takes a parameter to specify the buffer size, and which leaves any newline characters intact. A simple loop over the file that prints the buffer content should naturally copy the file intact.
Second, use fread() or the common C idiom with fgetc(). These would process the file in fixed-size chunks or a single character at a time.
If you must process the file over white-space delimited strings, then use either fgets or fread to read the file, and something like strtok to split the buffer at whitespace. Don't forget to handle the transition from one buffer to the next, since your target strings are likely to span the buffer boundary.
If there is an external requirement to use scanf to do the reading, then limit the length of the string it might read with a precision field in the format specifier. In your case with a 999 byte buffer, then say scanf("%998s", str); which will write at most 998 characters to the buffer leaving room for the nul terminator. If single strings longer than your buffer are allowed, then you would have to process them in two pieces. If not, you have an opportunity to tell the user about an error politely without creating a buffer overflow security hole.
Regardless, always validate the return values and think about how to handle bad, malicious, or just malformed input.
You can use getline() to read your text file without worrying about large lines:
getline() reads an entire line from stream, storing the address of the buffer containing the text into *lineptr. The buffer is null-terminated and includes the newline character, if one was found.
If *lineptr is set to NULL before the call, then getline() will allocate a buffer for storing the line. This buffer should be freed by the user program even if getline() failed.
bool read_file(const char *filename)
{
FILE *file = fopen(filename, "r");
if (!file)
return false;
char *line = NULL;
size_t linesize = 0;
while (getline(&line, &linesize, file) != -1) {
printf("%s", line);
free(line);
}
free(line);
fclose(file);
return true;
}
You can use it like this:
int main(void)
{
if (!read_file("test.txt")) {
printf("Error reading file\n");
exit(EXIT_FAILURE);
}
}
I use this version
char* read(const char* filename){
FILE* f = fopen(filename, "rb");
if (f == NULL){
exit(1);
}
fseek(f, 0L, SEEK_END);
long size = ftell(f)+1;
fclose(f);
f = fopen(filename, "r");
void* content = memset(malloc(size), '\0', size);
fread(content, 1, size-1, f);
fclose(f);
return (char*) content;
}
You could read the entire file with dynamic memory allocation, but isn't a good idea because if the file is too big, you could have memory problems.
So is better read short parts of the file and print it.
#include <stdio.h>
#define BLOCK 1000
int main() {
FILE *f=fopen("teste.txt","r");
int size;
char buffer[BLOCK];
// ...
while((size=fread(buffer,BLOCK,sizeof(char),f)>0))
fwrite(buffer,size,sizeof(char),stdout);
fclose(f);
// ...
return 0;
}

Resources