Calling fopen gives malloc error at runtime - c

I'm reading a file's contents into a string with this function
void readFile2String(char **string, char location[]){
FILE *fileList;
int size;
char *temp;
char command[1024];
if((fileList=fopen(location,"r")) == NULL){
perror(" Error opening list of directories: ");
exit(2);
}
fseek(fileList,0,SEEK_END);
size = ftell(fileList);
rewind(fileList);
temp = malloc((size+1)*(sizeof(char)));
fread(temp,sizeof(char),size,fileList);
temp[size]=0; // Terminate string with 0
*string = temp;
fclose(fileList);
}
and I'm using the string for further manipulation. I'm calling this function as
char *temp;
readFile2String(&temp, fileName);
I successfully get the file contents in the string. But when at a later point I try to use fopen again, I get malloc error at runtime. I've tried commenting out the call to this function in this program and after that, I have been able to use fopen as many times as I want. What is wrong with my function?
Thanks.

"fopen()" isn't causing the memory corruption' Neither is failing to "free()" (if in fact you're not doing a free().
You're validating the return from "fopen()" - good. Q: Why aren't you also checking fseek(), ftell(), rewind() and fread() for error conditions?
My guess is temp[size]=0; might be the culprit that actually causes the memory corruption. Or perhaps fread(). Knowing "size" would definitely be useful.
SUGGESTION:
Carefully walk through the debugger, and validate your I/O returns each step of the way

Related

free()-ing a char* stops working after sscanf()"?

I'm having some trouble with some memory issues. The issue is when the line is freed (free(line)), there is a
free(): invalid size error.
From what I know, sscanf doesn't modify the string that is passed into it. Weirdly enough, the free(line) inside the if statement works fine. I'm not sure what the problem is because I've freed the char* like this in other parts of my program without issues, albeit without the sccanf call. Any help would be appreciated.
char* line;
read_line(read, &line, 0);
printf("%s\n", line); //gives "playinfoA/30"
char playerLetter[1];
char numberOfPlayers[2];
char temp[1];
if (sscanf(line, "playinfo%1s%1s%s", playerLetter, temp,
numberOfPlayers) != 3) {
free(line);
return -1;
}
//free(line);
return 0;
If your problem is, free(line) doesn't work when if block fails, then you might want to check if line is actually pointing to something.
Since you did not initialize line, which is a pointer to char, the only other possibility of having it point to some memory location is the call to read_line.
Now, I'm not sure what read_line does, but try passing line instead of &line ?

How to Solve C Program Raises Malloc Error With Opened Files?

I have a program to read a file (given as a parameter in argv) line by line and output it to a different file. Here is the code;
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
#define CAT_PREFIX_LEN 4
#define CAT_PREFIX "bsh_"
int main(int argc, char** argv) {
FILE *toread, *towrite;
char *line;
size_t len = 0;
ssize_t read;
char catfile[CAT_PREFIX_LEN + strlen(argv[1]) + 1];
// Opening file to read
toread = fopen(argv[1], "r");
// Create output file name
strcpy(catfile, CAT_PREFIX);
strcat(catfile, argv[1]);
// Opening file to write
towrite = fopen(catfile, "w");
while((read = getline(&line, &len, toread)) != -1) {
fprintf(towrite, line);
}
fclose(toread);
fclose(towrite);
}
However if I try to run this program with a test file test.txt I get the error:
convert_script(21115,0x7fffcb3f43c0) malloc: *** error for object 0x10caca4c0: pointer being realloc'd was not allocated
*** set a breakpoint in malloc_error_break to debug
Abort trap: 6
I've tried multiple things; the first thing I tried was to comment out the whole for loop. If I do comment out the whole for loop, the error goes away. I also tried removing the opening, closing and writing to test.txt. If I remove it, the error also goes away. I don't really know what could be causing it. I am running it on a MacBook Air and compiling it using gcc to compile it.
char *line;
is not initialized and contains an indeterminate value.
The Linux getline() man page says
If *lineptr is set to NULL and *n is set 0 before the call, then
getline() will allocate a buffer for storing the line. This buffer
should be freed by the user program even if getline() failed.
Since you're on a Mac, the POSIX getline() standard adds this:
The application shall ensure that *lineptr is a valid argument that could be passed to the free() function. If *n is non-zero,
the application shall ensure that *lineptr either points to an
object of size at least *n bytes, or is a null pointer.
For strict compliance with the POSIX standard, you should initialize line to NULL:
char *line = NULL;
Edit: The official Apple getline() man page states:
The caller may provide a pointer to a malloced buffer for the line in
*linep, and the capacity of that
buffer in *linecapp. These functions expand the buffer as needed, as if via realloc(). If linep
points to a NULL pointer, a new buffer will be allocated. In either case, *linep and *linecapp will be
updated accordingly.

C program just exits after trying to write to stdout

so written a small function (part of a larger program) and when i run it and input "GET" it exits with a value of 1. to be honest i'm still grasping the concept of open read and write to stdout but not sure what i've done wrong here.
int input_arg()
{
MainStruct val; //variables are loaded from a config file to this structure
char *getInput;
char *fileInput;
FILE *loadfile;
char buffer[1024];
int n;
int defaultFile = val.def; //success.txt value read when fileparser.c is run
printf("http >> :");
fflush(NULL);
fscanf(stdin,"%s", getInput);
if (getInput == "GET")
{
loadfile = fopen(defaultFile, "r");
if (loadfile == NULL)
{
fprintf(stderr, "error loading default resource: PROGRAM WILL EXIT");
exit(0);
}
while ((n = read(loadfile, buffer, sizeof(buffer))) > 0) //reads file (not sure this should be a while loop)
{
if((write(STDOUT_FILENO, buffer, n)) < 0) //writes to stdout
{
perror("failed to display file to output");
close(loadfile);
exit(1);
}
}
}
}
for compiling purposes the val.def pointer is a string as below
char defaultFile = "success.txt";
unsure of what i am missing here. tried changing the structure pointer to a simple char string to see if it was anything there but didn't actually make any difference. i think the problem is with the while loop...i don't think it should be there, but i have yet to find an example where a while loop ISN'T used in a write to stdout scenario.
thanks
It crashes because you have not allocated any memory for getInput to point at. So the program will crash when it attempts to follow the pointer, which will not point to anything useful.
Either allocate memory dynamically with for example malloc, or replace it with a static buffer.
Also, you may want to look at strcmp for comparing strings. Comparing strings in C with == will not compare the strings lexically, instead it will only compare the pointers pointing at them.
fscanf(stdin,"%s", getInput);
getInput is never initialized or allocated memory.Fix it by allocating memory
getInput = malloc(200);
Your program has serious issues, the most important one is that you are using fopen() with read() and that is wrong.
The read() function takes and int as first parameter, which is a file descriptor tha you can create via the open() function, and not fopen() which returns a FILE * object, so change[1]
FILE *loadFile;
to
int loadFile;
and
loadFile = fopen(defaultFile, "r");
to
loadFile = open(defaultFile, O_RDONLY);
and then to check for failure
if (loadFile == -1) /* it failed to open check errno? perhaps... */
you must enable compiler warnings to prevent this kind of mistake, because the first parameter of read() in your program is of incompatible type.
The fscanf() function expects a valid pointer for each "%s" specifier, you are passing an unintialzed pointer to it, dereferencing it inside of scanf() is undefined behavior.
You need to allocate space for it, something like this should work
char inputBuffer[100];
if (fscanf(stdin, "%99s", inputBuffer) != 1)
thereWasAProblemGettingInput_DoNotUse_inputBuffer_InTheCodeThatFollows();
Note that:
I used inputBuffer as a name for the variable, though this doesn't affect the program execution or compilation at all, the readability matters.
Used "%99s" to prevent buffer overflow.
Checked the value returned by fscanf() to make sure that the inputBuffer has valid data and was properly initialized.
String comparison in c is not like in many other languages, in your code
if (getInput == "GET")
is comparing the addresses of getInput and the string literal "GET", which will not be the same unless you make getInput point to "GET", since you want to compare the contents you need
if (strcmp(inputBuffer, "GET") == 0)
instead, and do not forget to include the string.h header.
[1]Note that loadFile is also a bad choice for a variable name, it feels like a function name, inputFile would be more appropriate.

Weird segfault after open file?

I'm getting a seg fault when I try and print fname. Can someone explain why this is happening to me? Is it that I'm not allowed to write to a file, close it, and then read from a file? argv[2] is defined. I've tried with multiple different allocations.
int main(int argc, char *argv[]){
//Other stuff
char *rfile=calloc(1, 2049);
strcpy(rfile, argv[2]);
FILE *wfile;
wfile = fopen(argv[2], "w");
SaveIndexToFile(index, wfile, alpha); // This does a whole bunch of writing to a the file.
fclose(wfile);
//alpha is declared here, and works just fine.
RemakeIndex(rfile, alpha);
return 1;
}
int RemakeIndex(char *fname, WordList * alpha){
printf("%s", fname);
return 1;
}
You are not checking the return value of fopen. If the fopen fails it can
return NULL. If you are doing something with NULL that can undefined behavior. Place this line after opening the file.
if ( wfile == NULL ){
perror("fopen");
return;
}
And check whether the argc count is three. If you are not giving arguments to the ./a.out then accessing the argv[2] can also lead to segmentation fault.
Is it that I'm not allowed to write to a file, close it, and then read from a file?
Yes, you are not allowed to read from a file [stream] after it had been closed.
Note on the (OP's) wording:
char * rfile is called a "pointer to char".
FILE * is called a "file-pointer" (or also just "pointer to FILE) or commonly (but formally wrong) just "file".
Also RemakeIndex() is called in main() without proper protoyping.
To fix this
either add a prototype before main():
int RemakeIndex(char *, WordList *);
or move the whole implementation of RemakeIndex() before main().
Also the printf() calls' output might not show up immediately on the console, as stdout is line buffered.
To fix this
either print out a trailing new-line:
printf("%s\n", fname);
or printf to stderr, which itself isn't line bufferd by default:
fprintf(strerr, "%s\n", fname);
or flush stdout after having printed to it:
printf("%s\n", fname);
fflush(stdout);
Prototyping the function is very important, the GCC compiler will assume an implicitly declared function (RemakeIndex in your code) has two arguments which are both int, which would make your code look like this:
int RemakeIndex(int fname, int alpha) {
printf("%s", (char *)fname);
return 1;
}
On a 64 bit machine and with GCC where pointers are 64 bits and ints are 32 bits then your arguments will be truncated to 32 bits which is likely to cause a segfault. The other answers have mentioned prototyping the function and if you are using a 64bit compiler I would suggest that this is your problem.
It ended up being that I was allocating way too much memory on the heap. I had a loop that was allocating strings of max_length of an unsigned int. Thank you for all of your comments and help!

when strdup function fails?

i have following code which use strdup function
#include<stdlib.h>
#include<stdio.h>
#include<string.h>
char source[] = "The Source String ";
int main()
{
char *dest;
if ((dest = _strdup(source)) == NULL)
{
fprintf(stderr, " Error allocation memory. ");
exit(1);
}
printf("The destination = %s\n", dest);
return 0;
}
it successfully says The Source String,but i am interesting in which situation it fails and how good it is usage of it in daily problems?i know that strdup it is determined by
char *strdup (const char *s)
{
char *d = malloc (strlen (s) + 1); // Space for length plus nul
if (d == NULL) return NULL; // No memory
strcpy (d,s); // Copy the characters
return d; // Return the new string
}
if our string is not NULL,is there any chance of failing strdup function?
Yes, if malloc fails to allocate memory and returns NULL.
This could reasonably happen when you're trying to duplicate a very large string, or if your address space is very fragmented and nearly full (so taht malloc can't find a contiguous block of memory to allocate, or in embedded systems where not much memory is available.
The chance of strdup failing is determined by the chance of malloc failing. On modern operating systems with virtual memory, a malloc failure is a very rare thing. The OS may have even killed your entire process before the system gets so low on memory that malloc has to return NULL.
It's not unheard of to run out of memory, if there is a memory leak.
So it's not a bad idea to check for null, print out error message, and maybe even exit at that point.
Note that things like 'printf' won't work (or may not work, but in my experience don't work) if you run out of memory. So you gotta use low-level 'write' or such, and file descriptor you're using (if you're writing to log file), should already be opened.

Resources