Memory leak with getline and strsep - c

Get a memory leak while using getline together with strsep. I know strsep modifies line - could this be the cause? That line is not freed correctly.
FILE *file = fopen("keywords.txt", "r");
if (file) {
char* line = NULL;
size_t len = 0;
ssize_t read;
while ((read = getline(&line, &len, file)) != -1) { // Line 35
char *token;
while ((token = strsep(&line, "\t")) != NULL) {
// Do stuff
}
}
free(line);
fclose(file);
}
Valgrind returns this:
==6094== 4,680 bytes in 39 blocks are definitely lost in loss record 7 of 7
==6094== at 0x4C2AB80: malloc (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so)
==6094== by 0x51AEBB4: getdelim (iogetdelim.c:66)
==6094== by 0x4009B3: read_keywords (main.c:35)
==6094== by 0x400959: renew_init (main.c:64)
==6094== by 0x400A48: main (main.c:68)
If I comment out strsep, there's no memory leak.
Tips?

When you pass &line to strsep, it will change the value of line. At the end of the inner loop, line will be NULL and free(line) will do nothing. This will also cause getline to allocate a new buffer instead of reusing the current one.
You should copy line to a new variable, e.g. char *line2 = line; and pass &line2 to strsep.

Related

Why do I have to malloc the buffer each time I call 'getline' on a new file pointer?

I have a daemon written in C that reads a file every INTERVAL seconds line by line and does some work on each line if it matches the criteria. I can't leave the file open since the file is being accessed and modified by another API, so this is a snippet of the code to show the issue:
#include <stdlib.h>
#include <stdio.h>
#include <unistd.h>
#define INTERVAL 5
int main(int argc, char *argv[])
{
size_t len = 100;
FILE *fp;
char *line = NULL;
line = malloc(len * sizeof(char));
size_t read;
int iteration = 1;
while (1)
{
printf("iteration %d\n", iteration++);
fp = fopen("input.txt", "r+");
if (fp == NULL)
exit(EXIT_FAILURE);
while ((read = getline(&line, &len, fp)) != -1)
{
printf("line is: %s\n", line);
// Do some more work with the data.
// If the password has expired remove it from the file.
}
fclose(fp);
if (line)
{
free(line);
}
sleep(INTERVAL);
}
}
When running this code, it results in this:
iteration 1
line is: 1656070481 qwerty12345
line is: 1656070482 qwerty
iteration 2
line is: 1656070481 qwerty12345
line is: 1656070482 qwerty
daemon(37502,0x1027e4580) malloc: *** error for object 0x600003ca8000: pointer being freed was not allocated
daemon(37502,0x1027e4580) malloc: *** set a breakpoint in malloc_error_break to debug
zsh: abort ./daemon
So the problem is in this line:
if (line)
{
free(line);
}
It looks like somehow the pointer is being deallocated somewhere inside the while loop. So, to solve this, I have to call the malloc function at the start of the while(1) loop to reallocate the memory with each iteration, like so:
while (1)
{
printf("iteration %d\n", iteration++);
line = malloc(len * sizeof(char));
This solves the issue, but I want to understand why the issue happened in the first place. Why is the pointer being deallocated after the second iteration?
have a daemon written in C
Then call daemon(3).
so the problem is in this line:
if (line){
free(line); }
Yes. So you need to clear line and code:
if (line) {
free(line);
line = NULL;
len = 0;
}
On the next loop, line will be NULL and you won't (incorrectly) free it twice.
(I won't bother free-ing inside the loop, but this is just my opinion.)
Of course, before getline you want to allocate line, so before
while ((read = getline(&line, &len, fp)) != -1)
code:
if (!line) {
line = malloc(100);
if (line)
len = 100;
};
Consider using Valgrind and in some cases Boehm conservative GC (or my old Qish GC downloadable here). You may then have to avoid getline and use simpler and lower-level alternatives (e.g., read(2) or simply fgetc(3)...).
Read the GC handbook.
Perhaps use Frama-C on your source code.
(Consider also using Rust, Go, C++, OCaml, or Common Lisp/SBCL to entirely recode your daemon.)
On Linux, see also inotify(7).

Invalid write of size due to strcat

I have this code for an assignment where I want to create and open a file using the same name "in itself" of an imput file given via command line argument but with a different extention (example: I pass the file "filename.in" in terminal via argv and I want to create and open the file "filename.out"). However, I keep getting the error "Invalid size write of size 1" and a few others when I run my program in valgrind, and I really cannot see where they are coming from.
Code:
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
#include "roap.h"
int main(int argc, char** argv) {
FILE* filePtr;
LabList* head = NULL;
if (argc != 3) {
printf("Numero de argumentos errado! \n");
exit(EXIT_FAILURE);
}
char flag[] = "-s";
if ( strcmp(argv[1] , flag) != 0 )
{
fprintf(stdout, "Flag '-s' necessária para esta fase do projeto!"); //verifica se a flag -s está presente
exit(EXIT_FAILURE);
}
char *file_arg /*argumento indicado no terminal para referir ao ficheiro*/, *filename /*nome "próprio" do ficheiro*/, *file_arg_aux;
char dot = '.';
char ponto[] = ".";
char *extencao;
int read_ok;
file_arg = (char*) calloc(1, strlen(argv[2]) +1 ); //verifica se de facto a extensão é .in1
file_arg_aux = (char*) calloc(1, strlen(argv[2]) +1 );
strcpy(file_arg, argv[2]);
strcpy(file_arg_aux, argv[2]);
filename = strtok(file_arg, ponto);
extencao = strrchr(file_arg_aux, dot);
if ((read_ok = strcmp(extencao, ".in1")) != 0 )
{
fprintf(stdout, "Extensão inválida!");
exit(EXIT_FAILURE);
}
filePtr = fopen(argv[2], "r");
if (filePtr == NULL) {
printf("Erro ao abrir o ficheiro %s !\n", argv[2]);
exit(EXIT_FAILURE);
} else {
head = readLab(filePtr, head);
fclose(filePtr);
}
char extencao_out[] = ".sol1";
FILE* file_out = fopen( strcat(filename, extencao_out) , "w");
Valgrind output:
==123== Invalid write of size 1
==123== at 0x483EC5E: strcat (in /usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
==123== by 0x1095B2: main (main.c:60)
==123== Address 0x4a47051 is 0 bytes after a block of size 17 alloc'd
==123== at 0x483DD99: calloc (in /usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
==123== by 0x10944E: main (main.c:36)
==123==
==123== Syscall param openat(filename) points to unaddressable byte(s)
==123== at 0x4962D1B: open (open64.c:48)
==123== by 0x48E5195: _IO_file_open (fileops.c:189)
==123== by 0x48E5459: _IO_file_fopen##GLIBC_2.2.5 (fileops.c:281)
==123== by 0x48D7B0D: __fopen_internal (iofopen.c:75)
==123== by 0x48D7B0D: fopen##GLIBC_2.2.5 (iofopen.c:86)
==123== by 0x1095C1: main (main.c:60)
==123== Address 0x4a47051 is 0 bytes after a block of size 17 alloc'd
==123== at 0x483DD99: calloc (in /usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
==123== by 0x10944E: main (main.c:36)
Line 36: file_arg = (char*) calloc(1, strlen(argv[2]) +1 );
Line 60: FILE* file_out = fopen( strcat(filename, extencao_out) , "w");
This ...
file_arg = (char*) calloc(1, strlen(argv[2]) +1 )
... allocates exactly enough space for a copy of argv[2], and assigns it to file_arg. That's well and good, and it is afterward ok to strcpy(file_arg, argv[2]).
Given the value of ponto, this ...
filename = strtok(file_arg, ponto);
... truncates the string to which file_arg points by overwriting the first '.' (if any) with a string terminator, and it returns a copy of (pointer) file_arg. That's ok in itself.
But then here:
FILE* file_out = fopen( strcat(filename, extencao_out) , "w");
, strcat(filename, extencao_out) attempts to append the contents of string extencao_out (".sol1") in place of the original extension, which you already verified, a bit awkwardly, was ".in1". Because exactly enough space was allocated for the original file name, no more, there is not enough room to accommodate the longer one that the program is now trying to construct. The allocated space is overwritten by one byte, just as Valgrind tells you.
I would suggest moving the declaration of extencao_out far enough earlier that you can instead allocate like so:
file_arg = (char*) calloc(1, strlen(argv[2]) + strlen(extencao_out) + 1);
That will overallocate by four bytes with your present combination of extensions, but
four bytes is negligible;
it probably will place no extra memory burden at all on the system about 75% of the time; and
it will be flexible towards other behavioral variations you might later want to support, such as input file names without extensions.
Declare the variables on separate lines, with comments; each of these are pointers, and have declared/reserved no memory.
char* file_arg; /*argumento indicado no terminal para referir ao ficheiro*/
char* filename; /*nome "próprio" do ficheiro*/
char* file_arg_aux;
char* extencao;
Here you assign memory to file_arg, but only enough to hold the original argv string contents (plus null-terminator). Since you later reuse this buffer, you might just assign enough extra space for future needs.
file_arg = (char*) calloc(1, strlen(argv[2]) +1 ); //verifica se de facto a extensão é .in1
file_arg_aux = (char*) calloc(1, strlen(argv[2]) +1 );
Just use strdup() instead of calloc() and strcpy(),
strcpy(file_arg, argv[2]);
strcpy(file_arg_aux, argv[2]);
These functions both find the '.'/"." in the filename/file_arg, which you later use to append the new file extension (extencao?),
filename = strtok(file_arg, ponto);
extencao = strrchr(file_arg_aux, dot);
Here is your expected extension, which has length=4,
strcmp(extencao, ".in1")
Here is your new file extension, which has length=5,
char extencao_out[] = ".sol1";
Build your out filename with enough space to hold the filename and the new extension,
char* outfilename = calloc( strlen(filename)+strlen(extencao_out)+1 );
strcpy(outfilename, filename);
strcat(outfilename, extencao_out);
FILE* file_out = fopen( outfilename, "w");

Freeing array of dynamic strings / lines in C

I am writing a program that is sorting the lines from the input text file. It does its job, however I get memory leaks using valgrind.
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
char* getline(FILE * infile)
{
int size = 1024;
char * line = (char*)malloc(size);
int temp;
int i=0;
do
{
(temp = fgetc(infile));
if (temp !=EOF)
line[i++]=(char)temp;
if (i>=size)
{
size*=2;
line = (char*)realloc(line, size);
}
}while (temp != '\n' && temp !=EOF);
if (temp==EOF)
return NULL;
return line;
}
void print_line (char * line)
{
printf("%s", line);
}
int myCompare (const void * a, const void * b )
{
const char *pa = *(const char**)a;
const char *pb = *(const char**)b;
return strcmp(pa,pb);
}
int main (int argc, char* argv[])
{
FILE * infile;
FILE * outfile;
infile = fopen(argv[1], "r");
if (infile==NULL)
{
printf("Error");
exit(3);
}
outfile = fopen(argv[2], "w");
if (outfile==NULL)
{
printf("Error");
exit(3);
}
char * line;
char **all_lines;
int nlines=0;
while((line=getline(infile))!=NULL)
{
print_line(line);
nlines++;
}
all_lines=malloc(nlines*sizeof(char*));
rewind(infile);
int j=0;
printf("%d\n\n", nlines);
while((line=getline(infile))!=NULL)
{
all_lines[j]=line;
j++;
}
qsort(all_lines, nlines, sizeof(char*), myCompare);
for (int i=0; i<nlines;i++)
{
print_line(all_lines[i]);
fprintf(outfile, "%s", all_lines[i]);
}
for(int i =0; i<nlines;i++)
{
free(all_lines[i]);
}
free(all_lines);
fclose(infile);
fclose(outfile);
return 0;
}
Any ideas where they might come from? I loop over all_lines[] and free the content, then free all_lines itself.
UPDATE:
Ok, so I've done the updates you have suggested. However, now valgrind throws error for functions fprintf in my program. Here is what it says:
11 errors in context 2 of 2:
==3646== Conditional jump or move depends on uninitialised value(s)
==3646== at 0x40BA4B1: vfprintf (vfprintf.c:1601)
==3646== by 0x40C0F7F: printf (printf.c:35)
==3646== by 0x80487B8: print_line (sort_lines.c:44)
==3646== by 0x804887D: main (sort_lines.c:77)
==3646== Uninitialised value was created by a heap allocation
==3646== at 0x4024D12: realloc (vg_replace_malloc.c:476)
==3646== by 0x8048766: getline (sort_lines.c:30)
==3646== by 0x804889A: main (sort_lines.c:75)
I would like to know why it reports error on simply fprintf those lines to a text file. I've looked up that it is something concerning gcc optimalization turning fprint into fputs but I dont get this idea
There are multiple problems in your code:
function getline:
the string in the line buffer is not properly '\0' terminated at the end of the do / while loop.
It does not free the line buffer upon end of file, hence memory leak.
It does not return a partial line at end of file if the file does not end with a '\n'.
neither malloc not realloc return values are checked for memory allocation failure.
You should realloc the line to the actual size used to reduce the amount of memory consumed. Currently, you allocate at least 1024 bytes per line. Sorting the dictionary will require 100 times more memory than needed!
function MyCompare:
The lines read include the '\n' at the end. Comparison may yield unexpected results: "Hello\tworld\n" will come before "Hello\n". You should strip the '\n' in getline and modify the printf formats appropriately.
function main:
You do not check if command line arguments are actually provided before trying to open them with fopen, invoking undefined behaviour
The first loop that counts the number of lines does not free the lines returned by getline... big memory leak!
The input stream cannot always be rewinded. If your program is given its input via a pipe, rewind will fail except for very small input sizes.
The number of lines read in the second loop may be different from the one counted in the first loop if the file was modified asynchronously by another process. It the file grew, you invoke undefined behaviour when you load it, if it shrank, you invoke undefined behaviour when you sort the array. You should reallocate the all_lines array as you read the lines in a single loop.
Printing the lines before sort is not very useful and complicates testing.

How to read in two text files and count the amount of keywords?

I have tried looking around but, to me files are the hardest thing to understand so far as I am learning C, especially text files, binary files were a bit easier. Basically I have to read in two text files both contains words that are formatted like this "hard, working,smart, works well, etc.." I am suppose to compare the text files and count the keywords. I would show some code but honestly I am lost and the only thing I have down is just nonsense besides this.
#include <time.h>
#include <stdlib.h>
#include <stdio.h>
#define SIZE 1000
void resumeRater();
int main()
{
int i;
int counter = 0;
char array[SIZE];
char keyword[SIZE];
FILE *fp1, *fp2;
int ch1, ch2;
errno_t result1 = fopen_s(&fp1, "c:\\myFiles\\resume.txt", "r");
errno_t result2 = fopen_s(&fp2, "c:\\myFiles\\ideal.txt", "r");
if (fp1 == NULL) {
printf("Failed to open");
}
else if (fp2 == NULL) {
printf("Failed to open");
}
else {
result1 = fread(array, sizeof(char), 1, fp1);
result2 = fread(keyword, sizeof(char), 1, fp2);
for (i = 0; i < SIZE; i++)
{
if (array[i] == keyword[i])
{
counter++;
}
}
fclose(fp1);
fclose(fp2);
printf("Character match: %d", counter);
}
system("pause");
}
When you have a situation where you are doing a multiple of something (like reading 2 files), it makes a lot of sense to plan ahead. Rather than muddying the body of main with all the code necessary to read 2 text files, create a function that reads the text file for you and have it return an array containing the lines of the file. This really helps you concentrate on the logic of what your code needs to do with the lines rather than filling space with getting the lines in the first place. Now there is nothing wrong with cramming it all in one long main, but from a readability, maintenance, and program structure standpoint, it makes all more difficult.
If you structure the read function well, you can reduce your main to the following. This reads both text files into character arrays and provides the number of lines read in a total of 4 lines (plus the check to make sure your provided two filenames to read):
int main (int argc, char **argv) {
if (argc < 3 ) {
fprintf (stderr, "error: insufficient input, usage: %s <filename1> <filename2>\n", argv[0]);
return 1;
}
size_t file1_size = 0; /* placeholders to be filled by readtxtfile */
size_t file2_size = 0; /* for general use, not needed to iterate */
/* read each file into an array of strings,
number of lines read, returned in file_size */
char **file1 = readtxtfile (argv[1], &file1_size);
char **file2 = readtxtfile (argv[2], &file2_size);
return 0;
}
At that point you have all your data and you can work on your key word code. Reading from textfiles is a very simple matter. You just have to get comfortable with the tools available. When reading lines of text, the preferred approach is to use line-input to read an entire line at a time into a buffer. You then parse to buffer to get what it is you need. The line-input tools are fgets and getline. Once you have read the line, you then have tools like strtok, strsep or sscanf to separate what you want from the line. Both fgets and getline read the newline at the end of each line as part of their input, so you may need to remove the newline to meet your needs.
Storing each line read is generally done by declaring a pointer to an array of char* pointers. (e.g. char **file1;) You then allocate memory for some initial number of pointers. (NMAX in the example below) You then access the individual lines in the file as file1_array[n] when n is the line index 0 - lastline of the file. If you have a large file and exceed the number of pointers you originally allocated, you simply reallocate additional pointers for your array with realloc. (you can set NMAX to 1 to make this happen for every line)
What you use to allocate memory and how you reallocate can influence how you make use of the arrays in your program. Careful choices of calloc to initially allocate your arrays, and then using memset when you reallocate to set all unused pointers to 0 (null), can really save you time and headache? Why? Because, to iterate over your array, all you need to do is:
n = 0;
while (file1[n]) {
<do something with file1[n]>;
n++;
}
When you reach the first unused pointer (i.e. the first file1[n] that is 0), the loop stops.
Another very useful function when reading text files is strdup (char *line). strdup will automatically allocate space for line using malloc, copy line to the newly allocated memory, and return a pointer to the new block of memory. This means that all you need to do to allocate space for each pointer and copy the line ready by getline to your array is:
file1[n] = strdup (line);
That's pretty much it. you have read your file and filled your array and know how to iterate over each line in the array. What is left is cleaning up and freeing the memory allocated when you no longer need it. By making sure that your unused pointers are 0, this too is a snap. You simply iterate over your file1[n] pointers again, freeing them as you go, and then free (file1) at the end. Your done.
This is a lot to take in, and there are a few more things to it. On the initial read of the file, if you noticed, we also declare a file1_size = 0; variable, and pass its address to the read function:
char **file1 = readtxtfile (argv[1], &file1_size);
Within readtxtfile, the value at the address of file1_size is incremented by 1 each time a line is read. When readtxtfile returns, file1_size contains the number of lines read. As shown, this is not needed to iterate over the file1 array, but you often need to know how many lines you have read.
To put this all together, I created a short example of the functions to read two text files, print the lines in both and free the memory associated with the file arrays. This explanation ended up longer than I anticipated. So take time to understand how it works, and you will be a step closer to handling textfiles easily. The code below will take 2 filenames as arguments (e.g. ./progname file1 file2) Compile it with something similar to gcc -Wall -Wextra -o progname srcfilename.c:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#define NMAX 256
char **readtxtfile (char *fn, size_t *idx);
char **realloc_char (char **p, size_t *n);
void prn_chararray (char **ca);
void free_chararray (char **ca);
int main (int argc, char **argv) {
if (argc < 3 ) {
fprintf (stderr, "error: insufficient input, usage: %s <filename1> <filename2>\n", argv[0]);
return 1;
}
size_t file1_size = 0; /* placeholders to be filled by readtxtfile */
size_t file2_size = 0; /* for general use, not needed to iterate */
/* read each file into an array of strings,
number of lines read, returned in file_size */
char **file1 = readtxtfile (argv[1], &file1_size);
char **file2 = readtxtfile (argv[2], &file2_size);
/* simple print function */
if (file1) prn_chararray (file1);
if (file2) prn_chararray (file2);
/* simple free memory function */
if (file1) free_chararray (file1);
if (file2) free_chararray (file2);
return 0;
}
char** readtxtfile (char *fn, size_t *idx)
{
if (!fn) return NULL; /* validate filename provided */
char *ln = NULL; /* NULL forces getline to allocate */
size_t n = 0; /* max chars to read (0 - no limit) */
ssize_t nchr = 0; /* number of chars actually read */
size_t nmax = NMAX; /* check for reallocation */
char **array = NULL; /* array to hold lines read */
FILE *fp = NULL; /* file pointer to open file fn */
/* open / validate file */
if (!(fp = fopen (fn, "r"))) {
fprintf (stderr, "%s() error: file open failed '%s'.", __func__, fn);
return NULL;
}
/* allocate NMAX pointers to char* */
if (!(array = calloc (NMAX, sizeof *array))) {
fprintf (stderr, "%s() error: memory allocation failed.", __func__);
return NULL;
}
/* read each line from fp - dynamicallly allocated */
while ((nchr = getline (&ln, &n, fp)) != -1)
{
/* strip newline or carriage rtn */
while (nchr > 0 && (ln[nchr-1] == '\n' || ln[nchr-1] == '\r'))
ln[--nchr] = 0;
array[*idx] = strdup (ln); /* allocate/copy ln to array */
(*idx)++; /* increment value at index */
if (*idx == nmax) /* if lines exceed nmax, reallocate */
array = realloc_char (array, &nmax);
}
if (ln) free (ln); /* free memory allocated by getline */
if (fp) fclose (fp); /* close open file descriptor */
return array;
}
/* print an array of character pointers. */
void prn_chararray (char **ca)
{
register size_t n = 0;
while (ca[n])
{
printf (" arr[%3zu] %s\n", n, ca[n]);
n++;
}
}
/* free array of char* */
void free_chararray (char **ca)
{
if (!ca) return;
register size_t n = 0;
while (ca[n])
free (ca[n++]);
free (ca);
}
/* realloc an array of pointers to strings setting memory to 0.
* reallocate an array of character arrays setting
* newly allocated memory to 0 to allow iteration
*/
char **realloc_char (char **p, size_t *n)
{
char **tmp = realloc (p, 2 * *n * sizeof *p);
if (!tmp) {
fprintf (stderr, "%s() error: reallocation failure.\n", __func__);
// return NULL;
exit (EXIT_FAILURE);
}
p = tmp;
memset (p + *n, 0, *n * sizeof *p); /* memset new ptrs 0 */
*n *= 2;
return p;
}
valgrind - Don't Forget To Check For Leaks
Lastly, anytime you allocate memory in your code, make sure you use a memory checker such as valgrind to confirm you have no memory errors and to confirm you have no memory leaks (i.e. allocated blocks you have forgotten to free, or that have become unreachable). valgrind is simple to use, just valgrind ./progname [any arguments]. It can provide a wealth of information. For example, on this read example:
$ valgrind ./bin/getline_readfile_fn voidstruct.c wii-u.txt
==14690== Memcheck, a memory error detector
==14690== Copyright (C) 2002-2012, and GNU GPL'd, by Julian Seward et al.
==14690== Using Valgrind-3.8.1 and LibVEX; rerun with -h for copyright info
==14690== Command: ./bin/getline_readfile_fn voidstruct.c wii-u.txt
==14690==
<snip - program output>
==14690==
==14690== HEAP SUMMARY:
==14690== in use at exit: 0 bytes in 0 blocks
==14690== total heap usage: 61 allocs, 61 frees, 6,450 bytes allocated
==14690==
==14690== All heap blocks were freed -- no leaks are possible
==14690==
==14690== For counts of detected and suppressed errors, rerun with: -v
==14690== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 2 from 2)
Pay particular attention to the lines:
==14690== All heap blocks were freed -- no leaks are possible
and
==14690== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 2 from 2)
You can ignore the (suppressed: 2 from 2) which just indicate I don't have the development files installed for libc.

C Memory leaks and Valgrind output

I am doing some learning with C, and am having trouble identifying a memory leak situation.
First, some code:
My main function:
#define FILE_NAME "../data/input.txt"
char * testGetLine( FILE * );
int testGetCount(void);
int main(void)
{
int count = 0;
FILE * fptr;
if ((fptr = fopen(FILE_NAME, "r")) != NULL) {
char * line;
while ((line = testGetLine(fptr)) != NULL) {
printf("%s", line);
free(line); count++;
}
free(line); count++;
} else {
printf("%s\n", "Could not read file...");
}
// testing statements
printf("testGetLine was called %d times\n", testGetCount());
printf("free(line) was called %d times\n", count);
fclose(fptr);
return 0;
}
and my getline function:
#define LINE_BUFFER 500
int count = 0;
char * testGetLine(FILE * fptr)
{
extern int count;
char * line;
line = malloc(sizeof(char) * LINE_BUFFER);
count++;
return fgets(line, LINE_BUFFER, fptr);
}
int testGetCount(void) {
extern int count;
return count;
}
my understanding is that I would need to call free everytime I have called my testGetLine function, which I do. By my count, on a simple text file with four lines I need to call free 5 times. I verify that with my testing statements in the following output:
This is in line 01
Now I am in line 02
line 03 here
and we finish with line 04
testGetLine was called 5 times
free(line) was called 5 times
What I am having trouble with is, valgrind says that I alloc 6 times, and am only calling free 5 times. Here is truncated output from valgrind:
HEAP SUMMARY:
in use at exit: 500 bytes in 1 blocks
total heap usage: 6 allocs, 5 frees, 3,068 bytes allocated
500 bytes in 1 blocks are definitely lost in loss record 1 of 1
at 0x4C2B3F8: malloc (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so)
by 0x4007A5: testGetLine (testGetLine.c:13)
by 0x400728: main (tester.c:16)
LEAK SUMMARY:
definitely lost: 500 bytes in 1 blocks
indirectly lost: 0 bytes in 0 blocks
possibly lost: 0 bytes in 0 blocks
still reachable: 0 bytes in 0 blocks
suppressed: 0 bytes in 0 blocks
I feel I am missing something with the memory management. Where is the 6th memory allocation that valgrind says I am using? and how should I free it?
Followup to implement Adrian's answer
testGetLine adjustment:
char * testGetLine(FILE * fptr)
{
extern int count;
char * line;
line = malloc(sizeof(char) * LINE_BUFFER);
count++;
if (fgets(line, LINE_BUFFER, fptr) == NULL) {
line[0] = '\0';
}
return line;
}
main while loop adjustment:
while ((line = testGetLine(fptr))[0] != '\0') {
printf("%s", line);
free(line); count++;
}
free(line); count++;
fgets return description:
On success, the function returns str. If the end-of-file is
encountered while attempting to read a character, the eof indicator is
set (feof). If this happens before any characters could be read, the
pointer returned is a null pointer (and the contents of str remain
unchanged). If a read error occurs, the error indicator (ferror) is
set and a null pointer is also returned (but the contents pointed by
str may have changed).
When fgets doesn't read anything it doesn't return the char * that you used malloc on.
Therefore, the malloc in your last call isn't being freed. The statement after your while doesn't work as you want.
Solution: change your return and return line instead:
char * testGetLine(FILE * fptr)
{
extern int count;
char * line;
line = malloc(sizeof(char) * LINE_BUFFER);
count++;
fgets(line, LINE_BUFFER, fptr);
return line;
}

Resources