Invalid write of size due to strcat - c

I have this code for an assignment where I want to create and open a file using the same name "in itself" of an imput file given via command line argument but with a different extention (example: I pass the file "filename.in" in terminal via argv and I want to create and open the file "filename.out"). However, I keep getting the error "Invalid size write of size 1" and a few others when I run my program in valgrind, and I really cannot see where they are coming from.
Code:
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
#include "roap.h"
int main(int argc, char** argv) {
FILE* filePtr;
LabList* head = NULL;
if (argc != 3) {
printf("Numero de argumentos errado! \n");
exit(EXIT_FAILURE);
}
char flag[] = "-s";
if ( strcmp(argv[1] , flag) != 0 )
{
fprintf(stdout, "Flag '-s' necessária para esta fase do projeto!"); //verifica se a flag -s está presente
exit(EXIT_FAILURE);
}
char *file_arg /*argumento indicado no terminal para referir ao ficheiro*/, *filename /*nome "próprio" do ficheiro*/, *file_arg_aux;
char dot = '.';
char ponto[] = ".";
char *extencao;
int read_ok;
file_arg = (char*) calloc(1, strlen(argv[2]) +1 ); //verifica se de facto a extensão é .in1
file_arg_aux = (char*) calloc(1, strlen(argv[2]) +1 );
strcpy(file_arg, argv[2]);
strcpy(file_arg_aux, argv[2]);
filename = strtok(file_arg, ponto);
extencao = strrchr(file_arg_aux, dot);
if ((read_ok = strcmp(extencao, ".in1")) != 0 )
{
fprintf(stdout, "Extensão inválida!");
exit(EXIT_FAILURE);
}
filePtr = fopen(argv[2], "r");
if (filePtr == NULL) {
printf("Erro ao abrir o ficheiro %s !\n", argv[2]);
exit(EXIT_FAILURE);
} else {
head = readLab(filePtr, head);
fclose(filePtr);
}
char extencao_out[] = ".sol1";
FILE* file_out = fopen( strcat(filename, extencao_out) , "w");
Valgrind output:
==123== Invalid write of size 1
==123== at 0x483EC5E: strcat (in /usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
==123== by 0x1095B2: main (main.c:60)
==123== Address 0x4a47051 is 0 bytes after a block of size 17 alloc'd
==123== at 0x483DD99: calloc (in /usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
==123== by 0x10944E: main (main.c:36)
==123==
==123== Syscall param openat(filename) points to unaddressable byte(s)
==123== at 0x4962D1B: open (open64.c:48)
==123== by 0x48E5195: _IO_file_open (fileops.c:189)
==123== by 0x48E5459: _IO_file_fopen##GLIBC_2.2.5 (fileops.c:281)
==123== by 0x48D7B0D: __fopen_internal (iofopen.c:75)
==123== by 0x48D7B0D: fopen##GLIBC_2.2.5 (iofopen.c:86)
==123== by 0x1095C1: main (main.c:60)
==123== Address 0x4a47051 is 0 bytes after a block of size 17 alloc'd
==123== at 0x483DD99: calloc (in /usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
==123== by 0x10944E: main (main.c:36)
Line 36: file_arg = (char*) calloc(1, strlen(argv[2]) +1 );
Line 60: FILE* file_out = fopen( strcat(filename, extencao_out) , "w");

This ...
file_arg = (char*) calloc(1, strlen(argv[2]) +1 )
... allocates exactly enough space for a copy of argv[2], and assigns it to file_arg. That's well and good, and it is afterward ok to strcpy(file_arg, argv[2]).
Given the value of ponto, this ...
filename = strtok(file_arg, ponto);
... truncates the string to which file_arg points by overwriting the first '.' (if any) with a string terminator, and it returns a copy of (pointer) file_arg. That's ok in itself.
But then here:
FILE* file_out = fopen( strcat(filename, extencao_out) , "w");
, strcat(filename, extencao_out) attempts to append the contents of string extencao_out (".sol1") in place of the original extension, which you already verified, a bit awkwardly, was ".in1". Because exactly enough space was allocated for the original file name, no more, there is not enough room to accommodate the longer one that the program is now trying to construct. The allocated space is overwritten by one byte, just as Valgrind tells you.
I would suggest moving the declaration of extencao_out far enough earlier that you can instead allocate like so:
file_arg = (char*) calloc(1, strlen(argv[2]) + strlen(extencao_out) + 1);
That will overallocate by four bytes with your present combination of extensions, but
four bytes is negligible;
it probably will place no extra memory burden at all on the system about 75% of the time; and
it will be flexible towards other behavioral variations you might later want to support, such as input file names without extensions.

Declare the variables on separate lines, with comments; each of these are pointers, and have declared/reserved no memory.
char* file_arg; /*argumento indicado no terminal para referir ao ficheiro*/
char* filename; /*nome "próprio" do ficheiro*/
char* file_arg_aux;
char* extencao;
Here you assign memory to file_arg, but only enough to hold the original argv string contents (plus null-terminator). Since you later reuse this buffer, you might just assign enough extra space for future needs.
file_arg = (char*) calloc(1, strlen(argv[2]) +1 ); //verifica se de facto a extensão é .in1
file_arg_aux = (char*) calloc(1, strlen(argv[2]) +1 );
Just use strdup() instead of calloc() and strcpy(),
strcpy(file_arg, argv[2]);
strcpy(file_arg_aux, argv[2]);
These functions both find the '.'/"." in the filename/file_arg, which you later use to append the new file extension (extencao?),
filename = strtok(file_arg, ponto);
extencao = strrchr(file_arg_aux, dot);
Here is your expected extension, which has length=4,
strcmp(extencao, ".in1")
Here is your new file extension, which has length=5,
char extencao_out[] = ".sol1";
Build your out filename with enough space to hold the filename and the new extension,
char* outfilename = calloc( strlen(filename)+strlen(extencao_out)+1 );
strcpy(outfilename, filename);
strcat(outfilename, extencao_out);
FILE* file_out = fopen( outfilename, "w");

Related

fscanf reads one line out of 7 wrong

I have a zipped text file with 7 very long lines of text containing information for the decoding of a JPEG encoded file.
When I try to read the unzipped File with my C program, line by line with fscanf, I get the first 3 and the last 3 lines correctly, just the 4th line isn't read as a string as expected.
The output of the 4th line is a very long string filled with 1 and 0.
If I look at the input file with Notepad or a hex editor everything looks fine as it should.
If I manually create a text file with the same structure (but with shorter lines) fscanf works fine.
There is no difference if I unzip the File with my program or do it manually.
FILE *tmpdata;
char enc_path[256];
int arrsize;
// Building the absolute Path
sprintf(enc_path, "%s%stmp.txt", dest, src_name);
arrsize = unzip(); // gives back size of the file
// not the best way to create the output strings,
// but I don't know the size of the lines.
char masse[10];
char ytabelle[arrsize / 3];
char cbtabelle[arrsize / 3];
char crtabelle[arrsize / 2];
char ywerte[arrsize /3];
char cbwerte[arrsize / 3];
char crwerte[arrsize / 3];
if ((tmpdata = fopen(enc_path, "r")) == NULL) {
printf("Error: can´t read input file\n");
return EXIT_FAILURE;
}
fscanf(tmpdata, "%s %s %s %s %s %s %s", masse, ytabelle, cbtabelle, crtabelle, ywerte, cbwerte, crwerte);
The input file looks like:
512x512
Y{42:110000;13:111000;...;0:0;}
CB{42:110000;13:111000;...;0:0;}
CR{42:110000;13:111000;...;0:0;}
000111010010111001110000111100011...
100011011101110001101000011100110...
100011101110110111011001100111011...
if I print the separate strings:
512x512
Y{42:110000;13:111000;...;0:0;}
CB{42:110000;13:111000;...;0:0;}
111001111111111000110000111111000...
000111010010111001110000111100011...
100011011101110001101000011100110...
100011101110110111011001100111011...
There are multiples reasons for your program to not behave properly:
you may allocate too much data with automatic storage (aka on the stack), causing erratic behavior.
the strings int the file might contain embedded spaces, causing fscanf() to read words instead of lines.
you do not tell fscanf() the size of the destination arrays. fscanf() may store data beyond the end of the destination arrays, overflowing into the next array (which would explain the observed behavior) or causing some other undefined behavior.
It is very cumbersome to pass the size of the destination arrays when they are not simple constants. I suggest you use fgets() instead of fscanf() to read the file contents and allocate the arrays with malloc() to a larger size to avoid problems:
FILE *tmpdata;
char enc_path[256];
size_t arrsize;
// Building the absolute path
snprintf(enc_path, sizeof enc_path, "%s%stmp.txt", dest, src_name);
arrsize = unzip(); // gives back size of the file
// not the best way to create the output strings,
// but I don't know the size of the lines.
char masse[16];
size_t ytabelle_size = arrsize + 2;
size_t cbtabelle_size = arrsize + 2;
size_t crtabelle_size = arrsize + 2;
char *ytabelle = malloc(ytabelle_size);
char *cbtabelle = malloc(cbtabelle_size);
char *crtabelle = malloc(crtabelle_size);
size_t ywerte_size = arrsize + 2;
size_t cbwerte_size = arrsize + 2;
size_t crwerte_size = arrsize + 2;
char *ywerte = malloc(ywerte_size);
char *cbwerte = malloc(cbwerte_size);
char *crwerte = malloc(crwerte_size);
if (!ytabelle ||!cbtabelle ||!crtabelle ||!ywerte ||!cbwerte ||!crwerte) {
printf("Error: cannot allocate memory\n");
return EXIT_FAILURE;
}
if ((tmpdata = fopen(enc_path, "r")) == NULL) {
printf("Error: cannot open input file\n");
return EXIT_FAILURE;
}
if (!fgets(masse, sizeof masse, tmpdata)
|| !fgets(ytabelle, ytabelle_size, tmpdata)
|| !fgets(cbtabelle, cbtabelle_size, tmpdata)
|| !fgets(crtabelle, crtabelle_size, tmpdata)
|| !fgets(ywerte, ywerte_size, tmpdata)
|| !fgets(cbwerte, cbwerte_size, tmpdata)
|| !fgets(crwerte, crwerte_size, tmpdata)) {
printf("Error: cannot read input file\n");
return EXIT_FAILURE;
}
// file contents were read, arrays should have a trailing newline, which
// you should strip or handle in the decoding phase.
...
If you are using the GNUlibc or some modern Posix systems, you could use the m prefix in fscanf() to allocate the space for the words read from the file. Using this allows for a simpler but non portable solution:
FILE *tmpdata;
char enc_path[256];
size_t arrsize;
// Building the absolute path
snprintf(enc_path, sizeof enc_path, "%s%stmp.txt", dest, src_name);
arrsize = unzip(); // gives back size of the file
// not the best way to create the output strings,
// but I don't know the size of the lines.
char masse[16];
char *ytabelle = NULL;
char *cbtabelle = NULL;
char *crtabelle = NULL;
char *ywerte = NULL;
char *cbwerte = NULL;
char *crwerte = NULL;
if ((tmpdata = fopen(enc_path, "r")) == NULL) {
printf("Error: cannot open input file\n");
return EXIT_FAILURE;
}
if (fscanf(tmpdata, "%ms %ms %ms %ms %ms %ms %ms", &masse,
&ytabelle, &cbtabelle, &crtabelle,
&ywerte, &cbwerte, &crwerte) != 7) {
printf("Error: cannot read input file\n");
return EXIT_FAILURE;
}
...
PS: Unlike German, the initial letters of nouns are not capitalized in English, except for some exceptions such as language, people and place names.
Maybe avoid stack allocation??
char masse[10];
char *ytabelle = malloc(arrsize/3); if (!ytabelle) exit(EXIT_FAILURE);
char *cbtabelle = malloc(arrsize/3); if (!cbtabelle) exit(EXIT_FAILURE);
char *crtabelle = malloc(arrsize/2); if (!crtabelle) exit(EXIT_FAILURE);
char *ywerte = malloc(arrsize/3); if (!ywerte) exit(EXIT_FAILURE);
char *cbwerte = malloc(arrsize/3); if (!cbwerte) exit(EXIT_FAILURE);
char *crwerte = malloc(arrsize/3); if (!crwerte) exit(EXIT_FAILURE);
/* use as before */
free(ytabelle);
free(cbtabelle);
free(crtabelle);
free(ywerte);
free(cbwerte);
free(crwerte);

Need help figuring out what the Problem with fgets is

So basically I have a file-pointer to a file that has 80 digits between 0 and 1 and I need to get them into a string to then do something with it.
The function returns NULL, and I cannot find what's wrong because if it returns NULL it just means error.
FILE *fpr = fopen(path, "r");
FILE *fpw = fopen("code.txt", "w");
char *str = calloc(81, sizeof(char));
if (fpr == NULL || fpw == NULL) {
printf("yikes");
}
if (fgets(str, 80, fpr) != NULL) { //HERE ITS NULL
int p1 = 0;
int p2 = 0;
I really thought it through and im either really dumb or there is no obvious problem.
The are a few problems in the code fragment:
if any of the files cannot be open, you still call fgets(), which has undefined behavior if fpr is NULL. Make a separate test for each FILE*, print a more explicit error message and exit the program.
you should pass the size of the array to fgets(), 81 instead of 80.
the array should be allocated to at least 82 bytes: 80 characters plus the trailing newline and a null byte terminator.
you do not test for memory allocation failure. You should not even allocate memory, a local array is OK for a small size like 82 bytes.
Here is a corrected version:
#include <errno.h>
#include <stdio.h>
...
char str[82];
FILE *fpr = fopen(path, "r");
if (fpr == NULL) {
fprintf(stderr, "cannot open input file %s: %s\n", path, strerror(errno));
exit(1);
}
FILE *fpw = fopen("code.txt", "w");
if (fpw == NULL) {
fprintf(stderr, "cannot open output file %s: %s\n", "code.txt", strerror(errno));
exit(1);
}
if (fgets(str, sizeof str, fpr)) {
int p1 = 0;
int p2 = 0;
...
Always test error conditions and print explicit error messages, you will safe yourself countless hours of debugging time.

How to read in two text files and count the amount of keywords?

I have tried looking around but, to me files are the hardest thing to understand so far as I am learning C, especially text files, binary files were a bit easier. Basically I have to read in two text files both contains words that are formatted like this "hard, working,smart, works well, etc.." I am suppose to compare the text files and count the keywords. I would show some code but honestly I am lost and the only thing I have down is just nonsense besides this.
#include <time.h>
#include <stdlib.h>
#include <stdio.h>
#define SIZE 1000
void resumeRater();
int main()
{
int i;
int counter = 0;
char array[SIZE];
char keyword[SIZE];
FILE *fp1, *fp2;
int ch1, ch2;
errno_t result1 = fopen_s(&fp1, "c:\\myFiles\\resume.txt", "r");
errno_t result2 = fopen_s(&fp2, "c:\\myFiles\\ideal.txt", "r");
if (fp1 == NULL) {
printf("Failed to open");
}
else if (fp2 == NULL) {
printf("Failed to open");
}
else {
result1 = fread(array, sizeof(char), 1, fp1);
result2 = fread(keyword, sizeof(char), 1, fp2);
for (i = 0; i < SIZE; i++)
{
if (array[i] == keyword[i])
{
counter++;
}
}
fclose(fp1);
fclose(fp2);
printf("Character match: %d", counter);
}
system("pause");
}
When you have a situation where you are doing a multiple of something (like reading 2 files), it makes a lot of sense to plan ahead. Rather than muddying the body of main with all the code necessary to read 2 text files, create a function that reads the text file for you and have it return an array containing the lines of the file. This really helps you concentrate on the logic of what your code needs to do with the lines rather than filling space with getting the lines in the first place. Now there is nothing wrong with cramming it all in one long main, but from a readability, maintenance, and program structure standpoint, it makes all more difficult.
If you structure the read function well, you can reduce your main to the following. This reads both text files into character arrays and provides the number of lines read in a total of 4 lines (plus the check to make sure your provided two filenames to read):
int main (int argc, char **argv) {
if (argc < 3 ) {
fprintf (stderr, "error: insufficient input, usage: %s <filename1> <filename2>\n", argv[0]);
return 1;
}
size_t file1_size = 0; /* placeholders to be filled by readtxtfile */
size_t file2_size = 0; /* for general use, not needed to iterate */
/* read each file into an array of strings,
number of lines read, returned in file_size */
char **file1 = readtxtfile (argv[1], &file1_size);
char **file2 = readtxtfile (argv[2], &file2_size);
return 0;
}
At that point you have all your data and you can work on your key word code. Reading from textfiles is a very simple matter. You just have to get comfortable with the tools available. When reading lines of text, the preferred approach is to use line-input to read an entire line at a time into a buffer. You then parse to buffer to get what it is you need. The line-input tools are fgets and getline. Once you have read the line, you then have tools like strtok, strsep or sscanf to separate what you want from the line. Both fgets and getline read the newline at the end of each line as part of their input, so you may need to remove the newline to meet your needs.
Storing each line read is generally done by declaring a pointer to an array of char* pointers. (e.g. char **file1;) You then allocate memory for some initial number of pointers. (NMAX in the example below) You then access the individual lines in the file as file1_array[n] when n is the line index 0 - lastline of the file. If you have a large file and exceed the number of pointers you originally allocated, you simply reallocate additional pointers for your array with realloc. (you can set NMAX to 1 to make this happen for every line)
What you use to allocate memory and how you reallocate can influence how you make use of the arrays in your program. Careful choices of calloc to initially allocate your arrays, and then using memset when you reallocate to set all unused pointers to 0 (null), can really save you time and headache? Why? Because, to iterate over your array, all you need to do is:
n = 0;
while (file1[n]) {
<do something with file1[n]>;
n++;
}
When you reach the first unused pointer (i.e. the first file1[n] that is 0), the loop stops.
Another very useful function when reading text files is strdup (char *line). strdup will automatically allocate space for line using malloc, copy line to the newly allocated memory, and return a pointer to the new block of memory. This means that all you need to do to allocate space for each pointer and copy the line ready by getline to your array is:
file1[n] = strdup (line);
That's pretty much it. you have read your file and filled your array and know how to iterate over each line in the array. What is left is cleaning up and freeing the memory allocated when you no longer need it. By making sure that your unused pointers are 0, this too is a snap. You simply iterate over your file1[n] pointers again, freeing them as you go, and then free (file1) at the end. Your done.
This is a lot to take in, and there are a few more things to it. On the initial read of the file, if you noticed, we also declare a file1_size = 0; variable, and pass its address to the read function:
char **file1 = readtxtfile (argv[1], &file1_size);
Within readtxtfile, the value at the address of file1_size is incremented by 1 each time a line is read. When readtxtfile returns, file1_size contains the number of lines read. As shown, this is not needed to iterate over the file1 array, but you often need to know how many lines you have read.
To put this all together, I created a short example of the functions to read two text files, print the lines in both and free the memory associated with the file arrays. This explanation ended up longer than I anticipated. So take time to understand how it works, and you will be a step closer to handling textfiles easily. The code below will take 2 filenames as arguments (e.g. ./progname file1 file2) Compile it with something similar to gcc -Wall -Wextra -o progname srcfilename.c:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#define NMAX 256
char **readtxtfile (char *fn, size_t *idx);
char **realloc_char (char **p, size_t *n);
void prn_chararray (char **ca);
void free_chararray (char **ca);
int main (int argc, char **argv) {
if (argc < 3 ) {
fprintf (stderr, "error: insufficient input, usage: %s <filename1> <filename2>\n", argv[0]);
return 1;
}
size_t file1_size = 0; /* placeholders to be filled by readtxtfile */
size_t file2_size = 0; /* for general use, not needed to iterate */
/* read each file into an array of strings,
number of lines read, returned in file_size */
char **file1 = readtxtfile (argv[1], &file1_size);
char **file2 = readtxtfile (argv[2], &file2_size);
/* simple print function */
if (file1) prn_chararray (file1);
if (file2) prn_chararray (file2);
/* simple free memory function */
if (file1) free_chararray (file1);
if (file2) free_chararray (file2);
return 0;
}
char** readtxtfile (char *fn, size_t *idx)
{
if (!fn) return NULL; /* validate filename provided */
char *ln = NULL; /* NULL forces getline to allocate */
size_t n = 0; /* max chars to read (0 - no limit) */
ssize_t nchr = 0; /* number of chars actually read */
size_t nmax = NMAX; /* check for reallocation */
char **array = NULL; /* array to hold lines read */
FILE *fp = NULL; /* file pointer to open file fn */
/* open / validate file */
if (!(fp = fopen (fn, "r"))) {
fprintf (stderr, "%s() error: file open failed '%s'.", __func__, fn);
return NULL;
}
/* allocate NMAX pointers to char* */
if (!(array = calloc (NMAX, sizeof *array))) {
fprintf (stderr, "%s() error: memory allocation failed.", __func__);
return NULL;
}
/* read each line from fp - dynamicallly allocated */
while ((nchr = getline (&ln, &n, fp)) != -1)
{
/* strip newline or carriage rtn */
while (nchr > 0 && (ln[nchr-1] == '\n' || ln[nchr-1] == '\r'))
ln[--nchr] = 0;
array[*idx] = strdup (ln); /* allocate/copy ln to array */
(*idx)++; /* increment value at index */
if (*idx == nmax) /* if lines exceed nmax, reallocate */
array = realloc_char (array, &nmax);
}
if (ln) free (ln); /* free memory allocated by getline */
if (fp) fclose (fp); /* close open file descriptor */
return array;
}
/* print an array of character pointers. */
void prn_chararray (char **ca)
{
register size_t n = 0;
while (ca[n])
{
printf (" arr[%3zu] %s\n", n, ca[n]);
n++;
}
}
/* free array of char* */
void free_chararray (char **ca)
{
if (!ca) return;
register size_t n = 0;
while (ca[n])
free (ca[n++]);
free (ca);
}
/* realloc an array of pointers to strings setting memory to 0.
* reallocate an array of character arrays setting
* newly allocated memory to 0 to allow iteration
*/
char **realloc_char (char **p, size_t *n)
{
char **tmp = realloc (p, 2 * *n * sizeof *p);
if (!tmp) {
fprintf (stderr, "%s() error: reallocation failure.\n", __func__);
// return NULL;
exit (EXIT_FAILURE);
}
p = tmp;
memset (p + *n, 0, *n * sizeof *p); /* memset new ptrs 0 */
*n *= 2;
return p;
}
valgrind - Don't Forget To Check For Leaks
Lastly, anytime you allocate memory in your code, make sure you use a memory checker such as valgrind to confirm you have no memory errors and to confirm you have no memory leaks (i.e. allocated blocks you have forgotten to free, or that have become unreachable). valgrind is simple to use, just valgrind ./progname [any arguments]. It can provide a wealth of information. For example, on this read example:
$ valgrind ./bin/getline_readfile_fn voidstruct.c wii-u.txt
==14690== Memcheck, a memory error detector
==14690== Copyright (C) 2002-2012, and GNU GPL'd, by Julian Seward et al.
==14690== Using Valgrind-3.8.1 and LibVEX; rerun with -h for copyright info
==14690== Command: ./bin/getline_readfile_fn voidstruct.c wii-u.txt
==14690==
<snip - program output>
==14690==
==14690== HEAP SUMMARY:
==14690== in use at exit: 0 bytes in 0 blocks
==14690== total heap usage: 61 allocs, 61 frees, 6,450 bytes allocated
==14690==
==14690== All heap blocks were freed -- no leaks are possible
==14690==
==14690== For counts of detected and suppressed errors, rerun with: -v
==14690== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 2 from 2)
Pay particular attention to the lines:
==14690== All heap blocks were freed -- no leaks are possible
and
==14690== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 2 from 2)
You can ignore the (suppressed: 2 from 2) which just indicate I don't have the development files installed for libc.

Memory leak with getline and strsep

Get a memory leak while using getline together with strsep. I know strsep modifies line - could this be the cause? That line is not freed correctly.
FILE *file = fopen("keywords.txt", "r");
if (file) {
char* line = NULL;
size_t len = 0;
ssize_t read;
while ((read = getline(&line, &len, file)) != -1) { // Line 35
char *token;
while ((token = strsep(&line, "\t")) != NULL) {
// Do stuff
}
}
free(line);
fclose(file);
}
Valgrind returns this:
==6094== 4,680 bytes in 39 blocks are definitely lost in loss record 7 of 7
==6094== at 0x4C2AB80: malloc (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so)
==6094== by 0x51AEBB4: getdelim (iogetdelim.c:66)
==6094== by 0x4009B3: read_keywords (main.c:35)
==6094== by 0x400959: renew_init (main.c:64)
==6094== by 0x400A48: main (main.c:68)
If I comment out strsep, there's no memory leak.
Tips?
When you pass &line to strsep, it will change the value of line. At the end of the inner loop, line will be NULL and free(line) will do nothing. This will also cause getline to allocate a new buffer instead of reusing the current one.
You should copy line to a new variable, e.g. char *line2 = line; and pass &line2 to strsep.

Get a segment fault while reading a file

I want to read the whole file content and print it out , but I get a segment fault , I can't find what's wrong with the code ...
#include <stdio.h>
#include <stdlib.h>
int main()
{
FILE * file;
long fsize;
file = fopen("./input.txt","r");
if(file != NULL){
//get file size
fseek(file,0,SEEK_END);
fsize = ftell(file);
rewind(file);
// print
char * file_content;
fgets(file_content,fsize,file);
puts(file_content);
}
else{
printf("open failure\n");
}
fclose(file);
return 0;
}
The pointer you pass to fgets (file_content) is uninitialized. It should be pointing to a block of memory large enough to contain the specified number (fsize) of bytes. You can use malloc to allocate the memory.
char* file_content = (char*)malloc(fsize);
char * file_content is just a pointer, you need to allocate memory to store the string.
char * file_content;
file_content = malloc(fsize);
"..but I get a segment fault"
Obviously because you're attempting to write to an uninitialized file_content
Allocated memory for file_content before use
char * file_content =malloc(fsize);

Resources