heap-buffer-overflow with fprintf - c

I'm updating my question, very sorry for asking it the wrong way.
Now I could distill my problem into a single self-contained piece of code:
#include <stdio.h>
#include <stdlib.h>
static __inline__ char* fileRead(char* file){
FILE* fp;
long fileSize;
char* fileContents;
fp = fopen ( file , "rb" );
if(!fp){
perror(file);
exit(1);}
/* this block writes the size of the file in fileSize */
fseek( fp , 0L , SEEK_END);
fileSize = ftell( fp );
rewind( fp );
/* allocate memory for entire content */
fileContents = malloc(fileSize+1);
if(!fileContents){
fclose(fp);
fputs("memory alloc fails",stderr);
exit(1);}
/* copy the file into the buffer */
if(fread(fileContents, fileSize, 1, fp) != 1){
fclose(fp);
free(fileContents);
fputs("entire read fails",stderr);
exit(1);}
/* close the file */
fclose(fp);
return fileContents;}
int main (){
char* head10 = "";
char* fileName = "testhtml.html";
FILE* out = fopen(fileName, "w");
head10 = fileRead("head10.html");
printf("%s\n", head10);
out = fopen(fileName, "wb");
fprintf(out, "%s\n", head10);
fclose(out);
free(head10);
return 0;}
Here the head10.html file.
I'm compiling it with -fsanitize=address, and I'm getting an heap-buffer-overflow.
The error seems to be caused at the line fprintf(out, "%s\n", head10);.
head10 is the only malloc'd variable, so that makes sense.
I can print it without problems with printf, but when I try to write it to file with fprintf, an heap-buffer-overflow is generated.
===EDIT===
Looks like the problem came from using fprintf with a malloc'd var, as fprintf itself uses malloc under the hood, so the original alloc gets lost, and memory leaks.
So i rewrote my functions without using malloc:
#define _POSIX_C_SOURCE 200809L /* for getline() */
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
static __inline__ void fileReset(char* fileName){
FILE* out = fopen(fileName, "w");
fwrite("" , sizeof(char) , strlen("") , out );
fclose(out);}
static __inline__ void fileAppend(char* fileName, char* string){
FILE* out = fopen(fileName, "a"); /* using "a" to APPEND */
if(fwrite(string , sizeof(char) , strlen(string) , out ) != strlen(string)){
printf("==file write error\n");
exit(EXIT_FAILURE);}
fclose(out);}
static __inline__ void fileAppendFile(char* source, char* dest){
FILE* in = fopen(source, "r");
char *line = NULL;
size_t len = 0;
size_t read;
while ((read = getline(&line, &len, in)) != -1) {
fileAppend(dest, line);}
free(line);
fclose(in);}
int main (){
char* fileName = "testhtml.html";
char* theme = "dark";
fileReset(fileName);
fileAppendFile("head10.html", fileName);
fileAppend(fileName, theme);
return 0;}
Thanks a lot for all the help, very noob here, didn't know what -lasan was, now I know what an invaluable tool!
==EDIT-2==
As pointed out by EmployedRussian, the problem in the original code was NOT fprintf, but the lack of a terminating '\0', look at their answer below, it does fix my original code :)

Looks like the problem came from using fprintf with a malloc'd var, as fprintf itself uses malloc under the hood, so the original alloc gets lost, and memory leaks.
I am afraid you learned the wrong lesson here.
While fprintf may indeed use malloc under the hood, your problem doesn't have anything to do with that.
I created a head10.html file containing abc\n (4 characters). Running your program with that input file produced:
==10173==ERROR: AddressSanitizer: heap-buffer-overflow on address 0x602000000015 at pc 0x7fb5db2c7054 bp 0x7ffd44e74de0 sp 0x7ffd44e74590
READ of size 6 at 0x602000000015 thread T0
#0 0x7fb5db2c7053 (/usr/lib/x86_64-linux-gnu/libasan.so.5+0x4d053)
#1 0x5654101dd435 in main /tmp/foo.c:43
#2 0x7fb5db0dde0a in __libc_start_main ../csu/libc-start.c:308
#3 0x5654101dd199 in _start (/tmp/a.out+0x1199)
0x602000000015 is located 0 bytes to the right of 5-byte region [0x602000000010,0x602000000015)
allocated by thread T0 here:
#0 0x7fb5db381628 in malloc (/usr/lib/x86_64-linux-gnu/libasan.so.5+0x107628)
#1 0x5654101dd2db in fileRead /tmp/foo.c:20
#2 0x5654101dd425 in main /tmp/foo.c:42
#3 0x7fb5db0dde0a in __libc_start_main ../csu/libc-start.c:308
So the problem is that you allocated 5 bytes (as expected), but fprintf tried to read 6th character from that buffer.
Why would it do that? Because the format you used: %s expects to find a terminating NUL character (i.e. it expects a properly terminated C-string), and you gave it a pointer to non-terminated string with the following bytes:
a b c \n X
What value does the fifth byte contain? It's undefined (it came from malloc, and no value was written into it). Since that value is not NUL, fprintf tries to read the next (6th) byte, and that's when Address Sanitizer signals the error and aborts your program.
The correct fix is to NUL-terminate the string, like so:
if (fread(fileContents, fileSize, 1, fp) != 1){ ... handle error
fileContents[fileSize] = '\0'; // NUL-terminate the string.

Related

fgets statement reads first line and not sure how to modify because I have to return a pointer [duplicate]

I need to copy the contents of a text file to a dynamically-allocated character array.
My problem is getting the size of the contents of the file; Google reveals that I need to use fseek and ftell, but for that the file apparently needs to be opened in binary mode, and that gives only garbage.
EDIT: I tried opening in text mode, but I get weird numbers. Here's the code (I've omitted simple error checking for clarity):
long f_size;
char* code;
size_t code_s, result;
FILE* fp = fopen(argv[0], "r");
fseek(fp, 0, SEEK_END);
f_size = ftell(fp); /* This returns 29696, but file is 85 bytes */
fseek(fp, 0, SEEK_SET);
code_s = sizeof(char) * f_size;
code = malloc(code_s);
result = fread(code, 1, f_size, fp); /* This returns 1045, it should be the same as f_size */
The root of the problem is here:
FILE* fp = fopen(argv[0], "r");
argv[0] is your executable program, NOT the parameter. It certainly won't be a text file. Try argv[1], and see what happens then.
You cannot determine the size of a file in characters without reading the data, unless you're using a fixed-width encoding.
For example, a file in UTF-8 which is 8 bytes long could be anything from 2 to 8 characters in length.
That's not a limitation of the file APIs, it's a natural limitation of there not being a direct mapping from "size of binary data" to "number of characters."
If you have a fixed-width encoding then you can just divide the size of the file in bytes by the number of bytes per character. ASCII is the most obvious example of this, but if your file is encoded in UTF-16 and you happen to be on a system which treats UTF-16 code points as the "native" internal character type (which includes Java, .NET and Windows) then you can predict the number of "characters" to allocate as if UTF-16 were fixed width. (UTF-16 is variable width due to Unicode characters above U+FFFF being encoded in multiple code points, but a lot of the time developers ignore this.)
I'm pretty sure argv[0] won't be an text file.
Give this a try (haven't compiled this, but I've done this a bazillion times, so I'm pretty sure it's at least close):
char* readFile(char* filename)
{
FILE* file = fopen(filename,"r");
if(file == NULL)
{
return NULL;
}
fseek(file, 0, SEEK_END);
long int size = ftell(file);
rewind(file);
char* content = calloc(size + 1, 1);
fread(content,1,size,file);
return content;
}
If you're developing for Linux (or other Unix-like operating systems), you can retrieve the file-size with stat before opening the file:
#include <stdio.h>
#include <sys/stat.h>
int main() {
struct stat file_stat;
if(stat("main.c", &file_stat) != 0) {
perror("could not stat");
return (1);
}
printf("%d\n", (int) file_stat.st_size);
return (0);
}
EDIT: As I see the code, I have to get into the line with the other posters:
The array that takes the arguments from the program-call is constructed this way:
[0] name of the program itself
[1] first argument given
[2] second argument given
[n] n-th argument given
You should also check argc before trying to use a field other than '0' of the argv-array:
if (argc < 2) {
printf ("Usage: %s arg1", argv[0]);
return (1);
}
argv[0] is the path to the executable and thus argv[1] will be the first user submitted input. Try to alter and add some simple error-checking, such as checking if fp == 0 and we might be ble to help you further.
You can open the file, put the cursor at the end of the file, store the offset, and go back to the top of the file, and make the difference.
You can use fseek for text files as well.
fseek to end of file
ftell the offset
fseek back to the begining
and you have size of the file
Kind of hard with no sample code, but fstat (or stat) will tell you how big the file is. You allocate the memory required, and slurp the file in.
Another approach is to read the file a piece at a time and extend your dynamic buffer as needed:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#define PAGESIZE 128
int main(int argc, char **argv)
{
char *buf = NULL, *tmp = NULL;
size_t bufSiz = 0;
char inputBuf[PAGESIZE];
FILE *in;
if (argc < 2)
{
printf("Usage: %s filename\n", argv[0]);
return 0;
}
in = fopen(argv[1], "r");
if (in)
{
/**
* Read a page at a time until reaching the end of the file
*/
while (fgets(inputBuf, sizeof inputBuf, in) != NULL)
{
/**
* Extend the dynamic buffer by the length of the string
* in the input buffer
*/
tmp = realloc(buf, bufSiz + strlen(inputBuf) + 1);
if (tmp)
{
/**
* Add to the contents of the dynamic buffer
*/
buf = tmp;
buf[bufSiz] = 0;
strcat(buf, inputBuf);
bufSiz += strlen(inputBuf) + 1;
}
else
{
printf("Unable to extend dynamic buffer: releasing allocated memory\n");
free(buf);
buf = NULL;
break;
}
}
if (feof(in))
printf("Reached the end of input file %s\n", argv[1]);
else if (ferror(in))
printf("Error while reading input file %s\n", argv[1]);
if (buf)
{
printf("File contents:\n%s\n", buf);
printf("Read %lu characters from %s\n",
(unsigned long) strlen(buf), argv[1]);
}
free(buf);
fclose(in);
}
else
{
printf("Unable to open input file %s\n", argv[1]);
}
return 0;
}
There are drawbacks with this approach; for one thing, if there isn't enough memory to hold the file's contents, you won't know it immediately. Also, realloc() is relatively expensive to call, so you don't want to make your page sizes too small.
However, this avoids having to use fstat() or fseek()/ftell() to figure out how big the file is beforehand.

How to edit a specific line of a txt file in C

I am currently trying to edit specific lines of .txt file in C. The file that im using looks like this :
Pixel location and RGB Color
Now lets say I want to change whats written on the specific line that its highlighted on the image:
400,300: (255,255,255) #FFFFFF
into this:
400,300: (000,000,000) #000000
Basically, im trying to create a black dot in specific pixels, in this case on 400,300. This is what i have of code:
#include <stdio.h>
int main(void)
{
const char *filename = "sample.txt";
int x = 400;
int y = 300;
FILE *fp;
fp = fopen(filename, "w+");
// Algorithm that reads all the file
// If("Operation that reads" == x+","+y)
// {
// Replace the line information after where it starts with "400,300"
// Like this : 400,300: (000,000,000) #000000
// }
// Algorithm that saves the file with the changes.
fclose(fp)
printf("Ok - File %s saved\n", filename);
return 0;
Creating, opening and editing .txt files is kind of new for me so I dont know what to do, the more i read about it, the more confused I get. How do I approach this problem and what code would fit here?
Update 1:
FILE *fp;
fp = fopen(filename, "w+");
if ( fp == NULL )
{
printf("Error while opening file");
}
Ok so after reading what you have placed below i came up with an idea but still needs work. I would print everything from the file to a char array. After that i would search in each slot for the specific line of code that I was looking for and keep the number slot. After that, i would go to array, run it, and when it comes to that specific slot, i would replace the needed data. Now all i needed to do is to swap the information thats in the file for the one thats in the array, save the file and problem solved. But im getting erros in the code and im missing the bits of code that would clear the txt file and save the new data.
Update 2:
#include <stdio.h>
int main(void)
{
int x,y;
int k = 0;
int noline; // Used to locate which line is the string im looking for
char search; // Used to compare with each string
char blackcode = (char)000; // In RGB, Black uses (000,000,000)
char blackhexcode = (char)000000; // The hexcode for black is #000000
const char *filename = "sample.txt";
char* strings[480000]; // Since its a 800x600 resolution picture, it needs that many lines.
char line[30]; // Space created to store whats inside each line of the file before transfering
char temp;
FILE * fp;
fp= fopen(filename, "r+");
if ( fp == NULL )
{
printf("Error while opening file");
}
else
{
while(fgets(line, sizeof line, fp))
{
strings[k]=strdup(line); // ERROR HERE! What Am I missing?
k++;
}
for(k = 0; k< sizeof strings; k++)
{
temp = scanf("%[^:]s", strings[k]);
search = ("%s,%s",x,y);
if(temp = search)
{
noline = k;
}
else
{
printf("Error : Wrong Coordinates");
}
}
for(k = 0; k < sizeof strings; k++)
{
if(k == noline)
{
strings[k] = ("%d,%d: (%s,%s,%s) #%s", x, y, blackcode, blackcode, blackcode, blackhexcode); // ERROR HERE! What did i did wrong?
}
}
// Code that cleans the txt file and saves the array back to txt file
}
fclose(fp);
}
What you are missing is somewhat conceptual, and somewhat related to fopen. When you think about opening a file with fopen, you need to pay particular attention to the effect of the file modes. If you look carefully at the man page regarding either "w" or "w+". In both cases the existing file is truncated. To 0-length in the case of "w".
To avoid this issue, one approach is to read the entire file into a buffer and then make changes to the buffer, writing the modified buffer back to the original filename. This avoids the possibility to attempting to insert/delete bytes without rewriting the remainder of the file.
To handle reading the file into a buffer, the link posted overwriting a specific line on a text file?, provides a roadmap to changing a single line in a file. Your case is different. You want to find/replace All occurrences of a particular pattern. (that is where the truncation issue posses challenges) However much of the solution there can be applied to reading the file itself into a buffer. Specifically the use of fseek and ftell.
Using fseek and ftell provides a simply way to determine the size (or length) of the file that can then be used to allocate space to hold the entire file in memory. Below is one approach to a simple function that takes the address of a character pointer and a file pointer, then using fseek and ftell allocates the required memory to hold the file and then reads the file into the buffer (filebuf) in a single operation with fread. The buffer is filled, in place, and also returned. A pointer to the file length fplen is passed to the function so the length is made available back in the calling function (main() in this case). Returning a pointer to the buffer on success (NULL otherwise) will allow assignment of the return, if desired, and a way to determine success/failure of the read:
char *read_file_into_buf (char **filebuf, long *fplen, FILE *fp)
{
fseek (fp, 0, SEEK_END);
if ((*fplen = ftell (fp)) == -1) { /* get file length */
fprintf (stderr, "error: unable to determine file length.\n");
return NULL;
}
fseek (fp, 0, SEEK_SET); /* allocate memory for file */
if (!(*filebuf = calloc (*fplen, sizeof *filebuf))) {
fprintf (stderr, "error: virtual memory exhausted.\n");
return NULL;
}
/* read entire file into filebuf */
if (!fread (*filebuf, sizeof *filebuf, *fplen, fp)) {
fprintf (stderr, "error: file read failed.\n");
return NULL;
}
return *filebuf;
}
Once you have the file in memory, the second piece of the puzzle is simply to scan through the buffer and make the replacements you need. Here there are a number of different tweaks you can apply to optimize the search/replace, but the following is just a straight forward basic search/replace where the only optimization attempt is a comparison of the starting character before using the normal string.h string comparison functions to check for your specified search string. The function returns the number of replacements made so you can determine whether a write out to the original filename is required:
unsigned find_replace_text (char *find, char *rep, char *buf, long sz)
{
long i;
unsigned rpc = 0;
size_t j, flen, rlen;
flen = strlen (find);
rlen = strlen (rep);
for (i = 0; i < sz; i++) {
/* if char doesn't match first in find, continue */
if (buf[i] != *find) continue;
/* if find found, replace with rep */
if (strncmp (&buf[i], find, flen) == 0) {
for (j = 0; buf[i + j] && j < rlen; j++)
buf[i + j] = rep[j];
if (buf[i + j])
rpc++;
}
}
return rpc;
}
Putting all the pieces together in a short example program using your sample data could be written as follows. The program expects the filename as the first argument (or it will read from stdin and write to stdout by default if no filename is given). There are always additional validation checks you can include as well:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <limits.h>
char *read_file_into_buf (char **filebuf, long *fplen, FILE *fp);
unsigned find_replace_text (char *find, char *rep, char *buf, long sz);
int main (int argc, char **argv) {
char *srchstr = "400,300";
char *repstr = "400,300: (000,000,000) #000000";
char *filebuf = NULL;
long int fplen = 0;
FILE *fp = NULL;
/* open file for reading (default stdin) */
fp = argc > 1 ? fopen (argv[1], "r") : stdin;
if (!fp) { /* validate file open */
fprintf (stderr, "error: file open failed '%s'\n", argv[1]);
return 1;
}
if (!read_file_into_buf (&filebuf, &fplen, fp)) return 1;
if (fplen < 1 || fplen >= INT_MAX) { /* validate file length */
fprintf (stderr, "error: length of file invalid for fwrite use.\n");
return 1;
}
if (fp != stdin) fclose (fp);
/* find/replace text in filebuf */
if (!find_replace_text (srchstr, repstr, filebuf, fplen)) {
printf ("no replacements made.\n");
return 0;
}
/* open file for writing (default stdout) */
fp = argc > 1 ? fopen (argv[1], "w") : stdout;
if (!fp) { /* validate file open */
fprintf (stderr, "error: file open failed '%s'\n", argv[1]);
return 1;
}
/* write modified filebuf back to filename */
if (fwrite (filebuf, sizeof *filebuf, (size_t)fplen, fp) != (size_t)fplen) {
fprintf (stderr, "error: file write failed.\n");
return 1;
}
if (fp != stdout)
if (fclose (fp) == EOF) {
fprintf (stderr, "error: fclose() returned EOF\n");
return 1;
}
free (filebuf);
return 0;
}
Just include the functions at the bottom of the file. You can then:
Compile
gcc -Wall -Wextra -O3 -o bin/fread_file fread_file.c
(or use the equivalent compile string with your compiler)
Input File
$ cat dat/rbgtst.txt
400,280: (234,163,097) #EAA361
400,300: (255,255,255) #FFFFFF
400,320: (064,101,160) #4065A0
400,340: (220,194,110) #DCC26E
Use/File After Replacement
$ ./bin/fread_file dat/rbgtst.txt
$ cat dat/rbgtst.txt
400,280: (234,163,097) #EAA361
400,300: (000,000,000) #000000
400,320: (064,101,160) #4065A0
400,340: (220,194,110) #DCC26E
or reading from stdin writing to stdout:
$ ./bin/fread_file <dat/rbgtst.txt
400,280: (234,163,097) #EAA361
400,300: (000,000,000) #000000
400,320: (064,101,160) #4065A0
400,340: (220,194,110) #DCC26E
Memory/Error Check
In any code your write that dynamically allocates memory, you have 2 responsibilites regarding any block of memory allocated: (1) always preserves a pointer to the starting address for the block of memory so, (2) it can be freed when it is no longer needed.
It is imperative that you use a memory error checking program to insure you haven't written beyond/outside your allocated block of memory, attempted to read or base a jump on an unintitialized value and finally to confirm that you have freed all the memory you have allocated.
For Linux valgrind is the normal choice. There are many subtle ways to misuse a new block of memory. Using a memory error checker allows you to identify any problems and validate proper use of of the memory you allocate rather than finding out a problem exist through a segfault. There are similar memory checkers for every platform. They are all simple to use, just run your program through it. E.g.:
$ valgrind ./bin/fread_file dat/rbgtst.txt
==13768== Memcheck, a memory error detector
==13768== Copyright (C) 2002-2012, and GNU GPL'd, by Julian Seward et al.
==13768== Using Valgrind-3.8.1 and LibVEX; rerun with -h for copyright info
==13768== Command: ./bin/fread_file dat/rbgtst.txt
==13768==
==13768==
==13768== HEAP SUMMARY:
==13768== in use at exit: 0 bytes in 0 blocks
==13768== total heap usage: 3 allocs, 3 frees, 2,128 bytes allocated
==13768==
==13768== All heap blocks were freed -- no leaks are possible
==13768==
==13768== For counts of detected and suppressed errors, rerun with: -v
==13768== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 2 from 2)
You want to confirm All heap blocks were freed -- no leaks are possible and ERROR SUMMARY: 0 errors from 0 contexts (ignore the suppressed note which simply relates to missing debug symbol files not installed on my system)
Look over the code and understand what it is doing. This isn't presented as the only way of doing what you are attempting to do, but it is presented as an example of how to approach the problem while avoiding a number of pitfalls inherent in trying to change a line-at-a-time in an existing file utilizing offsets and a number of reads/writes to the file. Let me know if you have questions.
You cannot write specific line of txt file in general.
Actually, txt file is just a sequence of bytes. Every line separated by each other just by special symbol '\n' (or symbols '\r', '\n': there are two approaches).
So, if you rewrite some line, you have to move data (lines) remained in the file just after your new line.
But if your new line has the same length as before, you can write it over old line without any worries.
The best approach I can think of for something like this is to open the file in read only mode and then copy everything to a new folder by opening a new file in 'w+' mode. Then you go line by line in the read file until you find a line that you wish to change, then you rewrite the line yourself in the new copy file. Then skip that line in the read file and continue on.
After the copy file is what you want, you can replace the name of it to the original file name you want it to have. Then it will act as if you edited the file like you wanted to.

Access violation when searching through a file

This is my algorithm for searching a term into a file.
void ricerca_file(char* frase){
char* prelievo = "";
file = fopen("*userpath*\\file.bin", "rb");
while((fgets(prelievo, sizeof(prelievo), file)) != NULL){
if((strstr(prelievo, frase)) != NULL)
printf("frase trovata!\n");
}
fclose(file);
printf("%s", prelievo);}
i ask the input of frase in this way:
char* frase = "";
printf("insert the term that you want to search..");
scanf("%s", frase);
and then i call the function with:
ricerca_file(frase);
The compiler gives me this error after i write the input (e.g the number 2):
prove1.exe: 0xC0000005: Access violation writing location 0x00F67BC3.
If there is a handler for this exception, the program may be safely
continued.
What am i doing wrong?
if it wasn't clear, i'm learning. But i didn't really got how to manage the search of a term into a file.
I guess that with this algorithm i can miss lots of matches because if i search for "hello", with the strstr function that moves 5 characters per cycle if i have a file with a text like this "abchelloabc" he will first find "abche" and will not find anything, while after the first cycle it will go to the "lloab" part and then "c". Am i right thinking that it works like that and this is wrong?
prelievo points to a string literal. This is constant data that cannot be written to. And sizeof(prelievo) will be 2 or 4 (or whatever size pointers are on your system), which is not what you want.
You'll need to instead point prelievo to an array of characters that can be modified:
char prelievo[1000];
The same problems and solution apply to frase:
char frase[1000];
You need to actually provide memory to save the string you scan into. Try something like this instead:
char frase[80];
printf("insert the term that you want to search..");
fgets(frase, 80, stdin);
This allocates enough space for 80 characters and then reads one line of input.
Please also check the results of all these functions: If they return an error, you should act appropriately.
what am I doing wrong:
regarding:
char* prelievo = "";
file = fopen("*userpath*\\file.bin", "rb");
while((fgets(prelievo, sizeof(prelievo), file)) != NULL){
...
The call to fgets() needs to have a pointer to a buffer as its' first parameter.
The 'prelievo' is only an uninitalized pointer.
suggestion 1)
char* prelievo = malloc( 1024 );
if ( prelievo ) {
file = fopen("*userpath*\\file.bin", "rb");
while((fgets(prelievo, sizeof(prelievo), file)) != NULL){
suggestion 2)
char prelievo[1024];
file = fopen("*userpath*\\file.bin", "rb");
while((fgets(prelievo, sizeof(prelievo), file)) != NULL){
This answer is not exactly related to your problem, but because you already got your Answers i will try to explain you about some problems if you ignore them.
If we do not check for errors/return and the program works fine this does not mean that the program is ok or safe.
Let's take the following scenario as an Example.
#include<stdio.h>
#include<string.h>
#include<stdlib.h>
char *printFile(char *fileName){
size_t length,size;
char *buffer;
FILE *file;
file = fopen (fileName , "r" );
fseek (file , 0 , SEEK_END);
length = (size_t)ftell (file);
fseek (file , 0 , SEEK_SET);
buffer = malloc(length);
if (buffer == NULL){
fputs ("Memory error",stderr);
exit (2);
}
size = fread (buffer,1,length,file);
if (size != length){
fputs ("Reading error",stderr);
exit(3);
}
fclose (file);
return buffer;
}
int main (void) {
char *fileName = "test.txt";
char *stringToSearch = "Addams";
char *fileContent = printFile(fileName);
if (strstr(fileContent, stringToSearch)){
printf("%s was Found\n",stringToSearch);
}else{
printf("%s was not Found\n",stringToSearch);
}
free(fileContent);
return 0;
}
The file test.txt has the following content:
Michael Jackson
Bryan Addams
Jack Sparrow
So now if I run this program I get:
Addams was Found
Everything seems to be ok, but what happens if I try to share this program with someone ? Or what happens if I try to run it on another computer ?
well:
Segmentation fault (core dumped)
OMG, what did just happen now ? Simple,the file test.txt is missing and i did not check that in my program that's why.
Lets move on and create that file and run that program again:
Addams was not Found
Huh, I succeeded isn't ? Well not, valgrind has another opinion:
==3657== Invalid read of size 1
==3657== at 0x4C32FF4: strstr (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so)
==3657== by 0x400A2D: main (in /home/michi/program)
==3657== Address 0x54202b0 is 0 bytes after a block of size 0 alloc'd
==3657== at 0x4C2BBA0: malloc (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so)
==3657== by 0x40095E: printFile (in /home/michi/program)
==3657== by 0x400A16: main (in /home/michi/program)
What happens is that I try to read a file which was newly created without thinking if that file has some content and i performed a lot of codding on it.

Is it legal to use freopen and after it fopen ?

Suppose I have a string char* str.
I print it to the buffer in the following way:
char buf[MAX_LEN];
freopen("tmp","w",stdout);
printf("%s\n",str);
fflush(stdout);
fp = fopen(tmp,"r");
if (fp == NULL) return;
fgets(buf,MAX_LEN,fp);
fclose(fp);
fclose(stdout);
May this code cause invalid stream buffer handle?
Is it legal to use freopen and after it fopen?
Based on constrains of my system I can't use fprintf and sprintf.
In theory, it's perfectly legal and works fine. It's even its main use case, according to its man page :
The freopen() function opens the file whose name is the string
pointed to by path and associates the stream pointed to by stream with
it. The original stream (if it exists) is closed. The mode argument
is used just as in the fopen() function. The primary use of the
freopen() function is to change the file associated with a standard
text stream (stderr, stdin, or stdout)
In practice, your code won't work : there are some mistake mainly between "tmp" and tmp & missing headers. This code will work:
#include <stdio.h>
#define MAX_LEN 512
int main() {
const char* str = "data\n";
FILE* fp;
char buf[MAX_LEN];
freopen("tmp","w",stdout);
printf("%s\n",str);
fflush(stdout);
fp = fopen("tmp","r");
if (fp == NULL) return;
fgets(buf,MAX_LEN,fp);
// here, buf gets str's content
fclose(fp);
fclose(stdout);
return 0;
}

*** glibc detected *** free(): invalid next size (normal): 0x0a03c978 *** [duplicate]

This question already has answers here:
Facing an error "*** glibc detected *** free(): invalid next size (fast)"
(2 answers)
Closed 8 years ago.
I'm writing a socket program to download images. The problem is that when I test my code on small pics like gif it works fine. But when I run it with JPG pics (bigger than GIF) I got the error message:
*** glibc detected *** /home/ubuntu/NetBeansProjects/myDownloader/dist/Debug/GNU-Linux-x86/mydownloader: free(): invalid next size (normal): 0x0a03c978 ***
Please see the code and I'll give more information about the error.
FILE* pFile;
long lSize;
unsigned char* buffer;
size_t result;
FILE* combinedFile = fopen("mypic.jpg", "wb+");
for(i = 1; i <= numberOfPartitions; i++)
{
sprintf(filename, "part%d", i);
pFile = fopen(filename, "rb");
//obtain file size
fseek(pFile , 0 , SEEK_END);
lSize = ftell(pFile);
rewind(pFile);
// allocate memory to contain the whole file:
buffer = (unsigned char*) malloc(sizeof(unsigned char) * (lSize + 1));
if(buffer == NULL)
{
fputs("Memory error", stderr);
exit(2);
}
// copy the file into the buffer:
result = fread(buffer, 1, lSize, pFile);
if(result != lSize)
{
fputs("Reading error", stderr);
exit(3);
}
else
{
unsigned char* temp = strstr(buffer, "\r\n\r\n");
temp = temp + 4;
int len = lSize - (temp - buffer);
//printf("i : %d len is : %d plen is %f\n",i,len,pLen);
if(i != numberOfPartitions)
fwrite(temp, 1, len - 1, combinedFile);
else
fwrite(temp, 1, len, combinedFile);
}
fclose(pFile);
printf("crash here\n");
free(buffer);
}
fclose(combinedFile);
I got the error from this part, as i said when the image size is small it works fine. But with bigger size it's broken!
P.S: The program divided the pic into several files then re-combine it so the combining part is the one that causes the error.
Any help will be very much appreciated since I've been stuck with this error for more than 3 days!
You don't verify that the fopen() calls all succeed; this is a recipe for trouble.
You don't check the ftell() gives you a plausible value in lSize.
You don't verify that the strstr() operation actually finds the marker string. If it doesn't, it will return NULL and the following length operations are then bogus. But the error suggests that your code has written out of bounds, rather than just read data out of bounds.
You could declare the first four variables into the body of the loop instead of outside the loop.
You don't show the declaration of the variable filename; could that be a char pointer with no space allocated? Or is it an array that is big enough?
It is an odds-on bet that something has written beyond the end of some allocated space. It is not immediately obvious that there's anything wrong with this code, but the trouble could be elsewhere yet it is this code that suffers the effects of transgressions elsewhere. This is quite common with memory problems; the code that finds the problem isn't the code that causes it.
Does the malloc() on your machine return null or a non-null pointer when you allocate zero bytes? Both are legitimate responses.
If ftell() returns -1, then malloc() would allocate a buffer for 0 bytes, but the fread() would attempt to read up to 4 GB of data, which might overflow the space. OTOH, if ftell() fails, it is likely that fread() will fail too.
Have you printed out the sizes of the files? Is it the second partial file that crashes, or a later file?
I've taken the code you supplied, wrapped it up as a main() function, supplied missing variables and headers, and run it under valgrind. (MacOS X 10.6.6, GCC 4.5.2, Valgrind 3.6.0) It shows no problem. So, your trouble is most probably not in this code per se; something else earlier in your program trampled out of bounds of allocated memory and caused this to fail. I generated the 4 part files using the script:
{ echo "Header:control-Vcontrol-Mreturncontrol-Vcontrol-M";
dd if=/dev/random bs=1k count=4; } >part1
So each file was 4107 bytes long.
Working Code
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
int main(void)
{
char filename[32];
FILE* pFile;
long lSize;
char *buffer;
ssize_t result;
FILE* combinedFile = fopen("mypic.jpg", "wb+");
int numberOfPartitions = 4;
int i;
for(i = 1; i <= numberOfPartitions; i++)
{
sprintf(filename, "part%d", i);
pFile = fopen(filename, "rb");
fseek(pFile , 0 , SEEK_END);
lSize = ftell(pFile);
rewind(pFile);
printf("size(%d) = %ld\n", i, lSize);
buffer = (char*) malloc(sizeof(char) * (lSize + 1));
if (buffer == NULL)
{
fputs("Memory error", stderr);
exit(2);
}
result = fread(buffer, 1, lSize, pFile);
if (result != lSize)
{
fputs("Reading error", stderr);
exit(3);
}
else
{
char* temp = strstr(buffer, "\r\n\r\n");
temp = temp + 4;
int len = lSize - (temp - buffer);
if(i != numberOfPartitions)
fwrite(temp, 1, len - 1, combinedFile);
else
fwrite(temp, 1, len, combinedFile);
}
fclose(pFile);
printf("crash here\n");
free(buffer);
}
fclose(combinedFile);
return 0;
}
I've not inserted all the error checking that I would if it were my own program.
The output file in my scheme is 16381 bytes long; that is 3 bytes short. The problem there is the fwrite() calls. The fread() code told you how many bytes it read; you subtracted the bytes for the header, and then subtracted one more. So, that if/else code reduces to just the fwrite() in the else.
Actually I can't find anything obviously wrong with your memory or file handling in the code above, the crash on free() might just be a symptom of something in your code writing into malloc()'s personal space...
You could use memory checkers such as Valgrind or debuggers like gdb to take a closer look.
The only possibly wrong thing that comes to mind is that buffer is not necessarily NUL-terminated, and as such the strstr() search can happily go over it, buffer[lSize] = '\0'; after the malloc-NULL-check should fix that. Also just to be sure, check that strstr() actually found what it was looking for (it returns NULL if it didn't). You may also want to check that all your fopen() calls actually succeed (return not NULL).
If none of this helps, printout's of len, lSize, temp and buffer's values just before the fwrite() calls would be helpful.

Resources