*** glibc detected *** free(): invalid next size (normal): 0x0a03c978 *** [duplicate] - c

This question already has answers here:
Facing an error "*** glibc detected *** free(): invalid next size (fast)"
(2 answers)
Closed 8 years ago.
I'm writing a socket program to download images. The problem is that when I test my code on small pics like gif it works fine. But when I run it with JPG pics (bigger than GIF) I got the error message:
*** glibc detected *** /home/ubuntu/NetBeansProjects/myDownloader/dist/Debug/GNU-Linux-x86/mydownloader: free(): invalid next size (normal): 0x0a03c978 ***
Please see the code and I'll give more information about the error.
FILE* pFile;
long lSize;
unsigned char* buffer;
size_t result;
FILE* combinedFile = fopen("mypic.jpg", "wb+");
for(i = 1; i <= numberOfPartitions; i++)
{
sprintf(filename, "part%d", i);
pFile = fopen(filename, "rb");
//obtain file size
fseek(pFile , 0 , SEEK_END);
lSize = ftell(pFile);
rewind(pFile);
// allocate memory to contain the whole file:
buffer = (unsigned char*) malloc(sizeof(unsigned char) * (lSize + 1));
if(buffer == NULL)
{
fputs("Memory error", stderr);
exit(2);
}
// copy the file into the buffer:
result = fread(buffer, 1, lSize, pFile);
if(result != lSize)
{
fputs("Reading error", stderr);
exit(3);
}
else
{
unsigned char* temp = strstr(buffer, "\r\n\r\n");
temp = temp + 4;
int len = lSize - (temp - buffer);
//printf("i : %d len is : %d plen is %f\n",i,len,pLen);
if(i != numberOfPartitions)
fwrite(temp, 1, len - 1, combinedFile);
else
fwrite(temp, 1, len, combinedFile);
}
fclose(pFile);
printf("crash here\n");
free(buffer);
}
fclose(combinedFile);
I got the error from this part, as i said when the image size is small it works fine. But with bigger size it's broken!
P.S: The program divided the pic into several files then re-combine it so the combining part is the one that causes the error.
Any help will be very much appreciated since I've been stuck with this error for more than 3 days!

You don't verify that the fopen() calls all succeed; this is a recipe for trouble.
You don't check the ftell() gives you a plausible value in lSize.
You don't verify that the strstr() operation actually finds the marker string. If it doesn't, it will return NULL and the following length operations are then bogus. But the error suggests that your code has written out of bounds, rather than just read data out of bounds.
You could declare the first four variables into the body of the loop instead of outside the loop.
You don't show the declaration of the variable filename; could that be a char pointer with no space allocated? Or is it an array that is big enough?
It is an odds-on bet that something has written beyond the end of some allocated space. It is not immediately obvious that there's anything wrong with this code, but the trouble could be elsewhere yet it is this code that suffers the effects of transgressions elsewhere. This is quite common with memory problems; the code that finds the problem isn't the code that causes it.
Does the malloc() on your machine return null or a non-null pointer when you allocate zero bytes? Both are legitimate responses.
If ftell() returns -1, then malloc() would allocate a buffer for 0 bytes, but the fread() would attempt to read up to 4 GB of data, which might overflow the space. OTOH, if ftell() fails, it is likely that fread() will fail too.
Have you printed out the sizes of the files? Is it the second partial file that crashes, or a later file?
I've taken the code you supplied, wrapped it up as a main() function, supplied missing variables and headers, and run it under valgrind. (MacOS X 10.6.6, GCC 4.5.2, Valgrind 3.6.0) It shows no problem. So, your trouble is most probably not in this code per se; something else earlier in your program trampled out of bounds of allocated memory and caused this to fail. I generated the 4 part files using the script:
{ echo "Header:control-Vcontrol-Mreturncontrol-Vcontrol-M";
dd if=/dev/random bs=1k count=4; } >part1
So each file was 4107 bytes long.
Working Code
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
int main(void)
{
char filename[32];
FILE* pFile;
long lSize;
char *buffer;
ssize_t result;
FILE* combinedFile = fopen("mypic.jpg", "wb+");
int numberOfPartitions = 4;
int i;
for(i = 1; i <= numberOfPartitions; i++)
{
sprintf(filename, "part%d", i);
pFile = fopen(filename, "rb");
fseek(pFile , 0 , SEEK_END);
lSize = ftell(pFile);
rewind(pFile);
printf("size(%d) = %ld\n", i, lSize);
buffer = (char*) malloc(sizeof(char) * (lSize + 1));
if (buffer == NULL)
{
fputs("Memory error", stderr);
exit(2);
}
result = fread(buffer, 1, lSize, pFile);
if (result != lSize)
{
fputs("Reading error", stderr);
exit(3);
}
else
{
char* temp = strstr(buffer, "\r\n\r\n");
temp = temp + 4;
int len = lSize - (temp - buffer);
if(i != numberOfPartitions)
fwrite(temp, 1, len - 1, combinedFile);
else
fwrite(temp, 1, len, combinedFile);
}
fclose(pFile);
printf("crash here\n");
free(buffer);
}
fclose(combinedFile);
return 0;
}
I've not inserted all the error checking that I would if it were my own program.
The output file in my scheme is 16381 bytes long; that is 3 bytes short. The problem there is the fwrite() calls. The fread() code told you how many bytes it read; you subtracted the bytes for the header, and then subtracted one more. So, that if/else code reduces to just the fwrite() in the else.

Actually I can't find anything obviously wrong with your memory or file handling in the code above, the crash on free() might just be a symptom of something in your code writing into malloc()'s personal space...
You could use memory checkers such as Valgrind or debuggers like gdb to take a closer look.
The only possibly wrong thing that comes to mind is that buffer is not necessarily NUL-terminated, and as such the strstr() search can happily go over it, buffer[lSize] = '\0'; after the malloc-NULL-check should fix that. Also just to be sure, check that strstr() actually found what it was looking for (it returns NULL if it didn't). You may also want to check that all your fopen() calls actually succeed (return not NULL).
If none of this helps, printout's of len, lSize, temp and buffer's values just before the fwrite() calls would be helpful.

Related

fgets statement reads first line and not sure how to modify because I have to return a pointer [duplicate]

I need to copy the contents of a text file to a dynamically-allocated character array.
My problem is getting the size of the contents of the file; Google reveals that I need to use fseek and ftell, but for that the file apparently needs to be opened in binary mode, and that gives only garbage.
EDIT: I tried opening in text mode, but I get weird numbers. Here's the code (I've omitted simple error checking for clarity):
long f_size;
char* code;
size_t code_s, result;
FILE* fp = fopen(argv[0], "r");
fseek(fp, 0, SEEK_END);
f_size = ftell(fp); /* This returns 29696, but file is 85 bytes */
fseek(fp, 0, SEEK_SET);
code_s = sizeof(char) * f_size;
code = malloc(code_s);
result = fread(code, 1, f_size, fp); /* This returns 1045, it should be the same as f_size */
The root of the problem is here:
FILE* fp = fopen(argv[0], "r");
argv[0] is your executable program, NOT the parameter. It certainly won't be a text file. Try argv[1], and see what happens then.
You cannot determine the size of a file in characters without reading the data, unless you're using a fixed-width encoding.
For example, a file in UTF-8 which is 8 bytes long could be anything from 2 to 8 characters in length.
That's not a limitation of the file APIs, it's a natural limitation of there not being a direct mapping from "size of binary data" to "number of characters."
If you have a fixed-width encoding then you can just divide the size of the file in bytes by the number of bytes per character. ASCII is the most obvious example of this, but if your file is encoded in UTF-16 and you happen to be on a system which treats UTF-16 code points as the "native" internal character type (which includes Java, .NET and Windows) then you can predict the number of "characters" to allocate as if UTF-16 were fixed width. (UTF-16 is variable width due to Unicode characters above U+FFFF being encoded in multiple code points, but a lot of the time developers ignore this.)
I'm pretty sure argv[0] won't be an text file.
Give this a try (haven't compiled this, but I've done this a bazillion times, so I'm pretty sure it's at least close):
char* readFile(char* filename)
{
FILE* file = fopen(filename,"r");
if(file == NULL)
{
return NULL;
}
fseek(file, 0, SEEK_END);
long int size = ftell(file);
rewind(file);
char* content = calloc(size + 1, 1);
fread(content,1,size,file);
return content;
}
If you're developing for Linux (or other Unix-like operating systems), you can retrieve the file-size with stat before opening the file:
#include <stdio.h>
#include <sys/stat.h>
int main() {
struct stat file_stat;
if(stat("main.c", &file_stat) != 0) {
perror("could not stat");
return (1);
}
printf("%d\n", (int) file_stat.st_size);
return (0);
}
EDIT: As I see the code, I have to get into the line with the other posters:
The array that takes the arguments from the program-call is constructed this way:
[0] name of the program itself
[1] first argument given
[2] second argument given
[n] n-th argument given
You should also check argc before trying to use a field other than '0' of the argv-array:
if (argc < 2) {
printf ("Usage: %s arg1", argv[0]);
return (1);
}
argv[0] is the path to the executable and thus argv[1] will be the first user submitted input. Try to alter and add some simple error-checking, such as checking if fp == 0 and we might be ble to help you further.
You can open the file, put the cursor at the end of the file, store the offset, and go back to the top of the file, and make the difference.
You can use fseek for text files as well.
fseek to end of file
ftell the offset
fseek back to the begining
and you have size of the file
Kind of hard with no sample code, but fstat (or stat) will tell you how big the file is. You allocate the memory required, and slurp the file in.
Another approach is to read the file a piece at a time and extend your dynamic buffer as needed:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#define PAGESIZE 128
int main(int argc, char **argv)
{
char *buf = NULL, *tmp = NULL;
size_t bufSiz = 0;
char inputBuf[PAGESIZE];
FILE *in;
if (argc < 2)
{
printf("Usage: %s filename\n", argv[0]);
return 0;
}
in = fopen(argv[1], "r");
if (in)
{
/**
* Read a page at a time until reaching the end of the file
*/
while (fgets(inputBuf, sizeof inputBuf, in) != NULL)
{
/**
* Extend the dynamic buffer by the length of the string
* in the input buffer
*/
tmp = realloc(buf, bufSiz + strlen(inputBuf) + 1);
if (tmp)
{
/**
* Add to the contents of the dynamic buffer
*/
buf = tmp;
buf[bufSiz] = 0;
strcat(buf, inputBuf);
bufSiz += strlen(inputBuf) + 1;
}
else
{
printf("Unable to extend dynamic buffer: releasing allocated memory\n");
free(buf);
buf = NULL;
break;
}
}
if (feof(in))
printf("Reached the end of input file %s\n", argv[1]);
else if (ferror(in))
printf("Error while reading input file %s\n", argv[1]);
if (buf)
{
printf("File contents:\n%s\n", buf);
printf("Read %lu characters from %s\n",
(unsigned long) strlen(buf), argv[1]);
}
free(buf);
fclose(in);
}
else
{
printf("Unable to open input file %s\n", argv[1]);
}
return 0;
}
There are drawbacks with this approach; for one thing, if there isn't enough memory to hold the file's contents, you won't know it immediately. Also, realloc() is relatively expensive to call, so you don't want to make your page sizes too small.
However, this avoids having to use fstat() or fseek()/ftell() to figure out how big the file is beforehand.

Can't read whole file in c

I'm trying to move content from one file to another.
My code:
char *path = extractFileName(args[1]);
if (path == 0)
return -1;
FILE *input = fopen(path, "r");
rewind(input);
fseek(input, 0L, SEEK_END);
long sz = ftell(input);
printf("sz: %ld\n", sz);
rewind(input);
size_t a;
FILE *result = fopen("result.mp3", "w");
size_t counter = 0;
char buffer[128];
while ((a = fread(&buffer[0], 1, 128, input)) != 0) {
fwrite(&buffer[0], 1, a, result);
counter += a;
}
printf("%d\n", counter);
printf("ferror input: %d\n", ferror(input));
printf("feof input: %d\n", feof(input));
After execution it prints
sz: 6675688
25662
ferror input: 0
feof input: 16
As far as I know it means that C knows that size of input file is 665kb but returns eof when I try to read more than 25662 bytes. What I'm doing wrong?
Since your output filename is result.mp3, it's a safe bet you're dealing with non-textual data. That means you should be opening your files in binary mode - "rb" and "wb" respectively. If you're running this code on Windows, not doing that would explain the behavior you're seeing (On that platform, reading a particular byte (0x1A) in text mode causes it to signal end of file even when it's not actually the end), and using binary mode will fix it. On other OSes, it's a no-op but still clues the reader into your intentions and the type of data you're expecting to work with, and is thus a good idea even if it's not strictly needed on them.

Why wont my code work to open a new file using sprintf and fopen?

int main()
{
int size = 512, i = 1;
char buffer[1000];
char *newFileTemp;
char const *chunk = "Chunk";
memset(buffer, 0, sizeof(buffer));
FILE *fb;
FILE *fp=fopen("blah.txt", "r");
if (fp == NULL)
{
perror("doesnt exist");
return 0;
}
fread(buffer,sizeof(char),sizeof(buffer), fp);
sprintf(newFileTemp, "%s%i", chunk, i);
printf("blah check %s",newFileTemp);
fb = fopen(newFileTemp, "wb");
if (fb == NULL)
{
perror("doesnt exist");
return 0;
}
fwrite(buffer, sizeof(char), sizeof(buffer), fb);
fclose(fp);
fclose(fb);
return 0;
}
I'm trying to use sprintf to create a new file named chunk1 that has the data of file blah.text (blah.txt is already created). But even though the code compiles properly, it doesn't create a new file. Please help.
What you are experiencing is called an undefined behaviour, because you are using newFileTemp before you have initialized it. To correct the problem, initialize it like this:
newFileTemp = (char*)malloc(100);
or declare it like this:
char newFileTemp[100];
The reason is that sprintf expects newFileTemp to have enough space allocated for storing the string it is formatting (it will not allocate it for you).
Also:
If you use malloc don't forget to free.
Don't forget to check the success of functions like fread and fwrite.
You'll have another problem later because in your call to fwrite you are trying to write always 1000 bytes (sizeof(buffer) is 1000 bytes), even if your file had fewer bytes. This is where the return value of fread comes into play (it returns the actual amount of bytes read): you need to use that return value instead of sizeof(buffer).

Reading file line by line with fgets in C, invalid read of size [duplicate]

This question already has answers here:
Valgrind on OS X Yosemite, giving bogus errors? [duplicate]
(4 answers)
Closed 7 years ago.
I keep getting a valgrind error in my code, and after three hours I remain clueless so I need your help people.
So I basically just read files contained in a directory and parse them, So I copied the shortest example of my code still producing the error:
int main(int argc, char** argv) {
parse_files_dir("/Users/link_to_dir_example/");
return (EXIT_SUCCESS);
}
void parse_files_dir(char *dirLink){
int dLink_l =strlen(dirLink);
int max_len = dLink_l*2;
char* full_path=malloc(sizeof(char)*(max_len+1));
//check if null pointer...
strncpy(full_path, dirLink, dLink_l);
DIR *dir;
struct dirent *dir_con;
dir=opendir(dirLink);
if (dir == NULL){
fprintf(stderr, "Problem opening directory: \"%s\". Aborting...\n", dirLink);
exit(EXIT_FAILURE);
}
while((dir_con = readdir(dir)) != NULL){
if (dir_con->d_name[0] == '.') continue;
if (dLink_l+strlen(dir_con->d_name)>max_len) //realloc full path..
strncpy(&full_path[dLink_l], dir_con->d_name, strlen(dir_con->d_name));
full_path[dLink_l+strlen(dir_con->d_name)] = '\0';
parse_one_file(full_path); // (*) <=== valgrind complain
full_path[dLink_l] = '\0';
}
free(full_path);
closedir(dir);
}
So now the actual problem method:
void parse_one_file(char* link) {
FILE *file = fopen(link, "r");
if (file == NULL) //error message
int line_len=0;
int line_max=1000;
char* line= malloc(sizeof(char)*line_max);
line[0] = '\0';
char* line_full= malloc(sizeof(char)*line_max);
line_full[0] = '\0';
int line_full_len = 0;
//check all allocations for null pointers
while(fgets(line, line_max, file) != NULL){ // <=== Here is where valgrind complains !!!!
line_len = strlen(line);
if (line[line_len-1] == '\n'){
strncpy(&line_full[line_full_len], line, line_len);
line_full_len+=line_len;
}
else{
//Check if line_full has enough memory left
strncpy(&line_full[line_full_len], line, line_len);
line_full_len+=line_len;
}
line[0] = '\0';
}
free(line);
free(line_full);
fclose(file);
}
I keep getting the error:
==4929== Invalid read of size 32
==4929== at 0x1003DDC1D: _platform_memchr$VARIANT$Haswell (in /usr/lib/system/libsystem_platform.dylib)
==4929== by 0x1001CF66A: fgets (in /usr/lib/system/libsystem_c.dylib)
==4929== by 0x100000CD8: parse_one_file (main.c:93)
==4929== by 0x100000B74: parse_files_dir (main.c:67)
==4929== by 0x100000944: main (main.c:28)
==4929== Address 0x100804dc0 is 32 bytes before a block of size 4,096 in arena "client"
So i really dont see where my mistake is, I keep emptying the buffer line, I never read more bytes than allocated there.
The interesting thing I noticed is, if the directory "dirLink" has only one file, the error does not occur, however if I have two or more, the error occurs, so I thought the mistake is how I generate the path "full_path", but then I replaced line "*" with (just for testing reasons)
parse_one_file("another/example/path/");
and the error remained..
Unless your file is less than 1000 bytes in total you are writing over the end of the line_full buffer which is only 1000 bytes total in size. This will invariably clobber your memory and lead to spurious errors like the one you experience in fgets.
if(line[line_len-1] == '\n'){
strncpy(&line_full[line_full_len], line, line_len);
line_full_len+=line_len;
}
This is not quite correct, you can only strncpy() (line_max - line_full_len) bytes, there is no guarantee that you can copy line_len bytes. Or in other words. starting from position line_full[500], you can't write another 1000 bytes.
The same error is in the else branch.

Access violation when searching through a file

This is my algorithm for searching a term into a file.
void ricerca_file(char* frase){
char* prelievo = "";
file = fopen("*userpath*\\file.bin", "rb");
while((fgets(prelievo, sizeof(prelievo), file)) != NULL){
if((strstr(prelievo, frase)) != NULL)
printf("frase trovata!\n");
}
fclose(file);
printf("%s", prelievo);}
i ask the input of frase in this way:
char* frase = "";
printf("insert the term that you want to search..");
scanf("%s", frase);
and then i call the function with:
ricerca_file(frase);
The compiler gives me this error after i write the input (e.g the number 2):
prove1.exe: 0xC0000005: Access violation writing location 0x00F67BC3.
If there is a handler for this exception, the program may be safely
continued.
What am i doing wrong?
if it wasn't clear, i'm learning. But i didn't really got how to manage the search of a term into a file.
I guess that with this algorithm i can miss lots of matches because if i search for "hello", with the strstr function that moves 5 characters per cycle if i have a file with a text like this "abchelloabc" he will first find "abche" and will not find anything, while after the first cycle it will go to the "lloab" part and then "c". Am i right thinking that it works like that and this is wrong?
prelievo points to a string literal. This is constant data that cannot be written to. And sizeof(prelievo) will be 2 or 4 (or whatever size pointers are on your system), which is not what you want.
You'll need to instead point prelievo to an array of characters that can be modified:
char prelievo[1000];
The same problems and solution apply to frase:
char frase[1000];
You need to actually provide memory to save the string you scan into. Try something like this instead:
char frase[80];
printf("insert the term that you want to search..");
fgets(frase, 80, stdin);
This allocates enough space for 80 characters and then reads one line of input.
Please also check the results of all these functions: If they return an error, you should act appropriately.
what am I doing wrong:
regarding:
char* prelievo = "";
file = fopen("*userpath*\\file.bin", "rb");
while((fgets(prelievo, sizeof(prelievo), file)) != NULL){
...
The call to fgets() needs to have a pointer to a buffer as its' first parameter.
The 'prelievo' is only an uninitalized pointer.
suggestion 1)
char* prelievo = malloc( 1024 );
if ( prelievo ) {
file = fopen("*userpath*\\file.bin", "rb");
while((fgets(prelievo, sizeof(prelievo), file)) != NULL){
suggestion 2)
char prelievo[1024];
file = fopen("*userpath*\\file.bin", "rb");
while((fgets(prelievo, sizeof(prelievo), file)) != NULL){
This answer is not exactly related to your problem, but because you already got your Answers i will try to explain you about some problems if you ignore them.
If we do not check for errors/return and the program works fine this does not mean that the program is ok or safe.
Let's take the following scenario as an Example.
#include<stdio.h>
#include<string.h>
#include<stdlib.h>
char *printFile(char *fileName){
size_t length,size;
char *buffer;
FILE *file;
file = fopen (fileName , "r" );
fseek (file , 0 , SEEK_END);
length = (size_t)ftell (file);
fseek (file , 0 , SEEK_SET);
buffer = malloc(length);
if (buffer == NULL){
fputs ("Memory error",stderr);
exit (2);
}
size = fread (buffer,1,length,file);
if (size != length){
fputs ("Reading error",stderr);
exit(3);
}
fclose (file);
return buffer;
}
int main (void) {
char *fileName = "test.txt";
char *stringToSearch = "Addams";
char *fileContent = printFile(fileName);
if (strstr(fileContent, stringToSearch)){
printf("%s was Found\n",stringToSearch);
}else{
printf("%s was not Found\n",stringToSearch);
}
free(fileContent);
return 0;
}
The file test.txt has the following content:
Michael Jackson
Bryan Addams
Jack Sparrow
So now if I run this program I get:
Addams was Found
Everything seems to be ok, but what happens if I try to share this program with someone ? Or what happens if I try to run it on another computer ?
well:
Segmentation fault (core dumped)
OMG, what did just happen now ? Simple,the file test.txt is missing and i did not check that in my program that's why.
Lets move on and create that file and run that program again:
Addams was not Found
Huh, I succeeded isn't ? Well not, valgrind has another opinion:
==3657== Invalid read of size 1
==3657== at 0x4C32FF4: strstr (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so)
==3657== by 0x400A2D: main (in /home/michi/program)
==3657== Address 0x54202b0 is 0 bytes after a block of size 0 alloc'd
==3657== at 0x4C2BBA0: malloc (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so)
==3657== by 0x40095E: printFile (in /home/michi/program)
==3657== by 0x400A16: main (in /home/michi/program)
What happens is that I try to read a file which was newly created without thinking if that file has some content and i performed a lot of codding on it.

Resources