C - Exponential function causes a segfault with file IO - c

EDIT: To answer some questions, this is the revised and still not working code (most of it was there to begin with, but I should have been explicit that I initialised the file pointer, etc). Again, only works if I either add a write before the exp() or remove the exp() entirely:
FILE *outfile;
char *outfilename;
outfilename = (char *)malloc(FILENAME_MAX*sizeof(char));
strcpy(outfilename, "outfile.txt");
outfile = fopen(realoutfilename, "w");
/* If this is uncommented, there isn't a segfault
if(realoutfile!=NULL && imoutfile!=NULL){
fprintf(outfile, "\r\n");
fseek(outfile,0,SEEK_SET);
}
*/
gauss = (double*) calloc(points, sizeof(double));
/* Maths and stuff */
if(outfile!=NULL){
for(i=0;i<points;i++){
/* this prints fine */
printf(outfile, "%g,\r\n", gauss[i]);
/* Seg fault is here */
fprintf(outfile, "%g,\r\n", gauss[i]);
}
}
fclose(outfile);
free(outfile);
And I'm compiling with:
gcc main.c -lm -Wall -Wextra -Werror -Wshadow -g -o main
To clarify, it doesn't reach the end of the function - so it's not the freeing that it crashes on. The crash is when it tries to write to the file in that for loop.
I've checked that exp() isn't over or underflowing, as I say, I can printf the output, but file writing is a no-no. It also fails if I try a simple call, say exp(2).
The gdb backtrace is (I'm not that familiar with gdb, thought it might help):
#0 0xff15665c in _malloc_unlocked () from /lib/libc.so.1
#1 0xff15641c in malloc () from /lib/libc.so.1
#2 0xff1a8c80 in _findbuf () from /lib/libc.so.1
#3 0xff1a8f0c in _wrtchk () from /lib/libc.so.1
#4 0xff1ad834 in _fwrite_unlocked () from /lib/libc.so.1
#5 0xff1ad798 in fwrite () from /lib/libc.so.1
#6 0x000128ac in gaussian ()
#7 0x00010f78 in main ()
Any help would be greatly appreciated!

The problem, is here:
outfilename = (char *)malloc(FILENAME_MAX*sizeof(char));
outfilename = "file.txt";
You can't assign a string like that, you have to use strcpy:
strcpy(outfilename, "file.txt");
What's happening is that you're are overwriting the outfilename pointer with the string assignment. Then you try to free it free(outfilename);. Since you are freeing a string literal, the behavior is undefined, hence the crash you are getting.
As for why it crashes in one case and not the other. The behavior is undefined, therefore anything is allowed to happen. It's possible that your exponential function code does something to the stack/heap that could be causing it crash/not crash.
EDIT : I hope it's just a typo or mis-copy, but I also don't see where outfile is initialized. If it really is never initialized, then that's the other error. (and most likely the one that's causing your particular segfault)
So it should look like this:
FILE *outfile;
outfilename = (char *)malloc(FILENAME_MAX*sizeof(char));
strcpy(outfilename, "file.txt");
outfile = fopen(outfilename, "w");
if (outfile == NULL){
// Error, file cannot be openned.
}

First, you really should compile with all warnings enabled and with debugging information produced; with gcc that means the -Wall -g flags.
FILE *outfile;
outfilename = (char *)malloc(FILENAME_MAX*sizeof(char));
outfilename = "file.txt";
You should use strdup and I see no call to fopen eg
outfile = fopen(outfilename, "r");
And you should learn to use the debugger gdb (or perhaps its ddd graphical front-end).

outfilename = "file.txt";
/* snip */
free(outfilename);
You can only free something you got back from malloc. You can't pass free a pointer to a constant!

Related

Why segfaults occur with string.h functions?

With the same command in my coworker's PC, my program works without the problem.
But in my PC, the program crashes with segfault;
GDB backtrace at core reads as follows:
#0 strrchr () at ../sysdeps/x86_64/strrchr.S:32
32 ../sysdeps/x86_64/strrchr.S: no such file or directory
(gdb) bt
#0 strrchr () at ../sysdeps/x86_64/strrchr.S:32
#1 0x00007f10961236d7 in dirname (path=0x324a47a0 <error: Cannot access memory at address 0x324a47a0>) at dirname.c:31
I'm already compiling the executable with -g -ggdb options.
Odd thing is that.. with valgrind the program works without error in my PC as well.
How can I solve the problem? I've observed that the errors occur only with strrchr, strcmp, strlen, ... string.h functions.
+Edit: the gdb backtrace indicates that the program crashes here:
char* base_dir = dirname(get_abs_name(test_dir));
where get_abs_name is defined as
char* get_abs_name(char* dir) {
char abs_path[PATH_MAX];
char* c = malloc(PATH_MAX*sizeof(char));
realpath(dir, abs_path);
strcpy(c, abs_path);
return c;
}
+Edit2: 'dir' is a path of certain file, like '../program/blabla.jpg'.
Using valgrind,
printf("%s\n", dir)
normally prints '/home/frozenca/path_to_program'.
I can't guess why the program crashes without valgrind..
We cannot know for sure without a Minimal, Complete, and Verifiable example. Your code looks mostly correct (albeit convoluted), except you do not check for errors.
char* get_abs_name(char* dir) {
char abs_path[PATH_MAX];
char* c = malloc(PATH_MAX*sizeof(char)); /* this may return NULL */
realpath(dir, abs_path); /* this may return NULL */
strcpy(c, abs_path);
return c;
}
Now, how could this lead to an error like you see? Well, if malloc returns NULL, you'll get a crash right away in strcpy. But if realpath fails:
The content of abs_path remains undefined.
So strcpy(c, abs_path) will copy undefined content. Which could lead to it copying just one byte if abs_path[0] happens to be \0. But could also lead to massive heap corruption. Which happens depends on unrelated conditions, such as how the program is compiled, and whether some debugging tool such as valgrind is attached.
TL;DR: get into the habit of checking every function that may fail.
char* get_abs_name(char* dir) {
char abs_path[PATH_MAX];
char* c = malloc(PATH_MAX*sizeof(char));
if (!c) { return NULL; }
if (!realpath(dir, abs_path)) {
free(c);
return NULL;
}
strcpy(c, abs_path);
return c;
}
Or, here, you can simplify it alot assuming a GNU system or POSIX.1-2008 system:
char * get_abs_name(const char * dir) {
return realpath(dir, NULL);
}
Note however that either way, in your main program, you also must check that get_abs_name() did not return NULL, otherwise dirname() will crash.
Drop your function entirely and use the return value of realpath(dir, NULL) instead.
Convert type
char* c = malloc(PATH_MAX*sizeof(char));
Thanks!

C segmention fault when opening file

This seems to be a really simple one, but I can't figure it out after not touching C programming in four years.
I was trying to open a file in main()
int main(int argc, const char * argv[])
{
FILE * fp = fopen("data.txt","r");
...
return(0)
}
The program compiled, but when I tried to run it in gdb, the following error occurs.
Program received signal SIGSEGV, Segmentation fault.
0x00000000004016c6 in main ()
when the program is trying to open the file "data.txt". What could cause the error? Thanks!
I suspect your error lies in this bit of code:
...
In other words, there's nothing in the other code shown that appears to be wrong.
The most likely case is that the file doesn't exist, or it doesn't exist in the directory where the program is running (which, if you're in an IDE, usually turns out to be somewhere other than you think it is).
And, in that case, you're getting NULL from the fopen, then later using it, something like:
FILE *fp = fopen ("no_such_file.txt", "r");
int ch = fgetc (fp);
You should generally check return values from all functions that use them to indicate success or failure:
#include <stdio.h>
int main (void) {
FILE *fp = fopen ("no_such_file.txt", "r");
if (fp == NULL) {
perror ("Opening no_such_file.txt");
return 1;
}
// You can use fp here.
puts ("It worked.");
fclose (fp);
return 0;
}
What could cause the error?
The most likely cause of the error is that the file data.txt could not be opened (e.g. because it doesn't exist, or it's not in the current directory, or your program doesn't have permission to read it). That will cause fopen() to return NULL. Then if your code (in the ... section) tries to call fread() or fgets() or whatever and passes in the NULL pointer, that will cause a crash. You need to check the value returned by fopen() to make sure it is non-NULL before trying to use it.

Segfault occuring after a week of running?

I've got a C program which runs fine, but after around a week of running it always seems to segfault. I've compiled it with -g and run it through gdb and it looks like it pointing to the following code.
In my main loop I call a function (actually to try and debug why its crashing)
char config_debug[10];
I then read a conf file and based on the current setting in it, it sets config_debug to true
Then in my program I call this:
(line 312):
debug("send off data",config_debug);
This is the function:
int debug(char *debug_info, char *config_debug)
{
chomp(config_debug);
if ( strcmp(config_debug,"true") == 0 )
{
FILE *fp;
fp=fopen("/tmp/debug.log", "a");
(line 55):
fprintf(fp, debug_info);
fprintf(fp, "\n");
fclose(fp);
}
return 0;
}
void chomp(char *s) {
while(*s && *s != '\n' && *s != '\r') s++;
*s = 0;
}
Can anyone see anything wrong with the above 2 functions?
Here is a trace if it helps:
Program terminated with signal 11, Segmentation fault.
#0 0xb6d7a67c in vfprintf () from /lib/arm-linux-gnueabihf/libc.so.6 (gdb) bt
#0 0xb6d7a67c in vfprintf () from /lib/arm-linux-gnueabihf/libc.so.6
#1 0xb6d83cd8 in fprintf () from /lib/arm-linux-gnueabihf/libc.so.6
#2 0x0000a848 in debug (debug_info=0xc304 "send off data", config_debug=0xbec0cb5c "true") at station.c:55
#3 0x0000b614 in main (argc=1, argv=0xbec0cd94) at station.c:312
fprintf(fp, debug_info);
is wrong, and incorrect (possible undefined behavior) if debug_info contains a % (followed by some characters like s for instance).
You should read fprintf(3), enable all warnings in the cross-compiler e.g. by compiling with -Wall -g passed to your cross-compiler (clang would have warned you and gcc should, at least with -Wextra, but perhaps does not). In your case you could simply replace that faulty fprintf with a simpler and faster call to fputs(3) like:
fputs(debug_info, fp);
(in emebedded applications, fputs is often worth using, since it is faster than fprintf; actually sometimes the compiler optimize fprintf into something simpler.)
and replace the fprintf(fp, "\n"); with a simple putc('\n', fp);
BTW, it is confusing to have config_debug be both a global variable and a parameter. Avoid name collusions to improve readability. Be sure that config_debug and debug_info are null terminated strings.
fprintf(fp, debug_info);
fprintf(fp, "\n");
Above two statements are wrong.Modify Like this
fprintf(fp,"%s", debug_info);
fprintf(fp,"%s", "\n");
see fprintf()
From # Basile Starynkevitch comment, added.
You can also use fputs() and fputc() These are simpler and more efficient.
fputs(debug_info,fp);
fputc('\n',fp);

Segmentation fault backtracked to vfprintf?

I get a segmentation fault and using gdb and backtrace, it is thrown at vprintf.
#0 0x006e8779 in vfprintf () from /lib/libc.so.6
#1 0x006f265f in fprintf () from /lib/libc.so.6
#2 0x08049fd1 in write_tofile (logfile=0x9843090 "~/www/log") at example.c:446
It happens when I call
file = fopen(log_file, "a"); // log_file = "~/www/log"
fprintf(file, buffer);
Can fopen handle files from different directories? Would anyone have a clue as to why it segfaults here?
Using '~' as an abbreviation for your home directory is a shell thing, and isn't necessarily available in C. This is likely to cause the fopen to fail, and you're not checking the return code.
You must never fail to check for errors in operations that aren't 100% under your control. If you don't know whether a file exists and the open operation must succeed (and that's something you really cannot know for sure, ever), you must test:
FILE * f = fopen(log_file, "a");
if (!f) { /*error, die? */ }
fprintf(f, buffer);
Also make sure that buffer is a valid pointer to the first character of a null-terminated array of characters, and that the string doesn't contain any format specifiers.
For just printing a raw string str, it is safer to use fputs(str, f), or fprintf(f, "%s", str) if you must.
Check the contents of buffer. You have either a unescaped % symbol and/or no null terminating character.
Presumably fopen failed & returned a NULL pointer.
You should check the return value of fopen before using it, and use errno to determine what the error was.

C segmentation faults due to fopen(). How to trace and what to look for?

(this was asked on ffmpeg-devel list, but counted way offtopic, so posting it here).
ffmpeg.c loads multiple .c's, that are using log.c's av_log -> av_log_default_callback function, that uses fputs;
void av_log_default_callback(void* ptr, int level, const char* fmt, va_list vl)
{
...
snprintf(line, sizeof(line), "[%s # %p] ", (*parent)->item_name(parent), parent);
... call to colored_fputs
Screen output:
static void colored_fputs(int level, const char *str){
...
fputs(str, stderr);
// this causes sigsegv just by fopen()
FILE * xFile;
xFile = fopen('yarr', 'w');
//fputs(str, xFile);fclose(xFile); // compile me. BOOM!
av_free(xFile); // last idea that came, using local free() version to avoid re-creatio
Each time, when fopen is put into code, it gives a segmentation fault of unknown reason. Why this kind of thing may happen here? Maybe due to blocking main I/O?
What are general 'blockers' that should be investigated in such a situation? Pthreads (involved in code somewhere)?
fopen takes strings as arguments, you're giving it char literals
xFile = fopen('yarr', 'w');
Should be
xFile = fopen("yarr", "w");
if(xFile == NULL) {
perror("fopen failed");
return;
}
The compiler should have warned about this, so make sure you've turned warning flags on (remeber to read them and fix them)

Resources