Segmentation Fault 11 in C File I/O - c

I'm writing a function which searches a text file formatted like this:
#User1\pass\
#User2\pass\
#User3\pass\
I have written the function Check_User:
int Check_User(char input[20], FILE *userlist)
{
int c, i;
fseek(userlist, 0, SEEK_SET);
while(1)
{
while((c = fgetc(userlist)) != '#')
{
if(c == EOF)
return 1;
}
while(input[i] == (c = fgetc(userlist)))
{
i++;
}
i = 0;
if(c == '\\')
return 0;
}
}
Which is called from here:
while(Check_User(username, userlist) == 0)
{
printf("Username already in use. Please select another:");
Get_Input(username);
}
Check user checks the file pointed to by *userlist to see if the username[20] contains a username which is already in use. If it is already in use it calls Get_Input for a new username.
The program makes it all the way to the while loop and then gives me a segmentation fault: 11. I have read that this often stems from trying to write beyond the end of an array, or generally doing things to memory you don't have access to. Userlist.txt has been opened in r+ mode.
Thanks for the help.

You did not initialize your variable i before its first use. In C, declared variables are not initialized to numerical 0 (as they would be in C# or Java), but simply use the value that was present at their memory location before. Thus, the value of i may be far bigger that the length of the string input, and input[i] may access an invalid memory location.
As a side note: to quickly debug this yourself under Linux, compile the program with debug symbols (gcc -g) and then use valgrind --tool=memcheck <your program> to let valgrind find the source of error for you.

Related

Realloc FAILS with >= 27 chars

I've stumbled upon following problem: I wrote a program in C, which functions as a variation of Mergesort, just with forking. For this problem, I wrote a function which reads exactly one line from some input stream specified by a parameter, adds a '\0' at the end, and returns a boolean, indicating whether a newline character exists or not. The source code is below:
static bool getInputLine(FILE *f, char **input, int *size_in) {
int size = 0;
char todo;
do {
if((*input = realloc(*input, (size + 1) * sizeof(char))) == NULL) exitWithStatus(EXIT_FAILURE, "reallocing FAILED");
todo = fgetc(f);
if(todo == EOF) break;
(*input)[size] = todo;
if(todo == '\n' && size == 0){
fprintf(stdout, "\n");
break;
}
size++;
} while(todo != '\n');
if((*input = realloc(*input, (size+1) * sizeof(char))) == NULL) exitWithStatus(EXIT_FAILURE, "reallocing FAILED");
(*input)[size] = '\0';
*size_in = size;
return size == 0 || todo != '\n';
}
The code works for (almost) any input, at least so I thought. For some reason, if a line is greater than exactly 26 chars, I get:
realloc(): invalid next size Aborted (core dumped)
I'm relatively new to C, and this had me confused as ever. Any help is greatly appreciated, thanks in advance!
My OS is Linux Fedora, if that helps in any way. Should you need anything else, please let me know :)
And just fyi, the input String is malloc'd with sizeof(char) before calling the function! It is freed again at exiting the process.
EDIT: Despite help from various sides, I have not been able to solve this issue. I had to generally rethink my strategy to solve the problem as a whole. I suppose it had something to do with global variables being inherited by the forked child processes, and hence borking the heap structs in some way or another. Unfortunately, I lost track of what happened exactly, despite restructuring my code to make it more debug-friendly.
Thank you very much for your help regardless! :)

Usage of ferror with a regular file in c

In this related question How to use feof and ferror for fgets (minishell in C), the answers aren't really clear for my problem.
What I'm trying to do is: I want to read all of the characters from a plain text file on my disk. It's not a link, devicefile, socket etc. just a regular text file. Once I've read all the characters I want to see if everything succeeded. What I'm seeing now is, in my Debug builds everything goes successfully, but in my Release builds ferror() always indicates there is a an error. While the Debug build returns the 0 in the end. In both cases I can see that the content of the file has been obtained in a debugger.
my_function(File* f) {
int c;
while ((c = fgetc(f)) != EOF) {
char character = (char) c;
// store character in a dynamic growing buffer
}
// append '0' byte to buffer (to terminate the string).
if (ferror(f)) {
// return error
return 1;
}
return 0; // no error.
}
Rationale of having a already opened file as function argument, is to make it relatively easy to read from a file, without the need to bother with a platform dependent encoding. This is inside a private part of my library, the public functions handle the case that f == NULL.
Do I need to call clearerr(file) first, because the error bit is not initialized in a Release builds?

C - Different array behavior linux/windows?

I have a litle probem with C programming. I have to write a programm for the university and I wrote the whole programm on my windows pc.
I tried it afterwards on a linux system and I got different output.
Here is the code:
char *compress(char *input,int lineSize)
{
int n = 0;
char *compr = (char*)malloc(lineSize * sizeof(char));
//compr++;
char c = *input;
while(*input != '\0')
{
printf("compr: %s\n",compr);
if(*input != c)
{
snprintf(compr,lineSize,"%s%c%d",compr,c,n);
n = 0;
c = *input;
n++;
}
else
{
n++;
}
input++;
}
snprintf(compr,lineSize,"%s%c%d%c",compr,c,n,'\0');
printf("compr: %s\n",compr);
return compr;
}
this works as it should on my windows pc, but when I run it on linux I get a file writing error + the "compr" array is empty :/
I hope anyone could help me, couldnt find any solution.
thanks
Compile with warnings:
warning: function returns address of local variable [enabled by default]
Another possible problem:
snprintf(compr,lineSize,"%s%c%d%c",compr,c,n,'\0');
From the sprintf man page:
Some programs imprudently rely on code such as the following
sprintf(buf, "%s some further text", buf);
to append text to buf. However, the standards explicitly note that the
results are undefined if source and destination buffers overlap when
calling sprintf(), snprintf(), vsprintf(), and vsnprintf(). Depending
on the version of gcc(1) used, and the compiler options employed,
calls such as the above will not produce the expected results.

Segmentation fault in fscanf() function

I am writing some code for a C wordsearch program and, while compiling one of the files, i get segmentation fault on fscanf function but i don't know where the error is.
I already searched for the answer and i understood that on integer variables I must initialize the value of them (and I already done it) and that i must refer the types inside fscanf() with '&' (done it too).
Here is the code( in the main() function):
int i;
int nl = 0;
int nc = 0;
int a, b, d;
char mat[1000][1000];//character matrix for wordsearch letters
int x[1000], y[1000];
int l, c, n;
printf("Chose the wordsearch you want to use.\n For example: t01.in, t02.in, t03.in, t04.in, (...),t30.in\n");
char wordsearch_file[8];
fgets(wordsearch_file,8,stdin);//user enters f.e. 't01.in'
FILE *stream;
char buffer[1024];
sprintf(buffer, "home/adminuser/Desktop/LA/ETAPA2/sopas/%s", wordsearch_file);
stream = fopen(buffer,"r");
if((fscanf (stream,"%d%d", &nl, &nc)) > 0){ // SEG. FAULT happens here
for(l = 0; l < nl; l++) {
for(c = 0; c < nc; c++)
mat[l][c] = fgetc(stream) != EOF;
fgetc(stream);
}
}
I wanted 'nl' (number of lines) to read 3 and 'nc' (number os columns) to read the other 3.
The 't01.in' file:
3 3
SIA
ORR
EDI
Anytime you open an external resource (file, database, socket), or make any system call whatsoever, you always check for a valid stream or return code.
The first thing you should do is add a check for stream instead of blindly calling fscanf(stream,...) without knowing if the fopen() call succeeded.
Then decide why fopen() failed. I suggest printing out the filenamne, or checking that it exists, and/or using perror() to print the system error. perror() will tell you exactly what is wrong, and if I had to guess, it would be as #BLUEPIXY mentioned, a newline in the filename.
stream = fopen(buffer,"r");
if(!stream) {
perror("what the problem: ");
}
Lastly, learn how to use the debugger to analyze the core file. If you aren't getting a core dump, set your ulimit correctly. From memory, you want "ulimit -c unlimited". Find out your current ulimits by typing simply "ulimit" at the shell prompt. Then re-run your crashing program. Once you get a core file, run GNU debugger on it.
gdb program.exe core

C - Reading File Error.. !f_in is null and dont know why?

// Trying to read file
void readFilee(char *namefile){
FILE *f_in = fopen(namefile,"r");
char x;
int i = 0;
if(!f_in){ printf("Error"); exit(0); }
/* read to EOF */
while(1){
x = getc(f_in);
if(x == '\n') continue;
archivo[i] = x;
if(x == EOF) break;
i++;
}
tamArchivo = i;
fclose(f_in);
}
Fact that the error was f_in is null, but I do not understand why? I'm trying to connect a server with multiple clients, the clients are initialized waiting connection.
The command verificion "if" I get "Error"
add
archivo[i] = '\0';
after the while loop terminate. This will add null charachter at the end of your string archivo
also may be the memory space of your archivo array is not sufficient to get the whole content of the file and so it could cause a buffer overflow so it could cause a crash
Verify f_in is not NULL before using it. I bet fopen fails and this causes f_in to be NULL. Later when you call getc, it will crash.
If that is not the case, try debugging your application with cgdb for instance and see which line causes the crash.
Try replacing while (1) with while (!feof(f_in)). This will exit the loop when the file is at the end. You shouldn't check for end of file like you are in your loop. This would require the end of file character to be somewhere in your file. The value of EOF is actually specifically designed so it doesn't appear in your files.
Other things to look at are uncommenting your check that the file was opened successfully. If this check is failing then you shouldn't attempt to use any of the fxxx functions on the file pointer.
you are not checking for array boundaries of "archivo"
that'd be for me the first thing to check

Resources