I have a function that reads integers with certain format from a file.
It works fine as desired, but whenever I tried to close the file with fclose(),
fclose() always returns EOF.
Any suggestions why? I am a student and still learning.
I have put the code below. Please let me know if you need the "processing" code. Thanks :)
// Open the file
FILE *myFile = fopen(fileName, "r");
if(myFile == NULL){
//Handle error
fprintf(stderr, "Error opening file for read \n");
exit(1);
}
while(myFile != EOF)
{
// read and process the file
// this part works.
}
// always returns EOF here. WHY?
if (fclose(myFile) == EOF) {
// Handle the error!
fprintf(stderr, "Error closing input file.\n");
exit(1);
}
printf("Done reading the file.");
EDIT:
Thank you for all the replies. Sorry I cannot post the code as this is part of my homework. I was trying to get some help, I am not asking you guys to make the code for me. Posting code is illegal according to my Prof (since other students can see and probably copy, that's what he told me). I can only post the code after Sunday. For now, I will try to modify my code and avoid using fscanf. Thanks and my apology.
This:
while(myFile != EOF)
is actually illegal (a constraint violation). Any conforming C compiler is required to issue a diagnostic; gcc, by default, merely issues a warning (which does qualify as a "diagnostic").
If gcc gave you a warning, you should pay attention to it; gcc often issues warnings for things that IMHO should be treated as fatal errors. And if it didn't give you a warning, you're probably invoking it with options that disable warnings (which is odd, because it does produce that warning by default). A good set of options to use is -Wall -Wextra -std=c99 -pedantic (or adjust the -std=... option to enforce a different version of the standard if you like).
myFile is of pointer type, specifically FILE*. EOF is of type int, and typically expands to (-1). You cannot legally compare a pointer value to an integer value (except for the special case of a null pointer constant, but that doesn't apply here.)
Assuming the program isn't rejected, I'd expect that to result in an infinite loop, since myFile would almost certainly never be equal to EOF.
You could change it to
while (!feof(myFile))
but that can cause other problems. The correct way to detect end-of-file while reading from a file is to use the value returned by whatever function you're using read the data. Consult the documentation for the function you're using to see what it returns when it encounters end-of-file or an error condition. The feof() function is useful for determining, after you've finished reading, whether you encountered end-of-file or an error condition.
Since there is nothing that you can do to a regular file open in read-only-mode that would cause a fclose to error out, you very probably have a bug in the code you didn't show which is stomping on the myFile structure.
Also the test myFile != EOF will never be true because you set it to the return of fopen which will never give you EOF and you already checked it for NULL Did you mean something like:
while((c = fgetc(myFile)) != EOF) {
// do stuff
}
What the errno said? add this to your code:
#include <errno.h>
#include <string.h>
if (fclose(myFile) == EOF) {
// Handle the error!
fprintf(stderr, "Error closing input file. and errno = %d, and error = %s\n", errno, strerror(errno));
exit(1);
}
Hope this help.
Regards.
while(myFile != EOF)
{
// read and process the file
// this part works.
}
If this part is ok then
fclose(myFile);
definitely return you EOF.Because the while loop terminates only myFile == EOF(this is a wrong comparison, do not ignore warnings).comparision between pointer and int.As EOF is a macro defined in stdio.h file And according to glibc it -1.Your loop terminates that means your myFile pointer changed to EOF some whire.This is your mistake.
just go through your code you must be change the FILE pointer myFile which should not be over written by you As it point to a file structure
which is used for all file operation.
NOTE
myFile which is a pointer to a file should not be used as lvalue in any assignment statement.
Change while(myFile != EOF){...} by while(!feof(myFile)){...}
myFile is a pointer to a FILE struct (a memory address). Not a "end of file" indicator.
Related
Trying to read from a file to use in a small game I've created. I'm using the fgets function. It's returning a Segmentation Fault, not sure why.
The file it's reading, just contains "20 10" in a txt file as this is the map size.
My readfile function is shown below
if (argc == 2) {
f = fopen("map.txt", "r");
if (NULL == f) {
printf("File cannot be opened.");
}
while (fgets(fileRead, 50, f) != NULL) {
printf("%s", fileRead);
}
fclose(f);
}
The if (argc == 2) can be ignored, this is just to make this section run, as I'm modifying a file so just running this function by satisfying that if statement.
I am fairly new to C, so apologies if I'm missing something minor. Worth noting I'm programming in C89 and using the -Wall -ansi -pedantic compile options, as this is University work and the tutors want us to do C89.
EDIT:
char userInput, fileRead[50];
FILE* f;
Declaration of variables.
Assuming that your problem is indeed in your posted code and not somewhere else in the program, then I believe that your problem is caused by the following issue:
After calling fopen, you check the return value of the function immediately afterwards, to verify that it succeeded. However, if it doesn't succeed and it returns NULL, all you do is print an error message to stdout but continue execution as if it succeeded. This will cause fgets to be called with NULL as the stream argument, which will invoke undefined behavior and probably cause your segmentation fault.
In the comments section, you raised the following objection to this explanation:
However it doesn't print the error message anyway and still segmentation faults, so I think the problem isn't here?
This objection of yours is flawed, for the following reason:
When a segmentation fault occurs, execution of the program is immediately halted. The content of the output buffer is not flushed. This means that output can get lost when a segmentation fault happens. This is probably what is happening in your case.
If you want to ensure that the output actually gets printed even in the case of a segmentation fault, you should flush the output buffer by calling fflush( stdout ); immediately after the print statement. Alternatively, you can print to stderr instead of stdout. In constrast to stdout, the stream stderr is unbuffered by default, so that it does not have this problem.
You can test whether my suspicion is correct by changing the line
printf("File cannot be opened.");
to
printf("File cannot be opened.");
fflush( stdout );
or to:
fprintf( stderr, "File cannot be opened." );
If the error message now gets printed, then this probably means that my suspicion was correct.
In any case, I recommend that you change the lines
if (NULL == f) {
printf("File cannot be opened.");
}
to the following:
if (NULL == f) {
fprintf( stderr, "File cannot be opened." );
exit( EXIT_FAILURE );
}
That way, the program will exit immediately if an occur occurs, instead of continuing execution.
Please note that the code posted above requires you to #include <stdlib.h>.
I wrote the following code in GDB online debugger :
#include <stdio.h>
int main()
{
printf("jkjkkjkj");
int p , n;
FILE *fp;
printf("jkjkkjkj2");
fp = fopen("abc.txt","r");
while ( (n = getc(fp))!= EOF)
{
printf( "the chareacter here is %d \n", n);
}
n = fclose(fp);
return 0;
}
While executing the code I am getting a segmentation fault at the line where I am trying to fetch the characters from the file. I know that as the file does not exist the segmentation fault error is coming.
However, what intrigues me is the absence of the messages that I am trying to print on the screen. I tried checking on debugger and once I found:
optimized out written near the line no
However, I tried putting getchar() here and there, the messages got printed on the screen even if the segmentation fault persists.
How to explain this? Why is this happening? Why are the messages printed when I am putting getchar() at different places?
I had tried writing this code on a Solaris server and compiling using GCC. The code got compiled but I did not get any output message even when a file with the name provided in the directory existed.
As answered by Yunnosch, you probably forgot to check against failure of fopen(3). A better habit is to always check that, at least by coding:
fp = fopen("abc.txt","r");
if (fp == NULL) { perror("fopen abc.txt"); exit(EXIT_FAILURE); };
and take the habit of doing at least that everywhere. Using perror(3) (or strerror(3) with errno(3)) is a useful habit to get, since you want some reason related to the failure (given by errno perhaps thru perror).
More generally, always read the documentation of functions that you are using (for standard functions, at least on some reference website, and possibly in the C11 standard n1570), and take care of handling their failure (at the very least, by checking against failure and exiting with a useful message to stderr); for Unix functions, see their man pages (on Linux, start on intro(2) and intro(3); for Solaris, start with intro(2) & intro(3)..). In your Unix terminal, try also man fopen ... For POSIX standard, start here.
what intrigues me is the absence of the messages that I am trying to print on the screen.
That is simple. stdout is buffered (see also setvbuf(3)), and often line-buffered. So a printf which does not end with a \n has its output still inside the buffer, and not yet on the screen. The habit to get is to almost always end your printf(3) control format string with a newline, or else to flush the buffer explicitly using fflush(3).
For a newbie, there are few reasons to avoid ending your printf with an explicit \n. So use instead
printf("jkjkkjkj\n");
Otherwise, call fflush(NULL); quite often in your program. BTW, for these buffering reasons, fflush(NULL) should be done before calls to system(3), fork(2), execve(2) and other important program-wide functions.
optimized out written near the line no
That probably happens in the C standard library itself (e.g. in getc from some libc.so), which is usually not compiled with debug information. In practice, trust your C standard library: you are much more likely to have bugs in your code that in libc.
Your own source code should be compiled with gcc -Wall -Wextra -g (asking the GCC compiler to give all warnings and debug info in DWARF format, usable by the gdb debugger) and you need to improve your code to get no warnings at all before using the gdb debugger.
Be aware of undefined behavior, spend several hours reading about UB, and be scared of UB.
Try guarding against NULL in fp and for good measure make sure the debug output gets printed (as in comment by Some Programmer Dude).
#include <stdio.h>
int main(void)
{
int p , n;
FILE *fp;
printf("jkjkkjkj2\n");
fp = fopen("abc.txt","r");
if (NULL != fp)
{
while ( (n = getc(fp))!= EOF)
{
printf( "the chareacter here is %d \n", n);
}
n = fclose(fp);
} else
{
printf("File opening failed somehow!\n");
}
return 0;
}
Note the nice touch (by Basile Starynkevitch) to only close what was successfully opened.
I am practicing some practice questions in FILE IO in C. Below is one of the programs.
#include<stdio.h>
#include<stdlib.h>
int main()
{
char fname[]="poem.txt";
FILE *fp;
char ch;
fp = fopen ( fname, "tr");
if (fp == NULL)
{
printf("Unable to open file...\n");
exit(1);
}
while((ch =fgetc(fp)) != EOF)
{
printf("%c",ch);
}
printf("\n");
return 0;
}
As you can see in the statement
fp = fopen ( fname, "tr");
The mode "tr" is not a valid mode (as I understand). I was expecting gcc to give an error (or a warning) while compiling the above program. However, gcc does not give any error (or warning) while compiling it.
However, as expected, when i run the program it exits printing "Unable to open file..." which means fopen() returned NULL , because there was error while opening file.
-bash-4.1$ ./a.out
Unable to open file...
-bash-4.1$
(The file poem.txt exists so this is because of the invalid mode given to fopen(). I checked by changing the mode to "r" and it works fine displaying the content of "poem.txt")
-bash-4.1$ ./a.out
THis is a poem.
-bash-4.1$
I was expecting gcc to give an error (or warning) message for the invalid mode.
Why gcc does not give any error (or warning) for this ?
the compiler doesn't check what you do, it only checks the syntax.
However, at run time, if the code is written like so:
#include<stdio.h>
#include<stdlib.h>
int main()
{
char fname[]="poem.txt";
FILE *fp;
char ch;
fp = fopen ( fname, "tr");
if (fp == NULL)
{
perror( "fopen for poem.txt failed");
exit( EXIT_FAILURE );
}
while((ch =fgetc(fp)) != EOF)
{
printf("%c",ch);
}
printf("\n");
return 0;
}
then a proper error message is output:
...$ ./untitled
fopen for poem.txt failed: Invalid argument
This is Undefined Behavior:
Per Annex J.2 "Undefined Behavior", it is UDB if:
—The string pointed to by the mode argument in a call to the fopen function does not exactly match one of the specified character sequences (7.19.5.3).
Although Annex J is informative, looking at §7.19.5.3:
/3 The argument mode points to a string. If the string is one of the following, the file is open in the indicated mode. Otherwise, the behavior is undefined.
Basically, the compiler can blow you off here - a standard library function name (and behavior) can be used outside of the inclusion of a standard header (for example, non-standard extensions, completely user-defined behavior, etc.). The Standard specifies what a conforming library implementation shall include and how it shall behave, but does not require you to use that standard library (or define behavior for a specific implementation explicitly specified as UDB territory: at this point, if your parameter types match it's a legal function call).
A really good lint utility might help you here.
How is the compiler supposed to know what the valid arguments for a function are?
To do it you'd be building too much knowledge in the compiler - it would have to recognize functions and their parameters by name. What if you want to override the function? What if different modes are valid on different platforms?
In Windows programming, "tr" is a valid mode is not a valid mode, although "rt" is. The t means text and the r means read. (If you are using gcc and linking to MS's C runtime then you will be able to use this).
However you still don't see t very often because it is the default and therefore redundant; the other option for this setting is b meaning binary. But MS do seem to explicitly use t in their examples to make it clear that translation is intended.
The behaviour of text mode and binary mode for a stream is implementation-defined, although the intent is that binary mode reads the characters exactly as they appear on disk, and text mode may perform translations relevant to text processing; most famously, converting \r\n in MS text files to \n in your program.
I've been trying simple file handling in C and I wanted to make sure that the file can be accessed tried using this
#include<stdio.h>
main()
{
CheckFile();
}
int CheckFile()
{
int checkfile=0;
FILE *fp1;
fp1 = fopen("users.sav","r");
if(fp1==NULL)
{
fopen("users.sav","w");
fclose(fp1);
}
if(checkfile!=0)printf("\nERROR ACCESSING FILE!\nNow exiting program with exit code: %d\n",checkfile);exit(1);
return 0;
}
then it displays
Segmentation fault (core dumped)
but it doesn't segfault if the file already exists beforehand (e.g. when i created it manually or when i run the program the second time)
Please help. I need this for our final project due in a week and I haven't gotten the hang of files and pointers yet.
I'm using "gcc (Ubuntu/Linaro 4.8.1-10ubuntu9) 4.8.1"
P.s
I saw this in another question
There's no guarantee in your original code that the fopen is actually working, in which case it will return NULL and the fclose will not be defined behaviour.
So how exactly do I check if it worked?
That's normal, when you call fclose(fp1) when fp1 is NULL.
BTW
fopen("users.sav","w");
is useless because you don't assign the return value to a file pointer. That means the users.sav file will be opened for writing, but you will never be able to write anything in it .
fopen returns a FILE pointer. It will return NULL and set the global errno to indicate the error. If you want to check the errno, you have to check if after you check if fopen returned NULL.
if (fp1 == NULL)
{
printf("fopen failed, errno = %d\n", errno);
}
Otherwise, you may get an errno from something else, not necessarily your fopen call. Also include errno.h. You also don't need to call fopen("users.sav","w"); again. You aren't reassigning the pointer nor checking it again.
I don't see a reason to call fclose here since if fopen returns NULL, there isn't anything to close. That is probably the reason for your seg fault. You are trying to close a null pointer. More information on fopen failures.
Another comment on your code. If you are going to return an int from CheckFile, it should probably not be 0 on fail. I would return -1 to indicate an error. Better yet, you could return the global errno. Also, main should be int main() and you should return 0; at the end. I don't particularly care for your naming scheme of CheckFile. In C, check_file or camelCase of checkFile would be better.
In CheckFile, your one line if statement could be formatted and work more properly if you formatted it on multiple lines. It doesn't do what you think it does currently:
if(checkfile!=0)
{
printf("\nERROR ACCESSING FILE!\nNow exiting program with exit code: %d\n", checkfile);
exit(1);
}
Also, checkfile is never set anywhere in your code.. other than zero. So the code in the if statement will not execute, period.
I'm not really sure what you're trying to do, but the immediate problem is here:
if(fp1==NULL)
fclose(fp1);
After asserting that fp1 is NULL, you're trying to call close on the null pointer, which will cause a segmentation fault.
If all you want to do is verify that the file exists, try something like What's the best way to check if a file exists in C? (cross platform)
The man page of fclose says -
The behaviour of fclose() is undefined if the stream parameter is an
illegal pointer, or is a descriptor already passed to a previous
invocation of fclose().
The error is in the if block in your code.
if(fp1==NULL)
{
fopen("users.sav","w");
fclose(fp1); // passing NULL to fclose invokes undefined behaviour
}
Another unrelated problem:
This line is probably not what you want:
if(checkfile!=0)printf("\nERROR ACCESSING FILE!\nNow exiting program with exit code: %d\n",checkfile);exit(1);
If we write it correctly formatted the error becomes obvious:
if (checkfile != 0)
printf("\nERROR ACCESSING FILE!\nNow exiting program with exit code: %d\n",checkfile);
exit(1);
return 0 ;
Actually we will get to exit(1) even if checkfile is zero.
You probably want this:
if (checkfile != 0)
{
printf("\nERROR ACCESSING FILE!\nNow exiting program with exit code: %d\n",checkfile);
exit(1);
}
return 0 ;
Conclusion: format your code correctly and many errors will suddenly look obvious.
While debugging some code I got something like below:
#include<stdio.h>
int main()
{
FILE *fb = fopen("/home/jeegar/","r");
if(NULL == fb)
printf("it is null");
else
printf("working");
}
Here in fopen I gave a somewhat valid path name but not a filename. Shouldn't fopen return NULL then? But it does not return null!
Edit:
If I give path of valid directory in fopen then it will print working:
If I give path of invalid directory in fopen then it will print it is null
Edit:
spec says
Upon successful completion, fopen() shall return a pointer to the object
controlling the stream. Otherwise, a null pointer shall be returned.
so here whether error code set or not, it MUST return NULL
And error code setting is an extansion to ISO C standard standard.
ERROR IS ALSO NOT GOING TO SET HERE
#include<stdio.h>
#include <errno.h>
int main()
{
errno = 0;
FILE *fb = fopen("/home/jeegar/","r");
if(fb==NULL)
printf("its null");
else
printf("working");
printf("Error %d \n", errno);
}
OUTPUT IS
workingError 0
I think that in Unix everything (directories included) is considered to be file so fopen should work on them.
The posix man page man 3p fopen says, in the section ERRORS:
The fopen() function shall fail if:
[...]
EISDIR The named file is a directory and mode requires write access.
(Emphasis mine). Since you are not requesting write access, and chances are that the path you use is a directory, the function does not fail.
About what can you use with a FILE* that refers to a directory, I have no idea.
As you might be very well aware that pretty much everything on Linux system is a file, if not a file then its a process (corrections & remarks welcome :) ) Directory is treated like a file which lists other files (Reference from TLDP); so opening to read a directory as a file is a valid operation and thus you do not get any error. Although trying to write to it is not allowed, so if you open directory in write or append mode, the fopen operation will fail (this has been very well mentioned is other responses & link to fopen documentation). Most of the file operation like read & write operations on this file stream will fail with the error stating that its a directory. Only use which could be found was finding the size of the file (directory in this case) using fseek to SEEK_END & ftell (which will most likely give a result of 4096).
Regarding using errno to get meaningful messages, you can use perror which is in stdio.h & pass message which will be added before the error message or strerror which is in string.h & pass errno which is in errno.h
Hope this helps!
How to check that errno?
You can check errno for example:
#include <errno.h>
#include <stdio.h>
#include <stdlib.h>
int main(void)
{
FILE *fp;
errno = 0;
fp = fopen("file.txt", "r");
if ( errno != 0 )
{
// Here you can check your error types:
perror("Error %d \n", errno);
exit(1);
}
}
Error types you can find at http://pubs.opengroup.org/onlinepubs/009695399/functions/fopen.html Error section.