C: fprintf does not work - c

I have a long C code. At the beginning I open two files and write something on them:
ffitness_data = fopen("fitness_data.txt","w");
if( ffitness_data == NULL){
printf("Impossible to open the fitness data file\n");
exit(1);
}else{
fprintf(ffitness_data,"#This file contains all the data that are function of fitness.\n");
fprintf(ffitness_data,"#Columns: f,<p>(f),<l>(f).\n\n");
}
fmeme_data = fopen("meme_data.txt","w");
if( fmeme_data == NULL){
printf("Impossible to open the meme data file\n");
exit(1);
}else{
fprintf(fmeme_data,"#This file contains all the data relative to memes.\n");
fprintf(fmeme_data,"#Columns: fitness, popularity, lifetime.\n\n");
}
Everything is fine at this step: files are open and two lines are written on them.
Then I have a long simluation of a stochastic process, whose code is not interesting for the question's purposes: the files and their pointers are never used. At the end of the process I have:
for(i=0;i<data;i++){
fprintf(fmeme_data,"%f\t%d\t%f\n",meme[i].fitness,meme[i].popularity,meme[i].lifetime);
}
for(i=0;i<40;i++){
fprintf(ffitness_data,"%f\t%f\t%f\n",(1.0/40)*(i+0.5),popularity_histo[i],lifetime_histo[i]);
}
Then I DO fflush() and fclose() of both files.
If I make the code run on my laptop, both files are filled. If the code runs on a remote server, the file fitness_data.txt contains only the first print, i.e. the print starting with # but doesn't contain the data. I want you to note that:
The other file never gives me problems.
I'm used to this server. Something similar never happened.
Given all these information, the question is:
Why it is happening that a certain command, used always in the same way and in the same code, always works on a server while on a different server it works sometime but sometime it doesn't?
Admins: I don't think this question is a duplicate. All similar questions were solved by asjusting the code (here) or adding fflush() (here) and similar things. Here is not a problem in the code (in my modest opinion) because on my laptop it works. I bet it works on most.

We can't say for certain what's going on here, because we don't have your full program nor do we have access to the server where the problem happens. But, we can give you some debugging advice.
When a C program behaves differently on one computer than another, the very first thing you should suspect is memory corruption. The best available tool for finding memory corruption is valgrind. Fix the first invalid operation it reports and repeat until it reports no more invalid operations. There are excellent odds that the problem will have then gone away.
Turn up the warning levels as high as they can go and fix all of the complaints, even the ones that look silly.
You say you are calling fflush and fclose, but are you checking whether they failed? Check thoroughly, like this:
if (ferror(ffitness_data) || fflush(ffitness_data) || fclose(ffitness_data)) {
perror("write error on fitness_data.txt");
exit(1);
}
Does the problem go away if you change the optimization level you are compiling with? If so, you may have a bug that causes "undefined behavior". Unfortunately there are a lot of possible ways to do that and I can't easily explain how to look for them.
Use a tool like C-Reduce to cut your program down to a smaller program that still doesn't work correctly but is short enough to post here in its entirety.
Read and follow the instructions in the article "How to Debug Small Programs"..

Related

How can I debug the cause of a 0xc0000417 exit code

I get an exit error code 0xc0000417 (which translates to STATUS_INVALID_CRUNTIME_PARAMETER) in my executable (mixed Fortran/C) and try to find out what's causing it. It seems to occur when trying to write to a file which I infer because the file is created but there's nothing in it. Yet I have the suspicion that's not the /real/ cause. When I disable writing of that file, which is done from C code, it crashes when writing a different file, this time from Fortran code.
The unfortunate thing is: this only happens after the program (a CPU heavy calculation) has finished after having run for ~2-3 days. When I tried to shorten the calculation time by various means to facilitate debugging, the problem did not occur anymore. It almost seemed like the long runtime was crucial for triggering the problem.
I tried running it in Visual Studio 2015 but VS does not break/stop (like it would if e.g. a segfault had happened) despite having turned on breaking at all the C++ Exceptions, like was suggested in some other thread and all Common Language Runtime Exceptions.
What I would like VS to do is to either break whenever that error code is 'produced' and examine the values of variables or at least get a stack trace.
I searched intensively but I could not find a satisfactory solution to my problem. In essence, my question is similar to how to debug "Invalid parameter passed to C runtime function"? but the problem does not occur with the linux version of my program, so I'm looking for directions on how to debug it on Windows, either with Visual Studio or some other tool.
Edit:
Sadly, I was not able to find any convenient means of breaking automatically when the error occurs. So I went with the manual way of setting a breakpoint (in VS) near the supposed crash and step through the code.
It turned out that I got a NULL pointer from fopen:
myfile = fopen("somedir\\somefile.xml");
despite the file being created. But when trying to write to that file (via the NULL handle!), a segfault occurred. Strangely, it seems I only get a NULL pointer from fopen when the process has a long lifetime. But that's offtopic for that question.
Edit 2:
Checking the global errno variable gave error code 22 which again translates to an invalid argument. However, the argument to fopen is not invalid as I verified with the debugger and the fact that the file is actually created correctly (with 0 bytes length). Now I think that that that error code 22 is simply misleading because when I check (via a watch in VS) $err, hr I get:
0x000005aa ERROR_NO_SYSTEM_RESOURCES : Insufficient system resources exist to complete the requested service.
Just like mentioned here, I have plenty of HD space (1.4 GB), plenty of free RAM (3.2 GB), and I fear it is something not directly caused by my program but something due to broken Windows design of file handling (it does not happen under Linux).
Edit 3: OK, it seems it is not Windows itself that's the culprit but rather the Intel Fortran compiler I'm using. Every time I'm doing formatted write statements in my program, a Mutant (Windows speak for mutex) handle is leaked. Using WinDbg and !htrace -enable, then running a bit further, break and issue !htrace -diff gives loads of these backtraces:
0x00000000777ca25a: ntdll!NtCreateMutant+0x000000000000000a
0x000007fefd54a1b7: KERNELBASE!CreateMutexExW+0x0000000000000057
0x000007fefd551d60: KERNELBASE!CreateMutexExA+0x0000000000000050
0x000007fedfab24db: libifcoremd!for_lge_ssll+0x0000000000001dcb
0x000007fedfb03ed6: libifcoremd!for_write_int_fmt+0x0000000000000056
0x000000014085aa21: myprog!MY_ROUTINE+0x0000000000000121
During the program runtime these mutant handles seem to accumulate until they exhaust all handle resources (16711680 handles) so that there's nothing left for file handles.
Edit 4: It's a bug in the intel fortran runtime libraries that has been fixed with a later version (see here). Using the patched version of libifcoremd.dll fixes the problem, i.e. the handle count does not increase anymore during formatted writes.
It could be too many open files or leaked (not closed) handles. You can check that with e.g. Process Explorer (I think you could see the number of handles in the process with it).

Simple C program labelled as virus

I was writing a program to input a string and output alternate characters in the string and I could compile it once, and building it made my antivirus program report that it is a virus (Gen:variant.graftor.74557).
Does any of my code causes something malicious to be called a virus
#include<stdio.h>
#include<string.h>
void altchar()
{
char a[50];
printf("Enter a string");
gets(a);
int i=0;
for(i=0;i<strlen(a);i+=2)
printf("%c",*(a+i));
}
int main()
{
altchar();
return 0;
}
Other c programs compiles very smoothly with no clashes with my AV.
Update:
My AV has no problem with gets() function, and other programs that use gets works smoothly.
Update 2:
By the way, I can run the program exactly once, then it is moved to quarantine.
And the output is nothing and the compiler tells me
Process returned 1971248979 (0x757EDF53) execution time: -0.000 s
For curious minds, I use Bitdefender Antivirus!
The "virus" detected is actually a placeholder name for the F-Secure generic trojan detector. It looks into programs for suspect behaviour. Unfortunately that kind of analysis is bound to sometimes producing false positives.
Maybe your harmless code matches some known malware behaviour on a byte code level? Try making a small change to your code and see if the problem goes away. Otherwise you can submit your program (info on the paged linked above) as a false positive to help them improve their database.
The classic methodology of an antivirus software is to take a few bytes (very well chosen bytes) from an infected file and use those as an identifier string ... it searches the executables and if the bytes match with some bytes from an executable (this check is most of the time done when opening (running) an executable, or when performing a full system scan) then it marks it as a virus.
Fix your code (as per comments), and recompile ... see what happens :)

fopen gives no error, but file stays empty

I'm having trouble writing in a file, but the problem is that weird, I dont't even know for what I should exactly ask for.
At first: I use C, I am forced to keep it concurring to the 89/90-standard an to compile with gcc. My OS is Windows 7 Home Premium, 64Bit.
What I want to do, is to open three filestreams like this:
FILE *A=fopen("A.txt", "r");
FILE *B=fopen("B.txt", "w");
FILE *D=fopen("C.txt", "w");
The first one only to read, the two others to write. During the program, I write with
fprintf(B, "%c", letter);
fprintf(D, "%c", letter);
integers, interpreted as ASCII-charachters into the files. I did this twenty times before, it always worked. Now, the "D" file stays empty. If I change the order of the streams to:
FILE *A=fopen("A.txt", "r");
FILE *D=fopen("C.txt", "w");
FILE *B=fopen("B.txt", "w");
my file "B" stays empty! So always the last one opened does not work. But why?! I can't see any difference to my other programms, which are working, and this program also works, except in the case of this third filestream. I'm compiling with -Wall and -pedantic, the Compiler is not complaining, the Programm ist not crashing, everything works but the third stream!
Has anyone any idea, or even better, experience with a problem like this? I tried about one hour without getting any clue.
Edit: Thanks for alle the comments! I'm programming with C for about two months till now, so there are many things, I'm not really sure of.
#Mhd.Tahawi: Yes, now I did, but no difference.
#Kninnug: The files get opened succesfully.
#Bit Fiddling Code Monkey: Yes, as far as I know, there is a limit somewhere. But I worked with 6 filestreams at the same time a week ago and everything was fine.
#Pandrei: Yes, the Output in the other file was fine.
#Martin James: I'm not sure what you mean. If you mean fclose or something similar: Since so many of you asked me for this, I tried it, but there was no difference.
#hmjd: 'letter' is an integer that gets its value by fgetc(A);
#squeamish ossifrage: I'm printing sign by sign into the file an get '1' for each, seems to be ok.
Edit2: Now, that's nice, I'm facing some kind of Heisenbug. I was going to check the return values, as so many of you told me to.
So I initalized a new int retval=0; and wrote:
letter=fgtec(A);
while (letter!=EOF)
{
retval=fprintf(B, "%c", letter);
printf("%d", retval); /*Want to see the return values on stdout*/
if (retval<0) /*If writing fails, the program stops*/
{
printf("Error while writing in file");
return (-1);
}
letter=fgetc(A);
}
And this one works! New file gets filled, everything is fine. Except for the fact, that I have absolutely no idea, why the old one:
letter=fgtec(A);
while (letter!=EOF)
{
fprintf(B, "%c", letter);
letter=fgetc(A);
}
that worked tenthousand times before, didn't do anything at all.
Has anyone any further ideas? However, thanks so far for your ideas!
Unless you called fclose() your standard library will buffer the output data and you won't see it on disk until you either write enough, or flush it via fflush() or close the file via fclose() (or disable buffering via setbuf()/setvbuf()).
Obviously, disabling buffering or flushing after each I/O operation obviates the whole purpose of buffered I/O making performance suboptimal.
I think, I found the source of the problem, so I thought it might be helpful to share, that it seems to be compiler-related.
As written above, I use gcc as part of the MinGW-package.
I asked a friend of mine, who uses Pelles C, to compile the code himself and voila! For him, it worked fine!
So it seems like gcc has, under certain conditions, possibly problems with filestreams. But I can't really say, which conditions.
However, if anyone else has a problem like this, try it with another compiler!

Reading .png file in C and sending it over a socket

I am currently working on a simple server implemented in C.
Processing jpg files works fine, btu png's give me a segmentation fault. I never get past this chunk of code. Why this might be?
fseek (file , 0 , SEEK_END);
lSize = ftell (file);
rewind (file);
Thanks.
It's far more likely that you were accessing those arrays in a problematic fashion. Check the logic in your buffering code. Make sure you have your buffer sizes #define'd in a central location, rather than hardcoding sizes and offsets. You made it quit crashing, but if you missed an underlying logic error, you may run into mysterious problems down the road when you change something else. It is probably worth your time to deliberately break the program again and figure out WHY it's broken. As others have suggested, a debugger would be an excellent idea at this point. Or post a more complete example of your code.

Is there a way to read HD data past EOF?

Is there a way to read a file's data but continue reading the data on the hard drive past the end of file? For normal file I/O I could just use fread(), but, obviously, that will only read to the end of the file. And it might be beneficial if I add that I need this on a Windows computer.
All my Googling for a way to do this is instead coming up with results about unrelated topics concerning EOF, such as people having problems with normal I/O.
My reasoning for this is that I accidentally deleted part of the text in a text file I was working on, and it was an entire day's worth of work. I Googled up a bunch of file recovery stuff, but it all seems to be about recovering deleted files, where my problem is that the file is still there but without some of its information, and I'm hoping some of that data still exists directly after the currently marked end of file and is neither fragmented elsewhere or already claimed or otherwise overwritten. Since I can't find a program that helps with this specifically, I'm hoping I can quickly make something up for it (I understand that, depending on what is involved, this might not be as feasible as just redoing the work, but I'm hoping that's not the case).
As far as I can foresee, though I might not be correct (not sure, which is why I'm asking for help), there are 3 possibilities.
Worst of the three: I have to look up Windows API functions that allow direct access to the entire hard drive (similar to its functions for memory, perhaps? those I have experience with) and scan the entire thing for the data that I still have access to from the file and then just continue looking at what's after it.
Second: I can get a pointer to the file, then I still have to get raw access to HD but at least have a pointer to the file in it?
Best of the three: Just open the file for write access, seek to the end, then write a ways past EOF to claim more space, but first hope that Windows won't clean the data before it hands it over to me so that I get garbage data which was the previous data in that spot which would actually be what I'm looking for? This would be awesome if it were that simple, but I'm afraid to test it out because I'd lose the data if it failed, so hopefully someone else already knows. The PC in question is running Vista Home Premium if that matters to anyone that knows the gory details of Windows.
Do either of those three seem plausible? Whether yea or nay, I'm also open (and eager) for other suggestions, especially those which are better than my silly ideas, and especially if they come with direction toward specific functions to use to get the job done.
Also, if anyone else actually has heard of a recovery program that doesn't just recover deleted files but which would actually work for a situation like this, and which is free and trustworthy, that works too.
Thanks in advance for any assistance.
You should get a utility for scanning the free space of a hard drive and recovering data from it, for example PhotoRec or foremost. Note however that if you've been using the machine much at all (even web browsing, which will create files in your cache), the data has likely already been overwritten. Do not save your recovery tools on the same hard drive, or even use the same PC to download them; get them from another computer and save them to a USB device, then run them from that device.
As for the conceptual content of your question, files are abstract objects. There is no such thing as data "past eof" except (depending on the implementation) perhaps up to the next multiple of the filesystem/disk "blocksize". Also it's possible (very likely) that your editor "saved" the file by truncating it and writing everything newly from the beginning, meaning there's not necessarily any correspondence between the old and new storage.
Your question doesn't make a lot of sense -- by definition there is nothing in the file after the EOF. By your further description, it appears that you want to read whatever happens to be on the disk after the last byte that is used by the file, which might be random garbage (unused space) or might be some other file. But in either case, this isn't 'data after the EOF' its just data on the disk that's not part of the file. Its even possible that it might be some other part of the same file, if the filesystem happens to lay out its data that way -- some filesystems scatter blocks in seemingly random ways across the disk and figuring out what bytes belong to which files requires understanding the filesystem metadata.

Resources