I am running a simulation in C, and need to store 3 100x100 matrices ~1000 times. My program runs just fine when I'm not writing the data to file. But when I run my program and write the data, I get a segmentation error after 250 time steps or so. And I don't understand why.
My save function looks like this
void saveData(Simulation* sim, int number) {
sprintf(pathname_vx, "data/xvel%d.dat", number);
sprintf(pathname_vy, "data/yvel%d.dat", number);
sprintf(pathname_rho, "data/rho%d.dat", number);
FILE* vx_File = fopen(pathname_vx, "w");
FILE* vy_File = fopen(pathname_vy, "w");
FILE* rho_File = fopen(pathname_rho, "w");
int iX, iY;
double ux, uy, rho;
for (iY=0; iY<sim->ly; ++iY) {
for (iX=0; iX<sim->lx; ++iX) {
computeMacros(sim->lattice[iX][iY].fPop, &rho, &ux, &uy);
fprintf(vx_File, "%f ", ux);
fprintf(vy_File, "%f ", uy);
fprintf(rho_File, "%f ", rho);
}
fprintf(vx_File, "\n");
fprintf(vy_File, "\n");
fprintf(rho_File, "\n");
}
fclose(vx_File);
fclose(vx_File);
fclose(vy_File);
}
where 'Simulation' is a struct containing a lattice (100x100 matrix) with 3 different variables 'rho', 'ux', 'uy'. The 'number' argument is just a counting variable to name the files correctly.
gdb says the following, but it doesn't help me much.
Program received signal EXC_BAD_ACCESS, Could not access memory.
Reason: KERN_INVALID_ADDRESS at address: 0x0000000000000010
0x00007fff87c6ebec in __vfprintf ()
I'm not that experienced in programing, so I guess there are better ways to write data to file. Any attempt to clarify why my approach doesn't work is highly appreciated.
Thanks
jon
Looks like you're closing vx_File twice, and not closing rho_File at all. This means that you're leaving rho_File open each iteration, and thus using up a file descriptor each time through.
I'd guess the program fails you're running out of file descriptors. (Since this happens on the 250'th iteration, I'd guess your limit is 256). Once you're out of file descriptors, one of the fopen() calls will return NULL. Since you don't check the return value of fopen(), the crash will occur when you attempt to fwrite to a NULL handle.
Looks like a NULL pointer dereference. You need to check the result of fopen() to make sure it succeeded (non-NULL result).
Maybe you run out of memory when you are creating those thousand 100x100 matrices (or whatever is exactly happening). You could then end up with a incomplete sim->lattice that might contain NULL pointers.
Do you check if you malloc() calls succeed? If they can't allocate memory they return NULL.
Related
I made the file "wor.txt" in the same program and i closed its write stream. But when i try to access it in first run(I created the file) it gives segmentation fault but when i re-run this program it runs successfully.
When i delete the automatically generated file and run the program again it gives Segmentation fault and on 2nd run(Without deleting the file) it runs successfully again.
NOTE: There is data in the textfile hence it is not empty(I have seen it after the first run in the file manager)
FILE *fp1= fopen("wor.txt","r");
FILE *f1= fopen("wordsa.txt","ab+");
if((f1==NULL)||(f2==NULL)){
printf("f1 or f2 is null");
}
char c='0';
while((c)!=EOF){
printf("Here is one marker\n");
c=fgetc(fp1); //This Line gives error
printf("Here is another marker\n");
fputc(c,f1);
}
A char is no sufficient for EOF, change the type to int.
Check the man page of fgetc(), it returns an int and you should use the same datatype for storing the return value and further use.
That said, when either of f1 or fp1 is NULL, you are continuing anyways, accessing those file pointers, which may create UB. You should make some sense of that NULL check and either return or exit so that the code accessing tose pointers are not reached.
Incorrect check.
To properly detect opening of the stream, check fp1, not f2. Then code will fail gracefully when the files do not open properly rather than seg fault.
FILE *fp1= fopen("wor.txt","r");
FILE *f1= fopen("wordsa.txt","ab+");
// if((f1==NULL)||(f2==NULL)){
if((f1==NULL) || (fp1==NULL)){
printf("f1 or fp1 is null");
}
Also use int c as fgetc() typically returns 256 + 1 different values (unsigned char values and EOF) and a char is insufficient to uniquely distinguish them.
I'm working on 64-bit Xubuntu 14.04.
I have a fairly large C program and to test new features, I usually implement them in a separate program to iron out any bugs and whatnot before incorporating them into the main program.
I have a function that takes a const char* as argument, to indicate the path of a file (/dev/rx.bin in this case). It opens the file, reads a specific number of bytes into an array and then does some things before exporting the new data to a different file.
First off I allocate the array:
int16_t samples = (int16_t *)calloc(rx_length, 2 * sizeof(samples[0]));
Note that rx_length is for example 100 samples (closer to 100 000 in the actual program), and it's calculated from the same constants.
Next I open the file and read from it:
uint32_t num_samples_read;
FILE *in_file = fopen(file, "rb");
if (in_file == NULL){
ferror(in_file);
return 1;
}
num_samples_read = fread(samples, 2 * sizeof(samples[0]), rx_length, in_file);
Here's the kicker; the return value from fread is not the same between the test program and the main program, while the code is identical. For example, when I should be reading 100 000 samples from a 400 kB file (100 000 samples, one int16_t for the real part and one int16_t for the imaginary part, adds up to four bytes per sample), the value returned is 99328 in the main program. For the life of me I cannot figure out why.
I've tested the output of every single variable used in any calculation, and up until fread() everything is identical.
I should also note that the function is in a separate header in my main program, but I figured that since printing every constant / definition gives the expected result, that it's not there where I'm making a mistake.
If there's anything that I might have missed, any input would be greatly appreciated.
Regards.
Thank you chux for reminding me to close and answer.
Closing the file was the problem in my main program, it never occurred within the test environment because the input file was not being modified there.
Once the RX thread has completed its task, make a call to fclose():
rx_task_out:
fclose(p->out_file);
// close device
// free sample buffer
return NULL;
Previously, only an error status with creating the RX thread caused it to close the file.
I'm writing a struct into a file, but it returns garbage. Here is my code:
ptFile = fopen("funcionarios.dat", "ab+");
fwrite(&novoFunc, sizeof(strFunc), 1, ptFile);
The values of struct novoFunc, before and after the fwrite are not garbage.
However, when I return the file values:
ptFile = fopen("funcionarios.dat", "rb+");
[...]
fseek(ptFile, i*sizeof(strFunc), SEEK_SET); //on the loop, i goes from 0 to total structs
fread(&funcionario, sizeof(strFunc), 1, ptFile);
printf("Code: %d; Name: %s; Address: %s; CPF: %d; Sales: %d\n", funcionario.codigo, funcionario.nome, funcionario.endereco, funcionario.cpf, funcionario.numVendas);
Any idea why? The code was working fine, and I dont remember doing significative changes.
Thanks in advance
Edit: Struct definition
typedef struct func{
int codigo;
char nome[50];
char endereco[100];
int cpf;
int numVendas;
int ativo;
} strFunc;
Edit2: It just got weirder: it works fine on linux (using netbeans and gcc compiler), but it doesnt on windows (devcpp and codeblocks). Well, the entire code is here:
http://pastebin.com/XjDzAQCx
the function cadastraFucionario() register the user, and when I use listaFuncionarios(), to list all the registered data, it returns the garbage. Here is a print of what listaFuncionarios() returns:
http://img715.imageshack.us/img715/3002/asodfadhf.jpg
Im sorry the code isnt in english
You say: "The code was working fine, and I dont remember doing significative changes."
When it was working fine, it wrote some structures into your file.
Maybe later it was still working fine, and it appended some additional structures at the end of your file. The original data still remained at the beginning of your file. So when you read the beginning of the file, you read the original data. Maybe.
Are you sure that you read garbage? Are you sure that you didn't just read old data?
In your code:
ptFile = fopen("funcionarios.dat", "ab+");
Appending is the right thing to do for some purposes but not for others. Do you need wb+ instead?
This:
it works fine on linux ... but it doesnt on windows
is a big red flag. Windows has "text" files which are different to "binary" files. On Linux and other Unixes, there is no difference.
Two lines in your source stand out:
fopen("funcionarios.dat", "rb+");
and later
fopen("funcionarios.dat", "r+");
That is, sometimes you open the file in "binary" mode, and sometimes in "text" mode. Make sure you always open any file in "binary" mode (that is, with the b in the mode string) if you ever intend to read or write non-text data.
Here are two problems in your function retornaIndice.
while(!feof(ptFile)){
fseek(ptFile, sizeof(strFunc)*i, SEEK_SET);
fread(&tmpFunc, sizeof(strFunc), 1, ptFile);
You aren't checking the result of fread. After reading the last record, eof has not been reached yet, so you will try another read. That read will reach eof and will return 0, but you aren't checking for that 0, so you will use garbage data and will exit the loop the next time the while statement tests it.
if(codigo != 0 && tmpFunc.ativo){
if(tmpFunc.codigo == codigo){
return i;
}
If you detect a problem at this point, you don't close ptFile. The leaked handle shouldn't cause garbage data to be written to the file, but it doesn't inspire confidence either.
Some of your other functions have the same errors.
I'm experiencing some problems while trying to read a binary file in C.
This problem never happened to me before, I don't really know how to manage it.
So, there's this structure called "hash_record", there are many of them stored in my "HASH_FILE" file in binary mode. This is the structure:
typedef struct hash_record {
char *hash;
char *filename;
} hash_record;
I write the file in this way:
hash_record hrec;
[...] code that fill the structure's fields [...]
FILE* hash_file = fopen(HASH_FILE, "ab");
fwrite(&hrec, sizeof(hash_record), 1, hash_file);
fclose(shared_file);
This is just a summary, the fwrite() function is inside a loop so that I can fill the file with many hash_record's.
Then, immediately after that piece of code, I start reading the file and printing some data to be sure everything went well. This is the code:
int print_data() {
hash_record rec;
printf("Data:\n");
FILE* hash_file = fopen("hash.bin", "rb");
if (hash_file == NULL)
return -1;
while(fread(&rec, sizeof(hash_record), 1, hash_file) == 1)
printf("Filename: %s - Hash: %s", rec.filename, rec.hash);
fclose(hash_file);
return 0;
}
And everything works just fine!
The problem is that if I write the binary file in an instance of my program and then quit it, when I open it again (commenting the code which write the file so it can only read it) it gives me a Segmentation Fault. This error appears when I call the printf() inside the while() loop. If I just print a common string without calling "rec" no errors are given, so I'm assuming there's something wrong storing data inside "rec".
Any idea?
Thank you!
You are writing out pointers. When you read them back in from the same instance of the program, the data is in the same place and the pointers are meaningful. If you read them in from another instance, the pointers are bad.
Hey I've tried a lot of programs in Visual Studio and in most of them when i try taking the input from a stream ( while using fscanf ) it invariably throws a debug assertion failed error ..
and goes on to say:
stream != NULL. Since I have gotten this error a number of times .. I assume there is a flaw in the way I'm using fscanf. I would appreciate it if someone could tell me the usage or .. give me a demo sample code that illustrates the simple usage .. !
I tried looking up the error .. in most places it said I haven't closed the file .. but I have and I'm a little confused .. I appreciate any help .. thanks a lot :)
printf("Enter No of states\n");
Q=5;
// scanf("%d",&Q);
// READING ZERO MATRIX
// reading the matrix from f0.sta
{
FILE *fp;
fp = fopen("c:\\tc\\fuzzy\\f0.sta","r");
for(i=1;i<=Q;i++)
for(j=1;j<=Q;j++)
fscanf(fp,"%f",&a0[i][j]);
fclose(fp);
}
// READING ONE MATRIX
// reading the matrix from f0.sta
FILE *fp;
fp = fopen("c:\\tc\\fuzzy\\f1.sta","r");
for(i=1;i<=Q;i++)
for(j=1;j<=Q;j++)
fscanf(fp,"%f",&a1[i][j]);
fclose(fp);
This is the code bit.
It sounds like fp is NULL. The most likely reason is that one of the files (or both) do not exist, or can't be opened (for example, because some other process is using it).
I would start by adding some error checking after the two fopen() calls: compare the result to NULL and if it is NULL, examine errno.
Your loop counters start at 1 instead of 0, which is odd for C programming. What's likely happening is you're not allocating enough space in the array, i.e. you have
double a[5][5];
when you need
double a[6][6];
so you're stepping on something past the end of the array. Better to have your loop be
for(i=0;i<Q;i++)
for(j=0;j<Q;j++)
so you don't waste the 0 slots in the array.