Overwriting in a random access file - c

I have a random access file opened in "r+b" mode with records of equal length. Can I change the contents of a record after reading it and overwrite in place?
I tried the following code but on running I get: Segmentation fault(core dumped)
#include<stdio.h>
int main()
{
struct tala {
int rec_no;
long file_no;
};
FILE *file_locking;
struct tala t,f;
file_locking = fopen("/path/to/my/file.bin", "rb+");
t.rec_no = 1;
t.file_no = 3;
if (fwrite(&t, sizeof(struct tala),1,file_locking)==0)
printf("Error opening file");
t.rec_no=0;
rewind(file_locking);
if (fwrite(&t, sizeof(struct tala),1,file_locking)==0)
printf("Error opening file");
rewind(file_locking);
if (fread(&f, sizeof(struct tala),1,file_locking)==0)
printf("Error opening file");
printf("\n %d",f.rec_no);
printf("\n %ld", f.file_no);
fclose(file_locking);
}

Yes you can. Just remember to always fseek between reads and writes.
Quote the fopen man page:
Reads and writes may be intermixed on read/write streams in any order. Note that ANSI C requires that a file positioning function intervene between output and input, unless an input operation encounters end-of-file.
Extra tip: always check the return value of fopen and related functions, and handle errors (use perror or strerror to print out what failed).

Yes. The only thing to pay attention is that you have to call flush or a file positioning function before switching from output to input and call a file positioning function or be at end of file before switching from read to write.

Related

can't access a place in memory

I'm trying to read a binary file of 32 bytes in C, however I'm keep getting "segmentation fault (code dumped)" when I run my program,
it would be great if somebody can help me out by pointing where did I go wrong?.
my code is here below:
int main()
{
char *binary = "/path/to/myfiles/program1.ijvm";
FILE *fp;
char buffer[32];
// Open read-only
fp = fopen(binary, "rb");
// Read 128 bytes into buffer
fread (buffer, sizeof(char), 32, fp);
return 0;
}
It's because of the path. Make sure that "/path/to/myfiles/program1.ijvm" points to an existing file.
You should always check the return value of fopen.
\\Open read-only
fp = fopen(binary, "rb");
if(fp==NULL){
perror("problem opening the file");
exit(EXIT_FAILURE);
}
Notice also that you are reading 32 bytes in your buffer and not 128 as your comment says.
You must check the return result from fopen().
I'm assuming you are getting the segfault in the fread() call because your data file doesn't exist, or couldn't be opened, and you are trying to work on a NULL FILE structure.
See the following safe code:
#include <stdio.h>
#include <stdint.h>
#define SIZE_BUFFER 32
int main()
{
char *binary = "data.txt";
FILE *fp = NULL;
char buffer[SIZE_BUFFER];
// Open read-only
fp = fopen(binary, "rb");
// Read SIZE_BUFFER bytes into buffer
if( fp )
{
printf("Elements read %ld\n", fread (buffer, sizeof(char), SIZE_BUFFER, fp));
fclose(fp);
}
else
{
// Use perror() here to show a text description of what failed and why
perror("Unable to open file: ");
}
return 0;
}
When I execute this code it doesn't crash and will print the number of elements read if the file is opened or it will print "Unable to open file" if the file could not be opened.
As mentioned in the comments you should also close the file being exiting. Another thing you can do is the following:
FILE *fp = fopen(.....);
Instead of declaring and assigning in two separate steps.
There are two possible reasons
The fopen(3) function failed due to some reason, which means fp is NULL, and then you are trying to use the null-pointer in fread(3). This can crash. #OznOg has already given a subtle hint to look into this direction.
If the fopen call is a success (i.e. fp is non-NULL after calling fopen), the code can still crash because you are reading 32 chars into the variable binary, while binary has been initialized with only 30 chars.

C program not writing to text file

So here is the problem:
I want to write some text to a text file, but I got across some weird situations:
SAMPLE 1:
int main()
{
FILE *file = fopen("structures.txt", "a+"); // open the file for reading & writing
int choice = 0;
if(file == NULL)
puts("Unable to open text file");
else {
do {
scanf("%d", &choice);
fprintf(file, "This is testing for fprintf...\n");
}while(choice > 0);
}
fclose(file);
return 0;
}
In this version of the program, nothing ever gets written to the text file & I can't understand why. But the strangest thing for me is next:
SAMPLE 2:
int main()
{
FILE *file = fopen("structures.txt", "a+"); // open the file for reading & writing
int choice = 0;
if(file == NULL)
puts("Unable to open text file");
else {
do {
scanf("%d", &choice);
fprintf(file, "This is testing for fprintf...\n");
fclose(file); // Added this line, the writing works
}while(choice > 0);
}
fclose(file);
return 0;
}
After adding fclose(file); directly after the fprintf call, the program successfully writes :
This is testing for fprintf...
to the text file.
My questions are:
Does that mean that I have to open my text file whenever I want to
write some text to it & close it directly afterwards ?
Does fclose() has anything to do with the writing process ?
What are the factors that prevent fprintf() from working ?
(1st sample)
How can I open & close the text file just ONCE, at the start of the program & at the end of it (respectively) guaranteeing at the same time that my program will work flawlessly ?
Does that mean that I have to open my text file whenever I want to write some text to it & close it directly afterwards ?
No, but it may mean you need to close the file before the contents you've written will actually be flushed and written out to the file.
Does fclose() has anything to do with the writing process ?
Most file streams are buffered. Meaning that each write goes to memory. It is not written to disk until the buffer is full or you call close.
What are the factors that prevent fprintf() from working ? (1st sample)
Anything you get significantly wrong.
How can I open & close the text file just ONCE, at the start of the program & at the end of it (respectively) guaranteeing at the same time that my program will work flawlessly ?
You could call something like fflush(). But are you sure you tried this and the file contained nothing even after you finally closed the file at the end?
I solved the problem using int fflush (FILE *stream) (which #Jonathan hinted at).
Flushing output on a buffered stream means transmitting all
accumulated characters to the file. There are many circumstances when
buffered output on a stream is flushed automatically:
When you try to do output and the output buffer is full.
When the stream is closed.
When the program terminates by calling exit.
When a newline is written, if the stream is line buffered.
Whenever an input operation on any stream actually reads data from its file.
So basically, I just had to call fflush() after the call to fprintf() :
fprintf(file, "This is testing for fprintf...\n");
fflush(file);

Why do the strings output using fprintf end up not being written to the output file if my program is terminated via CTRL-C?

Why does fprintf give different results in the following example programs?
Example 1:
int main(){
FILE *f;
char buf[512];
char name[128] = {"filename"};
f = fopen(name, "w");
fprintf(f, "asdas\n");
fprintf(f, "asdas\n");
while(1){}
return 0;
}
If I terminate this program using CTRL+C, I get an empty file named filename.
However, using
Example 2:
int main(){
FILE *f;
char buf[512];
char name[128] = {"wpa_supplicant.conf"};
f = fopen(name,"w");
while(1){
fprintf(f, "asdas\n");
}
return 0;
}
If I terminate this program using CTRL+C, I get file named filename, and it contains many lines with the string asdas.
Why are the strings not written to the file in the first example, but they are written to the file in the second example?
In the second case, there are enough fprintf calls for the internal buffers to be flushed to disk.
With the first program, if you put a fflush(f) before the while loop, the strings will be written to the file.
#include <stdio.h>
int main(void) {
FILE *f = fopen("filename", "w");
if (!f) {
perror("Failed to open 'filename' for writing");
exit(EXIT_FAILURE);
}
fprintf(f, "asdas\n");
fprintf(f, "asdas\n");
if ( fflush(f) != 0 ) {
perror("Flushing output failed");
exit(EXIT_FAILURE);
}
while(1){}
return 0;
}
Output:
C:\...\Temp> cl file.c
Microsoft (R) C/C++ Optimizing Compiler Version 18.00.31101 for x64
...
/out:file.exe
C:\...\Temp> file
^C
C:\...\Temp> type filename
asdas
asdas
Keep in mind:
Upon successful completion, fflush() shall return 0; otherwise, it shall set the error indicator for the stream, return EOF, and set errno to indicate the error.
As mentioned in the answer by #SinanÜnür this is indeed an issue with the buffering of data in internal buffers. You need to flush manually in the first case to get that data actually written into the file.
However, FWIW, I just want to add here, you see this behavior because of the abnormal termination of the program by a signal (generated by CTRL+C).
If your program would have ended normally, (for example, by calling exit(), after a large-enough but controlled while() loop), then both the cases would have shown the same behavior, as in that scenario, all the open streams would have been flushed automatically.
The exit() function shall then flush all open streams with unwritten buffered data and close all open streams. Finally, the process shall be terminated ...

fopen c with multiple files

In my software I have to read multiple txt databases in a serial way, so I read the first, then I do something with the info I got from that file, than I open another one to write and so on.
Sometimes I got an error on an opening OR creation of a file, and then I got errors on all the following opening/creation, which uses different functions, different variables, different files.
So for example I call the function below, which uses two files, and I got an error "* error while opening file -%s- ..\n", then all the other fopen() in my code goes wrong!
This is an example of code for one single file:
FILE *filea;
if((filea=fopen(databaseTmp, "rb"))==NULL) {
printf("* error while opening file -%s- ..\n",databaseTmp);
fclose (filea);
printf("---------- createDatabaseBackup ----------\n");
return -1;
}
int emptyFolder=1;
FILE *fileb;
if((fileb=fopen(databaseBackup, "ab"))==NULL) {
printf("* error while opening file -%s- ..\n",databaseBackup);
fclose (fileb);
printf("---------- createDatabaseBackup ----------\n");
return -1;
}
else {
int i=0;
char c[500]="";
for (i=0;fgets(c,500,filea);i++) {
fprintf(fileb,"%s",c);
emptyFolder=0;
}
}
fclose(fileb);
fclose(filea);
There is an upper limit on the number of open handles for a given process. May be you have a handle leak in your program ?
Error while creating a file typically means you don't have access permission to the parent folder .
Those error log messages belong to your program . You can enhance it further. There is an errnum set by the os as fopen is essentially a system call. You can print that error number and get more info about your issue.
If fopen returned NULL, the file wasn't opened, so there's no point in trying to fclose it.
You should check the return value of fgets besides whether it is 0 or not. If it reads 500 characters and the buffer is not null-terminated, the fprintf will attempt to write more characters than is allocated for c

why fread sometimes encounters "Bad file descriptor"?

I am read from a file like this:
#include <stdio.h>
int main() {
FILE *fp = fopen("sorted_hits", "r+");
while(!feof(fp)) {
int item_read;
int *buffer = (int *)malloc(sizeof(int));
item_read = fread(buffer, sizeof(int), 1, fp);
if(item_read == 0) {
printf("at file %ld\n", ftell(fp));
perror("read error:");
}
}
}
This file is big and I got the "Bad file descriptor" error sometimes. "ftell" indicates that the file position stopped when error occurred.
I don't know why it is "sometimes", is that normal? does the problem lie in my code or in my hard disk? How to handle this?
perror prints whatever is in errno as a descriptive string. errno gets set to an error code whenever a system call has an error return. But, if a system call DOESN'T fail, errno doesn't get modified and will continue to contain whatever it contained before. Now if fread returns 0, that means that either there was an error OR you reached the end of the file. In the latter case, errno is not set and might contain any random garbage from before.
So in this case, the "Bad file descriptor" message you're getting probably just means there hasn't been an error at all. You should be checking ferror(fp) to see if an error has occurred.
You seem to be mixing text and binary modes when reading the file.
Normally when you use fread you read from a binary file i.e. fread reads a number of bytes matching the buffer size but you seem to be opening the file in text mode (r+). ftell doesn't work reliably on files opened in text mode because newlines are treated differently than other characters.
Open the file in binary mode (untranslated) instead:
FILE *fp = fopen("sorted_hits", "rb+");
If that's really what your loop looks like, my guess would be that you're probably getting a more or less spurious error because your process is just running out of memory because your loop is leaking it so badly (calling malloc every iteration of your loop, but no matching call to free anywhere).
It's also possible (but a lot less likely) that you're running into a little problem from your (common but nearly always incorrect) use of while (!feof(fp)).
Your all to printf also gives undefined behavior because you've mismatched the conversion and the type (though on many current systems it's irrelevant because long and int are the same size).
Fixing those may or may not remove the problem you've observed, but at least if you still see it, you'll have narrowed down the possibilities of what may be causing the problem.
int main() {
FILE *fp = fopen("sorted_hits", "r+");
int buffer;
while(0 != fread(&buffer, sizeof(int), 1, fp))
; // read file but ignore contents.
if (ferror(fp)) {
printf("At file: %ld\n", ftell(fp));
perror("read error: ");
}
}

Resources