File Handling in C - c

Sorry for asking very simple question.I'm new to coding.
Input txt file
5,3
001
110
111
110
001
I need to print the output as
U1: 001
U2: 110
U3: 111
U4: 110
U5: 001
Upto now I was able to print the contents with this:
#include <stdio.h>
void main() {
FILE *fopen(), *fp;
int c;
fp = fopen("read.txt","r");
c = getc(fp) ;
while (c!= EOF) {
putchar(c);
c = getc(fp);
}
fclose(fp);
}
Can Anybody tell how should I proceed?

Most of your work is done inside the while loop.
But you're not doing enough work ...
a) you need to count lines
b) you need to print the "U#"
Suggestion: create a variable for the line counting business and rewrite your loop to consider it. Here's a few snippets
int linecount = 0;
printf("U%d: ", linecount)
if (c == '\n') linecount += 1;
Oh! You really shouldn't add the prototype for fopen yourself. It is already specified with #include <stdio.h>.
And well done for declaring c as int. Many people do the error of declaring it char which is incompatible with EOF and all the range of characters

I would read the data a line at a time with fgets. When you have a complete line in a buffer, it'll be pretty easy to get fprintf to put in your line header with a format like "U%d: %s\n".

Related

K&R different outputs from basically the same thing

I am currently working through K&R for C. In section 1.5, we basically create a program for word count. The code is as follows,
#include <stdio.h>
#define IN 1
#define OUT 0
int main()
{
int c;
long int nl, nc, nw;
int state;
nl = nw = nc = 0;
state = OUT;
while((c = getchar())!= EOF){
++nc;
if (c == '\n'){
++nl;
}
if (c == ' '||c == '\n'|| c == '\t'){
state = OUT;
} else if (state == OUT){
state = IN;
++nw;
}
}
printf("\n%ld %ld %ld\n", nl, nc, nw);
}
When I compile this program with gcc and open the executable and type something such as
Hello
World
I get output
1 11 2
Which makes sense as we have 1 '\n' newline characters, 11 characters (including newline), and 2 word. What is interesting is when I do
vim hello.txt
and type
Hello
World
and then issue the command
cat hello.txt|./a.out
I get the following output
2 12 2
Why are the outputs from this different from before because all I am doing is streaming the file into the executable so I do not understand what is different. Please do explain.
Thank You.
Not your program's feature, but Vim's. See: What does the noeol indicator at the bottom of a vim edit session mean?
Vim automatically appends a newline at the end of file if there isn't when saving.
Type
:set noeol
in Vim and save the file again, then your program will output 11 as the second number.
P.S. It's redundant to use cat something | program, just program < something is enough.
After you type the charactors into the file and save it, at the end of the file, a 0x0a is added to the end of the file automaticly

Format file to have 5 numbers per line

I am working on a text file containing integers separated by spaces, for instance:
1 2 57 99 8 14 22 36 98 445 1001 221 332 225 789 1111115 147 0 1 21321564 544 489 654 61266 5456 15 19
I would like to re-format this file to only contain 5 integers in any line but the last, and at most 5 integers in the last line.
My code:
#include <stdio.h>
#include <stdlib.h>
int main()
{
FILE *f; // main file (A.txt)
FILE *g; // file copy (B.txt)
// open A.txt to read data
f = fopen("file/path/here/A.txt", "r");
if (f == NULL) {
printf("Read error.\n");
fclose(f);
return -1;
}
// open B.txt to write data
g = fopen("file/path/here/B.txt", "w");
if (g == NULL) {
printf("Write error.\n");
fclose(g);
return -2;
}
int line = 1; // first line in output file
while (!feof(f)) { // not end-of-file
char number[1000];
int i = 0;
for (i = 0; i <= 4; i++)
if (fscanf(f, "%s", number) == 1) { // one number read
fprintf(g, "%s", line + i, number);
}
line += i;
}
// close files
fclose(f);
fclose(g);
return 0;
}
When I run this in Code::Blocks, I get the 'Segmentation fault (core dumped) Process returned 139' message. I suspect that the problem lies in the 'if' statement and my use of formats. Needless to say, I'm relatively new to C. How might I fix this?
The simple reason for your segmentation fault is expression fprintf(g, "%s", line + i, number);, in which you state to pass a pointer to a string (i.e. char*), but actually pass a number (i.e. line + i); hence, the value of line + i, which is probably 1, ..., is interpreted as a pointer to memory address 1, which is not allowed to be addressed. It is as if you wrote fprintf(g, "%s", 1), which crashes, too;
So basically change this expression into fprintf(g, "%s", number);, and it should at least not crash (unless you have numbers with more than 999 digits).
There are some other issues in your code, e.g. that you open B.txt for write and assign it to g, but then you test and close the file using variable f.
But maybe above "crash solution" brings you forward, such that you can work further on your own. Note that - if B.txt failed opening, then your code would also have crashed because of passing NULL as file stream argument to fprintf.
The issue is with the use of fscanf and then fprintf.
fscanf knows how to parse a string into a number. E.g. fscanf(f, "%d", &var);. This reads a signed integer from the file handle f into the variable var. This can then be printed with fprintf.
As it stands, the first fscanf slurps the entire input into number (assuming that 1000 char is enough) and the following ones are not expected to be called

fgetc dropping characters when reading 2d pixel map with for loop

I am working on inputting a 2d pixel array from a PPM file with the one I am testing being of width and length 5. Also I am aware that in a ppm file that is rgb it has 3 color values not just one. I had forgotten that before I wrote this code, but the problem still persists even with the update to it and the problem still exists in the same way. I have simplified the problem to just the array as to isolate the problem. From what I can tell this seems to be both dropping characters and replacing some with new line characters as well. Any insight into why this is happening would be greatly appreciated and if I forgot to add something I will update this as soon as I am aware.
#include <stdio.h>
int main(int args, char *argv[]) {
int w = 5, h = 5;
FILE *f = fopen(argv[1], "rb");
int c = 'a';//I am setting this so as to avoid the off chance of c being defined as EOF
for(int i = 0; i < h && c != EOF; i++) {
for(int j = 0; j < w && (c = fgetc(f)) != EOF; j++) printf("%c", c);
fgetc(f);//To remove the '\n' character I am not using fgets because it stops at '\n' character and it is possible for a rgb value to be == to '\n'
printf("\n");
}
fclose(f);
return 0;
}
Test File I am using:
12345
abcde
12345
abcde
12345
Output I am getting:
12345
abcd
123
5
ab
de
1
Thanks in advance!
Edit: This is running on the windows 10 command prompt
The problem is that '\n' on a Windows machine actually ends up producing two characters, a carriage return (ASCII code 13) and a line feed (ASCII code 10). When you open a file in binary mode, those line endings are not translated back to a single character. You're only accounting for one of these characters, so you're getting off by a character on each line you read.
To illustrate this, replace your printf("%c", c);" with printf("%d ", c);. I get the following output:
49 50 51 52 53
10 97 98 99 100
13 10 49 50 51
53 13 10 97 98
100 101 13 10 49
You can see those 10s and 13s shifting through.
Now try adding a second fgetc(f); to eat the line feed and it will work much better. Keep in mind, however, that this only works on files with CRLF line endings. Port it to Linux or Mac and you will have more troubles.

fgetc not starting at beginning of large txt file

I am working in C and have a text file that is 617kb that I am trying to read with fgetc. For some reason fgetc is starting randomly within the file. I have tried moving the file pointer to get beginning with fseek with no success. I can get fgetc work to fine with smaller files. Any help is appreciated.
Sample input is 25,000 lines of data similar to:
Product
23 660
2366 3
237 09
2 3730
23734
23 773
241 46
Source:
#define _CRT_SECURE_NO_WARNINGS
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
void print(FILE *category){
int ch = 'a';
while ((ch = fgetc(category)) != EOF){
printf("%c", ch);
}
getchar();
}
int main(void){
FILE *category = fopen("myFile", "r");
if (category == NULL){
puts("category file not found");
}
else{
print(category);
fclose(category);
return 0;
}
}
I suspect the problem lies elsewhere.
Where is the output from this program going? If you're sending it to the console, its scrollback buffer won't be large enough to contain the whole file. Maybe it just looks like fgetc() is starting in an odd place.
Try diverting the output from this program to a new text file and compare the size of this file with the size of the input file, e.g.:
./category_print >output.txt

How to skip data when working with files C language

I have an issue with as assignment regarding files.
Here is the assignment:
I am asked to write a code for a program that adds to each line in a text file, the number of that line. for example if the original file was:
Hi my name is Oria
I study programming
I love dogs
I use stackoverflow
It will be changed to:
1 Hi my name is Oria
2 I study programming
3 I love dogs
4 I use stackoverflow
But I don't know how to skip a line. After I've written the first number, how do I advance the *file pointer to be the first character of the next line?
This can be done with the help of writing it into another file.
Read the each line of a input file using fgets and start the loop count, then write to output file with count and data.
#include <stdio.h>
int main()
{
FILE *src, *dest;
char buf[64];
int i = 0;
src = fopen("in.txt", "r");
dest = fopen("out.txt", "w");
while(fgets(buf, 64, src) != NULL){
i++;
fprintf(dest, "%d %s", i,buf);
}
fclose(src);
fclose(dest);
return 0;
}
Use getline(3) to read lines in a loop. Within the loop, you can skip lines at will.
while (1) {
....
getline();
if (...)
continue;
}

Resources