I am trying to read a file in C. But when I read, and write it to stdout it prints # also which there is no in my file. What is the reason?
#include <stdio.h>
int main() {
FILE *fp;
int br;
char buffer[10];
int i;
fp = fopen("a.txt","r");
while(1) {
br = fread(buffer,1,10,fp);
printf("%s",buffer);
if (br==0)
break;
}
}
Output:
1234567891#2345678912#3456789
12#3456789
12#
The file:
123456789123456789123456789
Your fread call reads up to 10 bytes correctly, but printf with %s requires string to be null terminated. You can fix it by increasing size of the buffer to be 11 bytes and after every call to fread write zero at the end of data, i.e. buffer[br] = 0;.
The other way to go is to tell printf what is the size of your data by calling printf("%.*s", br, buffer);. You don't need to modify your buffer array then.
Dynamically allocate your buffer and have it be initialized to zeros like this:
char *buffer = calloc(1, 11);
<do your read loop>
free(buffer)
This way you get the zero byte at the end which will terminate the string when printing it. When C prints a string it expects it to be terminated by a NULL (or 0) byte.
Related
I have written a basic code which writes into a file a string in binary mode (using fwrite()). Also I can read the same string from the file (using fread()) in to the buffer and print it. It works but in the part where I read from the file, extra junk is also read into the buffer. My question is how to know the length of the bytes to be read, correctly?
The following is the code --
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <errno.h>
#define BUFSZ 81
char * get_string (char *, size_t);
int main (int argc, char * argv[])
{
if (argc != 2)
{
fprintf (stderr, "Invalid Arguments!!\n");
printf ("syntax: %s <filename>\n", argv[0]);
exit (1);
}
FILE * fp;
if ((fp = fopen(argv[1], "ab+")) == NULL)
{
fprintf (stderr, "Cannot openm file <%s>\n", argv[1]);
perror ("");
exit (2);
}
char string[BUFSZ];
char readString[BUFSZ];
size_t BYTES, BYTES_READ;
puts ("Enter a string: ");
get_string (string, BUFSZ);
// printf ("You have entered: %s\n", string);
BYTES = fwrite (string, sizeof (char), strlen (string), fp);
printf ("\nYou have written %zu bytes to file <%s>.\n", BYTES, argv[1]);
printf ("\nContents of the file <%s>:\n", argv[1]);
rewind (fp);
BYTES_READ = fread (readString, sizeof (char), BUFSZ, fp);
printf ("%s\n", readString);
printf ("\nYou have read %zu bytes from file <%s>.\n", BYTES_READ, argv[1]);
getchar ();
fclose (fp);
return 0;
}
char * get_string (char * str, size_t n)
{
char * ret_val = fgets (str, n, stdin);
char * find;
if (ret_val)
{
find = strchr (str, '\n');
if (find)
* find = '\0';
else
while (getchar () != '\n')
continue;
}
return ret_val;
}
in the part where I read from the file, extra junk is also read into the buffer.
No, it isn't. Since you're opening the file in append mode, it's possible that you're reading in extra data preceding the string you've written, but you are not reading anything past the end of what you wrote, because there isn't anything there to read. When the file is initially empty or absent, you can verify that by comparing the value of BYTES to the value of BYTES_READ.
What you are actually seeing is the effect of the read-back data not being null terminated. You did not write the terminator to the file, so you could not read it back. It might be reasonable to avoid writing the terminator, but in that case you must supply a new one when you read the data back in. For example,
readString[BYTES_READ] = '\0';
My question is how to know the length of the bytes to be read, correctly?
There are various possibilities. Among the prominent ones are
use fixed-length data
write the string length to the file, ahead of the string data.
Alternatively, in your particular case, when the file starts empty and you write only one string in it, there is also the possibility of capturing and working with how many bytes were read instead of knowing in advance how many should be read.
First of all you get string from the user, which will contain up to BUFSZ-1 characters (get_string() function will remove the trailing newline or skip any character exceeding the BUFSZ limit.
For example, the user might have inserted the word Hello\n, so that after get_string() call string array contains
-------------------
|H|e|l|l|o|'\0'|...
-------------------
Then you fwrite the string buffer to the output file, writing strlen (string) bytes. This doesn't include the string terminator '\0'.
In our example the contents of the output file is
--------------
|H|e|l|l|o|...
--------------
Finally you read back from the file. But since readString array is not initialized, the file contents will be followed by every junk character might be present in the uninitialized array.
For example, readString could have the following initial contents:
---------------------------------------------
|a|a|a|a|a|T|h|i|s| |i|s| |j|u|n|k|!|'\0'|...
---------------------------------------------
and after reading from the file
---------------------------------------------
|H|e|l|l|o|T|h|i|s| |i|s| |j|u|n|k|!|'\0'|...
---------------------------------------------
So that the following string would be printed
HelloThis is junk!
In order to avoid these issues, you have to make sure that a trailing terminator is present in the target buffer. So, just initialize the array in this way:
char readString[BUFSZ] = { 0 };
In this way at least a string terminator will be present in the target array.
Alternatively, memset it to 0 before every read:
memset (readString, 0, BUFSZ);
I want to print the contents of a .txt file to the command line like this:
main() {
int fd;
char buffer[1000];
fd = open("testfile.txt", O_RDONLY);
read(fd, buffer, strlen(buffer));
printf("%s\n", buffer);
close(fd);
}
The file testfile.txt looks like this:
line1
line2
line3
line4
The function prints only the first 4 letters line.
When using sizeof instead of strlen the whole file is printed.
Why is strlen not working?
It is incorrect to use strlen at all in this program. Before the call to read, the buffer is uninitialized and applying strlen to it has undefined behavior. After the call to read, some number of bytes of the buffer are initialized, but the buffer is not necessarily a proper C string; strlen(buffer) may return a number having no relationship to the amount of data you should print out, or may still have UB (if read initialized the full length of the array with non-nul bytes, strlen will walk off the end). For the same reason, printf("%s\n", buffer) is wrong.
Your program also can't handle files larger than the buffer at all.
The right way to do this is by using the return value of read, and write, in a loop. To tell read how big the buffer is, you use sizeof. (Note: if you had allocated the buffer with malloc rather than as a local variable, then you could not use sizeof to get its size; you would have to remember the size yourself.)
#include <unistd.h>
#include <stdio.h>
int main(void)
{
char buf[1024];
ssize_t n;
while ((n = read(0, buf, sizeof buf)) > 0)
write(1, buf, n);
if (n < 0) {
perror("read");
return 1;
}
return 0;
}
Exercise: cope with short writes and write errors.
When using sizeof instead of strlen the whole file is printed. Why is
strlen not working?
Because how strlen works is it goes through the char array passed in and counts characters till it encounters 0. In your case, buffer is not initialized - hence it will try to access elements of uninitialized array (buffer) to look for 0, but reading uninitialized memory is not allowed in C. Actually you get undefined behavior.
sizeof works differently and returns the number of bytes of the passed object directly without looking for a 0 inside the array as strlen does.
As correctly noted in other answers read will not null terminate the string for you so you have to do it manually or declare buffer as:
char buffer[1000] = {0};
In this case printing such buffer using %s and printf after reading the file, will work, only assuming read didn't initialize full array with bytes of which none is 0.
Extra:
Null terminating a string means you append a 0 to it somewhere. This is how most of the string related functions guess where the string ends.
Why is strlen not working?
Because when you call it in read(fd, buffer, strlen(buffer));, you haven't yet assigned a valid string to buffer. It contains some indeterminate data which may or may not have a 0-valued element. Based on the behavior you report, buffer just so happens to have a 0 at element 4, but that's not reliable.
The third parameter tells read how many bytes to read from the file descriptor - if you want to read as many bytes as buffer is sized to hold, use sizeof buffer. read will return the number of bytes read from fd (0 for EOF, -1 for an error). IINM, read will not zero-terminate the input, so using strlen on buffer after calling read would still be an error.
I'm currently working on a binary file creation. Here is what I have tried.
Example 1:
#include<stdio.h>
int main() {
/* Create the file */
int a = 5;
FILE *fp = fopen ("file.bin", "wb");
if (fp == NULL)
return -1;
fwrite (&a, sizeof (a), 1, fp);
fclose (fp);
}
return 0;
}
Example 2:
#include <stdio.h>
#include <string.h>
int main()
{
FILE *fp;
char str[256] = {'\0'};
strcpy(str, "3aae71a74243fb7a2bb9b594c9ea3ab4");
fp = fopen("file.bin", "wb");
if(fp == NULL)
return -1;
fwrite(str, sizeof str, 1, fp);
return 0;
}
Example 1 gives the right output in binary form. But Example 2 where I'm passing string doesn't give me right output. It writes the input string which I have given into the file and appends some data(binary form).
I don't understand and I'm unable to figure it out what mistake I'm doing.
The problem is that sizeof str is 256, that is, the entire size of the locally declared character array. However, the data you are storing in it does not require all 256 characters. The result is that the write operation writes all the characters of the string plus whatever garbage happened to be in the character array already. Try the following line as a fix:
fwrite(str, strlen(str), 1, fp);
C strings are null terminated, meaning that anything after the '\0' character must be ignored. If you read the file written by Example 2 into a str[256] and print it out using printf("%s", str), you would get the original string back with no extra characters, because null terminator would be read into the buffer as well, providing proper termination for the string.
The reason you get the extra "garbage" in the output is that fwrite does not interpret str[] array as a C string. It interprets it as a buffer of size 256. Text editors do not interpret null character as a terminator, so random characters from str get written to the file.
If you want the string written to the file to end at the last valid character, use strlen(str) for the size in the call of fwrite.
I'm getting some issues with reading the content of my array. I'm not sure if I'm storing it correctly as my result for every line is '1304056712'.
#include <stdio.h>
#include <stdlib.h>
#define INPUT "Input1.dat"
int main(int argc, char **argv) {
int data_index, char_index;
int file_data[1000];
FILE *file;
int line[5];
file = fopen(INPUT, "r");
if(file) {
data_index = 0;
while(fgets(line, sizeof line, file) != NULL) {
//printf("%s", line); ////// the line seems to be ok here
file_data[data_index++] = line;
}
fclose(file);
}
int j;
for(j = 0; j < data_index; j++) {
printf("%i\n", file_data[j]); // when i display data here, i get '1304056712'
}
return 0;
}
I think you need to say something like
file_data[data_index++] = atoi(line);
From your results I assume the file is a plain-text file.
You cannot simply read the line from file (a string, an array of characters) into an array of integers, this will not work. When using pointers (as you do by passing line to fgets()) to write data, there will be no conversion done. Instead, you should read the line into an array of chars and then convert it to integers using either sscanf(), atoi() or some other function of your choice.
fgets reads newline terminated strings. If you're reading binary data, you need fread. If you're reading text, you should declare line as an array of char big enough for the longest line in the file.
Because file_data is an array of char, file_data[data_index] is a single character. It is being assigned a pointer (the base address of int line[5] buffer). If reading binary data, file_data should be an array of integers. If reading strings, it should be an array of string, ie char pointers, like char * file_data[1000]
you also need to initialize data_index=0 outside the if (file) ... block, because the output loop needs it to be set even if the file failed to open. And when looping and storing input, the loop should test that it's not reached the size of the array being stored into.
I'm writing a program that encrypts a file by adding 10 to each character. Somehow a portion of the programs working directory is being printed to the file, and I have no idea why.
#include <stdio.h>
int main(void){
FILE *fp;
fp=fopen("tester.csv","r+");
Encrypt(fp);
fclose(fp);
}
int Encrypt(FILE *fp){
int offset=10;
Shift(fp, offset);
}
int Decrypt(FILE *fp){
int offset= -10;
Shift(fp, offset);
}
int Shift(FILE *fp, int offset){
char line[50],tmp[50], character;
long position;
int i;
position = ftell(fp);
while(fgets(line,50,fp) != NULL){
for(i=0;i<50;i++){
character = line[i];
character = (offset+character)%256;
tmp[i] = character;
if(character=='\n' || character == 0){break;}
}
fseek(fp,position,SEEK_SET);
fputs(tmp,fp);
position = ftell(fp);
fseek(stdin,0,SEEK_END);
}
}
the file originally reads
this, is, a, test
i, hope, it, works!
after the program is run:
~rs}6*s}6*k6*~o}~
/alexio/D~6*y|u}+
k6*~o}~
/alexio/D
where users/alexio/Desktop is part of the path. How does this happen???
Because you "encode" the string, it won't be null terminated (that's your case), or it will contain a null even before the end of the string (character+offset % 256 == 0). Later you try to write it as a string, which overruns your buffer, and outputs part of your program arguments.
Use fread and fwrite.
The line
fputs(tmp,fp);
writes out a probably non-null terminated string. So it continues to copy memory to the file until it finds a null.
You need to add a null to the end of 'tmp' in the case where the loop breaks on a newline.
A number of things:
You're encoding all 50 chars from your read buffer, regardless of how many were actually read with fgets(). Recall that fgets() reads a line, not an entire buffer (unless the line is longer than a buffer, and your's is not). Anything past the string length from your line file input is stack garbage.
You're then dumping all that extra garbage data, andbeyond, by not terminating your tmp[] string before writing with fputs() which you should not be using anyway. Yet-more stack garbage.
Solution. Use fread() and fwrite() for this encoding. There is no reason to be using string functions whatsoever. When you write your decoder you'll thank yourself for using fread() and fwrite()