Read from file a number and after that an array in C - c

How can I read from a file a number and after an array. i mean my file looks like that:
3
7
8
9
3 is the number of components, 7, 8 9 the other components of the array, arr[1], arr[2], arr[3].

one way to perform the desired functionality is:
First, open the file for reading:
FILE *fp = fopen( "filename.txt" );
Then check that the call to fopen() was successful and handle any error:
if( ! fp )
{
perror( "fopen to read filename.txt failed" );
exit( EXIT_FAILURE );
}
Note: perror() outputs both your error message and the text reason the system thinks the error occurred to stderr. which is where error messages should be output.
reserve a variable to hold the count of following values:
int maxLoops;
then read the first number and use that number as the max iterations of a loop, of course, checking for errors
if( fscanf( fp, "%d", &maxLoops ) != 1 )
{
fprintf( stderr, "fscanf to read loop count failed\n" );
exit( EXIT_FAILURE );
}
Note: the scanf() family of functions does not set errno when some input format specifier (in this case %d) fails, so need to output an error message using something like fprinf().
Note: the scanf() family of functions returns the number of successful input format conversions (or EOF)
Note: exit() and EXIT_FAILURE are exposed via:
#include <stdlib.h>
then, reserve an array for the following entries in the file, using the Variable Array Length feature of C
int dataArray[ maxLoops ];
Now, set up the loop that will read the rest of the data
for( int i = 0; i < maxLoops; i++ )
{
for each pass through the loop read another entry into the array, of course, checking for errors
if( fscanf( fp, "%d", &dataArray[i] ) != 1 )
{
fprintf( stderr, "fscanf for data value failed\n" );
exit( EXIT_FAILURE );
}
} // end the loop
then, cleanup before doing anything else:
fclose( fp );
What you do with the data is up to you. You might want to print out each of the data values with a loop, similar to:
for( int i = 0; i < maxLoops; i++ )
{
printf( "entry %d = %d\n", i, dataArray[i] );
}
Note: when calling printf() no need to obtain the address of a variable (unless that is what you want to print). However, when inputting a variable, as when calling fscanf() need the address of the variable.

Related

File Not Opening in C?

I am tried to use read/write/append modes with both fprintf and fscanf, and they'r not working. My The location of the folder where I've saved my file is ''C:\coding projects\test\Assignment Template'' and the name of the file is ''Assignment Template.txt''.
#include <stdio.h>
#include <stdlib.h>
int main()
{
char str[500];
FILE *ptr;
ptr=fopen("C:\coding projects\test\Assignment Template\Assignment template.txt","r");
fscanf(ptr,"%s",str);
return 0;
}
ps, I also tried to use the location of folder only without file name, also tried to use the name of the file only without folder location, but none of it seems to be working.
As pointed out by others, the several "single '\'" in the pathname to your file are wrong. You need to replace each "\" with "\\" OR with "/". Both these solutions would work.
The second idea is that "not working" is not a helpful diagnostic. To follow that with "nothing happened" does not supply any extra information.
Here is an example showing how to write this functionality so that you can at least understand where a problem might be occurring.
int main()
{
// Separate the filepath so it can be used in error message if necessary.
char *fname = "C:/coding projects/test/Assignment Template/Assignment template.txt";
char str[ 500 + 1 ]; // One extra for trailing '\0'
// temporary debugging report to confirm pathname is as expected.
printf( "Attempt open of '%s'\n", fname );
FILE *fp = fopen( fname, "r" );
// ALWAYS test return values for possible errors
if( fp == NULL ) {
// Able to report what went wrong
fprintf( stderr, "Failed to open %s\n", fname );
exit( EXIT_FAILURE );
}
// fscanf returns how many variables were 'satisfied'
// use that information
// Also, set a limit that won't overflow the buffer being filled
int num = fscanf( fp, "%500s", str );
// 'temporary' diagnostic "debugging" report to the console
printf( "Loaded %d items\n", num );
// clean up
fclose( fp );
return 0;
}
This is not "debugging with print statements"... To move forward developing code, one adds-or-modifies only a VERY few lines of code, then TESTS the consequences of those changes before adding/modifying a few more lines. "Incremental development". 'Testing' involves having clear expectations of what should happen and "seeing" if those expectations have been met. Had you printed the string that is the pathname of the file you want to open, you would have seen a problem before writing one more line of code. "Slowly and methodically, ALWAYS testing/checking."

How to use the "write" system call using C langauge in Linux?

My code is working fine. The only error I'm getting is that after the program writes the text into the file i.e. text1.txt, the text file prints some weird symbols like /00 when I actually open it.
int fd;
fd = open("text1.txt", O_RDWR);
char text[] = "This is my file.";
write(fd,text,sizeof(text));
You need to ensure that open succeeded instead of blindly writing to the file-descriptor.
Always check the return value of a syscall (and most C standard library functions) and check errno if the return value indicated an error.
Your string literal will include a hidden \0 (NULL) character after the dot.
Writing text directly to the file will therefore include the trailing \0 which is what you're seeing.
These issues can be rectified by:
Always checking the return value of a syscall - and in this case: print a helpful error message to stdout and perform any necessary cleanup (the goto closeFile; statement).
Because C doesn't have a native try/catch or RAII it means its difficult to write terse error-handling and cleanup code, but using goto for common clean-up code is generally acceptable in C, hence the goto closeFile statement.
Using strlen to get the actual length of the string.
Though in a pinch it's okay to use sizeof(text) - 1 provided you're in a scope where the C compiler knows the length of text as using sizeof() won't work if you cross a function boundary due to array pointer decay.
Like so:
void writeToFile() {
int fd = open( "text1.txt", O_CREAT | O_WRONLY ); // Use `O_WRONLY` instead of `O_RDWR` if you're only writing to the file. Use `O_CREAT` to create a file if it doesn't already exist.
if( fd == -1 ) {
printf( "Error opening file: errno: %d - %s\n", errno, strerror( errno ) );
return;
}
size_t textLength = strlen( text );
size_t written = write( fd, text, textLength );
if( written == -1 ) {
printf( "Error writing text: errno: %d - %s\n", errno, strerror( errno ) );
goto closeFile;
}
else if( written < textLength ) {
printf( "Warning: Only %d of %d bytes were written.", written, textLength );
goto closeFile;
}
else {
// Carry on as normal.
}
closeFile:
if( close( fd ) == -1 ) {
printf( "Error closing file: errno: %d - %s\n", errno, strerror( errno ) );
}
}

how to print first 10 lines of a text file using Unix system calls?

I want to write my own version of the head Unix command, but my program is not working.
I am trying to to print the first 10 lines of a text file, but instead the program prints all the lines. I specify the file name and number of lines to print via command-line arguments. I am only required to use Unix system calls such as read(), open() and close().
Here is the code:
#include "stdlib.h"
#include "stdio.h"
#include <fcntl.h>
#include <stdlib.h>
#include <unistd.h>
#define BUFFERSZ 256
#define LINES 10
void fileError( char*, char* );
int main( int ac, char* args[] )
{
char buffer[BUFFERSZ];
int linesToRead = LINES;
int in_fd, rd_chars;
// check for invalid argument count
if ( ac < 2 || ac > 3 )
{
printf( "usage: head FILE [n]\n" );
exit(1);
}
// check for n
if ( ac == 3 )
linesToRead = atoi( args[2] );
// attempt to open the file
if ( ( in_fd = open( args[1], O_RDONLY ) ) == -1 )
fileError( "Cannot open ", args[1] );
int lineCount = 0;
//count no. of lines inside file
while (read( in_fd, buffer, 1 ) == 1)
{
if ( *buffer == '\n' )
{
lineCount++;
}
}
lineCount = lineCount+1;
printf("Linecount: %i\n", lineCount);
int Starting = 0, xline = 0;
// xline = totallines - requiredlines
xline = lineCount - linesToRead;
printf("xline: %i \n\n",xline);
if ( xline < 0 )
xline = 0;
// count for no. of line to print
int printStop = lineCount - xline;
printf("printstop: %i \n\n",printStop);
if ( ( in_fd = open( args[1], O_RDONLY ) ) == -1 )
fileError( "Cannot open ", args[1] );
//read and print till required number
while (Starting != printStop) {
read( in_fd, buffer, BUFFERSZ );
Starting++; //increment starting
}
//read( in_fd, buffer, BUFFERSZ );
printf("%s \n", buffer);
if ( close( in_fd ) == -1 )
fileError( "Error closing files", "" );
return 0;
}
void fileError( char* s1, char* s2 )
{
fprintf( stderr, "Error: %s ", s1 );
perror( s2 );
exit( 1 );
}
What am I doing wrong?
It's very odd that you open the file and scan it to count the total number lines before going on to echoing the first lines. There is absolutely no need to know in advance how many lines there are altogether before you start echoing lines, and it does nothing useful for you. If you're going to do it, anyway, however, then you ought to close() the file before you re-open it. For your simple program, this is a matter of good form, not of correct function -- the misbehavior you observe is unrelated to that.
There are several problems in the key portion of your program:
//read and print till required number
while (Starting != printStop) {
read( in_fd, buffer, BUFFERSZ );
Starting++; //increment starting
}
//read( in_fd, buffer, BUFFERSZ );
printf("%s \n", buffer);
You do not check the return value of your read() call in this section. You must check it, because it tells you not only whether there was an error / end-of-file, but also how many bytes were actually read. You are not guaranteed to fill the buffer on any call, and only in this way can you know which elements of the buffer afterward contain valid data. (Pre-counting lines does nothing for you in this regard.)
You are performing raw read()s, and apparently assuming that each one will read exactly one line. That assumption is invalid. read() does not give any special treatment to line terminators, so you are likely to have reads that span multiple lines, and reads that read only partial lines (and maybe both in the same read). You therefore cannot count lines by counting read() calls. Instead, you must scan the valid characters in the read buffer and count the newlines among them.
You do not actually print anything inside your read loop. Instead, you wait until you've done all your reading, then print everything the buffer after the last read. That's not going to serve your purpose when you don't get all the lines you need in the first read, because each subsequent successful read will clobber the data from the preceding one.
You pass the buffer to printf() as if it were a null-terminated string, but you do nothing to ensure that it is, in fact, terminated. read() does not do that for you.
I have trouble believing your claim that your program always prints all the line of the designated file, but I can believe that it prints all the lines of the specific file you're testing it on. It might do that if the file is short enough that the whole thing fits into your buffer. Your program then might read the whole thing into the buffer on the first read() call (though it is not guaranteed to do so), and then read nothing on each subsequent call, returning -1 and leaving the buffer unchanged. When you finally print the buffer, it still contains the whole contents of the file.

No segmentation fault for accessing out of bound memory

I'm not a good English speaker.
so in my program I want to copy a text who exists in a txt file to an array.
typedef struct Chaine
{
char * Lachaine;
int Taille_C;
} Chaine ;
int main (void)
{
Chaine *Tab_Texte=NULL;
Tab_Texte=(Chaine*)malloc(sizeof(Chaine));
FILE* Texte= NULL;
Texte = fopen("chaines", "r");
fseek(Texte, 0, SEEK_END);
Tab_Texte->Taille_C=ftell(Texte);
fseek(Texte, 0, SEEK_SET);
Tab_Texte->Lachaine=NULL;
Tab_Texte->Lachaine=(char*)malloc(sizeof(char)*Tab_Texte->Taille_C);
fread(Tab_Texte->Lachaine,sizeof(char)*(Tab_Texte->Taille_C),1,Texte);
printf("%s",Tab_Texte->Lachaine);
return 0;
}
Here everything works great and when I change
Tab_Texte->Lachaine=(char*)malloc(sizeof(char)*Tab_Texte->Taille_C);
with ( for example )
Tab_Texte->Lachaine=(char*)malloc(sizeof(char)*Tab_Texte->Taille_C - 10);
It works always, It suppose to show me a segmentation fault because sizeof(char)*Tab_Texte->Taille_C - 10 is shorter than sizeof(char)*Tab_Texte->Taille_C so than the text in the file.
Can you tell me why it always works?
What you're experiencing is called undefined behavior.
Accessing past the allocated memory
Using a not-null terminated char array as string
supplying invalid file pointer
all [any] of these will result in undefined behavior and the side effect may be a segmentation fault, but it's not guaranteed.
Please
Check for the success of fopen() before using the returned pointer
null-terminate a char array to use it as a string
free() the allocated memory after the usage is over.
Do not cast the return value of malloc()/ calloc().
The reason why it works always is because what you describe is undefined behavior so it's not really defined what should happen.
In some situations it may lead to a segmentation fault but not always. So your always is really sometimes, but it turns out the conditions haven't been the right ones for the segmentation fault to happen.
Consider the following fixes to your code:
You should not do this
printf("%s",Tab_Texte->Lachaine);
since your Tab_Texte->Lachaine is not null terminated
You can try to do it like this
fwrite(Tab_Texte->Lachaine, 1, Tab_Texte->Taille_C, stdout);
but generally you never check for every single function that returns null on failure in your code.
For example
Texte = fopen("chaines", "r");
if (Texte == NULL)
weAreInTroubleIfWeCall_fread_OnTexte_SoAbort();
also applies for malloc, and you don't need to cast malloc, read this
You should free the result of malloc when you no longer need it.
You will get a segfault is you read or write to memory that is not allocated for your program. If you misuse malloc, you may not get a segfault, depending on the way the underlying operating system loads your program into memory. In that case, you may be writing or reading from your own memory, but in different locations, potentially overwriting other variables.
// the reason a seg fault event did not occur is because
// the code is using the contents of the field: Tab_Texte->Taille_C
// which is the full size of the file
// (so no problem unless file has less than 10 bytes)
//eliminate a lot of clutter in your code via proper definition of a struct rather that a typedef
//do not cast the returned value from malloc (and family)
//check the returned value from malloc to assure successful operation
//check the returned value from fopen to assure successful operation
//check the returned value from fseek to assure successful operation
//check the returned value from ftell to assure successful operation
//cleanup when exiting program, including free for malloc'd areas, closing files, etc
//check the returned value from fread to assure successful operation
#include <stdio.h> // fopen(), fclose(), fread(), fseek(), ftell()
#include <stdlib.h> // exit(), EXIT_FAILURE, free(), malloc()
struct Chaine
{
char * Lachaine;
int Taille_C;
};
int main (void)
{
struct Chaine *Tab_Texte=NULL;
if( NULL == (Tab_Texte=malloc(sizeof(struct Chaine)) ) )
{ // then, malloc failed
perror("malloc failed");
exit( EXIT_FAILURE );
}
// implied else, malloc successful
FILE* Texte= NULL;
if(NULL == (Texte = fopen("chaines", "r")) )
{ // then fopen failed
perror( "fopen failed for chaines for read");
free(Tab_Texte);
exit( EXIT_FAILURE );
}
// implied else, fopen successful
if( 0 != fseek(Texte, 0, SEEK_END) )
{ // then fseek failed
perror( "fseek for end of file failed" );
fclose(Texte);
free(Tab_Texte);
exit( EXIT_FAILURE );
}
// implied else, fseek successful
if( -1L == (Tab_Texte->Taille_C=ftell(Texte) ) )
{ // then ftell failed
perror("ftell failed" );
fclose(Texte);
free(Tab_Texte);
exit( EXIT_FAILURE );
}
// implied else, ftell successful
if( 0 != fseek(Texte, 0, SEEK_SET) )
{ // then fseek failed
perror( "fseek for start of file failed" );
fclose(Texte);
free(Tab_Texte);
exit( EXIT_FAILURE );
}
// implied else, fseek successful
Tab_Texte->Lachaine=NULL;
if( NULL == (Tab_Texte->Lachaine=malloc(Tab_Texte->Taille_C) ) )
{ // then, malloc failed
perror( "malloc failed for file size" );
fclose(Texte);
free(Tab_Texte);
exit( EXIT_FAILURE );
}
// implied else, malloc successful
if( 1 != fread(Tab_Texte->Lachaine, sizeof(Tab_Texte->Taille_C), 1 , Texte) )
{ // fread failed
perror( "fread for whole file failed" );
fclose(Texte);
free(Tab_Texte->Lachaine);
free(Tab_Texte);
exit( EXIT_FAILURE );
}
// implied else, fread successful
printf("%s",Tab_Texte->Lachaine);
// cleanup
fclose(Texte);
free(Tab_Texte->Lachaine);
free(Tab_Texte);
return 0;
} // end function: main

Error checking and the added length thereof - is there an analog to interrupts from embedded system programming?

Of course it is necessary to check whether certain operations occurred as expected: calls to malloc, fopen, fgetc
However, sometimes adding these checks makes the code way too long - especially for very simply functions. For example, I have a function where I must open a file, read in a few parameters, and allocate memory corresponding to what was just read in.
Therefore, the code ends up looking something like:
Open file
Check if file opened
Read parameter
Check if file EOF was not read (if it was, file format is incorrect)
Allocate memory
Check if memory allocation occurred as expected
Etc.
There appears to be quite a bit of redundancy here. At least for my simple program, if any of the above checks file, I simply report the error and return control to the operating system. The code ends up looking something like this:
if(filePointer == NULL){
perror("Error X occured");
exit(EXIT_FAILURE);
}
So, a simple few-line function turns into perhaps 20 or more lines because of this error checking. Is there some where to aggregate the determination of these errors?
Just wondering if there was something that I missed.
EDIT: For example, is there a way to interrupt the flow of program when certain events occur? I.e. if EOF is read prematurely, then jump to some function that informs the user (something like an interrupt in embedded systems).
This is a question that every C programmer asks at some point in his/her career. You are correct that some portions of your code will have more lines of error handling code than actual useful productive code. One technique I've used in the past to streamline error handling is to implement an error function, like this
static FILE *fpin = NULL;
static FILE *fpout = NULL;
static BYTE *buffer = NULL;
static void error( char *msg, char *name )
{
if ( msg != NULL )
{
if ( name != NULL )
fprintf( stderr, "%s: %s\n", msg, name );
else
fprintf( stderr, "%s\n", msg );
}
if ( fpin != NULL )
fclose( fpin );
if ( fpout != NULL )
fclose( fpout );
if ( buffer != NULL )
free( buffer );
exit( 1 );
}
which then gets used like this
void main( int argc, char *argv[] )
{
if ( argc != 3 )
error( "Usage: ChangeBmp infile outfile" );
if ( (fpin = fopen( argv[1], "rb" )) == NULL )
error( "Unable to open input file", argv[1] );
if ( (fpout = fopen( argv[2], "wb" )) == NULL )
error( "Unable to open output file", argv[2] );
size = sizeof( bmphead );
if ( fread( &bmphead, 1, size, fpin ) != size )
error( "Unable to read header", NULL );
size = sizeof( bmpinfo );
if ( fread( &bmpinfo, 1, size, fpin ) != size )
error( "Unable to read info", NULL );
Of course, this only works if the error function has access to all of the necessary variables. For simple, single file programs, I just make the necessary variables global. In a larger project, you might have to manage the variables more carefully.
One common way to address this, at least to reduce apparent code size, is wrapping the various checks with macros: e.g.,
#define CHECK_NULL(expr) { \
if ((expr) == NULL) { \
perror("Error X"); \
exit(-1); \
} \
}
CHECK_NULL(p = malloc(size))
CHECK_NULL(filePointer = fopen("foo.txt", "r"))
As for interrupting control flow, other languages often use exceptions, which are also possible in C. However, this tends to be platform-specific and isn't usually the way it's done in practice with C.

Resources