Program stop reading file using fgets - c

It would be nice if anyone could help me with my 'program'. I am trying to read csv file and move it to 2D array. It stops on 17th line(out of 200).
int main ()
{
FILE * pFile;
double **tab;
char bufor [100];
int i=0;
tab = (double**)malloc(sizeof(double*));
pFile = fopen ("sygnal1.csv" , "r");
if (pFile == NULL) printf("Error");
else
while (fgets (bufor , 100 , pFile))
{
tab[i] = (double *) malloc(2 * sizeof(double));
sscanf(bufor, "%lf, %lf,", &tab[i][0], &tab[i][1]);
printf("%lf.%lf.\n",tab[i][0],tab[i][1]); //It's here only for testing
i++;
}
printf("number of lines read %d\n",i);
fclose (pFile);
system("PAUSE");
return 0;
}

You haven't completely allocated memory for tab yet. You've just allocated one (uninitialised) pointer. When i > 0 you're into Undefined Behaviour. You need to allocate at least as many elements as there might be lines in your file, e.g.
tab = malloc(sizeof(*tab) * MAX_LINES);
or use realloc after each iteration to increase the number of elements.

tab = (double**)malloc(sizeof(double*));
You're only allocating 1 element in this array. All other accesses are writing over unallocated chunks of memory and probably causing damage.
Try reallocing periodically.

You created place for only one double * in tab, if you know the number of lines you want to store, then do :
tab = malloc(sizeof(*tab) * NB_LINES);
Also, don't cast the return of malloc.

Related

Need to read a file with multiple lines of integers into an array

I need to read a file of ints into an array in C. A sample of the file I need to read is below, though note the files this will process can have thousands or hundreds of thousands of lines.
127
234
97
8723
I've gotten the file open in C, read how many lines there are so I know how many spaces my array needs, but I can't seem to read/parse each line into the array.
FILE *file;
int N = 0;
char filePath[30];
char endFile;
printf("What file should be used?\n");
scanf("%s", filePath);
file = fopen(filePath, "r");
if(file == NULL) {
printf("This file failed to open.\n");
break;
}
for(endFile = getc(file); endFile!=EOF; endFile=getc(file))
if(endFile == '\n') {
N = N+1;
}
int myArray[N];
while(fscanf(file, "%d\n", &a) != EOF) {
fscanf(file, "%d\n", &a); // I'm not sure this line is needed...
printf("%d\n", a);
M[i] = a;
}
From here, I need to read the file contents into myArray, with each line being the corresponding spot in the array (i.e. line zero is myArray[0], line one is myArray[1], etc.). I can't seem to find a way to do this, though I see several methods to do tab-delimited 2d arrays or csv multi-dimensional arrays.
Please also let me know if creating the array/determining the array size can be done in a better way than literally counting new-line characters...
There's no need to first "count the number of lines".
The following code cautiously grows an array of integers (by increments of 10).
#define GROW 10
int *rec = NULL, nRec = 0, sz = 0;
while( fgets( buf, sizeof buf, ifp ) != NULL ) {
if( nRec == sz ) {
rec = realloc( rec, (nRec+GROW) * sizeof *rec );
/*omitting test for failure */
sz += GROW;
}
rec[ nRec++ ] = atoi( buf );
}
This shows what is possible.
Note that realloc() can fail, returning NULL... It's up to you to add a bit of code to handle that condition.
Further, some conventional thought is to double the size of the allocation when needed (because realloc() may not be 'cheap'.) You can decide if you want to grow the array in increments (of 1024?) or grow it exponentially.

Segmentation Fault 11 while trying to read a text file and q sort it

Receiving a segmentation fault 11 while trying to read in a .txt file and q-sort it in C. This is a home work assignment for my CS class, and in it, the professor has given us a q-sort function he has written, and we need to make it faster using posix. I'm showing the relevant code here for reading in a text file, and then creating an array and using q-sort on it. The program works for an arbitrary array of strings I made up so I'm reasonably sure it's something to do with the way I am reading in the .txt file and processing it. Poem.txt is in the same directory, and the error handler works if I change the file name. Any ideas?
int main(){
double start, end;
double total;
char *array[100000];
char buffer[MAX_LENGTH];
int i = 0;
FILE *fp;
fp = fopen("poem.txt", "r");
if (fp < 0) {
fprintf(stderr, "Couldn't open file to read in. | error number %d : %s \n", errno, strerror(errno));
exit(1);
}
// First "function" to read in a text file for sorting
while (fscanf (fp, "%s", buffer) == 1) {
array[i] = malloc (MAX_LENGTH);
strcpy (array[i++], buffer);
}
// printf ("%s\n", array[1]); /* print for troubleshooting */
start = clock();
sortThreaded (array, i);
// freeing the memory used in the array from Malloc
for (int j = 0; array[j]; j++){
free(array[j]);
}
end=clock();
total = (end-start)/CLOCKS_PER_SEC;
printf(" total clocks: %f\n", total);
fclose(fp);
return 0;
}
First, as noted above, you should check
if (NULL == fp)
not
if (fp < 0)
Because fopen returns NULL if it can't open a file.
Second, it's unsafe to use %s in fscanf. If a input line is longer than MAX_LENGTH, you can receive a "* stack smashing detected *".
You can use
char *fgets(char *s, int size, FILE *stream);
or
fscanf (fp, "%99999s", buffer)
Even though it shouldn't be the problem.
Third, array is not initialized. So, it contains garbage. In
for (int j = 0; array[j]; j++) {
free(array[j]);
}
You may be freeing some addresses that were not allocated. As may be also the case in
fclose(fp);
A couple notes/possibilities I see:
When checking fp, if the fopen() fails, it will be set to NULL not a negative number so that could be a source or error.
if (fp == NULL) {
fprintf(stderr, "Couldn't open file to read in. | error number %d : %s \n", errno, strerror(errno));
exit(1);
}
When you do the fscanf and the strcpy() the buffer could be overflowing, best to do a fgets() and a strncpy() and specify MAX_LENGTH
while (fgets (buffer, MAX_LENGTH, fp) != NULL) {
array[i] = malloc (MAX_LENGTH);
strncpy (array[i++], buffer, MAX_LENGTH);
}
The free() loop could use the same iterator i and decrement to be sure we are freeing everything we allocated, no more no less.
while (--i >= 0){
free(array[i]);
}
Segmentation fault generally happens when you try to access memory that you are not allowed to access or doesn't exist. Sometimes thrown by a MPU (memory protection unit).

C: How to input only int data from a text file of unknown size

void inputData()
{
FILE* fp = fopen("input_001.txt", "r");
if(fp == NULL)
{
exit(FILE_FAILED_TO_OPEN);
}
fseek(fp, 0, SEEK_END); //set position to end of file
int size = ftell(fp);
if(size == 0)
{
exit(PARSING_ERROR_EMPTY_FILE);
}
int i;
char *input;
char *errPtr = NULL;
char *data;
fgets(input, 64, fp);
for(i = 0; i < size; i++)
{
strtol(input[i], *errPtr, 10);
//testing the output
printf("%d\n", input[i]);
}
fclose(fp);
if(fclose(fp) != 0)
{
exit(FILE_FAILED_TO_CLOSE);
}
}
I am trying to input data from a text file of unknown size into an array and keep only the integers. The format of the text file is one number per line and there can be any number of lines. I have included my attempt at trying to input the data, but my coding was not going so well. I want to use fgets() and then strtol() to convert the lines I get from fgets() into integers and put them in an array so I can work with that data. Any help is appreciated!
You haven't allocated any space for input to point to. I saw your
earlier version had a malloc; you can use that, or just a char
array.
Did you mean to use data? Because you're not, yet.
fgets reads at most one line at a time, so you need to put your reads
in a loop.
You appear to be converting the string to a number multiple times. For
instance, if the first line were "12345", this code would get 12345,
then 2345, 345, etc. This was presumably not your intention.
You're incrementing i up to size. size is the file size and might
be quite large, but you only read a maximum of 64 characters into the
buffer. (Or would have, if space had been allocated.)
In short, this code is very confused and I recommend starting over from
scratch. Decide whether you want to read the entire file at once, or one
line at a time; I recommend the latter, it takes less memory and is
simpler. If you want to store them in an array, you can do that with
malloc and then realloc as needed to grow the array dynamically.
do not use strtol, use atoi instead.
and use a loop to read the lines.
just a quick answer:
int count = 0;
while( fgets (input, 64, fp) != NULL )
{
count++;
}
fseek(fp, 0, SEEK_SET);
int* arr = new int[count];
count = 0;
while( fgets (input, 64, fp) != NULL )
{
int a =atoi(input);
arr[count++] = a;
}
If you don't know how many lines are in the file, you have a few options:
Loop through the file, counting the number of lines (call it nlines). Then declare the array int numbers[nlines].
Use a linked list instead of an array to store the numbers.
Dynamically allocate blocks of an array.
As for reading the integer data itself, something like this would work if you decide to go with the first option:
void inputData(const char* fname, const unsigned int nlines)
{
FILE* fp = fopen(fname, "r");
if (fp == NULL) {
exit(EXIT_FAILURE);
}
int numbers[nlines];
double d = 0;
fscanf(fp, "%lf", &d);
int i = 0;
while (!feof(fp)) {
numbers[i++] = (int)floor(d);
fscanf(fp, "%lf", &d);
}
fclose(fp);
}

Freeing array of dynamic strings / lines in C

I am writing a program that is sorting the lines from the input text file. It does its job, however I get memory leaks using valgrind.
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
char* getline(FILE * infile)
{
int size = 1024;
char * line = (char*)malloc(size);
int temp;
int i=0;
do
{
(temp = fgetc(infile));
if (temp !=EOF)
line[i++]=(char)temp;
if (i>=size)
{
size*=2;
line = (char*)realloc(line, size);
}
}while (temp != '\n' && temp !=EOF);
if (temp==EOF)
return NULL;
return line;
}
void print_line (char * line)
{
printf("%s", line);
}
int myCompare (const void * a, const void * b )
{
const char *pa = *(const char**)a;
const char *pb = *(const char**)b;
return strcmp(pa,pb);
}
int main (int argc, char* argv[])
{
FILE * infile;
FILE * outfile;
infile = fopen(argv[1], "r");
if (infile==NULL)
{
printf("Error");
exit(3);
}
outfile = fopen(argv[2], "w");
if (outfile==NULL)
{
printf("Error");
exit(3);
}
char * line;
char **all_lines;
int nlines=0;
while((line=getline(infile))!=NULL)
{
print_line(line);
nlines++;
}
all_lines=malloc(nlines*sizeof(char*));
rewind(infile);
int j=0;
printf("%d\n\n", nlines);
while((line=getline(infile))!=NULL)
{
all_lines[j]=line;
j++;
}
qsort(all_lines, nlines, sizeof(char*), myCompare);
for (int i=0; i<nlines;i++)
{
print_line(all_lines[i]);
fprintf(outfile, "%s", all_lines[i]);
}
for(int i =0; i<nlines;i++)
{
free(all_lines[i]);
}
free(all_lines);
fclose(infile);
fclose(outfile);
return 0;
}
Any ideas where they might come from? I loop over all_lines[] and free the content, then free all_lines itself.
UPDATE:
Ok, so I've done the updates you have suggested. However, now valgrind throws error for functions fprintf in my program. Here is what it says:
11 errors in context 2 of 2:
==3646== Conditional jump or move depends on uninitialised value(s)
==3646== at 0x40BA4B1: vfprintf (vfprintf.c:1601)
==3646== by 0x40C0F7F: printf (printf.c:35)
==3646== by 0x80487B8: print_line (sort_lines.c:44)
==3646== by 0x804887D: main (sort_lines.c:77)
==3646== Uninitialised value was created by a heap allocation
==3646== at 0x4024D12: realloc (vg_replace_malloc.c:476)
==3646== by 0x8048766: getline (sort_lines.c:30)
==3646== by 0x804889A: main (sort_lines.c:75)
I would like to know why it reports error on simply fprintf those lines to a text file. I've looked up that it is something concerning gcc optimalization turning fprint into fputs but I dont get this idea
There are multiple problems in your code:
function getline:
the string in the line buffer is not properly '\0' terminated at the end of the do / while loop.
It does not free the line buffer upon end of file, hence memory leak.
It does not return a partial line at end of file if the file does not end with a '\n'.
neither malloc not realloc return values are checked for memory allocation failure.
You should realloc the line to the actual size used to reduce the amount of memory consumed. Currently, you allocate at least 1024 bytes per line. Sorting the dictionary will require 100 times more memory than needed!
function MyCompare:
The lines read include the '\n' at the end. Comparison may yield unexpected results: "Hello\tworld\n" will come before "Hello\n". You should strip the '\n' in getline and modify the printf formats appropriately.
function main:
You do not check if command line arguments are actually provided before trying to open them with fopen, invoking undefined behaviour
The first loop that counts the number of lines does not free the lines returned by getline... big memory leak!
The input stream cannot always be rewinded. If your program is given its input via a pipe, rewind will fail except for very small input sizes.
The number of lines read in the second loop may be different from the one counted in the first loop if the file was modified asynchronously by another process. It the file grew, you invoke undefined behaviour when you load it, if it shrank, you invoke undefined behaviour when you sort the array. You should reallocate the all_lines array as you read the lines in a single loop.
Printing the lines before sort is not very useful and complicates testing.

Reading text file into an array of lines in C

Using C I would like to read in the contents of a text file in such a way as to have when all is said and done an array of strings with the nth string representing the nth line of the text file. The lines of the file can be arbitrarily long.
What's an elegant way of accomplishing this? I know of some neat tricks to read a text file directly into a single appropriately sized buffer, but breaking it down into lines makes it trickier (at least as far as I can tell).
Thanks very much!
Breaking it down into lines means parsing the text and replacing all the EOL (by EOL I mean \n and \r) characters with 0.
In this way you can actually reuse your buffer and store just the beginning of each line into a separate char * array (all by doing only 2 passes).
In this way you could do one read for the whole file size+2 parses which probably would improve performance.
It's possible to read the number of lines in the file (loop fgets), then create a 2-dimensional array with the first dimension being the number of lines+1. Then, just re-read the file into the array.
You'll need to define the length of the elements, though. Or, do a count for the longest line size.
Example code:
inFile = fopen(FILENAME, "r");
lineCount = 0;
while(inputError != EOF) {
inputError = fscanf(inFile, "%s\n", word);
lineCount++;
}
fclose(inFile);
// Above iterates lineCount++ after the EOF to allow for an array
// that matches the line numbers
char names[lineCount][MAX_LINE];
fopen(FILENAME, "r");
for(i = 1; i < lineCount; i++)
fscanf(inFile, "%s", names[i]);
fclose(inFile);
For C (as opposed to C++), you'd probably wind up using fgets(). However, you might run into issues due to your arbitrary length lines.
Perhaps a Linked List would be the best way to do this?
The compiler won't like having an array with no idea how big to make it. With a Linked List you can have a really large text file, and not worry about allocating enough memory to the array.
Unfortunately, I haven't learned how to do linked lists, but maybe somebody else could help you.
If you have a good way to read the whole file into memory, you are almost there. After you've done that you could scan the file twice. Once to count the lines, and once to set the line pointers and replace '\n' and (and maybe '\r' if the file is read in Windows binary mode) with '\0'. In between scans allocate an array of pointers, now that you know how many you need.
you can use this way
#include <stdlib.h> /* exit, malloc, realloc, free */
#include <stdio.h> /* fopen, fgetc, fputs, fwrite */
struct line_reader {
/* All members are private. */
FILE *f;
char *buf;
size_t siz;
};
/*
* Initializes a line reader _lr_ for the stream _f_.
*/
void
lr_init(struct line_reader *lr, FILE *f)
{
lr->f = f;
lr->buf = NULL;
lr->siz = 0;
}
/*
* Reads the next line. If successful, returns a pointer to the line,
* and sets *len to the number of characters, at least 1. The result is
* _not_ a C string; it has no terminating '\0'. The returned pointer
* remains valid until the next call to next_line() or lr_free() with
* the same _lr_.
*
* next_line() returns NULL at end of file, or if there is an error (on
* the stream, or with memory allocation).
*/
char *
next_line(struct line_reader *lr, size_t *len)
{
size_t newsiz;
int c;
char *newbuf;
*len = 0; /* Start with empty line. */
for (;;) {
c = fgetc(lr->f); /* Read next character. */
if (ferror(lr->f))
return NULL;
if (c == EOF) {
/*
* End of file is also end of last line,
` * unless this last line would be empty.
*/
if (*len == 0)
return NULL;
else
return lr->buf;
} else {
/* Append c to the buffer. */
if (*len == lr->siz) {
/* Need a bigger buffer! */
newsiz = lr->siz + 4096;
newbuf = realloc(lr->buf, newsiz);
if (newbuf == NULL)
return NULL;
lr->buf = newbuf;
lr->siz = newsiz;
}
lr->buf[(*len)++] = c;
/* '\n' is end of line. */
if (c == '\n')
return lr->buf;
}
}
}
/*
* Frees internal memory used by _lr_.
*/
void
lr_free(struct line_reader *lr)
{
free(lr->buf);
lr->buf = NULL;
lr->siz = 0;
}
/*
* Read a file line by line.
* http://rosettacode.org/wiki/Read_a_file_line_by_line
*/
int
main()
{
struct line_reader lr;
FILE *f;
size_t len;
char *line;
f = fopen("foobar.txt", "r");
if (f == NULL) {
perror("foobar.txt");
exit(1);
}
/*
* This loop reads each line.
* Remember that line is not a C string.
* There is no terminating '\0'.
*/
lr_init(&lr, f);
while (line = next_line(&lr, &len)) {
/*
* Do something with line.
*/
fputs("LINE: ", stdout);
fwrite(line, len, 1, stdout);
}
if (!feof(f)) {
perror("next_line");
exit(1);
}
lr_free(&lr);
return 0;
}

Resources