Am trying to open a file(Myfile.txt) and concatenate each line to a single buffer, but am getting unexpected output. The problem is,my buffer is not getting updated with the last concatenated lines. Any thing missing in my code?
Myfile.txt (The file to open and read)
Good morning line-001:
Good morning line-002:
Good morning line-003:
Good morning line-004:
Good morning line-005:
.
.
.
Mycode.c
#include <stdio.h>
#include <string.h>
int main(int argc, const char * argv[])
{
/* Define a temporary variable */
char Mybuff[100]; // (i dont want to fix this size, any option?)
char *line = NULL;
size_t len=0;
FILE *fp;
fp =fopen("Myfile.txt","r");
if(fp==NULL)
{
printf("the file couldn't exist\n");
return;
}
while (getline(&line, &len, fp) != -1 )
{
//Any function to concatinate the strings, here the "line"
strcat(Mybuff,line);
}
fclose(fp);
printf("Mybuff is: [%s]\n", Mybuff);
return 0;
}
Am expecting my output to be:
Mybuff is: [Good morning line-001:Good morning line-002:Good morning line-003:Good morning line-004:Good morning line-005:]
But, am getting segmentation fault(run time error) and a garbage value. Any think to do? thanks.
Specify MyBuff as a pointer, and use dynamic memory allocation.
#include <stdlib.h> /* for dynamic memory allocation functions */
char *MyBuff = calloc(1,1); /* allocate one character, initialised to zero */
size_t length = 1;
while (getline(&line, &len, fp) != -1 )
{
size_t newlength = length + strlen(line)
char *temp = realloc(MyBuff, newlength);
if (temp == NULL)
{
/* Allocation failed. Have a tantrum or take recovery action */
}
else
{
MyBuff = temp;
length = newlength;
strcat(MyBuff, temp);
}
}
/* Do whatever is needed with MyBuff */
free(MyBuff);
/* Also, don't forget to release memory allocated by getline() */
The above will leave newlines in MyBuff for each line read by getline(). I'll leave removing those as an exercise.
Note: getline() is linux, not standard C. A function like fgets() is available in standard C for reading lines from a file, albeit it doesn't allocate memory like getline() does.
Related
So yeah, saw many similar questions to this one, but thought to try solving it my way. Getting huge amount of text blocks after running it (it compiles fine).
Im trying to get an unknown size of string from a file. Thought about allocating pts at size of 2 (1 char and null terminator) and then use malloc to increase the size of the char array for every char that exceeds the size of the array.
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
int main()
{
char *pts = NULL;
int temp = 0;
pts = malloc(2 * sizeof(char));
FILE *fp = fopen("txtfile", "r");
while (fgetc(fp) != EOF) {
if (strlen(pts) == temp) {
pts = realloc(pts, sizeof(char));
}
pts[temp] = fgetc(fp);
temp++;
}
printf("the full string is a s follows : %s\n", pts);
free(pts);
fclose(fp);
return 0;
}
You probably want something like this:
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
#define CHUNK_SIZE 1000 // initial buffer size
int main()
{
int ch; // you need int, not char for EOF
int size = CHUNK_SIZE;
char *pts = malloc(CHUNK_SIZE);
FILE* fp = fopen("txtfile", "r");
int i = 0;
while ((ch = fgetc(fp)) != EOF) // read one char until EOF
{
pts[i++] = ch; // add char into buffer
if (i == size + CHUNK_SIZE) // if buffer full ...
{
size += CHUNK_SIZE; // increase buffer size
pts = realloc(pts, size); // reallocate new size
}
}
pts[i] = 0; // add NUL terminator
printf("the full string is a s follows : %s\n", pts);
free(pts);
fclose(fp);
return 0;
}
Disclaimers:
this is untested code, it may not work, but it shows the idea
there is absolutely no error checking for brevity, you should add this.
there is room for other improvements, it can probably be done even more elegantly
Leaving aside for now the question of if you should do this at all:
You're pretty close on this solution but there are a few mistakes
while (fgetc(fp) != EOF) {
This line is going to read one char from the file and then discard it after comparing it against EOF. You'll need to save that byte to add to your buffer. A type of syntax like while ((tmp=fgetc(fp)) != EOF) should work.
pts = realloc(pts, sizeof(char));
Check the documentation for realloc, you'll need to pass in the new size in the second parameter.
pts = malloc(2 * sizeof(char));
You'll need to zero this memory after acquiring it. You probably also want to zero any memory given to you by realloc, or you may lose the null off the end of your string and strlen will be incorrect.
But as I alluded to earlier, using realloc in a loop like this when you've got a fair idea of the size of the buffer already is generally going to be non-idiomatic C design. Get the size of the file ahead of time and allocate enough space for all the data in your buffer. You can still realloc if you go over the size of the buffer, but do so using chunks of memory instead of one byte at a time.
Probably the most efficient way is (as mentioned in the comment by Fiddling Bits) is to read the whole file in one go (after first getting the file's size):
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
#include <sys/stat.h>
int main()
{
size_t nchars = 0; // Declare here and set to zero...
// ... so we can optionally try using the "stat" function, if the O/S supports it...
struct stat st;
if (stat("txtfile", &st) == 0) nchars = st.st_size;
FILE* fp = fopen("txtfile", "rb"); // Make sure we open in BINARY mode!
if (nchars == 0) // This code will be used if the "stat" function is unavailable or failed ...
{
fseek(fp, 0, SEEK_END); // Go to end of file (NOTE: SEEK_END may not be implemented - but PROBABLY is!)
// while (fgetc(fp) != EOF) {} // If your system doesn't implement SEEK_END, you can do this instead:
nchars = (size_t)(ftell(fp)); // Add one for NUL terminator
}
char* pts = calloc(nchars + 1, sizeof(char));
if (pts != NULL)
{
fseek(fp, 0, SEEK_SET); // Return to start of file...
fread(pts, sizeof(char), nchars, fp); // ... and read one great big chunk!
printf("the full string is a s follows : %s\n", pts);
free(pts);
}
else
{
printf("the file is too big for me to handle (%zu bytes)!", nchars);
}
fclose(fp);
return 0;
}
On the issue of the use of SEEK_END, see this cppreference page, where it states:
Library implementations are allowed to not meaningfully support SEEK_END (therefore, code using it has no real standard portability).
On whether or not you will be able to use the stat function, see this Wikipedia page. (But it is now available in MSVC on Windows!)
I am trying to process a 250MB file using a script in C.
The file is basically a dataset and I want to read just some of the columns and (more importantly) break one of them (which is originally a string) into a sequence of characters.
However, even though I have plenty of RAM available, the code is killed by konsole (using KDE Neon) everytime I run it.
The source is available below:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
int main() {
FILE *arquivo;
char *line = NULL;
size_t len = 0;
int i = 0;
int j;
int k;
char *vetor[500];
int acertos[45];
FILE *licmat = fopen("licmat.csv", "w");
//creating the header
fprintf(licmat,"CO_CATEGAD,CO_UF_CURSO,ACERTO09,ACERTO10,ACERTO11,ACERTO12,ACERTO13,ACERTO14,ACERTO15,ACERTO16,ACERTO17,ACERTO18,ACERTO19,ACERTO20,ACERTO21,ACERTO22,ACERTO23,ACERTO24,ACERTO25,ACERTO26,ACERTO27,ACERTO28,ACERTO29,ACERTO30,ACERTO31,ACERTO32,ACERTO33,ACERTO34,ACERTO35\n");
if ((arquivo = fopen("MICRODADOS_ENADE_2017.csv", "r")) == NULL) {
printf ("\nError");
exit(0);
}
//reading one line at a time
while (getline(&line, &len, arquivo)) {
char *ptr = strsep(&line,";");
j=0;
//breaking the line into a vector based on ;
while(ptr != NULL)
{
vetor[j]=ptr;
j=j+1;
ptr = strsep(&line,";");
}
//filtering based on content
if (strcmp(vetor[4],"702")==0 && strcmp(vetor[33],"555")==0) {
//copying some info
fprintf(licmat,"%s,%s,",vetor[2],vetor[8]);
//breaking the string (32) into isolated characters
for (k=0;k<27;k=k+1) {
fprintf(licmat,"%c", vetor[32][k]);
if (k<26) {
fprintf(licmat,",");
}
}
fprintf(licmat,"\n");
}
i=i+1;
}
free(line);
fclose(arquivo);
fclose(licmat);
}
The output is perfect up to the point when the script is killed. The output file is just 640KB long and has about 10000 lines only.
What could be the issue?
It looks to me like you're mishandling the memory buffer managed by getline() - which allocates/reallocates as needed - by the use of strsep(), which seems to manipulate that same pointer value.
Once line has been updated to reflect some other element on the line, it's no longer pointing to the start of allocated memory, and then boom the next time getline() needs to do anything with it.
Use a different variable to pass to strsep():
while (getline(&line, &len, arquivo) > 0) { // use ">=" if you want blank lines
char *parseline = line;
char *ptr = strsep(&parseline,";");
// do the same thing later
The key thing here: you're not allowed to muck with the value of line other than to free() it at the end (which you do), and you can't let any other routine do it either.
Edit: updated to reflect getline() returning <0 on error (h/t to #user3121023)
The objective here is to take a whole text file that I dump into a buffer and then use the strcasestr() function to find the pointer of the word I am looking for within my buffer. It constantly gives me the segmentation fault error. At first, I thought it may be size so I tried with smaller sizes but it doesn't work either. The function only works with strings I create inside the actual code (ex : char * bob = "bob"; char * bobsentence = "bob is cool"; strstr(bobsentence, bob);). Which leads me to believe it has something to do with the fgets(). Any help is appreciated, really stuck on this one.
#define _GNU_SOURCE //to use strcasestr
#include <unistd.h>
#include <sys/types.h>
#include <dirent.h>
#include <stdio.h>
#include <string.h>
#include <stdio.h>
#include <stdlib.h>
void textEdit(char *path, char *word){
printf("%s\n", path);
FILE *textFile;
//FILE *locationFile;
//FILE *tempFile;
char counter[1024];
int count = 0;
textFile = fopen(path, "r+");
//locationFile = fopen(path, "r+");
//opens file to read and write and opens temp file to write
if( textFile == NULL){ //|| tempFile == NULL || locationFile == NULL) ) {
printf ("\nerror\n");
return;
}
// SECTION : ALLOCATES MEMORY NEEDED FOR COPY TEXT IN ARRAY
// finds number of lines to estimate total size of array needed for buffer
while((fgets(counter, sizeof(counter), textFile)) != NULL){
count++;
}
fclose(textFile);
FILE *tempFile = fopen(path, "r+");
count *= 1024;
printf("%d %zu\n",count, sizeof(char));
char *buffer = malloc(count); //1024 is the max number of characters per line in a traditional txt
if(buffer == NULL){ //error with malloc
return;
}
// SECTION : DUMPS TEXT INTO ARRAY
if(fgets(buffer, count, tempFile) == NULL){
printf("error");
} //dumps all text into array
printf("%s\n", buffer);
char * searchedWord;
while((searchedWord = strcasestr(buffer, word)) != NULL){
}
fclose(tempFile);
//fclose(locationFile);
free(buffer);
}
It looks that you forgot to initialize count variable to 0:
int count = 0;
You increment it and it can contain any random value, even negative.
Also, note that your utilization of strstr doesn't look correct. The function returns the pointer to first occurrence that matches. Note, that it doesn't remember already found matches, so if match exists it should loop forever in this loop. Instead it should look like:
char *pos = buffer;
while((pos = strcasestr(pos, word)) != NULL){
searchedWord = pos;
/* do something with searchedWord but remember that it belongs to
allocated buffer and can't be used after free() */
pos++;
}
Using C I would like to read in the contents of a text file in such a way as to have when all is said and done an array of strings with the nth string representing the nth line of the text file. The lines of the file can be arbitrarily long.
What's an elegant way of accomplishing this? I know of some neat tricks to read a text file directly into a single appropriately sized buffer, but breaking it down into lines makes it trickier (at least as far as I can tell).
Thanks very much!
Breaking it down into lines means parsing the text and replacing all the EOL (by EOL I mean \n and \r) characters with 0.
In this way you can actually reuse your buffer and store just the beginning of each line into a separate char * array (all by doing only 2 passes).
In this way you could do one read for the whole file size+2 parses which probably would improve performance.
It's possible to read the number of lines in the file (loop fgets), then create a 2-dimensional array with the first dimension being the number of lines+1. Then, just re-read the file into the array.
You'll need to define the length of the elements, though. Or, do a count for the longest line size.
Example code:
inFile = fopen(FILENAME, "r");
lineCount = 0;
while(inputError != EOF) {
inputError = fscanf(inFile, "%s\n", word);
lineCount++;
}
fclose(inFile);
// Above iterates lineCount++ after the EOF to allow for an array
// that matches the line numbers
char names[lineCount][MAX_LINE];
fopen(FILENAME, "r");
for(i = 1; i < lineCount; i++)
fscanf(inFile, "%s", names[i]);
fclose(inFile);
For C (as opposed to C++), you'd probably wind up using fgets(). However, you might run into issues due to your arbitrary length lines.
Perhaps a Linked List would be the best way to do this?
The compiler won't like having an array with no idea how big to make it. With a Linked List you can have a really large text file, and not worry about allocating enough memory to the array.
Unfortunately, I haven't learned how to do linked lists, but maybe somebody else could help you.
If you have a good way to read the whole file into memory, you are almost there. After you've done that you could scan the file twice. Once to count the lines, and once to set the line pointers and replace '\n' and (and maybe '\r' if the file is read in Windows binary mode) with '\0'. In between scans allocate an array of pointers, now that you know how many you need.
you can use this way
#include <stdlib.h> /* exit, malloc, realloc, free */
#include <stdio.h> /* fopen, fgetc, fputs, fwrite */
struct line_reader {
/* All members are private. */
FILE *f;
char *buf;
size_t siz;
};
/*
* Initializes a line reader _lr_ for the stream _f_.
*/
void
lr_init(struct line_reader *lr, FILE *f)
{
lr->f = f;
lr->buf = NULL;
lr->siz = 0;
}
/*
* Reads the next line. If successful, returns a pointer to the line,
* and sets *len to the number of characters, at least 1. The result is
* _not_ a C string; it has no terminating '\0'. The returned pointer
* remains valid until the next call to next_line() or lr_free() with
* the same _lr_.
*
* next_line() returns NULL at end of file, or if there is an error (on
* the stream, or with memory allocation).
*/
char *
next_line(struct line_reader *lr, size_t *len)
{
size_t newsiz;
int c;
char *newbuf;
*len = 0; /* Start with empty line. */
for (;;) {
c = fgetc(lr->f); /* Read next character. */
if (ferror(lr->f))
return NULL;
if (c == EOF) {
/*
* End of file is also end of last line,
` * unless this last line would be empty.
*/
if (*len == 0)
return NULL;
else
return lr->buf;
} else {
/* Append c to the buffer. */
if (*len == lr->siz) {
/* Need a bigger buffer! */
newsiz = lr->siz + 4096;
newbuf = realloc(lr->buf, newsiz);
if (newbuf == NULL)
return NULL;
lr->buf = newbuf;
lr->siz = newsiz;
}
lr->buf[(*len)++] = c;
/* '\n' is end of line. */
if (c == '\n')
return lr->buf;
}
}
}
/*
* Frees internal memory used by _lr_.
*/
void
lr_free(struct line_reader *lr)
{
free(lr->buf);
lr->buf = NULL;
lr->siz = 0;
}
/*
* Read a file line by line.
* http://rosettacode.org/wiki/Read_a_file_line_by_line
*/
int
main()
{
struct line_reader lr;
FILE *f;
size_t len;
char *line;
f = fopen("foobar.txt", "r");
if (f == NULL) {
perror("foobar.txt");
exit(1);
}
/*
* This loop reads each line.
* Remember that line is not a C string.
* There is no terminating '\0'.
*/
lr_init(&lr, f);
while (line = next_line(&lr, &len)) {
/*
* Do something with line.
*/
fputs("LINE: ", stdout);
fwrite(line, len, 1, stdout);
}
if (!feof(f)) {
perror("next_line");
exit(1);
}
lr_free(&lr);
return 0;
}
This question already has answers here:
How to read the content of a file to a string in C?
(12 answers)
Closed 5 years ago.
I'm dealing with small text files that i want to read into a buffer while i process them, so i've come up with the following code:
...
char source[1000000];
FILE *fp = fopen("TheFile.txt", "r");
if(fp != NULL)
{
while((symbol = getc(fp)) != EOF)
{
strcat(source, &symbol);
}
fclose(fp);
}
...
Is this the correct way of putting the contents of the file into the buffer or am i abusing strcat()?
I then iterate through the buffer thus:
for(int x = 0; (c = source[x]) != '\0'; x++)
{
//Process chars
}
char source[1000000];
FILE *fp = fopen("TheFile.txt", "r");
if(fp != NULL)
{
while((symbol = getc(fp)) != EOF)
{
strcat(source, &symbol);
}
fclose(fp);
}
There are quite a few things wrong with this code:
It is very slow (you are extracting the buffer one character at a time).
If the filesize is over sizeof(source), this is prone to buffer overflows.
Really, when you look at it more closely, this code should not work at all. As stated in the man pages:
The strcat() function appends a copy of the null-terminated string s2 to the end of the null-terminated string s1, then add a terminating `\0'.
You are appending a character (not a NUL-terminated string!) to a string that may or may not be NUL-terminated. The only time I can imagine this working according to the man-page description is if every character in the file is NUL-terminated, in which case this would be rather pointless. So yes, this is most definitely a terrible abuse of strcat().
The following are two alternatives to consider using instead.
If you know the maximum buffer size ahead of time:
#include <stdio.h>
#define MAXBUFLEN 1000000
char source[MAXBUFLEN + 1];
FILE *fp = fopen("foo.txt", "r");
if (fp != NULL) {
size_t newLen = fread(source, sizeof(char), MAXBUFLEN, fp);
if ( ferror( fp ) != 0 ) {
fputs("Error reading file", stderr);
} else {
source[newLen++] = '\0'; /* Just to be safe. */
}
fclose(fp);
}
Or, if you do not:
#include <stdio.h>
#include <stdlib.h>
char *source = NULL;
FILE *fp = fopen("foo.txt", "r");
if (fp != NULL) {
/* Go to the end of the file. */
if (fseek(fp, 0L, SEEK_END) == 0) {
/* Get the size of the file. */
long bufsize = ftell(fp);
if (bufsize == -1) { /* Error */ }
/* Allocate our buffer to that size. */
source = malloc(sizeof(char) * (bufsize + 1));
/* Go back to the start of the file. */
if (fseek(fp, 0L, SEEK_SET) != 0) { /* Error */ }
/* Read the entire file into memory. */
size_t newLen = fread(source, sizeof(char), bufsize, fp);
if ( ferror( fp ) != 0 ) {
fputs("Error reading file", stderr);
} else {
source[newLen++] = '\0'; /* Just to be safe. */
}
}
fclose(fp);
}
free(source); /* Don't forget to call free() later! */
Yes - you would probably be arrested for your terriable abuse of strcat !
Take a look at getline() it reads the data a line at a time but importantly it can limit the number of characters you read, so you don't overflow the buffer.
Strcat is relatively slow because it has to search the entire string for the end on every character insertion.
You would normally keep a pointer to the current end of the string storage and pass that to getline as the position to read the next line into.
If you're on a linux system, once you have the file descriptor you can get a lot of information about the file using fstat()
http://linux.die.net/man/2/stat
so you might have
#include <unistd.h>
void main()
{
struct stat stat;
int fd;
//get file descriptor
fstat(fd, &stat);
//the size of the file is now in stat.st_size
}
This avoids seeking to the beginning and end of the file.
See this article from JoelOnSoftware for why you don't want to use strcat.
Look at fread for an alternative. Use it with 1 for the size when you're reading bytes or characters.
Why don't you just use the array of chars you have? This ought to do it:
source[i] = getc(fp);
i++;
Not tested, but should work.. And yes, it could be better implemented with fread, I'll leave that as an exercise to the reader.
#define DEFAULT_SIZE 100
#define STEP_SIZE 100
char *buffer[DEFAULT_SIZE];
size_t buffer_sz=DEFAULT_SIZE;
size_t i=0;
while(!feof(fp)){
buffer[i]=fgetc(fp);
i++;
if(i>=buffer_sz){
buffer_sz+=STEP_SIZE;
void *tmp=buffer;
buffer=realloc(buffer,buffer_sz);
if(buffer==null){ free(tmp); exit(1);} //ensure we don't have a memory leak
}
}
buffer[i]=0;
Methinks you want fread:
http://www.cplusplus.com/reference/clibrary/cstdio/fread/
Have you considered mmap()? You can read from the file directly as if it were already in memory.
http://beej.us/guide/bgipc/output/html/multipage/mmap.html