I'm trying to read from a text file using the read function and store to a buffer. I then have to check the file again to see if any changes were made, and if there are, to reallocate memory and store contents to the same buffer (appending), character by character until EOF is reached. My code looks like this so far:
int fileSize=0;
fileSize=fileStat.st_size; /*fileSize is how many bytes the file is, when read initially*/
char buf[fileSize];
read(0, buf, fileSize);
/*now, I have to check if the file changed*/
int reader;
void *tempChar;
int reader=read(0, tempChar, 1);
while(reader!=0){
/*this means the file grew...I'm having trouble from here*/
I tried a lot of things, but always end up having problems when I try to append the contents from "tempChar" to "buf". I know to use the realloc function..but I'm still having problems. Any help would be appreciated.
Thanks!
You cannot use realloc() for statically allocated memory.
If you wan to do that, you have to use pointers and allocate the memory dynamically.
Example:
char *buf;
buf = malloc(fileSize);
Related
I have the following function and I am wondering if there is a way to pass string or char array instead of stdout into it so I can get the printed representation as a string.
void print_Type(Type t, FILE *f)
{
fprintf(f,"stuff ...");
}
print_Type(t, stdout);
I have already tried this:
int SIZE = 100;
char buffer[SIZE];
print_Type(t, buffer);
But this is what I am seeing:
�����
Something like this
FILE* f = fmemopen(buffer, sizeof(buffer), "w");
print_Type(t, f);
fclose(f);
The fmemopen(void *buf, size_t size, const char *mode) function opens a stream. The stream allows I/O to be performed on the string or memory buffer pointed to by buf.
Yes there is sprintf() notice the leading s rather than f.
int SIZE = 100;
char buffer[SIZE];
sprintf(buffer, "stuff %d", 10);
This function prints to a string s rather than a file f. It has exactly the same properties and parameters to fprintf() the only difference is the destination, which must be a char array (either statically allocated as an array or dynamical allocated (usually via malloc)).
Note: This function is dangerous as it does not check the length and can easily overrun the end of the buffer if you are not careful.
If you are using a later version of C (c99). A better function is snprintf this adds the extra buffer length checking.
The problem with fmemopen is that it cannot resize the buffer. fmemopen did exist in Glibc for quite some time, but it was standardized only in POSIX.1-2008. But that revision included another function that handles dynamic memory allocation: open_memstream(3):
char *buffer = NULL;
size_t size = 0;
FILE* f = open_memstream(&buffer, &size);
print_Type(t, f);
fclose(f);
buffer will now point to a null-terminated buffer, with size bytes before the extra null terminator! I.e. you didn't write null bytes, then strlen(buffer) == size.
Thus the only merit of fmemopen is that it can be used to write to a fixed location memory buffer or fixed length, whereas open_memstream should be used everywhere else where the location of the buffer does not matter.
For fmemopen there is yet another undesired feature - the writes may fail when the buffer is being flushed and not before. Since the target is in memory, there is no point in buffering the writes, so it is suggested that if you choose to use fmemopen, Linux manual page fmemopen(3) recommends disabling buffering with setbuf(f, NULL);
First of all, I'm quite new with C, and I know this is a very repeated question, however, I could not find anything that could help me with my problem.
Here is my code:
It takes a text file and stores each line in an array.
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
int main()
{
FILE *file;
file = fopen("test.txt", "r");
char buffer[600];
char *lines[10000];
int i = 0;
while(fgets(buffer, sizeof(buffer), file))
{
lines[i] = malloc(sizeof(buffer));
strcpy(lines[i], buffer);
i++;
free(lines[i]);
}
fclose(file);
return 1;
}
This works fine for small text files.
However it doesn't with large ones (even setting buffer and lines with much bigger numbers). Actually, if I increment buffer[] and *lines[] like 1000000 bytes, it doesn't give anything (if I understood well, it gives undefined behaviour). And I need to get this work with a 100.000 lines file with variable length lines,
So, how could I declare a very large array so I can pass each line? Since, as I exposed, it doesn't work with a large file.
Any help is appreciated!
char *lines[10000]; is just an array of pointers to the lines, not the array (memory) that is going to store the actual lines.
malloc is allocating a chunk of memory for each line, you are suppose to call free only when you are done using this chunk.
If you remove the free your solution would work, but you need to remember to free at some other point.
And I need to get this work with a 100.000 lines file with variable length lines,
So, how could I declare a very large array so I can pass each line?
This line
char *lines[10000];
gives you a variable with automatic storage duration - often called a local variable.
On most systems such a variable are located on a stack and most systems have a fixed limit for the size of the stack and thereby also a limit for the size of such a local variable.
So if you change the code to
char *lines[1000000];
to be able to handle larger files, it is likely that the variable use too much memory on the stack, i.e. you have a stack overflow.
A simple solution is to allocate the variable dynamically. Like:
char **lines = malloc(1000000 * sizeof *lines);
This will allocate 1000000 char-pointers and you can use lines as if it's an array - for instance like:
lines[i] = malloc(sizeof(buffer));
For something like this I'll also recommend that you take a look at realloc so that you can adjust the size of memory as needed.
Besides that your use of free seems strange and it's for sure wrong as you increment i between the malloc and the free.
You can allocate any space just as big as you need. So you will get rid of the fixed and limited numbers.
I have "massaged" your example in this way. The only thing I didn't is a first round through the file to obtain the longest line. So I kept the fixed buffer length.
Allocate only as many pointer to the lines as you need. For this you define a pointer to pointers to char.
Allocate only as many characters for each line as you need. This is done most conveniently with the function strdup(). If your library doesn't have it (it is not standard) you can replace it with the right combination of strlen(), malloc(), and strcpy(). How to do this is left as an exercise for you. ;-)
Handle allocation errors, especially if you plan to read huge files.
Free the allocated memories blocks, the sequence for the lines is not important. But lines has to be kept until all lines[*] are freed.
This is the code:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
int main()
{
FILE *file;
file = fopen("test.txt", "r");
char buffer[600];
char **lines = NULL;
int i = 0;
while (fgets(buffer, sizeof(buffer), file))
{
lines = realloc(lines, (i + 1) * sizeof (char*));
if (lines == NULL)
{
// any error handling you like...
return EXIT_FAILURE;
}
lines[i] = strdup(buffer);
if (lines[i] == NULL)
{
// any error handling you like...
return EXIT_FAILURE;
}
i++;
}
fclose(file);
// work with the lines
for (int j = 0; j < i; ++j)
{
free(lines[j]);
}
free(lines);
return EXIT_SUCCESS;
}
Some notes:
Because of the realloc() on each line the run time of you program will scale bad for files with a giant number of lines. To improve this you might like to use some better algorithm, for example by allocating in steps of growing numbers. But this is a completely different issue.
You don't need to free allocated memory yourself at all if you need the memory until the end of the program. The C runtime will then free the memory automatically.
#include <stdio.h>
#include <stdlib.h>
int main()
{
FILE *input_f;
input_f = fopen("Input.txt", "r"); //Opens the file in read mode.
if (input_f != NULL)
{
char line[2048];
while( fgets(line, sizeof line, input_f) != NULL )
{
//do something
}
fclose(input_f); //Close the input file.
}
else
{
perror("File couldn't opened"); //Will print that file couldn't opened and why.
}
return 0;
}
Hi. I know I can read line by line with this code in C, but I don't want to limit line size, say like in this code with 2048.
I thought about using malloc, but I don't know the size of the line before I read it, so IMO it cannot be done.
Is there a way to not to limit line size?
This question is just for my curiosity, thank you.
When you are allocating memory dynamically, you will want to change:
char line[2048];
to
#define MAXL 2048 /* the use of a define will become apparent when you */
size_t maxl = MAXL; /* need to check to determine if a realloc is needed */
char *line = malloc (maxl * sizeof *line);
if (!line) /* always check to insure allocation succeeded */
...error.. memory allocation failed
You read read up to (maxl -1) chars or a newline (if using fgetc, etc..) or read the line and then check whether line [strlen (line) - 1] == '\n' to determine whether you read the entire line (if using fgets). (POSIX requires all lines terminate with a newline) If you read maxl characters (fgetc) or did not read the newline (fgets), then it is a short read and more characters remain. Your choice is to realloc (generally doubling the size) and try again. To realloc:
char *tmp = realloc (line, 2 * maxl)
if (tmp) {
line = tmp;
maxl *= 2;
}
Note: never reallocate using your original pointer (e.g. line = realloc (line, 2 * maxl) because if realloc fails, the memory is freed and the pointer set to NULL and you will lose any data that existed in line. Also note that maxl is typically doubled each time you realloc. However, you are free to choose whatever size increasing scheme you like. (If you are concerned about zeroing all new memory allocated, you can use memset to initialize the newly allocated space to zero/null. Useful in some situations where you want to insure your line is always null-terminated)
That is the basic dynamic allocation/reallocation scheme. Note you are reading until you read the complete line, so you will need to restructure your loop test. And lastly, since you allocated the memory, you are responsible for freeing the memory when you are done with it. A tool you cannot live without is valgrind (or similar memory checker) to confirm you are not leaking memory.
Tip if you are reading and want to insure your string is always null-terminated, then after allocating your block of memory, zero (0) all characters. As mentioned earlier, memset is available, but if you choose calloc instead of malloc it will zero the memory for you. However, on realloc the new space is NOT zero'ed either way, so calling memset is required regardless of what function originally allocated the block.
Tip2 Look at the POSIX getline. getline will handle the allocation/reallocation needed so long as line is initialized to NULL. getline also returns the number of characters actually read dispensing with the need to call strlen after fgets to determine the same.
Let me know if you have additional questions.
Consider 2 thoughts:
An upper bound of allocated memory is reasonable. The nature of the task should have some idea of a maximum line length, be it 80, 1024 or 1 Mbyte.
With a clever OS, actual usage of allocated memory may not occur until needed. See Why is malloc not "using up" the memory on my computer?
So let code allocate 1 big buffer to limit pathological cases and let the underlying memory management (re-)allocate real memory as needed.
#define N (1000000)
char *buf = malloc(N);
...
while (fgets(buf, N, stdin) != NULL)) {
size_t len = strlen(buf);
if (len == N-1) {
perror("Excessive Long Line");
exit(EXIT_FAILURE);
}
}
free(buf);
I need to read in a file. The first line of the file is the number of lines in the file and it returns an array of strings, with the last element being a NULL indicating the end of the array.
char **read_file(char *fname)
{
char **dict;
printf("Reading %s\n", fname);
FILE *d = fopen(fname, "r");
if (! d) return NULL;
// Get the number of lines in the file
//the first line in the file is the number of lines, so I have to get 0th element
char *size;
fscanf(d, "%s[^\n]", size);
int filesize = atoi(size);
// Allocate memory for the array of character pointers
dict = NULL; // Change this
// Read in the rest of the file, allocting memory for each string
// as we go.
// NULL termination. Last entry in the array should be NULL.
printf("Done\n");
return dict;
}
I put some comments because I know that's what I'm to do, but I can't seem to figure out how to put it in actual code.
To solve this problem you need to do one of two things.
Read the file as characters then convert to integers.
Read the file directly as integers.
For the first, you would use freed into a char array and then use atoi to convert to integer.
For the second, you would use fscanf and use the %d specify to read directly into an int variable;
fscanf does not allocate memory for you. Passing it a random pointer as you have will only cause trouble. (I recommend avoid fscanf).
The question code has a flaw:
char *size;
fscanf(d, "%s[^\n]", size);
Although the above may compile, it will not function as expected at runtime. The problem is that fscanf() needs the memory address of where to write the parsed value. While size is a pointer that can store a memory address, it is uninitialized, and points to no specific memory in the process' memory map.
The following may be a better replacement:
fscanf(d, " %d%*c", &filesize);
See my version of the spoiler code here
So one of the ways to take user-keyboard input in C I know is as follows:
char buffer[LENGTH_KNOWN] = "";
scanf("%s",buffer);
I was wondering if there is any way to take arbitrary length user input. I tried something as follows but I ended up getting a segfault.
char* buffer = "";
scanf("%s",buffer);
printf("%s",buffer);
However this seems to work:
char* buffer = "TEST........keeps going....................."
scanf("%s",buffer);
printf("%s",buffer);
Can anybody explain why I am getting this error and is there any easy way out to scanf arbitrary user input without using malloc and checking buffer overflow?
Thanks in advance!
Actually both are wrong since you can't write to a string literal (in both your examples buffer points to a string literal).
It's impossible to get arbitrarily-long input via a single scanf. You need to get input in a loop and keep adding to a real buffer.
is there any easy way out to scanf arbitrary user input without using
malloc and checking buffer overflow
Use a ready-made function that does it for you. Something like getline(3) (non-standard unfortunately).
ssize_t getline (char **lineptr, size_t *n, FILE *stream)
This function reads an entire line from stream, storing the text
(including the newline and a terminating null character) in a buffer
and storing the buffer address in *lineptr.
char* buffer = "";
scanf("%s",buffer);
printf("%s",buffer);
In the above your code you are not allocating memory for the buffer, so allocate memory for the buffer and read the values into buffer.
char *buffer = "..."; is totally wrong. It is not how you allocate memory because the thing in double quotes is a string literal and it is read-only. You can allocate memory like this:
char buffer[1024];
or dynamically using malloc:
char *buffer = malloc(1024);
/* .... */
free(buffer);
You Never know how much bytes you might read and hence allocate the memory accordingly. So I guess you need to get input in a loop and add in into buffer.
char* buffer;
buffer = malloc(sizeof(char) * 1024); // 1024 is max user input length, or use whatever you want..
scanf("%s", buffer);
buffer = realloc (buffer, strlen(buffer) + 1); // re-allocating the buffer according to user input..( + 1 for counting NULL character)
printf("%s",buffer);