Using fgets to read through file in C - c

I am trying to read through the file given then tokenize it. The only problem im having is fgets.The file open recieves no errors. I have seen this elsewhere on the site however no matter how i set this up including setting fileLine to a set amount like (char fileline [200]) i get a segmentation fault. Thanks in advance for any help.
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <fcntl.h>
#include <unistd.h>
int main(int argc, char *argv[]){
char *fileName = "0";
char *tokenize, *savePtr;
struct Record *database= malloc(sizeof(database[0]));
int recordNum =0;
char *fileLine = malloc(sizeof(char *));//have replaced with fileline[200] still didnt work
FILE *fd = open(fileName,O_RDWR);
if(fd< 0){
perror("ERROR OPENING FILE");
}
while(fgets(fileLine,200,fd) !=NULL){
printf("%s\n", fileLine);
tokenize = strtok_r(fileLine,",",&savePtr);
while(tokenize != NULL){
//TOKENIZING into a struct
}
}

Why use open() with FILE? Use fopen() instead.
#include <stdio.h>
#include <string.h>
int main(int argc, char *argv[]) {
char *fileName = "test.txt";
char *tokenize, *savePtr;
char fileLine[200] = {0}; // init this to be NULL terminated
FILE *fd = fopen(fileName, "r");
if (fd == 0) { // error check, equal to 0 as iharob said, not less than 0
perror("ERROR OPENING FILE");
return -1;
}
while (fgets(fileLine, 200, fd) != NULL) {
printf("%s\n", fileLine);
tokenize = strtok_r(fileLine, ",", &savePtr);
while (tokenize != NULL) {
tokenize = strtok_r(NULL, ",", &savePtr); // do not forget to pass NULL
//TOKENIZING into a struct
}
}
return 0;
}
As Weather Vane said, fd < 0 would work if you used open(). However, with fopen(), you should check to see if the pointer is NULL, ecquivalently fd == 0.
A comparison between this functions that open a file can be found in:
open and fopen function
C fopen vs open
The way I have it in mind is that fopen() is of higher level.

This line
char *fileLine = malloc(sizeof(char *));
allocates memory for a char * type, 4 or 8 bytes (depending on the platform).
So when you do
fgets(fileLine,200,fd)
it expects there to be 200 bytes of memory available.
Try this:
char *fileLine = malloc(200);
if (fileLine == NULL) { ... } // check for error
which will allocate the memory required.

You are using open() instead of fopen().
You can't be sure that the file did open correctly because fopen() does not return an integer, but a pointer to a FILE * object, on failure it returns NULL, so the right codition is
FILE *file;
file = fopen(filename, "r");
if (file == NULL)
{
perror("fopen()");
return -1;
}
In your code, you still go and use fgets() even when the fopen() fails, you should abort the program in that case.
Also, malloc() takes the number of bytes as the size parameter, so if you want fgets() to be limited to read just count bytes, then malloc() should be
char *buffer;
size_t count;
count = 200; /* or a value obtained someway */
buffer = malloc(count);
if (buffer == NULL)
{
fclose(file);
perror("malloc()");
return -1;
}
All the problems in your code would be pointed out by the compiler if you enable compilation warnings.

Related

Is using getline() in a while loop bad practice?

If I have a file which contains a '\n' on virtually every line, and I use getline() in a while loop to read the content, is this bad practice?
My understanding is that getline() will be called numerous times for each '\n' char reached, and so realloc() will be called each time also. Which seems inefficient?
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
int main(int argc, char **argv) {
FILE *fp = fopen("file.txt", "r");
if (fp == NULL) {
perror("Unable to open file: ");
exit(1);
}
char *buffer = NULL;
size_t len = 0;
ssize_t characters;
while ((characters = getline(&buffer, &len, fp)) != -1) {
fputs(buffer, stdout);
}
fclose(fp);
free(buffer);
buffer = NULL;
return 0;
}
There is no problem calling getline in a loop as you do. As a matter of fact, this exactly the intended use case and used in the example in the man page.
realloc is only called if the array allocated so far is too short for the current line, thus it will be called very little and you can even lower the number of calls by allocating an initial array and set its length in the len variable.
Since getline returns the number of characters read, you could write the line using fwrite instead of fputs. Note however that the behavior is subtly different: fwrite will output lines read from the file containing embedded null bytes, whereas fputs would stop on the first null byte. This is usually not an issue as text files usually do not contain null bytes.
Here is a modified version:
#include <errno.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
int main(int argc, char **argv) {
FILE *fp = fopen("file.txt", "r");
if (fp == NULL) {
fprintf(stderr, "Unable to open file %s: %s\n",
"file.txt", strerror(errno));
return 1;
}
char *buffer = NULL;
size_t len = 0;
ssize_t characters;
#if 1
// optional code to show how to start with a preallocated buffer
if ((buffer = malloc(256)) != NULL)
len = 256;
#endif
while ((characters = getline(&buffer, &len, fp)) != -1) {
fwrite(buffer, 1, characters, stdout);
}
fclose(fp);
free(buffer);
return 0;
}

Error when passing filename per reference (C)

I need to pass a char pointer and filename per reference, and allocate a size for the pointer in that same function. (in the program it reads the size it needs to allocate from the file).
I created the file and also checked it in a hex editor, and it does exist.
I have tried to run it on GCC and Cygwin, it doesn't seem to be a compiler specific problem.
The following code is just the barebones, but still contains the same error:
GDB tells me it's a segfault caused by the file not existing. ("No such file or directory").
Where did I go wrong?
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
#define SIZE 50
#define NAME "name"
int loadFile(char** board, char* filename);
int main(int argc, char const *argv[])
{
char* board;
char* filename = NAME;
loadFile(&board, filename);
board[49] = '\0';
return 0;
}
int loadFile(char** board, char* filename)
{
FILE* source = fopen(filename, "rb");
if(source == NULL)
{
printf("Error loading file.\n");
return -1;
}
*board = malloc(SIZE);
if(fread(*board, sizeof(char), SIZE, source) != SIZE)
{
fclose(source);
printf("Error loading file.\n");
return -2;
}
return 0;
}
Following many comments with suggestions, I have seen strange things when a constant char literal is passed:
#define NAME "name"
char* filename = NAME;
This assigns a pointer to a constant char literal to filename. Please try:
#define NAME "name"
char filename[] = NAME;
Check your status values:
rv = loadFile(&board, filename);
if (rv == -1) {
perror("Error opening file");
return 1;
}
This should (hopefully) give you an error message you can troubleshoot further. Using a const char * as an argument to fopen() should be OK.

segfaulting with fgets, even though fopen doesn't return NULL in C

I've looked all over and made sure there were no warnings, but my code to replace text with digits keeps returning segfault. Any help?
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
int main(int argc , char *argv[])
{
FILE *file;
file = fopen(argv[1] , "r");
char *line = malloc(1024);
if(file != NULL)
{
while(fgets(line , sizeof(line) , file))
{
//things
}
}
else
{
printf("ERROR: %s NOT AVAILABLE" , argv[1]);
}
return 0;
}
Replace:
char *line = malloc(1024);
with:
char line[1024] = {0};
or:
char line[1024];
if you don't want to clear out the line buffer.
Otherwise, you end up with two problems.
First:
sizeof(line)
returns the size of the pointer (4 or 8 bytes). That's not what you want.
Second: You have a memory leak because you don't free the line pointer at the end.
You can use malloc if you want, but you want to write clean(er) code to do this. You might do something like:
#define MAX_LINE_LENGTH 1024
/* ... */
char *line = NULL;
line = malloc(MAX_LINE_LENGTH);
if (!line) {
fprintf(stderr, "Error: Could not allocate space for line buffer!\n");
exit(EXIT_FAILURE);
}
FILE *file = NULL;
/* Avoid undefined behavior by making sure filename argument holds a value */
if (argv[1])
file = fopen(argv[1] , "r");
if (file != NULL) { /* You could also do "if (file) { ... }" */
while (fgets(line, MAX_LINE_LENGTH, file)) {
/* ... */
}
}
free(line);
line = NULL;
As a habit, explicitly initialize pointers to NULL, and check that they actually hold a value before using them. Welcome to C!

String search C program for command prompt

I wrote C program of searching string. The problem is MyStrstr() function doesn't work with
command prompt. It only works with IDE. So, can anyone advise me how to fix the code for working with command prompt. With regards...
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
#define ARGUMENT_COUNT 3
#define FILE_INDEX 2
#define SEARCH_INDEX 1
#define BUFFER 256
#define SUCCESS 0
#define ERRCODE_PARAM 1
#define ERRCODE_FILENAME 2
#define MSG_USAGE "String Search Program === EXER5 === by Newbie\nUsage: %s Search_String fileName"
#define MSG_ERROR "Can not open file. [%s]"
char* MyStrstr(char* pszSearchString, char* pszSearchWord);
int main(int argc, char* argv[])
{
FILE* pFile = NULL;
char szData[BUFFER];
char* pszCutString = NULL;
if(argc != ARGUMENT_COUNT) {
printf(MSG_USAGE, argv[0]);
return ERRCODE_PARAM;
}
pFile = fopen(argv[FILE_INDEX], "r");
if(pFile == NULL) {
printf(MSG_ERROR, argv[FILE_INDEX]);
return ERRCODE_FILENAME;
}
pszCutString = MyStrstr(szData, argv[SEARCH_INDEX]);
if(pszCutString != NULL) {
printf("%s", pszCutString);
}
fclose(pFile);
pFile = NULL;
return SUCCESS;
}
char* MyStrstr(char* pszSearchString, char* pszSearchWord) {
int nFcount = 0;
int nScount = 0;
int nSearchLen = 0;
int nIndex = 0;
char* pszDelString = NULL;
char cSLen = 0;
size_t len = 0;
if(pszSearchString == NULL || pszSearchWord == NULL) {
return NULL;
}
while(pszSearchWord[nSearchLen] != '\0') {
nSearchLen++;
}
if(nSearchLen <= 0){
return pszSearchString;
}
cSLen = *pszSearchWord++;
if (!cSLen) {
return (char*) pszSearchString;
}
len = strlen(pszSearchWord);
do {
char cMLength;
do {
cMLength = *pszSearchString++;
if (!cMLength)
return (char *) 0;
} while (cMLength != cSLen);
} while (strncmp(pszSearchString, pszSearchWord, len) != 0);
return (char *) (pszSearchString - 1);
}
You want to open a file, search the contents of that file for a string and return/print that. You are instead doing:
char szData[256]; // <-- making an uninitialized buffer
char* pszCutString = NULL;
pFile = fopen(argv[2], "r"); // <-- Opening a file
pszCutString = MyStrstr(szData, argv[1]); // <-- searching the buffer
if(pszCutString != NULL) {
printf("%s", pszCutString);
}
fclose(pFile); // <-- Closing the file
So you never fill your buffer szData with the contents of the file noted in argv[2]. You're trying to search an uninitialized buffer for a string. You're luck the result is just "no output comes out".
You need to take the contents of the file in argv[2] and place it in the buffer szData then do the search. This could be accomplished by adding a call to a function like read() or fscanf()
Note 1:
I assume when you say this "worked" in the IDE, the code was a little different and you weren't using the command line arguments.
Note 2:
you should also check to fopen() worked before trying to read from/close pFile, and if your file is possibly larger than 256 characters you will need to change your code to either have a dynamically sized string, or you'll need to loop the buffer fills (but then you have to worry about breaking a word apart), or some other mechanism to check the full file.

String arrays in c

I wrote a code to read files
What is wrong in the following code I am always getting last filename if I print any arrayItem
#include <stdio.h>
#include <string.h>
char **get_files()
{
FILE *fp;
int status;
char file[1000];
char **files = NULL;
int i = 0;
/* Open the command for reading. */
fp = popen("ls", "r");
if (fp == NULL) {
printf("Failed to run command\n" );
//exit;
}
while (fgets(file, sizeof(file)-1, fp) != NULL) {
files = (char **)realloc(files, (i + 1) * sizeof(char *));
//files[i] = (char *)malloc(sizeof(char));
files[i] = file;
i++;
}
printf("%s", files[0]);
return files;
}
int main()
{
char **files = NULL;
int i =0 ;
files = get_files("");
}
you should use
files[i] = strdup(file);
instead of
files[i] = file;
The second version only lets files[i] point to your reading buffer which is always the same. With the next fgets, you'll overwrite the contents of file and thus the contents of file[i] which actually point to the same location in memory.
In fact, at the end, all your file[0]..file[n] will point to the same location as file does.
With strdup(..) you're allocating a new buffer and copying the contents of file there.
pclose is missing for your popen. popen is only POSIX, not C89/C99.
No memory-alloc check in the example, its your work ;-)
#include <stdio.h>
#include <stdlib.h>
char **get_files(char **list)
{
FILE *fp;
char file[1000];
int i=1;
/* Open the command for reading. */
fp = popen("ls -l", "rt");
if( !fp )
perror("Failed to run command\n" ),exit(1);
while( fgets(file, sizeof file , fp) ) {
list = realloc(list, ++i * sizeof*list );
memmove( list+1, list, (i-1)*sizeof*list);
*list = strcpy( malloc(strlen(file)+1), file);
}
pclose( fp );
return list;
}
main()
{
char **files = get_files(calloc(1,sizeof*files)), **start=files;
while( *files )
{
puts(*files);
free(*files++);
}
free(start);
return 0;
}
Calling popen() on 'ls' is a bad way to do this. Take a look at opendir(), readdir(), rewinddir() and closedir().
You are reusing the file array. After you've read a filename, you need to use strdup to take a copy of it, and put that copy into the files array. Otherwise, every element in files just points to the same string.
Your array in char file[1000] is single dimension, regardless of how you re-allocate memory (unless I'm missing something obvious). If you are reading an unknown number of files, then a linked list is probably the best way to go about this.

Resources