Building a basic shell, more specifically using execvp() - c

In my program I am taking user input and parsing it into a 2d char array. The array is declared as:
char parsedText[10][255] = {{""},{""},{""},{""},{""},
{""},{""},{""},{""},{""}};
and I am using fgets to grab the user input and parsing it with sscanf. This all works as I think it should.
After this I want to pass parsedText into execvp, parsedText[0] should contain the path and if any arguments are supplied then they should be in parsedText[1] thru parsedText[10].
What is wrong with execvp(parsedText[0], parsedText[1])?
One thing probably worth mentioning is that if I only supply a command such as "ls" without any arguments it appears to work just fine.
Here is my code:
#include <stdio.h>
#include <string.h>
#include <unistd.h>
#include "308shell.h"
int main( int argc, char *argv[] )
{
char prompt[40] = "308sh";
char text[40] = "";
char parsedText[10][40] = {{""},{""},{""},{""},{""},
{""},{""},{""},{""},{""}};
// Check for arguments to change the prompt.
if(argc >= 3){
if(!(strcmp(argv[1], "-p"))){
strcpy(prompt, argv[2]);
}
}
strcat(prompt, "> ");
while(1){
// Display the prompt.
fputs(prompt, stdout);
fflush(stdout);
// Grab user input and parse it into parsedText.
mygetline(text, sizeof text);
parseInput(text, parsedText);
// Check if the user wants to exit.
if(!(strcmp(parsedText[0], "exit"))){
break;
}
execvp(parsedText[0], parsedText[1]);
printf("%s\n%s\n", parsedText[0], parsedText[1]);
}
return 0;
}
char *mygetline(char *line, int size)
{
if ( fgets(line, size, stdin) )
{
char *newline = strchr(line, '\n'); /* check for trailing '\n' */
if ( newline )
{
*newline = '\0'; /* overwrite the '\n' with a terminating null */
}
}
return line;
}
char *parseInput(char *text, char parsedText[][40]){
char *ptr = text;
char field [ 40 ];
int n;
int count = 0;
while (*ptr != '\0') {
int items_read = sscanf(ptr, "%s%n", field, &n);
strcpy(parsedText[count++], field);
field[0]='\0';
if (items_read == 1)
ptr += n; /* advance the pointer by the number of characters read */
if ( *ptr != ' ' ) {
strcpy(parsedText[count], field);
break; /* didn't find an expected delimiter, done? */
}
++ptr; /* skip the delimiter */
}
}

execvp takes a pointer to a pointer (char **), not a pointer to an array. It's supposed to be a pointer to the first element of an array of char * pointers, terminated by a null pointer.
Edit: Here's one (not very good) way to make an array of pointers suitable for execvp:
char argbuf[10][256] = {{0}};
char *args[10] = { argbuf[0], argbuf[1], argbuf[2], /* ... */ };
Of course in the real world your arguments probably come from a command line string the user entered, and they probably have at least one character (e.g. a space) between them, so a much better approach would be to either modify the original string in-place, or make a duplicate of it and then modify the duplicate, adding null terminators after each argument and setting up args[i] to point to the right offset into the string.
You could instead do a lot of dynamic allocation (malloc) every step of the way, but then you have to write code to handle every possible point of failure. :-)

Related

Segmentation Fault With Strcmp With One Input

I am having an issue with the following code.
I have a global variable
char tokens[512][80];
Along with code:
int main(int argc, char** argv) {
char *input = malloc(sizeof(char) * 80);
while (1) {
printf("mini-shell>");
fgets(input, 80, stdin);
parse(input);
if (strcmp(tokens[0], "cd") == 0) {
cd();
}
else if (strcmp(tokens[0], "exit") == 0) {
exit(1);
}
}
}
void parse(char str[]) {
int index = 0;
char* str_ptr = strtok(str, " ");
while (str_ptr != NULL) {
strcpy(tokens[index], str_ptr);
str_ptr = strtok(NULL, " \0\r\n");
//printf("%d\n", index);
index = index + 1;
}
}
I found that if I enter exit for stdin I get a Segmentation fault, but if I enter cd .. for stdin I don't. Why is this so?
We don't know what the definition of the cd() function is, but there are a number of things that you may wish to consider in this program.
First, I don't believe there's any benefit to dynamically allocating 80 bytes of memory for the input buffer when you can easily do so automatically on the stack with char input[80]; - this is free and easy and requires no deallocation when you're done.
If you do this, you derive the size with fgets(input, sizeof input, stdin) where if you change the size of your input line from 80 to some other number, you only have to change it once: the sizeof on an array pulls the size directly.
Your parse() routine needs a little bit of help also. It's a really good idea to declare the function via the extern as shown so that when the compiler sees you call the function in the loop (right after the fgets), it knows the parameter and return types. Otherwise it has to make assumptions.
Because parse() is splitting apart the line you read from input, it's not required to copy the strings to some other place, so you can turn tokens from a multi-dimensional array into a simple array of pointers. As you run strtok() through the line to split up the parameters, you can store just the pointer, knowing that they will be pointing to stable data until the next fgets().
Also: your code does not strictly require or use this, but adding a NULL pointer to the end of the tokens list is a really good idea: otherwise, how does the caller know how many parameters were actually entered? This code checks whether the user entered just a blank line or not.
We've also change the loop around a little bit so the strtok() is called just once instead of twice, including the \n as noted in the comments.
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
char *tokens[512];
extern void parse(char *str);
int main(int argc, char** argv) {
char input[80];
while (1) {
printf("mini-shell> "); fflush(stdout); // make sure user sees prompt
fgets(input, sizeof input, stdin);
parse(input);
if (tokens[0] == NULL) continue; // user entered blank line
if (strcmp(tokens[0], "cd") == 0) {
cd();
}
else if (strcmp(tokens[0], "exit") == 0) {
exit(1);
}
}
}
void parse(char *str) {
int index = 0;
char* str_ptr;
while ( (str_ptr = strtok(str, " \n")) != NULL)
{
tokens[index++] = str_ptr;
str = NULL; // for next strtok() loop
}
tokens[index] = NULL;
}

C remove special characters from string

I am very new to C, and I have created a function that removes special characters from a string and returns a new string (without the special characters).
At first glance, this seemed to be working well, I now need to run this function on the lines of a (huge) text file (1 Million sentences). After a few thousand lines/sentences (About 4,000) I get a seg fault.
I don't have much experience with memory allocation and strings in C, I have tried to figure out what the problem with my code is, unfortunately without any luck.
Here is the code:
#include <stdio.h>
#include <stdlib.h>
#include <ctype.h>
char *preproccessString(char *str) {
// Create a new string of the size of the input string, so this might be bigger than needed but should never be too small
char *result = malloc(sizeof(str));
// Array of allowed chars with a 0 on the end to know when the end of the array is reached, I don't know if there is a more elegant way to do this
// Changed from array to string for sake of simplicity
char *allowedCharsArray = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
// Initalize two integers
// i will be increased for every char in the string
int i = 0;
// j will be increased every time a new char is added to the result
int j = 0;
// Loop over the input string
while (str[i] != '\0') {
// l will be increased for every char in the allowed chars array
int l = 0;
// Loop over the chars in the allowed chars array
while (allowedCharsArray[l] != '\0') {
// If the char (From the input string) currently under consideration (index i) is present in the allowed chars array
if (allowedCharsArray[l] == toupper(str[i])) {
// Set char at index j of result string to uppercase version of char currently under consideration
result[j] = toupper(str[i]);
j++;
}
l++;
}
i++;
}
return result;
}
Here is the rest of the program, I think the problem is probably here.
int main(int argc, char *argv[]) {
char const * const fileName = argv[1];
FILE *file = fopen(fileName, "r");
char line[256];
while (fgets(line, sizeof(line), file)) {
printf("%s\n", preproccessString(line));
}
fclose(file);
return 0;
}
You have several problems.
You're not allocating enough space. sizeof(str) is the size of a pointer, not the length of the string. You need to use
char *result = malloc(strlen(str) + 1);
+ 1 is for the terminating null byte.
You didn't add a terminating null byte to the result string. Add
result[j] = '\0';
before return result;
Once you find that the character matches an allowed character, there's no need to keep looping through the rest of the allowed characters. Add break after j++.
Your main() function is never freeing the results of preprocessString(), so you might be running out of memory.
while (fgets(line, sizeof(line), file)) {
char *processed = preproccessString(line);
printf("%s\n", processed);
free(processed);
}
You could address most of these problems if you have the caller pass in the result string, instead of allocating it in the function. Just use two char[256] arrays in the main() function.
int main(int argc, char *argv[])
{
char const* const fileName = argv[1];
FILE* file = fopen(fileName, "r");
char line[256], processed[256];
while (fgets(line, sizeof(line), file)) {
processString(line, processed);
printf("%s\n", processed);
}
fclose(file);
return 0;
}
Then just change the function so that the parameters are:
void preprocessString(const char *str, char *result)
A good rule of thumb is to make sure there is one free for every malloc/calloc call.
Also, a good tool to keep note of for the future is Valgrind. It's very good at catching these kinds of errors.
There are some major issues in your code:
the amount of memory allocated is incorrect, sizeof(str) is the number of bytes in a pointer, not the length of the string it points to, which would also be incorrect. You should write char *result = malloc(strlen(str) + 1);
the memory allocated in preproccessString is never freed, causing memory leaks and potentially for the program to run out of memory on very large files.
you do not set a null terminator at the end of the result string
Lesser issues:
you do not check if filename was passed nor if fopen() succeeded.
there is a typo in preproccessString, it should be preprocessString
you could avoid memory allocation by passing a properly sized destination array.
you could use isalpha instead of testing every letter
you should cast the char values as unsigned char when passing them to toupper because char may be a signed type and toupper is undefined for negative values except EOF.
there are too many comments in your source file, most of which are obvious but make the code less readable.
Here is a modified version:
#include <ctype.h>
#include <errno.h>
#include <stdio.h>
#include <stdlib.h>
// transform the string in `str` into buffer dest, keeping only letters and uppercasing them.
char *preprocessString(char *dest, const char *str) {
int i, j;
for (i = j = 0; str[i] != '\0'; i++) {
if (isalpha((unsigned char)str[i])
dest[j++] = toupper((unsigned char)str[i]);
}
dest[j] = '\0';
return dest;
}
int main(int argc, char *argv[]) {
char line[256];
char dest[256];
char *filename;
FILE *file;
if (argc < 2) {
fprintf(stderr, "missing filename argument\n");
return 1;
}
filename = argv[1];
if ((file = fopen(filename, "r")) == NULL) {
fprintf(stderr, "cannot open %s: %s\n", filename, strerror(errno));
return 1;
}
while (fgets(line, sizeof(line), file)) {
printf("%s\n", preprocessString(dest, line));
}
fclose(file);
return 0;
}
The following proposed code:
cleanly compiles
performs the desired functionality
properly checks for errors
properly checks for length of input string parameter
makes use of characteristic of strchr() also checking the terminating NUL byte
limits scope of visibility of local variables
the calling function is expected to properly cleaning up by passing the returned value to free()
the calling function is expected to check the returned value for NULL
informs compiler the user knows and accepts when an implicit conversion is made.
moves allowedCharsArray to 'file static scope' so does not have to be re-initialized on each pass through the loop and marks as 'const' to help the compiler catch errors
and now the proposed code: (note: edited per comments)
#include <stdio.h>
#include <stdlib.h>
#include <ctype.h>
#include <string.h>
char *preproccessString(char *str)
{
// Create a new string of the size of the input string, so this might be bigger than needed but should never be too small
char *result = calloc( sizeof( char ), strlen(str)+1);
if( !result )
{
perror( "calloc failed" );
return NULL;
}
// Array of allowed chars
static const char *allowedCharsArray = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
// Loop over the input string
for( int j=0, i=0; str[i]; i++)
{
if( strchr( allowedCharsArray, (char)toupper( str[i] ) ) )
{
// Set char at index j of result string to uppercase version of char currently under consideration
result[j] = (char)toupper(str[i]);
j++;
}
}
return result;
}
I think the problem is you are using malloc which allocates memory from the heap and since you are calling this function again and again you are running out of memory.
To solve this issue you have to call the free() function on the pointer returned by your preprocessString function
In your main block
char *result=preprocessString(inputstring);
//Do whatever you want to do with this result
free(result);

C-divide input string - segmentation error

my code would work in this way:
input : a[]="create /dir/bar"
and save in this string:
b[]=create
c[]=/dir/bar
there is also a case in which i save an other string: (for example)
a[]=write /foo/bar "test"
b[]= write
c[]=/foo/bar
d[]=test (without the "")
my code is this :
#include<stdio.h>
#include<stdlib.h>
#include<string.h>
#define SPACE ' '
void divide(char a[], char b[], char c[], char d[]);
int main(int argc, char const *argv[]) {
char a[50+1];
char b[50+1];
char c[50+1];
char d[50+1];
int i;
scanf("%s\n", a);
divide(a, b, c, d);
for(i=0; b[i]!='\0'; i++)
printf("%s %s %s \n", b, c, d);
return 0;
}
void divide(char a[], char b[], char c[], char d[]){
int i, j;
for(i=0; a[i]!=SPACE; i++)
b[i]=a[i];
b[i]='\0';
for(; a[i]==SPACE; i++)
;
for(j=0; a[i]!='\0'; i++, j++)
c[j]=a[i];
c[j]='\0';
for(; a[i]==SPACE; i++)
;
if(a[i]=='"'){
i++;
for(j=0; a[i]!='"'; i++)
d[j]=a[i];
d[j]='\0';
return;
}
}
but it does not work for a segmentation fault after the program get the input. where is the problem?
I must not use malloc, because it spend too much time to work (I have to get thousands of these lines) and does not respect a limit. (I work for a project in my university)
You may be making this a little more difficult than it needs to be. Yes, you can tokenize a string by repeated calls to sscanf or with repeated reads with scanf, but the C library provides a tool to tokenize words from a line of text. Smartly enough named strtok.
You simply declare a constant string holding the delimiters you wish to break the words on (e.g. delims = " \t"; to break the words on space or tab, and then call strtok (str, delims) to return the first token (word), and then loop over repeated calls to strtok (NULL, delims) to parse the remaining words (or until you reach your max of 3 words).
(note the first call to strtok uses str as the first parameter, while all subsequent calls use NULL)
This is a far more flexible way to handle an unknown number of tokens in a string.
Instead of using a[], b[], c[], etc.. consider using just a single buf[] to read the line of input into, and then an array of strings to hold the parameters (which allows you to use an index variable during your loops over strtok to assign and copy the correct string to the associated index).
Don't use void as a return in circumstances like this. Why not use a meaningful return (like the number of parameters in the line of text). That way, you know how many were read (or tokenized) in your divide function. Give it a return that can provide useful information, e.g.
size_t divide (char *buf, char (*params)[MAXC+1]);
Which will now return a size_t type containing the number of parameters that result from each call to divide.
Putting it altogether, (and using fgets to read the entire line of input), you could do something like the following:
#include <stdio.h>
#include <string.h>
enum { MAXP = 3, MAXC = 50 }; /* max parameters & chars */
size_t divide (char *buf, char (*params)[MAXC+1]);
int main (void) {
char buf[MAXC * 4 + 1] = "";
char params[MAXP][MAXC + 1]; /* array to hold 3 parameters */
size_t i, len, nparams = 0;
/* use fgets for line-oriented user input */
printf ("\nenter commands: ");
if (!fgets (buf, sizeof buf, stdin)) {
fprintf (stderr, "error: insufficient input.\n");
return 1;
}
len = strlen (buf); /* get length */
if (buf[len - 1] == '\n') /* validate last char is '\n' */
buf[--len] = 0; /* overwrite with nul-terminating char */
else { /* short read -- handle error */
fprintf (stderr, "error: incomplete input read.\n");
return 1;
}
nparams = divide (buf, params);
for (i = 0; i < nparams; i++)
printf ("parameter[%zu] : %s\n", i, params[i]);
return 0;
}
/* divide using strtok */
size_t divide (char *buf, char (*params)[MAXC+1])
{
char *delims = " \t", /* delimiters for strtok */
*p = buf; /* pointer to buf */
size_t n = 0; /* var to return number of params */
p = strtok (buf, delims); /* tokenize fist paramter */
while (p) { /* now loop until all words exhausted or limit reached */
strncpy (params[n++], p, MAXC); /* copy token to params array */
if (n == MAXP) /* check if limit reached */
break;
p = strtok (NULL, delims); /* get next token */
}
return n; /* return the number of parameters found */
}
Example Use/Output
$ /bin/splitparams
enter commands: create /dir/bar
parameter[0] : create
parameter[1] : /dir/bar
$ ./bin/splitparams
enter commands: write /foo/bar "test"
parameter[0] : write
parameter[1] : /foo/bar
parameter[2] : "test"
Or providing a bunch of extra words (to validate handling of only 3)
$ ./bin/splitparams
enter commands: write /foo/bar "test" and some more stuff
parameter[0] : write
parameter[1] : /foo/bar
parameter[2] : "test"
If you run this simple program
#include<stdio.h>
int main(int argc, char const *argv[]) {
char a[50+1];
scanf("%s\n", a);
printf("|%s|\n", a);
return 0;
}
and give the input "create foo", you'll get the output
|create|
As you can see you only got the first word, i.e. "create", instead of the expected "create foo" as
scanf("%s\n", a);
will only give the first word. Consequently your divide function will fail. Instead of scanf you could do
fgets(a, 51, stdin);
to make sure the whole input is read into array a.
In general your program lacks a lot of range checking and input validation. You should add that.
Another problem I see is that in case the input is
create /dir/bar
you never initialize the string d but you still print it in main. That is undefined behaviour.
Try:
char d[50+1];
d[0] = '\0'; // Add this line

Processing outputs of multiple inputs in C

It's not something trivial but I would like to know the best way to process multiple outputs, for example:
Input
First line of input will contain a number T = number of test cases. Following lines will contain a string each.
Output
For each string, print on a single line, "UNIQUE" - if the characters are all unique, else print "NOT UNIQUE"
Sample Input
3
DELHI
london
#include<iostream>
Sample Output
UNIQUE
NOT UNIQUE
NOT UNIQUE
So how can I accomplish outputs like that? My code so far is:
int main(int argc, char *argv[])
{
int inputs, count=0;
char str[100];
char *ptr;
scanf("%d",&inputs);
while(inputs-- >0)
{
scanf("%s",str);
for(ptr=str; *ptr!='\0';ptr++)
{
if( *ptr== *(ptr+1))
{
count++;
}
}
if(count>0)
{
printf("NOT UNIQUE");
}
else
{
printf("UNIQUE");
}
}
}
But the above will obviously print the output after each input, but I want the output only after entering all the inputs, if the user enters 3, then the user have to give 3 strings and after the output will be given whether the given strings are unique or not. So I want to know how can I achieve the result given in the problem. Also another thing I want to know is, I am using an array of 100 char, which it can hold a string up to 100 characters, but what do I have to do if I want to handle string with no limit? Just declaring char *str is no good, so what to do?
Hope this helps:
#include <stdio.h>
int main(int argc, char *argv[])
{
int inputs,count=0;
char str[20];
scanf("%d",&inputs);
char *ptr;
char *dummy;
while(inputs-- >0)
{
scanf("%s",str);
for(ptr=str; *ptr!='\0';ptr++)
{
for(dummy=ptr+1; *dummy != '\0';dummy++)
{
if( *ptr== *dummy)
{
count=1;
}
}
if(count == 1)
break;
}
if(count>0)
{
printf("NOT UNIQUE");
}
else
{
printf("UNIQUE");
}
}
}
If you want to save stuff for later use, you must store it somewhere. The example below stores up to 10 lines in buf and then points str to the current line:
#include <stdlib.h>
#include <stdio.h>
#include <string.h> /* for strlen */
#include <ctype.h> /* for isspace */
int main(int argc, char *argv[])
{
int ninput = 0;
char buf[10][100]; /* storage for 10 strings */
char *str; /* pointer to current string */
int i;
printf("Enter up to 10 strings, blank to and input:\n");
for (i = 0; i < 10; i++) {
int l;
str = buf[i];
/* read line and break on end-of-file (^D) */
if (fgets(str, 100, stdin) == NULL) break;
/* delete trailing newline & spaces */
l = strlen(str);
while (l > 0 && isspace(str[l - 1])) l--;
str[l] = '\0';
/* break loop on empty input */
if (l == 0) break;
ninput++;
}
printf("Your input:\n");
for (i = 0; i < ninput; i++) {
str = buf[i];
printf("[%d] '%s'\n", i + 1, str);
}
return 0;
}
Note the two separate loops for input and output.
I've also rejiggled your input. I'm not very fond of fscanf; I prefer to read input line-wise with fgets and then analyse the line with strtok or sscanf. The advantage over fscanf is that yout strings may contain white-space. The drawback is that you have a newline at the end which you usually don't want and have to "chomp".
If you want to allow for longer strings, you should use dynamic allocation with malloc, although I'm not sure if it is useful when reading user input from the console. Tackle that when you have understood the basics of fixed-size allocation on the stack.
Other people have already pointed you to the error in your check for uniqueness.

Parsing text in C

I have a file like this:
...
words 13
more words 21
even more words 4
...
(General format is a string of non-digits, then a space, then any number of digits and a newline)
and I'd like to parse every line, putting the words into one field of the structure, and the number into the other. Right now I am using an ugly hack of reading the line while the chars are not numbers, then reading the rest. I believe there's a clearer way.
Edit: You can use pNum-buf to get the length of the alphabetical part of the string, and use strncpy() to copy that into another buffer. Be sure to add a '\0' to the end of the destination buffer. I would insert this code before the pNum++.
int len = pNum-buf;
strncpy(newBuf, buf, len-1);
newBuf[len] = '\0';
You could read the entire line into a buffer and then use:
char *pNum;
if (pNum = strrchr(buf, ' ')) {
pNum++;
}
to get a pointer to the number field.
fscanf(file, "%s %d", word, &value);
This gets the values directly into a string and an integer, and copes with variations in whitespace and numerical formats, etc.
Edit
Ooops, I forgot that you had spaces between the words.
In that case, I'd do the following. (Note that it truncates the original text in 'line')
// Scan to find the last space in the line
char *p = line;
char *lastSpace = null;
while(*p != '\0')
{
if (*p == ' ')
lastSpace = p;
p++;
}
if (lastSpace == null)
return("parse error");
// Replace the last space in the line with a NUL
*lastSpace = '\0';
// Advance past the NUL to the first character of the number field
lastSpace++;
char *word = text;
int number = atoi(lastSpace);
You can solve this using stdlib functions, but the above is likely to be more efficient as you're only searching for the characters you are interested in.
Given the description, I think I'd use a variant of this (now tested) C99 code:
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
#include <ctype.h>
struct word_number
{
char word[128];
long number;
};
int read_word_number(FILE *fp, struct word_number *wnp)
{
char buffer[140];
if (fgets(buffer, sizeof(buffer), fp) == 0)
return EOF;
size_t len = strlen(buffer);
if (buffer[len-1] != '\n') // Error if line too long to fit
return EOF;
buffer[--len] = '\0';
char *num = &buffer[len-1];
while (num > buffer && !isspace((unsigned char)*num))
num--;
if (num == buffer) // No space in input data
return EOF;
char *end;
wnp->number = strtol(num+1, &end, 0);
if (*end != '\0') // Invalid number as last word on line
return EOF;
*num = '\0';
if (num - buffer >= sizeof(wnp->word)) // Non-number part too long
return EOF;
memcpy(wnp->word, buffer, num - buffer);
return(0);
}
int main(void)
{
struct word_number wn;
while (read_word_number(stdin, &wn) != EOF)
printf("Word <<%s>> Number %ld\n", wn.word, wn.number);
return(0);
}
You could improve the error reporting by returning different values for different problems.
You could make it work with dynamically allocated memory for the word portion of the lines.
You could make it work with longer lines than I allow.
You could scan backwards over digits instead of non-spaces - but this allows the user to write "abc 0x123" and the hex value is handled correctly.
You might prefer to ensure there are no digits in the word part; this code does not care.
You could try using strtok() to tokenize each line, and then check whether each token is a number or a word (a fairly trivial check once you have the token string - just look at the first character of the token).
Assuming that the number is immediately followed by '\n'.
you can read each line to chars buffer, use sscanf("%d") on the entire line to get the number, and then calculate the number of chars that this number takes at the end of the text string.
Depending on how complex your strings become you may want to use the PCRE library. At least that way you can compile a perl'ish regular expression to split your lines. It may be overkill though.
Given the description, here's what I'd do: read each line as a single string using fgets() (making sure the target buffer is large enough), then split the line using strtok(). To determine if each token is a word or a number, I'd use strtol() to attempt the conversion and check the error condition. Example:
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
/**
* Read the next line from the file, splitting the tokens into
* multiple strings and a single integer. Assumes input lines
* never exceed MAX_LINE_LENGTH and each individual string never
* exceeds MAX_STR_SIZE. Otherwise things get a little more
* interesting. Also assumes that the integer is the last
* thing on each line.
*/
int getNextLine(FILE *in, char (*strs)[MAX_STR_SIZE], int *numStrings, int *value)
{
char buffer[MAX_LINE_LENGTH];
int rval = 1;
if (fgets(buffer, buffer, sizeof buffer))
{
char *token = strtok(buffer, " ");
*numStrings = 0;
while (token)
{
char *chk;
*value = (int) strtol(token, &chk, 10);
if (*chk != 0 && *chk != '\n')
{
strcpy(strs[(*numStrings)++], token);
}
token = strtok(NULL, " ");
}
}
else
{
/**
* fgets() hit either EOF or error; either way return 0
*/
rval = 0;
}
return rval;
}
/**
* sample main
*/
int main(void)
{
FILE *input;
char strings[MAX_NUM_STRINGS][MAX_STRING_LENGTH];
int numStrings;
int value;
input = fopen("datafile.txt", "r");
if (input)
{
while (getNextLine(input, &strings, &numStrings, &value))
{
/**
* Do something with strings and value here
*/
}
fclose(input);
}
return 0;
}

Resources