I have been reading through the documentation for strtoul()/strtoull() from here, and under the "Conforming To" section towards to bottom, it makes these two points:
strtoul(): POSIX.1-2001, POSIX.1-2008, C89, C99 SVr4.
strtoull(): POSIX.1-2001, POSIX.1-2008, C99.
These two lines, in addition to other references throughout the document indicate to me that the function strtoull should not be available when compiling a program using the c89/c90 standard. However, when I run a quick test with gcc, it allows me to call this function, regardless of the standard that I specify.
First, the code I am using to test:
#include <stdio.h>
#include <stdlib.h>
int main(void)
{
unsigned long long x;
const char *str = "1234";
x = strtoull(str, NULL, 10);
printf("%llu\n", x);
return 0;
}
And here is my compilation command:
gcc test.c -std=c89 -pedantic -Wall -Wextra
Now, in fairness it does warn me of the compatibility issue:
test.c: In function ‘main’:
test.c:6:16: warning: ISO C90 does not support ‘long long’ [-Wlong-long]
unsigned long long x;
^~~~
test.c:9:6: warning: implicit declaration of function ‘strtoull’; did you mean ‘strtoul’? [-Wimplicit-function-declaration]
x = strtoull(str, NULL, 10);
^~~~~~~~
strtoul
test.c:11:9: warning: ISO C90 does not support the ‘ll’ gnu_printf length modifier [-Wformat=]
printf("%llu\n", x);
^~~~~~~~
These warning messages are exactly what I would expect given the documentation. It notifies me that the function I have specified cannot be found, and even that the C90 standard doesn't support unsigned long long. However, when I attempt to run this code, it works just fine, with no crashing or other types of errors. It prints the value 1234, as desired. So, based on this experiment, I have a few questions that I was hoping someone more seasoned than I could answer.
Is this a matter of me not providing the necessary compilation flags to enforce the 'strict' c98 standard?
Is this a case of me misunderstanding the documentation, or is there some documentation for gcc itself that I should refer to? And, if so, where could I find it?
Is there something fundamental about the compiling/linking process that I am not understanding, which explains this issue?
Why would I be warned of an incompatibility, even warned that the function I am calling does not exist, but the code still works with no issue?
Does this experiment imply that the -std=c89 -pedantic flags do not actually enforce the C89/C90 standard?
As a final note, I am not trying to say I want to use this function in C89, I was just curious about the compatibility restriction, and then confused about the answer.
Thanks in advance for any responses!
From a C89/C90 compiler's point of view, the only thing wrong with your code is the use of unsigned long long which looks like a syntax error. The standard requires only that the compiler produce a "diagnostic" in this case, and GCC has done so with its "ISO C90 does not support long long" warning. There is no requirement that this error should be fatal, and the compiler can decide to handle the code some other way if it wants. GCC obviously chooses to understand it as the long long type which it supports as an extension to C89.
The use of strtoull then just looks like some function that you made up, as C89 had no way of knowing that this name would be special in some future version of the standard. (Well, they did specify that more functions starting with str could be added to <string.h> in the future, but that doesn't make your code illegal for C89.) You haven't declared it, but C89 allowed implicit declarations, so it's understood to be declared as int strtoull();, i.e. returning int and with unspecified arguments. AFAIK no diagnostic was required for implicit declarations, but GCC chooses to issue one anyway. So it's treated like any other call to a function not defined in this source file, and the compiler presumes that some other part of your program (including the libraries you use) will define it.
And in fact some other part of your program does define it, namely libc, since your libc conforms to C99 and later. (You know, hopefully, that libc is not part of GCC.) C library authors generally don't provide a version of the library that only includes functions from a particular standard version, since having so many different libraries around would be awkward and inefficient. So linking succeeds.
Note, though, that because of the implicit declaration, the program may not actually work correctly. The compiler will generate code incorrectly assuming that strtoull returns int, which depending on your system's calling conventions, may cause all sorts of problems. On x86-64, it means that your program will only look at the low 32 bits of the result and will sign-extend them to 64 bits. So if you try to convert a number that fits in 32 bits but would not fit in long long, you'll get the wrong result. Example.
If you want a program that would work on a system that only supports C89 and nothing else, it's your responsibility to look at the diagnostics issued by the compiler and fix the corresponding problems. The -pedantic-errors option mentioned in comments can help with this, as it causes compilation to fail when such diagnostics are issued.
It would also help if you could find a C89-only libc, but that's not GCC's problem. But its implicit declaration warnings do give you some assistance in noticing that you have called a function which you may not have intended for your program to define.
As a final point, it's historically been part of GCC's design philosophy that they don't think "enforcing the standard" is really part of what they want to do. They saw their goal as writing a compiler that helps people write and compile programs that are useful, not a linter that checks for conformance with coding standards; they figured the latter should be a separate project, and not one that they were interested in. As such, they were liberal in providing extensions to the standard language, and not particularly diligent in providing ways for programs to avoid using them. They did provide the -pedantic option but apparently with some reluctance, as you can tell from the derogatory name.
Related
What is meant by the term "implicit declaration of a function"? A call to a standard library function without including the appropriate header file produces a warning as in the case of:
int main(){
printf("How is this not an error?");
return 0;
}
Shouldn't using a function without declaring it be an error? Please explain in detail. I searched this site and found similar questions, but could not find a definitive answer. Most answers said something about including the header file to get rid of the warning, but I want to know how this is not an error.
It should be considered an error. But C is an ancient language, so it's only a warning.
Compiling with -Werror (GCC) fixes this problem.
When C doesn't find a declaration, it assumes this implicit declaration: int f();, which means the function can receive whatever you give it, and returns an integer. If this happens to be close enough (and in case of printf, it is), then things can work. In some cases (e.g., the function actually returns a pointer, and pointers are larger than ints), it may cause real trouble.
Note that this was fixed in newer C standards (C99 and C11). In these standards, this is an error. However, GCC doesn't implement these standards by default, so you still get the warning.
Implicit declarations are not valid in C.
C99 removed this feature (present in C89).
GCC chooses to only issue a warning by default with -std=c99, but a compiler has the right to refuse to translate such a program.
To complete the picture, since -Werror might considered too "invasive",
for GCC (and LLVM), a more precise solution is to transform just this warning in an error, using the option:
-Werror=implicit-function-declaration
See How can I make this GCC warning an error?.
Regarding general use of -Werror: Of course, having warningless code is recommendable, but in some stage of development it might slow down the prototyping.
Because of historical reasons going back to the very first version of C, it passes whatever type the argument is. So it could be an int or a double or a char*. Without a prototype, the compiler will pass whatever size the argument is and the function being called had better use the correct argument type to receive it.
For more details, look up K&R C.
An implicitly declared function is one that has neither a prototype nor a definition, but is called somewhere in the code. Because of that, the compiler cannot verify that this is the intended usage of the function (whether the count and the type of the arguments match). Resolving the references to it is done after compilation, at link-time (as with all other global symbols), so technically it is not a problem to skip the prototype.
It is assumed that the programmer knows what he is doing and this is the premise under which the formal contract of providing a prototype is omitted.
Nasty bugs can happen if calling the function with arguments of a wrong type or count. The most likely manifestation of this is a corruption of the stack.
Nowadays this feature might seem as an obscure oddity, but in the old days it was a way to reduce the number of header files included, hence faster compilation.
C is a very low-level language, so it permits you to create almost any legal object (.o) file that you can conceive of. You should think of C as basically dressed-up assembly language.
In particular, C does not require functions to be declared before they are used. If you call a function without declaring it, the use of the function becomes its (implicit) declaration. In a simple test I just ran, this is only a warning in the case of built-in library functions like printf (at least in GCC), but for random functions, it will compile just fine.
Of course, when you try to link, and it can't find foo, then you will get an error.
In the case of library functions like printf, some compilers contain built-in declarations for them so they can do some basic type checking, so when the implicit declaration (from the use) doesn't match the built-in declaration, you'll get a warning.
What is meant by the term "implicit declaration of a function"? A call to a standard library function without including the appropriate header file produces a warning as in the case of:
int main(){
printf("How is this not an error?");
return 0;
}
Shouldn't using a function without declaring it be an error? Please explain in detail. I searched this site and found similar questions, but could not find a definitive answer. Most answers said something about including the header file to get rid of the warning, but I want to know how this is not an error.
It should be considered an error. But C is an ancient language, so it's only a warning.
Compiling with -Werror (GCC) fixes this problem.
When C doesn't find a declaration, it assumes this implicit declaration: int f();, which means the function can receive whatever you give it, and returns an integer. If this happens to be close enough (and in case of printf, it is), then things can work. In some cases (e.g., the function actually returns a pointer, and pointers are larger than ints), it may cause real trouble.
Note that this was fixed in newer C standards (C99 and C11). In these standards, this is an error. However, GCC doesn't implement these standards by default, so you still get the warning.
Implicit declarations are not valid in C.
C99 removed this feature (present in C89).
GCC chooses to only issue a warning by default with -std=c99, but a compiler has the right to refuse to translate such a program.
To complete the picture, since -Werror might considered too "invasive",
for GCC (and LLVM), a more precise solution is to transform just this warning in an error, using the option:
-Werror=implicit-function-declaration
See How can I make this GCC warning an error?.
Regarding general use of -Werror: Of course, having warningless code is recommendable, but in some stage of development it might slow down the prototyping.
Because of historical reasons going back to the very first version of C, it passes whatever type the argument is. So it could be an int or a double or a char*. Without a prototype, the compiler will pass whatever size the argument is and the function being called had better use the correct argument type to receive it.
For more details, look up K&R C.
An implicitly declared function is one that has neither a prototype nor a definition, but is called somewhere in the code. Because of that, the compiler cannot verify that this is the intended usage of the function (whether the count and the type of the arguments match). Resolving the references to it is done after compilation, at link-time (as with all other global symbols), so technically it is not a problem to skip the prototype.
It is assumed that the programmer knows what he is doing and this is the premise under which the formal contract of providing a prototype is omitted.
Nasty bugs can happen if calling the function with arguments of a wrong type or count. The most likely manifestation of this is a corruption of the stack.
Nowadays this feature might seem as an obscure oddity, but in the old days it was a way to reduce the number of header files included, hence faster compilation.
C is a very low-level language, so it permits you to create almost any legal object (.o) file that you can conceive of. You should think of C as basically dressed-up assembly language.
In particular, C does not require functions to be declared before they are used. If you call a function without declaring it, the use of the function becomes its (implicit) declaration. In a simple test I just ran, this is only a warning in the case of built-in library functions like printf (at least in GCC), but for random functions, it will compile just fine.
Of course, when you try to link, and it can't find foo, then you will get an error.
In the case of library functions like printf, some compilers contain built-in declarations for them so they can do some basic type checking, so when the implicit declaration (from the use) doesn't match the built-in declaration, you'll get a warning.
What is meant by the term "implicit declaration of a function"? A call to a standard library function without including the appropriate header file produces a warning as in the case of:
int main(){
printf("How is this not an error?");
return 0;
}
Shouldn't using a function without declaring it be an error? Please explain in detail. I searched this site and found similar questions, but could not find a definitive answer. Most answers said something about including the header file to get rid of the warning, but I want to know how this is not an error.
It should be considered an error. But C is an ancient language, so it's only a warning.
Compiling with -Werror (GCC) fixes this problem.
When C doesn't find a declaration, it assumes this implicit declaration: int f();, which means the function can receive whatever you give it, and returns an integer. If this happens to be close enough (and in case of printf, it is), then things can work. In some cases (e.g., the function actually returns a pointer, and pointers are larger than ints), it may cause real trouble.
Note that this was fixed in newer C standards (C99 and C11). In these standards, this is an error. However, GCC doesn't implement these standards by default, so you still get the warning.
Implicit declarations are not valid in C.
C99 removed this feature (present in C89).
GCC chooses to only issue a warning by default with -std=c99, but a compiler has the right to refuse to translate such a program.
To complete the picture, since -Werror might considered too "invasive",
for GCC (and LLVM), a more precise solution is to transform just this warning in an error, using the option:
-Werror=implicit-function-declaration
See How can I make this GCC warning an error?.
Regarding general use of -Werror: Of course, having warningless code is recommendable, but in some stage of development it might slow down the prototyping.
Because of historical reasons going back to the very first version of C, it passes whatever type the argument is. So it could be an int or a double or a char*. Without a prototype, the compiler will pass whatever size the argument is and the function being called had better use the correct argument type to receive it.
For more details, look up K&R C.
An implicitly declared function is one that has neither a prototype nor a definition, but is called somewhere in the code. Because of that, the compiler cannot verify that this is the intended usage of the function (whether the count and the type of the arguments match). Resolving the references to it is done after compilation, at link-time (as with all other global symbols), so technically it is not a problem to skip the prototype.
It is assumed that the programmer knows what he is doing and this is the premise under which the formal contract of providing a prototype is omitted.
Nasty bugs can happen if calling the function with arguments of a wrong type or count. The most likely manifestation of this is a corruption of the stack.
Nowadays this feature might seem as an obscure oddity, but in the old days it was a way to reduce the number of header files included, hence faster compilation.
C is a very low-level language, so it permits you to create almost any legal object (.o) file that you can conceive of. You should think of C as basically dressed-up assembly language.
In particular, C does not require functions to be declared before they are used. If you call a function without declaring it, the use of the function becomes its (implicit) declaration. In a simple test I just ran, this is only a warning in the case of built-in library functions like printf (at least in GCC), but for random functions, it will compile just fine.
Of course, when you try to link, and it can't find foo, then you will get an error.
In the case of library functions like printf, some compilers contain built-in declarations for them so they can do some basic type checking, so when the implicit declaration (from the use) doesn't match the built-in declaration, you'll get a warning.
C function malloc() is defined under stdlib.h.
It should give an error if we don't include this file, but this code works fine with a little warning.
My question is, if malloc() works without this header file, then why do we need to include it? Please help clarify my concepts.
# include <stdio.h>
int main()
{
int a, b, *p;
p = (int*)malloc(sizeof(int)*5);
for(a=0;a<5;a++)p[a]=a*9;
for(b=0;b<5;b++)printf("%d ",p[b]);
}
In C unfortunately you don't need pre-declaration for functions. If the compiler encounters with a new function it will create an implicit declaration for it ("mmm`kay, this how it is used so I will assume that the type of the arguments are..").
Do not rely on this "feature" and in general do not write code that compiles with warnings.
Read the warning. It says it's invalid. The compiler is simply too kind to you. In Clang this works, but it might not in other compilers.
At least include it to suppress the warning. Unnecessary warnings are annoying. Any program should compile with warnings treated as errors (I always enable that).
It appears that that's your compiler's magic. Not including the necessary headers may work on your compiler (which I suppose is by Microsoft), but it won't necessarily compile elsewhere(that includes future versions of the same compiler). Write standard-conforming, portable code.
stdlib.h is of the general purpose standard header which includes functions of Dynamic Memory allocation and other Standard Functions.
For example if you want to display a message at the end of the execution of your program you will need to go for the getch() function,this functions reads a character from keyboard thus giving user the time to read the displayed Information.
The getch() function requires the stdlib header to be Included.
Like many things in c the reason that an error isn't generated when there is no prototype is for historical reasons. In the early days people often didn't bother prototyping functions because pointers and integers were usually the same size and integral types smaller than an integer were promoted to an integer when passed as a parameter (and floating point was rarely used for systems programming).
If at any point they had changed the compiler to give an error if a function was not prototyped then it would have broken many programs and would not have gained widespread acceptance.
With 64 bit addressing we are now entering a period when integers and pointers are not the same size and programs will most likely break if you do not prototype functions like malloc() that return a pointer.
In gcc always set the following options for your own programs: -Werror -Wstrict-prototypes
I am currently learning and experimenting with C and am using Bloodshed's DEV-C++ as an IDE.
Now, I just realized that the following piece of code (as it is...no includes or nothing) compiles and runs :
main ()
{
printf("%d", strlen("hello"));
}
Now, if I'm not mistaken, shouldn't two header files be included in this source for it to work ? stdio.h and string.h...but as you can see, I did not add them and the code still compiled and ran successfully.
My complaint is that I want the compiler to be "strict" because since I'm still learning C, I don't want the code to run if normally it shouldn't.
So, is there any way to prevent Dev-C++ from 'correcting my mistakes' when it comes to includes, ie making it more kinda "strict" ?
C90 had a feature (absent of C99 and C++) called implicit function declaration: when you used a name not declared yet in a function call, the compiler behaved as if
extern int identifier();
had been seen. That feature has been dropped from C99 and most compilers had option to warn about this even before C99 was promulgated.
Even when staying in C90, it is not recommended style to use this. If you have to maintain code making use of this and can't add prototypes, check that:
the functions returns an int (it is the case for printf but the validity is implementation dependent for strlen which returns a size_t which can be int or something else)
the function isn't variadic (it is the case for strlen but not printf)
the type of the arguments is not modified by default argument promotions (char, short, float are) and you must pay attention to cast pointers to void* when needed when the expected type is void*, you have to pay attention to cast NULL to the correct pointer type. (These are the same things you have to pay attention for variadic arguments BTW).
If those conditions aren't met -- and they aren't for any calls in your code -- you enter in the realm of undefined behavior.
I don't know if this actually is a DevC++ issue, but in any case you should consider ditching it. It is no longer being developed and is very buggy. I recommend changing to Code::Blocks, which is better in every way and alows you to use the very latest GCC compiler.
One of the possibilities for 'undefined behaviour' - which you get if you call a variadic function without a visible prototype - is that your code compiles and runs successfully.
If you're using gcc as the underlying compiler then you should be able to pass flags such as -std=c89 -pedantic -Wall -Wextra and get warnings about code such as the snippet that you've posted.