implicit int and implicit declaration of functions with gcc compiler - c

I read in the c99 Standard:
-remove implicit function declaration,
-remove implicit int.
But when I try to compile this code with gcc compiler in c99 mode using -pedantic
main(void){
f(3);
return 0;
}
int f(int a){
....
}
I expect 2 errors, but I just receive 2 warnings:
-warning: return type defaults to ‘int’
-warning: implicit declaration of function ‘f’.
Shouldn't them be errors in c99?
http://gcc.gnu.org/c99status.html
In both situations there's written "done".
Thanks.

The C standard requires a diagnostic for any translation unit containing a violation of a syntax rule or constraint. It does not require such diagnostics to be fatal; the compiler is free to continue processing the source file. The behavior of the resulting executable, if any, is undefined. The standard makes no distinction between warnings and fatal errors.
(The only thing that requires a compiler to reject a source file is the #error directive.)
Conclusion: when compiling C, take warnings very seriously.

I don't believe the compiler is required to produce a fatal error. Use -Werror if you're concerned...

Two points: first, it may (usually does) take a specific set of flags to get a compiler to conform with the standard.
Second, all that's required by the standard is that the implementation issue a "diagnostic" in the case of an error -- but it's up to the implementation to define what is or isn't a diagnostic. It's free to say a "warning" is a diagnostic if it wants to. When a diagnostic is issued, it may quit compiling, or it can compile the code anyway.
Bottom line: what it's doing is probably enough to conform, for whatever that's worth.

Related

strtoull() Availbility in C89

I have been reading through the documentation for strtoul()/strtoull() from here, and under the "Conforming To" section towards to bottom, it makes these two points:
strtoul(): POSIX.1-2001, POSIX.1-2008, C89, C99 SVr4.
strtoull(): POSIX.1-2001, POSIX.1-2008, C99.
These two lines, in addition to other references throughout the document indicate to me that the function strtoull should not be available when compiling a program using the c89/c90 standard. However, when I run a quick test with gcc, it allows me to call this function, regardless of the standard that I specify.
First, the code I am using to test:
#include <stdio.h>
#include <stdlib.h>
int main(void)
{
unsigned long long x;
const char *str = "1234";
x = strtoull(str, NULL, 10);
printf("%llu\n", x);
return 0;
}
And here is my compilation command:
gcc test.c -std=c89 -pedantic -Wall -Wextra
Now, in fairness it does warn me of the compatibility issue:
test.c: In function ‘main’:
test.c:6:16: warning: ISO C90 does not support ‘long long’ [-Wlong-long]
unsigned long long x;
^~~~
test.c:9:6: warning: implicit declaration of function ‘strtoull’; did you mean ‘strtoul’? [-Wimplicit-function-declaration]
x = strtoull(str, NULL, 10);
^~~~~~~~
strtoul
test.c:11:9: warning: ISO C90 does not support the ‘ll’ gnu_printf length modifier [-Wformat=]
printf("%llu\n", x);
^~~~~~~~
These warning messages are exactly what I would expect given the documentation. It notifies me that the function I have specified cannot be found, and even that the C90 standard doesn't support unsigned long long. However, when I attempt to run this code, it works just fine, with no crashing or other types of errors. It prints the value 1234, as desired. So, based on this experiment, I have a few questions that I was hoping someone more seasoned than I could answer.
Is this a matter of me not providing the necessary compilation flags to enforce the 'strict' c98 standard?
Is this a case of me misunderstanding the documentation, or is there some documentation for gcc itself that I should refer to? And, if so, where could I find it?
Is there something fundamental about the compiling/linking process that I am not understanding, which explains this issue?
Why would I be warned of an incompatibility, even warned that the function I am calling does not exist, but the code still works with no issue?
Does this experiment imply that the -std=c89 -pedantic flags do not actually enforce the C89/C90 standard?
As a final note, I am not trying to say I want to use this function in C89, I was just curious about the compatibility restriction, and then confused about the answer.
Thanks in advance for any responses!
From a C89/C90 compiler's point of view, the only thing wrong with your code is the use of unsigned long long which looks like a syntax error. The standard requires only that the compiler produce a "diagnostic" in this case, and GCC has done so with its "ISO C90 does not support long long" warning. There is no requirement that this error should be fatal, and the compiler can decide to handle the code some other way if it wants. GCC obviously chooses to understand it as the long long type which it supports as an extension to C89.
The use of strtoull then just looks like some function that you made up, as C89 had no way of knowing that this name would be special in some future version of the standard. (Well, they did specify that more functions starting with str could be added to <string.h> in the future, but that doesn't make your code illegal for C89.) You haven't declared it, but C89 allowed implicit declarations, so it's understood to be declared as int strtoull();, i.e. returning int and with unspecified arguments. AFAIK no diagnostic was required for implicit declarations, but GCC chooses to issue one anyway. So it's treated like any other call to a function not defined in this source file, and the compiler presumes that some other part of your program (including the libraries you use) will define it.
And in fact some other part of your program does define it, namely libc, since your libc conforms to C99 and later. (You know, hopefully, that libc is not part of GCC.) C library authors generally don't provide a version of the library that only includes functions from a particular standard version, since having so many different libraries around would be awkward and inefficient. So linking succeeds.
Note, though, that because of the implicit declaration, the program may not actually work correctly. The compiler will generate code incorrectly assuming that strtoull returns int, which depending on your system's calling conventions, may cause all sorts of problems. On x86-64, it means that your program will only look at the low 32 bits of the result and will sign-extend them to 64 bits. So if you try to convert a number that fits in 32 bits but would not fit in long long, you'll get the wrong result. Example.
If you want a program that would work on a system that only supports C89 and nothing else, it's your responsibility to look at the diagnostics issued by the compiler and fix the corresponding problems. The -pedantic-errors option mentioned in comments can help with this, as it causes compilation to fail when such diagnostics are issued.
It would also help if you could find a C89-only libc, but that's not GCC's problem. But its implicit declaration warnings do give you some assistance in noticing that you have called a function which you may not have intended for your program to define.
As a final point, it's historically been part of GCC's design philosophy that they don't think "enforcing the standard" is really part of what they want to do. They saw their goal as writing a compiler that helps people write and compile programs that are useful, not a linter that checks for conformance with coding standards; they figured the latter should be a separate project, and not one that they were interested in. As such, they were liberal in providing extensions to the standard language, and not particularly diligent in providing ways for programs to avoid using them. They did provide the -pedantic option but apparently with some reluctance, as you can tell from the derogatory name.

Who detects misspelled function name? Compiler or Linker?

According to C How to Program (Deitel):
Standard library functions like printf and scanf are not part of the C programming language. For example, the compiler cannot find a spelling error in printf or scanf. When the compiler compiles a printf statement, it merely provides space in the object program for a “call” to the library function. But the compiler does not know where the library functions are—the linker does. When the linker runs, it locates the library functions and inserts the proper calls to these library functions in the object program. Now the object program is complete and ready to be executed. For this reason, the linked program is called an executable. If the function name is misspelled, it is the linker which will spot the error, because it will not be able to match the name in the C program with the name of any known function in the libraries.
These statements leave me doubtful because of the existence of header file. These files are included during the preprocessing phase, before the compiling one, and, as I read, there are used by the compiler.
So if I write print instead of printf how can't the compiler see that there is no function declared with that name and throw an error?
If it is as the book says, why can I declare function in header files if the compiler doesn't watch them?
So if I write print instead of printf how can't the compiler see that there is no function declared with that name and throw an error?
You are right. If you made a typo in any function name, any modern compiler should complain about it. For example, gcc complains for the following code:
$ cat test.c
int main(void)
{
unknown();
return 0;
}
$ gcc -c -Wall -Wextra -std=c11 -pedantic-errors test.c
test.c: In function ‘main’:
test.c:3:5: error: implicit declaration of function ‘unknown’ [-Wimplicit-function-declaration]
unknown();
^
However, in pre C99 era of C language, any function whose declaration isn't seen by the compiler, it'll assume the function returns an int. So, if you are compiling in pre-C99 mode then a compiler isn't required to warn about it.
Fortunately, this implicit int rule was removed from the C language since C99 and a compiler is required to issue a diagnostic for it in modern C (>= C99).
But if you provide only a declaration or prototype for the function:
$ cat test.c
int unknown(void); /* function prototype */
int main(void)
{
unknown();
return 0;
}
$ gcc -c -Wall -Wextra -std=c89 -std=c11 test.c
$
(Note: I have used -c flag to just compile without linking; but if you don't use -c then compiling & linking will be done in a single step and the error would still come from the linker).
There's no issue despite the fact, you do not have definition for unknown() anywhere. This is because the compiler assumes unknown() has been defined elsewhere and only when the linker looks to resolve the symbol unknown, it'll complain if it can't find the definition for unknown().
Typically, the header file(s) only provide the necessary declarations or prototypes (I have provided a prototype for unknown directly in the file itself in the above example -- it might as well be done via a header file) and usually not the actual definition. Hence, the author is correct in that sense that the linker is the one that spots the error.
So if I write print instead of printf how can't the compiler see that there is no function declared with that name and throw an error?
The compiler can see that there is no declaration in scope for the identifier designating the function. Most will emit a warning under those circumstances, and some will emit an error, or can be configured to do so.
But that's not the same thing as the compiler detecting that the function doesn't exist. It's the compiler detecting that the function name has not been declared. The compiler will exhibit the same behavior if you spell the function name correctly but do not include a prior declaration for it.
Furthermore, C90 and pre-standardization C permitted calls to functions without any prior declaration. Such calls do not conform to C99 or later, but most compilers still do accept them (usually with a warning) for compatibility purposes.
If it is as the book says, why can I declare function in header files if the compiler doesn't watch them?
The compiler does see them, and does use the declarations. Moreover, it relies on the prototype, if the declaration provides one, to perform appropriate argument and return value conversions when you call the function. Moreover, if you use functions whose argument types are altered by the default argument promotions, then your calls to such functions are non-conforming if no prototype is in scope at the point of the call. Undefined behavior results.

C Compiler not throwing error when no type is specified

Why this below program not throwing an error:
dfljshfksdhfl;
#include <stdio.h>
int main () {
return 0;
}
gcc would just throw a warning:
test.c:1:1: warning: data definition has no type or storage class [enabled by default]
This is because even though implicit int is no longer part of the C standard since C99 some compilers still support it, mainly to prevent breaking a lot of old code. So this line:
dfljshfksdhfl;
ends up being equivalent to:
int dfljshfksdhfl;
clang gives us a much more informative warning by default:
warning: type specifier missing, defaults to 'int' [-Wimplicit-int]
dfljshfksdhfl;
^~~~~~~~~~~~~
We can use the -pedantic-errors flag to turn this into an error, although oddly enough this does not work for clang and so we have to resort to -Werror and turn all warnings into errors, which is actually a good habit to get into. As remyabel points out for clang we can also use -Werror=implicit-int.
I've already answered a similar question (actually I'm pretty sure it's a duplicate, but whatever) and the answer is found in the C99 rationale.
A new feature of C99:
In C89, all type specifiers could be omitted from the declaration
specifiers in a declaration. In such a case int was implied. The
Committee decided that the inherent danger of this feature outweighed
its convenience, and so it was removed. The effect is to guarantee the
production of a diagnostic that will catch an additional category of
programming errors. After issuing the diagnostic, an implementation
may choose to assume an implicit int and continue to translate the
program in order to support existing source code that exploits this
feature.
#Shafik's answers tells you one way to turn the warning into an error (for Clang). If you consider -Werror to be too strict, you can turn that one warning into an error with -Werror=implicit-int. In GCC, it appears that -pedantic-errors is necessary.
First of all, gcc is not a conforming C compiler by default. It implements a dialect of C89/C90 with GNU extensions.
You can use -std=cNN -pedantic (where NN can be 90, 99, or 11) to cause it to (attempt to) conform to a specified version of the ISO C standard. C90 permitted implicit int; it was dropped in C99.
But C compilers are not actually required to generate fatal error messages (except for a #error directive). The standard's requirement (N1570 5.1.1.3p1) is:
A conforming implementation shall produce at least one diagnostic
message (identified in an implementation-defined manner) if a
preprocessing translation unit or translation unit contains a
violation of any syntax rule or constraint, even if the behavior is
also explicitly specified as undefined or implementation-defined.
Diagnostic messages need not be produced in other circumstances.
A non-fatal warning qualifies as a "diagnostic message". A conforming C compiler can print a warning for any error -- even a syntax error -- and then continue to successfully compiler the source file. (This is how some compiler-specific language extensions may be supported.)
Personally, I find gcc to be overly lax about certain errors; in my opinion, a missing int should be treated as a fatal error. But that's just my preference, not a requirement imposed by the standard.
The lesson here is that you should not assume that mere warnings are harmless. Ideally, compiling your code should produce no diagnostics at all. Cases where it's ok to ignore warnings are rare (but they do exist, since compilers are free to warn about perfectly valid code).

Why isn't GCC's acceptance of void-pointer arithmetic considered a bug? [duplicate]

This question already has answers here:
Pointer arithmetic for void pointer in C
(10 answers)
Closed 5 years ago.
There are at least three different posts about how void pointer arithmetic is prohibited in C; that gcc 4.8.2 allows it, assuming that a void is of byte size; and how one can turn on extra pedantic warnings to trigger an error. Here is an example:
#include <stdio.h>
/* compile gcc -Wall -o try try.c */
int main() {
char *str="string";
void *vp= (void *) str;
++vp; /* arithmetic on void point. huh? */
printf("%s\n", (char*)vp);
return 0;
}
My question is about thinking about what a C compiler is supposed to do in case of invalid code. Is it not considered a bug when a compiler does not issue a compile error on invalid code?
And this seems like bizarre behavior for a compiler, anyway — even if gcc does not issue a compile error, at the very least, it could issue a "deprecated" warning with the default compiler flags. And, even with
-Wall, it is still not even giving a warning. Huh? It surprised me because gcc seems very mature otherwise and C is not exactly a novel or complex language.
The C standard makes an attempt to perform pointer arithmetic on void* a constraint violation, which means that any conforming C compiler must issue at least one diagnostic message for any program containing such an attempt. The warning may be a non-fatal error; in that case, the compiler may then go on to generate code whose behavior is defined by the implementation.
gcc, by default, does not warn about pointer arithmetic on void*. This means that gcc, by default, is not a conforming C compiler.
One could argue that this is a bug, but in its default mode gcc is not a compiler for standard C but for GNU C. (A Fortran compiler's failure to be a conforming C compiler is also not a bug.)
Carefully chosen command-line options can force gcc to at least attempt to be conforming. For example:
gcc -std=cXX -pedantic
where XX is one of 90, 99, or 11, will cause gcc to warn about pointer arithmetic on void*. Replacing -pedantic with -pedantic-errors causes it to treat such arithmetic as a fatal error.
Sure invalid standard C code could be legal in a specific compiler, it's called compiler extension.
It's true in this case, from https://gcc.gnu.org/onlinedocs/gcc/Pointer-Arith.html
In GNU C, addition and subtraction operations are supported on pointers to void and on pointers to functions. This is done by treating the size of a void or of a function as 1.
If you need your code to be portable, it's always a good idea to stick with standard C, but if your code runs only on a specific platform, it's no harm to use certain compiler extensions.
C11 standard n1570 S6.5.6/2:
For addition, either both operands shall have arithmetic type, or one operand shall be a
pointer to a complete object type and the other shall have integer type. (Incrementing is
equivalent to adding 1.)
The language for C++ is similar.
It's definitely not standards-conforming behaviour. I think the GCC team already know that.
The answer is that a compliant compiler should issue a diagnostic, and then generate whatever code it likes (or not).

How to turn "implicit declaration" warnings in $CC into errors?

Preamble: My C may be fairly rusty; I first started writing C programs in somewhere around 1993 -- compilers may have been different back then, but I recall that when one attempted to refer to a C function that was not declared, the compiler would abort. This is from memory.
Currently, I am perplexed as to why GCC (4.4.3) is so forgiving on me when I [intentionally] mismatch or omit declaration of bar below, with its definition in bar.c. Because the compiler does not warn me, the program proceeds to a fatal addressing error at runtime -- since bar wants an address and is given an integer, it ends up de-referencing that integer as an address.
A strict compiler, or so I would think, would abort on me with an error. Am I missing something? My build command line is as follows:
cc -o foobar -g -Wall -std=c99 -fexec-charset=ISO-8859-1 -DDEBUG foo.c bar.c
With foo.c:
int main() {
int a;
bar(a);
return 0;
}
and bar.c:
void bar(int * a) {
*a = 1;
}
I have intentionally omitted declaration of bar and, as mentioned, intentionally pass it an integer (could be anything really) instead of an address that its actual definition would otherwise mandate. Because $(CC) does not stop me, I end up with a segmentation fault (x86, Ubuntu 10.04). I am aware that a compliant C (C99?) compiler would implicitly create an int bar(void) declaration for bar if none otherwise found, but in this case that's obviously not what I want at all!
I want to protect myself from the kind of errors -- where I make the human mistake of mismatching declarations and definitions or omitting the former altogether.
I tried to instead just invoke the compiler without the linking step (with the -c switch) -- but it doesn't matter as compiling still succeeds with warnings. The linker might complain though, but I want the compiler to stop me before that happens.
I do not actually want to turn all my warnings into errors (e.g. with -Werror), because:
I could have included the wrong float bar(double a); at the top of foo.c, which would eliminate the warning altogether, but doesn't change the fact that the resulting program crashes; alas, a program that compiles successfully without warnings (even with the -Wall switch) but still is faulty
I have and will have other types of warnings that should stay warnings and not prevent successfully building the program
It would be dealing with the effect of the problem, rather than the problem itself
It's not just the types of warnings, but also particular instances thereof; I wouldn't want to turn a specific warning into an error because in some instances that would not be applicable; this would be too "coarse" of a solution which doesn't take into account the specifics of and the context in which the warning occurred
To turn this warning into an error when compiling with gcc, pass the switch -Werror=implicit-function-declaration to the compiler.
Trying to answer your "why" question: yes, it might look odd that this is by default a warning and not an error. This is for historical reasons. For details, see e.g. Why does/did C allow implicit function and typeless variable declarations?, or read it in Ritchie's own words at http://cm.bell-labs.com/who/dmr/chist.html.
You could probably force additional warnings for gcc:
-Wmissing-prototypes
-Wmissing-declarations
Using both (along with -Werror) will probably help you to avoid such situations, but require some more code writing.
Inspired by this.
EDIT: Example
// file: mis1.c
int main(void)
{
int a;
bar(a);
return 0;
}
// file: mis2.c
#include <stdio.h>
double bar(double a)
{
printf("%g\n", a);
return a;
}
Compiling with gcc 3.3.4 (DJGPP) as:
gcc -Wall -Wmissing-prototypes -Wmissing-declarations -Werror mis2.c mis1.c -o mis.exe
Compiler output:
mis2.c:5: warning: no previous prototype for `bar'
mis1.c: In function `main':
mis1.c:6: warning: implicit declaration of function `bar'
Fix? #Include the following file in both files:
// file: mis.h
extern int bar(int);
Recompiling you get:
mis2.c:6: error: conflicting types for `bar'
mis.h:3: error: previous declaration of `bar'
Fix? Define and declare bar everywhere in the same way, correct, for example, mis.h:
// file: mis.h
extern double bar(double);
Likewise you could change bar() in mis2.c to match that of mis.h.
From the gcc docs on warnings:
-Wimplicit-function-declaration (C and Objective-C only)
Give a warning whenever a function is used before being declared. In C99 mode (-std=c99 or -std=gnu99), this warning is enabled by default and it is made into an error by -pedantic-errors. This warning is also enabled by -Wall.
...
-pedantic-errors (my emphasis) Like -pedantic, except that errors are produced rather than warnings.
...
-pedantic
Issue all the warnings demanded by strict ISO C and ISO C++; reject all programs that use forbidden extensions, and some other programs that do not follow ISO C and ISO C++. For ISO C, follows the version of the ISO C standard specified by any -std option used.
It looks to me that -pedantic-errors will do what you want (turn these warnings into errors), however it sounds like it will also turn on a host of other checks you may or may not want. =/
The closest I found to a solution to my problem was to simply use the flag -combine which indirectly causes the compiler to abort compilation when attempting to call a function that is missing a prototype or where prototypes mismatch or do not match the definition.
I am afraid it has drawbacks though. Since input files now are combined in one compilation run, one ends up with a single object file, which has some implications of its own. In short, -combine does something more than just fix my problem, and that may be a problem in itself.
You can turn all warning to error with
cc [..] -Werror [..]
. This will partially solve your problem.
I could have included the wrong float bar(double a); at the top of foo.c, which eliminates the warning altogether, but doesn't change
the fact that the resulting program crashes. Alas, a program that
compiles successfully without warnings (even with the -Wall switch)
and beautifully crashes at runtime.
Herefore it is essential, additional to other measures, to include the same header file (including the prototype) in foo.c and bar.c. This ensures that the correct prototype is applied at both places.

Resources