Anybody knows why is this compiling successfully? - c

Anybody knows why is this compiling successfully in C?
int main(){
display();
return 0;
}
void display(){
printf("Why am I compiling successfully?");
}
I thought when declaration is not provided C assume extern int Function_name(arg1,arg2,...){}. Thus this should give an error but however it's working! I know that Ideone is supressing the warnings but my question is why is it just not giving a straight error? (however in C++ it's straight error)

Turn up the warning level in your compiler and you should get 2 warnings,
display not declared, int assumed
and
display redeclared
Edit:
Older versions of C (pre C99) aren't really that bothered about return types or argument types. You could say it's part of the K&R legacy. For instance, if you don't explicitly specify the argument types, the compiler won't check them at all.
C++ is stricter, which IMO is a good thing. I always provide declarations and always specify the argument lists when I code in C.

It's compiling because C uses a lot of defaults to be backwards compatible. In K&R C, you couldn't specify function prototypes, so the compiler would just assume that you know what you're doing when you call a function.
Later (at least ANSI C, but maybe even in C99), C didn't really have a way to distinguish
void display(void);
void display();
so the empty declaration must be accepted as well.
That's why you can call display() without defining it first.
printf() is similar. You will get a linker error if you forget -lc but from the compiler's point of view, the code is "good enough".
That will change as soon as you enable all warnings that your compiler has to offer and it will fail with an error when you disable K&C compatibility or enable strict ANSI checks.
Which is why "C" is often listed as "you shoot yourself into the foot" in "How to Shoot Yourself In the Foot Using Any Programming Language" kind of lists.

it depends of your Cx (C89, C90, C99,...)
for function return values, prior to C99 it was explicitly specified that if no function declaration was visible the translator provided one. These implicit declarations defaulted to a return type of int
Justification from C Standard (6.2.5 page 506)
Prior to C90 there were no function prototypes. Developers expected to
be able to interchange argu-ments that had signed and unsigned
versions of the same integer type. Having to cast an argument, if the
parameter type in the function definition had a different signedness,
was seen as counter to C’s easy-going type-checking system and a
little intrusive. The introduction of prototypes did not completely do
away with the issue of interchangeability of arguments. The ellipsis
notation specifies that nothing is known about the 1590 ellipsis
supplies no information expected type of arguments. Similarly, for
function return values, prior to C99 it was explicitly specified that
if no function declaration was visible the translator provided one.
These implicit declarations defaulted to a return type of int . If the
actual function happened to return the type unsigned int , such a
default declaration might have returned an unexpected result. A lot of
developers had a casual attitude toward function declarations. The
rest of us have to live with the consequences of the Committee not
wanting to break all the source code they wrote. The
interchangeability of function return values is now a moot point,
because C99 requires that a function declaration be visible at the
point of call (a default declaration is no longer provided)

It probably does (assume such declaration as you wrote), but as you are not passing parameters, it just works fine. It's the same like if you declare int main() where it should actually be int main(int argc, char *argv[]).
So probably if you tried to pass some parameters (different from default ones) from main then use them in display it would fail.
BTW, for me it compiles, but generates warning:
$ gcc z.c
z.c:8:6: warning: conflicting types for ‘display’ [enabled by default]
z.c:4:5: note: previous implicit declaration of ‘display’ was here

When I compile with gcc, I get warnings about redeclaration of display, as you would expect.
Did you get warnings?
It might "run" because C doesn't mangle function names (like C++). So the linker looks for a symbol 'display' and finds one. The linker uses this address to run display. I would expect results to not be what you expect all the time.

Related

Implicit int declaration of a variable when type qualifier is specified [duplicate]

If "a function" were compiled separately, the mismatch would not be detected, "the function" would return a double that main would treat as an int... In the light of what we have said about how declarations must match definitions this might seems surprising. The reason a mismatch can happen is that if there is no function prototype, a function is implicitly declared by its first appearance in an expression, such as
sum += "the function"(line);
If a name that has not been previously declared occurs in an expression and is followed by a left parenthesis, it is declared by context to be a function name, the function is assumed to return an int, and nothing is assumed about its arguments.
I apologize beforehand for the ambiguous question, but what does this mean?
By the way this is page 73 chapter 4.3 from Brian W. Kernighan and Dennis M. Ritchie's C Programming Language book, 2nd edition.
K&R2 covers the 1989/1990 version of the language. The current ISO C standard, published in 1999 2011, drops the "implicit int" rule, and requires a visible declaration for any function you call. Compilers don't necessarily enforce this by default, but you should be able to request more stringent warnings -- and you definitely should. In well-written new code, the rule is irrelevant (but it is necessary to understand it).
An example: the standard sqrt() function is declared in <math.h>:
double sqrt(double);
If you write a call without the required #include <math.h>:
double x = 64.0;
double y = sqrt(x);
a C90 compiler will assume that sqrt returns int -- and it will generate code to convert the result from int to double. The result will be garbage, or perhaps a crash.
(You could manually declare sqrt yourself, but that's the wrong solution.)
So don't do that. Always include whatever header is required for any function you call. You might get away with calling an undeclared function if it does return int (and if your compiler doesn't enforce strict C99 or C11 semantics, and if a few other conditions are satisfied), but there's no good reason to do so.
Understanding the "implicit int" rule is still useful for understanding the behavior of old or poorly written code, but you should never depend on it in new code.
Function prototypes were introduced into the language late.
Before prototypes, the compiler would assume that every argument passed to every unknown function should be passed as an integer and would assume that the return value was also an integer.
This worked fine for the few cases where it was correct, but meant people had to write programs in an awkward order so that functions would never rely on unknown functions that did not match this expectation.
When prototypes were introduced into C89 (aka ANSI C or ISO C), the prototypes allow the compiler to know exactly what types of arguments are expected and what types of results will be returned.
It is strongly recommended that you use function prototypes for all new code; when working on an entirely old code base, the prototypes might be harmful. (Or, if the code must be compilable on a pre-ANSI C compiler, then you might wish to leave off the prototypes so it can be built on the ancient software. gcc is the only place I've seen this in a long time.)
It's just stating that if the compiler comes across code that calls an unknown function, then it implicitly treats it as if it had already seen a declared prototype of the form int unknown();

What happens when I leave out an argument to a function in C?

First of all, I know this way of programming is not good practice. For an explanation of why I'm doing this, read on after the actual question.
When declaring a function in C like this:
int f(n, r) {…}
The types of r and n will default to int. The compiler will likely generate a warning about it, but let's choose to ignore that.
Now suppose we call f but, accidentally or otherwise, leave out an argument:
f(25);
This will still compile just fine (tested with both gcc and clang). However there is no warning from gcc about the missing argument.
So my question is:
Why does this not produce a warning (in gcc) or error?
What exactly happens when it is executed? I assume I'm invoking undefined behaviour but I'd still appreciate an explanation.
Note that it does not work the same way when I declare int f(int n, int r) {…}, neither gcc nor clang will compile this.
Now if you're wondering why I would do such a thing, I was playing Code Golf and tried to shorten my code which used a recursive function f(n, r). I needed a way to call f(n, 0) implicitly, so I defined F(n) { return f(n, 0) } which was a little too many bytes for my taste. So I wondered whether I could just omit this parameter. I can't, it still compiles but no longer works.
While optimizing this code, it was pointed out to me that I could just leave out a return at the end of my function – no warning from gcc about this either. Is gcc just too tolerant?
You don't get any diagnostics from the compiler because you are not using modern "prototyped" function declarations. If you had written
int f(int n, int r) {…}
then a subsequent f(25) would have triggered a diagnostic. With the compiler on the computer I'm typing this on, it's actually a hard error.
"Old-style" function declarations and definitions intentionally cause the compiler to relax many of its rules, because the old-style code that they exist for backward compatibility with would do things like this all the dang time. Not the thing you were trying to do, hoping that f(25) would somehow be interpreted as f(25, 0), but, for instance, f(25) where the body of f never looks at the r argument when its n argument is 25.
The pedants commenting on your question are pedantically correct when they say that literally anything could happen (within the physical capabilities of the computer, anyway; "demons will fly out of your nose" is the canonical joke, but it is, in fact, a joke). However, it is possible to describe two general classes of things that are what usually happens.
With older compilers, what usually happens is, code is generated for f(25) just as it would have been if f only took one argument. That means the memory or register location where f will look for its second argument is uninitialized, and contains some garbage value.
With newer compilers, on the other hand, the compiler is liable to observe that any control-flow path passing through f(25) has undefined behavior, and based on that observation, assume that all such control-flow paths are never taken, and delete them. Yes, even if it's the only control-flow path in the program. I have actually witnessed Clang spit out main: ret for a program all of whose control-flow paths had undefined behavior!
GCC not complaining about f(n, r) { /* no return statement */ } is another case like (1), where the old-style function definition relaxes a rule. void was invented in the 1989 C standard; prior to that, there was no way to say explicitly that a function does not return a value. So you don't get a diagnostic because the compiler has no way of knowing that you didn't mean to do that.
Independently of that, yes, GCC's default behavior is awfully permissive by modern standards. That's because GCC itself is older than the 1989 C standard and nobody has reexamined its default behavior in a long time. For new programs, you should always use -Wall, and I recommend also at least trying -Wextra, -Wpedantic, -Wstrict-prototypes, and -Wwrite-strings. In fact, I recommend going through the "Warning Options" section of the manual and experimenting with all of the additional warning options. (Note however that you should not use -std=c11, because that has a nasty tendency to break the system headers. Use -std=gnu11 instead.)
First off, the C standard doesn't distinguish between warnings and errors. It only talks about "diagnostics". In particular, a compiler can always produce an executable (even if the source code is completely broken) without violating the standard.1
The types of r and n will default to int.
Not anymore. Implicit int has been gone from C since 1999. (And your test code requires C99 because for (int i = 0; ... isn't valid in C90).
In your test code gcc does issue a diagnostic for this:
.code.tio.c: In function ‘f’:
.code.tio.c:2:5: warning: type of ‘n’ defaults to ‘int’ [-Wimplicit-int]
It's not valid code, but gcc still produces an executable (unless you enable -Werror).
If you add the required types (int f(int n, int r)), it uncovers the next issue:
.code.tio.c: In function ‘main’:
.code.tio.c:5:3: error: too few arguments to function ‘f’
Here gcc somewhat arbitrarily decided not to produce an executable.
Relevant quotes from C99 (and probably C11 too; this text hasn't changed in the n1570 draft):
6.9.1 Function definitions
Constraints
[...]
If the declarator includes an identifier list, each declaration in the declaration list shall
have at least one declarator, those declarators shall declare only identifiers from the
identifier list, and every identifier in the identifier list shall be declared.
Your code violates a constraint (your function declarator includes an identifier list, but there is no declaration list), which requires a diagnostic (such as the warning from gcc).
Semantics
[...] If the
declarator includes an identifier list, the types of the parameters shall be declared in a
following declaration list.
Your code violates this shall rule, so it has undefined behavior. This applies even if the function is never called!
6.5.2.2 Function calls
Constraints
[...]
If the expression that denotes the called function has a type that includes a prototype, the
number of arguments shall agree with the number of parameters. [...]
Semantics
[...]
[...] If the number of arguments does not equal the number of parameters, the
behavior is undefined. [...]
The actual call also has undefined behavior if the number of arguments passed doesn't match the number of parameters the function has.
As for omitting return: This is actually valid as long as the caller doesn't look at the returned value.
Reference (6.9.1 Function definitions, Semantics):
If the } that terminates a function is reached, and the value of the function call is used by
the caller, the behavior is undefined.
1 The sole exception seems to be the #error directive, about which the standard says:
The implementation shall not successfully translate a preprocessing translation unit
containing a #error preprocessing directive unless it is part of a group skipped by
conditional inclusion.

Function without any return type

I know there are two types of functions: void and the ones that return something (int, double, etc). But what if a function is declared without any return statements? Is it considered to be a void function? For example,
myFunction(int value){
.......
}
A function declared without any return type is considered returning an int. This is an ancient rule going back to the original K&R C, left in the language for backward compatibility.
Integer promotions and conversion of float arguments to doubles are done for functions without prior definition or forward declaration for the same reason - backward compatibility with really old code.
It is needless to say that relying on these rules for new development is a very bad practice. One should always declare return type explicitly, and forward-declare or define all functions before calling them.
First off, I have to say that your code:
myFunction(int value){
.......
}
is a function definition, other than just a declaration. A pure declaration will be like:
myFunction(int value);
or
myFunction(int); //also valid but less recommended due to bad readability
Also note that every definition can serve as a declaration of the function, but not vise versa.
Let's go back to your question. myFunction(int value); is equal to int myFunction(int value); according to earlier standards. In fact, compilers will assume all undeclared functions to return an int using these standards.
However, a prototype without a type isn't a function declaration since C99, although the type is very likely to still be seen as int by a compiler.
Any way, this is a very bad coding style and should be avoided, because compile-time checking cannot be performed and you are depending on undefined behaviors, which are never trustworthy. If you don't need a return value, you can just write:
void myFunction(int value);
See also: Are prototypes required for all functions in C89, C90 or C99?
If your standard compiler is set to correctly to compile according to the C standard, such functions are not allowed.
For example gcc -std=c11 -pedantic-errors -Wall.
If you have a 20+ year old compiler, such functions were allowed though. The return type would then default to int if not stated explicitly. Similarly if no function prototype was visible, the compiler would make up a nonsense function declaration based on the function call, where the return type would default to int, if no prototype was visible.
There was never any rationale for why the "implicit int" feature made sense. What it did in practice was to block the compiler from static type checks and therefore created countless of bugs.
The dangerous, "implicit int" was therefore promptly removed from the C language in 1999.
This is actually one of the main reasons why you shouldn't be using a C compiler that follows some ancient version of the C language in the year 2016.

Implicit declaration of recursive function in C [duplicate]

What is meant by the term "implicit declaration of a function"? A call to a standard library function without including the appropriate header file produces a warning as in the case of:
int main(){
printf("How is this not an error?");
return 0;
}
Shouldn't using a function without declaring it be an error? Please explain in detail. I searched this site and found similar questions, but could not find a definitive answer. Most answers said something about including the header file to get rid of the warning, but I want to know how this is not an error.
It should be considered an error. But C is an ancient language, so it's only a warning.
Compiling with -Werror (GCC) fixes this problem.
When C doesn't find a declaration, it assumes this implicit declaration: int f();, which means the function can receive whatever you give it, and returns an integer. If this happens to be close enough (and in case of printf, it is), then things can work. In some cases (e.g., the function actually returns a pointer, and pointers are larger than ints), it may cause real trouble.
Note that this was fixed in newer C standards (C99 and C11). In these standards, this is an error. However, GCC doesn't implement these standards by default, so you still get the warning.
Implicit declarations are not valid in C.
C99 removed this feature (present in C89).
GCC chooses to only issue a warning by default with -std=c99, but a compiler has the right to refuse to translate such a program.
To complete the picture, since -Werror might considered too "invasive",
for GCC (and LLVM), a more precise solution is to transform just this warning in an error, using the option:
-Werror=implicit-function-declaration
See How can I make this GCC warning an error?.
Regarding general use of -Werror: Of course, having warningless code is recommendable, but in some stage of development it might slow down the prototyping.
Because of historical reasons going back to the very first version of C, it passes whatever type the argument is. So it could be an int or a double or a char*. Without a prototype, the compiler will pass whatever size the argument is and the function being called had better use the correct argument type to receive it.
For more details, look up K&R C.
An implicitly declared function is one that has neither a prototype nor a definition, but is called somewhere in the code. Because of that, the compiler cannot verify that this is the intended usage of the function (whether the count and the type of the arguments match). Resolving the references to it is done after compilation, at link-time (as with all other global symbols), so technically it is not a problem to skip the prototype.
It is assumed that the programmer knows what he is doing and this is the premise under which the formal contract of providing a prototype is omitted.
Nasty bugs can happen if calling the function with arguments of a wrong type or count. The most likely manifestation of this is a corruption of the stack.
Nowadays this feature might seem as an obscure oddity, but in the old days it was a way to reduce the number of header files included, hence faster compilation.
C is a very low-level language, so it permits you to create almost any legal object (.o) file that you can conceive of. You should think of C as basically dressed-up assembly language.
In particular, C does not require functions to be declared before they are used. If you call a function without declaring it, the use of the function becomes its (implicit) declaration. In a simple test I just ran, this is only a warning in the case of built-in library functions like printf (at least in GCC), but for random functions, it will compile just fine.
Of course, when you try to link, and it can't find foo, then you will get an error.
In the case of library functions like printf, some compilers contain built-in declarations for them so they can do some basic type checking, so when the implicit declaration (from the use) doesn't match the built-in declaration, you'll get a warning.

Confused over function call in pre-ANSI C syntax

I'm dealing with some pre-ANSI C syntax. See I have the following function call in one conditional
BPNN *net;
// Some more code
double val;
// Some more code, and then,
if (evaluate_performance(net, &val, 0)) {
But then the function evaluate_performance was defined as follows (below the function which has the above-mentioned conditional):
evaluate_performance(net, err)
BPNN *net;
double *err;
{
How come evaluate_performance was defined with two parameters but called with three arguments? What does the '0' mean?
And, by the way, I'm pretty sure that it isn't calling some other evaluate_performance defined elsewhere; I've greped through all the files involved and I'm pretty sure the we are supposed to be talking about the same evaluate_performance here.
Thanks!
If you call a function that doesn't have a declared prototype (as is the case here), then the compiler assumes that it takes an arbitrary number and types of arguments and returns an int. Furthermore, char and short arguments are promoted to ints, and floats are promoted to doubles (these are called the default argument promotions).
This is considered bad practice in new C code, for obvious reasons -- if the function doesn't return int, badness could ensure, you prevent the compiler from checking that you're passing the correct number and types of parameters, and arguments might get promoted incorrectly.
C99, the latest edition of the C standard, removes this feature from the language, but in practice many compilers still allow them even when operating in C99 mode, for legacy compatibility.
As for the extra parameters, they are technically undefined behavior according to the C89 standard. But in practice, they will typically just be ignored by the runtime.
The code is incorrect, but in a way that a compiler is not required to diagnose. (A C99 compiler would complain about it.)
Old-style function definitions don't specify the number of arguments a function expects. A call to a function without a visible prototype is assumed to return int and to have the number and type(s) of arguments implied by the calls (with narrow integer types being promoted to int or unsigned int, and float being promoted to double). (C99 removed this; your code is invalid under the C99 standard.)
This applies even if the definition precedes the call (an old-style definition doesn't provide a prototype).
If such a function is called incorrectly, the behavior is undefined. In other words, it's entirely the programmer's responsibility to get the arguments right; the compiler won't diagnose errors.
This obviously isn't an ideal situation; it can lead to lots of undetected errors.
Which is exactly why ANSI added prototypes to the language.
Why are you still dealing with old-style function definitions? Can you update the code to use prototypes?
Even standard C compilers are somewhat permissive when it comes to this. Try running the following:
int foo()
{
printf("here");
}
int main()
{
foo(3,4);
return 0;
}
It will, to some's surprise, output "here". The extra arguments are just ignored. Of course, it depends on the compiler.
Overloading doesn't exist in C so having 2 declarations would not work in the same text.
That must be a quite old compiler to not err on this one or it did not find the declaration of the function yet!
Some compilers would not warn/err when calling an undefined function. That's probably what you're running into. I would suggest you look at the command line flags of the compiler to see whether there is a flag you can use to get these warnings because you may actually find quite a few similar mistakes (too many parameters is likely to work just fine, but too few will make use of "undefined" values...)
Note that it is possible to do such (add extra parameters) when using the ellipsis as in printf():
printf(const char *format, ...);
I would imagine that the function had 3 parameters at some point and the last was removed because it was unused and some parts of the code was not corrected as it ought to be. I would remove that 3rd parameter, just in case the stack goes in the wrong order and thus fails to send the correct parameters to the function.

Resources