GCC typically yields this warning when the proper header file is not included. This link --> www.network-theory.co.uk/docs/gccintro/gccintro_19.html says that because the function declaration is implicit (rather than explicitly declared via a header) the wrong argument types could actually be passed to the function, yielding incorrect results. I don't understand this. Does this mean the compiler generates code that pushes something, of the machine's word size, onto the stack for the callee to consume, and hopes for the best?
Detail is appreciated.
If the compiler doesn't have specific information about how the argument should be passed, such as when there's no prototype or for arguments that are passed where the prototype have an ellipsis ('...'), the compiler follows certain rules for passing the arguments. These rule basically follow what occurred in pre-standard (or K&R) C - before prototypes were used. Paraphrased from C99 6.5.2.2/6 "Function calls":
* the integer promotions are applied
* if the argument has float type it's promoted to double
After these default argument promotions are applied, the argument is simply copied to wherever the compiler normally copies arguments (generally, the stack). So a struct argument would be copied to the stack.
If the actual function implementation doesn't match how the compiler creates the parameters, then you get undefined behavior (with exceptions for signed/unsigned mismatch if the value can be represented or pointers to char and pointers to void can be mixed/matched).
Also in C90, if the function is implicitly declared (which C99 doesn't permit, though it does permit functions without prototypes), the return value is defaulted as int. Once again, the the actual function returns something else, undefined behavior results.
In classic K&R C, that's pretty much what happened; there were default coercions (anything smaller than (int) was promoted to (int), for example), and for backwards compatibility any function without a prototype is still called that way, but by and large the only indication you got for passing the wrong type was a weird result or maybe a core dump. Which is where you get in trouble, as when the function has a prototype the exact (not coerced/promoted) value is pushed. So if you're passing a (char), if there's a prototype in scope then a single byte is pushed by the caller, otherwise 4 bytes (on most current platforms). If the caller and callee disagree about this, Bad Things happen.
Related
I know that how arguments are passed to functions is not part of the C standard, and is dependent on the hardware architecture and calling convention.
I also know that an optimizing compiler may automatically inline functions to save on call overhead, and omit code that has no "side effects".
But, I have a question about a specific case:
Lets say there is a non trivial function that can not be inlined or removed, and must be called, that is declared to take no arguments:
int veryImportantFunc() {
/* do some important stuff */
return result;
}
But this function is called with arguments:
int result = veryImportantFunc(1, 2, 3);
Is the compiler allowed to call the function without passing these arguments?
Or is there some standard or technical limitation that would prevent this kind of optimization?
Also, what if argument evaluation has side effects:
int counter = 1;
int result = veryImportantFunc(1, ++counter, 3);
Is the compiler obligated to evaluate even without passing the result, or would it be legal to drop the evaluation leaving counter == 1?
And finally, what about extra arguments:
char* anotherFunc(int answer) {
/* Do stuff */
return question;
}
if this function is called like this:
char* question = anotherFunc(42, 1);
Can the 1 be dropped by the compiler based on the function declaration?
EDIT: To clarify: I have no intention of writing the kind of code that is in my examples, and I did not find this in any code I am working on.
This question is to learn about how compilers work and what the relevant standards say, so to all of you who advised me to stay away from this kind of code: thank you, but I already know that.
To begin with, "declared to take no arguments" is wrong. int veryImportantFunc() is a function accepting any arguments. This is obsolete C style and shouldn't be used. For a function taking no arguments, use (void).
Is the compiler allowed to call the function without passing these arguments?
If the actual function definition does not match the number of arguments, the behavior is undefined.
Also, what if argument evaluation has side effects
Doesn't matter, since arguments are evaluated (in unspecified order) before the function is called.
Is the compiler obligated to evaluate even without passing the result, or would it be legal to drop the evaluation leaving counter == 1?
It will evaluate the arguments and then invoke undefined behavior. Anything can happen.
And finally, what about extra arguments:
Your example won't compile, as it isn't valid C.
The following quotes from the C standard are relevant to your different questions:
6.5.2.2 Function calls
...
2. If the expression that denotes the called function has a type that includes a prototype, the number of arguments shall agree with the number of parameters.
...
4. An argument may be an expression of any complete object type. In preparing for the call to a function, the arguments are evaluated, and each parameter is assigned the value of the corresponding argument.
...
6. If the expression that denotes the called function has a type that does not include a prototype, the integer promotions are performed on each argument, and arguments that have type float are promoted to double. These are called the default argument promotions. If the number of arguments does not equal the number of parameters, the behavior is undefined. If the function is defined with a type that includes a prototype, and either the prototype ends with an ellipsis (, ...) or the types of the arguments after promotion are not compatible with the types of the parameters, the behavior is undefined. If the function is defined with a type that does not include a prototype, and the types of the arguments after promotion are not compatible with those of the parameters after promotion, the behavior is undefined.
...
10. There is a sequence point after the evaluations of the function designator and the actual arguments but before the actual call. Every evaluation in the calling function (including other function calls) that is not otherwise specifically sequenced before or after the execution of the body of the called function is indeterminately sequenced with respect to the execution of the called function.
Lets say there is a non trivial function that can not be inlined or removed, and must be called, that is declared to take no arguments:
int veryImportantFunc() {
/* do some important stuff */
return result;
}
But this function is called with arguments:
There are two possibilities:
the function is declared with a "full" prototype such as
int veryImportantFunc(void);
in this case the call with extra arguments won't compile, as the number of parameters and arguments must match;
the function is declared as taking an unspecified number of arguments, i.e. the declaration visibile to the call site is
int veryImportantFunc();
in this case, the call is undefined behavior, as the usage doesn't match the actual function definition.
All the other considerations about optimization aren't particularly interesting, as what you are trying to do is illegal however you look at it.
We can stretch this and imagine a situation where passing extra useless arguments is legal, for example a variadic function never using the extra arguments.
In this case, as always, the compiler is free to perform any such optimization as long as the observable behavior isn't impacted, and proceeds "as if" performed according to the C abstract machine.
Given that the details of arguments passing aren't observables1, the compiler could in line of principle optimize away the argument passing, while the arguments evaluation may still need to be done if it has some observable impact on the program state.
That being said, I have a hard time imagining how such optimization may be implemented in the "classical linking model", but with LTCG it shouldn't be impossible.
The only observable effects according to the C standard are IO and reads/writes on volatile variables.
Following Pascals theory, it is better to be wrong in believing the compiler can make this optimisation than be right in believing it doesn’t. It serves no purpose to wrongly define a function; if you really must, you can always put a stub in front of it:
int RealSlimShady(void) {
return Dufus;
}
int MaybeSlimShady(int Mathew, int Mathers) {
Used(Mathew);
Used(Mathers);
return RealSlimShady();
}
Everyone is happy, and if your compiler is worth its salt, there will be 0 code overhead.
Earlier I asked one question that is if the return of malloc is cast then the error which would be flagged is hidden
and I got the answer that is: In (older dialects of) C, you can call a function which hasn't been declared; that implicitly declares a function with a return type of int. So, if you were to call malloc() without including the header, you'd get code that erroneously assumed that it returned int. Without a cast, you'd get a compiler error when you tried to assign it to a pointer. With a cast, the code would compile, potentially giving obscure runtime errors and lengthy debugging sessions.
I understand that without inclusion of <stdlib.h> header file compiler implicitly declares a function with a return type of int.
But I confused in that according to malloc()'s function definition that is defined by creator, it will returns void * whenever we used it, but without <stdlib.h> it returns int. So how does its definition change, is compiler implicitly changing void * to int * type, or there is some other reason?
I know that I couldn't explain my question properly but if anybody understand, please explain me about that.
For C, all undeclared functions are implicitly declared to return int and take an unspecified number of unspecified arguments.
The problem with malloc is that an int and a void * may not be the same size, or even compatible. For example, on modern 64-bit machines pointers are 64-bit types while int is a 32-bit type, so if you don't have a proper declaration of malloc the compiler will drop the high 32 bits of the pointer.
In standard C, this will never happen. If you forget to include <stdlib.h> on a standard compiler and then call malloc, you will get a compiler error "implicit declaration" or similar, because the function has no prototype.
In older, obsolete versions of the C standard, there was a rule that said that if no prototype of a called function was visible, the return type of the function was assumed to be int. This was of course very dangerous and lead to all kind of obscure bugs. Therefore, the implicit int "feature" was removed from the C language 15 years ago, when the standard C99 was released.
I'm dealing with some pre-ANSI C syntax. See I have the following function call in one conditional
BPNN *net;
// Some more code
double val;
// Some more code, and then,
if (evaluate_performance(net, &val, 0)) {
But then the function evaluate_performance was defined as follows (below the function which has the above-mentioned conditional):
evaluate_performance(net, err)
BPNN *net;
double *err;
{
How come evaluate_performance was defined with two parameters but called with three arguments? What does the '0' mean?
And, by the way, I'm pretty sure that it isn't calling some other evaluate_performance defined elsewhere; I've greped through all the files involved and I'm pretty sure the we are supposed to be talking about the same evaluate_performance here.
Thanks!
If you call a function that doesn't have a declared prototype (as is the case here), then the compiler assumes that it takes an arbitrary number and types of arguments and returns an int. Furthermore, char and short arguments are promoted to ints, and floats are promoted to doubles (these are called the default argument promotions).
This is considered bad practice in new C code, for obvious reasons -- if the function doesn't return int, badness could ensure, you prevent the compiler from checking that you're passing the correct number and types of parameters, and arguments might get promoted incorrectly.
C99, the latest edition of the C standard, removes this feature from the language, but in practice many compilers still allow them even when operating in C99 mode, for legacy compatibility.
As for the extra parameters, they are technically undefined behavior according to the C89 standard. But in practice, they will typically just be ignored by the runtime.
The code is incorrect, but in a way that a compiler is not required to diagnose. (A C99 compiler would complain about it.)
Old-style function definitions don't specify the number of arguments a function expects. A call to a function without a visible prototype is assumed to return int and to have the number and type(s) of arguments implied by the calls (with narrow integer types being promoted to int or unsigned int, and float being promoted to double). (C99 removed this; your code is invalid under the C99 standard.)
This applies even if the definition precedes the call (an old-style definition doesn't provide a prototype).
If such a function is called incorrectly, the behavior is undefined. In other words, it's entirely the programmer's responsibility to get the arguments right; the compiler won't diagnose errors.
This obviously isn't an ideal situation; it can lead to lots of undetected errors.
Which is exactly why ANSI added prototypes to the language.
Why are you still dealing with old-style function definitions? Can you update the code to use prototypes?
Even standard C compilers are somewhat permissive when it comes to this. Try running the following:
int foo()
{
printf("here");
}
int main()
{
foo(3,4);
return 0;
}
It will, to some's surprise, output "here". The extra arguments are just ignored. Of course, it depends on the compiler.
Overloading doesn't exist in C so having 2 declarations would not work in the same text.
That must be a quite old compiler to not err on this one or it did not find the declaration of the function yet!
Some compilers would not warn/err when calling an undefined function. That's probably what you're running into. I would suggest you look at the command line flags of the compiler to see whether there is a flag you can use to get these warnings because you may actually find quite a few similar mistakes (too many parameters is likely to work just fine, but too few will make use of "undefined" values...)
Note that it is possible to do such (add extra parameters) when using the ellipsis as in printf():
printf(const char *format, ...);
I would imagine that the function had 3 parameters at some point and the last was removed because it was unused and some parts of the code was not corrected as it ought to be. I would remove that 3rd parameter, just in case the stack goes in the wrong order and thus fails to send the correct parameters to the function.
See I have made a library which has several .h & several .c files.
Now under some circumstances I have not removed a warning that says
warning: implicit declaration of function ‘getHandle’
Now I want to ask you, does this cause any problem in the binary of my library?
Will it negatively affect the execution of my library on embedded platforms or anywhere else?
In C90, a call to a function with no visible declaration creates an implicit declaration of a function returning int and taking the promoted types of the arguments. If your getHandle function returns a pointer, for example, then the compiler will generate code assuming that it returns an int. Assigning the result to a pointer object should at least trigger another warning. It might work ok (if int and the pointer type are the same size, and int and pointer function results are returned in the same way), or it could go badly wrong (if, for example, int is 32 bits and pointers are 64 bits, or if pointer results are returned in 68K-style address registers).
The C99 standard removed implicit int from the language. Calling a function with no visible declaration is a constraint violation, requiring a diagnostic and possibly causing your program to be rejected.
And if you just fix the problem, you don't have to waste time figuring out how and whether it will work if you don't fix it.
In such a case the compiler cannot verify that the usage of getHandle is proper - that is, whether the return value and the arguments are of the right type and count. It will just accept the way you call it without any explicit statement that this is the right way.
Adding a prototype of the function is a way to tell the compiler that this is the intended usage of the function and it will be a compile-time error not to comply to that.
It may cause some nasty bugs in case there is a mismatch between the way it's called and the way the function expects its arguments. The most likely effect will be a corruption of the stack.
If there isn't a mismatch, it won't make any difference at runtime.
Quick question:
strlen[char*] works perfectly regardless whether I #include <string.h> or not
All I get from compiler is a warning about implicit declaration, but functionally it works as intended.
Why is that?
When you invoke undefined behavior, one possible behavior is that the program behaves as you expected it to, one your system, and with the current version of libraries and system software you have installed. This does not mean it's okay to do this. Actually a correct C99 compiler should not allow implicit function declarations; it should give you an error.
The function prototypes in C are not compulsory. They're useful indications to the compiler so that it can do type checking on types which are passed into them. When you don't include string.h, a default signature is assumed for the function which is why you get the warning.
If a function is called without a prototype in scope, the compiler will generate code to call a function which accepts whatever types of parameters are passed and accept an integer result. If the parameters match those the function expects, and if the function returns its result in a way the calling code can handle(*), all will be well. The purpose of prototyping is to ensure that arguments get converted into expected types if possible, and that compilation will fail if they cannot be converted. For example, if a non-prototyped function expects an argument of type 'long' and one attempts to pass an 'int', any of the following may occur:
The program may crash outright
Things may work as expected
The function may execute as though it were passed some arbitrary different parameter value
The program may continue to run, but with arbitrary values corrupting any or all program variables.
The computer may cause demons may fly out the programmer's nose
By contrast, if the function were prototyped, the compiler would be guaranteed to do whatever was necessary to convert the 'int' to a 'long' prior to calling the function.
When C was originally conceived, prototypes didn't exist, and the programmer was responsible for ensuring that all arguments were passed with the precise types expected. In practice, it's a real pain to ensure that all function arguments are always the exact proper types (e.g. when passing the value five to a function that expects a long, one must write it as either "5L" or "(long)5"). Realistically speaking, there's never(**) any reason to rely upon implicit argument types, except with variadic functions.
(*) Any of the things that can happen with incorrect parameter types can happen with incorrect return types, except when a function is expected to return 'int' and the actual return value of the function would fit in an 'int', the results are more likely to be correct than when incorrect parameter types are used.
(**) The only exceptions I can think of would be if one was programming for some really old hardware for which a prototype-aware compiler was unavailable, or if one was programming for a code-golf or similar competition. The latter I consider puzzle-solving rather than programming, and I'm unaware of any hardware people would be interested in using for which the former condition would apply.
Because it's declaration ie equal to so called 'default declaration'. Compiler expects any unknown function to return int and expect parameters as passed at the first time of function usage in code.
Usually this is because another header file which you have included ALSO includes string.h. Obviously it is bad practice to assume that you don't need to include something just because something else does, but it is most likely responsible for this effect.
I guessing it's because in C an int is the default data type returned from a function.
Can you give a fuller code example.
The function prototype is there in include files. So even if you don't include those files, a fixed prototype int function_name(); is written. Now the code of strlen() is there in the library files which are linked at run time, so the function gives correct output (only if it is the only function with a fixed prototype int function_name();).