Quick question:
strlen[char*] works perfectly regardless whether I #include <string.h> or not
All I get from compiler is a warning about implicit declaration, but functionally it works as intended.
Why is that?
When you invoke undefined behavior, one possible behavior is that the program behaves as you expected it to, one your system, and with the current version of libraries and system software you have installed. This does not mean it's okay to do this. Actually a correct C99 compiler should not allow implicit function declarations; it should give you an error.
The function prototypes in C are not compulsory. They're useful indications to the compiler so that it can do type checking on types which are passed into them. When you don't include string.h, a default signature is assumed for the function which is why you get the warning.
If a function is called without a prototype in scope, the compiler will generate code to call a function which accepts whatever types of parameters are passed and accept an integer result. If the parameters match those the function expects, and if the function returns its result in a way the calling code can handle(*), all will be well. The purpose of prototyping is to ensure that arguments get converted into expected types if possible, and that compilation will fail if they cannot be converted. For example, if a non-prototyped function expects an argument of type 'long' and one attempts to pass an 'int', any of the following may occur:
The program may crash outright
Things may work as expected
The function may execute as though it were passed some arbitrary different parameter value
The program may continue to run, but with arbitrary values corrupting any or all program variables.
The computer may cause demons may fly out the programmer's nose
By contrast, if the function were prototyped, the compiler would be guaranteed to do whatever was necessary to convert the 'int' to a 'long' prior to calling the function.
When C was originally conceived, prototypes didn't exist, and the programmer was responsible for ensuring that all arguments were passed with the precise types expected. In practice, it's a real pain to ensure that all function arguments are always the exact proper types (e.g. when passing the value five to a function that expects a long, one must write it as either "5L" or "(long)5"). Realistically speaking, there's never(**) any reason to rely upon implicit argument types, except with variadic functions.
(*) Any of the things that can happen with incorrect parameter types can happen with incorrect return types, except when a function is expected to return 'int' and the actual return value of the function would fit in an 'int', the results are more likely to be correct than when incorrect parameter types are used.
(**) The only exceptions I can think of would be if one was programming for some really old hardware for which a prototype-aware compiler was unavailable, or if one was programming for a code-golf or similar competition. The latter I consider puzzle-solving rather than programming, and I'm unaware of any hardware people would be interested in using for which the former condition would apply.
Because it's declaration ie equal to so called 'default declaration'. Compiler expects any unknown function to return int and expect parameters as passed at the first time of function usage in code.
Usually this is because another header file which you have included ALSO includes string.h. Obviously it is bad practice to assume that you don't need to include something just because something else does, but it is most likely responsible for this effect.
I guessing it's because in C an int is the default data type returned from a function.
Can you give a fuller code example.
The function prototype is there in include files. So even if you don't include those files, a fixed prototype int function_name(); is written. Now the code of strlen() is there in the library files which are linked at run time, so the function gives correct output (only if it is the only function with a fixed prototype int function_name();).
Related
What will happen if I don't include the header files when running a c program? I know that I get warnings, but the programs runs perfectly.
I know that the header files contain function declarations. Therefore when I don't include them, how does the compiler figure it out? Does it check all the header files?
I know that I get warnings, but the programs runs perfectly.
That is an unfortunate legacy of pre-ANSI C: the language did not require function prototypes, so the standard C allows it to this day (usually, a warning can be produced to find functions called without a prototype).
When you call a function with no prototype, C compiler makes assumptions about the function being called:
Function's return type is assumed to be int
All parameters are assumed to be declared (i.e. no ... vararg stuff)
All parameters are assumed to be whatever you pass after default promotions, and so on.
If the function being called with no prototype fits these assumptions, your program will run correctly; otherwise, it's undefined behavior.
Before the 1989 ANSI C standard, there was no way to declare a function and indicate the types of its parameters. You just had to be very careful to make each call consistent with the called function, with no warning from the compiler if you got it wrong (like passing an int to sqrt()). In the absence of a visible declaration, any function you call was assumed to return int; this was the "implicit int" rule. A lot of standard functions do return int, so you could often get away with omitting a #include.
The 1989 ANSI C standard (which is also, essentially, the 1990 ISO C standard) introduced prototypes, but didn't make them mandatory (and they still aren't). So if you call
int c = getchar();
it would actually work, because getchar() returns an int.
The 1999 ISO C standard dropped the implicit int rule, and made it illegal (actually a constraint violation) to call a function with no visible declaration. So if you call a standard function without the required #include, a C99-conforming compiler must issue a diagnostic (which can be just a warning). Non-prototype function declarations (ones that don't specify the types of the arguments) are still legal, but they're considered obsolescent.
(The 2011 ISO C standard didn't change much in this particular area.)
But there's still plenty of code out there that was written for C90 compilers, and most modern compilers still support the older standard.
So if you call a standard function without the required #include, what will probably happen is that (a) the compiler will warn you about the missing declaration, and (b) it will assume that the function returns int and takes whatever number and type(s) of arguments you actually passed it (also accounting for type promotion, such as short to int and float to double). If the call is correct, and if you compiler is lenient, then your code will probably work -- but you'll have one more thing to worry about if it fails, perhaps for some unrelated reason.
Variadic functions like printf are another matter. Even in C89/C90, calling printf with no visible prototype had undefined behavior. A compiler can use an entirely different calling convention for variadic functions, so printf("hello") and puts("hello") might generate quite different code. But again, for compatibility with old code, most compilers use a compatible calling convention, so for example the first "hello world" program in K&R1 will probably still compile and run.
You can also write your own declarations for standard functions; the compiler doesn't care whether it sees a declaration in a standard header or in your own source file. But there's no point in doing so. Declarations have changed subtly from one version of the standard to the next, and the headers that came with your implementation should be the correct ones.
So what will actually happen if you call a standard function without the corresponding #include?
In a typical working environment, it doesn't matter, because with any luck your program won't survive code review.
In principle, any compiler that conforms to C99 or later may reject your program with a fatal error message. (gcc will behave this way with -std=c99 -pedantic-errors) In practice, most compilers will merely print a warning. The call will probably work if the function returns int (or if you ignore the result) and if you get all the argument types correct. If the call is incorrect, the compiler may not be able to print good diagnostics. If the function doesn't return int, the compiler will probably assume that it does, and you'll get garbage results, or even crash your program.
So you can study this answer of mine, follow up by reading the various versions of the C standard, find out exactly which edition of the standard your compiler conforms to, and determine the circumstances in which you can safely omit a #include header -- with the risk that you'll mess something up next time you modify your program.
Or you can pay attention to your compiler's warnings (Which you've enabled with whatever command-line options are available), read the documentation for each function you call, add the required #includes at the top of each source file, and not have to worry about any of this stuff.
First of all: just include them.
If you don't the compiler will use the default prototype for undeclared functions, which is:
int functionName(int argument);
So it will compile, and link if the functions are available. But you will have problems at runtime.
There are a lot of things you can't do if you leave out headers:
(I'm hoping to get some more from the comments since my memory is failing on this ...)
You can't use any of the macros defined in the headers. This can be significant.
The compiler can't check that you are calling functions properly since the headers define their parameters for it.
For compatibility with old program C compilers can compile code calling functions which have not been declared, assuming the parameters and return value is of type int. What can happen? See for example this question: Troubling converting string to long long in C I think it's a great illustration of the problems you can run into if you don't include necessary headers and so don't declare functions you use. What happened to the guy was he tried to use atoll without including stdlib.h where atoll is declared:
char s[30] = { "115" };
long long t = atoll(s);
printf("Value is: %lld\n", t);
Surprisingly, this printed 0, not 115, as expected! Why? Because the compiler didn't see the declaration of atoll and assumed it's return value is an int, and so picked only part of the value left on stack by the function, in other words the return value got truncated.
That's why of the reasons it is recommended to compile your code with -Wall (all warnings on).
When you don't include and use malloc, we get implicit declaration warning.
"warning: incompatible implicit declaration of built-in function ‘malloc’"
This warning is due to fact that compiler assumes malloc defined as int malloc(size) while it is void* malloc(size).
But how does it know about void* malloc(size)? We have not included anything in header files. So how is it comparing it against something that is not included.
And after that, how does my code work? How does it find correct malloc definition and use it?
Is there any sequence order in which function definitions are scanned?
When you call a function f that has never been defined, an implicit declaration like this takes place:
int f();
Note that you can still pass arguments to f, since it was not declared int f(void); (this is for backwards compatibility with K&R C).
Thus, the compiler doesn't "know" that malloc receives a size argument, in fact you can pass it what you want, it doesn't care.
So, the mere fact that the code works is pure luck. In the case of malloc, if the code works, it just means that the size of an integer is the same size as a pointer - no more, no less - so, you can still call malloc and assign its result to a pointer, since no bits were trimmed / are missing.
The true function is found in the linking stage, after compilation takes place. By this time, your code was already compiled with a wrong function prototype. Of course, if the linker can't find the function anywhere in its path, an error is reported and everything is aborted.
For the case of malloc, and other standard library functions, there may be builtin functions to increase performance. There's even an option for gcc to disable builtin functions (-fno-builtin or -fno-builtin-function). From gcc manpage:
GCC normally generates special code to handle certain built-in
functions more efficiently; for instance, calls to "alloca" may become
single instructions that adjust the stack directly, and calls to
"memcpy" may become inline copy loops. The resulting code is often
both smaller and faster, but since the function calls no longer appear
as such, you cannot set a breakpoint on those calls, nor can you
change the behavior of the functions by linking with a different
library. In addition, when a function is recognized as a built-in
function, GCC may use information about that function to warn about
problems with calls to that function, or to generate more efficient
code, even if the resulting code still contains calls to that
function.
So, in the particular case of malloc, this is how the compiler "knows" its usual signature. Try to compile this code with gcc:
int main(void) {
char *a = malloc(12, 13, 14, 15);
return 0;
}
You will see it will abort with a compilation error:
test.c:3: error: too many arguments to function `malloc'
If you use the option -fnobuiltin, the error goes away and the warning is different:
test.c:3: warning: implicit declaration of function `malloc'
This is the same warning you get every time you use a regular function that was not previously defined, because now the compiler is ignoring what he "knows" about these functions. This example uses gcc, but other compilers will have similar behavior.
But how does it know about void* malloc(size)? We have not included anything in header files. So how is it comparing it against something that is not included.
Most modern compilers have a built-in (i. e. hard-coded in the compiler) list of common standard library functions, so the compiler knows what the declaration should be even without the function being declared (explicitly or implicitly) or called.
This does not mean that the correct declaration will be used (because the implicit declaration rule overrides the compilers apparent knowledge), but at least you know that you're doing something wrong.
The exact purpose of these built-ins is that the compiler can warn you if you forget to include a header file. (There other purposes as well, such as the opportunity to perform intrinsic optimization by knowing the semantics/implementation of a built-in function, but those do not apply here.)
The reason you got this specific warning is malloc being a built-in function in this specific compiler (as the warning states). This function receives special treatment from the compiler. The compiler has intrinsic knowledge about how this function has to be declared, so when the implicit declaration contradicts the proper declaration, the warning is issued immediately.
In general case, with ordinary (non-built-in) functions, you will get a similar waning for an implicitly declared function when the compiler actually discovers the first explicit declaration for the same function (assuming there's a mismatch between the two). In that case the warning will be issued later: not at the point of the call, but at the point of explicit declaration. That's the way it works with ordinary functions.
But with such standard (built-in) functions as malloc the compiler allows itself a bit of "cheating".
It doesn't find the correct definition if you don't declare it properly (either with an explicit declaration in your source code, or by including the proper header). It uses a default, implicit declaration for any functions that are called before they're declared. The implicit declaration includes the assumption that the function returns int.
It's then comparing the default declaration with the way that you called the function. It notices that you're assigning the result to a pointer variable rather than an integer, so it warns you that these are incompatible.
Do I really need header files for using libraries?My code works without header files and the libraries are linked perfectly except that I get a "warning: implicit declaration of function"
message.Still its just a warning my program runs.
Header file:- is a file that allows programmers to separate certain elements of a program's source code into reusable files.
A library file is the executable code that works according to what is specified in the header file.
I get a "warning: implicit declaration of function" message.
There may be many reasons for this warning like maybe your header guard is not working right. Another possibility is that you have declared the function Func but called it with func.
Check this out:-
Including a header file is equal to copying the content of the header
file but we do not do it because it will be very much error-prone and
it is not a good idea to copy the content of header file in the source
files, specially if we have multiple source file comprising our
program.
For most of the starters this warning is seen when they start using
malloc
So I am using this example. For the warning in this case You need to add:
#include <stdlib.h>
This file includes the declaration for the built-in function malloc. If you don't do that, the compiler thinks you want to define your own function named malloc and it warns you because:
You don't explicitly declare it and
There already is a built-in function by that name which has a different signature than the one that was implicitly declared (when a function is declared implicitly, its return and argument types are assumed to be int, which isn't compatible with the built-in malloc, which takes a size_t and returns a void*).
My code works without header files and the libraries are linked
perfectly except that I get a "warning: implicit declaration of
function" message.Still its just a warning my program runs.
It is a bad practice to ignore a warning from the compiler. Programs that are not compiled properly could have bugs that might cause you bigger problems and errors in the future.
So for example, if you get a warning:
prog.c:12:5: warning: implicit declaration of /*some function used in the program*/
It is expected that you must include the header file wherein that function is defined.
Do I really need header files for using libraries?
Yes.
My code works without header files
No, it doesn't. It only appears to be working.
and the libraries are linked perfectly except that I get a "warning: implicit declaration of function" message
In this case, I wouldn't call it "perfect".
Still its just a warning my program runs.
And you can never know what it does since it now invokes undefined behavior. "Just a warning"? Sure. There's a reason the compiler warns you. Respect warnings and fix them.
Here's the quote from C99 (6.5.2.2.6) which explains why the program has UB without headers (more precisely, without proper function prototypes which you don't seem to have either):
If the expression that denotes the called function has a type that does not include a prototype, the integer promotions are performed on each argument, and arguments that have type float are promoted to double. These are called the default argument promotions. If the number of arguments does not equal the number of parameters, the behavior is undefined.
If the function is defined with a type that includes a prototype, and either the prototype ends with an ellipsis (, ...) or the types of the arguments after promotion are not compatible with the types of the parameters, the behavior is undefined.
If the function is defined with a type that does not include a prototype, and the types of the arguments after promotion are not compatible with those of the parameters after promotion, the behavior is undefined, except for the following cases:
— one promoted type is a signed integer type, the other promoted type is the
corresponding unsigned integer type, and the value is representable in both types;
— both types are pointers to qualified or unqualified versions of a character type or
void.
There's a very good reason header files are used. People who invented and designed the language knew what they were doing, and they made this decision because it's a necessary thing and a good solution to the problem that types need to be visible across compilation units.
I'm dealing with some pre-ANSI C syntax. See I have the following function call in one conditional
BPNN *net;
// Some more code
double val;
// Some more code, and then,
if (evaluate_performance(net, &val, 0)) {
But then the function evaluate_performance was defined as follows (below the function which has the above-mentioned conditional):
evaluate_performance(net, err)
BPNN *net;
double *err;
{
How come evaluate_performance was defined with two parameters but called with three arguments? What does the '0' mean?
And, by the way, I'm pretty sure that it isn't calling some other evaluate_performance defined elsewhere; I've greped through all the files involved and I'm pretty sure the we are supposed to be talking about the same evaluate_performance here.
Thanks!
If you call a function that doesn't have a declared prototype (as is the case here), then the compiler assumes that it takes an arbitrary number and types of arguments and returns an int. Furthermore, char and short arguments are promoted to ints, and floats are promoted to doubles (these are called the default argument promotions).
This is considered bad practice in new C code, for obvious reasons -- if the function doesn't return int, badness could ensure, you prevent the compiler from checking that you're passing the correct number and types of parameters, and arguments might get promoted incorrectly.
C99, the latest edition of the C standard, removes this feature from the language, but in practice many compilers still allow them even when operating in C99 mode, for legacy compatibility.
As for the extra parameters, they are technically undefined behavior according to the C89 standard. But in practice, they will typically just be ignored by the runtime.
The code is incorrect, but in a way that a compiler is not required to diagnose. (A C99 compiler would complain about it.)
Old-style function definitions don't specify the number of arguments a function expects. A call to a function without a visible prototype is assumed to return int and to have the number and type(s) of arguments implied by the calls (with narrow integer types being promoted to int or unsigned int, and float being promoted to double). (C99 removed this; your code is invalid under the C99 standard.)
This applies even if the definition precedes the call (an old-style definition doesn't provide a prototype).
If such a function is called incorrectly, the behavior is undefined. In other words, it's entirely the programmer's responsibility to get the arguments right; the compiler won't diagnose errors.
This obviously isn't an ideal situation; it can lead to lots of undetected errors.
Which is exactly why ANSI added prototypes to the language.
Why are you still dealing with old-style function definitions? Can you update the code to use prototypes?
Even standard C compilers are somewhat permissive when it comes to this. Try running the following:
int foo()
{
printf("here");
}
int main()
{
foo(3,4);
return 0;
}
It will, to some's surprise, output "here". The extra arguments are just ignored. Of course, it depends on the compiler.
Overloading doesn't exist in C so having 2 declarations would not work in the same text.
That must be a quite old compiler to not err on this one or it did not find the declaration of the function yet!
Some compilers would not warn/err when calling an undefined function. That's probably what you're running into. I would suggest you look at the command line flags of the compiler to see whether there is a flag you can use to get these warnings because you may actually find quite a few similar mistakes (too many parameters is likely to work just fine, but too few will make use of "undefined" values...)
Note that it is possible to do such (add extra parameters) when using the ellipsis as in printf():
printf(const char *format, ...);
I would imagine that the function had 3 parameters at some point and the last was removed because it was unused and some parts of the code was not corrected as it ought to be. I would remove that 3rd parameter, just in case the stack goes in the wrong order and thus fails to send the correct parameters to the function.
See I have made a library which has several .h & several .c files.
Now under some circumstances I have not removed a warning that says
warning: implicit declaration of function ‘getHandle’
Now I want to ask you, does this cause any problem in the binary of my library?
Will it negatively affect the execution of my library on embedded platforms or anywhere else?
In C90, a call to a function with no visible declaration creates an implicit declaration of a function returning int and taking the promoted types of the arguments. If your getHandle function returns a pointer, for example, then the compiler will generate code assuming that it returns an int. Assigning the result to a pointer object should at least trigger another warning. It might work ok (if int and the pointer type are the same size, and int and pointer function results are returned in the same way), or it could go badly wrong (if, for example, int is 32 bits and pointers are 64 bits, or if pointer results are returned in 68K-style address registers).
The C99 standard removed implicit int from the language. Calling a function with no visible declaration is a constraint violation, requiring a diagnostic and possibly causing your program to be rejected.
And if you just fix the problem, you don't have to waste time figuring out how and whether it will work if you don't fix it.
In such a case the compiler cannot verify that the usage of getHandle is proper - that is, whether the return value and the arguments are of the right type and count. It will just accept the way you call it without any explicit statement that this is the right way.
Adding a prototype of the function is a way to tell the compiler that this is the intended usage of the function and it will be a compile-time error not to comply to that.
It may cause some nasty bugs in case there is a mismatch between the way it's called and the way the function expects its arguments. The most likely effect will be a corruption of the stack.
If there isn't a mismatch, it won't make any difference at runtime.