I'm working with a large C-only project, and I keep getting bitten by the following problem:
Lets say I have a function
void MyFunction(int parameter)
{
printf("parameter: %d\n", parameter);
}
Which normally gets called
int aVariable = 5;
MyFunction(aVariable);
However, apparently due to the C standard specifications, this does not cause a compilation error:
int aVariable = 5;
MyFunction(&aVariable); // No error signaled, but causes all sorts of mayhem
How can I catch this kind of error, specifically in Visual Studio? Is there any setting I can turn on to make it stricter?
Any strategy you might recommend (besides "don't make typos")?
Edit:
I might add that due to the (crappy) nature of the project, the sample code already generates tons of warnings; I am not sure I can remove all of them in the time I have. Generating more warnings might not be the best option -- however being able to discern about these specific warnings (as one answer suggests) might be the solution to this particular problem.
If the function MyFunction is declared with prototype before the point of the call, then
MyFunction(&aVariable);
is a genuine full-blown constraint violation in C, i.e. it is what we usually call "an error". That's what the language specification says. In other words, your belief that this is somehow allowed "due to the C standard specifications" is incorrect: this is explicitly disallowed by C standard.
Any C compiler will issue at least a warning for such code, which you should also pay attention to. If your C compiler does not report this violation as an error, it can usually be changed through compiler setup.
In case of Visual Studio compiler, one approach is to watch the warning number issued in such cases and ask the compiler to convert such warnings into errors. This can be done through either #pragma warning or through project settings.
Insert a function declaration of the form
void MyFunction(int parameter);
before attempting to call it. If you don't do that, when calling MyFunction() a C compiler is required to assume MyFunction() accepts a variable argument list (arbitrary number and types of arguments) and returns int.
Hence code will compile without error, even if the function is called with incorrect arguments or calling code attempts to use the return value (which should not be done with a void function). The result, if the arguments supplied do not match what the actual function definition expects, is often in the realm of undefined behaviour.
Declaring the function before calling it is good practice, and allows the compiler to detect a problem, and issue errors or warnings as needed. Without the declaration, some compilers do issue warnings, but not all compilers do.
Related
What is meant by the term "implicit declaration of a function"? A call to a standard library function without including the appropriate header file produces a warning as in the case of:
int main(){
printf("How is this not an error?");
return 0;
}
Shouldn't using a function without declaring it be an error? Please explain in detail. I searched this site and found similar questions, but could not find a definitive answer. Most answers said something about including the header file to get rid of the warning, but I want to know how this is not an error.
It should be considered an error. But C is an ancient language, so it's only a warning.
Compiling with -Werror (GCC) fixes this problem.
When C doesn't find a declaration, it assumes this implicit declaration: int f();, which means the function can receive whatever you give it, and returns an integer. If this happens to be close enough (and in case of printf, it is), then things can work. In some cases (e.g., the function actually returns a pointer, and pointers are larger than ints), it may cause real trouble.
Note that this was fixed in newer C standards (C99 and C11). In these standards, this is an error. However, GCC doesn't implement these standards by default, so you still get the warning.
Implicit declarations are not valid in C.
C99 removed this feature (present in C89).
GCC chooses to only issue a warning by default with -std=c99, but a compiler has the right to refuse to translate such a program.
To complete the picture, since -Werror might considered too "invasive",
for GCC (and LLVM), a more precise solution is to transform just this warning in an error, using the option:
-Werror=implicit-function-declaration
See How can I make this GCC warning an error?.
Regarding general use of -Werror: Of course, having warningless code is recommendable, but in some stage of development it might slow down the prototyping.
Because of historical reasons going back to the very first version of C, it passes whatever type the argument is. So it could be an int or a double or a char*. Without a prototype, the compiler will pass whatever size the argument is and the function being called had better use the correct argument type to receive it.
For more details, look up K&R C.
An implicitly declared function is one that has neither a prototype nor a definition, but is called somewhere in the code. Because of that, the compiler cannot verify that this is the intended usage of the function (whether the count and the type of the arguments match). Resolving the references to it is done after compilation, at link-time (as with all other global symbols), so technically it is not a problem to skip the prototype.
It is assumed that the programmer knows what he is doing and this is the premise under which the formal contract of providing a prototype is omitted.
Nasty bugs can happen if calling the function with arguments of a wrong type or count. The most likely manifestation of this is a corruption of the stack.
Nowadays this feature might seem as an obscure oddity, but in the old days it was a way to reduce the number of header files included, hence faster compilation.
C is a very low-level language, so it permits you to create almost any legal object (.o) file that you can conceive of. You should think of C as basically dressed-up assembly language.
In particular, C does not require functions to be declared before they are used. If you call a function without declaring it, the use of the function becomes its (implicit) declaration. In a simple test I just ran, this is only a warning in the case of built-in library functions like printf (at least in GCC), but for random functions, it will compile just fine.
Of course, when you try to link, and it can't find foo, then you will get an error.
In the case of library functions like printf, some compilers contain built-in declarations for them so they can do some basic type checking, so when the implicit declaration (from the use) doesn't match the built-in declaration, you'll get a warning.
What is meant by the term "implicit declaration of a function"? A call to a standard library function without including the appropriate header file produces a warning as in the case of:
int main(){
printf("How is this not an error?");
return 0;
}
Shouldn't using a function without declaring it be an error? Please explain in detail. I searched this site and found similar questions, but could not find a definitive answer. Most answers said something about including the header file to get rid of the warning, but I want to know how this is not an error.
It should be considered an error. But C is an ancient language, so it's only a warning.
Compiling with -Werror (GCC) fixes this problem.
When C doesn't find a declaration, it assumes this implicit declaration: int f();, which means the function can receive whatever you give it, and returns an integer. If this happens to be close enough (and in case of printf, it is), then things can work. In some cases (e.g., the function actually returns a pointer, and pointers are larger than ints), it may cause real trouble.
Note that this was fixed in newer C standards (C99 and C11). In these standards, this is an error. However, GCC doesn't implement these standards by default, so you still get the warning.
Implicit declarations are not valid in C.
C99 removed this feature (present in C89).
GCC chooses to only issue a warning by default with -std=c99, but a compiler has the right to refuse to translate such a program.
To complete the picture, since -Werror might considered too "invasive",
for GCC (and LLVM), a more precise solution is to transform just this warning in an error, using the option:
-Werror=implicit-function-declaration
See How can I make this GCC warning an error?.
Regarding general use of -Werror: Of course, having warningless code is recommendable, but in some stage of development it might slow down the prototyping.
Because of historical reasons going back to the very first version of C, it passes whatever type the argument is. So it could be an int or a double or a char*. Without a prototype, the compiler will pass whatever size the argument is and the function being called had better use the correct argument type to receive it.
For more details, look up K&R C.
An implicitly declared function is one that has neither a prototype nor a definition, but is called somewhere in the code. Because of that, the compiler cannot verify that this is the intended usage of the function (whether the count and the type of the arguments match). Resolving the references to it is done after compilation, at link-time (as with all other global symbols), so technically it is not a problem to skip the prototype.
It is assumed that the programmer knows what he is doing and this is the premise under which the formal contract of providing a prototype is omitted.
Nasty bugs can happen if calling the function with arguments of a wrong type or count. The most likely manifestation of this is a corruption of the stack.
Nowadays this feature might seem as an obscure oddity, but in the old days it was a way to reduce the number of header files included, hence faster compilation.
C is a very low-level language, so it permits you to create almost any legal object (.o) file that you can conceive of. You should think of C as basically dressed-up assembly language.
In particular, C does not require functions to be declared before they are used. If you call a function without declaring it, the use of the function becomes its (implicit) declaration. In a simple test I just ran, this is only a warning in the case of built-in library functions like printf (at least in GCC), but for random functions, it will compile just fine.
Of course, when you try to link, and it can't find foo, then you will get an error.
In the case of library functions like printf, some compilers contain built-in declarations for them so they can do some basic type checking, so when the implicit declaration (from the use) doesn't match the built-in declaration, you'll get a warning.
What is meant by the term "implicit declaration of a function"? A call to a standard library function without including the appropriate header file produces a warning as in the case of:
int main(){
printf("How is this not an error?");
return 0;
}
Shouldn't using a function without declaring it be an error? Please explain in detail. I searched this site and found similar questions, but could not find a definitive answer. Most answers said something about including the header file to get rid of the warning, but I want to know how this is not an error.
It should be considered an error. But C is an ancient language, so it's only a warning.
Compiling with -Werror (GCC) fixes this problem.
When C doesn't find a declaration, it assumes this implicit declaration: int f();, which means the function can receive whatever you give it, and returns an integer. If this happens to be close enough (and in case of printf, it is), then things can work. In some cases (e.g., the function actually returns a pointer, and pointers are larger than ints), it may cause real trouble.
Note that this was fixed in newer C standards (C99 and C11). In these standards, this is an error. However, GCC doesn't implement these standards by default, so you still get the warning.
Implicit declarations are not valid in C.
C99 removed this feature (present in C89).
GCC chooses to only issue a warning by default with -std=c99, but a compiler has the right to refuse to translate such a program.
To complete the picture, since -Werror might considered too "invasive",
for GCC (and LLVM), a more precise solution is to transform just this warning in an error, using the option:
-Werror=implicit-function-declaration
See How can I make this GCC warning an error?.
Regarding general use of -Werror: Of course, having warningless code is recommendable, but in some stage of development it might slow down the prototyping.
Because of historical reasons going back to the very first version of C, it passes whatever type the argument is. So it could be an int or a double or a char*. Without a prototype, the compiler will pass whatever size the argument is and the function being called had better use the correct argument type to receive it.
For more details, look up K&R C.
An implicitly declared function is one that has neither a prototype nor a definition, but is called somewhere in the code. Because of that, the compiler cannot verify that this is the intended usage of the function (whether the count and the type of the arguments match). Resolving the references to it is done after compilation, at link-time (as with all other global symbols), so technically it is not a problem to skip the prototype.
It is assumed that the programmer knows what he is doing and this is the premise under which the formal contract of providing a prototype is omitted.
Nasty bugs can happen if calling the function with arguments of a wrong type or count. The most likely manifestation of this is a corruption of the stack.
Nowadays this feature might seem as an obscure oddity, but in the old days it was a way to reduce the number of header files included, hence faster compilation.
C is a very low-level language, so it permits you to create almost any legal object (.o) file that you can conceive of. You should think of C as basically dressed-up assembly language.
In particular, C does not require functions to be declared before they are used. If you call a function without declaring it, the use of the function becomes its (implicit) declaration. In a simple test I just ran, this is only a warning in the case of built-in library functions like printf (at least in GCC), but for random functions, it will compile just fine.
Of course, when you try to link, and it can't find foo, then you will get an error.
In the case of library functions like printf, some compilers contain built-in declarations for them so they can do some basic type checking, so when the implicit declaration (from the use) doesn't match the built-in declaration, you'll get a warning.
What is meant by the term "implicit declaration of a function"? A call to a standard library function without including the appropriate header file produces a warning as in the case of:
int main(){
printf("How is this not an error?");
return 0;
}
Shouldn't using a function without declaring it be an error? Please explain in detail. I searched this site and found similar questions, but could not find a definitive answer. Most answers said something about including the header file to get rid of the warning, but I want to know how this is not an error.
It should be considered an error. But C is an ancient language, so it's only a warning.
Compiling with -Werror (GCC) fixes this problem.
When C doesn't find a declaration, it assumes this implicit declaration: int f();, which means the function can receive whatever you give it, and returns an integer. If this happens to be close enough (and in case of printf, it is), then things can work. In some cases (e.g., the function actually returns a pointer, and pointers are larger than ints), it may cause real trouble.
Note that this was fixed in newer C standards (C99 and C11). In these standards, this is an error. However, GCC doesn't implement these standards by default, so you still get the warning.
Implicit declarations are not valid in C.
C99 removed this feature (present in C89).
GCC chooses to only issue a warning by default with -std=c99, but a compiler has the right to refuse to translate such a program.
To complete the picture, since -Werror might considered too "invasive",
for GCC (and LLVM), a more precise solution is to transform just this warning in an error, using the option:
-Werror=implicit-function-declaration
See How can I make this GCC warning an error?.
Regarding general use of -Werror: Of course, having warningless code is recommendable, but in some stage of development it might slow down the prototyping.
Because of historical reasons going back to the very first version of C, it passes whatever type the argument is. So it could be an int or a double or a char*. Without a prototype, the compiler will pass whatever size the argument is and the function being called had better use the correct argument type to receive it.
For more details, look up K&R C.
An implicitly declared function is one that has neither a prototype nor a definition, but is called somewhere in the code. Because of that, the compiler cannot verify that this is the intended usage of the function (whether the count and the type of the arguments match). Resolving the references to it is done after compilation, at link-time (as with all other global symbols), so technically it is not a problem to skip the prototype.
It is assumed that the programmer knows what he is doing and this is the premise under which the formal contract of providing a prototype is omitted.
Nasty bugs can happen if calling the function with arguments of a wrong type or count. The most likely manifestation of this is a corruption of the stack.
Nowadays this feature might seem as an obscure oddity, but in the old days it was a way to reduce the number of header files included, hence faster compilation.
C is a very low-level language, so it permits you to create almost any legal object (.o) file that you can conceive of. You should think of C as basically dressed-up assembly language.
In particular, C does not require functions to be declared before they are used. If you call a function without declaring it, the use of the function becomes its (implicit) declaration. In a simple test I just ran, this is only a warning in the case of built-in library functions like printf (at least in GCC), but for random functions, it will compile just fine.
Of course, when you try to link, and it can't find foo, then you will get an error.
In the case of library functions like printf, some compilers contain built-in declarations for them so they can do some basic type checking, so when the implicit declaration (from the use) doesn't match the built-in declaration, you'll get a warning.
How does the compiler know the prototype of sleep function or even printf function, when I did not include any header file in the first place?
Moreover, if I specify sleep(1,1,"xyz") or any arbitrary number of arguments, the compiler still compiles it.
But the strange thing is that gcc is able to find the definition of this function at link time, I don't understand how is this possible, because actual sleep() function takes a single argument only, but our program mentioned three arguments.
/********************************/
int main()
{
short int i;
for(i = 0; i<5; i++)
{
printf("%d",i);`print("code sample");`
sleep(1);
}
return 0;
}
Lacking a more specific prototype, the compiler will assume that the function returns int and takes whatever number of arguments you provide.
Depending on the CPU architecture arguments can be passed in registers (for example, a0 through a3 on MIPS) or by pushing them onto the stack as in the original x86 calling convention. In either case, passing extra arguments is harmless. The called function won't use the registers passed in nor reference the extra arguments on the stack, but nothing bad happens.
Passing in fewer arguments is more problematic. The called function will use whatever garbage happened to be in the appropriate register or stack location, and hijinks may ensue.
In classic C, you don't need a prototype to call a function. The compiler will infer that the function returns an int and takes a unknown number of parameters. This may work on some architectures, but it will fail if the function returns something other than int, like a structure, or if there are any parameter conversions.
In your example, sleep is seen and the compiler assumes a prototype like
int sleep();
Note that the argument list is empty. In C, this is NOT the same as void. This actually means "unknown". If you were writing K&R C code, you could have unknown parameters through code like
int sleep(t)
int t;
{
/* do something with t */
}
This is all dangerous, especially on some embedded chips where the way parameters are passed for a unprototyped function differs from one with a prototype.
Note: prototypes aren't needed for linking. Usually, the linker automatically links with a C runtime library like glibc on Linux. The association between your use of sleep and the code that implements it happens at link time long after the source code has been processed.
I'd suggest that you use the feature of your compiler to require prototypes to avoid problems like this. With GCC, it's the -Wstrict-prototypes command line argument. In the CodeWarrior tools, it was the "Require Prototypes" flag in the C/C++ Compiler panel.
C will guess int for unknown types. So, it probably thinks sleep has this prototype:
int sleep(int);
As for giving multiple parameters and linking...I'm not sure. That does surprise me. If that really worked, then what happened at run-time?
This is to do with something called 'K & R C' and 'ANSI C'.
In good old K & R C, if something is not declared, it is assumed to be int.
So any thing that looks like a function call, but not declared as function
will automatically take return value of 'int' and argument types depending
on the actuall call.
However people later figured out that this can be very bad sometimes. So
several compilers added warning. C++ made this error. I think gcc has some
flag ( -ansic or -pedantic? ) , which make this condition an error.
So, In a nutshell, this is historical baggage.
Other answers cover the probable mechanics (all guesses as compiler not specified).
The issue that you have is that your compiler and linker have not been set to enable every possible error and warning. For any new project there is (virtually) no excuse for not doing so. for legacy projects more excuse - but should strive to enable as many as possible
Depends on the compiler, but with gcc (for example, since that's the one you referred to), some of the standard (both C and POSIX) functions have builtin "compiler intrinsics". This means that the compiler library shipped with your compiler (libgcc in this case) contains an implementation of the function. The compiler will allow an implicit declaration (i.e., using the function without a header), and the linker will find the implementation in the compiler library because you're probably using the compiler as a linker front-end.
Try compiling your objects with the '-c' flag (compile only, no link), and then link them directly using the linker. You will find that you get the linker errors you expect.
Alternatively, gcc supports options to disable the use of intrinsics: -fno-builtin or for granular control, -fno-builtin-function. There are further options that may be useful if you're doing something like building a homebrew kernel or some other kind of on-the-metal app.
In a non-toy example another file may include the one you missed. Reviewing the output from the pre-processor is a nice way to see what you end up with compiling.