I had this question in a test, and I still don't understand the answer I was given:
Let's say I wrote the following code:
#include <math.h>
#include <stdio.h>
float cos(float x){
return 1-x*x/4;
}
int main()
{
printf("%0f",cos(0.05f)+sin(0.05f));
}
Let's assume cos and sin are declared and defined in the math library (receiving and returning double), and I'm trying to link my code with the math library.
Another assumption is that cos is defined in math.c.
The question was:
"Will the code compile/link successfully? if so, which cos function
will be called?"
The answer was:
"Yes, the code will compile and my cos will be called".
How could this behavior be explained? Aren't these multiple definitions of the same function?
Your teacher may have made a mistake and intended to use double cos(double x). In this case, many C implementations will accept the program, and it will link and run because the linker takes every module from the object modules it is supplied but only takes the needed modules from the libraries it is supplied. Thus, because cos is already defined in the program, the linker will not take it from the math library. However, although this works in many C implementations, it violates the rules of standard C, which reserves the library identifiers; normal programs may not define them.
Another possibility is that your teacher did not intend to include math.h. This would make the declaration of cos not be an error, since it would not be conflicting with another declaration, but it would mean that sin should also be declared by the program, since it is used.
It will not compile.
I added a return 0; at the end of main() to remove a second problem with -Wall -Werror. If you do this you will see:
$ gcc -Wall -Werror costest1.c -o costest -lm
costest1.c:5:1: error: conflicting types for ‘cos’
This fails at the compile stage because math.h also defines a function called cos. Note the prototype for cos is:
double cos(double x);
not
float cos(float x);
If you did not include math.h, you would be able to compile, but would get:
$ gcc -Wall -Werror costest1.c -o costest -lm
costest1.c:5:1: error: conflicting types for built-in function ‘cos’ [-Werror]
costest1.c: In function ‘main’:
costest1.c:13:3: error: implicit declaration of function ‘sin’ [-Werror=implicit-function-declaration]
costest1.c:13:32: error: incompatible implicit declaration of built-in function ‘sin’ [-Werror]
cc1: all warnings being treated as errors
This is because cos is not a normal function, but handled as a builtin. As you can see it's defined in terms of sin. If cos were a normal function, you would see a duplicate symbol error of some sort.
In C you cannot have two functions with the same name even if they have different arguments. In C++ you can, in that identically named methods may differ by the calling parameters (but not just by the return type).
Online C2011 standard:
6.7 Declarations
...
Constraints
...
4 All declarations in the same scope that refer to the same object or function shall specify
compatible types
The code you posted violates the above constraint; math.h declares cos as
double cos(double x);
This behavior cannot be explained as C; it can be explained as C++, which allows for name overloading.
Make sure you're really talking about C and not C++, otherwise you are going to wind up being very confused.
EDIT:
I assume the question was about C++ and not C since compiling your code as a C program will generate conflicting types error: https://eval.in/93380.
This behaviour is caused by the Function Overloading. The cos() function has the assembly name _cos#double and redeclaring the cos() function to be accepting a float argument will have the assembly name_cos#floatand it will not conflict with the cos() defined in the math library. And calling cos() with a float argument will be translated to a call to the assembly function _cos#float which is your own cos().
Note that Function Overloading is allowed only in C++ and not C.
In C you can only do it without changing the arguments and the return types of the function (https://eval.in/93381) otherwise the previous error will be generated.
As the assumption clarifies (cos is defined in math.c), the cos() function in the case is other than the one defined in the math.h in which case the the answer would be true if it was defined to be accepting an argument of type float and returning a value of type float. In this case the program will compile without any issues.
It will not work at all.
You will see an error like this one:
conflicting types for 'cos'
I got it with codeblocks with gcc compiler ...
There is another solution you can use, please check this post: Override a function call in C
No it does not compile.
The header file math.h declares double cos(double). Attempting to overload is not allowed in C.
error C2371: 'cos' : redefinition; different basic types
Related
I just resolved an absolute headbanger of a problem, and the issue was so simple, yet so elusive. So frustratingly hidden behind a lack of compiler feedback and an excess of compiler complacency (which is rare!). During writing this post, I found a few similar questions, but none that quite match my scenario.
Calling method without typeless argument produces a compiler error when the definition includes strongly typed args.
Why does gcc allow arguments to be passed to a function defined to be with no arguments? and C function with incomplete declaration both pass excess arguments to an argumentless function.
Why does an empty declaration work for definitions with int arguments but not for float arguments? does contain a successfully building declaration/definition mismatch, but has no invocation, where I would expect to see a too few arguments message.
I have a function declaration with no args, a call to that function with no args, and the function definition below with args. Somehow, C manages to successfully call the function, no warning, no error, but very undefined behaviour. Where does the function get the missing argument from? Why don't I get a linker error since the no-arg function isn't defined? Why don't I get a compiler error because I'm redefining a function with a different signature? Why, oh why, is this allowed?
Compiling as C++ code (gcc -x c++, enabling Compile To Binary on Godbolt) I get a linker error as expected, because of course C++ allows overloading, and the no-arg overload isn't defined. By checking with Godbolt, compiling with Clang and MSVC as C code also both build successfully, with only MSVC spitting out a minor warning.
Here is my reduced example for Godbolt.
// Compile with GCC or Clang -x c -Wall -Wextra
// Compile with MSVC /Wall /W4 /Tc
#include <stdio.h>
#include <stdlib.h>
// This is just so Godbolt can do an MSVC build
#ifndef _MSC_VER
# include <unistd.h>
#else
# define read(file, output, count) (InputBuffer[count] = count, fd)
#endif
static char InputBuffer[16];
int ReadInput(); // <-- declared with no args
int main(void)
{
int count;
count = ReadInput(); // <-- called with no args
printf("%c", InputBuffer[0]); // just so the results, and hence the entire function call,
printf("%d", count); // don't get optimised away by not being used (even though I'm
return 0; // not using any optimisation... just being cautious)
};
int ReadInput(int fd) // <-- defined with args!
{
return read(fd, InputBuffer, 1); // arg is definitely used, it's not like it's optimised away!
};
Where does the function get the missing argument from?
Typically, the called function is compiled to get its parameters from the places the arguments would be passed according to the ABI (Application Binary Interface) being used. This is necessarily true when the called function is in a separate translation unit (and there is no link-time optimization), so the compiler cannot adjust it according to the calling code. If the call and the called function are in the same translation unit, the compiler could do other things.
For example, if the ABI says the first int class parameter is passed in processor register r4, then the called function will get its parameter from register r4. Since the caller has not put an argument there, the called function gets whatever value happens to be in r4 from previous use.
Why don't I get a linker error since the no-arg function isn't defined?
C implementations generally resolve identifiers by name only. Type information is not part of the name or part of resolution. A function declared as int ReadInput() has the same name as a function declared as int ReadInput(int fd), and, as far as the linker is concerned, a definition of one will satisfy a reference to the other.
Why don't I get a compiler error because I'm redefining a function with a different signature?
The definitions are compatible. In C, the declaration int ReadInput() does not mean the function has no parameters. It means “There is a function named ReadInput that returns int, and I am not telling you what its parameters are.
The declaration int ReadInput(int fd) means “There is a function named ReadInput that returns int, and it takes one parameter, an int. These declarations are compatible; neither says anything inconsistent with the other.
Why, oh why, is this allowed?
History. Originally, C did not supply parameter information in function declarations, just in definitions. The prototype-less declarations are still allowed so that old software continues to work.
Other answers explained why it is legal to call a function that was declared without a prototype (but that it is your responsibility to get the arguments right). But you might be interested in the -Wstrict-prototypes warning option accepted by both GCC and clang, which is documented to "Warn if a function is declared or defined without specifying the argument types." Your code then yields warning: function declaration isn't a prototype.
Try it on godbolt.
(I'm kind of surprised this warning isn't enabled with -Wall -Wextra.)
In C, unlike in C++, declaring a function with no arguments means that the function may have as many arguments as you'd like. If you want to make it really not have any arguments, you just have to explicitly declare that:
int ReadInput(void);
This question already has answers here:
Why am I getting "undefined reference to sqrt" error even though I include math.h header? [duplicate]
(5 answers)
How a standard library differ from user defined header file (.h) and its implementation file (.c) in C?
(2 answers)
Closed 4 years ago.
The following code:
#include <stdio.h>
#include <math.h>
int main(void)
{
long long int a;
scanf("%lld", &a);
printf("%lf", sqrt(a));
return 0;
}
gives output:
source_file.c: In function ‘main’:
source_file.c:9:5: warning: ignoring return value of ‘scanf’, declared with attribute warn_unused_result [-Wunused-result]
scanf("%lld", &a);
^
/tmp/ccWNm2Vs.o: In function `main':
source.c:(.text.startup+0x65): undefined reference to `sqrt'
collect2: error: ld returned 1 exit status
However, if I do long long int a = 25; and delete the scanf statement, or simply do sqrt(25), both of them work (correctly give output 5.000000).
I checked this question, but it is for C++ and uses function overloading, whereas afaict C does not have function overloading (that's why we have sqrtf, sqrt, sqrtl if I'm not wrong). Moreover, the above code fails whether I take long long int or double type of a. So, the questions are probably not related.
Also, regarding the other linked question, the error did not occur for me for constantly defined values, which instead happens to be the case with the linked question.
What is the reason then? Why would a constant value work for sqrt, while a variable user input value won't?
Like was mentioned in the comments, you haven't linked against libm so sqrt is undefined at link-time.
Why would a constant value work for sqrt, while a variable user input value won't?
Because GCC recognizes sqrt as a builtin function and is able to evaluate the square root of a compile-time constant at compilation time, and forgoes the call to sqrt altogether, thus avoiding the subsequent linker error.
The ISO C90 functions ... sqrt, ... are all recognized as built-in functions unless -fno-builtin is specified (or -fno-builtin-function is specified for an individual function).
If you were to add -fno-builtin-sqrt, you would see the linker error regardless of what you pass to sqrt.
Functions inside the stdlib.h and stdio.h have implementations in libc.so (or libc.a for static linking), which is linked into executable by default (as if -lc were specified).
GCC can be instructed to avoid this automatic link with the -nostdlib or -nodefaultlibs options.
Functions in math.h have implementations in libm.so (or libm.a for static linking), and libm is not linked in by default.
Basically, C++ runtime libstdc++ requires libm, so if you compile a C++ program with GCC (g++), you will automatically get libm linked in.
According to C How to Program (Deitel):
Standard library functions like printf and scanf are not part of the C programming language. For example, the compiler cannot find a spelling error in printf or scanf. When the compiler compiles a printf statement, it merely provides space in the object program for a “call” to the library function. But the compiler does not know where the library functions are—the linker does. When the linker runs, it locates the library functions and inserts the proper calls to these library functions in the object program. Now the object program is complete and ready to be executed. For this reason, the linked program is called an executable. If the function name is misspelled, it is the linker which will spot the error, because it will not be able to match the name in the C program with the name of any known function in the libraries.
These statements leave me doubtful because of the existence of header file. These files are included during the preprocessing phase, before the compiling one, and, as I read, there are used by the compiler.
So if I write print instead of printf how can't the compiler see that there is no function declared with that name and throw an error?
If it is as the book says, why can I declare function in header files if the compiler doesn't watch them?
So if I write print instead of printf how can't the compiler see that there is no function declared with that name and throw an error?
You are right. If you made a typo in any function name, any modern compiler should complain about it. For example, gcc complains for the following code:
$ cat test.c
int main(void)
{
unknown();
return 0;
}
$ gcc -c -Wall -Wextra -std=c11 -pedantic-errors test.c
test.c: In function ‘main’:
test.c:3:5: error: implicit declaration of function ‘unknown’ [-Wimplicit-function-declaration]
unknown();
^
However, in pre C99 era of C language, any function whose declaration isn't seen by the compiler, it'll assume the function returns an int. So, if you are compiling in pre-C99 mode then a compiler isn't required to warn about it.
Fortunately, this implicit int rule was removed from the C language since C99 and a compiler is required to issue a diagnostic for it in modern C (>= C99).
But if you provide only a declaration or prototype for the function:
$ cat test.c
int unknown(void); /* function prototype */
int main(void)
{
unknown();
return 0;
}
$ gcc -c -Wall -Wextra -std=c89 -std=c11 test.c
$
(Note: I have used -c flag to just compile without linking; but if you don't use -c then compiling & linking will be done in a single step and the error would still come from the linker).
There's no issue despite the fact, you do not have definition for unknown() anywhere. This is because the compiler assumes unknown() has been defined elsewhere and only when the linker looks to resolve the symbol unknown, it'll complain if it can't find the definition for unknown().
Typically, the header file(s) only provide the necessary declarations or prototypes (I have provided a prototype for unknown directly in the file itself in the above example -- it might as well be done via a header file) and usually not the actual definition. Hence, the author is correct in that sense that the linker is the one that spots the error.
So if I write print instead of printf how can't the compiler see that there is no function declared with that name and throw an error?
The compiler can see that there is no declaration in scope for the identifier designating the function. Most will emit a warning under those circumstances, and some will emit an error, or can be configured to do so.
But that's not the same thing as the compiler detecting that the function doesn't exist. It's the compiler detecting that the function name has not been declared. The compiler will exhibit the same behavior if you spell the function name correctly but do not include a prior declaration for it.
Furthermore, C90 and pre-standardization C permitted calls to functions without any prior declaration. Such calls do not conform to C99 or later, but most compilers still do accept them (usually with a warning) for compatibility purposes.
If it is as the book says, why can I declare function in header files if the compiler doesn't watch them?
The compiler does see them, and does use the declarations. Moreover, it relies on the prototype, if the declaration provides one, to perform appropriate argument and return value conversions when you call the function. Moreover, if you use functions whose argument types are altered by the default argument promotions, then your calls to such functions are non-conforming if no prototype is in scope at the point of the call. Undefined behavior results.
Recently I've learnt about implicit function declarations in C. The main idea is clear but I have some troubles with understanding of the linkage process in this case.
Consider the following code ( file a.c):
#include <stdio.h>
int main() {
double someValue = f();
printf("%f\n", someValue);
return 0;
}
If I try to compile it:
gcc -c a.c -std=c99
I see a warning about implicit declaration of function f().
If I try to compile and link:
gcc a.c -std=c99
I have an undefined reference error. So everything is fine.
Then I add another file (file b.c):
double f(double x) {
return x;
}
And invoke the next command:
gcc a.c b.c -std=c99
Surprisingly everything is linked successfully. Of course after ./a.out invocation I see a rubbish output.
So, my question is: How are programs with implicitly declared functions linked? And what happens in my example under the hood of compiler/linker?
I read a number of topics on SO like this, this and this one but still have problems.
First of all, since C99 , implicit declaration of a function is removed from the standard. compilers may support this for compilation of legacy code, but it's nothing mandatory. Quoting the standard foreword,
remove implicit function declaration
That said, as per C11, chapter §6.5.2.2
If the function is defined with a type that does not include a prototype, and the types of
the arguments after promotion are not compatible with those of the parameters after
promotion, the behavior is undefined.
So, in your case,
the function call itself is implicit declaration (which became non-standard since C99),
and due to the mismatch of the function signature [Implicit declaration of a function were assumed to have an int return type], your code invokes undefined behavior.
Just to add a bit more reference, if you try to define the function in the same compilation unit after the call, you'll get a compilation error due to the mismatch signature.
However, your function being defined in a separate compilation unit (and missing prototype declaration), compiler has no way to check the signatures. After the compilation, the linker takes the object files and due to the absence of any type-checking in the linker (and no info in object files either), happily links them. Finally, it will end up in a successful compilation and linking and UB.
Here is what is happening.
Without a declaration for f(), the compiler assumes an implicit declaration like int f(void). And then happily compiles a.c.
When compiling b.c, the compiler does not have any prior declaration for f(), so it intuits it from the definition of f(). Normally you would put some declaration of f() in a header file, and include it in both a.c and b.c. Because both the files will see the same declaration, the compiler can enforce conformance. It will complain about the entity that does not match the declaration. But in this case, there is no common prototype to refer to.
In C, the compiler does not store any information about the prototype in the object files, and the linker does not perform any checks for conformance (it can't). All it sees is a unresolved symbol f in a.c and a symbol f defined in b.c. It happily resolves the symbols, and completes the link.
Things break down at run time though, because the compiler sets up the call in a.c based on the prototype it assumed there. Which does not match what the definition in b.c looks for. f() (from b.c) will get a junk argument off the stack, and return it as double, which will be interpreted as int on return in a.c.
How are programmes with implicitly declared functions are linked? And what happens in my example under the hood of compiler/linker?
The implicit int rule has been outlawed by the C standard since C99. So it's not valid to have programs with implicit function declarations.
It's not valid since C99. Before that, if a visible prototype is not available then the compiler implicitly declares one with int return type.
Surprisingly everything is linked successfully. Of course after
./a.out invocation I see a rubbish output.
Because you didn't have prototype, compiler implicitly declares one with int type for f(). But the actual definition of f() returns a double. The two types are incompatible and this is undefined behaviour.
This is undefined even in C89/C90 in which the implicit int rule is valid because the implicit prototype is not compatible with the actual type f() returns. So this example is (with a.c and b.c) is undefined in all C standards.
It's not useful or valid anymore to have implicit function declarations. So the actual detail of how compiler/linker handles is only of historic interest. It goes back to the pre-standard times of K&R C which didn't have function prototypes and the functions return int by default. Function prototypes were added to C in C89/C90 standard. Bottom line, you must have prototypes (or define functions before use) for all functions in valid C programs.
After compiling, all type information is lost (except maybe in debug info, but the linker doesn't pay attention to that). The only thing that remains is "there is a symbol called "f" at address 0xdeadbeef".
The point of headers is to tell C about the type of the symbol, including, for functions, what arguments it takes and what it returns. If you mismatch the real ones with the ones you declare (either explicitly or implicitly), you get undefined behavior.
Anybody knows why is this compiling successfully in C?
int main(){
display();
return 0;
}
void display(){
printf("Why am I compiling successfully?");
}
I thought when declaration is not provided C assume extern int Function_name(arg1,arg2,...){}. Thus this should give an error but however it's working! I know that Ideone is supressing the warnings but my question is why is it just not giving a straight error? (however in C++ it's straight error)
Turn up the warning level in your compiler and you should get 2 warnings,
display not declared, int assumed
and
display redeclared
Edit:
Older versions of C (pre C99) aren't really that bothered about return types or argument types. You could say it's part of the K&R legacy. For instance, if you don't explicitly specify the argument types, the compiler won't check them at all.
C++ is stricter, which IMO is a good thing. I always provide declarations and always specify the argument lists when I code in C.
It's compiling because C uses a lot of defaults to be backwards compatible. In K&R C, you couldn't specify function prototypes, so the compiler would just assume that you know what you're doing when you call a function.
Later (at least ANSI C, but maybe even in C99), C didn't really have a way to distinguish
void display(void);
void display();
so the empty declaration must be accepted as well.
That's why you can call display() without defining it first.
printf() is similar. You will get a linker error if you forget -lc but from the compiler's point of view, the code is "good enough".
That will change as soon as you enable all warnings that your compiler has to offer and it will fail with an error when you disable K&C compatibility or enable strict ANSI checks.
Which is why "C" is often listed as "you shoot yourself into the foot" in "How to Shoot Yourself In the Foot Using Any Programming Language" kind of lists.
it depends of your Cx (C89, C90, C99,...)
for function return values, prior to C99 it was explicitly specified that if no function declaration was visible the translator provided one. These implicit declarations defaulted to a return type of int
Justification from C Standard (6.2.5 page 506)
Prior to C90 there were no function prototypes. Developers expected to
be able to interchange argu-ments that had signed and unsigned
versions of the same integer type. Having to cast an argument, if the
parameter type in the function definition had a different signedness,
was seen as counter to C’s easy-going type-checking system and a
little intrusive. The introduction of prototypes did not completely do
away with the issue of interchangeability of arguments. The ellipsis
notation specifies that nothing is known about the 1590 ellipsis
supplies no information expected type of arguments. Similarly, for
function return values, prior to C99 it was explicitly specified that
if no function declaration was visible the translator provided one.
These implicit declarations defaulted to a return type of int . If the
actual function happened to return the type unsigned int , such a
default declaration might have returned an unexpected result. A lot of
developers had a casual attitude toward function declarations. The
rest of us have to live with the consequences of the Committee not
wanting to break all the source code they wrote. The
interchangeability of function return values is now a moot point,
because C99 requires that a function declaration be visible at the
point of call (a default declaration is no longer provided)
It probably does (assume such declaration as you wrote), but as you are not passing parameters, it just works fine. It's the same like if you declare int main() where it should actually be int main(int argc, char *argv[]).
So probably if you tried to pass some parameters (different from default ones) from main then use them in display it would fail.
BTW, for me it compiles, but generates warning:
$ gcc z.c
z.c:8:6: warning: conflicting types for ‘display’ [enabled by default]
z.c:4:5: note: previous implicit declaration of ‘display’ was here
When I compile with gcc, I get warnings about redeclaration of display, as you would expect.
Did you get warnings?
It might "run" because C doesn't mangle function names (like C++). So the linker looks for a symbol 'display' and finds one. The linker uses this address to run display. I would expect results to not be what you expect all the time.