Why is it possible to redefine C library functions? - c

I noticed that if I write a function named getline, this function will be used if I invoke it, even if I #include <stdio.h>, but if I don't write such a function, the one from stdio.h will be used.
I expected instead to get a linker error, the same as if I had done the following:
foo.c:
int f() { return 0; }
main.c:
int f() { return 1; }
int main() { return f(); }
Compile:
$ gcc -c foo.c
$ gcc -c main.c
$ gcc foo.o main.o
/usr/bin/ld: main.o: in function `f':
main.c:(.text+0x0): multiple definition of `f'; foo.o:foo.c:(.text+0x0): first defined here
collect2: error: ld returned 1 exit status
The linker error makes sense to me; when the linker attempts to combine the object files into a single binary, it doesn't know how to resolve the invocation of f(); should it use foo.o's f() or main.o's f()?
But then why don't I get such a linker error when I write my own versions of getline or other C library functions?
This came up because I noticed that when compiling with -std=c99, gcc gives me a implicit-function-declaration warning for using getline. I can make an explicit function prototype, and it works correctly, but this implies that glibc's getline is being linked, so I tested what happens if I write my own getline, and if I do, the linker uses it instead and produces no error... The same appears to be true for other C library functions. Why is this? Why don't I get a linker error instead?

Linkers process library files differently than object files. The following discusses typical behavior for linkers. Details may vary with specific linkers and command-line switches or other settings.
When a linker processes an object file, it includes the entire object file in the output file it is building. As it is doing this, it builds a list of symbols that the object files use (refer to) but that are not defined yet.
A library file consists of multiple object modules inside a containing file. When a linker processes a library file, it examines each module in the library file and compares the symbols that module defines to that list of symbols that are needed but not yet defined. When it finds such a module, the linker includes that module in the output file. (The linker may also go back to earlier modules in the same library file, in case a later module uses a symbol that an earlier one defines.)
Any modules in the library file that do not provide a needed symbol are not needed in the output file, so the linker does not include them.
A consequence of this is that, if a same symbol is defined more than once in the object files, there will be multiple definitions because they are both built into the output file. However, if a symbol is defined once in the object files and once in the library, the one in the library will not be used because, when the linker considers the module it is in, that symbol will not be on the list of needed symbols, and the linker will not include it in the output file. So the output file ends up with just one definition of the symbol, the one from the object modules.
There are some complications to this. Suppose a module in a library defines both sin and cos, and an object module defines sin and uses both sin and cos. When the linker processes the object module, it will note that sin and cos are both used. The reference to sin will be satisfied by the object module, but cos is still needed. Then, when the linker processes the library, it will find cos and include that module. But that module also defines sin, so there will be two definitions of sin in the output file, and the linker will complain. So you can get multiple-definition errors from library modules this way.
Another complication is that the order of processing matters. If the linker first processes an object module that needs getline, and then a library module that defines getline, and then an object module that defines getline, the library module will be included in the output file (because getline was needed when the linker processed the library), and the object module that defines getline will also be included (because the linker includes all object files). So the output will have multiple definitions of getline, and the linker will complain. This is one reason why libraries are generally processed last, so that all object modules are processed first, and only things that are needed from libraries are taken.
In spite of this linker behavior, you cannot rely on defining your own versions of standard C routines. Compilers may have built-in knowledge about how the routines are specified by the C standard, and they may replace calls to those routines with other code. If you do need to provide your own version of a standard routine, the compiler may have a switch to disable its special treatment of that routine. For example, GCC has -fno-builtin-function, where function is replaced with a particular name, to tell it to disable special knowledge of a function.

Related

Why is there a need to link with `printf.o` if it's already defined inside `stdio.h`?

As far as I understand when we include stdio.h the entire file is copied into the text of our program.(basically we prepend it) Then when we do a function call for printf() why does it have to be linked first?
Is it that stdio.h contains just the definitions of those functions and the compiler finds the compiled executable object file for the function that we invoke for example printf().
I've read about this in a few places but still kinda not clear.
Header files like stdio.h generally just contain a declaration that defines the name of the function, the types of its arguments, and its return value. This is enough for the compiler to generate calls to the function, but it is not a definition. The actual code that implements the function is going to be in a library (with an extension like .a or .o or .lib).
Nope. printf() is included in libc.so (this is where the printf() function resides), which is the C standard library. The compiler includes automatically -lc as option, when it calls the linker, so you don't need to add that option. Only in case you link your object files calling the linker ld directly, is when you need to include the option yourself (and some other files that form the C runtime, the linker doesn't know that you are linking a C source program, so it doesn't know what files to link with your modules, but the compiler does) See the documentation of the compiler you use, or just use the -v option when building the executable, to see the command line the compiler uses to call the linker, and you will see all the objects and libraries the compiler requires for every C program.

Which functions are included in executable file compiled by gcc with "-static" on? And which functions are not?

When a C program is compiled with GCC's -static option on, the final executable file would include tons of C's standard functions. For example, my whole program is like
int main(int argc, char *argv[]) {
printf("Hello, world!\n");
return 0;
}
I checked the compiled executable file, and functions like, strcmp(), mktime(), realloc(), etc. are included in it, even though my program never calls them. However, some functions in stdlib.h are missing, like, rand(), system(), etc. My experiment environments are: Ubuntu 14.04 (with Linux kernel 3.13.0); GCC 4.8.2. I would like to know which C's functions would be included in the executable file when -static is turned on.
Static linking means that ALL libraries that your program needs are linked and included into our executable at compiling time. In other words, your program will be larger, but it will be very independent (portable) as the executable will contain all libraries that it needs to run.
This means that with -static you will have ALL functions defined in your included libraries. You didn't put the include declarations, but just printf() already uses a large amount of libraries.
In other words, we cannot tell you which libraries are included in your program when using static, because it will vary from program to program.
Static libs are archives of object files.
Linking them, only brings in those archive members that resolve undefined
symbol references, and this works recursively (e.g., you may call a(), which calls b(), which calls c()). If each archive member defined exactly one symbol (e.g., a.o only defines a(), etc.) , you'd get only those symbols that were needed (recursively). Practically, an archive member may also define other symbols (e.g., a.o may define a() and variable), so you'll get symbols that resolve undefined symbol references along with the symbols that shared the same object file with the needed symbol definition.

Does the linker refer to the main code

Let assume I am having three source files main.c, a.c and b.c. In the main.c are called some of the functions (not all) that are defined in a.c. None of the functions defined in b.c are called (used) by main.c. In main.c is the main function. Then we have a makefile that compiles all the source files(main.c, a.c and b.c) and then links them to produce executable file, in my case intel hex file. My question is: Does the linker know in which file the main function resides and knowing that to determine what part of the object files to link together? I mean if the linker produces the exe file based only on the recipe of the rule to make the target then no matter how many functions are called in our application code the size of the executable will be the same because the recipe says to link all the object files. For example we compile the three source files and we get three object files: main.o a.o and b.o (the bigger the object files are, the bigger the exe file is). I know you would say if you dont want anything from the b.c then do not include it in the build. But it means that every time I want to change the application (include/exclide modules) I need to change the makefile too. And another thing is how the linker knows what part of the object file to take, does it understand the C language? I hope you understand my question, excuse my bad English.
1) Does the linker know in which file the main function resides and knowing that to determine what part of the object files to link together?
Maybe there are options of your toolchain (compiler/linker) to enable this kind of optimizations, I mean removing unused functions from link, but I have big doubt for global functions (could be possible for static functions).
2) And another thing is how the linker knows what part of the object file to take, does it understand the C language?
Linker may detect if a function or variable is not used by the application (once again, check the available options), but it is not really the objective of this tool. However if you compile/link some functions as library functions (see options), you can generate a "library" file and then link this library with other object files. The functions of the library will then be included by the linker ONLY if they are used.
What I suggest: use compilation flags (#ifdef...) to include or exclude parts of code from compilation/link.
If you want only those functions in the executable that are eventually called from main, use a library of object files.
Basically the smallest unit the linker will extract from a library is the object file. Whatever symbols are in that object file will also be resolved, until all symbols are resolved.
In other words, if none of the symbols in an object file are needed, it won't end up in the result. If at least one symbol is needed, it will get linked in its entirety.
No, the linker does not understand C. Note that a lot of language compilers create object files (C++, FORTRAN, ..., and assemblers). A linker resolves symbols, which are names attached to values.
John Levine has written a book, "Linkers and Loaders", available on the 'net, which will give you an in-depth understanding of linkers, symbols, and object files.

Compiler warning not generated for multiple definitions

problem I am facing is function with same signature is defined in two .c files and is not giving compile time error. I have included declaration in .h file, which is included to both .c files.
For example:
int add(int x, int y) { return x+y;}
same definition is given in two .c files (Say A.c and B.c) and declaration in one .h file which is included in both A.c and B.c. But why this is not giving compile time error or How can I make to give them compile error
Even Linker is not giving any error, it looks it is taking first definition
I am using GCC compiler mingw
I found another pattern in this.
if I am using this in header file
#ifndef H_H_
#define H_H_
linker is not giving warning warning but If i don't use this Linker gives warning which is expected.
This situation is undefined behaviour with no diagnostic required.
Consult your linker's documentation to see if it has any options to report multiple definition of functions.
The compiler doesn't analyze your program as a whole. It simply processes one .c file at a time. If the declaration in the .h file matches the definition in the .c file, then everything is good as far as the compiler is concerned.
The linker will detect that the function was defined twice and will generate a "duplicate symbol" error.
Compiler sees each source file apart from the other. Compiler includes the content of header file(s) into A.c then geneates an object file A.obj from A.c. A.obj file will contain symbols of the variables and functions defined in A.c. On the other hand, compiler will process B.c apart without checking A.c, or any other source file, content. It will start by including header file(s) into B.c then it generates B.obj which also includes symbols of the variables and functions defined in B.c.
As a result, you will not get errors at compile time as the function duplication is not detected by the compiler. It is the linker job to check the symbols consistency and that there are no duplication present. Linker will get all generated object files in order to generate an executable. Linker must assign a unique memory address to each symbol. For example, in your code if there is a point (let's say in main function) where a function of A.c is called, actually, this is translated into a jump to an address in memory where that function is located. Now, imagine if two functions with the same signature coexist in the executable and each symbol has a different address. Then, how can the processor figure out which function exactly do you intend to call in your program. For that reason, if linker finds a symbol which is duplicated it will signal an error.
As #Matt-McNabb says: consult your linker documentation.
The only other cause I can come up with is that the linker binary compares the two functions, finds they are idenical, and ignores one. You can check this by slightly changing the code, for example by 'return y+x'.

Detect undefined symbols in C header file

Suposse I coded a C library which provides a bunch of "public" functions, declared in a mylib.h header file. Those functions are supposedly implemented in (say) a mylib.c file which is compiled to a (say) static lib mylib.c -> mylib.o -> mylib.a.
Is there some way to detect that I forgot to provide the implementation of some declared function in mylib.h? (Yes, I know about unit testing, good practices, etc - and, yes, I understand the meaning of a plain function declaration in C).
Suppose mylib.h declares a void func1(); and this function was not coded in the provided library. This will trigger an error only if the linker needs to use that function. Otherwise, it will compile ok and even without warnings - AFAIK. Is there a way (perhaps compiler dependent) to trigger a warning for declared but not implemented functions, or there is any other way to deal with this issue?
BTW: nm -u lists not all undefined declared functions, but only those "used" by the library, i.e., those functions that will trigger an error in the linking phase if not declared somewhere. (Which makes sense, the library object file does not know about header files, of course.)
Basically, the most reliable way is to have a program (or possibly a series of programs) which formally exercise each and every one of the functions. If one of the programs fails to link because of a missing symbol, you've goofed.
I suppose you could try to do something by editing a copy of the header into a source file (as in, file ending .c), converting the function declarations into dummy function definitions:
Original:
extern int somefunc(void);
Revised:
extern int somefunc(void){}
Then compile the modified source with minimum warnings - and ignore anything to do with "function that is supposed to return a value doesn't". Then compare the defined symbols in the object file from the revised source with the defined symbols in the library (using nm -g on Unix-like systems). Anything present in the object file that isn't present in the library is missing and should be supplied.
Note: if your header includes other headers of your own which define functions, you need to process all of those. If your header includes standard headers such as <stdio.h>, then clearly you won't be defining functions such as fopen() or printf() in the ordinary course of events. So, choose the headers you reprocess into source code carefully.
There's no easy way.
For example, you can analyse the output of clang -Xclang -ast-print-xml or gcc-xml and filter out declarations with no implementations for a given .h file.
You could grep for signatures of exported function in both .h and .c, and compare the lists.
Use wc -l for counting matches, Both numbers should be equal.
Another thought, just came to my mind. It is ihmo not possible to handle it using compiler. it is not always the case, that function declares in mylib.h is implemented in mylib.c
Is there some way to detect that I forgot to provide the implementation of some declared function in mylib.h?
Write the implementation first, then worry about header contents -- because that way, it can be flagged.

Resources