Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I want to know what is the difference between preprocessor directives and libraries in C?
So far, I know preprocessor directives are used to include external files. (I mean these external files can be Libraries?)
And libraries are ready made compiled programs already written for us. (Thus we need preprocessor directives to include them in C?)
Preprocessor directives appear in source code. There are many different directives. One of them is #include, which is used to include a header file. Header files contain a collection of declarations, often for functions and types (and sometimes variables) found in a library. However, a header is not a library.
Libraries are collections of object files that have already been compiled. The C standard does not recognize that libraries exist (but it does recognize the preprocessor and defines the required behaviour). Libraries are listed on the linker (compiler) command line, often using the -lname notation to locate a library name, with the -L option used to specify places to search for libraries.
Note that the majority of the functions defined in the standard C library are found by the linker without it needing to be told where to look for them. The exception are the maths function which, for historical reasons (primarily related to machines that did not always have floating point arithmetic built-in — sometimes they had FP coprocessors, such as the Intel 80386 + 80387, or sometimes they needed software emulation of the missing hardware). On many systems, you need to specify -lm to link the maths library; on others, you don't (the code is in the main system C library).
Generally speaking, headers are not found in the same directories as libraries (it would be a messy, unprofessional project that installed headers into the same directory as its libraries).
Especially in C++, there are libraries that do not have pre-compiled object files; the services are defined exclusively through the header. These are much less common in C. It is most sensible to regard the files as headers, not libraries. A header defines a set of services that can be used by the compiler. A library provides the support for such services. (If you think about it, or take a look in your system, you will find that <stdio.h> does not include the source for fprintf() — to take one example of many — but it does declare fprintf() in such a manner that your program can use it and so that the actual function from the standard C library will be used at runtime.)
Dynamic linking (the loading of shared objects, aka shared libraries or dynamic link libraries (DLLs)) where the library file(s) are not loaded until runtime, after the program is started up (after main() has been called) are another whole platform-specific bag'o'worms.
Pre processors commands do lots of things, one of those is include files, such as header files. Libraries mostly provide compiled code to do things for you, this is very very different. However, most libraries will require your code to include a header file from the library so that you code will know about the types and function available in the library.
there are a lot of preprocessor directives , I'll list some of theme here :
#define : is used to define constants or marcro (with or without arguments )
#include: which include files (using the " ") or libraries (using < >)
#if and #ifdef : used to compile only parts of code when certain conditions are filled (they are allways followed by #endif)
...
you can find out a lot more about preprocessor directive here
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
Both languages have a way for a single program/library to span over multiple files. C languages use include statements using explicitly created header files, whereas Fortran has the compiler generate an interface from a single module. Forgive me in advance if I misunderstood how either language compiles its source code.
In c/c++ programs you usually create .c files containing source code, and .h files containing the interface to your code so that other source files can anticipate what is in the source, which is then compiled all together into libraries, object files, etc.
In Fortran programs (90+), you can put code into separate modules, and instead of explicitly writing a header/interface file for each one, the compiler will generate interfaces for them and put them into separate binaries (.mod files) in addition to the compiled object files. Creating libraries, object files, etc then requires you compile/link them together.
Why is there this subtle difference between how these languages compile their interfaces? Is it just a quirk that resulted from the long histories of either language?
C is far less organized than you make it sound. The use of "header" files is purely a convention; the preprocessor allows arbitrary textual inclusions, and there is no principled division of "interface" and "implementation" built into the language.
The simple fact of the matter is that linking separate translation units of a C program is not checked or "safe"; it's your responsibility that the pieces that you link actually fit together.
In early C, function declarations could be implied, and function arguments were not checked at all. So you could stick a call foo(1, 2, 3) into your code, and it would imply the declaration of a function int foo(), and the arguments would need to match the parameters that the eventual function definition would use.
The two key features in C that make header files useful are declarations and function prototypes. If you agree to always require an explicit declaration of a function before use (something compilers can warn you about), then you can provide that declaration in a "header file", and there's a better chance that the link fits. Function prototypes are an extension of the concept of declarations that automatically fix up arguments at the call site with the parameters in the function definition, and so if you choose to use all those features, you can reasonably well document your function call interface via header files. But all that is purely convention!
What does include and link REALLY do? What are the differences? And why do I need to specify both of them?
When I write #include math.h and then write -lm to compile it, what does #include math.h and -lm do respectively?
In my understanding, when linking a library, you need its .h file and its .o file. Does this suggest #include math.h means take in the .h file while -lm take in the .o file?
The reason that you need both a header (the interface description) and the library (the implementation) is that C separates the two clearer than languages like C# or Java do. One can compile a C function (e.g. by invoking gcc -c <sourcefile>) which calls library code even when the called library is not present; the header, which contains the interface description, suffices. (This is not possible with C# or Java; the assemblies resp. class files/jars must be present.) During the link stage though the library must be there, even when it's dynamic, afaik.
With C#, Java, or script languages, by contrast, the implementation contains all information necessary to define the interface. The compiler (which is not as clearly separated from the linker) looks in the jar file or the C# assembly which contain called implementations and obtains information about function signatures and types from there.
Theoretically, that information could probably be present in a library written in C as well — it's basically the debug information. But the classic C compiler (as opposed to the linker) is oblivious to libraries or object files and cannot parse them. (One should remember that the "compiler" executable you usually use to compile a C program , e.g. gcc, is a "compiler driver" which interprets the command line arguments and calls the programs which actually do stuff, e.g. the preprocessor, actual compiler and actual linker, to create the desired output.)
So in theory, if you have a properly annotated library in a known location, you could probably write a compiler which compiles a C function against it without having function declarations and type definitions; the compiler would have to produce the proper declarations. The compiler would have to know which library to parse (which corresponds to setting a C# project "Reference" in VS or having a class path and name/class correspondence in Java).
It would probably be easiest to use a well-known debugging format like stabs or dwarf and extract the interface definitions from it with a little helper program which uses the API for the debug format, extracts the information and produces a C header which is prepended to every source file. That would be the job of the compiler driver, and the actual compiler would still be oblivious to that.
It's because headers files contain only declaration and .o files (or .obj, .dll or .lib) contain definitions of methods.
If you open an .h file, you will not see the code of methods, because that is in the libraries.
One reason is commercial, because you need to publish your code and have the source code in your company. Libraries are compiled, so you could publish it.
Header files only tell compiler, what classes and methods it can find in the library.
The header files are kind of a table-of-contents plus a kind of dictionary for the compiler. It tells the compiler what the library offers and gives special values readable names.
The library file itself contains the contents.
What you are asking are entirely two different things.
Don't worry , i will explain them to you.
You use # symbol to instruct the preprocessor to include the math.h header files which internally contain the function prototypes of fabs(),ceil() etc..
And you use -lm to instruct the linker, to include the pre-compiled function definitions of fabs(),ceil() etc. functions in the exe file .
Now, you may ask why we have to explicitly link library file of math functions unlike for other functions and the answer is ,it is due to some undefined historical reasons.
I want to be able to write small scripts to analyze the macros, prototypes, etc., that are effectively included on the platform I am compiling on when I #include the official C header files associated with large external libraries in my C programs. (Assume I am compiling with gcc. Answers for other compilers are of interest too.)
Serious obstacles to this are recursive includes, conditional compilation directives, and the interactions between the two. So I seek a tool that will recursively find the text effectively included by processing these directives, producing a single header file, the inclusion of which is equivalent to the inclusion of the official ones for the current platform only.
(It would be very nice if in addition to supplying the conditionally relevant macros and C declarations, that comments were preserved, and #line directives inserted from time-to-time so as to indicate the origin of various parts of the output. But all of this is less than vital.)
I don't insist upon recursive output from non-top-level #includes, though to work properly, the tool will clearly have to recursively visit #included files inside the headers it is asked to pursue. So the tool could leave those lower level #include directives in its output, rather than recursively interpolating their recursively processed bodies.
Is there a tool out there that specializes header files in this fashion?
My basic question is how the compilation process works to use standard library routines. When I #include <stdio.h> in C, does the preprocessor take the entire standard library and paste it into my source file?
If this is so, when I use a library routine, how is it that the linker is involved?
The preprocessor is, as its name implies, a program that runs before the compiler. All it does is simple text substitutions.
When a #include directive is found, it simply "pastes" the complete file into the place where the directive was. The same applies to macro expansions, when a macro "call" is detected, the body of the macro is "pasted" into its place.
The preprocessor has nothing to do with libraries. It's just that C (and C++) needs to have all its functions and variables declared before they are used, and so putting the declarations in a header file that is included by the preprocessor is a simple way to get these declarations from libraries.
There are basically two types of libraries: Header only libraries, and libraries you need to link with. The first type, header only libraries, are exactly what the name implies: They are fully contained in the header files you include. However, the vast majority of libraries are libraries you need to link with. This is done in a step after the compiler has done its work, by a special program. How this is used depends on the environment of course.
In general, compilation of a program can be divided into these steps:
Editing
Preprocessor
Compiler
Linker
The editing step is what you do to create your source.
The preprocessor and compilation steps are often put together into a single step, which is probably why there is some confusion among beginners as to what the preprocessor really does.
The final step, linking, is taking the input from the compiler, and uses that together with the libraries you specified to create the final executable.
When I pound include in C, does the preprocessor take the entire standard library and paste it into my source file?
Only the header files you #include.
If this is so, when I use a library routine, how is it that the linker is involved?
The standard library headers contain declarations only. The definition (implementation) of the functions is in a library file, most likely /usr/lib/libc.ext (ext being an OS-dependent extension).
When you #include something in your source code, the preprocessor pastes whatever you #include into your source file.
But specifically, if you include a header file from a library, you are just including function declarations like void a();, and the linker finds implementations of these functions in the library itself.
This question already has answers here:
Closed 13 years ago.
Possible Duplicates:
[C] Header per source file.
In C++ why have header files and cpp files?
C++ - What should go into an .h file?
Is the only reason header files exist in C is so a developer can quickly see what functions are available, and what arguments they can take? Or is it something to do with the compiler?
Why has no other language used this method? Is it just me, or does it seem that having 2 sets of function definitions will only lead to more maintenance and more room for errors? Or is knowing about header files just something every C developer must know?
Header files are needed to declare functions and variables that are available. You might not have access to the definitions (=the .c files) at all; C supports binary-only distribution of code in libraries.
The compiler needs the information in the header files to know what functions, structures, etc are available and how to use them.
All languages needs this kind of information, although they retrieve the information in different ways. For example, a Java compiler does this by scanning either the class-file or the java source code to retrieve the information.
The drawback with the Java-way is that the compiler potentially needs to hold a much more of information in its memory to be able to do this. This is no big deal today, but in the seventies, when the C language was created, it was simply not possible to keep that much information in memory.
The main reason headers exist is to share declarations among multiple source files.
Say you have the function float *f(int a, int b) defined in the file a.c and reused in b.c and d.c. To allow the compiler to properly check arguments and return values you either put the function prototype in an header file and include it in the .c source files or you repeat the prototype in each source file.
Same goes for typedef etc.
While you could, in theory, repeat the same declaration in each source file, it would become a real nightmare to properly manage it.
Some language uses the same approach. I remember the TurboPascal units being not very different. You would put use ... at the beginning to signal that you were going to require functions that were defined elsewhere. I can't remember if that was passed into Delphi as well.
Know what is in a library at your disposal.
Split the program into bite-size chunks for the compiler. Compiling a megabyte of C files simultaneously will take more resources than most modern hardware can offer.
Reduce compiler load. Why should it know in screen display procedures about deep database engine? Let it learn only of functions it needs now.
Separate private and public data. This use isn't frequent but you may implement in C what C++ uses private fields for: each .c file includes two .h files, one with declarations of private stuff, the other with whatever others may require from the file. Less chance of a namespace conflict, safer due to hermetization.
Alternate configs. Makefile decides which header to use, and the same code may service two different platforms given two different header files.
probably more.