A Java programmer has questions regarding C header files - c

I have a fair amount of practice with Java as a programming language, but I am completely new to C. I understand that a header file contains forward declarations for methods and variables. How is this different from an abstract class in Java?

The short answer:
Abstract classes are a concept of object oriented programming. Header files are a necessity due to the way that the C language is constructed. It cannot be compared in any way
The long answer
To understand the header file, and the need for header files, you must understand the concepts of "declaration" and "definition". In C and C++, a declaration means, that you declare that something exists somewhere, for example a function.
void Test(int i);
We have now declared, that somewhere in the program, there exists a function Test, that takes a single int parameter. When you have a definition, you define what it is:
void Test(int i)
{
...
}
Here we have defined what the function void Test(int) actually is.
Global variables are declared using the extern keyword
extern int i;
They are defined without the extern keyword
int i;
When you compile a C program, you compile each source file (.c file) into an .obj file. Definitions will be compiled into the .obj file as actual code. When all these have been compiled, they are linked to the final executable. Therefore, a function should only be defined on one .c file, otherwise, the same function will end up multiple times in the executable. This is not really critical if the function definitions are identical. It is more problematic if a global variable is linked into the same executable twice. That will leave half the code to use the one instance, and the other half of the code to use the other instance.
But functions defined in one .c file cannot see functions defined in another .c files. So if from file1.c file you need to access function Test(int) defined in file2.c, you need to have a declaration of Test(int) present when compiling file1.c. When file1.c is compiled into file1.obj, the resulting .obj file will contain information that it needs Test(int) to be defined somewhere. When the program is linked, the linker will identify that file2.obj contains the function that file1.obj depends on.
If there is no .obj file containing the definition for this function, you will get a linker error, not a compiler error (linker errors are considerably more difficult to find and correct that compiler errors because you get no filename and line number for the resulting file)
So you use the header file to store declarations for the definitions stored in the corresponding source file.

IMO it's mainly because many C programmers seem to think that Java programmers don't know how to program “for real”, e.g. handling pointers, memory and so on.
I would rather compare headers to Java interfaces, in the sense that they generally define how the API must be used.
Headers are basically just a way to avoid copy-pasting: the preprocessor simply includes the content of the header in the source file when encounters an #include directive.
You put in a header every declaration that the user will commonly use.

Here's the answers:
Java has had a bad reputation among some hardcore C programmers mainly because they think:
it's "too easy" (no memory-management, segfaults)
"can't be used for serious work"
"just for the web" or,
"slow".
Java is hardly the easiest language in the world these days, compared to some lanmguages like Python, etc.
It is used in many desktop apps - applets aren't even used that often. Finally, Java will always be slower than C, because it is not compiled directly to machine code. Sometimes, though, extreme speed isn't needed. Anyway, the JVM isn't the slowest language VM ever.
When you're working in C, there aren't abstract classes.
All a header file does is contain code which is pasted into other files. The main reason you put it in a header file is so that it is at the top of the file - this way, you don't need to care where you put your functions in the actual implementation file.
While you can kind-of use OO concepts in C, it doesn't have built-in support for classes and similar fundamentals of OO. It is nigh-impossible to implement inheritance in plain C, therefore there can never actually have OO, or abstract classes for that matter. I would suggest sticking to plain old structs.
If it makes it easier for you to learn, by all means think of them as abstract classes (with the implementation file being the inheriting class) - but IMHO it is a difficult mindset to use when for working in a language without explicit support of said features.
I'm not sure if Java has them, but I think a closer analogue could be partial classes in C#.

If you forward declare something, you have to actually deliver and implement it, else the compiler will complain. The header allows you to display a "module"'s public API and make the declarations available (for type checking and so) to other parts of the program.

Comprehensive reading: Learning C from Java. Recommended reading for developers who are coming from Java to C.

I think that there is much derision (mockery, laughter, contempt, ridicule) for Java simply because it's popular.
Abstract classes and interfaces specify a contract or a set of functions that can be invoked on an object of a certain type. Function prototypes in C only really do compile time type checking of function arguments/return values.

While your first question seems subjective to me, I will answer to the second one:
A header file contains the declarations which are then made available to other files via #inclusion by the preprocessor.
For instance you will declare in a header a function, and you will implement in a .c file. Other files will be able to use the function so long they can see the declaration (by including the header file).
At linking time the linker will look among the object files, or the various libraries linked, for some object which provides the code for the function.
A typical pattern is: you distribute the header files for your library, and a dll (for instance) which contains the object code. Then in your application you include the header, and the compiler will be able to compile because it will find the declaration in the header. No need to provide the actual implementation of the code, which will be available for the linker through the dll.

C programs run directy, while Java programs run inside the JVM, so a common belief is that Java programs are slow. Also in Java you are hidden from some low level constructs (pointer, direct memory access), memory management, etc...
In C the declaration and definition of a function is separated. Declaration "declares" that there exists a function that called by those arguments returns something. Definition "defines" what the function actually does. The former is done in header files, the latter in the actual code. When you are compiling your code, you must use the header files to tell your compiler that there is such a function, and link in a binary that contains the binary code for the function.
In Java, the binary code itself also contains the declaration of the functions, so it is enough for the compiler to look at the class files to get both the definition and declaration of the available functions.

Related

Is there a way to limit the use of a function to its library in C?

So I'm working on a static C library (like a library.a file) for a school project. There are multiple functions, some of which are placed in different files, so I can't use the static keyword. Is there a way that those functions could be limited to the library itself, an equivalent to static for libraries?
So I'm working on a static C library (like a library.a file) for a school project. There are multiple functions, some of which are placed in different files, so I can't use the static keyword. Is there a way that those functions could be limited to the library itself, an equivalent to static for libraries?
The C language does not have any formal sense of a unit of program organization larger than a single translation unit but smaller than a whole program. Libraries are a foreign concept to the language specification, provided by substantially all toolchains, but not part of the language itself. Thus, no, the C language does not define a mechanism other than static to declare that a function identifier can be referenced only by a proper subset of all translation units contributing to a program.
Such limitations are supported by some shared library formats, such as ELF, and it is common for C implementations targeting such shared libraries to provide extensions that enable those facilities to be engaged, but the same generally is not true for static libraries.
Note also that in all these cases, we're talking about the linkage of function identifiers, not about genuinely controlling access to the functions. In principle, any function in the program can be called from anywhere in the program via a function pointer pointing to it.
Frame challenge: why do you care?
The usual accommodation for having functions with external linkage that you don't want library clients to call directly would be to omit those functions from the library's public header files. What does it matter if some intrepid person can analyze the library to discover and possibly call those functions? Your public headers and documentation tell people how the library should be used. If people use it in other ways then that's on them.
It may not be possible to completely "hide" the existence of a set of restricted functions from other source files if they are defined with external linkage, since their identifiers will be visible when linking (as noted in the other answer).
However, if you are only looking to prevent someone from inadvertently calling restricted functions, one of these approaches may be useful:
In some of my projects, I have used #define and #ifdef statements to block restricted functions from being used throughout the program. For example, in my Hardware Abstraction Layer (HAL) library C source files, I typically place #define HAL__ prior to any #include statements. Then I place an #ifdef HAL__ ... #endif block around any restricted function definitions in my header files. Someone with intention could easily easily bypass this by adding #define HAL__ to their source code or by modifying the header file, but it provides some protection against unintentional use of restricted functions and other definitions.
Place the restricted function definitions in a separate header file used to build the library itself (for example library.a) and provide only header files containing non-restricted function declarations with the library. The identifiers for any functions defined will still be visible to the linker, but without the prototypes, it will be difficult for anyone to call them.
Again, if having the identifiers for any restricted functions visible throughout the program would be a problem (for example, duplicating other identifiers), then these options will not work. Also, if the goal is to prevent developers from intentionally calling restricted functions, then these options will not work, although option 2 would make this more difficult. If the intention is only to prevent unintentional calls to the restricted functions and there is no concern with having duplicate identifiers in the program, then these options may help.

Where are the header functions defined? [duplicate]

When I include some function from a header file in a C++ program, does the entire header file code get copied to the final executable or only the machine code for the specific function is generated. For example, if I call std::sort from the <algorithm> header in C++, is the machine code generated only for the sort() function or for the entire <algorithm> header file.
I think that a similar question exists somewhere on Stack Overflow, but I have tried my best to find it (I glanced over it once, but lost the link). If you can point me to that, it would be wonderful.
You're mixing two distinct issues here:
Header files, handled by the preprocessor
Selective linking of code by the C++ linker
Header files
These are simply copied verbatim by the preprocessor into the place that includes them. All the code of algorithm is copied into the .cpp file when you #include <algorithm>.
Selective linking
Most modern linkers won't link in functions that aren't getting called in your application. I.e. write a function foo and never call it - its code won't get into the executable. So if you #include <algorithm> and only use sort here's what happens:
The preprocessor shoves the whole algorithm file into your source file
You call only sort
The linked analyzes this and only adds the source of sort (and functions it calls, if any) to the executable. The other algorithms' code isn't getting added
That said, C++ templates complicate the matter a bit further. It's a complex issue to explain here, but in a nutshell - templates get expanded by the compiler for all the types that you're actually using. So if have a vector of int and a vector of string, the compiler will generate two copies of the whole code for the vector class in your code. Since you are using it (otherwise the compiler wouldn't generate it), the linker also places it into the executable.
In fact, the entire file is copied into .cpp file, and it depends on compiler/linker, if it picks up only 'needed' functions, or all of them.
In general, simplified summary:
debug configuration means compiling in all of non-template functions,
release configuration strips all unneeded functions.
Plus it depends on attributes -> function declared for export will be never stripped.
On the other side, template function variants are 'generated' when used, so only the ones you explicitly use are compiled in.
EDIT: header file code isn't generated, but in most cases hand-written.
If you #include a header file in your source code, it acts as if the text in that header was written in place of the #include preprocessor directive.
Generally headers contain declarations, i.e. information about what's inside a library. This way the compiler allows you to call things for which the code exists outside the current compilation unit (e.g. the .cpp file you are including the header from). When the program is linked into an executable that you can run, the linker decides what to include, usually based on what your program actually uses. Libraries may also be linked dynamically, meaning that the executable file does not actually include the library code but the library is linked at runtime.
It depends on the compiler. Most compilers today do flow analysis to prune out uncalled functions. http://en.wikipedia.org/wiki/Data-flow_analysis

Using my linked list code in a new c program

Apologies in advance since this seems extremely basic.
I have my linked list file, linkedList.c, and I would like to include it in my new c file so that I don't have to code the whole linked list in again. In Java I just had to place it in the same folder and then I could create an object of the class linkedList in the new file however C doesn't seem to work this way. If I try to use
#include "linkedLIst.c"
at the start of my new file then I receive errors since main has now been defined twice along with my Boolean variable. How exactly do I go about solving this?
You could #include any kind of (syntactically valid) C code, but you generally should not (by convention) include a .c file. Read more about the C preprocessor.
In practice, you should consider making some library (to be linked for reuse), and separate your shared code into a .h header file (containing definitions) that you would #include for re-using and an implementation .c file. Of course don't define any main in the shared source code. In some simple cases and on some operating systems, you might also share a single (or some few) object file (and related header files).
Your shared header would declare functions and extern variables (and #define some macros). It could also contain the definition (with their body) of short static inline functions.
Your shared implementation would define these (and others) functions and variables.
C programming entails a lot of conventions (and you need to define your own ones). Look at existing examples (some free software source code from github, or from a Linux distribution). For reusable container libraries, look into glib (from GTK) and also sglib (which uses a lot of preprocessor tricks) and many others.
Because C does not have any notion of namespaces it is wise (for readability and other reasons) to have a consistent naming convention, e.g. starting all the public names (of functions and variables and macros in headers) of your library by some common prefix.
You need to define a header file, linkedList.h, and declare your linked list function prototypes in there, which you probably already have defined in you linkedList.c file. And then, you can use '#include linkedList.h' to reuse your code.

Included files, all or nothing?

If I #include a file in C, do I get the entire contents of the file linked in, or just the parts I use?
If it has 10 functions in it, and I only use one of the functions, does the code for the other nine functions get included in my executable? This is especially relevant for me right now as I am working on a microcontroller and memory is precious.
Firstly, header files do not get "linked in". #include is basically a textual copy-paste feature. Everything from your include file gets pasted by preprocessor into the final translation unit, which will later be seamlessly processed by the compiler proper. The compiler proper knows nothing about any header files or #include directives.
Secondly, it means that if in your code you declared or defined some function or variable that you do not use, it is completely irrelevant whether it came from a header file through #include or was written directly in source file. There's absolutely no difference.
Thirdly, the question is: what exactly do you have in your header file that you include? Typically, header files do not define objects and functions, they simply declare them. Declarations do not produce any code, regardless whether you use the function or not. Declarations simply tell the compiler that the code (generated from the function definition) already exists elsewhere. Thus, as long as we are talking about typical header files, #include directives and header files by themselves have no effect on final code size.
Fourthly, if your header file is of some unusual kind that contains function (or object) definitions, then see "firstly" and "secondly" above. The compiler proper can see only one translation unit at a time, for which reason a typical strategy for the compiler proper is to completely discard unused entities with internal linkage (i.e. static objects and functions) and keep all entities with external linkage. Entities with external linkage cannot be discarded by compiler proper, since they might be needed in some other translation unit.
Fifthly, at linking stage linker can see the program in its entirety and, for that reason, can discard unused objects and functions, if it is advanced enough for that (and if you allow linker to do it). Meanwhile, inclusion-exclusion precision of a typical run-of-the-mill linker is limited to a single object file. Each object file is atomic to such linker. This means that if you want to be able to exclude unused functions on per-function basis, you might have to adopt "one function per object file" strategy, i.e. write one and only one function per .c file. Of course, this is only possible when you write your own code. If some third-party library you want to use does not adhere to this convention, then you might not be able to exclude individual functions.
If you #include a file in C, the entire contents of that file are added to your source file and compiled by your compiler. A header file, though, usually only has declarations of functions and no definitions (so no actual code is compiled).
The linker, on the other hand, takes all the functions from all the libraries and compiled source code and merges them into the final output file. At this time, the linker will discard any functions that you aren't using.
So, to answer your question: only the functions you use (and indirectly depend on) will be included in your final program file, and this is independent of what files you #include. Happy hacking!
You have to distinguish between different scenarios:
What does the included header file contain? Declarations of external functions only, or also static function definitions?
How are the implementations of the external functions distributed which are declared in that the header file you include? Are they all implemented in one .c file, or distributed across several .c files?
Regarding point 1: Only by #includeing external declarations, no other code will become part of your object file. And, definitions of static functions that are part of the header file, but which are not referenced by your code, may not become part of your object file - this is an optimization that is fairly common. It depends on your compiler, however.
Regarding point 2: Some linkers can only link whole object files, all or nothing. That means, if all the external functions declared in a header file are implemented in one .c file, and, if your code references at least one of these functions, chances are that you will get the whole object file, including all the other functions you don't use. Some linkers, however, can avoid this and remove unused parts when linking object files.
One brute-force approach to deal with non-optimizing linkers is, to put every external function into a .c file of its own. You will, however, have to find a way to deal with the situation that some of these functions refer to a common static function that is part of the original .c file...
Include simply presents the compiler ultimately with what looks like a single file (and if you do save-temps on GCC you will see that exactly a single file is presented to the actual compiler). It is no more complicated than that. So if you have some function prototypes or defines in your .c file then having them come from an include makes no difference whatsoever; the end result is the same.
If the things you include include code, functions, and not just prototypes, then it is the same as if you had those in the .c file itself. Whether or not those show up in the final binary has to do with whether or not you declared them as global or not using static, and then whether or not you optimized, etc. The same goes for variables and structures and other things.
Not all linkers are the same, but a common way to do it is whatever the compiler left in the object goes into the final binary. But if you take those objects and make a library out of them then some/many(?) linkers don’t suck everything into the binary on the portions that are required to resolve the dependencies.

What is a C header file? [duplicate]

This question already has answers here:
Closed 13 years ago.
Possible Duplicates:
[C] Header per source file.
In C++ why have header files and cpp files?
C++ - What should go into an .h file?
Is the only reason header files exist in C is so a developer can quickly see what functions are available, and what arguments they can take? Or is it something to do with the compiler?
Why has no other language used this method? Is it just me, or does it seem that having 2 sets of function definitions will only lead to more maintenance and more room for errors? Or is knowing about header files just something every C developer must know?
Header files are needed to declare functions and variables that are available. You might not have access to the definitions (=the .c files) at all; C supports binary-only distribution of code in libraries.
The compiler needs the information in the header files to know what functions, structures, etc are available and how to use them.
All languages needs this kind of information, although they retrieve the information in different ways. For example, a Java compiler does this by scanning either the class-file or the java source code to retrieve the information.
The drawback with the Java-way is that the compiler potentially needs to hold a much more of information in its memory to be able to do this. This is no big deal today, but in the seventies, when the C language was created, it was simply not possible to keep that much information in memory.
The main reason headers exist is to share declarations among multiple source files.
Say you have the function float *f(int a, int b) defined in the file a.c and reused in b.c and d.c. To allow the compiler to properly check arguments and return values you either put the function prototype in an header file and include it in the .c source files or you repeat the prototype in each source file.
Same goes for typedef etc.
While you could, in theory, repeat the same declaration in each source file, it would become a real nightmare to properly manage it.
Some language uses the same approach. I remember the TurboPascal units being not very different. You would put use ... at the beginning to signal that you were going to require functions that were defined elsewhere. I can't remember if that was passed into Delphi as well.
Know what is in a library at your disposal.
Split the program into bite-size chunks for the compiler. Compiling a megabyte of C files simultaneously will take more resources than most modern hardware can offer.
Reduce compiler load. Why should it know in screen display procedures about deep database engine? Let it learn only of functions it needs now.
Separate private and public data. This use isn't frequent but you may implement in C what C++ uses private fields for: each .c file includes two .h files, one with declarations of private stuff, the other with whatever others may require from the file. Less chance of a namespace conflict, safer due to hermetization.
Alternate configs. Makefile decides which header to use, and the same code may service two different platforms given two different header files.
probably more.

Resources