I've read up a bit on preprocessor directives and I've seen #import being used a few times in C programs. I'm not sure what the difference is between them, some sites have said that #include is only used for header files and #import is used more in Java and is deprecated in C.
If that's the case, why do some programs still use #import and how exactly is it different from #include? Also, I've used #import in a few of my C programs and it seems to work fine and do the same thing as #include.
This is well-explained in the Gnu CPP (C preprocessor) manual, although the behaviour is the same in clang (and possibly other C compilers, but not MSVC):
The problem. Summary: You don't usually want to include the same header twice into a single translation unit, because that can lead to duplicate declarations, which is an error. However, since included files may themselves want to include other files, it is hard to avoid.
Some non-standard solutions (including #import). Summary: #import in the including file and #pragma once in the included file both prevent duplicate inclusion. But #pragma once is a much better solution, because the includer shouldn't need to know whether duplicate inclusion is acceptable.
The linked document calls #import a "deprecated extension", which is a slightly odd way of describing a feature which was never part of any standard C version. But it's not totally meaningless: many preprocessor implementations do allow #import (which is a feature of Objective-C), so it is a common extension. Calling it deprecated is a way of saying that the extension will never be part of any C standard, regardless of how widespread implementations are.
If you want to use an extension, use #pragma once; that also might not be present in a future standard, but changing it for a given header file will only require a change in one place instead of in every file which includes the header. C++ and even C are likely at some point to develop some kind of module feature which will allow inclusion guards to finally be replaced.
As mentioned in comments, #import is not standard and can mean different things for different compilers.
With Microsoft's compiler, for example, #import can automatically generate and include a header file at compilation time.
Simple. It can be the same; but some compilers handle #import differently, like the Microsoft compiler: it will automatically include the specified file at compilation time.
Related
I am new to C and am just learning the basics of modularising my code for neatness and maintainability. I am reading a lot of people saying not to include .c files directly but instead to use .h files with associated .c files.
My question is, when writing a library which is exposed/included via its .h file - does the compiler dedupe common includes or are the included each time they are referenced?
For instance in my above application, I am using printf in my main and also in my foo library.
When running:
gcc -o app main foo/foo.c && ./app
I get the expected outputs printed to the console, however does the compiler remove duplicates of the <stdio.h> include or is it included once for main.c and once again for foo.c?
No, the compiler does not remove them. Nor should it, because sometimes (although it's rare) headers are written with the purpose of being included several times with different effects each time. So the compiler can't just omit these subsequent inclusions.
That's why people put include guards in headers (#ifndef FOO_H_ in this case.)
Each file, regardless of whether is a .h or .c file, should include what it needs. It should not rely that a header has already been included somewhere else. If something is included twice in the current compilation unit, the include guards will make sure headers are only included once, regardless of how many files try to include them.
As a side note, #pragma once, even though it's not in the C standard, is a de-facto standard compiler extension. So you can use just do:
#pragma once
void foo();
It's one of those rare cases where a non-standard compiler extension is so widely supported that it's safe to use.
In contrary, each compilation unit ("main.c" and "foo.c" in your case) needs that include. Otherwise the compiler would not know the prototype of printf()(note). Each compilation unit (aka "module") is compiled on its own.
You might mix up headers and linkable files (object code files, and libraries).
The contents of a header file replaces the #include line during preprocessing. "stdio.h" contains only the prototype of printf(), among a lot of other stuff, not the implementation of the function.
If the compiler generates the object code for "main.c" and "foo.c", each of them includes an unresolved reference to printf().
Finally the linker will include the object code for printf(), but just once. This single instance of the function is called by both callers. Here happens what you seem to ask.
You might wonder why you don't have to add the library to your command line. This is a convenience feature of most compiler drivers, as nearly all applications want the standard libraries. You might like to add "-v" to the command line to see what really happens. Other options can suppress this automation.
Note: Some compilers are quite smart and know a lot of standard functions. They will accept the source and produce a nice warning. But don't rely on this.
Do including header files such as stdio.h, conio.h or any other makes our code or program heavier
Including header files inserts all the content from them into the translation unit on pre-processing.
If the include has only declarations(which is usually the case) and the functions are implemented in library files, the code doesn't get heavier. If the header files include implementation, it will be compiled on compilation, thus making the file heavier.
You can read more about compilation steps here:
http://www.tenouk.com/ModuleW.html
Why don't you try including them and building an EXE, then not including them and building an EXE, and see what happens to the file size. I suspect you'll find that the linker is clever enough to only build what's necessary into the EXE.
For typical release builds, no. Header files typically only contain declarations, and unused declarations do not contribute to the size of release builds. Header files can contain inline function definitions, which might be emitted by your compiler if the definition is not optimized out. However, this will probably not apply to system headers like <stdio.h>.
For typical debug builds, yes. Debugging data often contains information about declarations even if those declarations aren't used in the program. This way you can use those declarations in the debugger. Modern debuggers include function definitions, enums, and even preprocessor definitions these days.
That said, you can put anything in a header file.
The primary effect of unnecessary header file inclusions is to make the build process take longer.
Normally a header file contains declarative statements only. Declarations in themselves do not add to code size. Referencing the declared symbols anywhere in the code will cause the linker to include the code defining the declared symbols.
Because a header file must be parsed by the compiler in the context of the code in which it is included, the build time can be extended by header file inclusion. Very large or deeply nested headers such as windows.h for example can have a considerable effect in this way.
Additionally if you are using an IDE with code navigation and comprehension facilities such as auto-complete and syntax checking, the IDE must parse the code in a similar manner to the compiler, and here again header files can slow that process down.
While you should avoid including unnecessary header files, those containing declarations fuse in teh code are unavoidable - or at least avoiding them will lead likely lead to errors, repetition and make the code hard to maintain.
I like to keep my files clean, so I prefer to take out includes I don't need. Lately I've been just commenting the includes out and seeing if it compiles without warnings (-Wall -Wextra -pedantic, minus a couple very specific ones). I figure if it compiles without warnings I didn't need it.
Is this actually a safe way to check if an include is needed or can it introduce UB or other problems? Are there any specific warnings I need to be sure are enabled to catch potential problems?
n.b. I'm actually using Objective C and clang, so anything specific to those is appreciated, but given the flexibility of Objective C I think if there's any trouble it will be a general C thing. Certainly any problems in C will affect Objective C.
In principle, yes.
The exception would be if two headers interact in some hidden way. Say, if you:
include two different headers which define the same symbol differently,
both definitions are syntactically valid and well-typed,
but one definition is good, the other breaks your program at run-time.
Hopefully, your header files are not structured like that. It's somewhat unlikely, though not inconceivable.
I'd be more comfortable doing this if I had good (unit) tests.
Usually just commenting out the inclusion of the header is safe, meaning: if the header is needed then there will be compiler errors when you remove it, and (usually) if the header is not needed, the code will still compile fine.
This should not be done without inspecting the header to see what it adds though, as there is the (not exactly typical) possibility that a header only provides optional #define's (or #undef's) which will alter, but not break, the way a program is compiled.
The only way to be sure is to build your code without the header (if it's able to build in the first place) and run a proper regimen of testing to ensure its behavior has not changed.
No. Apart from the reasons already mentioned in other answers, it's possible that the header is needed and another header includes it indirectly. If you remove the #include, you won't see an error but there may be errors on other platforms.
In general, no. It is easy to introduce silent changes.
Suppose header.h defines some macros like
#define WITH_FEATURE_FOO
The C file including header.h tests the macro
#ifdef WITH_FEATURE_FOO
do_this();
#else
do_that();
#endif
Your files compile cleanly and with all warnings enabled with or without the inclusion of header.h, but the result behaves differently. The only way to get a definitive answer is to analyze which identifiers a header defines/declares and see if at least one of them appears in the preprocessed C file.
One tool that does this is FlexeLint from Gimpel. I don't get paid for saying this, even though they should :-) If you want to avoid shelling out big bucks, an approach I have been taking is compiling a C file to an object file with and without the header, if both succeed check for identical object files. If they are the same you don't need the header
(but watch our for include directives wrapped in #ifdefs that are enabled by a -DWITH_FEATURE_FOO option).
This is just a general compiler question, directed at C based languages.
If I have some code that looks like this:
#include "header1.h"
#include "header2.h"
#include "header3.h"
#include "header4.h" //Header where #define BUILD_MODULE is located
#ifdef BUILD_MODULE
//module code to build
#endif //BUILD_MODULE
Will all of the code associated with those headers get built even if BUILD_MODULE is not defined? The compiler just "pastes" the contents of headers correct? So this would essentially build a useless bunch or header code that just takes up space?
All of the text of the headers will be included in the compilation, but they will generally have little or no effect, as explained below.
C does not have any concept of “header code”. A compilation of the file in the question would be treated the same as if the contents of all the included files appeared in a single file. Then what matters is whether the contents define any objects or functions.
Most declarations in header files are (as header files are commonly used) just declarations, not definitions. They just tell the compiler about things; they do not actually cause objects or code to be created. For the most part, a compiler will not generate any data or code from declarations that are not definitions.
If the headers define external objects or functions, the compiler must generate data (or space) or code for them, because these objects or functions could be referred to from other source files to be compiled later and then linked with the object produced from the current compilation. (Some linkers can determine that external objects or functions are not used and discard them.)
If the headers define static objects or functions (to be precise, objects with internal or no linkage), then a compiler may generate data or code for these. However, the optimizer should see that these objects and functions are not referenced, and therefore generation may be suppressed. This is a simple optimization, because it does not require any complicated code or data analysis, simply an observation that nothing depends on the objects or functions.
So, the C standard does not guarantee that no data or code is generated for static objects or functions, but even moderate quality C implementations should avoid it, unless optimization is disabled.
Depends on the actual compiler. Optimizing compilers will not generate the output for unrequired code, whereas dumber compilers will.
gcc (a very common c compiler for open-source platforms) will optimize your code with the -O option, which will not generate unneeded expressions.
Code in #ifdef statements where the target is not defined will never generate output, as this would violate the language specifications.
Conceptually, at least, include/macro processing is a separate step from compilation. The main source file is read and a new temporary file is constructed containing all the included code. If anything is "#ifdefed out" then that code is not included in the temporary file. At the same time, the occurrences of macro names are replaced with the text they "expand" into. It is that resulting file, with all the includes included, etc, that is fed into the actual compiler.
Some compilers do this literally (and you can even "capture" the intermediate file) while others sort of simulate it (and actually require an entire separate step if you request that the intermediate file be produced). But most compilers have one means or another of producing the file for your examination.
The C/C++ standards lay out some rather arcane rules that must be followed to assure that any "simulated" implementation doesn't somehow change the behavior of the resulting code, vs the "literal" approach.
I have two large framework libraries, whose header files are included in my project. Either one works flawlessly, but including both causes erratic behaviour (but no error message related to the macro).
I assume that they both #define a macro of the same name. What is the most efficient way to identify the problematic macro?
I assume that they both #define a macro of the same name.
That should generate at least a warning by the compiler (if they are in the same translation unit).
How do I identify redefined macros in C/C++?
As far as I know there is no straight-forward way.
Either one works flawlessly, but including both causes erratic behaviour
Can you please give us some details on the eratic behaviour? What actually happens? What makes you think it's the macro definitions?
If the header files are badly written and #undef SYMBOL ... #define SYMBOL something-else then you can't trace this. Simply #defining a macro twice should at least issue a warning. You'd have to look more closely at the 'erratic behavior'.
Try looking at the preprocessed output to determine what's different about it when you #include the header files and when you don't. With gcc and with MSVC, you'd compile with -E to print the preprocessor output to stdout. (It likely will be many thousands of lines, so you'll want to pipe it to a file.)
You should be able to run ctags over your source.
ctags can generate a tags file that, amongst other things, contains the names and locations of the macros in your C files.
You can control the types of symbols that ctags will store in your tags file through the use of the --c-kinds option.
eg.
ctags --c-kinds=+d -f tags --recurse ./your_source_directory/
You can then search for duplicates in the resultant tags file.
grep for #define ?
are you sure the problem isn't something else than a macro (for example pragmas for structur packing, global memory allocators, global namespace for class names, messing with locale ...)
Compile with all warnings on - they should tell you when a macro 'is already defined' (maybe you can modify code in order to fix this)
If (1) doesn't help then you should try to create function wrappers for each library. This way you avoid including the conflicting headers by including the wrapped headers, using the wrapped functions. This is laborious but it's sometimes the only way to make two libraries coexist in an application.
Basically solution (2) would make a separation between libraries. An example of such conflict is ACE with wxWidgets (2.8 version) when forced using precompiled libraries that are compiled with different options (one library Unicode the other ASCII).