Compile and link .h header files which many .c source programs use - c

I work for a group in which our test bucket has hundreds of .c source programs. The .c programs are fairly small and they all include the same 10 .h header files. These .h files are fairly large.
Each time we get a new library file to link our test programs to test, we run a script to recompile and run our test bucket against. The problem is that the compiling takes fairly long, especially if the environment is virtual.
Is there a way to compile the .h header files once, put in a separate object file and have those many .c source files link to said object file? I think this will speed up compiling time. I am willing to change/remove all the #include in the .c source programs.
Any suggestions to speeding up compile time is greatly appreciated.
Also, I should say that a script executes a makefile PER .c source test program! The makefile is not told to compile all programs in the current directory. Each test program is compiled into its own executable.

You could use precompiled header feature. See http://gcc.gnu.org/onlinedocs/gcc/Precompiled-Headers.html

You've asked further suggestions to speed up your compilation.
One way can be using ccache. Basically, ccache keeps a cache of the object files compiled so far and returns them (instead of re-compiling again over and over) when it recognises that the same source file is being compiled again.
Using it should be as simple as
Install ccache
Prefix your gcc/cc/g++ command with ccache

Rewrite your headers. Strip off all definition and leave in header. Strip off all implementation and put in new .c. Compile as library. Link with solution. Distribute library on runtime system.

If I understand correctly, the way libraries typically work is by using precompiled code in object files ( .so on Linux systems? ), while providing header files ( .h ) for use in projects.
What happens is when you compile, the #include <library.h> directive finds that header and pastes its contents in the source file being compiled. Then, once the source file is compiled, it is linked to the precompiled object file. That way, the library can be included in a huge number of projects without it needing to be compiled from source each time. The only part that must be recompiled when linking to a library is the ( relatively ) small amount of code in the headers, which essentially makes library functions and variables accessible to the source code.
All this means is that to drastically speed up compilation, your best bet is to take all of the functions out of the 10 .h files, and instead leave only the function prototypes in the headers. Once you have all of the functions in separate .c source files, you can compile them into an object file ( typically -c flag ). Then, whenever you need to compile a new program against the 10 headers you typically use, you can instead include your stripped down version of the headers, and link to the precompiled object. Since only the new code in the .c file has to be compiled, instead of all of the code in the headers, the process should be much faster.

Related

Importing Modules — customized Modules in C

I am currently learning the C programming language and I'm having some issues with importing modules I created.
I created a small module to read with fgets and flush the buffer from stdin perfectly and I don't want to keep writing the code every single time. I just want to import this small module like I used to do in Python. I didn't knew how because I'm not using an IDE. I'm just compiling with gcc in terminal and using a text editor. I tried to search with Google,but in vain.
You should create a header for your module that declares the functions in the module – and any other information that a consumer of the module needs. You might call that header weekly.h, a pun on your name, but you can choose any name you like within reason.
You should create a library (shared or static — that's up to you) that contains the functions (and any global variables, if you're so gauche as to have any) defined by your module. You might call it libweekly.so or libweekly.a — or using the extensions appropriate to your machine (.dylib and .a on macOS, for example). The source files might or might not be weekly.c — if there's more than one function, you'll probably have multiple source files, so they won't all be weekly.c. You should put this code (the header and the source files and their makefile) into a separate source directory.
You should install the header(s) and the library in a well-known location (e.g. $HOME/include for headers and $HOME/lib for the library — or maybe in the corresponding directories under /usr/local), and then ensure that the right options are used when compiling (-I$HOME/include for the headers) or linking (-L$HOME/lib and -lweekly).
Your source code using the module would contain:
#include "weekly.h"
and your code would be available. With shared libraries in $HOME/lib, you would have to ensure that the runtime system knows where to find the library. If you install it in /usr/local, that is done for you already. If you install it in $HOME/lib, you have to investigate things like /etc/ld.so.conf or the LD_LIBRARY_PATH or DYLIB_LIBRARY_PATH environment variables, etc.
You need to create a header file (.h) with your function declarations types and extern variables. Then in the program where you want to use those functions include this .h file and and add the compiled .o file (with your functions) to the object file list. And you are done.

linking object files and linking static libraries containing these files

Hello Stack Overflow Community,
i am working on a c project to interleave multiple c programs into one binary, which can run the interleaved programs as treads or forks for benchmarking purposes.
Therefore i run make in each program folder of the desired programs and prelink all .o files with "ld -r" to one new .o file. After that i add a specific named function to each of these "big" .o files, which does nothing but run the main() of each program and providing the argc and argv. Then i use objcopy to localize every global Symbol except the unknown ones and the one of my specific function which shall run the main(). At last i link these manipulated .o files together with my program which runs the specific named functions as threads, or forks or after another.
Now to my Question/Problem:
I ran into a problem with static libs. I was using ffmpeg for testing, and it builds static libs such as libavcodc and libavutil and so on. Unfortunately, "ld -r" does not link .a files. So i tried to extract these libs with ar -x and then link the extracted .o files in the way mentioned above to the "big" new .o file. But i did not work because libavcodec and libavutil both include the file ff_inverse.o. That is obviously not a problem when i just build ffmpeg, which will link these static libraries. But still, both libraries include it, so there must be a machanism which makes the choice, which ff_inverse.o to use and to link. So my Question: How does this work? Where is the difference?
The way ld does it with normal linking is to prioritize the libraries. Libraries listed first in the command line are linked in first, and only if symbols still are unresolved does it move on to the next library. When linking static libraries, it ignores the name of each .o file, because the name is unnecessary, only the exported symbols are necessary. You may want to emulate that behavior, by extracting libraries in a sorted order.

C Header Files with the same declarations but different implementations

I have two sets of header files and .c files in my project i will only ever be including one of these headers but i want the option to quickly swap the header im including. Both header files have some declarations that are exactly the same but the implementations in the .c files are different. Basically what I need is way to tell the compiler to only compile the .c file that is associated with the header im including elsewhere in my program.
You could always specify the .c or .o file that you're going to link against at compile/link time for instance
gcc -o myexe file1.c/file1.o
or
gcc -o myexe file2.c/file2.o
you could even make this a different make directive if you have a makefile if you have the same header file but 2 different implementations. I would recommend just using 1 header file and changing the underlying implementation, no point in having 2 headers with similar declarations.
If both header files are exactly the same then you don't need to maintain two header files. You can keep only one copy. Whichever code includes the header file can include this single header file only.
You can always specify which .c file you want to compile while compiling. In gcc, you can mention the C file to be compiled in the command line. In Visual Studio, you can include the correct C file.
I guess you should maintain only one header file and include that in your code. Introduce a flag to the makefile to link which implementation to be linked. You have not mentioned what are you using to build.

Copy C files to include

I have a set of C files that I would like to use. Can I just copy them to my include directory, or do I have to compile them. I would think they would be compiled in the final binary.
You need to compile those C files if you want to use them.
To make use of what's in those C files, you'll nead a header file that declares what's inside them.
Those header files is what you'd put in your include folder, and you'll compile the C files together with your other C files. (Or you could make a library out of those C files)
Yes, they need to be compiled so that they are available at the linking step. C is not an interpreted language, so having the sources present in an include directory would do nothing for execution.
You can keep the source files at the same location. The include files will be in the include directory. You can use the compilation option -I./<include-file-directory> to specify from where to fetch the include files.
The final binary will be compiled version of all your source files which you give to the compiler. You have to explicitly specify every file to be compiled along the with final executable name.
In case you dont do so a default executable is created with the name a.out(i am assuming the platform to be linux and compiler to be gcc) in the directory where you compile.
Check the link for more details on compilation using Makefile.

"tc" "th" files for C program

I first encountered files ended with .tc and .th in a C library (http://www.vlfeat.org/api/files.html, only .tc files are listed there. To see .th file, one has to download its source code http://www.vlfeat.org/download/vlfeat-0.9.5-bin.tar.gz. They are under vl directory.). Just wonder what do they mean and relation with normal .h and .c files ?
Thanks and regards!
They use those as a template and the files are not compiled directly, but by being #included in the corresponding .c or .h file after setting #defines that affect the final result.
One example is what happens in mathop_sse2.c. They include the same mathop_sse2.tc twice, but the first time FLT is defined as VL_TYPE_DOUBLE and the second time is VL_TYPE_FLOAT. That way they avoid duplicating the exact same code for different types.
Seems most likely these files were written using the Borland compiler.

Resources