I have an embedded software entirely written in C and Assembly and for its build process I am using Scons and GCC. The source code is organized in different folders and each folder represents a "standalone" sub-project (i.e., application elf, static libray, etc.).
The main SConstruct file, located inside the main folder of the project, builds a default environment, adding to it some compilation flags shared among all the sub-projects, then it invokes the SConscripts placed in every sub-project, exporting the default environment.
Each sub-project builds a new environment cloning the default one and customize it with new builders, new compilation flags, etc.
The issue comes out because I have to add, for all sub-projects, some actions after each object file's compilation. In short, I need to generate preprocessed file for each ".c" and ".S", invoking the preprocessor with the same flags used by the normal compilation. To do this I think the best solution is to add the compilation flag "-save-temps=obj" to the default environment (this flag tell to the compiler to keep the temporary files), so all sub-projects will inherit this behavior.
The issue is that SCons doesn't track each generated temporary file. Considering that:
for each .c file, gcc will create the two temporary file .i and .s
for each .S file, gcc will create the single temporary file .s
I need to add to the default Object builder:
A SideEffect to tell scons that a .c to .o compilation will create also a .i file;
A SideEffect to tell scons that a .c to .o compilation will create also a .s file;
A SideEffect to tell scons that a .S to .o compilation will create also a .s file;
Is there a way to do this only using the file suffixes and without the enumeration of each target object file?
Furthermore, on each temporary file, I need to invoke a custom tool, again with the same compilation flags used for its compilation, in order to create other files with other debug information. How can I do this? Is there a way to add a post-action to the Object builder?
Thanks,
Ciro
Related
I am building a dynamic library (.so) file for android with around 100 local c files. The files all include a file c_macros.h, but the c_macros.h in question changes for different groups of files. For example, foo0.c and bar0.c need to include c_macros.h in the directory 0/ whereas foo1.c and bar1.c need to include c_macros.h in the directory 1/.
I see that one can define LOCAL_C_INCLUDES for the entirety of a compilation so that all .o files will use those local includes. However, can the LOCAL_C_INCLUDES be specified for a single file (or a group of files) and then change so that the right directories can be included for the right files?
One solution is just to build different .so files depending on which c_macros.h is being used, but this adds an overhead of around 10KB for each .so file, so I'd like to squash everything in one big .so file if possible, but then I'd need to sort out the LOCAL_C_INCLUDES issue.
You can build it into separate static libraries (where you can easily set different LOCAL_C_INCLUDES for each), and just build one single .so file that includes the static libraries. When linking the final .so file, this doesn't incur any extra overhead (the object files from static libraries behave just as normal individual object files).
I am working with a different compiler CC. It doesn't work like GCC.
When I was using GCC, I can do "gcc -o exe_filename source_filename" and the output would be a exe file.
When I use CC, I need 2 steps. First I compile the source files (suppose it involve a .c and a .h file ) and it create a .lis file and a .obj file. Then I do a link command which created a .exe file.
What is the relationship between LIS, OBJ and EXE files? I ask this because I wonder which files do I need if I want to use the exe in another machine without including unnecessary files. If LIS and OBJ were only used for compilation, I don't need it in another machine.
The compiler takes C files (and includes H files as referenced) and produces object (OBJ) and listing (LIS) files. The object file contains the code and data, but has unresolved external references. The listing typically includes line numbers, error and warning messages, and optional sections such as a type and variable cross-reference.
The linker combines object files and resolves external references to libraries. The result is an executable (EXE) image. (Or shareable image when creating libraries.)
Only the executable file needs to be copied from one system to another to run the application. The listing may be useful for interpreting error messages as it provides the properly correlated line numbers. The object could be useful if the application needs to be relinked due to changes in libraries, particularly if the target system has older versions than the original system.
the OBJ files are the compiled C files in a format that they can be "Linked" together by a linker and turned into an EXE.
Compile -> OBJ -> Link -> EXE
the LIS file is just informational output of the C that the compiler ends up compiling.
All you need once compiled and linked is the EXE
You don't need the other files. The exe will work fine by itself.
I don't have much idea on LIS. But the difference between OBJ and EXE is OBJ file may contain unresolved symbols and in EXE file all symbols are linked and resolved.
If another machine also has same hardware then u can use direct exe to run else you have to cross compile
gcc -MD file.c creates a dependency output file named file.d. But I dont understand the need of creating this file ( dependency file ), because when error comes while compilation, no dependency file is generated. So can anyone throw some light when he/she has used this dependency file or some usefulness of this file / feature of gcc.
The file.d file can be understand by make. You often first generate the .d files, include them into your Makefile and then compile the c-files only if one of the included headers has changed.
Don't bother about if you don't use make.
GCC documentation says:
Instead of outputting the result of preprocessing, output a rule suitable for make describing the dependencies of the main source file. The preprocessor outputs one make rule containing the object file name for that source file, a colon, and the names of all the included files, including those coming from -include or -imacros command line options.
Hello Stack Overflow Community,
i am working on a c project to interleave multiple c programs into one binary, which can run the interleaved programs as treads or forks for benchmarking purposes.
Therefore i run make in each program folder of the desired programs and prelink all .o files with "ld -r" to one new .o file. After that i add a specific named function to each of these "big" .o files, which does nothing but run the main() of each program and providing the argc and argv. Then i use objcopy to localize every global Symbol except the unknown ones and the one of my specific function which shall run the main(). At last i link these manipulated .o files together with my program which runs the specific named functions as threads, or forks or after another.
Now to my Question/Problem:
I ran into a problem with static libs. I was using ffmpeg for testing, and it builds static libs such as libavcodc and libavutil and so on. Unfortunately, "ld -r" does not link .a files. So i tried to extract these libs with ar -x and then link the extracted .o files in the way mentioned above to the "big" new .o file. But i did not work because libavcodec and libavutil both include the file ff_inverse.o. That is obviously not a problem when i just build ffmpeg, which will link these static libraries. But still, both libraries include it, so there must be a machanism which makes the choice, which ff_inverse.o to use and to link. So my Question: How does this work? Where is the difference?
The way ld does it with normal linking is to prioritize the libraries. Libraries listed first in the command line are linked in first, and only if symbols still are unresolved does it move on to the next library. When linking static libraries, it ignores the name of each .o file, because the name is unnecessary, only the exported symbols are necessary. You may want to emulate that behavior, by extracting libraries in a sorted order.
I have a set of C files that I would like to use. Can I just copy them to my include directory, or do I have to compile them. I would think they would be compiled in the final binary.
You need to compile those C files if you want to use them.
To make use of what's in those C files, you'll nead a header file that declares what's inside them.
Those header files is what you'd put in your include folder, and you'll compile the C files together with your other C files. (Or you could make a library out of those C files)
Yes, they need to be compiled so that they are available at the linking step. C is not an interpreted language, so having the sources present in an include directory would do nothing for execution.
You can keep the source files at the same location. The include files will be in the include directory. You can use the compilation option -I./<include-file-directory> to specify from where to fetch the include files.
The final binary will be compiled version of all your source files which you give to the compiler. You have to explicitly specify every file to be compiled along the with final executable name.
In case you dont do so a default executable is created with the name a.out(i am assuming the platform to be linux and compiler to be gcc) in the directory where you compile.
Check the link for more details on compilation using Makefile.