Unit test on program which has lots of macros - c

Recently, I need to add unit test to one legacy program.
But in it, there are lots of macros, like
#ifdef CONFIG_XXX
do xxx
#endif
#ifdef CONFIG_YYY
do yyy
#endif
Currently, the generic program path are covered by unit tests. So, I want to add tests to cover the inside macro parts (different program path).
It seems that I need to compile and run my program with certain macros each time, and how to design the composition of macros to cover the program path and reduce compilation times is really not easy.
So, I plan to move all the hardware related code to arch folder, now, macros were moved from c files to makefile, but still need to compile with certain macros each time to get UT work.
Does anyone have experiences on this problem before?
Thanks for your comments.

i think you can just use gcc -D to generate many version of the binary program.
compile and run them with a script to do that :)

Related

Best Practices in embedded C: How to use preprocessor directives to autoconfigurate a project for its specific hardware

I have two boards, each with the same mcu as target. The difference is that the peripherals are not 100% the same (lets say they are by maybe 90%). So far my colleague has two macros and he either comments them or not so that #ifdef/#endif can be used to tell the preprocessor which includes to use and which to ignore.
I'm thinking of better ways to do this. I dont like the idea of people having to search for the correct line to comment each time they want the correct build for their hardware system, this should be automated and or better documented imho.
Best I came up with are multiple "build-sets" that would then by called "hardware-1" and "hardware-2" or something (of course more descriptive...). These build sets would then each have different "-I"-options to define the two macros my colleague used already before.
For cmake I found this thread:
Define preprocessor macro through CMake?
Is this the way to go or are there better ways that are more elegant? How would you solve this situation? The question maybe also goes into "What are the best practices to tackle this"
Thanks for your input
J
Best I came up with are multiple "build-sets" that would then by
called "hardware-1" and "hardware-2" or something (of course more
descriptive...). These build sets would then each have different
"-I"-options to define the two macros my colleague used already
before.
You mean -D, not -I, but yes, defining the macros via the compiler command line is one of the traditional approaches to this. How you might achieve that depends somewhat on your build system, but with a hand-rolled makefile, it is common to define make variables for target-specific flags, and to put put those, appropriately commented, at the top of the top-level makefile. Sometimes these are intended to be modified at build time, but sometimes there are just different makefiles, or else which set of flags to used is controlled by the target requested on the make command line.
For cmake I found [...]. Is this the way to go or are there better ways that are more elegant?
If you are using cmake already then yes, cmake's facilities for adding macro definitions to the compiler command line would be a great approach. If you are not using cmake then no, switching to a cmake-based build system would be way overkill for just solving the problem described. For systems where CMake will generate makefiles, it is basically a wrapper for what I already described.
I happen to be a fan of the Autotools. If you have an Autotools-based build system then there are different ways to set up this sort of thing, but if you don't, then setting up autotooling for just this purpose would be overkill. It is perhaps worth mentioning, however, that a standard Autotools approach would work by putting the definitions of the adjustable control macros in a header file, and having all the source files include that header. The Autotools would generate that header programmatically, but that's not essential -- you could set up such a header manually and update it as needed, and that would still solve the problem of knowing where to look for the macro definitions.
Normally one can specify preprocessor defines as part of the compilation command.
gcc -Wall -Darduino embedded.c
So assuming Linux/Make you could use
make clean arduino
or
make clean atmega2560
and simply have two targets named that in the make file.
Each one having a -darduino or -datmega2560 as part of the compile command.
If you are using some sort of IDE like MSVC, on the project properties page, under C/C++ you would find a Preprocessor area, and you can add one or the other as part of the preprocessor defines.
Preprocessor Definitions arduino;_DEBUG;_CONSOLE;%(PreprocessorDefinitions)

Is it possible to see the macros of a compiled C program?

I am trying to learn C and I have this C file that I want view the macros of. Is there a tool to view the macros of the compiled C file.
No. That's literally impossible.
The preprocessor is a textual replacement that happens before the main compile pass. There is no difference between using a macro and putting the code the macro expands to in its place.*
*Ignoring the debugger output. But even then you can do it if you know the right #pragma to tell it the file and line number.
They're always defined in the header file(s) that you've imported with #include, or that those files in turn #include.
This may involve a lot of digging. It may involve going into files that make no sense to you because they're not written for casual inspection.
Any macros of any importance are usually documented. They may use other more complex implementation-specific macros that you shouldn't concern yourself with ordinarily, but if you're curious how they work the source is all there.
That being said, this is only relevant if you have the source and more specifically a complete build environment. Once compiled all these definitions, like the source itself, do not appear in the executable and cannot be inferred directly from the executable, especially not a release build.
Unlike Java or C#, C compiles directly to machine code so there's no way to easily reverse that back to the source. There are "decompilers" that try, but they can only really guess as to the original source. VM-based languages like Java and C# only lightly compile the code, sot here are a lot of hints as to how that code was generated and reversing it is an easier process.

Expanding a C macro selectively [duplicate]

I was wondering if it is possible, and if yes how, can I run a C preprocessor, like cpp, on a
C++ source file and only process the conditional directives #if #endif etc. I would like other
directives to stay intact in the output file.
I'm doing some analysis on C# code and there is no C# pre-processor. My idea is to run a C preprocessor on C# file and process only conditionals. This way for example, the #region directive, will stay
in the file, but cpp appears to remove #region.
You might be looking for a tool like coan:
Coan is a software engineering tool for analysing preprocessor-based configurations of C or C++ source code. Its principal use is to simplify a body of source code by eliminating any parts that are redundant with respect to a specified configuration.
It's precisely designed to process #if and #ifdef preprocessor lines, and remove code accordingly, but it has a lot of other possible uses.
The linux unifdef command does what you want:
http://linux.die.net/man/1/unifdef
Even if you're not on linux, there is source available on the web.
BTW, this is a duplicate of another question: Way to omit undefined preprocessor branches by default with unifdef?
Oh, this is the same task as I had in the past. I've tried cpp unifdef and coan tools - all of them stumbled upon special C# preprocessor things like #region. In the end I've decided to make my own one:
https://github.com/gaDZella/undefine.
The tool has a pretty simple set of options compared to the mentioned cpp tools but it is fully compatible with C# preprocessor syntax.
You can use g++ -E option to stop after preprocessing stage
-E -> stop after the preprocessing stage.The output is in the form of preprocessed source code, which is sent to the standard output

Check for time.h / sys/time.h compatibilty

I am currently porting an autotools project to CMake that uses the AC_HEADER_TIME autotools macro to check if I can include both time.h and sys/time.h.
How can this be done with CMake?
It isn't the time test specifically, but for an example of how to do that sort of thing you might check out this example from BRL-CAD:
http://sourceforge.net/p/brlcad/code/HEAD/tree/brlcad/trunk/misc/CMake/test_srcs/sys_wait_test.c
The code is then used to do a test in CMake with:
CHECK_C_SOURCE_RUNS(${CMAKE_SOURCE_DIR}/misc/CMake/test_srcs/sys_wait_test.c SYS_WAIT_TEST_RUNS)
You can also use CHECK_C_SOURCE_COMPILES if you don't want to try to run the code (we usually do, but that sort of thing is situation dependent.) See http://www.cmake.org/cmake/help/v3.0/module/CheckCSourceRuns.html
and
http://www.cmake.org/cmake/help/v3.0/module/CheckCSourceCompiles.html for more information about using variables to control how the compilation is done. Generally speaking if you need to set those variables, you'll want to cache their current values before specifying the test values and then restore the original values after the test, i.e.
set(CMAKE_REQUIRED_FLAGS_CACHE ${CMAKE_REQUIRED_FLAGS})
set(CMAKE_REQUIRED_FLAGS "-foo_flag")
CHECK_C_SOURCE_RUNS(${CMAKE_SOURCE_DIR}/CMake/time_test.c TIME_TEST_RUNS)
set(CMAKE_REQUIRED_FLAGS ${CMAKE_REQUIRED_FLAGS_CACHE})
I've had a few puzzling behaviors in tests over the years that resulted from leftovers from one test interfering with another.
You can use try_compile() with sample source which #include both files and check the result.

How to automatically call all functions in C source code

have you ever heard about automatic C code generators?
I have to do a kind of strange API functionality research which includes at least one attempt of every function execution. It may lead to crushes, segmentation faults - no matter. I just need to register every function call.
So i got a long list (several hundreds) of functions from sources using
ctags -x --c-kinds=f *.c
Can i use any tool to generate code calling every of them? Thanks a lot.
UPD: thanks for all your answers.
You could also consider customizing the GCC compiler, e.g. with a MELT extension (which e.g. would generate the testing during some customized compilation). Then you might even define your own #pragma or __attribute__ to parameterize these functions (enabling their auto-testing, giving default arguments for testing, etc etc).
However, I'm not sure it is the right approach for unit testing. There are many unit testing frameworks (but I am not very familiar with them).
Maybe something like autoconf could help you with that: as described here. In particular check for AC_CHECK_FUNCS. Autoconf creates small programs to test the existence of registered functions.

Resources