Makefile arguments to C definitions - requires clean - c

I have a makefile that I use to build an embedded project in C. When I build the project I pass an argument like the following so that a #define is set.
make UID=ID123
In the makefile I have
ifdef UID
CFLAGS+=-DUID=\"$(UID)\"
endif
And in the source code e.g. app.h I have
#ifndef UID
#define UID NOUID
#endif
The problem I am facing is that this works only if I clean the project first. Since the project is quite big, this takes a lot of time between recompilations.
How can this be avoided? Can the make program selectively build the files that are affected? Like when a file gets edited? Does removing the object files that this #define affects help or is a bad idea?
The reason this is necessary is so that programming 100 devices, each will have a unique ID passed to the program at build time.
Thanks.

There are several things you can do, but some of there are pretty involved. It's a question of what your priorities are, and how much arcane Make-craft you want to do.
First you can figure out which object files depend on UID. Having such a list will save you a lot of work; for instance, you can remove only those object files and then rebuild them, instead of removing and rebuilding all object files with clean:
clean_uid_objects:
rm -f $(UID_OBJECTS)
You can maintain this list yourself:
OBJECTS := foo.o bar.o
UID_OBJECTS := baz.o qux.o
or with some cleverness you might have Make construct the list on the fly, but explaining that one would take a while.
You can also have Make keep track of which UID you used, the last time you rebuilt the object files. After all, if you haven't changed it, then you don't have to rebuild anything because of it. One way to do that is to record the last UID in a file called, say, UID_SAVE. Then you can include that file in the makefile, and Make will adjust it (and rebuild the makefile, and rerun itself) when you pass it a new UID:
-include UID_save
ifdef UID
#CFLAGS+=-DUID=\"$(UID)\"
ifneq ($(UID),$(OLDID))
.PHONY: UID_SAVE
endif
endif
UID_save:
#echo OLDID:=$(UID) > $#
Once you have that working, you can make UID_SAVE a prerequisite of the object files that depend on UID, so that no cleaning is necessary at all.

I ended up using
.PHONY: myfile1.c myfile2.c myfile1.h
to force rebuild the files that depend on the argument passed.

My answer is similar to Beta's last answer, but it shows the precise systematic method. What you want, is have UID behave as if it were a file, not a variable, so you could depend targets on it. Here is how to do it, see my answer in this post:
How do I add a debug option to Makefile
You write the function DEPENDABLE_VAR. Then, if you want a variable UID to be "dependable", you simply call that function:
$(eval $(call DEPENDABLE_VAR,UID))
and now you can write things like:
foobar.o: UID

Related

Define header file Macros at compile using Make, and recompile source files with those updated Macros

I am new to makefiles and am struggling a little bit to get things to work how I want.
I have a header file with some structs and a bunch of macros that I have sitting in #ifndef statements so that they have a default value if they're not defined on the command line.
I have three .c files that all include this header to make use of these macros.
In my makefile, I need to be able to compile all my source files, which have the header file as a dependent.
What I would like to be able to do is when I run make, if I include new values for those macros, that the header file is updated, and this triggers the source files to also be recompiled so that they include the new values.
I don't know if this is the correct way to go about it. The issue I am trying to avoid is if I pass these new values, but say only one of these source files recompile because make only identifies it as having been changed, and therefore different values are being used across the three programs.
What I would like to be able to do is when I run make, if I include new values for those macros, that the header file is updated, and this triggers the source files to also be recompiled so that they include the new values.
And how do you suppose that make would recognize that the macro definitions in use have changed? It cannot very easily compare the definitions themselves to values as-of previous runs.
What make can do pretty easily is compare the timestamp of the file wherein the macro definitions are recorded to the timestamps of the object files. It can do that by making the former file a prerequisite for each of the latter files.
The file containing the macro definitions might be the makefile itself or some file included into it. If you use the makefile itself then you will sometimes get recompilations for reasons other than the macro definitions having changed, but many of those reasons will in fact be good reasons. The object files and binaries have a bona fide dependency on the instructions used to build them.
So you might have something along these lines:
SOURCES = larry.c curly.c moe.c
OBJECTS = $(SOURCES:.c=.o)
DEFINES = -DOPTION1=1 -DOPTION2=0 -DOPTION3=1
prog: $(OBJECTS)
$(CC) -o $# $^
%.o: %.c stooges.h Makefile
$(CC) -c -o $# $(DEFINES) $<
It would possible to instead use a similar procedure update the timestamp of the header file, thus forcing recompilation indirectly, but that's synthetic. I recommend expressing the true dependencies instead.
OR, for a single-developer project, it would probably be feasible to just make it a developer responsibility to manually perform a clean build after performing makefile updates that require it. That scales poorly as you add developers, though.

How to include hundreds of folders in a Makefile?

I'm working on a C++ project at work where we need to develop a small piece for a larger application. We were given headers and static libraries for all of the code that we should need to reference. These are strewn throughout multiple folders and we placed all of that inside a common folder.
When writing our code, we'll need to include the headers and libraries as a part of our compilation process. Is there an elegant solution to doing this in a Makefile, or do I have to explicitly list each include folder with -I , each library folder with -L , and each library with -l?
Or is there an alternative to a Makefile that might make sense for this?
Edit: Here is an example of the folder structure:
common
folder1
subfolder1
include
libs
subfolder2
...
subfolder10
folder2
...
...
folder10
code
makefile
ourStuff
There are multiple levels of folders under common containing headers and libraries. We need to include code from there.
It was also asked why we don't just explicitly list the path in our #include statements. This code will be living in the main application once we're done, and it doesn't exactly follow the folder structure we were given.
Well, given the above structure it's simple enough to generate the things you want. For example if you want to add all -I... flags to CXXFLAGS, you can use:
INCDIRS := $(wildcard ../common/*/*/include)
CXXFLAGS += $(addprefix -I,$(INCDIRS))
Similar for -L flags:
LIBDIRS := $(wildcard ../common/*/*/libs)
LDFLAGS += $(addprefix -L,$(LIBDIRS))
Linking all the libraries is slightly more complicated. Assuming they're all static libraries you can do something like this:
LIBFILES := $(notdir $(wildcard ../common/*/*/libs/lib*.a))
LDLIBS += $(patsubst lib%.a,-l%,$(LIBFILES))
Of course this is assuming you don't have any naming conflicts / all libraries are unique.
Obviously, your question can be formulated like this: "Do I have to write a plethora of include paths or is there some managed/automatic way to do that". The question may pop up in the context of a makefile but this is mainly because make does not try to cloak the complexity of software building from the programmer. Trying to evade to another build system buys you nothing if the components you are using were not fitted into the larger build algorithm by their original programmers. If you receive pre-configured build parts (e.g. in form of a CMake project) then you save a great deal of work, needing only to tie together some abstraction level high up in the hierarchy. The downside of this is that you are locked in this build methodology now, possibly with more ramifications radiating out into parts of your project where they do as much harm as good. You may want to read this thread here: https://softwareengineering.stackexchange.com/questions/407056/the-case-against-path-expressions-in-include-directives
The cheapest way to at least partially achieve what you want to do in GNUmake is to use the function wildcard-rec (see https://github.com/markpiffer/gmtt#call-wildcard-reclist-of-globs) which has a fairly flexible input-output-relation. You can e.g. collect all paths which are of the form project/component_a/**/include/ in a whole subtree, or all header files in such a path with project/component_a/**/include/*.h.
PS: simply include gmtt.mk at the top of your makefiles.

Reading a set of variables into a Makefile from a C source file?

I have an AVR8 GCC application that can be built with a standard makefile. Because some folks who want to build the application don't want to set up make and such (or have trouble doing so), I also have figured out how to set the project up so it can be compiled from the Arduino IDE as well.
All is working.
But, I normally set some items in the makefile, like the version number and such, but creating the VERSION string in the makefile and passing it as a define into each source file compilation. But, when run from the Arduino IDE, that step is obviously not occurring. So, I have to create a second #define in the Arduino sketch stub to recreate the define.
This means when I update the version, I need to do so in 2 places, in the makefile and in the source file.
The easy option is to simply move the VERSION creation to the source file, where both can use it. And, I'm OK doing that, but
The makefile actually needs the version information, both to create the right filename (think app_v1.2.3.4.bin) and embed the version number into the bin file since it is used by the boot-loader (if requested) to ensure the version the boot-loader flashes is newer than the one already in FLASH. So, if I move the VERSION, RELEASE, MODIFICATION, etc. defines into the C code, I need to find a way to pull them back into the makefile.
I tried using the file read operations in the makefile, but they seem to ignore:
#define VERSION 0
with the prefaced '#' char.
I see there's some options to run sed/awk/etc, in bash, but I don't want to make too many assumptions on the environment, and the makefile currently runs on Windows as well as Unix/Linux without any differences.
I tried a few stack overflow examples, but nothing seems to yield those 4 numbers from any file, .h or otherwise.
I'm OK with creating version.h with just:
#define VERSION 0
#define RELEASE 1
#define MODIFICATION 2
#define FIX 4
If I can read it into the makefile and create the variables I need.
Jim
You may take a look at gmtt which was designed exactly with you use case in mind. In gmtt the following should read and analyze your header file:
include gmtt.mk
# create a 3-column table from the header file. The first column is just the "#define"
VNR_TABLE := 3 $(file < version.h)
# Extract the values from the table: select column 3 from VNR_TABLE where column 2 equals a string constant.
# Be careful not to introduce spaces in the compare!
VER := $(call select,3,$(VNR_TABLE),$$(call str-eq,$$2,VERSION))
REL := $(call select,3,$(VNR_TABLE),$$(call str-eq,$$2,RELEASE))
MODF := $(call select,3,$(VNR_TABLE),$$(call str-eq,$$2,MODIFICATION))
FIX := $(call select,3,$(VNR_TABLE),$$(call str-eq,$$2,FIX))
I couldn't test it but I think you get the idea.
PS: using a GNUmake library just means placing the included file alongside the makefile.
I think in this case you can use the ‘file’ function of makefiles.
It allows you to write (with > specifier) or read (with < specifier) to/from files. Then you can trim (with filter-out) your variables inside your makefile.
Source: https://www.gnu.org/software/make/manual/html_node/File-Function.html#File-Function
You can use GNU make's $(shell ...) function to extract the macro expansions. Assuming VERSION is defined in src.c and tokens are delimited by spaces (not tabs):
VERSION := $(shell sed -n -e "s/^\#define VERSION *\(.*\)/\1/p" src.c)
.PHONY: all
all:
#echo VERSION=$(VERSION)

Makefile : Parameterize so that it compiles the highest numbered file

Im building a make file, and i want it to be tolerant of the fact that i have multiple main methods, and possibly only consider the file with the highest alphabetical value as the entrypoint.
Is there a way to do this or something similar?
There are a few ways to do that, depending on what level you want or need to make that decision. Here are three that I know of:
Find out what to build before actually building something by calling $(shell find | grep) or $(shell grep -R ...) to identify the files to be built only (inclusion) or filter-out the files not to be built (exclusion).
Use objdump to analyze the object files and filter-out those which have a main() function except the last one. You could use $(shell sort ...) as a helper or $(sort ...).
If you know which objects can contain make, and you are using a Compiler / Linker combination which allows weak declarations that can be shadowed away, you could use $(sort) on the list of objects, $(lastword) to get the one with the "highest name", and put that first (or last, depending on how your linker resolves) in the list of linker files.

Make file for larger directory structure

I've got several directories with subdirectories containing c or asm files and I want them all compiled/assembled and then linked. I'm not especially picky where the object files go (e.g. a special bin folder or in the src folder) as long as a make clean removes them all.
The structure would look something like this:
/src
/dir1
/dir1_1
+file1_1.s
+file1_2.s
+file1.s
/dir2
+file2.c
I'm sure there's some easy way to create a makefile that compiles all files without me having to specify where it should look (compiling all files in one directory is doable with wildcards, but what then?).
Do a Google search for 'recursive make considered harmful'. You'll find the original article which postulates that the recursive make procedure is a bad way of doing business, and you'll find some links to other places which debate the validity of the proposition.
Basically, there are two ways to do builds in a directory hierarchy (with make).
Recursive make: each directory contains a makefile which builds in sub-directories and then builds in the current directory.
Non-recursive make: the makefile includes all the dependent makefiles, and builds up the complete dependency structure for the entire project and only builds the requisite software.
I work routinely on a product where the main build sequence is driven by a hybrid system that uses a shell script plus one makefile for each directory. One section of the product is managed by a 'RMCH' makefile; most of it is not. The build script deals with phases of the build, and sequences the directories, and runs make in each directory when it is time to do so. (The source code is in 20k+ files spread over a multitude of directories - it is a big project/product.)
I've also converted a medium-small project (about 20 directories of relevance, and about 400 source files) to work with RMCH (from a script + makefile-per-directory system). It was a bit mind-blowing at first, but works pretty neatly now it is done. Whether I did it correctly is open for debate; it was primarily a learning exercise, though I also did some work modifying the code to work with a modern curses library instead of the archaic BSD library that was used as a part of the code (archaic, as in 1982-vintage - the code was last seriously developed in about 1986) and generally upgrading to modern (standard C) standards. It was also a chance to work with git - so, all in all, quite an extensive learning experience.
If you can wrap your brain around RMCH, it is a good system. If done correctly, with complete and accurate dependency tracking, it removes the guess-work from the build sequence, and it does run fast. However, migrating even a medium size project to it is fairly hard work - it would be a daunting task to do it on the main product I work on, though the system might well benefit from it.
An alternative is to look at other alternatives to make, such as cmake, rake, scons, bras, imake, or ant or whatever else takes your fancy. Most of those are easily discoverable via a Google search; the hard one is bras, which is based on Tcl (as in Tcl/Tk), but is probably largely dead now. And imake is mentioned more for completeness than as a serious suggestion. You might also look at the GNU Autotools. Those do not abandon make; they build atop make.
If your project is small enough, you might get away with using a single hand-crafted makefile instead of a more sophisticated build system: check out the manual page on transformation functions to see what's possible.
Your example project could be compiled with the following non-recursive makefile:
targets = $(patsubst %$(1),%$(2),$(foreach dir,$(3),$(wildcard $(dir)/*$(1))))
asmdirs := src/dir1 src/dir1/dir1_1
cdirs := src/dir2
asmobjects := $(call targets,.s,.o,$(asmdirs))
cobjects := $(call targets,.c,.o,$(cdirs))
.PHONY : all clean
all : $(asmobjects) $(cobjects)
clean :
rm -f $(asmobjects) $(cobjects)
$(cobjects) : %.o : %.c
gcc -o $# -c $<
$(asmobjects) : %.o : %.s
gcc -o $# -c $<
However, because make can access the shell, you could also use standard unix tools like find instead of the somewhat limited builtin functions, eg
asmsources := $(shell find src -name '*.s')
csources := $(shell find src -name '*.c')
asmobjects := $(asmsources:.s=.o)
cobjects := $(csources:.c=.o)

Resources