How do i "version" a C binary file on linux platforms - c

usually the practice is not to include binaries in source control repositories, i am using mercurial, and would like to know if anyone has experience with embedding version (minor + major) in a C Binary, so that when its distributed if i use a command line argument like mybinaryApp --version, i will get a unique version, which i can control at build time.

The way that I embed version numbers in the code is to #define _VERSION_MAJOR in a separate header file, include them in files that need the version number, and then use that macro where needed. Then, you are free to control the version number in a different source file, without having to continually modify the original file.
This is the essence of what most advanced tools do.
Alternatively, if you wanted a build-specific tag, then you can use __DATE__ and __TIME__ to insert build time.

The usual way is to use the autotools (autoconf & automake & autoheader & al.). They make an include file config.h available, and this file defines the PACKAGE_VERSION macro that you can print out with --version.
This process is independent from version control. The simple reason for this is that you will want to distribute your project as a source code tarball, and people have to build that tarball without access to your version control.

Are you looking for something like the KeywordExtension?

Related

tool to compare .h files

I was wondering if there is a tool to compare C header files to the main version of those same files locally. To be more specific I have to compare the macros of the header files from the main version to the macros of the header files generated by a library. Also make sure which macros don't match or don't exist in the generated h. files. Both versions of these header files are in a local workspace. Also, the ubuntu version (ubuntu 18.04) I'm working with doesn't have a GUI so I would have to display the results on the command line.
I can also try to work it out in win10 if necessary. I appreciate any suggestions, thanks.
Yes, the tool you are looking for is called diff. It's available on the command line with linux since the beginning of time:
diff {my.h} {orig.h}
and there are many different formats available like "side-by-side" or "unified".
If you prefer a graphical utility there are many available for linux or windows. Just google for "diff gui" and you can find a few.

CMake add_subdirectory use different compiler [duplicate]

It seems like CMake is fairly entrenched in its view that there should be one, and only one, CMAKE_CXX_COMPILER for all C++ source files. I can't find a way to override this on a per-target basis. This makes a mix of host-and-cross compiling in a single CMakeLists.txt very difficult with the built-in CMake facilities.
So, my question is: what's the best way to use multiple compilers for the same language (i.e. C++)?
It's impossible to do this with CMake.
CMake only keeps one set of compiler properties which is shared by all targets in a CMakeLists.txt file. If you want to use two compilers, you need to run CMake twice. This is even true for e.g. building 32bit and 64bit binaries from the same compiler toolchain.
The quick-and-dirty way around this is using custom commands. But then you end up with what are basically glorified shell-scripts, which is probably not what you want.
The clean solution is: Don't put them in the same CMakeLists.txt! You can't link between different architectures anyway, so there is no need for them to be in the same file. You may reduce redundancies by refactoring common parts of the CMake scripts into separate files and include() them.
The main disadvantage here is that you lose the ability to build with a single command, but you can solve that by writing a wrapper in your favorite scripting language that takes care of calling the different CMake-makefiles.
You might want to look at ExternalProject:
http://www.kitware.com/media/html/BuildingExternalProjectsWithCMake2.8.html
Not impossible as the top answer suggests. I have the same problem as OP. I have some sources for cross compiling for a raspberry pi pico, and then some unit tests that I am running on my host system.
To make this work, I'm using the very shameful "set" to override the compiler in the CMakeLists.txt for my test folder. Works great.
if(DEFINED ENV{HOST_CXX_COMPILER})
set(CMAKE_CXX_COMPILER $ENV{HOST_CXX_COMPILER})
else()
set(CMAKE_CXX_COMPILER "g++")
endif()
set(CMAKE_CXX_FLAGS "")
The cmake devs/community seems very against using set to change the compiler since for some reason. They assume that you need to use one compiler for the entire project which is an incorrect assumption for embedded systems projects.
My solution above works, and fits the philosophy I think. Users can still change their chosen compiler via environment variables, if it's not set then I do assume g++. set only changes variables for the current scope, so this doesn't affect the rest of the project.
To extend #Bill Hoffman's answer:
Build your project as a super-build, by using some kind of template like the one here https://github.com/Sarcasm/cmake-superbuild
which will configure both the dependencies and your project as an ExternalProject (standalone cmake configure/build/install environment).

Are Cmake/Autotools useful for non standard compilers?

I am working on a complex project written in C/Asm for an embedded target running on an Analog Devices DSP. The toolchain is close to gcc, but they are plenty of differences. Moreover, I am using a lot of autogeneration scripts using Jinja2 to generate header files from data extracted from a database. I also have plenty of compiler flags.
I currently wrote a Makefile from scratch. It is about 400 lines long and works pretty well. I automatically discover the sources across the directories and hold all the dependencies i.e.
a.tmpl --->jinja-->a.c--->a.o
^
a.yaml ------'
I would like to know if tools such as Cmake or Automake can be useful in my case. In other words, can I use these tools to simply the readability of Makefile?
CMake works perfectly with generated sources. Just add appropriate custom command:
add_custom_command(OUTPUT a.c
COMMAND jinja <args>
DEPENDS a.yaml)
add_executable(a a.c)

Autotools and version control

I know, it must be a silly question.
Assume I have a library using an autotools build system.
I have all that configure, configure.ac, Makefile.am, config.h and may other files in my project root folder. Some of them wre written by a developer, others are generated by autotools.
The question is: if I use a version control system (in my case - hg) - which of all that autotools files should be tracked by a VCS and which shouldn't (hgignore'd)?
Thanks,
Serge
I think the best procedure is to only put files under version control that are not generated - people working with the VCS are developers and should have the autotools installed on their machines, checking in generated files will only cause trouble for them.
On the other hand you have to make sure that source-level distribution is done with all generated files in place, so that non-developers are able to build the software without the autotools installed.
There are two schools of thought on this:
"I want to see the project exactly as it was at time/version X"
"I can always re-generate anything which was automatically generated later"
I generally fall into the latter group personally, but the former can be nice if there are/were problems with the build system in some specific version that you probably don't have installed any more.
In your example configure and config.h are both (probably) autogenerated, so if you're going to include them in version control I'd be inclined to include the Makefile.ins too.
In my projects this usually means having no more autotools related files than configure.ac, Makefile.am, the documentation if it's GNU and a directory called m4 which includes any custom/non-standard macros my configure.ac requires.

How to visualise a graph of C structs that contain / point to one another?

I am using Ubuntu 10.04, and studying programming of kernel objects.
I have come across some rather complicated structs which I have difficulties reading, so I thought I'd try to find some tool that can help me visualise them.
Only thing I could find so far is VCG, which has a C Struct Visualization Example, which looks like this:
which looks like something I'd like to use.
First thing, is that the last VCG packaged for Ubuntu is vcg (1.30debian-6) in hardy - but the .deb package can be downloaded and installed in Ubuntu Lucid without problems.
However, it seems this package is only a VCG viewer (similar to vcgviewer, I'd guess). The vcgviewer page notes:
To generate compiler graph data with newest gcc compilers use:
gcc -g -da -dv -fdump-tree-original-raw -fdump-tree-all-all
So, apparently I'd have to use those switches along with gcc while compiling, to generate .vcg graph files from the C source.
The problem, however, is that I'm building a kernel module, which only references the Linux headers - as I try to avoid as much as I can the recompilation of entire kernel. And it seems, as soon as I try to use -fdump-tree-... switches in that context (kernel module), gcc wants to start compiling the rest of the kernel too! (and obviously fails, in both compilation and generation of .vcg graphs - as I don't have the kernel sources, only headers)
So my question is - is there a tool, that would produce .vcg or .dot graphs of structs - simply using a plain text header file as input? (it would not have to resolve all dependencies - simply those in header files in same directory)
EDIT: it is actually not that important for me that the backend is .vcg or .dot in particular, I mentioned them just because I've found them so far; any sort of software that would allow similar struct visualization, regardless of backend, is welcome :)
PS: Note that if you do not want to use VCG viewers for viewing .vcg graphs, you can convert the .vcg format to a .dot format, and use graphviz instead for the visualisation. What worked for me is to use graph-easy - search.cpan.org for perl - which first got packaged in Ubuntu with Maverick edition, as libgraph-easy-perl (however, the .deb file can - again - be downloaded and installed without problems in Lucid). libgraph-easy-perl installs a graph-easy script, which then allows to do stuff like:
graph-easy test.vcg --as_dot | dot -Tpng -o test.vcg.png
See also "[graphviz-interest] VCG files" and "Diego Novillo - Re: can't find VCG viewer" for another vcg-to-dot script (which, unfortunately, didn't work for me).
I have had good experiences with using doxygen for that task. It is designed to create documentation out of annotated source files, but it can already give you a lot of things without the annotations, including various graphs.
Doxygen uses dot for the graph creation.
I've managed to successfully build a kernel module with vcg generation by doing the following:
Creating a linked copy of the kernel source or header directory using cp -al /usr/src/linux-srcdir /tmp/tmp-srcdir since gcc wants to write to the current working directory.
Adding EXTRA_CFLAGS="-g -da -dv -fdump-tree-original-raw -fdump-tree-all-all" to the make command line eg. -C /tmp/tmp-srcdir M=pwdEXTRA_CFLAGS="-g -da -dv -fdump-tree-original-raw -fdump-tree-all-all". the vcg files are generated in /tmp/tmp-srcdir

Resources