Adding files to /usr/local/include - c

I recently installed openjtalk on a linux machine, and I want to be able to wrap it in Go. The source files for openjtalk have several subfolders with different sources, which I assume are found by the compiler because of the make files.
Should I copy each of those sub-folders into /usr/local/include? Is that a "correct" way of fixing include dependencies. From what I tested, Go seems to find the included files if I copy them, but I'm not sure if this is the correct, linux way of doing things.

It's usually not a good idea to change locations of external libraries. Some libraries automatically put themselves in include paths of compilers but for those that don't, adding their paths to the compilers' include paths is always a better idea.
For example, in gcc, you can gcc -I/your/header/directory to include your directory. Usually people put those info in the Makefile. This way, you can put external libraries' source code in your repository and just tell compiler to also look for the headers there. This way, when setting up a new working environment, all you have to do is pull from the repository.

Related

Makefile Project

I want to know please if there is a method to add sources files automatically from a specific directory without writing them in the Makefile.am . So I need an option if this exist. Thanks for help.
I want to know please if there is a method to add sources files automatically from a specific directory without writing them in the Makefile.am .
Some make implementations, such as GNU's, have features that address this objective, but there is no portable mechanism for it, and the Autotools are all about portability. It would therefore be poor Autotools style to rely on such a mechanism.
Even if you were satisfied to ignore portability implications, there would still be a limited scope for this, because Makefile.am associates source files with the target(s) to which they contribute. You would need there to be a direct link between the source directory in question and the target, or possibly between more specific filename patterns and targets. Sometimes such associations exist, but other times they don't.
Portability questions aside, I am not much of a fan of this sort of thing in general, as it lays a trap. If you engage such a mechanism then it is hard to add a source file without breaking the build, for example. The one-time cost of adding all the source names when autotooling an existing project is not that bad -- I've done it for several large projects -- and the cost to maintain your Makefile.am when you add sources is trivial.
But if you nevertheless want something along these lines then I would suggest engaging a source generator. It might look like this:
Makefile.am
bin_PROGRAMS = myprog
include myprog_sources.am
generate_myprog_sources.sh
#!/bin/bash
myprog_sources=(src/myprog/*.c)
echo "myprog_SOURCES = ${myprog_sources[*]}" > myprog_sources.am
That uses a separate shell script to generate Automake code for a _SOURCES definition naming all the .c files in the specified directory (that is, src/myprog). That's written to its own file, myprog_sources.am, for simplicity and safety, and the Makefile.am file uses an Automake include directive to incorporate it. When you want to adjust the build for a different complement of source files, you run the script. You can also make temporary changes to the source list by manually updating it without running the script.
This is not quite the level of hands-free automation that you asked for, but it's still pretty automatic, and it's better suited to the tools you are using.

Finding C libraries not included by default

Coming from programming environments that support package managers, I experience a lot of discomfort installing and using libraries not included in the default project.
For example, #include <threads.h> triggers an error threads.h file not found. I found that the compiler looks for header files in /Library/Developer/CommandLineTools/usr/include/c++/v1 by issuing gcc -print-prog-name=cpp -v. I am not sure if this a complete folder list? How do I find the ones that it doesn't find by default? I am on OSX, but Windows solution is also desired.
The question doesn't really say whether you are building your own project, or someone else's, and whether you use an IDE or some build system. I'll try to give a generic answer suitable for most scenarios.
But first, it's header files, not libraries (which are a different kind of pain, by the way). You need to explicitly make them available to the compiler, unless they reside on a standard search path. Alas, it's a lot of manual work sometimes, especially when you need to build a third-party project with a ton of dependencies.
I am not sure if this a complete folder list?
Figuring out the standard include paths of your compiler can be tricky. Here's one question that has some hints: What are the GCC default include directories?
How do I find the ones that it doesn't find by default?
They may or may not be present on your machine. If they are, you'll have to find out where they are located. Otherwise you have to figure out what library they belong to, then download and unpack (and probably build) it. Either way, you will have to specify the path to that library's header files in your IDE (or Makefile, or whatever you use). Oh, and you need to make sure that the library version matches the version required by the project. Fun!
On macOS you can use third-party package managers (e.g. brew) to handle library installation for you.
pkg-config is not available on macOS, unless you install it from a third-party source.
If you are building your own project, a somewhat better solution is to use CMake and its find_package command. However, only libraries supported by CMake can be discovered this way. Fortunately, their collection of supported libraries is quite extensive, and you can make your own find_package scripts. Moreover, CMake is cross-platform, and it can handle versioning for you.

Why aren't changes to header files accounted for in the Makefiles of mature C projects?

I have been reading up on make and looking at the Makefiles for popular C projects on GitHub to cement my understanding.
One thing I am struggling to understand is why none of the examples I've looked at (e.g. lz4, linux and FFmpeg) seem to account for header file dependencies.
For my own project, I have header files that contain:
Numeric and string constants
Macros
Short, inline functions
It would seem essential, therefore, to take any changes to these into account when determining whether to recompile.
I have discovered that gcc can automatically generate Makefile fragments from dependencies as in this SO answer but I haven't seen this used in any of the projects I've looked at.
Can you help me understand why these projects apparently ignore header file dependencies?
I'll attempt to answer.
The source distros of some projects include a configure script which creates a makefile from a template/whatever.
So the end user which needs to recompile the package for his/her target just has to do:
$ configure --try-options-until-it-works
$ make
Things can go wrong during the configure phase, but this has nothing to do with the makefile itself. User has to download stuff, adjust paths or configure switches and run again until makefile is successfully generated.
But once the makefile is generated, things should go pretty smooth from there for the user which only needs to build the product once to be able to use it.
A few portion of users will need to change some source code. In that case, they'll have to clean everything, because the makefile provided isn't the way the actual developpers manage their builds. They may use other systems (code::blocks, Ant, gprbuild...) , and just provide the makefile to automate production from scratch and avoid to depend on a complex production system. make is fairly standard even on Windows/MinGW.
Note that there are some filesystems which provide build audit (Clearcase) where the dependencies are automatically managed (clearmake).
If you see the makefile as a batch script to build all the sources, you don't need to bother adding a dependency system using
a template makefile
a gcc -MM command to append dependencies to it (which takes time)
Note that you can build it yourself with some extra work (adding a depend target to your makefile)

User defined .c and .h file management

I am building a small library of my own *.c & *.h files and am not sure how I should manage them, especially when including them into a project. I'm using Codeblocks on Ubuntu in case that matters. For each .c/.h file pair, I have a Codeblocks project that is a playground where I can modify & test out any changes or newfound bugs.
I'm thinking I should compile the .c into libraries (.a/.so), put them into respective custom 'XXX/bin' and 'XXX/include' folders, and include/link from those locations (add to the PATH).
The other option (which I've been doing for right or wrong) is to add the .c file directly to my project and #include the full path of the .h file (I know this is wrong, but it works).
How do you all manage your .c and .h files?
Actually, both ways (prepare object files and use headers with them when compile program, and add source code and headers to each project with full or relative path) are quite normal. You should choose a way that is convenient for you. I do not know how Codeblocks works, but I suppose that as most IDE it can support dependencies, optimize build time and rebuild libraries (components of complex project) if some files were updated.
My suggestion is to consider some project build tools (project makers) like cmake. You will be able to configure building process for any project and to use different compilers, as well as different compiling options for different projects, while source files (*.c and *.h) storage is unchanged.
Start from cmake tutorial and other documentation
Of course, at first it will be not easy to deal with the makefile syntax, but when you get used to it you will realize how much it's convenient.
I think it is possible to include headers from their exact source, on a linux platform at least.
assume that you put your *.h in /usr/include you can just use
#include "/usr/include/*.h"
w/o moving your source files you can add the same chunk of code to every new sources you write, but VolAnd's above mentioned suggestions are probably more standard ways of managing.

C convention - Adding H files to project file

A friend at work told me today that
It is a known convention to not add header files to the project file in C project.
I was shocked and couldn't find any logical reason for this(because I felt that this convention would just make it difficult to reach the file I need).
He explained it as that the H file doesn't really contain compilable C code, so it is not "part of the project", but just meta data.
FYI - we work currently on an embedded project.
For example - project file could be - eww file with IAR workbench, or vcxproj in visual studio, or cproject file with eclipse
Does any one ever encountered this kind of convention, and could say how popular is it and what is the practical advantage/logic of it?
I don't believe this is a convention. Header files describe the interfaces between parts of your program, which I would argue is more important than the specific bits of code for many projects. If you move into C++, you may also find significant portions of the project's code implemented in headers to support templates in older C++ versions.
Your IDE is meant to keep the code you're using front and center, so you can access the source that you need and edit any code while minimizing context switching.
My advice: Add the headers to your project, but categorize them in a separate folder, filter group, or other mechanism to make them easy to access. Make sure they're visible to the compiler, set their build targets to not compile (since they're just being included) and you should be set.
There are no disadvantages of adding a header file to the project.
Some advantages I find:
If I create 'Source Files' , 'Header Files' etc folders and and add the respective files to the folder,it looks neat when you open the project in your IDE as I can directly see what are the header files being used in my project (which most of the times are created by you)
In some IDEs (eg MSVC) , I can search the header file directly using the search window if that header file has been added to the project.Otherwise I need to open one of the C/C++ files which includes this header and has to open the file from the line where #include is defined for that file.
So it is upto you , whether you need everything organized . depending your IDE etc , you can add/exclude header files to the project.
Hope this helps.
The only reason I can think of to not include .h files in your project file is if they aren't a part of your project. For example stdio.h. I have seen people do this before and it can cause problems. The main issue is that is can make your project non-portable. It can also lead to people accidentally modifying files that they shouldn't.
Is it possible that's what your friend was talking about?
Based on your comments, looks like one counter-example is enough to show that .h files are sometimes included in project files. Here's an example of a Qt qmake .pro file for a project, which lists header files:
TEMPLATE = app
CONFIG += console
CONFIG -= app_bundle
CONFIG -= qt
SOURCES += main.c \
module.c
HEADERS += \
module.h
To get that, I used Qt Creator first to create the project as "plain C project", then generated the module.h and the module.c stubs with Qt Creator, which added both to the .pro file. Now having the .h files in the .pro is optional: project would compile without it, but it would be harder to navigate the project etc, and I'd have to manually remove each .h from .pro after generating them.
As a complete opposite, I think there are build systems (with their project files) also for C projects, where you don't actually need to list any files. You just list directories, or even have standard directory layout, and the build system will scan the directories and compile the project according to it's rules. I think this is possible with cmake at least. And of course for many project file types (like plain Makefiles) you can use wildcards to find all .c files in a directory.
As to reasons why .h files would not be listed:
Compiler will find the .h files based on #includes, they are not given in command line (only include directories are given).
A modern IDE will scan the sources anyway and find all the used include files without them being listed.
If .h file lists are maintained manually, but per above two points nothing will actually fail if one is forgotten, then the list in project file may get out of date, when someone just forgets either to add or remove one when there are changes.
Listing build dependencies for each .c file is actually a bit different than just listing the .h files in project files, and is best handled automatically.
Using a version control should remove any ambiguity between files which are really part of the project even if they are not used (because they are in version control), and which are just some clutter which should be removed/ignored.
So, if having .h files listed in a project file is any extra work, and if it does not offer any concrete advantage (for example with some IDE), then a convention of just not having them seems sensible.

Resources