Proper project organization with common source files and sub-projects - c

I have a straight C project that builds several different inter-related modules into a single image using a single top-level makefile and recursive calls to the modules. This all works fine, although I know it's not using the best structure. I now need to restructure it because of some changes to the project, and I'd like to do it "right". In addition, I've found that I'm using some common code in the modules that right now is just copied into each one, so I'd like to fix that too.
The further complication is that I'm using Subversion and the common code that's being used is stored in a separate repo from the project, so I can't just import each file that's used.
Here is the structure I think I'd like to use, but I'm not completely sure how to write the makefiles to actually work with it (but I can handle that in another question if needed).
build
+ common
| + lib1
| + lib2
+ module1
| + obj
+ module2
| + obj
+ module3
| + obj
+ output
Common would be an external to a folder in the other repo that has the common source files, and the makefile in each module would build intermediate object files locally (this is required as each module compiles differently, so the common files are not common binaries) and then put its final binary in the shared output directory for the top level makefile to combine into the single final image.
Is this a reasonable structure to use for make to deal with?
Module 1 uses a third-party library that may be switched out later. Should it be a sub-directory of module1, or should it be in common (hard to do with subversion, unless I mix it into the folder on the other repo), or should it be added as another directory under build?
Module 2 compiles to a static library for Module 3 to use. Should Module 3's makefile explicitly know about Module 2 or should the header file be in the common directory (the library will be in the output directory already)?
There are other common definitions that this project needs to set up for all the modules, which would ordinarily be in a header file in the common directory, but since that directory is coming from a subversion external, what are my other options for doing this?

First, some of these design questions are matters of taste and habit, and arguments about them can verge on religious war.
Second, your common directory is poorly named, since it serves specifically as a store of the code from the other repository (a good thing to have), and not as a place for all sources shared by multiple modules (a different good thing to have). So I suggest you add another directory such as build/headers/, for headers that are common to more than one module, but are stored in your own repository.
Yes, once you add headers/.
A subdir of module1/ is a good place for it; if the other modules don't use it, there's no reason for them to be able to see it.
headers/ is literally made for this. Put the header there. Module 3 has no business knowing more about Module 2's implementation than that.
Again, headers/ is the place for them.

Related

How to open file with path relative to c file, not working directory

I'm modifying a large codebase written in C. This code is designed to be compiled and ran from an arbitrary working directory. This is so configuration files can be read and output written to/from the working directory, making it easier to organize the setup and outputs of the code.
The additions I've made to this code need to read data from a few data files. I would like to place these in the same directory as the .c file where they are read, with a directory structure as follows:
big_project/
|-- models/
| |-- my_file.c
| |-- my_data.txt
My problem comes in when trying to open this data file using relative paths. Typically relative paths are relative to the working directory, which would not work in my case since the working directory can be arbitrary. From inside my_file.c, how can I open my_data.txt for reading using relative paths?
Based on our conversation in the comments, you have several alternatives, which I'll list below from, IMHO, most to least desirable.
You never specified if you were in Windows, GNU+Linux, or were doing cross-platform development, but I'm sure you can adapt the suggestions to your platform.
Multiple and Custom Config Files (Recommended)
You could modify your program to look at your platform's standard location for program data and/or configuration files. For example, you could have it look for a standard config at /etc/<your-program>/default.conf in GNU+Linux or %APPDATA%\<your-program>\default.conf in Windows.
If different users need to use their own personal configs, the program could also be made to accept a config file path as an argument. For example:
GNU+Linux:
$ ./your-program --config ${HOME}/.your-program/my.conf
Windows:
> your-program.exe --config %userprofile%\your-program\my.conf
Note that the use of %userprofile% may change based on Windows versions and/or shells used (e.g. standard cmd.exe vs powershell).
Compiling in the Path (Not Recommended)
Based on your comments, a short-term workaround could be to compile the absolute path into it for the __FILE__ macro to give that back to you at runtime. As I said in my comment:
if you're completely sure about the program always being placed in the same directory for everyone, then you can set the absolute path shown by the __FILE__ macro if you send the full path when compiling; e.g. gcc $(pwd)/your-file.c, when it prints __FILE__ will show the full path it had at compile time, not run-time. (Can't add enough disclaimers here, though)
Please note that there're many reasons to not use this approach. I'm simply suggesting it as a short-term workaround to pull out of an existing crisis-level situation you may have, while you (hopefully) take a closer look at the more desirable approach to handle configurations and path-finding.

Golang not able to see templates in external package

I am attempting to write a reusable package in Go. I'm using a structure similar to that described here but slightly different:
/src/bitbucket.org/EXTERNAL_PROJECT_NAME/EXTERNAL_PACKAGE_NAME/...
/src/INTERNAL_PROJECT_NAME/INTERNAL_PACKAGE_NAME/...
Or should the second line be:
/src/bitbucket.org/INTERNAL_PROJECT_NAME/INTERNAL_PACKAGE_NAME/...
Everything works until I need to access a non-go file that exists in the external package. For example, I have some built in templates that I would like to be available without having to include them in my internal projects templates directory.
To that end, I have a "templates" directory in the external project where I want to house some built-in templates and a "templates" directory in my internal project where custom templates will go. But when I attempt to parse templates from the external project template directory, it can't find them.
So how would I go about indicating that I want to get the templates from the external package directory instead of the internal one? I could adjust the path to something like the following:
../../bitbucket.org/EXTERNAL_PROJECT_NAME/EXTERNAL_PACKAGE_NAME/templates/file.html
but this is obviously very clumsy and depends on individual setup, so that's not going to work. In general, if I want to reference a file in an external package instead of my internal project directory, how would I do this gracefully?
Thanks!
Turns out there is a pretty simple solution. Looks something like the following:
package main
import (
"bitbucket.org/EXTERNAL_PROJECT/EXTERNAL_PACKAGE"
"go/build"
)
func main() {
SrcRoot := "/src"
PackageDir := "/bitbucket.org/EXTERNAL_PROJECT/EXTERNAL_PACKAGE"
InternalTemplateDir := build.Default.GOPATH + SrcRoot + PackageDir + "/templates/"
}
GOROOT here provides us with the path to the directory containing all our go code. From there, I want to reference the templates directory in the package source. With InternalTemplateDir, I now have the base path from which to reference templates within the external package.
For ease of use, I will probably build a template loader that checks for a file on an internal file path first and then checks for the same file in the external package, so that any given template can be overridden by including it internally, but essential templates will all have built in versions as well.
If it's not a Go package (aka bitbucket.org/EXTERNAL_PROJECT_NAME/EXTERNAL_PACKAGE_NAME/file.go) it's not gonna work, your best bet us something like https://github.com/jteeuwen/go-bindata.
But I really think you should rethink your problem and use a different approach to it.

Usual place for custom library

I'm writing simple C program using Eclipse. I have several system includes. But now I have downloaded source code of library that contains one *.c and one *.h file. What is good manier to place these files? Should I place them to workspace, or create directory Include near src or place in /usr/local/include?
I you want to make the headers available to other users on the machine /usr/local/include would be the place to put them. Otherwise keep them somehwere in your $HOME.
Another alternative would be to put the sources to /usr/local/lib/<tool>/src/include and link /usr/local/lib/<tools>/src/include to /usr/local/include/<tool>. With <tool> being the name of the tool.

shared libraries creating a soft link

Redhat 5.5
gcc version 4.1.2
I have a directory call lib, and in that directory I have all the shared libraries (about 30) that we get from our customer as we use their API. We link with this API.
directory structure:
/usr/CSAPI/lib
However, our customer will update their API so we get new libraries, normally about 3 or 4.
What I have been doing is when I get new libraries. Is to remove the old one and put in another directory. And replace them with the new libaries in the lib directory.
/usr/CSAPI/Old_libs
The new and old will have the same name. i.e.
libcs.so < old
libcs.so < new
Is there a better way to manage this? I was thinking of creating a soft line, but as the names are the same, I am not sure that this will work.
Many thanks,
Usually libraries are versioned, not just "the same name".
You'll have a file in your /usr/lib directory for each version:
/usr/lib/libFLAC.so.8.2.0
/usr/lib/libFLAC.so.8.2.1
/usr/lib/libFLAC.so.8.2.2
Then you symlink the major library versions to the latest minor version:
/usr/lib/libFLAC.so.8 -> /usr/lib/libFLAC.so.8.2.2
The benefit of this is that API changes will add new files and update the symlinks, but if I need to specify a specific API version number, the file is still right there.
This isn't set in stone, so do whatever works for your release process :)
Symlinks are a very good way to handle this. I would do something slightly differently. I would create a directory structure like:
/usr/CSAPI/lib_v1
/usr/CSAPI/lib_v2
and in each of these I would put the actual files. I would then create a separate directory:
/usr/CSAPI/lib
which only contains symlinks to the actual files in lib_v1, lib_v2, etc.
This way lib has the most current version, but if you need, you can use a previous version by simply changing your LD_LIBRARY_PATH.

Setting up Netbeans/Eclipse for Linux Kernel Development

I'm doing some Linux kernel development, and I'm trying to use Netbeans. Despite declared support for Make-based C projects, I cannot create a fully functional Netbeans project. This is despite compiling having Netbeans analyze a kernel binary that was compiled with full debugging information. Problems include:
files are wrongly excluded: Some files are incorrectly greyed out in the project, which means Netbeans does not believe they should be included in the project, when in fact they are compiled into the kernel. The main problem is that Netbeans will miss any definitions that exist in these files, such as data structures and functions, but also miss macro definitions.
cannot find definitions: Pretty self-explanatory - often times, Netbeans cannot find the definition of something. This is partly a result of the above problem.
can't find header files: self-explanatory
I'm wondering if anyone has had success with setting up Netbeans for Linux kernel development, and if so, what settings they used. Ultimately, I'm looking for Netbeans to be able to either parse the Makefile (preferred) or extract the debug information from the binary (less desirable, since this can significantly slow down compilation), and automatically determine which files are actually compiled and which macros are actually defined. Then, based on this, I would like to be able to find the definitions of any data structure, variable, function, etc. and have complete auto-completion.
Let me preface this question with some points:
I'm not interested in solutions involving Vim/Emacs. I know some people like them, but I'm not one of them.
As the title suggest, I would be also happy to know how to set-up Eclipse to do what I need
While I would prefer perfect coverage, something that only misses one in a million definitions is obviously fine
SO's useful "Related Questions" feature has informed me that the following question is related: https://stackoverflow.com/questions/149321/what-ide-would-be-good-for-linux-kernel-driver-development. Upon reading it, the question is more of a comparison between IDE's, whereas I'm looking for how to set-up a particular IDE. Even so, the user Wade Mealing seems to have some expertise in working with Eclipse on this kind of development, so I would certainly appreciate his (and of course all of your) answers.
Cheers
Eclipse seems to be pretty popular for Linux kernel development:
http://cdtdoug.blogspot.com/2008/12/linux-kernel-debugging-with-cdt.html
http://jakob.engbloms.se/archives/338
http://revver.com/video/606464/debugging-the-linux-kernel-using-eclipsecdt-and-qemu/
I previously wrote up an answer. Now I come up with all the details of the solution and would like to share it. Unfortunately stackoverflow does not allow me to edit the previous answer. So I write it up in this new answer.
It involves a few steps.
[1] The first step is to modify linux scripts to leave dep files in. By default after using them in the build, those dep files are removed. Those dep files contains exact dependency information about which other files a C file depends. We need them to create a list of all the files involved in a build. Thus, modify files under linux-x.y.z/scripts to make them not to remove the dep files like this:
linux-3.1.2/scripts
Kbuild.include: echo do_not_rm1 rm -f $(depfile);
Makefile.build: echo do_not_rm2 rm -f $(depfile);
The other steps are detailed in my github code project file https://github.com/minghuascode/Nbk/blob/master/note-nbkparse. Roughly you do:
[2] Configure with your method of configuration, but be sure use "O=" option to build the obj files into a separate directory.
[3] Then use the same "O=" option and "V=1" option to build linux, and save make output into a file.
[4] Run my nbkparse script from the above github project. It does:
[4.1] Read in the make log file, and the dep files. Generate a mirroring command.
[4.2] Run the mirroring command to hard-link the relevant source files into a separate tree, and generate a make-log file for NetBeans to use.
Now create a NetBeans C project using the mirrored source tree and the generated log file. NetBeans should be able to resolve all the kernel symbols. And you will only see the files involved in the build.
The Eclipse wiki has a page about this: HowTo use the CDT to navigate Linux kernel source
I have been doing some embedded linux development. Including kernel module development and have imported the entire linux kernel source code into Eclipse, as a separate project. I have been building the kernel itself outside of Eclipse(so far), but I don't any reason why I shouldn't be able to set up the build environment within Eclipse to build the kernel. For my projects, as long as I setup the PATH properties to point to the appropriate linux source include directories, it seems to be pretty good about name completion for struct fields, etc.
I can't really comment, on if it is picking up the correct defines and not greying out the correspond sections, as I haven't really paid to much attention to the files within the kernel itself.(so far)
I was also wondering about using Netbeans as a linux 'C' IDE, as I do prefer Netbean's for Java GUI development.
I think this would work (done each step for various projects):
[1] Modify kernel build scripts to leave .d files. By default they are removed.
[2] Log the build process to a file.
[3] Write a script to parse the build log.
[3.1] From the build log, you know every .c files.
[3.2] From the .c file, you know which is the corresponding .d file.
[3.3] Look into .d files to find out all the included .h files.
[3.4] Form a complete .c and .h file list.
[4] Now create a new dir, and use "ln -s" or "ln" to pick files of interest.
Now, create a Netbeans project for existing source code in the [4].
Configure code assistance to use make-log file. You should see
exactly the effective source code as when you build it at [2].
Some explanations to the above steps:
At [2], do a real build so the log file contains the exact files and flags of interest.
Later netbeans will be able to use the exact flags to parse.
At [4], pick only the files you want to see. Incorporating the whole kernel tree into netbeans will be unpractical.
There is a trick to parsing .d files: Many of the depended items are not real paths to a .h file, they are a modified entry for part of the linux config sections in the auto config file. You may need to reverse the modification to figure out which is the real header file.
Actually there is a topic on netbeans site. This is the discussion url: http://forums.netbeans.org/ntopic3075.html . And there is a wiki page linked from the discussion: wiki.netbeans.org/CNDLinuxKernel . Basically it asks you to prefix make with CFLAGS="-g3 -gdwarf-2" .
I found this link very helpful in setting up proper indexing in Eclipse. It requires running a script to alter Eclipse environment to match your kernel options, in my case
$ autoconf-to-eclipse.py ./include/generated/autoconf.h .
An illustrated guide to indexing the linux kernel in eclipse

Resources