I have large number of source files ~10,000 and they are scattered across several folders.
I wanted to know if there is a way to skip certain folders, I know that havent changed.
For ex, consider the following folder structure
A (Sconstruct is here)
|
->B (unchanged 1000 files)
->C (unchanged 1000 files)
->D (changed 1 file)
Once I do a complete build for the first time, I want it to compile everything (B, C, D) but when I modify a file in D (I know that), I would like to build folder D only, skip B and C and finally link them all together to form the final binary (B, C and new D).
I have been looking for quite some time now but not able to figure it out. Is it even possible? Can I specify only to look into a particular folder for changes?
First, I'd investigate using Decider('timestamp-match') or even building a custom Decider function. That should speed up your dependency-checking time.
But to answer your specific question, yes it is possible to not build the targets in B and C. If you don't invoke a builder for the targets in those subdirectories, you just won't build them. Just have an if that selectively chooses which env.Object() (or similar) functions to invoke.
When I fleshed out your example, I chose to have each subdirectory create a library that would be linked into the main executable, and to only invoke env.SConscript() for the directories that the user chooses. Here is one way to implement that:
A/SConstruct:
subdirs = ['B','C','D']
AddOption('--exclude', default=[], action='append', choices=subdirs)
env = Environment(EXCLUDES = GetOption('exclude'))
env.SConscript(
dirs=[subdir for subdir in subdirs
if subdir not in env['EXCLUDES']],
exports='env')
env2 = env.Clone()
env2.PrependUnique(LIBPATH=subdirs,
LIBS=subdirs)
env2.Program('main.c')
B/SConscript:
Import('env')
env.Library('B', env.Glob('*.c'))
C/SConscript:
Import('env')
env.Library('C', env.Glob('*.c'))
D/SConscript:
Import('env')
env.Library('D', env.Glob('*.c'))
To do a global build: scons
To do a build after modifying a single file in D: scons --exclude=B --exclude=C
EDIT
Similarly, you can add a whitelist option to your SConstruct. The idea is the same: only invoke builders for certain objects.
Here is a SConstruct similar to above, but with a whitelist option:
subdirs = ['B','C','D']
AddOption('--only', default=[], action='append', choices=subdirs)
env = Environment(ONLY = GetOption('only') or subdirs)
env.SConscript(
dirs=env['ONLY'],
exports='env')
env2 = env.Clone()
env2.PrependUnique(LIBPATH=subdirs,
LIBS=subdirs)
env2.Program('main.c')
To build everything: scons
To rebuild D and relink main program: scons --only=D
If D is independent of B and C just specify your target in D (program/library), or the whole directory, as target explicitly on the command line like scons D/myprog.exe. SCons will expand the required dependencies automatically, and such doesn't traverse the unrelated folders B and C.
Note how you can specify an arbitrary number of targets, so
scons D/myprog.exe B
is allowed too.
Related
I'm using autotools on a C project that, after installation, needs a particular directory structure in /var/lib as follows:
/var/lib/my-project/
data/
configurations/
local/
extra/
inputs/
I'm currently using the directive AS_MKDIR_P in configure.ac like so:
AS_MKDIR_P(/var/lib/my-project/data)
AS_MKDIR_P(/var/lib/my-project/configurations/local)
AS_MKDIR_P(/var/lib/my-project/configurations/extra)
AS_MKDIR_P(/var/lib/my-project/inputs)
But it needs the configure script to be run with root permissions which I don't think is the way to go. I think the instructions to create this directory structure needs to be in Makefile.am, so that make install creates them rather than configure, but I have no idea how to do that.
You really, really, really do not want to specify /var/lib/my-project. As the project maintainer, you have the right to specify relative paths, but the user may change DESTDIR or prefix. If you ignore DESTDIR and prefix and just install your files in /var/lib without regard for the user's requests, then your package is broken. It is not just slightly damaged, it is completely unusable. The autotool packaging must not specify absolute paths; that is for downsteam packagers (ie, those that build *.rpm or *.deb or *.dmg or ...). All you need to do is add something like this to Makefile.am:
configdir = $(pkgdatadir)/configurations
localdir = $(configdir)/local
extradir = $(configdir)/extra
inputdir = $(pkgdatadir)/inputs
mydatadir = $(pkgdatadir)/data
config_DATA = cfg.txt
local_DATA = local.txt
extra_DATA = extra.txt
input_DATA = input.txt
mydata_DATA = data.txt
This will put input.txt in $(DESTDIR)$(pkgdatadir)/inputs, etc. If you want that final path to be /var/lib/my-project, then you can specify datadir appropriately at configure time. For example:
$ CONFIG_SITE= ./configure --datadir=/var/lib > /dev/null
This will assign /var/lib to datadir, so that pkgdatadir will be /var/lib/my-project and a subsequent make install DESTDIR=/path/to/foo will put the files in /path/to/foo/var/lib/my-package/. It is essential that your auto-tooled package honor things like prefix (which for these files was essentially overridden here by the explicit assignment of datadir) and DESTDIR. The appropriate time to specify paths like /var/lib is when you run the configure script. For example, you can add the options to the configure script in your rpm spec file or in debian/rules, or in whatever file your package system uses. The auto-tools provide a very flexible packaging system which can be easily used by many different packaging systems (unfortunately, the word "package" is highly overloaded!). Embrace that flexibility.
According to autotools documentation (here and here), there are hooks that you can specify in Makefile.am that will run at specific times during the installation. For my needs I will use install-exec-hook (or install-data-hook) which will be run after all executables (or data) have been installed:
install-exec-hook:
$(MKDIR_P) /var/lib/my-project/data
$(MKDIR_P) /var/lib/my-project/configurations/local
$(MKDIR_P) /var/lib/my-project/configurations/extra
$(MKDIR_P) /var/lib/my-project/inputs
MKDIR_P is a variable containing the command mkdir -p, or an equivalent to it if the system doesn't have mkdir. To make it available in Makefile.am you have to use the macro AC_PROG_MKDIR_P in configure.ac.
I have recently converted my works make based build system to shake. I am now trying to make shake a little more robust to changes in the directory structure so that I do not have to regenerate the build system.
Each of my projects use are C based and have the following directory structure
src
source folder 1
source folder 2
inc
inc folder 1
inc folder 2
I am able to capture all the source files but what I cant get to work is capturing the include folders. The root inc folder and the sub folders I am trying to capture into a variable in the build system. I have been using the following setup
includes = getDirectoryDir "inc"
This will give me the included sub folders but not the root folder inc. Which I thought I could work around but inc will not be tracked.
What I would like is to have something like
includes = getDirectoryDirAndRoot "inc"
Which will capture each of the subdirectories and the root directory and have them tracked in the build system.
That aside what I have also tried to use
gcc -o out includes
But I would need to have every element in includes prepended with "-I" which I can't seem to figure out.
I guess how would one go abut doing this in shake, in make I can accomplish all of this by using makes shell function and a couple of string manipulation functions.
I think the question can be interpreted both ways, and both ways are useful (you may even want both), so I'll give two answers:
Answer 1: You want the C file to be recompiled if any file in the inc directory changes.
"*.c" *> \out -> do
headerFiles <- getDirectoryFiles "inc" "**/*.h"
need headerFiles
...
This snippet gets a list of all header files in the inc directory (or it's subdirectories) and introduces a dependency on them. If any header file changes, this rule will rerun.
Answer 2: You want to get the list of directories to pass to gcc.
"*.c" *> \out -> do
includeDirs <- getDirectoryDirs "inc"
cmd "gcc -c" [out] "-Iinc" (map ("-Iinc/" ++) includeDirs)
...
This snippet gets the directory names under inc and then uses map to prepend -Iinc/ over them. It also passes -Iinc directly. If the list of directories under inc changes this rule will rebuild. However, if the header files contained in those directories change nothing will rebuild. You can add dependencies on the used header files with the gcc -MD family of flags, as described in the Shake user manual, or using the technique from Answer 1.
Have a look at addOracle and its cousin addOracleCache. This should allow you to depend on information besides the files themselves, such as directories to be included.
But I would need to have every element in includes prepended with "-I" which I can't seem to figure out.
You can use Haskell here. If you have a list of directories directories :: [FilePath], you can turn those into compiler flags with
asIncludes :: [FilePath] -> [String]
asIncludes = fmap ("I" ++)
A friend and I are using Qt Creator with Boost to build a game engine. So far we have this idea that the engine is going to be a shared library, with the idea that we can run it with a test executable which will turn into the game we eventually want to make.
The problem is header files, mainly. I'd like to find some way for Qt Creator to be able to recognize the header files as soon as the latest builds of the engine have been built or even when they're added. At first I was thinking a script in Python which executed as a build step in Qt Creator after the engine had been built, would simply copy the header files to a system directory (/usr/include, for example - if operating on a *nix system), where the IDE would then recognize the header files when linking the engine with the test executable, and we'd also have full auto completion support.
Of course, environmental variables would be used, and while I prefer developing in Linux, my friend prefers Windows, so we agreed to take care of development in regards to our respective platform preferences.
While this seems like a good solution, I think this Python script idea may be over kill. Is there a better way to do this?
Update
From to the suggested Qmake script, I end up getting this error.
cp -f "/home/amsterdam/Programming/atlas/Engine/AtlasEngine/"AtlasEngine_global.h "/"
cp: cannot create regular file `/AtlasEngine_global.h': Permission denied
make: Leaving directory `/home/amsterdam/Programming/atlas/Engine/AtlasEngine__GCC__Linux__Debug'
make: *** [libAtlasEngine.so.1.0.0] Error 1
15:20:52: The process "/usr/bin/make" exited with code 2.
Error while building project AtlasEngine (target: Desktop)
When executing build step 'Make'
My adjustments look as follows:
# Copy over build artifacts
SRCDIR = $$ATLAS_PROJ_ROOT
DESTDIR = $$ATLAS_INCLUDE
# Look for header files there too
INCLUDEPATH += $$SRCDIR
# Dependencies: mylib. Only specify the libs you depend on.
# Leave out for building a shared library without dependencies.
#win32:LIBS += $$quote($$SRCDIR/mylib.dll)
# unix:LIBS += $$quote(-L$$SRCDIR) -lmylib
DDIR = \"$$SRCDIR/\" #<--DEFAULTS
SDIR = \"$$IN_PWD/\"
# Replace slashes in paths with backslashes for Windows
win32:file ~= s,/,\\,g
win32:DDIR ~= s,/,\\,g
win32:SDIR ~= s,/,\\,g
for(file, HEADERS) {
QMAKE_POST_LINK += $$QMAKE_COPY $$quote($${SDIR}$${file}) $$quote($$DDIR) $$escape_expand(\\n\\t)
}
I have managed to overcome this using some Qmake magic that works cross-platform. It copies over the shared libraries (either .dll or .so files) along with the header files to a directory in a directory dll at a level next to your current project.
Put this in the end of your .pro files and change the paths/libs accordingly.
# Copy over build artifacts
MYDLLDIR = $$IN_PWD/../dlls
DESTDIR = \"$$MYDLLDIR\"
# Look for header files there too
INCLUDEPATH += $$MYDLLDIR
# Dependencies: mylib. Only specify the libs you depend on.
# Leave out for building a shared library without dependencies.
win32:LIBS += $$quote($$MYDLLDIR/mylib.dll)
unix:LIBS += $$quote(-L$$MYDLLDIR) -lmylib
DDIR = \"$$MYDLLDIR/\"
SDIR = \"$$IN_PWD/\"
# Replace slashes in paths with backslashes for Windows
win32:file ~= s,/,\\,g
win32:DDIR ~= s,/,\\,g
win32:SDIR ~= s,/,\\,g
for(file, HEADERS) {
QMAKE_POST_LINK += $$QMAKE_COPY $$quote($${SDIR}$${file}) $$quote($$DDIR) $$escape_expand(\\n\\t)
}
Then adjust the LD_LIBRARY_PATH in the 'Run settings' of your project to point to that same dll directory (relatively).
Yes, it's ugly with escaping for paths with spaces and backslashes, but I found this to be working well cross-platform. Windows (XP, 7) and Linux tested. And yes it includes environment settings to be changed for running your project, but at least you don't need external (Python) scripts anymore or to install it to system directory requiring root privileges.
Improvements are welcome.
I'm not sure if anyone else would be having issues with this, but for whatever reason Qmake wasn't able to access my user specified environment variables properly.
So, since this was the case, one solution I came up with was to add the variables as Qmake configuration variable.
If you're in a UNIX based system, the first thing you're going to want to do is append the location of qmake - which should lie in your QtSDK folder - to your system $PATH, like so:
export PATH=$PATH:/path/to/QtSDK/...../qmake_root
From there, you can do something along the lines of:
qmake -set "VARIABLE" "VALUE"
In this case, I simply did:
qmake -set "ATLAS_PROJ_ROOT" $ATLAS_PROJ_ROOT.
And then I accessed it in my Qmake project file (.pro) with:
VAR = $$[ATLAS_PROJ_ROOT]
More info can be found here.
I have a C project that has the following structure
Main/
Makefile.am
bin/
src/
Makefile.am
main.c
SomeLibrarySource/
SomeFuncs.c
SomeFuncs.h
The main.c contains the main function that uses functions defined in the SomeFuncs.{h/c} files.
I want to use autotools for this project. I read a couple of resources on autotools. But, I was only able to manage using autotools for a single level project where all source, object and other files reside in the same directory.
Then I got some links that talked about using autotools for deep projects like this one and then I got confused.
Right now I have two Makefile.am as follows
Makefile.am
SUBDIRS=src
src/Makefile.am
mainprgdir=../
mainprg_PROGRAMS=main
main_SOURCES=main.c
I am pretty sure that these files should not be as I have them now :P
How do I use autotools for the above project structure? (At least what should be there in those Makefile.am(s) and where should I place them.
EDIT:
One more thing! At the end I would like to have the object files created in the bin directory.
Thanks
mainprogdir=../ does not make a whole lot of sense (you don't know what it is relative to on installation). Probably intended:
# Main/Makefile.am
# .━━ target for `make install`
# |
# ↓ ↓━━ target for compilation
bin_PROGRAMS = bin/main
# ↓━━ based upon compilation target name
bin_main_SOURCES = src/main.c
There are two main approaches. If the functions in SomeLibrarySource are used only by main, then there's no need to build a separate library and you can simply specify the source files in src/Makefile.am
main_SOURCES = main.c SomeLibrarySource/SomeFuncs.c
However, if you actually want to use the functions in other code in your tree, you do not want to compile SomeFuncs.c multiple times but should use a convenience library.
# Assigning main_SOURCES is redundant
main_SOURCES = main.c
main_LDADD = SomeLibrarySource/libSomeFuncs.a
noinst_LIBRARIES = SomeLibrarySource/libSomeFuncs.a
AM_CPPFLAGS = -I$(srcdir)/SomeLibrarySource
(You'll need AC_PROG_RANLIB in configure.ac to use convenience libraries.)
If the source file is named SomeFuncs.c, automake will not need Makefile.am to specify SomeLibrarySource_libSomeFuncs_a_SOURCES, but if the name of the source code file does not match the name specified in noinst_LIBRARIES, SomeLibrarySource_libSomeFuncs_a_SOURCES should be set to the list of files used to build the library. Note that you do not need to specify main_SOURCES, since main.c is the default value if left unspecified (but it's not a bad idea to be explicit.) (In all of this, I am not comfortable use CamlCase names, but the system I'm using uses a case insensitive file system (biggest mistake apple ever made) and the examples I give here are working for me. YMMV)
You could of course do a recursive make, or build the library as a separate project and install it. (I like the final option. Libraries with useful features should exist on their own.)
I have a C program built using Autotools. In src/Makefile.am, I define a macro with the path to installed data files:
AM_CPPFLAGS = -DAM_INSTALLDIR='"$(pkgdatadir)"'
The problem is that I need to run make install before I can test the binary (since it needs to be able to find the data files).
I can define another macro with the path of the source tree so the data files can be located without installing:
AM_CPPFLAGS = -DAM_INSTALLDIR='"$(pkgdatadir)"' -DAM_TOPDIR='"$(abs_top_srcdir)"'
Now, I would like the following behavior:
If the binary was installed via make install, use AM_INSTALLDIR to fetch data files.
If the binary was not installed, use AM_TOPDIR to fetch data files.
Is this possible? Is there a better approach to this problem?
What I do (in https://http://rhdunn.github.com/cainteoir/) is:
const char *basedir = getenv("CAINTEOIR_DATADIR");
if (!basedir)
basedir = DATADIR "/" PACKAGE; // e.g. /usr/share/cainteoir-engine
and then run it (in tests/harness.py) as:
CAINTEOIR_DATADIR=`pwd`/data src/apps/metadata/metadata test_file.epub
This then allows the user to change the location of where to get the data if they wish.
Making the program able to use a run-time configuration as proposed by reece is a good solution. If for some reason you do not want it to be configurable at run-time, a common solution is to build a test binary differently than the installed binary (there are other problems associated with this, in particular ensuring that the program you are testing has behavior that is consistent with the program that is installed.) An easy way to do that is something like:
bin_PROGRAMS = foo
check_PROGRAMS = test-foo
test_foo_SOURCES = $(foo_SOURCES)
AM_CPPFLAGS = -DINSTALLDIR='"$(pkgdatadir)"'
test_foo_CPPFLAGS = -DINSTALLDIR='"$(abs_top_srcdir)"'
Rather than using a binary with a different name, you might want to have a dedicated tests directory and build the program using the same name as the original.
Note that I've changed the name from AM_INSTALLDIR to INSTALLDIR. Automake reserves names
beginning with "AM_" for its own use, and by using that name you are stomping on Automake's
namespace.
A bit of additional information first: The data files are under active development, and I have various scripts that need to call binaries using local data files, whereas installed binaries should use stable, installed data files.
My original solution made use of an environment variable, as proposed by reece. But I didn't want to manage setting up environment variables in various places, and I didn't want any risk of the wrong data files being picked up due to a mistake.
So the solution I ended up with was to define macros for both locations at build time, and add a flag (-local) to the binaries to force local data files to be used.