How to do build variants in shake? - shake-build-system

How can pass parameters in shake and then use them in our rules? I would like to make some equivalents of:
make ARCH=x86_64
or
make DEBUG=YES
etc...
For now i've been using enviroment variables and getEnv function to simulate these.
How can i make a dependency on a given parameter (environment variable) so that it can be read once in a build, store it somewhere and access it from multiple rules?
e.g. For now i am doing getEnv multiple times in multiple rules but during compilation time the enviroment variable can change. Also for example a debug flag can alter both compiler and linker flags meaning that the info would be needed to be available in both link and compile output rules.

Your existing approach of using environment variables should work, and since environment variables are already tracked by Shake, one approach is to parse DEBUG=YES and turn it into an environment variable. For example:
main = shakeArgsWith shakeOptions [] $ \_ args -> do
let (vars,files) = partition ('=' `elem`) args
forM_ vars $ \v -> let (a,'=':b) = break (== '=') v in setEnv a b
return $ Just $ if null files then rules else want files >> withoutActions rules
rules :: Rules ()
rules = ...
Since environment variables are local to a process (and their child processes), this will probably work fine.
Of course, storing such information in the environment variables is a bit ugly. You can create your own oracle to store the information instead:
main = shakeArgsWith shakeOptions [] $ \_ args -> do
let (vars,files) = partition ('=' `elem`) args
let vars2 = [(a,b) | v <- vars, let (a,'=':b) = break (== '=') v]
return $ Just $ do
global <- addOracle $ \(Global x) -> return $ lookup x vars2
if null files then rules global else want files >> withoutActions (rules global)
newtype Global = Global String deriving (Show,Typeable,Eq,Hashable,Binary,NFData)
rules :: (Global -> Action (Maybe String)) -> Rules ()
rules global = undefined
Now, instead of writing the information into an environment variable with setEnv, we store it in an oracle with addOracle, which will still be tracked and not conflict with any other environment-variable pieces.

Related

Handling multiple build configurations in parallel

How can I build one set of source files using two different configurations without having to rebuild everything?
My current setup adds an option --config=rel which will load all options from build_rel.cfg and compile everything to the directory build_rel/.
data Flags = FlagCfg String
deriving (Show, Eq)
flags = [Option ['c'] ["config"]
(ReqArg (\x -> Right $ FlagCfg x) "CFG")
"Specify which configuration to use for the build"]
main :: IO ()
main = shakeArgsWith shakeOptions { shakeChange=ChangeModtimeAndDigest }
flags $
\flags targets -> return $ Just $do
let buildDir = "build" ++
foldr (\a def -> case (a, def) of
(FlagCfg cfg, "") -> '_':cfg
otherwise -> def)
"" flags
-- Settings are read from a config file.
usingConfigFile $ buildDir ++ ".cfg"
...
If I then run
build --config=rel
build --config=dev
I will end up with two builds
build_rel/
build_dev/
However, every time I switch configuration I end up rebuilding everything. I would guess this is because all my oracles have "changed". I would like all oracles to be specific to my two different build directories so that changes will not interfere between builds using different configurations.
I know there is a -m option to specify where the database should be stored but I would rather not have to specify two options that have to sync all the time.
build --config=rel -m build_rel
Is there a way to update the option shakeFiles after the --config option is parsed?
Another idea was to parameterize all my Oracles to include my build configuration but then I noticed that usingConfigFile uses an Oracle and I would have to reimplement that as well. Seems clunky.
Is there some other way to build multiple targets without having to rebuild everything? It seems like such a trivial thing to do but still, I can't figure it out.
There are a few solutions:
Separate databases
If you want the two directories to be entirely unrelated, with nothing shared between them, then changing the database as well makes most sense. There's currently no "good" way to do that (either pass two flags, or pre-parse some of the command line). However, it should be easy enough to add:
shakeArgsOptionsWith
:: ShakeOptions
-> [OptDescr (Either String a)]
-> (ShakeOptions -> [a] -> [String] -> IO (Maybe (ShakeOptions, Rules ())))
-> IO ()
Which would then let you control both settings from a single flag.
Single database
If you want a single database, you could load all the config files, and specify config like release.destination = ... and debug.destination = ..., then rule for */output.txt would lookup the config based on the prefix of the rule, e.g. release/output.txt would look up release.destination. The advantage here is that anything that does not change between debug and release (e.g. documentation) could potentially be shared.

Using conditional logic with env variable in GCC Linker script

I have a build environment variable available, let's say for differentiating 2 hardware variants: HW_VER1
We are using linker script in build.
So, we want something like this. But, of course below gives error "invalid syntax in flags"
MEMORY
{
ifeq ($(HW_VER1),YES)
iram0_0_seg : org = 0x00080400, len = 0x21C00
else
iram0_0_seg : org = 0x00080400, len = 0xf1C00
endif
}
The requirement is not to use 2 linker scripts with the same exact contents, except the len value above, and let build system decide which one based on the env variable (the one with bigger len or smaller len based on the hw ver)

How to override Shake configuration on the command-line

I maintain small configuration files per project read via usingConfigFile. I'd like to be able to override any of those settings on the command line. It seems using shakeArgsWith (rather than shakeArgs) is the first step on the way but I don't see an obvious way to wire that through to the values produced by getConfig. Is there a standard approach for doing this?
There isn't a standard approach, but I know several larger build systems have invented something. A combination of shakeArgsWith, readConfigFile and usingConfig should do it. Something like (untested):
main = shakeArgsWith shakeOptions [] $ \_ args -> return $ Just $ do
file <- readConfigFile "myfile.cfg"
usingConfig $ Map.union (argsToSettings args) file
myNormalRules
Where argsToSettings is some function that parses your arguments and turns them into settings - e.g. breaking on the first = symbol or similar.

How should I interpolate environment variables in Shake file patterns?

In my Makefiles, I prefer having the output directory defined by a environment variable rather than hard-coded (with some reasonable default value if its unset). For example, a Make rule would look like
$(OUTPUT_DIR)/some_file: deps
#build commands
I have yet to figure out how to achieve a similar goal in Shake. I like using getEnvWithDefault to grab the value of the environment variable or a reasonable default, but no amount of bashing it with binds or lambdas have allowed me to combine it with (*>).
How might it be possible to interpolate an environment variable in a FilePattern for use with (*>)?
The function getEnvWithDefault runs in the Action monad, and the name of the rule has to be supplied in a context where you cannot access the Action monad, so you can't translate this pattern the way you tried. There are a few alternatives:
Option 1: Use lookupEnv before calling shake
To exactly match the behaviour of Make you can write:
main = do
outputDir <- fromMaybe "output" <$> lookupEnv "OUTPUT_DIR"
shakeArgs shakeOptions $ do
(outputDir </> "some_file") *> \out -> do
need deps
-- build commands
Here we use the lookupEnv function (from System.Environment) to grab the environment variable before we start running Shake. We can then define a file that precisely matches the environment variable.
Option 2: Don't force the output in the rule
Alternatively, we can define a rule that builds some_file regardless of what directory it is in, and then use the tracked getEnvWithDefault when we say which file we want to build:
main = shakeArgs shakeOptions $ do
"//some_file" *> \out -> do
need deps
-- build commands
action $ do
out <- getEnvWithDefault "OUTPUT_DIR"
need [out </> "some_file"]
Here the rule pattern can build anything, and the caller picks what the output should be. I prefer this variant, but there is a small risk that if the some_file pattern overlaps in some way you might get name clashes. Introducing a unique name, so all outputs are named something like $OUTPUT_DIR/my_outputs/some_file eliminates that risk, but is usually unnecessary.

How to write compilation rules for a bootstrapping compiler

I want to write build rules for a self-hosted compiler. Taking the example of GHC, the GHC compiler is written in Haskell, and compiles Haskell. I want to first compile the source using an existing copy of the GHC compiler (phase1), then compile the compiler using the phase1 compiler (phase2) then compile the compiler using the phase2 compiler. How can I encode that in Shake?
This problem is similar to writing fixed-point build rules. Some asumptions:
I assume each source file is compiled to one object file with no additional dependencies (the complexities of include/import files are orthogonal)
I assume the objects and results from phase1 end up in the directory phase1 etc.
You can define:
want ["phase3/ghc" <.> exe]
let getPhase x = read $ drop (length "phase") $ takeDirectory1 x :: Int
"//*.o" *> \out ->
let src = dropDirectory1 out -<.> "hs"
let phase = getPhase out
let compiler = if p == 1 then "ghc" else "phase" ++ show (p-1) </> "ghc" <.> exe
need $ src : [compiler | p /= 1]
cmd [compiler] "-c" [src] "-o" out
("//ghc" <.> exe) *> \out ->
let os = map (takeDirectory1 out </>) ["Main.o","Module2.o",...]
need os
cmd "link -o" [out] os

Resources