Lumiera uses a build system based on SCons
SCons is an open source software construction tool based on build definition scripts written in Python. Within these build scripts, we define a data structure to describe the parts and dependencies of our software. When executed, SCons evaluates those definitions and the actual files in the source tree to derive a build strategy, which is then performed to actually (re)build the software.
just build Lumiera: |
|
# ≔ number of CPUs |
build + run Tests: |
|
→ see target/,testlog |
development build: |
|
→ |
install: |
|
installs below /usr/local/, ⚠ sudo |
see possible config: |
|
⚠ settings are sticky → see ./optcache |
|
Known shortcomings of SCons
|
this section is based on the introductory pages on the SCons Wiki
When SCons starts building the project, it creates its own environment with dependency trees, helper functions, builders and other stuff. The SCons environment is built in memory and some parts of it are saved to disk to speed up things on the next start. The definition of the build happens within this abstracted build environment. This often confuses people who used Makefiles, where “environment” is actually the System Environment.
the familiar operating system container with environment variables such as PATH, HOME etc. It is usually accessible via os.environ mapping in Python and therefore in SCons too. SCons does not automatically import any settings from System Environment automatically (like flags for compilers, or paths for tools), because it’s designed to be a cross platform tool with predictable behaviour. If you rely on any system PATHs or environment variables — you need to extract those settings explicitly in your build definition.
when SCons executes, it performs a build definition python script written by the user. By convention, this main script is called SConstruct and is located in the root of the source tree. It is a full featured Python module executed within a specifically prepared environment.
these files are also SCons scripts, but they are placed in subdirectories of the project. Typically they are used to organize hierarchical builds and are included from the main SConstruct file from the project root. Often, all of the actual build definitions reside in SConscript files in the sub-trees of the project.
The SCons buildsystem revolves around the metaphor of a Builder. This is a SCons object that you explicitly invoke from the scripts to define that there is something to build, thereby transforming a source into a target. So the target depends on the sources, and typically those source nodes were created by previous builder invocations. The use of Builders is declarative: it is a statement that a transformation (build step) has to happen, while the knowledge how this can be achieved is kept implicit within the buildsystem.
These are are functors that perform something (execute an external command or call a python function, for instance). A Builder retains a list of Actions needed to update its targets; those Actions are run when needed.
This is the basic building block of the dependency graph, while the arcs are created by using Builders: a Node represents a filesystem object, such as a file or directory which can also be a build result and as such does not exist yet. There are also Alias nodes and Value nodes which represent setting values. The power of SCons is in the fact that dependencies can be tracked and a build strategy can be derived, automatically, based on the structure of this dependency graph. And because the building blocks of that graph are abstract, users can represent the specifics of their build in an uniform way.
when defining a builder, SCons relies on modular scanner components to “understand” the source of the build step. They may scan source files to discover additional dependencies referenced inside. Thus, SCons comes with built-in knowledge about the source files and artefacts to be created by a typical build, and further types can be added through plug-ins.
any further, external component that adds Builders, Scanners and other helpers to SCons environments for use within scripts. There are special tools for configuring the platform to detect libraries and further requirements. Tools do not operate themselves, rather they will configure the build environment to reflect the needs of the specific build.
All key-value settings within a Construction Environment which are used to instruct SCons about builds.
Construction Variables can describe compiler flags, locations of commands to execute, and many other characteristics.
They are used for text substitution in command template strings for invocation of external commands, relying on
the usual $VARIABLE syntax. Since the configuration of a SCons environment is defined by its
Construction Variables, sub-environments with special configuration may be created.
SCons computes a signature for elements on the dependency graph using a cryptographic hash function which has the property that the same input repeatably leads to the same signature. The default function is MD5. Signatures are used throughout SCons to identify file contents, build command lines, or to identify cached build artefacts. Signatures are chained to determine if something needs to be re-built.
any Node or “build result” encountered through the definition of the build is a target. The actual build will be triggered by requesting a target, which typically might be just an executable known to reside at some location in the tree, or a target directory where the build is assumed to place build results. Special alias targets may be defined, based on other targets, to set off standard build sequences. Notably, a default target can be defined for the build.
Within our build system, we leverage the power of the Python programming language to create abstractions tailored to the needs of our project. Located in the admin/scons subdirectory, you’ll find a collection of Python modules to provide these building blocks.
the LumieraEnvironment is created as a subclass of the standard SCons build environment; it is outfitted with pre-configured custom builders for executables, libraries, extension modules, Lumiera plug-ins and icon resources.
all these custom builders implement a set of conventions and directory locations within the tree. These are defined (and can be adjusted) in the Setup.py module. This way, each builder automatically places the generated artefacts at standard build and installation locations.
for defining individual targets and builder invocations, we rely on build helpers to process whole source sub trees rather than individual files. Mostly, just placing a source file into the appropriate sub tree is sufficient to get it compiled, linked and installed in a standard way.
All sourcecode of the core application resides below src/. Building these components is controlled by
the SConscript located within this application source root. By convention, this is also the root for header
includes — all headers should be included relative to src/.
Within this application core tree, there are sub-trees for the main layers comprising the application.
Each of these sub-trees will be built into a shared library and then linked against the application framework
and common services residing in src/common. These common services in turn are also built into a shared
library liblumieracommon.so, as is the collection of helper classes and support facilities, known as
our support library' `liblumierasupport.so. Besides, there is a sub-tree for core plug-ins and helper tools.
one of the sub-trees, residing in src/stage forms the upper layer or user-interaction layer. Contrary to
the lower layers, the Stage Layer (GUI) is optional and the application is fully operational without GUI.
Thus, the GTK Gui is built and loaded as Lumiera a plug-in.
Since our development is test-driven, about half of the overall code can be found in unit- and integration tests, arranged below test/. There is a separate SConscript file, to define the various kinds of test artefacts to be created.
the tests to cover C++ components are organised into test-suites, residing in separate sub-trees.
Currently (as of 11/2025), we link each sub-tree into a shared test library. Here, individual
translation units define individual test case classes. At the end, all these test units
are linked together with a testrunner main() into the test-suite executable.
plain-C tests are defined in test-collections, grouped thematically into several subdirectories.
Here, each translation unit provides a separate main() function and is linked into a stand-alone
executable (yet still linked against the appropriate shared libraries of the main application layers)
There is a separate subtree for research and experiments. The rationale is to provide a simplified and flexible dependency structure for investigating fundamental problems and to try out new technologies. Notably there is a source file try.cpp, which is linked against all of the core libraries and is re-used any time some language features need further investigation or when new implementation techniques are pioneered.
the data/ subtree holds resources, configuration files and icons for the GUI. Most of our icons
are defined as SVG graphics. The build process creates a helper executable (rsvg_convert) to render
these vector graphics with the help of lib Cairo into icon collections of various sizes.
Largely, the documentation is written in Asciidoc and provided online in the documentation section of our website. The plain-text sources of this documentation tree are shipped alongside with the code. Besides, we build Doxygen API documentation there, and we create design and technical specs and drawings in SVG and in UML.
This is where the results of the build process are created. Lumiera is organised into a self contained folder structure. As long as the relative locations, as found within target/, are kept intact, the Application will be able to start up and find all its resources. Consequently, there is no need to “install” Lumiera — in can always be launched from a package bundle placed into some directory. In fact, the “install” target just copies this folder structure into the standard installation locations in accordance with the Filesystem Hierarchy Standard for Unix systems
|
Unfortunately SCons is a bit weird regarding the object files created during the build process.
So, for the time being, we’re building in-tree. Apologies for that. 2025: this aspect has been improved upstream, yet we did not find the time to rework our build system accordingly. Apologies for that… |
As described above, a custom environment
baseclass LumieraEnvironment is used to configure preconfigured builders, which always define
an installation target alongside with the build target. These installation targets are arranged
into a subtree with a prefix, which is by default usr/local — following the conventions of
the Filesystem Hierarchy Standard. As long as you just build stuff, you won’t notice these
installation targets, since by default they are located outside of the project tree. However,
by customising the build options PREFIX and INSTALLDIR, this installation tree can be relocated.
For example, our DEB package uses PREFIX=usr INSTALLDIR=debian/lumiera
(please note that such settings are cached in ./optcache and that
changing them causes a full rebuild)
All of the build processing is launched through the scons python script, usually installed into
/usr/bin when installing the SCons package onto the system. And by just invoking
scons -h
a summary of all custom options is printed, with targets and toggles defined for our build.
build is the default target: it creates the shared libs, the application, core plug-ins and the GUI.
testcode additionally builds the research and unit test code
check builds test code and runs our test-suites
research builds just the research tree
doc builds documentation (currently just Doxygen)
all builds the Application, test-suites and documentation
install builds and installs the Lumiera Application
By convention, an invocation of scons -c <TARGET> will clean up everything the given target would build.
Thus, invoking scons -c / is the most global clean operation: it will clean up al build artefacts and
will un-install Lumiera (recall: every defined node, or directory is also a target).
By deliberate choice, SCons does not support the concept of a separate “configure” stage. The necessary dependency detection is performed before each build, but with effective caching of detected settings. Currently, we expect all dependencies to be installed first-class into the system. Custom packages can be installed at /usr/local — however, we do not (yet) support custom libraries in arbitrary locations, passed as configuration. Please use your package manager.
SCons stores MD5 sums of all source files, all configure checks and all the command lines used to invoke compilers and external tools. The decision, what needs to be rebuilt is based entirely on these checksums. For one, this means that configure checks are re-run only when necessary. It also means that changes to some compiler switches will automatically cause all affected parts of the application to be re-built. And of course it means, that you only ever compile what is necessary.
With SCons, there is no need for the usual “cleaning paranoia”. Similarly, there is no need for CCache (but using DistCC rocks !). Unfortunately, calculating those MD5 sums requires some time on each build, even if the net result is that nothing will happen at all.
We provide several custom configuration options (run scons -h to get a summary). All of these
options are sticky: once set, the build system will recall them in a file .optcache and apply
them the same way in subsequent builds. It is fine to edit .optcache with a text editor.
The following sections provide additional information helpful when adapting the build system. It should be noted that the SCons scripts are Python modules, invoked within a special setup. Python has a host of specific tweaks and kinks, notably regarding visibility, definition order, imports and the use of standard data types like lists, dictionaries and generator functions. Python knowledge is widespread nowadays, yet we had ample opportunity to notice that, for people not familiar with the Python idiom, the SCons scripts may seem arcane and confusing.
After some basic checks of setup and the given command line, the SCons builder immediately loads the SConstruct as module — and expects that this python DSL code builds a project model data structure. The actual build is then driven by evaluating the dependency graph as implied by that model.
The individual SConscript definitions for each subfolder must be activated explicitly
from the SConstruct, using the SConscript(dirs=[...]) builder function. Note furthermore
that the order of the dirs mentioned in this invocation matters, since each SCconscript
usually imports some global variables at the beginning and exports other settings
at the end. Before evaluating a SConscript, the working directory is changed.
|
you can launch scons with a python debugger, using the Lumiera directory as
working location and set breakpoints in SConstruct or in any of our custom
python modules to investigate problems with some build definition not taking effect
as expected. Inspect the dictionary of the Environment with the debugger to
find out what has actually been configured… |
The Lumiera build system engages in a specific start-up sequence, which is explicitly coded and expands beyond the standards of the SCons build system.
first we add our tool directory below admin/scons to the search path, so that tools and python modules defined there become visible. You must familiarise yourself with the contents of this directory!
next we create our custom root LumieraEnvironment instance, which is stored
and exported in the global variable env. This environment setup is done by
the python module admin/scons/Setup.py
this module has some module-level definitions for standard path locations,
and all these settings are imported into a dictionary and placed into the
member field env.path of the LumieraEnvironment. All our custom builders
use these central settings as shared configuration.
once the base constructor of the SCons Environment class is invoked,
the command line is evaluated to populate the Construction Variables.
next, the constructor of our custom LumieraEnvironment installs
our custom tools and builder functions.
the last step in the start-up sequence is to invoke the Platform.configure(env)
function, which performs all the library and platform checks. This function
also configures the default compiler flags.
Several of the SConscript in subdirectories will create a nested build environment,
which obviously derives from LumieraEnvironment. This way we can configure
additional link dependencies and build configurations for some subtrees, like
building the GUI against GTK or handling plug-in modules specifically.
These custom builders like env.Program, env.SharedLibrary, env.LoadableModule
and env.LumieraPlugin are also defined in the LumieraEnvironment.py module.
All these classes inherit from the SCons Environment through the common baseclass
WrappedStandardExeBuilder — which defines a special arrangement where an install target
is always defined alongside the build target. This install target is “dropped off” as
a side-effect, while the build target will be returned.
In SCons, a builder returns a list of target nodes, and these can be passed in a flexible
way to further builders. At several places in our SConscript definitions, we use
Python functions defined within that script to manipulate and aggregate such
target lists. Notably, specific sets of targets can be combined into a
shared object (dynamic library), which is then again a SCons target
and can be passed to other executable builders as library dependency
for compilation and linking. Look into src/SConscript or test/SConscript
to see examples of that technique — which we also use to define that
global compound target variables like core, app_lib, core_lib,
vault_lib, and support_lib. These in turn are essential for
building the layered dependency hierarchy in our code.