As of this writing (Feb-11-2026) there's effectively only one
environment (*) in DUNE DAQ, containing all packages needed for any
far detector development and operation. These are installed as nightly
builds whose names have the format NFD_DEV_<YYMMDD>_A9 and candidate
and stable builds whose names contain fddaq (e.g.,
fddaq-v5.5.0-a9). Historically, there was also a near detector
environment, though for the time being that's gone by the
wayside. These instructions describe how to add a new environment.
Any environment follows the classic far detector environment
model. That is, it includes a Spack umbrella package whose purpose
is only to depend on other DUNE DAQ Spack-installed packages which,
unlike it, contain packages with actual code. These packages can be
either part of our core packages (reused by more than one environment,
e.g. logging) or they can be specific to the environment in question
(e.g., fddetdataformats for the far detector environment). The core
packages are depended on by an umbrella package, coredaq, which has
different variants depending on the environment in
question. E.g. coredaq with subset=fddaq will depend on data
readout packages such as dpdklibs, whereas a data-analysis focused
package likely would not. coredaq itself, along with depending on
core DUNE DAQ packages, will depend on an umbrella package called
externals which depends on the external packages which DUNE DAQ
packages need, and which itself has the same variant. So, we can
represent the Spack packages in the far detector environment as
follows:
fddaq ^coredaq subset=fddaq ^externals subset=fddaq
Just like the subset variant of coredaq controls which core
packages are loaded in an environment, the subset variant of
externals controls which external packages are in the
environment. E.g., externals subset=fddaq depends on dpdk, while
an externals with a subset variant for a data analysis-focused
environment shouldn't.
Last but not least, a DUNE DAQ software environment also has a Python .venv environment with pip-installed packages. Again, some Python packages
may be needed for some environments but not for others.
We can use the far detector example to examine how the above model is implemented. Firstly, packages in the environment and their versions are defined in YAML files. They can be found as follows from the base of daq-release:
configs/coredaq/<env name>-<build type>/release.yaml: the core DUNE DAQ Spack-installed packages and the Spack-installed external packages for the environmentconfigs/<env name>/<env name>-<build type>/release.yaml: the non-core (target specific) DUNE DAQ Spack-installed packages and the Python environment packagesconfigs/<env name>/<env name>-<build type>/dbt-build-order.cmake: the order in which packages will build in a software development work area based on the environment
Note that it's largely for historic reasons that the external packages
are defined alongside the core packages and the pip-installed Python
packages are defined alongside the non-core packages.
Additionally, the package.py files Spack uses to install the DUNE
DAQ packages are provided in daq-release. Those for the core
packages are available in
spack-repos/coredaq-repo-template/packages, while those those specific to an
environment are available in spack-repos/<env name>-repo-template/packages
We can break this down into a series of steps. In the steps below, for
convenience we'll pretend the new environment is called yourenv. The
first steps, based on the description in the previous section, are
fairly obvious:
- Create a
configs/coredaq/yourenv-develop/configuration directory, copyrelease.yamlfromconfigs/coredaq/coredaq-developand define what Spack-installed core DUNE DAQ packages and externals your environment needs by deleting any unnecessary packages. - Similarly create a
configs/yourenv/yourenv-develop/configuration directory, copyrelease.yamlanddbt-build-order.cmakeover fromconfigs/fddaq/fddaq-develop, and editrelease.yamlto define which environment-specific Spack-installed DUNE DAQ packages and pip-installed DUNE DAQ Python packages go into the environment. Don't forget to change the references tofddaqin the copied file to the name of your new environment. - Create
spack-repos/yourenv-repo-template - Copy
spack-repos/fddaq-repo-template/repo.yamlinto it and again, replace the namefddaqin the file - Create a subdirectory,
spack-repos/yourenv-repo-template/packages, which itself is to be populated with subdirectories containingpackage.py's (yourpkg1/package.py,yourpkg2/package.py, etc.). This, of course, is analogous to the subdirectories inspack-repos/fddaq-repo-template/packages, but will contain thepackage.pys for the Spack-installed DUNE DAQ packages referred to inconfigs/yourenv/yourenv-develop/release.yamlrather than forfddaq. - Create the subdirectories for the repos which have code (i.e., not the umbrella package, just yet) and create their
package.pyfiles. Thesepackage.pyfiles will be processed bymake-release-repo.pyand hence you'll want to include tokens for that script to swap out (XVERSIONX, etc. - see thispackage.pyfor an example. - Now create
spack-repos/yourenv-repo-template/packages/yourenv, copy overspack-repos/fddaq-repo-template/packages/fddaq/package.pyinto it, and of course change the references tofddaqin the copied file. - Add a new value to the
subsetvariant inspack-repos/coredaq-repo-template/packages/externals/package.py, giving it the name of your environment (variant('subset', values=('fddaq', 'yourenv' ...) - Likewise for
spack-repos/coredaq-repo-template/packages/coredaq/package.py - In
scripts/spack/release-setup-tools.sh, edit the section which works with the$TARGET_ABBREVvariable to add a map between the target abbreviation and its full name (e.g.,FD<->fddaq) - In
scripts/checkout-daq-package.py, edit the line which begins withpkgs = yaml_dict.get("coredaq", []) ...and add the name of the environment - Copy
scripts/templates/build-NEWENV-alma9.ymlover to.github/workflows, and change its name so thatNEWENVhas the environment name swapped in - e.g.,.github/workflows/build-yourenv-alma9.yml. - Swap the
NEWENVtoken in the body of.github/workflows/build-yourenv-alma9.ymlas well asNEWABBREV, which should be replaced by the target abbreviation you chose a few steps earlier.
At this point, you'll now have set things up to the point that you can run this workflow so as to install the environment in /cvmfs.
(*) Not to be confused with the idea of a developer vs. production environment