Skip to content

Support Spack Usage Scenarios

Marc Paterno edited this page Oct 17, 2025 · 14 revisions

This page provides descriptions of a variety of Spack usage scenarios that CSAID intends to support. They are grouped by role. We do not attempt to list every possible task for each role. Our goal is to provide enough scenarios that the way to work on other scenarios is clear by analogy.

Required initial concretization and build.

The current aim is to get to a build and distribution candidate for LArSoft. Thus we will not include Mu2e code in this solve.

To simulate the usage scenarios we want to support we first need a "base environment" containing:

  1. a specified version of larsoft (and thus a specified version of art)
  2. the development version of each supported experiment (to ensure all the right dependencies are built, and that experiments will be able to use the built larsoft for experiment code development).

Rules for creating the global concretization and build

In the environment configuration (spack.yaml) file

  1. The compiler version(s) we want should be specified as toolchains: stanzas, with a default compiler toolchain specified in an all section, using require:, (or prefer: if multiple compilers are being used).
  2. The target architecture should be specified in an all section as "x86_64_v3".
  3. The variants key should include "generator=ninja"
  4. The variants key should include "cxxstd=17"

In the packages.yaml file

  1. The versions of art and larsoft used for the previous exercise are good.
  2. No package constraints should use prefer except to work around concretizer issues.
  3. Packages with specified version numbers should be required.
  4. Non-default variants should be required.
  5. Because we care about the compiler version it should be required, or specified in the "specs" list to work around concretizer issues.
  6. The bootstrap script already marks the appropriate packages as external; no additions should be except as listed below.

Rules for experiment package files

  1. No package file should contain an all key (because we do not want them to override the global concretization).

art developer

This means someone working on art code and possibly other code upon which art depends, but not code from higher-level layers of the system.

1. Development of art code (only)

2. Development of art code root code.

Data Management Tool Developer

1. Building and distributing a new DM tool release (I.e. sam-web-client, metacat) compatible with an existing experiment code release

2. Distributing a new config file package (I.e. ifdhc_config) that overrides the existing installation

(aka "Fun with spack deprecate")

larsoft developer

This means someone working larsoft code and possibly other code upon which larsoft depends, but not working on experiment code.

1. Development of new larsoft code (only)

2. Development of new larsoft and art code

Experiment developer

These means a member an experiment, working on code from the experiment and possibly other code upon which the experiment's code depends.

1. Development of new experiment code (only).

2. Development of new experiment code and larsoft code.

SciSoft release manager

1. Building and distributing of a new larsoft release based on an existing art release

  • LArSoft releases are made via Spack environments.

  • LArSoft's larsoft_specs.yaml and larsoft_packages.yaml files reside in a repository

    • When creating a new LArSoft release, the release manager will use a tagged version of those files
    • The release manager will also use tagged versions of lower-down configurations (e.g. art_specs.yaml and art_packages.yaml)
  • LArSoft's larsoft_packages.yaml includes tagged versions of LArSoft packages

  • Each experiment's <exp>_package.yaml file includes only @develop versions

  • After tagging each LArSoft package with its new version (for a given LArSoft release)

    • The corresponding Spack recipes are updated with the new versions
    • The repository containing the Spack recipes is tagged with a version number corresponding to the LArSoft release

2. Building and distributing of a new art release based on an existing fife release

3. Building and distributing a new substrate release

Experiment release manager

1. Building and distributing a new experiment release based upon an existing larsoft release

2. Distributing a new config file release (I.e. ifdh_config) that overrides the one shipped with existing installs

DAQ roles?

Should we add some DAQ-related roles and scenarios here, at a similar level of details to what is above? Anything that makes use of the releases of shared infrastructure code we will be making with Spack is reasonable to include.

Clone this wiki locally