-
Notifications
You must be signed in to change notification settings - Fork 4
Support Spack Usage Scenarios
This page provides descriptions of a variety of Spack usage scenarios that CSAID intends to support. They are grouped by role. We do not attempt to list every possible task for each role. Our goal is to provide enough scenarios that the way to work on other scenarios is clear by analogy.
The current aim is to get to a build and distribution candidate for LArSoft. Thus we will not include Mu2e code in this solve.
To simulate the usage scenarios we want to support we first need a "base environment" containing:
- a specified version of larsoft (and thus a specified version of art)
- the development version of each supported experiment (to ensure all the right dependencies are built, and that experiments will be able to use the built larsoft for experiment code development).
- The compiler version(s) we want should be specified as toolchains: stanzas, with a default compiler toolchain specified in an all section, using require:, (or prefer: if multiple compilers are being used).
- The target architecture should be specified in an all section as "x86_64_v3".
- The variants key should include "generator=ninja"
- The variants key should include "cxxstd=17"
- The versions of art and larsoft used for the previous exercise are good.
- No package constraints should use prefer except to work around concretizer issues.
- Packages with specified version numbers should be required.
- Non-default variants should be required.
- Because we care about the compiler version it should be required, or specified in the "specs" list to work around concretizer issues.
- The bootstrap script already marks the appropriate packages as external; no additions should be except as listed below.
- No package file should contain an all key (because we do not want them to override the global concretization).
This means someone working on art code and possibly other code upon which art depends, but not code from higher-level layers of the system.
1. Building and distributing a new DM tool release (I.e. sam-web-client, metacat) compatible with an existing experiment code release
2. Distributing a new config file package (I.e. ifdhc_config) that overrides the existing installation
(aka "Fun with spack deprecate")
This means someone working larsoft code and possibly other code upon which larsoft depends, but not working on experiment code.
These means a member an experiment, working on code from the experiment and possibly other code upon which the experiment's code depends.
-
LArSoft releases are made via Spack environments.
-
LArSoft's
larsoft_specs.yamlandlarsoft_packages.yamlfiles reside in a repository- When creating a new LArSoft release, the release manager will use a tagged version of those files
- The release manager will also use tagged versions of lower-down configurations (e.g.
art_specs.yamlandart_packages.yaml)
-
LArSoft's
larsoft_packages.yamlincludes tagged versions of LArSoft packages -
Each experiment's
<exp>_package.yamlfile includes only@developversions -
After tagging each LArSoft package with its new version (for a given LArSoft release)
- The corresponding Spack recipes are updated with the new versions
- The repository containing the Spack recipes is tagged with a version number corresponding to the LArSoft release
2. Distributing a new config file release (I.e. ifdh_config) that overrides the one shipped with existing installs
Should we add some DAQ-related roles and scenarios here, at a similar level of details to what is above? Anything that makes use of the releases of shared infrastructure code we will be making with Spack is reasonable to include.