Commit 9810191
authored
[BC Breaking] Config System Refactor: TOML to Python Dataclass Registry (#2386)
**NOTE**: This PR is a large refactor of the codebase.
https://github.com/pytorch/torchtitan/releases/tag/v0.2.2 contains a
latest release right before this PR is merged.
# author's note
This refactor is mainly trying to address two issues:
- bad encapsulation: previously a monolithic `JobConfig` is leaked
everywhere
- not easy to iterate and experiment on model architecture and training
components
The main changes are:
- Strict encapsulation, even at the cost of (hopefully temporary)
bloated interface when calling subcomponents (e.g. validator). We should
try to find the right abstraction on cross-components visibility.
- Each `Configurable` component owns its own `Config`, which builds the
owner component. It achieves modularization via polymorphism and
inheritance, both classic concepts in OOP.
- This is partly inspired by repos like
[AXLearn](https://github.com/apple/axlearn) (in particular, @ruomingp's
[ML API
Styles](https://github.com/apple/axlearn/blob/main/docs/ml_api_style.md)),
github issues (e.g. #1055),
and offline discussions (with @Chillee, @ailzhang etc.).
- Similar functionality can be alternatively achieved by other ways,
e.g. `_target_` in
[Hydra](https://hydra.cc/docs/advanced/instantiate_objects/overview/),
but there are opinions not to couple with Hydra's other offerings. See
#1415
- Main entry point switches from TOML files to Python functions (a.k.a.
`config_registry.py` in each model).
- TOML has the constraint that everything needs to be registered
explicitly before it can be used, e.g. our quantization components need
to be registered with string names. Python's language level implicit
registration is what we believe to be more minimal, and should be fairly
easy to extended/modified to support TOML/YAML when users builds upon /
fork torchtitan.
- That said, Python config provides much more power, e.g. one can use
arbitrary logic to create (the config of) a component, which is hard to
express with TOML/YAML, thus creating extra difficulty when users want
to migrate to their own favorite config system. The only thing we can do
is to stay conservative on the usage of such power.
- We still uses [tyro](https://github.com/brentyi/tyro) to convert
config dataclass to CLI, still with the limitation that users need to
construct customized config classes, all the way from root level
(`Trainer.Config` now, `JobConfig` in the past).
- If CLI is not needed, new trainer (or any high-level) config is not
required.
- To support "polymorphic construction" from CLI without the hassle,
check out
[chz](https://github.com/openai/chz/blob/main/docs/04_command_line.md#polymorphic-construction).
This PR also
- updates the docs -- there might be remaining outdated docs, please
raise issues or help fix
- moves ft to experiments, continuing the effort in
#2311
Remaining work
- [AutoParallel CI
failure](https://github.com/pytorch/torchtitan/actions/runs/22165425254/job/64091572780?pr=2386)
seems caused by the way RoPE is authored, and needs change in
autoparallel. (cc @xmfan)
- being fixed in meta-pytorch/autoparallel#321
- [CompilerToolkit CI
failure](https://github.com/pytorch/torchtitan/actions/runs/22168015737/job/64099486707?pr=2386)
`TypeError: forward() missing 1 required positional argument:
'fwd_rng_state_2'` cc @yiming0416 please help take a look
- [SimpleFSDP CI
failure](https://github.com/pytorch/torchtitan/actions/runs/22168015749/job/64099486149?pr=2386)
is the same as #2312 around
dynamic shape for for-loop MoE experts computation. (cc @pianpwk)
- being fixed in #2399
- Fix integration scripts for MAST, Zoomer, etc.
- organize docs from `docs/` to subfolders, as we are having more
contents to cover in general
- generate and store serialized configs (maybe not in the repo)
- continue SAC refactor in
#2357, but somehow keep the
every-other-mm policy (cc @mori360)
- refactor RoPE in general, at least resolving the following TODOs in
code (cc @shuhuayu)
- having to set / no validation on rope dim == decoder dim // attention
n_heads
- consolidate `apply_rotary_emb_complex` and
`apply_rotary_emb_single_complex`
- address #2417
Longer-term issues
- More careful design about what to put config vs. runtime build kwargs.
(thanks @ailzhang)
- ModelSpec not serializable. There may be multiple solutions, but we
can potentially consolidate `model.py` and `parallelize.py` by
- sharing AC, compile, DP application across all Decoder models
- putting per-module TP/CP/EP sharding plan inside model itself
- Right now `BaseModel.update_from_config` violates encapsulation by
passing the Trainer config into Model config. This could be avoided by
python logic either in config construction time, or in trainer.
- Refactor `init_weights` into `Module.Config` instead of staying in
`Module`
- The benefit is that param init can be configurable; o/w we are
coupling module implementation and its weight init.
- This may require refactor of current TransformerBlock and its config.
E.g. `weight_init_std` may need to be put in config, with
`__post_init__` determining its value. (See related complaints /
discussions on `__post_init__` by
[chz](https://github.com/openai/chz/blob/main/docs/21_post_init.md))
Note to reviewer:
Although I believe the changes in this PR come naturally in a bundle,
you may (or may not) find the stack of 16 commits easier to review, as I
tried to split the changes in some logic manner. I apologize for the
giant PR.
# claude-generated summary
## Summary
This PR refactors torchtitan's configuration and training infrastructure
in 15 incremental, backwards-incompatible commits. The central change
replaces TOML config files and a monolithic `JobConfig` parser with
**typed Python dataclass configs**, a **`Configurable` base class**
pattern, and a **`config_registry`** module per model.
**270 files changed, 10,025 insertions, 11,418 deletions.**
---
## Motivation
The previous system used TOML files parsed by a custom `ConfigManager`
that layered CLI overrides on top. While simple, this had several
friction points:
1. **No type safety at config boundaries.** TOML values are
strings/ints/floats parsed at runtime. A typo in a key name (e.g.,
`training.stpes`) silently becomes a default value.
4. **Flat namespace.** All config sections (`[model]`, `[training]`,
`[optimizer]`, `[checkpoint]`, ...) lived in a single `JobConfig` class.
Every component received the full `JobConfig` even when it only needed a
few fields.
5. **Experiment extension was ad-hoc.** Experiments that needed custom
config fields (e.g., SimpleFSDP's `compile.graph_passes` or
FaultTolerant's `fault_tolerance.*`) required a `custom_config_module`
TOML key and a runtime `_merge_configs` call to graft new fields onto
`JobConfig`.
6. **Model args were disconnected from model code.** A `ModelArgs`
dataclass in `args.py` defined hyperparameters, but the `TrainSpec` that
bundled model + parallelization + loss was registered separately, with
no type-level link between them.
---
## What Changed
### 1. `Configurable` Base Class
A new `Configurable` base class (`torchtitan/config/configurable.py`)
establishes a universal pattern:
```python
class Configurable:
@DataClass(kw_only=True, slots=True)
class Config:
def build(self, **kwargs):
return self._owner(config=self, **kwargs)
def __init_subclass__(cls, **kwargs):
# Auto-wires Config.build() -> cls(config=..., **kwargs)
# Enforces @DataClass(kw_only=True, slots=True) on every Config
```
Every configurable component (Trainer, model, optimizer, tokenizer,
dataloader, checkpoint manager, metrics, validators, quantization
converters, ...) follows this pattern. Calling `config.build()`
constructs the owning class.
### 2. `Trainer.Config` Replaces `JobConfig`
The monolithic `JobConfig` is replaced by `Trainer.Config`, a nested
dataclass that aggregates typed sub-configs:
```python
class Trainer(Stateful, Configurable):
@DataClass(kw_only=True, slots=True)
class Config(Configurable.Config):
model_spec: ModelSpec | None = None # set by config_registry, suppressed from CLI
job: JobConfig = ...
training: TrainingConfig = ...
parallelism: ParallelismConfig = ...
optimizer: OptimizersContainer.Config = ...
lr_scheduler: LRSchedulersContainer.Config = ...
checkpoint: CheckpointManager.Config = ...
dataloader: BaseDataLoader.Config = ...
metrics: MetricsProcessor.Config = ...
# ... etc.
```
Each sub-config is the `Config` class of the component that consumes it
(e.g., `CheckpointManager.Config` is defined inside
`CheckpointManager`). Components receive only their own config, not the
entire training config.
### 3. `config_registry.py` Replaces TOML Files
Each model defines a `config_registry.py` with functions that return
complete `Trainer.Config` instances:
```python
# torchtitan/models/llama3/config_registry.py
def llama3_debugmodel() -> Trainer.Config:
return Trainer.Config(
job=JobConfig(description="Llama 3 debug training", ...),
model_spec=model_registry("debugmodel"),
optimizer=OptimizersContainer.Config(lr=8e-4),
training=TrainingConfig(local_batch_size=8, seq_len=2048, steps=10),
dataloader=HuggingFaceTextDataLoader.Config(dataset="c4_test"),
# ...
)
def llama3_debugmodel_float8() -> Trainer.Config:
config = llama3_debugmodel()
config.model_converters = ModelConvertersContainer.Config(
converters=[Float8LinearConverter.Config(enable_fsdp_float8_all_gather=True)]
)
return config
```
### 4. `TrainSpec` -> `ModelSpec`
`TrainSpec` is renamed to `ModelSpec` with a narrower scope: it holds
only model-specific concerns (model config, parallelization function,
loss function, state dict adapter). All training-level concerns
(optimizer, LR scheduler, checkpointing, etc.) live in `Trainer.Config`.
### 5. Model Configs: Flat `ModelArgs` -> Nested Dataclass Hierarchy
Model hyperparameters move from a flat `ModelArgs` dataclass into a
nested `Config` hierarchy that mirrors the module tree:
```python
# Before (main): flat args.py
@DataClass
class ModelArgs:
dim: int = 4096
n_layers: int = 32
n_heads: int = 32
# ... 20+ flat fields
# After (this PR): nested Config in model class
class Llama3Model(Decoder):
@DataClass(kw_only=True, slots=True)
class Config(Decoder.Config):
layer: Llama3TransformerBlock.Config # contains attention + FFN configs
rope: RoPE.Config # contains RoPE-specific params
```
### 6. `train.py` Split
The monolithic `train.py` (~800 lines) is split into:
- `train.py` (~60 lines): thin entry point that calls
`ConfigManager.parse_args()` and `config.build()`
- `trainer.py` (~850 lines): the `Trainer` class with training loop
logic
### 7. Experiment Extension via Inheritance
Experiments extend the config system through dataclass subclassing
instead of runtime config merging:
```python
# torchtitan/experiments/simple_fsdp/configs.py
@DataClass(kw_only=True, slots=True)
class SimpleFSDPConfig(Trainer.Config):
compile: SimpleFSDPCompileConfig = field(default_factory=SimpleFSDPCompileConfig)
```
Their `config_registry.py` returns the subclassed config type, and
`tyro` auto-generates CLI parsing for the extended fields.
---
## UX Comparison
### Launching Training
```bash
# Before (main)
CONFIG_FILE="./torchtitan/models/llama3/train_configs/llama3_8b.py" ./run_train.sh
# After (this PR)
MODEL=llama3 CONFIG=llama3_8b ./run_train.sh
```
### CLI Overrides
```bash
# Before (main)
CONFIG_FILE="./torchtitan/models/llama3/train_configs/debug_model.toml" ./run_train.sh \
--training.steps 100 --parallelism.tensor_parallel_degree 2
# After (this PR)
./run_train.sh --training.steps 100 --parallelism.tensor_parallel_degree 2
# (defaults to MODEL=llama3, CONFIG=llama3_debugmodel via run_train.sh)
```
CLI override syntax is unchanged (`--section.field value`), but `tyro`
now provides typed `--help` output generated from the dataclass tree.
### Defining a New Model Config
```bash
# Before: create a new TOML file, copy-paste sections, edit values
cp train_configs/debug_model.toml train_configs/my_experiment.toml
vim train_configs/my_experiment.toml
# After: write a Python function that mutates an existing config
def my_experiment() -> Trainer.Config:
config = llama3_debugmodel()
config.training.steps = 100
config.optimizer.lr = 1e-4
return config
```
### Adding Experiment-Specific Config Fields
```python
# Before (main): custom_config_module in TOML + runtime _merge_configs
# Requires: TOML key pointing to a Python module, dynamic dataclass creation
# After (this PR): dataclass inheritance
@DataClass(kw_only=True, slots=True)
class MyExperimentConfig(Trainer.Config):
my_custom_field: str = "default"
```
### Float8 / Quantization Configuration
```python
# Before (main): TOML section
# [quantize.linear.float8]
# enable_fsdp_float8_all_gather = true
# precompute_float8_dynamic_scale_for_fsdp = true
# After (this PR): typed config object
model_converters=ModelConvertersContainer.Config(
converters=[
Float8LinearConverter.Config(
enable_fsdp_float8_all_gather=True,
precompute_float8_dynamic_scale_for_fsdp=True,
),
],
),
```
---
## Limitations and Trade-offs
### 1. Configs are no longer declarative text files
TOML files were readable by anyone without Python knowledge. The new
config_registry functions are Python code, which requires understanding
imports, function calls, and dataclass construction. For users who only
need to tweak hyperparameters, the CLI override syntax
(`--training.steps 100`) works the same, but understanding the full
config requires reading Python.
### 2. Steeper learning curve for contributors
Adding a new model now requires understanding the `Configurable`
protocol, nested `Config` dataclass hierarchy, and the `config_registry`
pattern. The old approach of copying a TOML file and editing values had
a lower barrier to entry.
### 3. Config serialization is more complex
TOML files were trivially serializable and diffable. The new system
supports `to_dict()` + JSON serialization, but configs containing
callables (e.g., `ModelSpec.parallelize_fn`) cannot be fully
round-tripped. The `model_spec` field is excluded from serialization and
suppressed from CLI parsing.
### 4. tyro dependency
The CLI parsing now depends on `tyro`, a third-party library. While
`tyro` is well-maintained and provides typed CLI generation from
dataclasses, it is an additional dependency that must be kept compatible
with the dataclass patterns used here.
### 5. `@dataclass(slots=True)` constraints
The `Configurable` base class enforces `@dataclass(kw_only=True,
slots=True)` on all Config classes. While this provides memory
efficiency and prevents accidental attribute assignment, `slots=True`
prevents dynamic attribute addition and makes multiple inheritance with
other slotted classes more constrained. Each Config subclass in a deep
hierarchy must repeat the `@dataclass(kw_only=True, slots=True)`
decorator.
### 6. Two-level indirection for model selection
The old system required one identifier: `--job.config_file
path/to/file.toml`. The new system requires two: `--module llama3
--config llama3_8b`. While this separates model identity from training
recipe, it adds an extra argument.
---
## Numerics Verification
All model configs were verified for numerical equivalence against the
main branch (commit `10d8a306`):
NOTE
- only models that can fit on 8 GPUs are tested
- only subset of parallelism combination are tested
| Model | Status | Notes |
|-------|--------|-------|
| llama3 (debugmodel, 8B) | Bitwise match | |
| llama3 (debugmodel_flex_attn) | Bitwise match | |
| qwen3 (0.6B, 1.7B, 32B, MoE debugmodel) | Bitwise match | |
| deepseek_v3 (debugmodel, 16B) | Close (max diff 0.00014) |
Pre-existing main branch bug: missing `eps` in final RMSNorm |
| llama4 debugmodel | Bitwise match | _irope variants don't work on main
(FlexAttn `'dict' object has no attribute 'BLOCK_SIZE'`) but now work
after this PR |
| **gpt_oss** debugmodel | --debug.deterministic causes loss to be NaN;
o/w first step loss match, minor difference after (likely caused by
flex?) | |
| flux | Bitwise match | |
---
## Migration Guide
| Old (main) | New (this PR) |
|---|---|
| `CONFIG_FILE="path/to/config.toml" ./run_train.sh` | `MODEL=llama3
CONFIG=llama3_8b ./run_train.sh` |
| `--job.config_file path.toml` | `--module llama3 --config llama3_8b` |
| `train_configs/*.toml` | `config_registry.py` functions |
| `TrainSpec` | `ModelSpec` |
| `ModelArgs` / `args.py` | Nested `Model.Config` dataclass |
| `custom_config_module` + `_merge_configs()` | Subclass
`Trainer.Config` |
| `build_model_converters()` free function |
`ModelConvertersContainer.Config.build()` |
| `build_metrics_processor()` free function |
`MetricsProcessor.Config.build()` |1 parent 238fa99 commit 9810191
File tree
280 files changed
+10502
-12043
lines changed- .ci/docker
- .github/workflows
- benchmarks
- docs
- scripts
- checkpoint_conversion
- estimate
- generate
- tests
- assets
- integration_tests
- unit_tests
- torchtitan
- components
- ft/diloco
- quantization
- config
- distributed
- experiments
- autoparallel
- deepseek_v3
- llama3
- local_map_deepseek_v3
- tests
- compiler_toolkit
- deepseek_v3
- llama3
- tests
- forge
- ft
- config
- diloco
- llama3
- tests
- rl
- unified
- actors
- infra
- models
- vllm_compat
- models/qwen3
- simple_fsdp
- deepseek_v3
- llama3
- tests
- torchcomms
- transformers_modeling_backend
- configs
- model
- tests
- vlm
- datasets
- infra
- model
- tests
- train_configs
- hf_datasets
- models
- common
- moe
- deepseek_v3
- model
- train_configs
- flux
- inference
- model
- train_configs
- gpt_oss
- model
- train_configs
- llama3_ft
- train_configs
- llama3
- model
- train_configs
- llama4
- model
- train_configs
- qwen3
- model
- train_configs
- protocols
- tools
Some content is hidden
Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.
280 files changed
+10502
-12043
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
3 | 3 | | |
4 | 4 | | |
5 | 5 | | |
6 | | - | |
7 | 6 | | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
1 | 1 | | |
2 | 2 | | |
3 | | - | |
4 | 3 | | |
5 | | - | |
6 | 4 | | |
7 | 5 | | |
8 | 6 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
6 | 6 | | |
7 | 7 | | |
8 | 8 | | |
9 | | - | |
| 9 | + | |
| 10 | + | |
10 | 11 | | |
11 | 12 | | |
12 | 13 | | |
13 | | - | |
| 14 | + | |
| 15 | + | |
14 | 16 | | |
15 | 17 | | |
16 | 18 | | |
| |||
71 | 73 | | |
72 | 74 | | |
73 | 75 | | |
74 | | - | |
| 76 | + | |
75 | 77 | | |
76 | 78 | | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
51 | 51 | | |
52 | 52 | | |
53 | 53 | | |
54 | | - | |
| 54 | + | |
55 | 55 | | |
56 | 56 | | |
57 | 57 | | |
| |||
75 | 75 | | |
76 | 76 | | |
77 | 77 | | |
78 | | - | |
| 78 | + | |
79 | 79 | | |
80 | 80 | | |
81 | 81 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
68 | 68 | | |
69 | 69 | | |
70 | 70 | | |
71 | | - | |
| 71 | + | |
72 | 72 | | |
73 | 73 | | |
74 | 74 | | |
75 | | - | |
| 75 | + | |
76 | 76 | | |
77 | 77 | | |
78 | 78 | | |
| |||
142 | 142 | | |
143 | 143 | | |
144 | 144 | | |
145 | | - | |
| 145 | + | |
146 | 146 | | |
147 | 147 | | |
148 | 148 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
8 | 8 | | |
9 | 9 | | |
10 | 10 | | |
11 | | - | |
12 | | - | |
| 11 | + | |
| 12 | + | |
13 | 13 | | |
14 | 14 | | |
15 | 15 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
16 | 16 | | |
17 | 17 | | |
18 | 18 | | |
19 | | - | |
20 | | - | |
| 19 | + | |
21 | 20 | | |
22 | 21 | | |
23 | 22 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
5 | 5 | | |
6 | 6 | | |
7 | 7 | | |
8 | | - | |
9 | | - | |
10 | | - | |
11 | | - | |
12 | | - | |
13 | | - | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
14 | 13 | | |
| 14 | + | |
| 15 | + | |
15 | 16 | | |
16 | 17 | | |
17 | | - | |
18 | | - | |
19 | | - | |
20 | | - | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
21 | 23 | | |
22 | 24 | | |
23 | 25 | | |
24 | 26 | | |
25 | | - | |
26 | | - | |
27 | | - | |
28 | | - | |
29 | | - | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
| 31 | + | |
| 32 | + | |
30 | 33 | | |
31 | 34 | | |
32 | 35 | | |
33 | 36 | | |
34 | | - | |
| 37 | + | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
35 | 41 | | |
36 | | - | |
37 | | - | |
38 | | - | |
39 | | - | |
40 | | - | |
| 42 | + | |
41 | 43 | | |
42 | 44 | | |
43 | | - | |
44 | | - | |
45 | | - | |
46 | | - | |
47 | | - | |
48 | | - | |
49 | | - | |
50 | | - | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
51 | 52 | | |
52 | 53 | | |
53 | | - | |
| 54 | + | |
54 | 55 | | |
55 | 56 | | |
56 | 57 | | |
| |||
60 | 61 | | |
61 | 62 | | |
62 | 63 | | |
63 | | - | |
| 64 | + | |
64 | 65 | | |
65 | 66 | | |
66 | 67 | | |
67 | 68 | | |
68 | 69 | | |
69 | 70 | | |
70 | 71 | | |
71 | | - | |
| 72 | + | |
72 | 73 | | |
73 | 74 | | |
74 | 75 | | |
| |||
84 | 85 | | |
85 | 86 | | |
86 | 87 | | |
87 | | - | |
88 | | - | |
89 | | - | |
90 | | - | |
91 | | - | |
92 | | - | |
93 | | - | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
| 92 | + | |
| 93 | + | |
94 | 94 | | |
95 | 95 | | |
96 | 96 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
12 | 12 | | |
13 | 13 | | |
14 | 14 | | |
15 | | - | |
| 15 | + | |
16 | 16 | | |
17 | 17 | | |
18 | | - | |
| 18 | + | |
19 | 19 | | |
20 | 20 | | |
21 | 21 | | |
| |||
39 | 39 | | |
40 | 40 | | |
41 | 41 | | |
42 | | - | |
| 42 | + | |
43 | 43 | | |
44 | 44 | | |
45 | 45 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
54 | 54 | | |
55 | 55 | | |
56 | 56 | | |
57 | | - | |
| 57 | + | |
58 | 58 | | |
59 | | - | |
60 | | - | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
61 | 63 | | |
62 | 64 | | |
63 | 65 | | |
| |||
0 commit comments