@@ -38,29 +38,10 @@ For the complete list of architectures and their configurations, see :mod:`fairs
3838Output Format
3939-------------
4040
41- The converter produces:
42-
43- 1. Model weights in the reference format:
41+ The converter produces model weights in the reference format:
4442 - Single checkpoint: ``consolidated.00.pth ``
4543 - Sharded checkpoints: ``consolidated.{00,01,02...}.pth ``
4644
47- 2. ``params.json `` containing model configuration:
48-
49- .. code-block :: json
50-
51- {
52- "model" : {
53- "dim" : 2048 , // Model dimension
54- "n_layers" : 16 , // Number of layers
55- "n_heads" : 32 , // Number of attention heads
56- "n_kv_heads" : 8 , // Number of key/value heads (if different from n_heads)
57- "multiple_of" : 256 , // FFN dimension multiple
58- "ffn_dim_multiplier" : 1.5 , // FFN dimension multiplier (if not 1.0)
59- "rope_theta" : 500000.0 , // RoPE theta value
60- "norm_eps" : 1e-5 // Layer norm epsilon
61- }
62- }
63-
6445Usage Example
6546-------------
6647
@@ -72,16 +53,6 @@ Usage Example
7253 /path/to/fairseq2/checkpoint \
7354 /path/to/output/dir
7455
75- 2. Convert to HuggingFace format:
76-
77- .. code-block :: bash
78-
79- fairseq2 llama write_hf_config --model < architecture> < fairseq2_checkpoint_dir>
80-
81- * ``<architecture> ``: Specify the architecture of the model -- `e.g. `, ``llama3 `` (see :mod: `fairseq2.models.llama `)
82-
83- * ``<fairseq2_checkpoint_dir> ``: Path to the directory containing your Fairseq2 checkpoint, where ``config.json `` will be added.
84-
8556 .. note ::
8657
8758 Architecture ``--model `` must exist and be defined in `e.g. ` :meth: `fairseq2.models.llama._config.register_llama_configs `.
@@ -91,8 +62,6 @@ API Details
9162
9263.. autoclass :: ConvertLLaMACheckpointHandler
9364
94- .. autoclass :: WriteHFLLaMAConfigHandler
95-
9665See Also
9766--------
9867
0 commit comments