Skip to content

[Bug]: Serialized QuantizationMixin subclasses fail to loadย #1906

@brian-dellabetta

Description

@brian-dellabetta

โš™๏ธ Your current environment

I introduced a bug in #1772

๐Ÿ› Describe the bug

When a recipe with modifiers extending the QuantizationMixin class is serialized, it fails to load back up from the yaml. If config_groups field is set, the targets field is set during validation to contain the full list of targets nested in config_groups. When saved, both targets and config_groups are saved, and when re-loaded this error is hit, because both targets and config_groups cannot be set.

๐Ÿ› ๏ธ Steps to reproduce

from compressed_tensors.quantization import (
    QuantizationScheme,
    QuantizationStrategy,
    QuantizationType,
    QuantizationArgs,
)
from llmcompressor.modifiers.quantization import (
    QuantizationModifier,
)
from llmcompressor.recipe import Recipe

recipe = Recipe.from_modifiers(
    QuantizationModifier(
        config_groups={
            "group_0": QuantizationScheme(
                targets=["Linear"],
                weights=QuantizationArgs(
                    num_bits=4,
                    type=QuantizationType.INT,
                    strategy=QuantizationStrategy.GROUP,
                    group_size=128,
                    symmetric=True,
                    dynamic=False,
                ),
            )
        },
        ignore=["lm_head"],
    )
)

yaml_str = recipe.yaml()

new_recipe = Recipe.create_instance(path_or_modifiers=yaml_str)

results in error

pydantic_core._pydantic_core.ValidationError: 1 validation error for QuantizationModifier
  Value error, Please specify either `targets` or `config_groups` [type=value_error, input_value={'group': 'default', 'con..., 'ignore': ['lm_head']}, input_type=dict]

Metadata

Metadata

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions