-
Notifications
You must be signed in to change notification settings - Fork 250
Open
Labels
bugSomething isn't workingSomething isn't working
Description
โ๏ธ Your current environment
I introduced a bug in #1772
๐ Describe the bug
When a recipe with modifiers extending the QuantizationMixin
class is serialized, it fails to load back up from the yaml. If config_groups
field is set, the targets
field is set during validation to contain the full list of targets nested in config_groups
. When saved, both targets and config_groups are saved, and when re-loaded this error is hit, because both targets and config_groups cannot be set.
๐ ๏ธ Steps to reproduce
from compressed_tensors.quantization import (
QuantizationScheme,
QuantizationStrategy,
QuantizationType,
QuantizationArgs,
)
from llmcompressor.modifiers.quantization import (
QuantizationModifier,
)
from llmcompressor.recipe import Recipe
recipe = Recipe.from_modifiers(
QuantizationModifier(
config_groups={
"group_0": QuantizationScheme(
targets=["Linear"],
weights=QuantizationArgs(
num_bits=4,
type=QuantizationType.INT,
strategy=QuantizationStrategy.GROUP,
group_size=128,
symmetric=True,
dynamic=False,
),
)
},
ignore=["lm_head"],
)
)
yaml_str = recipe.yaml()
new_recipe = Recipe.create_instance(path_or_modifiers=yaml_str)
results in error
pydantic_core._pydantic_core.ValidationError: 1 validation error for QuantizationModifier
Value error, Please specify either `targets` or `config_groups` [type=value_error, input_value={'group': 'default', 'con..., 'ignore': ['lm_head']}, input_type=dict]
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working