Skip to content

Commit 8f327a6

Browse files
authored
[Training] Deprecate Training Surpport (#1882)
SUMMARY: - Will give the user a warning if `train` is called Example: ```python /home/dsikka/llm-compressor/examples/quantization_2of4_sparse_w4a16/llama7b_sparse_w4a16.py:83: DeprecationWarning: Training support will be removed in future releases. Please use the llmcompressor Axolotl integration for fine-tuning https://developers.redhat.com/articles/2025/06/17/axolotl-meets-llm-compressor-fast-sparse-open train( ```
1 parent 4c95fd2 commit 8f327a6

File tree

1 file changed

+8
-0
lines changed

1 file changed

+8
-0
lines changed

src/llmcompressor/entrypoints/train.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@
1010
import math
1111
import os
1212

13+
from compressed_tensors.utils import deprecated
1314
from loguru import logger
1415
from transformers import PreTrainedModel
1516

@@ -22,6 +23,13 @@
2223
from .utils import post_process, pre_process
2324

2425

26+
@deprecated(
27+
message=(
28+
"Training support will be removed in future releases. Please use "
29+
"the llmcompressor Axolotl integration for fine-tuning "
30+
"https://developers.redhat.com/articles/2025/06/17/axolotl-meets-llm-compressor-fast-sparse-open" # noqa: E501
31+
)
32+
)
2533
def train(**kwargs) -> PreTrainedModel:
2634
"""
2735
Fine-tuning entrypoint that supports vanilla fine-tuning and

0 commit comments

Comments
 (0)