Skip to content

feat(evaluation): Add lm-eval to Pruna Metrics #1125

feat(evaluation): Add lm-eval to Pruna Metrics

feat(evaluation): Add lm-eval to Pruna Metrics #1125

Re-run triggered March 10, 2026 15:23
Status Success
Total duration 2m 48s
Artifacts 1

build.yaml

on: pull_request
Matrix: build
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size Digest
pruna-0.3.2-py3-none-any.whl
325 KB
sha256:352a7c8f20cd00c483e2c86f04299326c1d57286e616cd464c5c0577aa6611c9