Skip to content
This repository was archived by the owner on Sep 10, 2025. It is now read-only.

Commit a602e23

Browse files
committed
Add file hash to cache key
Summary: Test Plan: Reviewers: Subscribers: Tasks: Tags:
1 parent 1f7748d commit a602e23

File tree

2 files changed

+5
-4
lines changed

2 files changed

+5
-4
lines changed

.github/workflows/pull.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -904,9 +904,9 @@ jobs:
904904
echo "et-git-hash=$(cat ${TORCHCHAT_ROOT}/install/.pins/et-pin.txt)" >> "$GITHUB_ENV"
905905
- name: Load or install ET
906906
id: install-et
907-
uses: actions/cache@v3
907+
uses: actions/cache@v4
908908
env:
909-
cache-key: et-build-${{runner.os}}-${{runner.arch}}-${{env.et-git-hash}}
909+
cache-key: et-build-${{runner.os}}-${{runner.arch}}-${{env.et-git-hash}}-${{ hashFiles('torchchat/utils/scripts/install_et.sh') }}
910910
with:
911911
path: ./et-build
912912
key: ${{env.cache-key}}

torchchat/model.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -30,8 +30,9 @@
3030
SequenceParallel,
3131
)
3232
from torch.nn import functional as F
33-
34-
import lm_eval # noqa
33+
# TODO: remove this after we figure out where in torchtune an `evaluate` module
34+
# is being imported, which is being confused with huggingface's `evaluate``.
35+
import lm_eval # noqa
3536
from torchtune.models.clip import clip_vision_encoder
3637
from torchtune.models.llama3_1._component_builders import llama3_1 as llama3_1_builder
3738
from torchtune.models.llama3_2_vision._component_builders import (

0 commit comments

Comments
 (0)