Skip to content

Commit b60e4d4

Browse files
Fix CI rate limiting (#1449)
* fix CI * fix
1 parent 04db016 commit b60e4d4

File tree

8 files changed

+12
-7
lines changed

8 files changed

+12
-7
lines changed

.github/workflows/test_inc.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@ concurrency:
1515

1616
env:
1717
TRANSFORMERS_IS_CI: true
18+
HF_TOKEN: ${{ secrets.HF_HUB_READ_TOKEN }}
1819

1920
jobs:
2021
build:
@@ -48,5 +49,4 @@ jobs:
4849
- name: Test with Pytest
4950
run: |
5051
pytest tests/neural_compressor
51-
env:
52-
HF_HUB_READ_TOKEN: ${{ secrets.HF_HUB_READ_TOKEN }}
52+

.github/workflows/test_ipex.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@ concurrency:
1515

1616
env:
1717
TRANSFORMERS_IS_CI: true
18+
HF_TOKEN: ${{ secrets.HF_HUB_READ_TOKEN }}
1819

1920
jobs:
2021
build:

.github/workflows/test_openvino.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ env:
2121
UV_TORCH_BACKEND: cpu
2222
UV_SYSTEM_PYTHON: true
2323
TRANSFORMERS_IS_CI: true
24-
HF_HUB_READ_TOKEN: ${{ secrets.HF_HUB_READ_TOKEN }}
24+
HF_TOKEN: ${{ secrets.HF_HUB_READ_TOKEN }}
2525

2626
jobs:
2727
build:

.github/workflows/test_openvino_nightly.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ env:
2727
UV_TORCH_BACKEND: cpu
2828
UV_SYSTEM_PYTHON: true
2929
TRANSFORMERS_IS_CI: true
30-
HF_HUB_READ_TOKEN: ${{ secrets.HF_HUB_READ_TOKEN }}
30+
HF_TOKEN: ${{ secrets.HF_HUB_READ_TOKEN }}
3131

3232
jobs:
3333
build:

.github/workflows/test_openvino_notebooks.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@ concurrency:
1818

1919
env:
2020
TRANSFORMERS_IS_CI: true
21+
HF_TOKEN: ${{ secrets.HF_HUB_READ_TOKEN }}
2122

2223
jobs:
2324
build:

.github/workflows/test_openvino_slow.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ env:
2424
UV_TORCH_BACKEND: cpu
2525
UV_SYSTEM_PYTHON: true
2626
TRANSFORMERS_IS_CI: true
27-
HF_HUB_READ_TOKEN: ${{ secrets.HF_HUB_READ_TOKEN }}
27+
HF_TOKEN: ${{ secrets.HF_HUB_READ_TOKEN }}
2828

2929
jobs:
3030
build:

optimum/intel/openvino/modeling_sentence_transformers.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -153,3 +153,6 @@ def tokenize(
153153
)
154154
)
155155
return output
156+
157+
def get_model_kwargs(self):
158+
return []

tests/openvino/test_modeling.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -572,9 +572,9 @@ def test_load_from_hub_and_save_text_speech_model(self):
572572
@slow
573573
def test_load_model_from_hub_private_with_token(self):
574574
model_id = "optimum-internal-testing/tiny-random-phi-private"
575-
token = os.environ.get("HF_HUB_READ_TOKEN", None)
575+
token = os.environ.get("HF_TOKEN", None)
576576
if not token:
577-
self.skipTest("Test requires a token `HF_HUB_READ_TOKEN` in the environment variable")
577+
self.skipTest("Test requires a token `HF_TOKEN` in the environment variable")
578578

579579
model = OVModelForCausalLM.from_pretrained(model_id, token=token, revision="openvino")
580580
self.assertIsInstance(model.config, PretrainedConfig)

0 commit comments

Comments
 (0)