Skip to content

Commit 9fb5ef4

Browse files
build: add new tokenizer.
1 parent a98cde9 commit 9fb5ef4

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

installer/Dockerfile-vector-model

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,11 +11,11 @@
1111
# 不知道为什么用上面的脚本重新拉一遍向量模型比之前的大很多,所以还是用下面的脚本复用原来已经构建好的向量模型
1212

1313
FROM python:3.11-slim-bookworm AS tmp-stage1
14-
COPY installer/install_model_token.py install_model_token.py
14+
COPY installer/install_model_bert_base_cased.py install_model_bert_base_cased.py
1515
RUN pip3 install --upgrade pip setuptools && \
1616
pip install pycrawlers && \
1717
pip install transformers && \
18-
python3 install_model_token.py && \
18+
python3 install_model_bert_base_cased.py && \
1919
cp -r model/base/hub model/tokenizer
2020

2121
FROM ghcr.io/1panel-dev/maxkb-vector-model:v1.0.1 AS vector-model
File renamed without changes.

0 commit comments

Comments
 (0)