添加--quantization_bit 8 全量微调后 加载的模型32K 模型和checkpoint不对应,导致推理失败 #559
Closed
15024287710Jackson
started this conversation in
Bad Case
Replies: 2 comments
-
执行的指令: |
Beta Was this translation helpful? Give feedback.
0 replies
-
#253 这个错误我们暂时没有办法复现,你可以去这个讨论区问问社群的人 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
(ChatGLM3EnvTest) [ma-user finetune_chatmodel_demo]$python inference.py --pt-checkpoint "/home/ma-user/work/output2023120702_finetune_ds_1000data/govern_1000_data-20231207-180940-1e-4/checkpoint-200" --model /home/ma-user/work/chatglm3-6b-32k
[2023-12-08 15:00:13,259] [INFO] [real_accelerator.py:158:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Loading checkpoint shards: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████| 7/7 [00:06<00:00, 1.11it/s]
Some weights of ChatGLMForConditionalGeneration were not initialized from the model checkpoint at /home/ma-user/work/chatglm3-6b-32k and are newly initialized: ['transformer.prefix_encoder.embedding.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Traceback (most recent call last):
File "inference.py", line 29, in
model.transformer.prefix_encoder.load_state_dict(new_prefix_state_dict)
File "/home/ma-user/anaconda3/envs/ChatGLM3EnvTest/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1605, in load_state_dict
self.class.name, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for PrefixEncoder:
Missing key(s) in state_dict: "embedding.weight".
(ChatGLM3EnvTest) [ma-user finetune_chatmodel_demo]$
Beta Was this translation helpful? Give feedback.
All reactions