finetune之后,加载对应的checkpoint,启动openai_api.py后,请求无法调用tools函数,如何解决? #535
Replies: 1 comment
-
需要在查询的POST请求中,增加functions字段来说明,参考https://github.com/THUDM/ChatGLM3/blob/main/openai_api_demo/openai_api_request.py |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
finetune之后,通过以下方式来加载checkpoint,启动python openai_api.py之后,请求无法支持tools函数,请问是否需要在openai_api.py中注册上tools函数,还是需要其他方式?
config = AutoConfig.from_pretrained(MODEL_PATH, trust_remote_code=True, pre_seq_len=128) model = AutoModel.from_pretrained(MODEL_PATH, trust_remote_code=True, config=config) prefix_state_dict = torch.load(os.path.join("***ChatGLM3/finetune_chatmodel_demo/output/tool_alpaca_pt-20231204-194658-128-2e-2", "pytorch_model.bin")) new_prefix_state_dict = {} for k, v in prefix_state_dict.items(): if k.startswith("transformer.prefix_encoder."): new_prefix_state_dict[k[len("transformer.prefix_encoder."):]] = v print("Loaded from pt checkpoints", new_prefix_state_dict.keys()) model.transformer.prefix_encoder.load_state_dict(new_prefix_state_dict) model = model.to(DEVICE).eval() if 'cuda' in DEVICE else model.float().to(DEVICE).eval()
Beta Was this translation helpful? Give feedback.
All reactions