-
Notifications
You must be signed in to change notification settings - Fork 2.6k
Closed
Description
Contact Information
No response
MaxKB Version
1.10.8-tls
Problem Description
Steps to Reproduce

Enable Tool in AI chat and chat with it.
The expected correct result
No response
Related log output
Traceback (most recent call last):
File "/home/ubuntu/projects/MaxKB/apps/setting/models_provider/impl/base_chat_open_ai.py", line 98, in get_num_tokens_from_messages
return super().get_num_tokens_from_messages(messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/.cache/pypoetry/virtualenvs/maxkb-ZHVvQRGB-py3.11/lib/python3.11/site-packages/langchain_openai/chat_models/base.py", line 1264, in get_num_tokens_from_messages
raise NotImplementedError(
NotImplementedError: get_num_tokens_from_messages() is not presently implemented for model cl100k_base. See https://platform.openai.com/docs/guides/text-generation/managing-tokens for information on how messages are converted to tokens.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ubuntu/projects/MaxKB/apps/application/flow/workflow_manage.py", line 506, in hand_event_node_result
for r in result:
File "/home/ubuntu/projects/MaxKB/apps/application/flow/step_node/ai_chat_step_node/impl/base_chat_node.py", line 104, in write_context_stream
_write_context(node_variable, workflow_variable, node, workflow, answer, reasoning_content)
File "/home/ubuntu/projects/MaxKB/apps/application/flow/step_node/ai_chat_step_node/impl/base_chat_node.py", line 48, in _write_context
message_tokens = chat_model.get_num_tokens_from_messages(node_variable.get('message_list'))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/projects/MaxKB/apps/setting/models_provider/impl/base_chat_open_ai.py", line 100, in get_num_tokens_from_messages
tokenizer = TokenizerManage.get_tokenizer()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/projects/MaxKB/apps/common/config/tokenizer_manage_config.py", line 18, in get_tokenizer
TokenizerManage.tokenizer = GPT2TokenizerFast.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/.cache/pypoetry/virtualenvs/maxkb-ZHVvQRGB-py3.11/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 2025, in from_pretrained
return cls._from_pretrained(
^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/.cache/pypoetry/virtualenvs/maxkb-ZHVvQRGB-py3.11/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 2063, in _from_pretrained
slow_tokenizer = (cls.slow_tokenizer_class)._from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/.cache/pypoetry/virtualenvs/maxkb-ZHVvQRGB-py3.11/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 2278, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/.cache/pypoetry/virtualenvs/maxkb-ZHVvQRGB-py3.11/lib/python3.11/site-packages/transformers/models/gpt2/tokenization_gpt2.py", line 153, in __init__
with open(vocab_file, encoding="utf-8") as vocab_handle:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: expected str, bytes or os.PathLike object, not NoneTypeAdditional Information
No response
Metadata
Metadata
Assignees
Labels
No labels
