-
Notifications
You must be signed in to change notification settings - Fork 19.2k
docs(huggingface): add chat usage guide and regression tests for init_chat_model #33194
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
docs(huggingface): add chat usage guide and regression tests for init_chat_model #33194
Conversation
The latest updates on your projects. Learn more about Vercel for GitHub.
|
init_chat_model
; docs(huggingface): add chat usage guide4968b4e
to
9de7872
Compare
We've moved docs away from this repo - see https://github.com/langchain-ai/docs |
thanks! I’ll move the docs to langchain-ai/docs and keep this PR tests-only. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the contribution! Wondering if you'd be able to port it to the new docs monorepo?
import types | ||
from importlib import util as import_util | ||
from types import SimpleNamespace | ||
from typing import Any, Optional |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We want to handle this via integration tests rather than in chat models. Could you remove the unit test?
@@ -0,0 +1,86 @@ | |||
--- |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you add these to the new langchain docs?
https://docs.langchain.com/oss/python/contributing/integrations-langchain
@Accidental-MVP @eyurtsev Thanks for the Documentation update! |
Thanks for the guidance! I’ll move the docs to langchain-ai/docs and close this PR. Quick question on tests: you mentioned handling this via integration tests rather than unit tests. Where would you like those to live (e.g., in the Hugging Face partner package, standard-tests, or elsewhere), and what scope would you prefer covered (current failure mode vs. post-fix behavior)? Happy to open a follow-up PR in the right spot and align with house style. |
Description
Add offline unit tests documenting the current Hugging Face initialization bug and a short docs page for HF chat usage.
Tests (unit, offline)
libs/langchain/tests/unit_tests/chat_models/test_init_chat_model_hf.py
.transformers.pipeline
andlangchain_huggingface.chat_models.huggingface.ChatHuggingFace
viamonkeypatch.setitem(sys.modules, ...)
.importlib.util.find_spec
so nested modules are treated as installed.task="text-generation"
explicitly to avoid default drift.TypeError
(sinceChatHuggingFace
is constructed withoutllm
):init_chat_model(..., model_provider="huggingface")
raisesmax_tokens
present (documenting themax_tokens → max_new_tokens
expectation)timeout
/max_retries
present (they shouldn't be sent to the pipeline)Docs
docs/docs/integrations/chat/huggingface.mdx
following the required template sections:transformers.pipeline
→ChatHuggingFace(llm=...)
.init_chat_model(..., model_provider="huggingface")
usage, clearly gated as future.langchain-huggingface
,transformers
) and prefermax_new_tokens
overmax_tokens
.Issue
Refs #28226
Related: #33167, #32941
Dependencies
None.
Test plan
Local (Windows/PowerShell), using the repo's
uv
test group:Expected: 3 passed (they assert the current
TypeError
), no hard failures.Lint & CI
ruff
locally:py -m pipx run ruff check libs/langchain/tests/unit_tests/chat_models/test_init_chat_model_hf.py --fix
py -m pipx run ruff format libs/langchain/tests/unit_tests/chat_models/test_init_chat_model_hf.py
Notes for reviewers
pytest.raises(TypeError)
) so we can flip them to assert the fixed behavior once the upstream change merges (pipeline created and passed asllm=...
).