Passing Additional Parameters to _call() Method in Custom LLM Class Doesn't Work as Expected #29307
Unanswered
zhang-boyu
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am implementing a custom chain by subclassing langchain.llms.base.LLM and overriding the _call() method. My goal is to pass additional parameters such as max_tokens to the _call() method. However, the parameters I provide are not being passed to the method correctly. Above is my code:
When running the code, the _call() method logs the following output:
Running _call, prompt: test, max_tokens: 1000, stop: <end>
It seems the max_tokens parameter I passed (max_tokens=100) is not being forwarded to _call() and is defaulting to 1000.
My Questions:
Why is max_tokens=10 not being passed to the _call() method?
How can I modify my code to ensure that the max_tokens value is correctly passed to _call() via kwargs?
System Info
LangChain version: 0.2.17
LangChain-Core verson: 0.2.43
Python version: 3.8.19
Endpoint setup: The endpoint is a local server receiving POST requests.
Beta Was this translation helpful? Give feedback.
All reactions