generated from langchain-ai/integration-repo-template
-
Notifications
You must be signed in to change notification settings - Fork 262
Labels
bugSomething isn't workingSomething isn't working
Description
Package
langchain-aws
Checked other resources
- I added a descriptive title to this issue
- I searched the LangChain documentation with the integrated search
- I used the GitHub search to find a similar issue and didn't find it
- I am sure this is a bug and not a question or request for help
Example Code
import os
from langchain_aws import ChatBedrockConverse
llm = ChatBedrockConverse(
region_name=os.environ["AWS_REGION"],
aws_access_key_id=os.environ["AWS_ACCESS_KEY_ID"],
aws_secret_access_key=os.environ["AWS_SECRET_ACCESS_KEY"],
model_id="eu.amazon.nova-lite-v1:0",
)
llm.invoke("test", additional_model_request_fields={"inferenceConfig": {"topK": 1}})Error Message and Stack Trace (if applicable)
/Users/user/projects/app/.venv/bin/python /Users/user/projects/app/bug_demo.py
Traceback (most recent call last):
File "/Users/user/projects/app/bug_demo.py", line 13, in <module>
llm.invoke("test", additional_model_request_fields={"inferenceConfig": {"topK": 1}})
~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/user/projects/app/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 402, in invoke
self.generate_prompt(
~~~~~~~~~~~~~~~~~~~~^
[self._convert_input(input)],
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<6 lines>...
**kwargs,
^^^^^^^^^
).generations[0][0],
^
File "/Users/user/projects/app/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1121, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/user/projects/app/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 931, in generate
self._generate_with_cache(
~~~~~~~~~~~~~~~~~~~~~~~~~^
m,
^^
...<2 lines>...
**kwargs,
^^^^^^^^^
)
^
File "/Users/user/projects/app/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1233, in _generate_with_cache
result = self._generate(
messages, stop=stop, run_manager=run_manager, **kwargs
)
File "/Users/user/projects/app/.venv/lib/python3.13/site-packages/langchain_aws/chat_models/bedrock_converse.py", line 1086, in _generate
_handle_bedrock_error(e)
~~~~~~~~~~~~~~~~~~~~~^^^
File "/Users/user/projects/app/.venv/lib/python3.13/site-packages/langchain_aws/chat_models/bedrock_converse.py", line 1621, in _handle_bedrock_error
raise error
File "/Users/user/projects/app/.venv/lib/python3.13/site-packages/langchain_aws/chat_models/bedrock_converse.py", line 1082, in _generate
response = self.client.converse(
messages=bedrock_messages, system=system, **params
)
File "/Users/user/projects/app/.venv/lib/python3.13/site-packages/botocore/client.py", line 602, in _api_call
return self._make_api_call(operation_name, kwargs)
~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/user/projects/app/.venv/lib/python3.13/site-packages/botocore/context.py", line 123, in wrapper
return func(*args, **kwargs)
File "/Users/user/projects/app/.venv/lib/python3.13/site-packages/botocore/client.py", line 1078, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.errorfactory.ValidationException: An error occurred (ValidationException) when calling the Converse operation: The model returned the following errors: Malformed input request: #: extraneous key [inference_config] is not permitted, please reformat your input and try again.
Process finished with exit code 1Description
Langchain turns valid arguments that complies with expected AWS request schema to invalid by forcing all nested keys to follow camel_case notation.
The code example worked correctly in previous versions. I believe this error was introduced in #829
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working