Skip to content

ChatBedrockConverse: passing empty inferenceConfig object and empty system array causes Bedrock ValidationException when using Prompt Management #9310

@nityam-dixit-at-quicko

Description

@nityam-dixit-at-quicko

Checked other resources

  • This is a bug, not a usage question. For questions, please use the LangChain Forum (https://forum.langchain.com/).
  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

According to AWS docs, if you use a prompt from Prompt management, you may not include inferenceConfig, system, additionalModelRequestFields, or toolConfig in the Converse request; doing so is treated as a forbidden “override.” AWS Documentation

import { ChatBedrockConverse } from "@langchain/aws";

const llm = new ChatBedrockConverse({
  // Important: this is a Prompt Management ARN, not a base model id
  model: "arn:aws:bedrock:ap-south-1:123456789012:prompt/abc123:1",
  // I do not intentionally set temperature/maxTokens/system messages
  // (they’re left undefined / absent)
});

await llm.invoke("Ping"); // or await llm.stream("Ping");

Error Message and Stack Trace (if applicable)

D:\developer\repository\experimental\genai-adapter-v-90-final\node_modules\@aws-sdk\client-bedrock-runtime\dist-cjs\index.js:1829
    const exception = new ValidationException({
                      ^
ValidationException: Conflict encountered. Overriding 'inferenceConfig' during runtime is not yet supported
    at de_ValidationExceptionRes (D:\developer\repository\experimental\genai-adapter-v-90-final\node_modules\@aws-sdk\client-bedrock-runtime\dist-cjs\index.js:1829:23)
    at de_CommandError (D:\developer\repository\experimental\genai-adapter-v-90-final\node_modules\@aws-sdk\client-bedrock-runtime\dist-cjs\index.js:1649:25)
    at processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async D:\developer\repository\experimental\genai-adapter-v-90-final\node_modules\@smithy\middleware-serde\dist-cjs\index.js:8:24
    at async D:\developer\repository\experimental\genai-adapter-v-90-final\node_modules\@smithy\core\dist-cjs\index.js:121:20
    at async D:\developer\repository\experimental\genai-adapter-v-90-final\node_modules\@smithy\middleware-retry\dist-cjs\index.js:254:46
    at async D:\developer\repository\experimental\genai-adapter-v-90-final\node_modules\@aws-sdk\middleware-logger\dist-cjs\index.js:5:26
    at async ChatBedrockConverse._streamResponseChunks (D:\developer\repository\experimental\genai-adapter-v-90-final\node_modules\@langchain\aws\src\chat_models.ts:923:26)
    at async ChatBedrockConverse._generate (D:\developer\repository\experimental\genai-adapter-v-90-final\node_modules\@langchain\aws\src\chat_models.ts:839:24)
    at async D:\developer\repository\experimental\genai-adapter-v-90-final\node_modules\@langchain\core\src\language_models\chat_models.ts:529:35 {        
  '$fault': 'client',
  '$response': HttpResponse {
    statusCode: 400,
    reason: undefined,
    headers: {
      ':status': 400,
      date: 'Mon, 03 Nov 2025 11:59:23 GMT',
      'content-type': 'application/json',
      'content-length': '100',
      'x-amzn-requestid': '7ce86070-cba5-4f82-8a99-f99aa06a922e',
      'x-amzn-errortype': 'ValidationException:http://internal.amazon.com/coral/com.amazon.bedrock/'
    },
    body: ClientHttp2Stream {
      _events: [Object],
      _readableState: [ReadableState],
      _writableState: [WritableState],
      allowHalfOpen: true,
      _maxListeners: undefined,
      _eventsCount: 11,
      [Symbol(shapeMode)]: true,
      [Symbol(kCapture)]: false,
      [Symbol(async_id_symbol)]: 31,
      [Symbol(kSession)]: [ClientHttp2Session],
      [Symbol(timeout)]: null,
      [Symbol(state)]: [Object],
      [Symbol(request)]: null,
      [Symbol(proxySocket)]: null,
      [Symbol(sent-headers)]: [Object: null prototype],
      [Symbol(origin)]: 'https://bedrock-runtime.ap-south-1.amazonaws.com',
      [Symbol(requestAsyncResource)]: [AsyncResource],
      [Symbol(id)]: 1,
      [Symbol(kHandle)]: [Http2Stream]
    }
  },
  '$retryable': undefined,
  '$metadata': {
    httpStatusCode: 400,
    requestId: '7ce86070-cba5-4f82-8a99-f99aa06a922e',
    extendedRequestId: undefined,
    cfId: undefined,
    attempts: 1,
    totalRetryDelay: 0
  }
}

Description

When ChatBedrockConverse is used with an Amazon Bedrock Prompt Management prompt (i.e., model is a Prompt ARN), two runtime fields are being sent even when they are effectively “empty”:

An inferenceConfig object whose properties (maxTokens, temperature, etc.) are undefined, but the object itself is present.

A system field that is an empty array when there are no system messages.

System Info

Node version: v22.17.1
@langchain/aws version: ^1.0.0
Platform: Windows 10 (win32 10.0.19045)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions