Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 15 additions & 15 deletions docs-website/reference/integrations-api/amazon_bedrock.md
Original file line number Diff line number Diff line change
Expand Up @@ -1135,7 +1135,7 @@ def __init__(
generation_kwargs: Optional[Dict[str, Any]] = None,
streaming_callback: Optional[StreamingCallbackT] = None,
boto3_config: Optional[Dict[str, Any]] = None,
tools: Optional[Union[List[Tool], Toolset]] = None,
tools: Optional[ToolsType] = None,
*,
guardrail_config: Optional[Dict[str, str]] = None) -> None
```
Expand Down Expand Up @@ -1168,7 +1168,8 @@ function that handles the streaming chunks. The callback function receives a
[StreamingChunk](https://docs.haystack.deepset.ai/docs/data-classes#streamingchunk) object and switches
the streaming mode on.
- `boto3_config`: The configuration for the boto3 client.
- `tools`: A list of Tool objects or a Toolset that the model can use. Each tool should have a unique name.
- `tools`: A list of Tool and/or Toolset objects, or a single Toolset for which the model can prepare calls.
Each tool should have a unique name.
- `guardrail_config`: Optional configuration for a guardrail that has been created in Amazon Bedrock.
This must be provided as a dictionary matching either
[GuardrailConfiguration](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_GuardrailConfiguration.html).
Expand Down Expand Up @@ -1227,12 +1228,10 @@ Instance of `AmazonBedrockChatGenerator`.

```python
@component.output_types(replies=List[ChatMessage])
def run(
messages: List[ChatMessage],
streaming_callback: Optional[StreamingCallbackT] = None,
generation_kwargs: Optional[Dict[str, Any]] = None,
tools: Optional[Union[List[Tool], Toolset]] = None
) -> Dict[str, List[ChatMessage]]
def run(messages: List[ChatMessage],
streaming_callback: Optional[StreamingCallbackT] = None,
generation_kwargs: Optional[Dict[str, Any]] = None,
tools: Optional[ToolsType] = None) -> Dict[str, List[ChatMessage]]
```

Executes a synchronous inference call to the Amazon Bedrock model using the Converse API.
Expand All @@ -1248,7 +1247,8 @@ Supports both standard and streaming responses depending on whether a streaming
- `stopSequences`: List of stop sequences to stop generation.
- `temperature`: Sampling temperature.
- `topP`: Nucleus sampling parameter.
- `tools`: Optional list of Tools that the model may call during execution.
- `tools`: A list of Tool and/or Toolset objects, or a single Toolset for which the model can prepare calls.
Each tool should have a unique name.

**Raises**:

Expand All @@ -1265,11 +1265,10 @@ A dictionary containing the model-generated replies under the `"replies"` key.
```python
@component.output_types(replies=List[ChatMessage])
async def run_async(
messages: List[ChatMessage],
streaming_callback: Optional[StreamingCallbackT] = None,
generation_kwargs: Optional[Dict[str, Any]] = None,
tools: Optional[Union[List[Tool], Toolset]] = None
) -> Dict[str, List[ChatMessage]]
messages: List[ChatMessage],
streaming_callback: Optional[StreamingCallbackT] = None,
generation_kwargs: Optional[Dict[str, Any]] = None,
tools: Optional[ToolsType] = None) -> Dict[str, List[ChatMessage]]
```

Executes an asynchronous inference call to the Amazon Bedrock model using the Converse API.
Expand All @@ -1285,7 +1284,8 @@ Designed for use cases where non-blocking or concurrent execution is desired.
- `stopSequences`: List of stop sequences to stop generation.
- `temperature`: Sampling temperature.
- `topP`: Nucleus sampling parameter.
- `tools`: Optional list of Tool objects or a Toolset that the model can use.
- `tools`: A list of Tool and/or Toolset objects, or a single Toolset for which the model can prepare calls.
Each tool should have a unique name.

**Raises**:

Expand Down
39 changes: 20 additions & 19 deletions docs-website/reference/integrations-api/anthropic.md
Original file line number Diff line number Diff line change
Expand Up @@ -196,7 +196,7 @@ def __init__(api_key: Secret = Secret.from_env_var("ANTHROPIC_API_KEY"),
streaming_callback: Optional[StreamingCallbackT] = None,
generation_kwargs: Optional[Dict[str, Any]] = None,
ignore_tools_thinking_messages: bool = True,
tools: Optional[Union[List[Tool], Toolset]] = None,
tools: Optional[ToolsType] = None,
*,
timeout: Optional[float] = None,
max_retries: Optional[int] = None)
Expand Down Expand Up @@ -231,7 +231,8 @@ Supported generation_kwargs parameters are:
`ignore_tools_thinking_messages` is `True`, the generator will drop so-called thinking messages when tool
use is detected. See the Anthropic [tools](https://docs.anthropic.com/en/docs/tool-use#chain-of-thought-tool-use)
for more details.
- `tools`: A list of Tool objects or a Toolset that the model can use. Each tool should have a unique name.
- `tools`: A list of Tool and/or Toolset objects, or a single Toolset, that the model can use.
Each tool should have a unique name.
- `timeout`: Timeout for Anthropic client calls. If not set, it defaults to the default set by the Anthropic client.
- `max_retries`: Maximum number of retries to attempt for failed requests. If not set, it defaults to the default set by
the Anthropic client.
Expand Down Expand Up @@ -275,12 +276,10 @@ The deserialized component instance.

```python
@component.output_types(replies=List[ChatMessage])
def run(
messages: List[ChatMessage],
streaming_callback: Optional[StreamingCallbackT] = None,
generation_kwargs: Optional[Dict[str, Any]] = None,
tools: Optional[Union[List[Tool], Toolset]] = None
) -> Dict[str, List[ChatMessage]]
def run(messages: List[ChatMessage],
streaming_callback: Optional[StreamingCallbackT] = None,
generation_kwargs: Optional[Dict[str, Any]] = None,
tools: Optional[ToolsType] = None) -> Dict[str, List[ChatMessage]]
```

Invokes the Anthropic API with the given messages and generation kwargs.
Expand All @@ -290,8 +289,9 @@ Invokes the Anthropic API with the given messages and generation kwargs.
- `messages`: A list of ChatMessage instances representing the input messages.
- `streaming_callback`: A callback function that is called when a new token is received from the stream.
- `generation_kwargs`: Optional arguments to pass to the Anthropic generation endpoint.
- `tools`: A list of Tool objects or a Toolset that the model can use. Each tool should
have a unique name. If set, it will override the `tools` parameter set during component initialization.
- `tools`: A list of Tool and/or Toolset objects, or a single Toolset, that the model can use.
Each tool should have a unique name. If set, it will override the `tools` parameter set during component
initialization.

**Returns**:

Expand All @@ -305,11 +305,10 @@ A dictionary with the following keys:
```python
@component.output_types(replies=List[ChatMessage])
async def run_async(
messages: List[ChatMessage],
streaming_callback: Optional[StreamingCallbackT] = None,
generation_kwargs: Optional[Dict[str, Any]] = None,
tools: Optional[Union[List[Tool], Toolset]] = None
) -> Dict[str, List[ChatMessage]]
messages: List[ChatMessage],
streaming_callback: Optional[StreamingCallbackT] = None,
generation_kwargs: Optional[Dict[str, Any]] = None,
tools: Optional[ToolsType] = None) -> Dict[str, List[ChatMessage]]
```

Async version of the run method. Invokes the Anthropic API with the given messages and generation kwargs.
Expand All @@ -319,8 +318,9 @@ Async version of the run method. Invokes the Anthropic API with the given messag
- `messages`: A list of ChatMessage instances representing the input messages.
- `streaming_callback`: A callback function that is called when a new token is received from the stream.
- `generation_kwargs`: Optional arguments to pass to the Anthropic generation endpoint.
- `tools`: A list of Tool objects or a Toolset that the model can use. Each tool should
have a unique name. If set, it will override the `tools` parameter set during component initialization.
- `tools`: A list of Tool and/or Toolset objects, or a single Toolset, that the model can use.
Each tool should have a unique name. If set, it will override the `tools` parameter set during component
initialization.

**Returns**:

Expand Down Expand Up @@ -392,7 +392,7 @@ def __init__(region: str,
None]] = None,
generation_kwargs: Optional[Dict[str, Any]] = None,
ignore_tools_thinking_messages: bool = True,
tools: Optional[List[Tool]] = None,
tools: Optional[ToolsType] = None,
*,
timeout: Optional[float] = None,
max_retries: Optional[int] = None)
Expand Down Expand Up @@ -425,7 +425,8 @@ Supported generation_kwargs parameters are:
`ignore_tools_thinking_messages` is `True`, the generator will drop so-called thinking messages when tool
use is detected. See the Anthropic [tools](https://docs.anthropic.com/en/docs/tool-use#chain-of-thought-tool-use)
for more details.
- `tools`: A list of Tool objects that the model can use. Each tool should have a unique name.
- `tools`: A list of Tool and/or Toolset objects, or a single Toolset, that the model can use.
Each tool should have a unique name.
- `timeout`: Timeout for Anthropic client calls. If not set, it defaults to the default set by the Anthropic client.
- `max_retries`: Maximum number of retries to attempt for failed requests. If not set, it defaults to the default set by
the Anthropic client.
Expand Down
17 changes: 9 additions & 8 deletions docs-website/reference/integrations-api/cohere.md
Original file line number Diff line number Diff line change
Expand Up @@ -745,7 +745,7 @@ def __init__(api_key: Secret = Secret.from_env_var(
streaming_callback: Optional[StreamingCallbackT] = None,
api_base_url: Optional[str] = None,
generation_kwargs: Optional[Dict[str, Any]] = None,
tools: Optional[Union[List[Tool], Toolset]] = None,
tools: Optional[ToolsType] = None,
**kwargs: Any)
```

Expand All @@ -770,7 +770,8 @@ Some of the parameters are:
`accurate` results or `fast` results.
- 'temperature': A non-negative float that tunes the degree of randomness in generation. Lower temperatures
mean less random generations.
- `tools`: A list of Tool objects or a Toolset that the model can use. Each tool should have a unique name.
- `tools`: A list of Tool and/or Toolset objects, or a single Toolset that the model can use.
Each tool should have a unique name.

<a id="haystack_integrations.components.generators.cohere.chat.chat_generator.CohereChatGenerator.to_dict"></a>

Expand Down Expand Up @@ -814,7 +815,7 @@ Deserialized component.
def run(
messages: List[ChatMessage],
generation_kwargs: Optional[Dict[str, Any]] = None,
tools: Optional[Union[List[Tool], Toolset]] = None,
tools: Optional[ToolsType] = None,
streaming_callback: Optional[StreamingCallbackT] = None
) -> Dict[str, List[ChatMessage]]
```
Expand All @@ -828,8 +829,8 @@ Invoke the chat endpoint based on the provided messages and generation parameter
potentially override the parameters passed in the __init__ method.
For more details on the parameters supported by the Cohere API, refer to the
Cohere [documentation](https://docs.cohere.com/reference/chat).
- `tools`: A list of tools or a Toolset for which the model can prepare calls. If set, it will override
the `tools` parameter set during component initialization.
- `tools`: A list of Tool and/or Toolset objects, or a single Toolset for which the model can prepare calls.
If set, it will override the `tools` parameter set during component initialization.
- `streaming_callback`: A callback function that is called when a new token is received from the stream.
The callback function accepts StreamingChunk as an argument.

Expand All @@ -847,7 +848,7 @@ A dictionary with the following keys:
async def run_async(
messages: List[ChatMessage],
generation_kwargs: Optional[Dict[str, Any]] = None,
tools: Optional[Union[List[Tool], Toolset]] = None,
tools: Optional[ToolsType] = None,
streaming_callback: Optional[StreamingCallbackT] = None
) -> Dict[str, List[ChatMessage]]
```
Expand All @@ -861,8 +862,8 @@ Asynchronously invoke the chat endpoint based on the provided messages and gener
potentially override the parameters passed in the __init__ method.
For more details on the parameters supported by the Cohere API, refer to the
Cohere [documentation](https://docs.cohere.com/reference/chat).
- `tools`: A list of tools for which the model can prepare calls. If set, it will override
the `tools` parameter set during component initialization.
- `tools`: A list of Tool and/or Toolset objects, or a single Toolset for which the model can prepare calls.
If set, it will override the `tools` parameter set during component initialization.
- `streaming_callback`: A callback function that is called when a new token is received from the stream.

**Returns**:
Expand Down
26 changes: 13 additions & 13 deletions docs-website/reference/integrations-api/google_genai.md
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@ def __init__(*,
generation_kwargs: Optional[Dict[str, Any]] = None,
safety_settings: Optional[List[Dict[str, Any]]] = None,
streaming_callback: Optional[StreamingCallbackT] = None,
tools: Optional[Union[List[Tool], Toolset]] = None)
tools: Optional[ToolsType] = None)
```

Initialize a GoogleGenAIChatGenerator instance.
Expand All @@ -156,7 +156,8 @@ For Gemini 2.5 series, supports `thinking_budget` to configure thinking behavior
- Positive integer: Set explicit budget
- `safety_settings`: Safety settings for content filtering
- `streaming_callback`: A callback function that is called when a new token is received from the stream.
- `tools`: A list of Tool objects or a Toolset that the model can use. Each tool should have a unique name.
- `tools`: A list of Tool and/or Toolset objects, or a single Toolset for which the model can prepare calls.
Each tool should have a unique name.

<a id="haystack_integrations.components.generators.google_genai.chat.chat_generator.GoogleGenAIChatGenerator.to_dict"></a>

Expand Down Expand Up @@ -201,7 +202,7 @@ def run(messages: List[ChatMessage],
generation_kwargs: Optional[Dict[str, Any]] = None,
safety_settings: Optional[List[Dict[str, Any]]] = None,
streaming_callback: Optional[StreamingCallbackT] = None,
tools: Optional[Union[List[Tool], Toolset]] = None) -> Dict[str, Any]
tools: Optional[ToolsType] = None) -> Dict[str, Any]
```

Run the Google Gen AI chat generator on the given input data.
Expand All @@ -215,8 +216,8 @@ the default config. Supports `thinking_budget` for Gemini 2.5 series thinking co
default settings.
- `streaming_callback`: A callback function that is called when a new token is
received from the stream.
- `tools`: A list of Tool objects or a Toolset that the model can use. If provided, it will
override the tools set during initialization.
- `tools`: A list of Tool and/or Toolset objects, or a single Toolset for which the model can prepare calls.
If provided, it will override the tools set during initialization.

**Raises**:

Expand All @@ -235,12 +236,11 @@ A dictionary with the following keys:

```python
@component.output_types(replies=List[ChatMessage])
async def run_async(
messages: List[ChatMessage],
generation_kwargs: Optional[Dict[str, Any]] = None,
safety_settings: Optional[List[Dict[str, Any]]] = None,
streaming_callback: Optional[StreamingCallbackT] = None,
tools: Optional[Union[List[Tool], Toolset]] = None) -> Dict[str, Any]
async def run_async(messages: List[ChatMessage],
generation_kwargs: Optional[Dict[str, Any]] = None,
safety_settings: Optional[List[Dict[str, Any]]] = None,
streaming_callback: Optional[StreamingCallbackT] = None,
tools: Optional[ToolsType] = None) -> Dict[str, Any]
```

Async version of the run method. Run the Google Gen AI chat generator on the given input data.
Expand All @@ -255,8 +255,8 @@ See https://ai.google.dev/gemini-api/docs/thinking for possible values.
default settings.
- `streaming_callback`: A callback function that is called when a new token is
received from the stream.
- `tools`: A list of Tool objects or a Toolset that the model can use. If provided, it will
override the tools set during initialization.
- `tools`: A list of Tool and/or Toolset objects, or a single Toolset for which the model can prepare calls.
If provided, it will override the tools set during initialization.

**Raises**:

Expand Down
Loading
Loading