Skip to content

Conversation

kristapratico
Copy link
Owner

Example:

import openai
from openai.types.chat import ChatCompletionSystemMessageParam, ChatCompletionUserMessageParam
from openai.lib.azure_types import (
    AzureCognitiveSearchChatExtensionConfiguration,
    AzureCognitiveSearchChatExtensionParameters,
)

client = openai.AzureOpenAI()

messages = [
    ChatCompletionSystemMessageParam(role="system", content="You are a helpful assistant."),
    ChatCompletionUserMessageParam(role="user", content="How is Azure machine learning different than Azure OpenAI?"),
]

response = client.chat.completions.with_azure_options(
    data_sources=[
        AzureCognitiveSearchChatExtensionConfiguration(
            type="AzureCognitiveSearch",
            parameters=AzureCognitiveSearchChatExtensionParameters(
                index_name="openai-test-index-carbon-wiki",
                endpoint="endpoint",
            ),
        )
    ]
).create(
    model="gpt-4",
    messages=messages,
)
print(response)

Intellisense:

image

Note: You can still use response helpers like with_raw_response and with_streaming_response, but order matters - they must be called after with_azure_options(...):

response = client.chat.completions.with_azure_options(
    data_sources=[
        AzureCognitiveSearchChatExtensionConfiguration(
            type="AzureCognitiveSearch",
            parameters=AzureCognitiveSearchChatExtensionParameters(
                index_name="openai-test-index-carbon-wiki",
                endpoint="endpoint",
            ),
        )
    ]
).with_raw_response.create(
    model="gpt-4",
    messages=messages,
)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant