@@ -249,12 +249,6 @@ Let's search by similarity:
249249
250250[!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb?name=search_similarity)]
251251
252- ```python
253- results = vector_store.similarity_search(query="thud",k=1)
254- for doc in results:
255- print(f"* {doc.page_content} [{doc.metadata}]")
256- ```
257-
258252## Using Azure OpenAI models
259253
260254If you' re using Azure OpenAI models with ` langchain-azure-ai` package, use the following URL:
@@ -277,43 +271,11 @@ First, configure logging to the level you are interested in:
277271
278272[! notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-chat-models.ipynb? name=configure_logging)]
279273
280- ` ` ` python
281- import sys
282- import logging
283-
284- # Acquire the logger for this client library. Use 'azure' to affect both
285- # 'azure.core` and `azure.ai.inference' libraries.
286- logger = logging.getLogger(" azure" )
287-
288- # Set the desired logging level. logging.INFO or logging.DEBUG are good options.
289- logger.setLevel(logging.DEBUG)
290-
291- # Direct logging output to stdout:
292- handler = logging.StreamHandler(stream=sys.stdout)
293- # Or direct logging output to a file:
294- # handler = logging.FileHandler(filename="sample.log")
295- logger.addHandler(handler)
296-
297- # Optional: change the default logging format. Here we add a timestamp.
298- formatter = logging.Formatter(" %(asctime)s:%(levelname)s:%(name)s:%(message)s" )
299- handler.setFormatter(formatter)
300- ` ` `
301274
302275To see the payloads of the requests, when instantiating the client, pass the argument ` logging_enable` =` True` to the ` client_kwargs` :
303276
304277[! notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-chat-models.ipynb? name=create_client_with_logging)]
305278
306- ` ` ` python
307- import os
308- from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
309-
310- model = AzureAIChatCompletionsModel(
311- endpoint=os.environ[" AZURE_INFERENCE_ENDPOINT" ],
312- credential=os.environ[" AZURE_INFERENCE_CREDENTIAL" ],
313- model=" mistral-medium-2505" ,
314- client_kwargs={" logging_enable" : True},
315- )
316- ` ` `
317279
318280Use the client as usual in your code.
319281
0 commit comments