@@ -249,12 +249,6 @@ Let's search by similarity:
249
249
250
250
[!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb?name=search_similarity)]
251
251
252
- ```python
253
- results = vector_store.similarity_search(query="thud",k=1)
254
- for doc in results:
255
- print(f"* {doc.page_content} [{doc.metadata}]")
256
- ```
257
-
258
252
## Using Azure OpenAI models
259
253
260
254
If you' re using Azure OpenAI models with ` langchain-azure-ai` package, use the following URL:
@@ -277,43 +271,11 @@ First, configure logging to the level you are interested in:
277
271
278
272
[! notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-chat-models.ipynb? name=configure_logging)]
279
273
280
- ` ` ` python
281
- import sys
282
- import logging
283
-
284
- # Acquire the logger for this client library. Use 'azure' to affect both
285
- # 'azure.core` and `azure.ai.inference' libraries.
286
- logger = logging.getLogger(" azure" )
287
-
288
- # Set the desired logging level. logging.INFO or logging.DEBUG are good options.
289
- logger.setLevel(logging.DEBUG)
290
-
291
- # Direct logging output to stdout:
292
- handler = logging.StreamHandler(stream=sys.stdout)
293
- # Or direct logging output to a file:
294
- # handler = logging.FileHandler(filename="sample.log")
295
- logger.addHandler(handler)
296
-
297
- # Optional: change the default logging format. Here we add a timestamp.
298
- formatter = logging.Formatter(" %(asctime)s:%(levelname)s:%(name)s:%(message)s" )
299
- handler.setFormatter(formatter)
300
- ` ` `
301
274
302
275
To see the payloads of the requests, when instantiating the client, pass the argument ` logging_enable` =` True` to the ` client_kwargs` :
303
276
304
277
[! notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-chat-models.ipynb? name=create_client_with_logging)]
305
278
306
- ` ` ` python
307
- import os
308
- from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
309
-
310
- model = AzureAIChatCompletionsModel(
311
- endpoint=os.environ[" AZURE_INFERENCE_ENDPOINT" ],
312
- credential=os.environ[" AZURE_INFERENCE_CREDENTIAL" ],
313
- model=" mistral-medium-2505" ,
314
- client_kwargs={" logging_enable" : True},
315
- )
316
- ` ` `
317
279
318
280
Use the client as usual in your code.
319
281
0 commit comments