@@ -293,7 +293,7 @@ export AZURE_INFERENCE_CREDENTIAL="<your-key-goes-here>"
293
293
294
294
Then create the client:
295
295
296
- [!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb.ipynb ?name=
296
+ [!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb?name=
297
297
create_embed_model_client)]
298
298
299
299
```python
@@ -308,7 +308,7 @@ embed_model = AzureAIEmbeddingsModel(
308
308
309
309
The following example shows a simple example using a vector store in memory:
310
310
311
- [!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb.ipynb ?name=create_vector_store)]
311
+ [!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb?name=create_vector_store)]
312
312
313
313
```python
314
314
from langchain_core.vectorstores import InMemoryVectorStore
@@ -318,7 +318,7 @@ vector_store = InMemoryVectorStore(embed_model)
318
318
319
319
Let' s add some documents:
320
320
321
- [! notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb.ipynb ? name=" add_documents)]
321
+ [! notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb? name=" add_documents)]
322
322
323
323
` ` ` python
324
324
from langchain_core.documents import Document
@@ -332,7 +332,7 @@ vector_store.add_documents(documents=documents)
332
332
333
333
Let's search by similarity:
334
334
335
- [!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb.ipynb ?name=search_similarity)]
335
+ [!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb?name=search_similarity)]
336
336
337
337
` ` ` python
338
338
results = vector_store.similarity_search(query=" thud" ,k=1)
0 commit comments