@@ -293,7 +293,7 @@ export AZURE_INFERENCE_CREDENTIAL="<your-key-goes-here>"
293293
294294Then create the client:
295295
296- [!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb.ipynb ?name=
296+ [!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb?name=
297297create_embed_model_client)]
298298
299299```python
@@ -308,7 +308,7 @@ embed_model = AzureAIEmbeddingsModel(
308308
309309The following example shows a simple example using a vector store in memory:
310310
311- [!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb.ipynb ?name=create_vector_store)]
311+ [!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb?name=create_vector_store)]
312312
313313```python
314314from langchain_core.vectorstores import InMemoryVectorStore
@@ -318,7 +318,7 @@ vector_store = InMemoryVectorStore(embed_model)
318318
319319Let' s add some documents:
320320
321- [! notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb.ipynb ? name=" add_documents)]
321+ [! notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb? name=" add_documents)]
322322
323323` ` ` python
324324from langchain_core.documents import Document
@@ -332,7 +332,7 @@ vector_store.add_documents(documents=documents)
332332
333333Let's search by similarity:
334334
335- [!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb.ipynb ?name=search_similarity)]
335+ [!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb?name=search_similarity)]
336336
337337` ` ` python
338338results = vector_store.similarity_search(query=" thud" ,k=1)
0 commit comments