@@ -214,17 +214,6 @@ Then create the client:
214214
215215[!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb?name=create_embed_model_client)]
216216
217-
218- ```python
219- from langchain_azure_ai.embeddings import AzureAIEmbeddingsModel
220-
221- embed_model = AzureAIEmbeddingsModel(
222- endpoint=os.environ["AZURE_INFERENCE_ENDPOINT"],
223- credential=os.environ[' AZURE_INFERENCE_CREDENTIAL' ],
224- model="text-embedding-3-large",
225- )
226- ```
227-
228217The following example shows a simple example using a vector store in memory:
229218
230219[!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb?name=create_vector_store)]
@@ -234,15 +223,6 @@ Let's add some documents:
234223
235224[! notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb? name=add_documents)]
236225
237- ` ` ` python
238- from langchain_core.documents import Document
239-
240- document_1 = Document(id=" 1" , page_content=" foo" , metadata={" baz" : " bar" })
241- document_2 = Document(id=" 2" , page_content=" thud" , metadata={" bar" : " baz" })
242-
243- documents = [document_1, document_2]
244- vector_store.add_documents(documents=documents)
245- ` ` `
246226
247227Let' s search by similarity:
248228
@@ -302,7 +282,7 @@ You can configure your application to send telemetry to Azure Application Insigh
302282 application_insights_connection_string = " instrumentation...."
303283 ` ` `
304284
305- 2. Using the Azure AI Foundry SDK and the project connection string ([! INCLUDE [hub-project-name](../../includes/hub-project-name.md)]s only).
285+ 2. Using the Azure AI Foundry SDK and the project connection string (** [! INCLUDE [hub-project-name](../../includes/hub-project-name.md)]s only** ).
306286
307287 1. Ensure you have the package ` azure-ai-projects` installed in your environment.
308288
0 commit comments