@@ -214,17 +214,6 @@ Then create the client:
214
214
215
215
[!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb?name=create_embed_model_client)]
216
216
217
-
218
- ```python
219
- from langchain_azure_ai.embeddings import AzureAIEmbeddingsModel
220
-
221
- embed_model = AzureAIEmbeddingsModel(
222
- endpoint=os.environ["AZURE_INFERENCE_ENDPOINT"],
223
- credential=os.environ[' AZURE_INFERENCE_CREDENTIAL' ],
224
- model="text-embedding-3-large",
225
- )
226
- ```
227
-
228
217
The following example shows a simple example using a vector store in memory:
229
218
230
219
[!notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb?name=create_vector_store)]
@@ -234,15 +223,6 @@ Let's add some documents:
234
223
235
224
[! notebook-python[](~/azureai-samples-main/scenarios/langchain/getting-started-with-langchain-embeddings.ipynb? name=add_documents)]
236
225
237
- ` ` ` python
238
- from langchain_core.documents import Document
239
-
240
- document_1 = Document(id=" 1" , page_content=" foo" , metadata={" baz" : " bar" })
241
- document_2 = Document(id=" 2" , page_content=" thud" , metadata={" bar" : " baz" })
242
-
243
- documents = [document_1, document_2]
244
- vector_store.add_documents(documents=documents)
245
- ` ` `
246
226
247
227
Let' s search by similarity:
248
228
@@ -302,7 +282,7 @@ You can configure your application to send telemetry to Azure Application Insigh
302
282
application_insights_connection_string = " instrumentation...."
303
283
` ` `
304
284
305
- 2. Using the Azure AI Foundry SDK and the project connection string ([! INCLUDE [hub-project-name](../../includes/hub-project-name.md)]s only).
285
+ 2. Using the Azure AI Foundry SDK and the project connection string (** [! INCLUDE [hub-project-name](../../includes/hub-project-name.md)]s only** ).
306
286
307
287
1. Ensure you have the package ` azure-ai-projects` installed in your environment.
308
288
0 commit comments