Skip to content

Commit 4bba848

Browse files
authored
Apply pencil edits for blocking issues from PR review
1 parent e8fcf17 commit 4bba848

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

articles/cognitive-services/openai/tutorials/embeddings.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -111,11 +111,11 @@ echo export AZURE_OPENAI_ENDPOINT="REPLACE_WITH_YOUR_ENDPOINT_HERE" >> /etc/envi
111111

112112
---
113113

114-
After setting the environment variables you may need to close and reopen jupyter notebooks or whatever IDE you are using in order for the environment variables to be accessible.
114+
After setting the environment variables you may need to close and reopen Jupyter notebooks or whatever IDE you are using in order for the environment variables to be accessible.
115115

116116
Run the following code in your preferred Python IDE:
117117

118-
If you wish to view the jupyter notebook that corresponds to this tutorial you can download the tutorial from our [samples repo](https://github.com/Azure-Samples/Azure-OpenAI-Docs-Samples/blob/main/Samples/Tutorials/Embeddings/embedding_billsum.ipynb)
118+
If you wish to view the Jupyter notebook that corresponds to this tutorial you can download the tutorial from our [samples repo](https://github.com/Azure-Samples/Azure-OpenAI-Docs-Samples/blob/main/Samples/Tutorials/Embeddings/embedding_billsum.ipynb).
119119

120120
## Import libraries and list models
121121

@@ -276,7 +276,7 @@ df_bills
276276

277277
**Output:**
278278

279-
:::image type="content" source="../media/tutorials/tokens-dataframe.png" alt-text="Screenshot of the DataFrame with a new column called `n_tokens`." lightbox="../media/tutorials/tokens-dataframe.png":::
279+
:::image type="content" source="../media/tutorials/tokens-dataframe.png" alt-text="Screenshot of the DataFrame with a new column called n_tokens." lightbox="../media/tutorials/tokens-dataframe.png":::
280280

281281
To understand the n_tokens column a little more as well how the text is tokenized, it can be helpful to run the following code:
282282

0 commit comments

Comments
 (0)