Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
931e7e6
Update pangolin_pipeline.ipynb
RamiyapriyaS-NIH Apr 16, 2025
2ec2194
Github Action: Lint Notebooks
Apr 16, 2025
1afc4d0
Update pangolin_pipeline.ipynb
RamiyapriyaS-NIH Apr 16, 2025
b2b022e
Update pangolin_pipeline.ipynb
RamiyapriyaS-NIH Apr 16, 2025
bb8facc
Update pangolin_pipeline.ipynb
RamiyapriyaS-NIH Apr 16, 2025
a8d06df
Resized images in microsoft_authenticator.md
RamiyapriyaS-NIH Apr 16, 2025
2c669be
Update KubeFlow_Azure.md
RamiyapriyaS-NIH Apr 18, 2025
36b14c1
Add files via upload
RamiyapriyaS-NIH Apr 18, 2025
55bae58
Add files via upload
RamiyapriyaS-NIH Apr 18, 2025
6017768
Add files via upload
RamiyapriyaS-NIH Apr 18, 2025
455520c
Update Azure_Open_AI_README.md
RamiyapriyaS-NIH Apr 18, 2025
a4d4a6a
Add files via upload
RamiyapriyaS-NIH Apr 18, 2025
ec4b89e
Add files via upload
RamiyapriyaS-NIH Apr 18, 2025
89e1177
Add files via upload
RamiyapriyaS-NIH Apr 18, 2025
5e66ed2
Add files via upload
RamiyapriyaS-NIH Apr 18, 2025
024a5b8
Update Azure_Open_AI_README.md
RamiyapriyaS-NIH Apr 18, 2025
25ebcdb
Update create_index_from_csv.md
RamiyapriyaS-NIH Apr 21, 2025
9d0b5e8
Update AzureAIStudio_langchain.ipynb
RamiyapriyaS-NIH Apr 21, 2025
f42f7c8
Update AzureOpenAI_embeddings.ipynb
RamiyapriyaS-NIH Apr 22, 2025
77d6f43
Add files via upload
RamiyapriyaS-NIH Apr 23, 2025
d4b8964
Add files via upload
RamiyapriyaS-NIH Apr 23, 2025
469dcd7
Update create_index_from_csv.md
RamiyapriyaS-NIH Apr 23, 2025
8bc8667
Update AzureAIStudio_index_structured_notebook.ipynb
RamiyapriyaS-NIH Apr 23, 2025
900ac63
Update AzureAIStudio_index_structured_with_console.ipynb
RamiyapriyaS-NIH Apr 23, 2025
6928eb9
Update Pubmed_RAG_chatbot.ipynb
RamiyapriyaS-NIH Apr 23, 2025
a41d24a
Update SpleenSeg_Pretrained-4_27.ipynb
RamiyapriyaS-NIH Apr 23, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
68 changes: 38 additions & 30 deletions docs/KubeFlow_Azure.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,52 +13,60 @@
# Azure Setup

To log into Azure from the command line interface, run the following commands
-az login
-az account set --subscription <NAME OR ID OF SUBSCRIPTION>

```
az login
az account set --subscription <NAME OR ID OF SUBSCRIPTION>
```
Create a resource group (if neccessary)
-az group create -n <RESOURCE_GROUP_NAME> -l <LOCATION>

```
az group create -n <RESOURCE_GROUP_NAME> -l <LOCATION>
```

Create a specifically defined cluster:
-az aks create -g <RESOURCE_GROUP_NAME> -n <NAME> -s <AGENT_SIZE> -c <AGENT_COUNT> -l <LOCATION> --generate-ssh-keys

```
az aks create -g <RESOURCE_GROUP_NAME> -n <NAME> -s <AGENT_SIZE> -c <AGENT_COUNT> -l <LOCATION> --generate-ssh-keys
```


# KubeFlow installation

Create user credentials. You only need to run this command once.
-az aks get-credentials -n <NAME> -g <RESOURCE_GROUP_NAME>

```
az aks get-credentials -n <NAME> -g <RESOURCE_GROUP_NAME>
```
Download the kfctl v1.2.0 release from the [Kubeflow releases page](https://github.com/kubeflow/kfctl/releases/tag/v1.2.0)

Unpack the tar ball
-tar -xvf kfctl_v1.2.0_<platform>.tar.gz

Unpack the tar ball.
```
tar -xvf kfctl_v1.2.0_<platform>.tar.gz
```
Run the following commands to set up and deploy Kubeflow in order. The code below includes an optional command to add the binary kfctl to your path. If you don’t add the binary to your path, you must use the full path to the kfctl binary each time you run it.

```
export PATH=$PATH:"<path-to-kfctl>

- export PATH=$PATH:"<path-to-kfctl>

- export KF_NAME=<your choice of name for the Kubeflow deployment>
export KF_NAME=<your choice of name for the Kubeflow deployment>

- export BASE_DIR=<path to a base directory>
export BASE_DIR=<path to a base directory>

- export KF_DIR=${BASE_DIR}/${KF_NAME}
export KF_DIR=${BASE_DIR}/${KF_NAME}

- export CONFIG_URI="https://raw.githubusercontent.com/kubeflow/manifests/v1.2-branch/kfdef/kfctl_k8s_istio.v1.2.0.yaml"
export CONFIG_URI="https://raw.githubusercontent.com/kubeflow/manifests/v1.2-branch/kfdef/kfctl_k8s_istio.v1.2.0.yaml"

- mkdir -p ${KF_DIR}
- cd ${KF_DIR}
- kfctl apply -V -f ${CONFIG_URI}
mkdir -p ${KF_DIR}
cd ${KF_DIR}
kfctl apply -V -f ${CONFIG_URI}
```

Run this command to check that the resources have been deployed correctly in namespace kubeflow:

```
kubectl get all -n kubeflow
```

Run this command to check that the resources have been deployed correctly in namespace kubeflow

- kubectl get all -n kubeflow

Open the KubeFlow Dashboard , the default installation does not create an external endpoint but you can use port-forwarding to visit your cluster. Run the following command

- kubectl port-forward svc/istio-ingressgateway -n istio-system 8080:80

Next, open http://localhost:8080 in your browser.
Open the KubeFlow Dashboard , the default installation does not create an external endpoint but you can use port-forwarding to visit your cluster. Run the following command:

```
kubectl port-forward svc/istio-ingressgateway -n istio-system 8080:80
```
Next, open http://localhost:8080 in your browser.
23 changes: 8 additions & 15 deletions docs/create_index_from_csv.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,12 @@
### Create an Azure search index from a csv file
:sparkles: Here we outline how to create an Azure search index from a CSV file summarizing funded award data exported from Reporter.nih.gov

### 1) Generate input CSV
### 1) Download input CSV
:ear: If you already have your csv ready, skip to section (2)

Our input data comes from the csv export option for [Reporter.nih.gov](https://reporter.nih.gov/). Navigate to reporter.nih.gov and select `Advanced Search`. Input your search parameters. In this case we filtered for awards made by NIGMS in FY 23. In the top right, select `Export`.
Download this public [csv file](https://www.kaggle.com/datasets/henryshan/2023-data-scientists-salary?resource=download) from kaggle to use as our input.

Select your export columns and make sure you export as a csv. In the example input data file we only selected 'Title', 'Project_ID', and 'Total_Cost', although a few other columns were also exported.

![Export from Reporter](/docs/images/1_export_reporter_csv.png)

If using the UI to upload, you need to make two small edits to the csv that gets exported. First, remove the extra comma at the end of each line. Second, replace the spaces in column names in the header row. You can do this using something like Python, or just do a find/replace in a text editor.
![Kaggle-csv](/docs/images/kaggle-input.jpeg)

### 2) Import data into Azure blob storage
:ear: If you already added your data to blob storage skip to section (3)
Expand All @@ -35,13 +31,13 @@ Navigate to AI Search and [create a new search](https://learn.microsoft.com/en-u

![Create new search](/docs/images/5_create_new_db.png)

Click `Import data`
Click `Import data`.

![Import Data](/docs/images/6_import_data.png)

Now fill out all the necessary parameters.
+ Data Source: Select `Azure Blob Storage`. New options will drop down.
+ Data source name: This can be anything, but go with something like `grant-data`.
+ Data source name: This can be anything, but go with something like `ds-salaries-data`.
+ Data to extract: Select `Content and metadata`.
+ Parsing mode: Select `Delimited text`. Check the `First Line Contains Header` box and leave `Delimiter Character` as `,`.
+ Connection string: Click `Choose an existing connection` and navigate to your storage account and container.
Expand All @@ -51,24 +47,21 @@ Now fill out all the necessary parameters.
+ Description: *Optional*.
+ If you get errors when trying to go to the next screen, make sure you don't have trailing commas in your csv, and there are not spaces in the header names. If this happens, fix those errors, re-upload to blob storage, and then try again!

![Connect to blog](/docs/images/7_connect_to_blob.png)
![Connect to blog](/docs/images/import-data.jpeg)

Skip ahead to `Customize target index`.
+ Give your index a name.
+ Make `Project_Number` your key.
+ Make sure the expected column names are present under fields. For the columns you expect to use, select `Retrievable` and `Searchable`. If you select all the columns you will just pay for indexing you are not using.

![Customize index](/docs/images/8_target_index.png)
![Customize index](/docs/images/index-csv.jpeg)

Advance to `Create an indexer`, name your indexer, then click `Submit`.

![Create indexer](/docs/images/9_create_indexer.png)
![Create indexer](/docs/images/create-indexer.jpeg)

Navigate to `Indexes` on the left panel and wait until your index shows as many documents as you have lines in your file. It will read 0 documents until it is finished indexing. The example 500 line csv takes about one minute.

![Check index](/docs/images/10_check_index.png)


And that is it! Now return to [the tutorial notebook to run queries against this csv using GPT-4]( /notebooks/GenAI/notebooks/AzureAIStudio_index_structured_with_console.ipynb).


Expand Down
Binary file added docs/images/RM-chat-playground-spaces.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/RM-hello.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/RM-parameters.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/RM_chat-button.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/RM_chat-playground.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/RM_create-open-ai-1.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/RM_gpt-4o-deploy.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/RM_gpt-4o-selection.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/create-indexer.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/import-data.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/index-csv.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/kaggle-input.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
24 changes: 15 additions & 9 deletions docs/microsoft_authenticator.md
Original file line number Diff line number Diff line change
@@ -1,28 +1,34 @@
# Setting up Microsoft Authenticator on an iPhone

1. Search for and download the Microsoft Authenticator in the App store

   ![Search App Store](/docs/images/Auth_2.png)
1. Search for and download the Microsoft Authenticator in the App store.

<img src="/docs/images/Auth_2.png" alt="Auth_2" width="200" style="text-align:center"/>


2. Accept the privacy statement.

<img src="/docs/images/Auth_3.png" alt="Auth_3" width="200" style="text-align:center"/>

   ![Search App Store](/docs/images/Auth_3.png)

3. Click *Add School or Work Account**

![Search App Store](/docs/images/Auth_4.png)

   <img src="/docs/images/Auth_4.png" alt="Auth_4" width="200" style="text-align:center"/>


4. Select **Scan QR Code** and give permission to access your camera if needed. Scan the QR code that appears on your computer screen.

<img src="/docs/images/Auth_5.png" alt="Auth_5" width="200" style="text-align:center"/>

![Search App Store](/docs/images/Auth_5.png)

5. Now you should see the NIH account on your authenticator list.

<img src="/docs/images/Auth_7.png" alt="Auth_7" width="200" style="text-align:center"/>

![Search App Store](/docs/images/Auth_7.png)

6. Now you can access codes or respond to a numerical prompt from Microsoft Online.

<img src="/docs/images/Auth_x.png" alt="Auth_x" width="200" style="text-align:center"/>

![Search App Store](/docs/images/Auth_x.png)



Expand Down
36 changes: 12 additions & 24 deletions notebooks/GenAI/Azure_Open_AI_README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,57 +30,45 @@ Navigate to Azure OpenAI. The easiest way is to search at the top of the page.

At the time of writing, Azure OpenAI is in Beta and only available to customers via an application form, if you click **Create** that is the message you will see. If you click **Create** and do not get this message, then feel free to create a new OpenAI Service. Otherwise, please email us at [email protected] and ask us to set this part up for you. Once you have an OpenAI Service provisioned, click to open it.

![click to open azure open ai](/docs/images/2_select_openai_project.png)
![click to open azure open ai](/docs/images/RM_create-open-ai-1.jpeg)

Now click **Go to Azure OpenAI Studio** or **Explore** to be connected to the Azure OpenAI studio user interface.
Now click **Go to Azure OpenAI Studio** or **Explore** to be connected to the Azure OpenAI studio user interface. Click **Chat**.

![connect to OpenAI UI](/docs/images/3_connet_open_ai.png)
![connect to OpenAI UI](/docs/images/RM_chat-button.jpeg)

Click **Chat**

![click chat image](/docs/images/4_click_chat.png)

Next, you need to deploy an OpenAI model.

## Deploy an OpenAI model

On the left navigation panel, click **Models**

![Click Models](/docs/images/10_click_models.png)

Select the (A) `gpt-35-turbo model`, click (B) **Deploy**. You can learn more about the available models by clicking (C) **Learn more about the different types of base models**, or [here](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models).
Select **Create new deployment** and **from base models**. Select the model `gpt-4o-mini`. In the **Deployment type** select **Standard**. You can learn more about the available models by clicking (C) **Learn more about the different types of base models**, or [here](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models).

![Deploy the model](/docs/images/11_deploy_model.png)

Name your deployment and then click **Create**.
![Click Models](/docs/images/RM_gpt-4o-selection.jpeg)

![Name your Deployment](/docs/images/12_name_your_deployment.png)
Click **Create**. Now if you select `Deployments` on the left panel, you should see your deployed model listed.

Now if you select `Deployments` on the left panel, you should see your deployed model listed.

![Check Deployments](/docs/images/13_check_deployments.png)
![Name your Deployment](/docs/images/RM_chat-playground.jpeg)

Run a quick test to ensure our deployment is acting as expected. Navigate to `Chat`, add an optional system message (we will cover this more later), and then type `Hello World` in the chat box. If you get a response, things are working well!

![test model](/docs/images/14_test_your_model.png)
![test model](/docs/images/RM-hello.jpeg)

Now we will look at [adding and querying over your own data](#Upload-your-own-data-and-query-over-it) and then review [prompt engineering best practices](#prompt-engineering-best-practices) using a general GPT model.

## Chat Playground Navigation

If you have not already (A) Navigate to the Chat Playground. Here we will walk through the various options available to you. First, you can specify a `System Message` which tells the model what context with which to respond to inquiries. To modify this, (B) select `System message`, then (B) input a [System Message](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/system-message#define-the-models-profile-capabilities-and-limitations-for-your-scenario) in the prompt box, then (D) click **Save**.
If you have not already navigate to the Chat Playground. In the middle of the page is where you actually interact with the model (A) through the chat prompts. First, you can specify a message which tells the model what context with which to respond to inquiries. To modify this, enter a prompt in the dialogue box(B).

On the next tab over, you can (A) add your own data, which we dive into in the [next section](#Upload-your-own-data-and-query-over-it). In the middle of the page is where you actually interact with the model (B) through the chat prompts. Always (C) clear the chat after each session.
Always (C) clear the chat after each session. On the next tab over, you can (D) add your own data, which we dive into in the [next section](#Upload-your-own-data-and-query-over-it).

![add your own data](/docs/images/18_add_custom_data.png)
![add your own data](/docs/images/RM-chat-playground-spaces.jpeg)

On the far right, you can modify which model you are deploying, which allows you to switch between different model deployments depending on the context.

![modify deployment](/docs/images/19_deployment.png)

Finally, you can select the `parameters` tab to modify the model parameters. Review [this presentation](/notebooks/GenAI/search_documents/aoai_workshop_content.pdf) to learn more about the parameters.

![modify parameters](/docs/images/20_parameters.png)
![modify parameters](/docs/images/RM-parameters.jpeg)

## Upload your own data and query over it

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -166,11 +166,11 @@
" azure_endpoint = os.getenv(\"AZURE_OPENAI_ENDPOINT\")\n",
" )\n",
"\n",
"def generate_embeddings(text, model=\"text-embedding-ada-002\"):\n",
"def generate_embeddings(text, model=\"text-embedding-3-small\"):\n",
" return client.embeddings.create(input = [text], model=model).data[0].embedding\n",
"\n",
"#adding embeddings for job title to get more accurate search results\n",
"df['job_title_vector'] = df['job_title'].apply(lambda x : generate_embeddings (x)) # model should be set to the deployment name you chose when you deployed the text-embedding-ada-002 (Version 2) model"
"df['job_title_vector'] = df['job_title'].apply(lambda x : generate_embeddings (x)) # model should be set to the deployment name you chose when you deployed the text-embedding-3-small model"
]
},
{
Expand Down Expand Up @@ -392,7 +392,7 @@
"outputs": [],
"source": [
"endpoint = \"https://{}.search.windows.net/\".format(service_name)\n",
"index_client = SearchIndexClient(endpoint, AzureKeyCredential(index_key))"
"index_client = SearchIndexClient(endpoint, AzureKeyCredential(search_key))"
]
},
{
Expand Down Expand Up @@ -610,7 +610,7 @@
"outputs": [],
"source": [
"from azure.search.documents import SearchClient\n",
"search_client = SearchClient(endpoint, index_name, AzureKeyCredential(index_key))"
"search_client = SearchClient(endpoint, index_name, AzureKeyCredential(search_key))"
]
},
{
Expand Down Expand Up @@ -741,7 +741,7 @@
"outputs": [],
"source": [
"response = client.chat.completions.create(\n",
" model=\"gpt-4\",\n",
" model=\"gpt-4o\",\n",
" messages=[\n",
" {\"role\": \"system\", \"content\": \"You are a helpful assistant who answers only from the given Context and answers the question from the given Query. If you are asked to count then you must count all of the occurances mentioned.\"},\n",
" {\"role\": \"user\", \"content\": \"Context: \"+ context + \"\\n\\n Query: \" + query}\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@
"#run query output on model\n",
"search_results = str(list(search_client.search(query)))\n",
"response = client.chat.completions.create(\n",
" model=\"gpt-4\",\n",
" model=\"gpt-4o\",\n",
" messages=[\n",
" {\"role\": \"system\", \"content\": \"You are an NIH Program Officer\"},\n",
" {\"role\": \"user\", \"content\": \"Context: \"+ search_results + \"\\n\\n Query: \" + query}\n",
Expand Down
4 changes: 2 additions & 2 deletions notebooks/GenAI/notebooks/AzureAIStudio_langchain.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Try loading in your deployed model and running a simple prompt"
"Try loading in your deployed model and running a simple prompt."
]
},
{
Expand Down Expand Up @@ -176,7 +176,7 @@
"metadata": {},
"outputs": [],
"source": [
"chain = load_summarize_chain(model, chain_type=\"stuff\")\n",
"chain = load_summarize_chain(llm, chain_type=\"stuff\")\n",
"\n",
"chain.run(pages)"
]
Expand Down
4 changes: 2 additions & 2 deletions notebooks/GenAI/notebooks/AzureOpenAI_embeddings.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@
"metadata": {},
"outputs": [],
"source": [
"pip install \"openai\" \"requests\""
"! pip install \"openai\" \"requests\""
]
},
{
Expand Down Expand Up @@ -106,7 +106,7 @@
"outputs": [],
"source": [
"# read the data file to be embedded\n",
"df = pd.read_csv('microsoft-earnings.csv')\n",
"df = pd.read_csv('../microsoft-earnings.csv')\n",
"print(df)"
]
},
Expand Down
6 changes: 3 additions & 3 deletions notebooks/GenAI/notebooks/Pubmed_RAG_chatbot.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@
"id": "9dbd13e7-afc9-416b-94dc-418a93e14587",
"metadata": {},
"source": [
"In this tutorial we will be using Azure OpenAI which (if you havent already) you can learn how to deploy [here](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource?pivots=cli). This tutorial utilizes the model **gpt-35-turbo** version 0301 and the embeddings model **text-embedding-ada-002** version 2."
"In this tutorial we will be using Azure OpenAI which (if you havent already) you can learn how to deploy [here](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource?pivots=cli). This tutorial utilizes the model **gpt-4o** and the embeddings model **text-embedding-3-small**."
]
},
{
Expand Down Expand Up @@ -935,7 +935,7 @@
"id": "10628e98-5486-4222-ad36-52ae4ad3a5c0",
"metadata": {},
"source": [
"For our index we will be adding in a **content_vector** field which represents each chuck embedded. **Embedding** means that we are converting our text into a **numerical vectors** that will help our model find similar objects like documents that hold similar texts or find similar photos based on the numbers assigned to the object, basically capturing texts meaning and relationship through numbers. Depending on the model you choose you have to find an embedder that is compatible to our model. Since we are using a OpenAI model the compatible embedding model will be **text-embedding-ada-002**."
"For our index we will be adding in a **content_vector** field which represents each chuck embedded. **Embedding** means that we are converting our text into a **numerical vectors** that will help our model find similar objects like documents that hold similar texts or find similar photos based on the numbers assigned to the object, basically capturing texts meaning and relationship through numbers. Depending on the model you choose you have to find an embedder that is compatible to our model. Since we are using a OpenAI model the compatible embedding model will be **text-embedding-3-small*."
]
},
{
Expand Down Expand Up @@ -970,7 +970,7 @@
"os.environ[\"AZURE_OPENAI_ENDPOINT\"] = \"<Your Azure OpenAI Endpoint>\"\n",
"\n",
"embeddings = AzureOpenAIEmbeddings(\n",
" azure_deployment=\"text-embedding-ada-002\",\n",
" azure_deployment=\"text-embedding-3-small\",\n",
" chunk_size=10, #processing our chunks in batches of 10\n",
")\n",
"embedding_function = embeddings.embed_query"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -393,14 +393,14 @@
"outputs": [],
"source": [
"mmar = {\n",
" RemoteMMARKeys.ID: \"monai_spleen_ct_segmentation\"\",\n",
" RemoteMMARKeys.ID: \"monai_spleen_ct_segmentation\",\n",
" RemoteMMARKeys.NAME: \"monai_spleen_ct_segmentation\",\n",
" RemoteMMARKeys.FILE_TYPE: \"zip\",\n",
" RemoteMMARKeys.HASH_TYPE: \"md5\",\n",
" RemoteMMARKeys.HASH_VAL: None,\n",
" RemoteMMARKeys.MODEL_FILE: os.path.join(\"models\", \"model.pt\"),\n",
" RemoteMMARKeys.CONFIG_FILE: os.path.join(\"configs\", \"train.json\"),\n",
" RemoteMMARKeys.VERSION: \"0.5.3\",\n",
" RemoteMMARKeys.VERSION: \"0.5.3\"\n",
"}"
]
},
Expand Down
Loading
Loading