You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/search/search-get-started-portal-import-vectors.md
+17-40Lines changed: 17 additions & 40 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,15 +27,22 @@ This quickstart helps you get started with [integrated vectorization](vector-sea
27
27
28
28
### Supported data sources
29
29
30
-
+[Azure Storage](/azure/storage/common/storage-account-create) for blobs, files, and tables. Azure Storage must be a standard performance (general-purpose v2) account. Access tiers can be hot, cool, and cold.
30
+
The **Import and vectorize data** wizard supports the following data sources:
31
31
32
-
+[Azure Data Lake Storage (ADLS) Gen2](/azure/storage/blobs/create-data-lake-storage-account) (an Azure Storage account with a hierarchical namespace enabled).
32
+
+[Azure Storage](/azure/storage/common/storage-account-create) for blobs and tables. Azure Storage must be a standard performance (general-purpose v2) account. Access tiers can be hot, cool, and cold.
33
+
34
+
+[Azure Data Lake Storage (ADLS) Gen2](/azure/storage/blobs/create-data-lake-storage-account) (an Azure Storage account with a hierarchical namespace enabled). You can confirm that you have Data Lake Storage by checking the **Properties** tab on the **Overview** page.
35
+
36
+
:::image type="content" source="media/search-get-started-portal-import-vectors/data-lake-storage.png" alt-text="Screenshot of the storage account properties page showing Data Lake Storage.":::
> This quicktart provides steps for data sources that work with whole files: Azure Blob storage, ADLS Gen2, OneLake.
39
46
40
47
### Supported embedding models
41
48
@@ -45,7 +52,7 @@ Use an embedding model on an Azure AI platform in the [same region as Azure AI S
45
52
|---|---|
46
53
|[Azure OpenAI Service](https://aka.ms/oai/access)| text-embedding-ada-002, text-embedding-3-large, or text-embedding-3-small. |
47
54
|[Azure AI Studio model catalog](/azure/ai-studio/what-is-ai-studio)| Azure, Cohere, and Facebook embedding models. |
48
-
|[Azure AI services multi-service account](/azure/ai-services/multi-service-resource)|[Azure AI Vision multimodal](/azure/ai-services/computer-vision/how-to/image-retrieval) for image and text vectorization. Azure AI Vision multimodal is available in selected regions. [Check the documentation](/azure/ai-services/computer-vision/how-to/image-retrieval?tabs=csharp) for an updated list. **To use this resource, the account must be in an available region and in the same region as Azure AI Search**. |
55
+
|[Azure AI services multi-service account](/azure/ai-services/multi-service-resource)|[Azure AI Vision multimodal](/azure/ai-services/computer-vision/how-to/image-retrieval) for image and text vectorization. Azure AI Vision multimodal is available in selected regions. [Check the documentation](/azure/ai-services/computer-vision/how-to/image-retrieval?tabs=csharp) for an updated list. Depending on how you [attach the multi-service resource](cognitive-search-attach-cognitive-services.md), the account might need to be in the same region as Azure AI Search. |
49
56
50
57
If you use the Azure OpenAI Service, the endpoint must have an associated [custom subdomain](/azure/ai-services/cognitive-services-custom-subdomains). A custom subdomain is an endpoint that includes a unique name (for example, `https://hereismyuniquename.cognitiveservices.azure.com`). If the service was created through the Azure portal, this subdomain is automatically generated as part of your service setup. Ensure that your service includes a custom subdomain before using it with the Azure AI Search integration.
51
58
@@ -57,9 +64,9 @@ For the purposes of this quickstart, all of the preceding resources must have pu
57
64
58
65
If private endpoints are already present and you can't disable them, the alternative option is to run the respective end-to-end flow from a script or program on a virtual machine. The virtual machine must be on the same virtual network as the private endpoint. [Here's a Python code sample](https://github.com/Azure/azure-search-vector-samples/tree/main/demo-python/code/integrated-vectorization) for integrated vectorization. The same [GitHub repo](https://github.com/Azure/azure-search-vector-samples/tree/main) has samples in other programming languages.
59
66
60
-
### Role requirements
67
+
### Permissions
61
68
62
-
We recommend role assignments for search service connections to other resources.
69
+
You can use key authentication and full access connection strings, or Microsoft Entra ID with role assignments. We recommend role assignments for search service connections to other resources.
63
70
64
71
1. On Azure AI Search, [enable roles](search-security-enable-roles.md).
65
72
@@ -81,13 +88,9 @@ For more secure connections:
81
88
82
89
If you're starting with the free service, you're limited to three indexes, data sources, skillsets, and indexers. Basic limits you to 15. Make sure you have room for extra items before you begin. This quickstart creates one of each object.
83
90
84
-
### Check for semantic ranker
85
-
86
-
The wizard supports semantic ranking, but only on the Basic tier and higher, and only if semantic ranker is already [enabled on your search service](semantic-how-to-enable-disable.md). If you're using a billable tier, check whether semantic ranker is enabled.
87
-
88
91
## Prepare sample data
89
92
90
-
This section points you to data that works for this quickstart.
93
+
This section points you to the content that works for this quickstart.
@@ -111,10 +114,6 @@ This section points you to data that works for this quickstart.
111
114
112
115
1. Sign in to the [Azure portal](https://portal.azure.com/) with your Azure account, and go to your Azure Storage account.
113
116
114
-
1. You can confirm that you have Data Lake Storage by checking the **Properties** tab on the **Overview** page.
115
-
116
-
:::image type="content" source="media/search-get-started-portal-import-vectors/data-lake-storage.png" alt-text="Screenshot of the storage account properties page showing Data Lake Storage.":::
117
-
118
117
1. On the left pane, under **Data Storage**, select **Containers**.
119
118
120
119
1. Create a new container and then upload the [health-plan PDF documents](https://github.com/Azure-Samples/azure-search-sample-data/tree/main/health-plan) used for this quickstart.
@@ -155,18 +154,6 @@ This section points you to data that works for this quickstart.
155
154
156
155
1. Before you leave the lakehouse, copy the URL, or get the workspace and lakehouse IDs, so that you can specify the lakehouse in the wizard. The URL is in this format: `https://msit.powerbi.com/groups/00000000-0000-0000-0000-000000000000/lakehouses/11111111-1111-1111-1111-111111111111?experience=data-engineering`.
157
156
158
-
### [Azure Table Storage](#tab/sample-data-table)
159
-
160
-
TBD
161
-
162
-
### [Azure SQL](#tab/sample-data-sql)
163
-
164
-
TBD
165
-
166
-
### [Azure Cosmos DB](#tab/sample-data-cosmos)
167
-
168
-
TBD
169
-
170
157
---
171
158
172
159
<aname="connect-to-azure-openai"></a>
@@ -310,18 +297,6 @@ Support for OneLake indexing is in preview. For more information about supported
0 commit comments