You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This tutorial shows how to import an OpenAPI specification backend API in JSON format into Azure API Management. Microsoft provides the backend API used in this example, and hosts it on Azure at `https://conferenceapi.azurewebsites.net`.
17
+
This tutorial shows how to import an OpenAPI specification backend API in JSON format into Azure API Management. For this example, you import the open source [Petstore API](https://petstore3.swagger.io/).
18
18
19
19
Once you import the backend API into API Management, your API Management API becomes a façade for the backend API. You can customize the façade to your needs in API Management without touching the backend API. For more information, see [Transform and protect your API](transform-api.md).
20
20
@@ -26,7 +26,7 @@ In this tutorial, you learn how to:
26
26
27
27
After import, you can manage the API in the Azure portal.
28
28
29
-
:::image type="content" source="media/import-and-publish/created-api.png" alt-text="Screenshot of a new API in API Management in the portal.":::
29
+
:::image type="content" source="media/import-and-publish/created-api.png" lightbox="media/import-and-publish/created-api.png" alt-text="Screenshot of a new API in API Management in the portal.":::
30
30
31
31
## Prerequisites
32
32
@@ -51,21 +51,18 @@ This section shows how to import and publish an OpenAPI specification backend AP
51
51
52
52
|Setting|Value|Description|
53
53
|-------|-----|-----------|
54
-
|**OpenAPI specification**|*https:\//conferenceapi.azurewebsites.net?format=json*|Specifies the backend service implementing the API and the operations that the API supports. <br/><br/>The backend service URL appears later as the **Web service URL** on the API's **Settings** page.<br/><br/>After import, you can add, edit, rename, or delete operations in the specification. |
54
+
|**OpenAPI specification**|*https:\//petstore3.swagger.io/api/v3/openapi.json*|Specifies the backend service implementing the API and the operations that the API supports. <br/><br/>The backend service URL appears later as the **Web service URL** on the API's **Settings** page.<br/><br/>After import, you can add, edit, rename, or delete operations in the specification. |
55
55
|**Include query parameters in operation templates**| Selected (default) | Specifies whether to import required query parameters in the specification as template parameters in API Management. |
56
56
|**Display name**|After you enter the OpenAPI specification URL, API Management fills out this field based on the JSON.|The name displayed in the [developer portal](api-management-howto-developer-portal.md).|
57
57
|**Name**|After you enter the OpenAPI specification URL, API Management fills out this field based on the JSON.|A unique name for the API.|
58
58
|**Description**|After you enter the OpenAPI specification URL, API Management fills out this field based on the JSON.|An optional description of the API.|
59
59
|**URL scheme**|**HTTPS**|Which protocols can access the API.|
60
-
|**API URL suffix**|*conference*|The suffix appended to the base URL for the API Management service. API Management distinguishes APIs by their suffix, so the suffix must be unique for every API for a given publisher.|
60
+
|**API URL suffix**|*petstore*|The suffix appended to the base URL for the API Management service. API Management distinguishes APIs by their suffix, so the suffix must be unique for every API for a given publisher.|
61
61
|**Tags**||Tags for organizing APIs for searching, grouping, or filtering.|
62
-
|**Products**|**Unlimited**|Association of one or more APIs. Each API Management instance comes with two sample products: **Starter** and **Unlimited**. You publish an API by associating the API with a product, **Unlimited** in this example.<br/><br/> You can include several APIs in a product and offer product [subscriptions](api-management-subscriptions.md) to developers through the developer portal. To add this API to another product, type or select the product name. Repeat this step to add the API to multiple products. You can also add APIs to products later from the **Settings** page.<br/><br/> For more information about products, see [Create and publish a product](api-management-howto-add-products.md).|
62
+
|**Products**|**Unlimited**|Association of one or more APIs. In certain tiers, API Management instance comes with two sample products: **Starter** and **Unlimited**. You publish an API in the developer portal by associating the API with a product.<br/><br/> You can include several APIs in a product and offer product [subscriptions](api-management-subscriptions.md) to developers through the developer portal. To add this API to another product, type or select the product name. Repeat this step to add the API to multiple products. You can also add APIs to products later from the **Settings** page.<br/><br/> For more information about products, see [Create and publish a product](api-management-howto-add-products.md).|
63
63
|**Gateways**|**Managed**|API gateway(s) that expose the API. This field is available only in **Developer** and **Premium** tier services.<br/><br/>**Managed** indicates the gateway built into the API Management service and hosted by Microsoft in Azure. [Self-hosted gateways](self-hosted-gateway-overview.md) are available only in the Premium and Developer service tiers. You can deploy them on-premises or in other clouds.<br/><br/> If no gateways are selected, the API won't be available and your API requests won't succeed.|
64
64
|**Version this API?**|Select or deselect|For more information, see [Publish multiple versions of your API](api-management-get-started-publish-versions.md).|
65
65
66
-
> [!NOTE]
67
-
> To publish the API to API consumers, you must associate it with a product.
68
-
69
66
1. Select **Create** to create your API.
70
67
71
68
If you have problems importing an API definition, see the [list of known issues and restrictions](api-management-api-import-restrictions.md).
@@ -74,8 +71,8 @@ If you have problems importing an API definition, see the [list of known issues
74
71
75
72
You can call API operations directly from the Azure portal, which provides a convenient way to view and test the operations. In the portal's test console, by default, APIs are called by using a key from the built-in all-access subscription. You can also test API calls by using a subscription key scoped to a product.
76
73
77
-
1. In the left navigation of your API Management instance, select **APIs** > **Demo Conference API**.
78
-
1. Select the **Test** tab, and then select **GetSpeakers**. The page shows **Query parameters** and **Headers**, if any.
74
+
1. In the left navigation of your API Management instance, select **APIs** > **Swagger Petstore**.
75
+
1. Select the **Test** tab, and then select **Finds Pets by status**. The page shows the *status***Query parameter**. Select one of the available values, such as *pending*. You can also add query parameters and headers here.
79
76
80
77
In the **HTTP request** section, the **Ocp-Apim-Subscription-Key** header is filled in automatically for you, which you can see if you select the "eye" icon.
Copy file name to clipboardExpand all lines: articles/azure-netapp-files/azure-netapp-files-understand-storage-hierarchy.md
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -77,7 +77,7 @@ When you use a manual QoS capacity pool with, for example, an SAP HANA system, a
77
77
78
78
## Large volumes
79
79
80
-
Azure NetApp Files allows you to create large volumes up to 1 PiB in size. Large volumes begin at a capacity of 50 TiB and scale up to 1 PiB. Regular Azure NetApp Files volumes are offered between 50 GiB and 102,400 GiB.
80
+
Azure NetApp Files allows you to create [large volumes](large-volumes.md) up to 1 PiB in size. Large volumes begin at a capacity of 50 TiB and scale up to 1 PiB. Regular Azure NetApp Files volumes are offered between 50 GiB and 102,400 GiB.
81
81
82
82
For more information, see [Requirements and considerations for large volumes](large-volumes-requirements-considerations.md).
83
83
@@ -88,4 +88,5 @@ For more information, see [Requirements and considerations for large volumes](la
88
88
-[Performance considerations for Azure NetApp Files](azure-netapp-files-performance-considerations.md)
89
89
-[Create a capacity pool](azure-netapp-files-set-up-capacity-pool.md)
90
90
-[Manage a manual QoS capacity pool](manage-manual-qos-capacity-pool.md)
91
+
-[Understand large volumes](large-volumes.md)
91
92
-[Requirements and considerations for large volumes](large-volumes-requirements-considerations.md)
title: Understand large volumes in Azure NetApp Files
3
+
description: Learn about the benefits, use cases, and requirements for using large volumes in Azure NetApp Files.
4
+
services: azure-netapp-files
5
+
author: b-ahibbard
6
+
ms.service: azure-netapp-files
7
+
ms.custom: references_regions
8
+
ms.topic: conceptual
9
+
ms.date: 10/29/2024
10
+
ms.author: anfdocs
11
+
---
12
+
# Understand large volumes in Azure NetApp Files
13
+
14
+
Volumes in Azure NetApp Files are the way you present high performance, cost-effective storage to your network attached storage (NAS) clients in the Azure cloud. Volumes act as independent file systems with their own capacity, file counts, ACLs, snapshots, and file system IDs. These qualities provide a way to separate datasets into individual secure tenants.
15
+
16
+
:::image type="content" source="./media/large-volumes/large-volumes-diagram.png" alt-text="Diagram of large and regular volume size." lightbox="./media/large-volumes/large-volumes-diagram.png":::
17
+
18
+
All resources in Azure NetApp files have [limits](azure-netapp-files-resource-limits.md). _Regular_ volumes have the following limits:
| Capacity | <ul><li>50 TiB minimum</li><li>1 PiB maximum (or [2 PiB by special request](azure-netapp-files-resource-limits.md#request-limit-increase))</li></ul> |
In many cases, a regular volume can handle the performance needs for a production workload, particularly when dealing with database workloads, general file shares, and Azure VMware Service or virtual desktop infrastructure (VDI) workloads. When workloads are metadata heavy or require scale beyond what a regular volume can handle, a large volume can increase performance needs with minimal cost impact.
38
+
39
+
For instance, the following graphs show that a large volume can deliver two to three times the performance at scale of a regular volume.
40
+
41
+
For more information about performance tests, see [Large volume performance benchmarks for Linux](performance-large-volumes-linux.md) and [Regular volume performance benchmarks for Linux](performance-benchmarks-linux.md).
42
+
43
+
For example, in benchmark tests using Flexible I/O Tester (FIO), a large volume achieved higher I/OPS and throughput than a regular volume.
44
+
45
+
:::image type="content" source="./media/large-volumes/large-regular-volume-comparison.png" alt-text="Diagram comparing large and regular volumes with random I/O." lightbox="./media/large-volumes/large-regular-volume-comparison.png":::
46
+
47
+
:::image type="content" source="./media/large-volumes/large-volume-throughput.png" alt-text="Diagram comparing large and regular volumes with sequential I/O." lightbox="./media/large-volumes/large-volume-throughput.png":::
48
+
49
+
## Work load types and use cases
50
+
51
+
Regular volumes can handle most workloads. Once capacity, file count, performance, or scale limits are reached, new volumes must be created. This condition adds unnecessary complexity to a solution.
52
+
53
+
Large volumes allow workloads to extend beyond the current limitations of regular volumes. The following table shows some examples of use cases for each volume type.
54
+
55
+
| Volume type | Primary use cases |
56
+
| - | -- |
57
+
| Regular volumes | <ul><li>General file shares</li><li>SAP HANA and databases (Oracle, SQL Server, Db2, and others)</li><li>VDI/Azure VMware Service</li><li>Capacities less than 50 TiB</li></ul> |
58
+
| Large volumes | <ul><li>General file shares</li><li>High file count or high metadata workloads (such as electronic design automation, software development, FSI)</li><li>High capacity workloads (such as AI/ML/LLP, oil & gas, media, healthcare images, backup, and archives)</li><li>Large-scale workloads (many client connections such as FSLogix profiles)</li><li>High performance workloads</li><li>Capacity quotas between 50 TiB and 1 PiB</li></ul> |
59
+
60
+
## More information
61
+
62
+
*[Requirements and considerations for large volumes](large-volumes-requirements-considerations.md)
63
+
*[Storage hierarchy of Azure NetApp Files](azure-netapp-files-understand-storage-hierarchy.md)
64
+
*[Resource limits for Azure NetApp Files](azure-netapp-files-resource-limits.md)
0 commit comments