Skip to content

Commit e6432ec

Browse files
committed
update throughput values, examples, and storage hierarchy graphic
1 parent 2429816 commit e6432ec

File tree

5 files changed

+12
-13
lines changed

5 files changed

+12
-13
lines changed

articles/azure-netapp-files/azure-netapp-files-service-levels.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ services: azure-netapp-files
55
author: b-hchen
66
ms.service: azure-netapp-files
77
ms.topic: conceptual
8-
ms.date: 01/24/2025
8+
ms.date: 03/10/2025
99
ms.author: anfdocs
1010
---
1111
# Service levels for Azure NetApp Files
@@ -71,8 +71,7 @@ The following diagram illustrates the scenarios for the SAP HANA volumes but wit
7171

7272
:::image type="content" source="./media/azure-netapp-files-service-levels/flexible-service-sap-hana-examples.png" alt-text="Diagram of Flexible service level throughput with SAP HANA volumes." lightbox="./media/azure-netapp-files-service-levels/flexible-service-sap-hana-examples.png":::
7373

74-
75-
The example extends to the Flexible service as well. A Flexible service level capacity pool can be used to create the following volumes. Each volume provides the individual size and throughput to meet your application requirements:
74+
The example extends to the Flexible service level as well. A Flexible service level capacity pool can be used to create the following volumes. Each volume provides the individual size and throughput to meet your application requirements:
7675

7776
- SAP HANA data volume: Size 4 TiB with up to 704 MiB/s
7877
- SAP HANA log volume: Size 0.5 TiB with up to 256 MiB/s
@@ -87,10 +86,10 @@ As illustrated in the diagram, the SAP HANA backup volume received the 128MiB/s
8786
| - | - | -- |
8887
| 1 | 128 | 5 * 128 * 1 = 640 |
8988
| 2 | 128 | 5 * 128 * 2 = 1,280 |
90-
| 5 | 128 | 5 * 128 * 5 = 3,200 |
9189
| 10 | 128 | 5 * 128 * 10 = 6,400 |
9290
| 50 | 128 | 5 * 128 * 50 = 32,000 |
9391
| 100 | 128 | 5 * 128 * 100 = 64,000 |
92+
| 1,024 | 128 | 5 * 128 * 1,024 = 655,360 |
9493

9594
>[!NOTE]
9695
>A Flexible capacity pool with 128 MiB/s throughput assigned to it is only charged for the allocated capacity.

articles/azure-netapp-files/azure-netapp-files-set-up-capacity-pool.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ services: azure-netapp-files
55
author: b-hchen
66
ms.service: azure-netapp-files
77
ms.topic: how-to
8-
ms.date: 02/28/2025
8+
ms.date: 03/10/2025
99
ms.author: anfdocs
1010
ms.custom: references_regions
1111
---

articles/azure-netapp-files/large-volumes-requirements-considerations.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ author: b-ahibbard
66
ms.service: azure-netapp-files
77
ms.custom: references_regions
88
ms.topic: conceptual
9-
ms.date: 02/25/2025
9+
ms.date: 03/10/2025
1010
ms.author: anfdocs
1111
---
1212
# Requirements and considerations for large volumes
@@ -20,17 +20,17 @@ The following requirements and considerations apply to large volumes. For perfor
2020
* A regular volume can’t be converted to a large volume.
2121
* You must create a large volume at a size of 50 TiB or larger. The maximum size of a large volume is 1,024 TiB, though 2-PiB large volumes are available on request depending on regional dedicated capacity availability. To request 2-PiB large volumes, contact your account team.
2222
* You can't resize a large volume to less than 50 TiB.
23-
A large volume cannot be resized to more than 30% of its lowest provisioned size. This limit is adjustable via [a support request](azure-netapp-files-resource-limits.md#resource-limits). When requesting the resize, specify the exact size desired size in TiB.
23+
A large volume cannot be resized to more than 30% of its lowest provisioned size. This limit is adjustable via [a support request](azure-netapp-files-resource-limits.md#resource-limits). When requesting the resize, specify the desired size in TiB.
2424
* Large volumes are currently not supported with Azure NetApp Files backup.
2525
* You can't create a large volume with application volume groups.
2626
* Currently, large volumes aren't suited for database (HANA, Oracle, SQL Server, etc.) data and log volumes. For database workloads requiring more than a single volume’s throughput limit, consider deploying multiple regular volumes. To optimize multiple volume deployments for databases, use [application volume groups](application-volume-group-concept.md).
27-
* Throughput ceilings for the three performance tiers (Standard, Premium, and Ultra) of large volumes are based on the existing 100-TiB maximum capacity targets. You're able to grow to 1 PiB with the throughput ceiling per the following table:
27+
* Throughput ceilings for all three performance tiers (Standard, Premium, and Ultra) of large volumes is 12,800 MiB/s. You're able to grow to 1 PiB with the throughput ceiling per the following table:
2828

2929
<table><thead>
3030
<tr>
3131
<th></th>
3232
<th colspan="2">Capacity</th>
33-
<th colspan="2">Linear performance scaling per TiB up to maximum allowed capacity tier throughput (large volume) </th>
33+
<th colspan="2">Linear performance scaling per TiB up to maximum 12,800 MiB/s </th>
3434
</tr></thead>
3535
<tbody>
3636
<tr>

articles/azure-netapp-files/large-volumes.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ author: b-ahibbard
66
ms.service: azure-netapp-files
77
ms.custom: references_regions
88
ms.topic: conceptual
9-
ms.date: 10/29/2024
9+
ms.date: 03/10/2025
1010
ms.author: anfdocs
1111
---
1212
# Understand large volumes in Azure NetApp Files
@@ -21,15 +21,15 @@ All resources in Azure NetApp files have [limits](azure-netapp-files-resource-li
2121
| - | - |
2222
| Capacity | <ul><li>50 GiB minimum</li><li>100 TiB maximum</li></ul> |
2323
| File count | 2,147,483,632 |
24-
| Performance | <ul><li>Standard: 1,600</li><li>Premium: 1,600</li><li>Ultra: 4,500</li></ul> |
24+
| Performance (MiB/s) | <ul><li>Standard: 1,600</li><li>Premium: 1,600</li><li>Ultra: 4,500</li><li>Flexible: 4,500</ul> |
2525

2626
Large volumes have the following limits:
2727

2828
| Limit type | Values |
2929
| - | - |
3030
| Capacity | <ul><li>50 TiB minimum</li><li>1 PiB maximum (or [2 PiB by special request](azure-netapp-files-resource-limits.md#request-limit-increase))</li></ul> |
3131
| File count | 15,938,355,048 |
32-
| Performance | <ul><li>Standard: 1,600</li><li>Premium: 6,400</li><li>Ultra: 12,800</li></ul> |
32+
| Performance | The large volume performance limit is 12,800 MiB/s regardless of service level. |
3333

3434

3535
## Large volumes effect on performance
@@ -53,7 +53,7 @@ Regular volumes can handle most workloads. Once capacity, file count, performanc
5353
Large volumes allow workloads to extend beyond the current limitations of regular volumes. The following table shows some examples of use cases for each volume type.
5454

5555
| Volume type | Primary use cases |
56-
| - | -- |
56+
| - | --- |
5757
| Regular volumes | <ul><li>General file shares</li><li>SAP HANA and databases (Oracle, SQL Server, Db2, and others)</li><li>VDI/Azure VMware Service</li><li>Capacities less than 50 TiB</li></ul> |
5858
| Large volumes | <ul><li>General file shares</li><li>High file count or high metadata workloads (such as electronic design automation, software development, financial services)</li><li>High capacity workloads (such as AI/ML/LLP, oil & gas, media, healthcare images, backup, and archives)</li><li>Large-scale workloads (many client connections such as FSLogix profiles)</li><li>High performance workloads</li><li>Capacity quotas between 50 TiB and 1 PiB</li></ul> |
5959

Loading

0 commit comments

Comments
 (0)