You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-netapp-files/azure-netapp-files-service-levels.md
+3-4Lines changed: 3 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ services: azure-netapp-files
5
5
author: b-hchen
6
6
ms.service: azure-netapp-files
7
7
ms.topic: conceptual
8
-
ms.date: 01/24/2025
8
+
ms.date: 03/10/2025
9
9
ms.author: anfdocs
10
10
---
11
11
# Service levels for Azure NetApp Files
@@ -71,8 +71,7 @@ The following diagram illustrates the scenarios for the SAP HANA volumes but wit
71
71
72
72
:::image type="content" source="./media/azure-netapp-files-service-levels/flexible-service-sap-hana-examples.png" alt-text="Diagram of Flexible service level throughput with SAP HANA volumes." lightbox="./media/azure-netapp-files-service-levels/flexible-service-sap-hana-examples.png":::
73
73
74
-
75
-
The example extends to the Flexible service as well. A Flexible service level capacity pool can be used to create the following volumes. Each volume provides the individual size and throughput to meet your application requirements:
74
+
The example extends to the Flexible service level as well. A Flexible service level capacity pool can be used to create the following volumes. Each volume provides the individual size and throughput to meet your application requirements:
76
75
77
76
- SAP HANA data volume: Size 4 TiB with up to 704 MiB/s
78
77
- SAP HANA log volume: Size 0.5 TiB with up to 256 MiB/s
@@ -87,10 +86,10 @@ As illustrated in the diagram, the SAP HANA backup volume received the 128MiB/s
87
86
| - | - | -- |
88
87
| 1 | 128 | 5 * 128 * 1 = 640 |
89
88
| 2 | 128 | 5 * 128 * 2 = 1,280 |
90
-
| 5 | 128 | 5 * 128 * 5 = 3,200 |
91
89
| 10 | 128 | 5 * 128 * 10 = 6,400 |
92
90
| 50 | 128 | 5 * 128 * 50 = 32,000 |
93
91
| 100 | 128 | 5 * 128 * 100 = 64,000 |
92
+
| 1,024 | 128 | 5 * 128 * 1,024 = 655,360 |
94
93
95
94
>[!NOTE]
96
95
>A Flexible capacity pool with 128 MiB/s throughput assigned to it is only charged for the allocated capacity.
Copy file name to clipboardExpand all lines: articles/azure-netapp-files/large-volumes-requirements-considerations.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ author: b-ahibbard
6
6
ms.service: azure-netapp-files
7
7
ms.custom: references_regions
8
8
ms.topic: conceptual
9
-
ms.date: 02/25/2025
9
+
ms.date: 03/10/2025
10
10
ms.author: anfdocs
11
11
---
12
12
# Requirements and considerations for large volumes
@@ -20,17 +20,17 @@ The following requirements and considerations apply to large volumes. For perfor
20
20
* A regular volume can’t be converted to a large volume.
21
21
* You must create a large volume at a size of 50 TiB or larger. The maximum size of a large volume is 1,024 TiB, though 2-PiB large volumes are available on request depending on regional dedicated capacity availability. To request 2-PiB large volumes, contact your account team.
22
22
* You can't resize a large volume to less than 50 TiB.
23
-
A large volume cannot be resized to more than 30% of its lowest provisioned size. This limit is adjustable via [a support request](azure-netapp-files-resource-limits.md#resource-limits). When requesting the resize, specify the exact size desired size in TiB.
23
+
A large volume cannot be resized to more than 30% of its lowest provisioned size. This limit is adjustable via [a support request](azure-netapp-files-resource-limits.md#resource-limits). When requesting the resize, specify the desired size in TiB.
24
24
* Large volumes are currently not supported with Azure NetApp Files backup.
25
25
* You can't create a large volume with application volume groups.
26
26
* Currently, large volumes aren't suited for database (HANA, Oracle, SQL Server, etc.) data and log volumes. For database workloads requiring more than a single volume’s throughput limit, consider deploying multiple regular volumes. To optimize multiple volume deployments for databases, use [application volume groups](application-volume-group-concept.md).
27
-
*Throughput ceilings for the three performance tiers (Standard, Premium, and Ultra) of large volumes are based on the existing 100-TiB maximum capacity targets. You're able to grow to 1 PiB with the throughput ceiling per the following table:
27
+
*Throughput ceilings for all three performance tiers (Standard, Premium, and Ultra) of large volumes is 12,800 MiB/s. You're able to grow to 1 PiB with the throughput ceiling per the following table:
28
28
29
29
<table><thead>
30
30
<tr>
31
31
<th></th>
32
32
<th colspan="2">Capacity</th>
33
-
<th colspan="2">Linear performance scaling per TiB up to maximum allowed capacity tier throughput (large volume) </th>
33
+
<th colspan="2">Linear performance scaling per TiB up to maximum 12,800 MiB/s </th>
| Capacity | <ul><li>50 TiB minimum</li><li>1 PiB maximum (or [2 PiB by special request](azure-netapp-files-resource-limits.md#request-limit-increase))</li></ul> |
| Performance |The large volume performance limit is 12,800 MiB/s regardless of service level.|
33
33
34
34
35
35
## Large volumes effect on performance
@@ -53,7 +53,7 @@ Regular volumes can handle most workloads. Once capacity, file count, performanc
53
53
Large volumes allow workloads to extend beyond the current limitations of regular volumes. The following table shows some examples of use cases for each volume type.
54
54
55
55
| Volume type | Primary use cases |
56
-
| - | -- |
56
+
| - | ---|
57
57
| Regular volumes | <ul><li>General file shares</li><li>SAP HANA and databases (Oracle, SQL Server, Db2, and others)</li><li>VDI/Azure VMware Service</li><li>Capacities less than 50 TiB</li></ul> |
58
58
| Large volumes | <ul><li>General file shares</li><li>High file count or high metadata workloads (such as electronic design automation, software development, financial services)</li><li>High capacity workloads (such as AI/ML/LLP, oil & gas, media, healthcare images, backup, and archives)</li><li>Large-scale workloads (many client connections such as FSLogix profiles)</li><li>High performance workloads</li><li>Capacity quotas between 50 TiB and 1 PiB</li></ul> |
0 commit comments