Skip to content

Commit 7d387f7

Browse files
Merge pull request #281228 from b-ahibbard/lg-vol
new large volume limits
2 parents b871d1e + 2979a33 commit 7d387f7

8 files changed

+62
-28
lines changed

articles/azure-netapp-files/azure-netapp-files-cost-model.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ For cost model specific to cross-region replication, see [Cost model for cross-r
1818

1919
Azure NetApp Files is billed on provisioned storage capacity, which is allocated by creating capacity pools. Capacity pools are billed monthly based on a set cost per allocated GiB per hour. Capacity pool allocation is measured hourly.
2020

21-
Capacity pools must be at least 1 TiB and can be increased or decreased in 1-TiB intervals. Capacity pools contain volumes that range in size from a minimum of 100 GiB to a maximum of 100 TiB for regular volumes and up to 500 TiB for [large volumes](azure-netapp-files-understand-storage-hierarchy.md#large-volumes). Volumes are assigned quotas that are subtracted from the capacity pool’s provisioned size. For an active volume, capacity consumption against the quota is based on logical (effective) capacity, being active filesystem data or snapshot data. See [How Azure NetApp Files snapshots work](snapshots-introduction.md) for details.
21+
Capacity pools must be at least 1 TiB and can be increased or decreased in 1-TiB intervals. Capacity pools contain volumes that range in size from a minimum of 100 GiB to a maximum of 100 TiB for regular volumes and up to 1 PiB for [large volumes](azure-netapp-files-understand-storage-hierarchy.md#large-volumes). Volumes are assigned quotas that are subtracted from the capacity pool’s provisioned size. For an active volume, capacity consumption against the quota is based on logical (effective) capacity, being active filesystem data or snapshot data. See [How Azure NetApp Files snapshots work](snapshots-introduction.md) for details.
2222

2323
### Pricing examples
2424

articles/azure-netapp-files/azure-netapp-files-introduction.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ Azure NetApp Files is designed to provide high-performance file storage for ente
4040
| Small-to-large volumes | Easily resize file volumes from 100 GiB up to 100 TiB without downtime. | Scale storage as business needs grow without over-provisioning, avoiding upfront cost.
4141
| 1-TiB minimum capacity pool size | 1-TiB capacity pool is a reduced-size storage pool compared to the initial 4-TiB minimum. | Save money by starting with a smaller storage footprint and lower entry point, without sacrificing performance or availability. Scale storage based on growth without high upfront costs.
4242
| 2,048-TiB maximum capacity pool | 2048-TiB capacity pool is an increased storage pool compared to the initial 500-TiB maximum. | Reduce waste by creating larger, pooled capacity and performance budget, and share and distribute across volumes.
43-
| 50-500 TiB large volumes | Store large volumes of data up to 500 TiB in a single volume. | Manage large datasets and high-performance workloads with ease.
43+
| 50-1,024 TiB large volumes | Store large volumes of data up to 1,024 TiB in a single volume. | Manage large datasets and high-performance workloads with ease.
4444
| User and group quotas | Set quotas on storage usage for individual users and groups. | Control storage usage and optimize resource allocation.
4545
| Virtual machine (VM) networked storage performance | Higher VM network throughput compared to disk IO limits enable more demanding workloads on smaller Azure VMs. | Improve application performance at a smaller VM footprint, improving overall efficiency and lowering application license cost.
4646
| Deep workload readiness | Seamless deployment and migration of any-size workload with well-documented deployment guides. | Easily migrate any workload of any size to the platform. Enjoy a seamless, cost-effective deployment and migration experience.

articles/azure-netapp-files/azure-netapp-files-resource-limits.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,8 @@ The following table describes resource limits for Azure NetApp Files:
3131
| Maximum size of a single regular volume | 100 TiB | No |
3232
| Minimum size of a single [large volume](large-volumes-requirements-considerations.md) | 50 TiB | No |
3333
| Large volume size increase | 30% of lowest provisioned size | Yes |
34-
| Maximum size of a single [large volume](large-volumes-requirements-considerations.md) | 500 TiB | No |
34+
| Maximum size of a single [large volume](large-volumes-requirements-considerations.md) | 1,024 TiB | No |
35+
| Maximum size of a single large volume on dedicated capacity (preview) | 2,048 TiB | No |
3536
| Maximum size of a single file | 16 TiB | No |
3637
| Maximum size of directory metadata in a single directory | 320 MB | No |
3738
| Maximum number of files in a single directory | *Approximately* 4 million. <br> See [Determine if a directory is approaching the limit size](#directory-limit). | No |

articles/azure-netapp-files/azure-netapp-files-understand-storage-hierarchy.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -73,11 +73,11 @@ When you use a manual QoS capacity pool with, for example, an SAP HANA system, a
7373
- A volume's capacity consumption counts against its pool's provisioned capacity.
7474
- A volume’s throughput consumption counts against its pool’s available throughput. See [Manual QoS type](#manual-qos-type).
7575
- Each volume belongs to only one pool, but a pool can contain multiple volumes.
76-
- Volumes contain a capacity of between 100 GiB and 100 TiB. You can create a [large volume](#large-volumes) with a size of between 50 TiB and 500 TiB.
76+
- Volumes contain a capacity of between 100 GiB and 100 TiB. You can create a [large volume](#large-volumes) with a size of between 50 and 1 PiB.
7777

7878
## Large volumes
7979

80-
Azure NetApp Files allows you to create volumes up to 500 TiB in size. Large volumes begin at a capacity of 50 TiB and scale up to 500 TiB. Regular Azure NetApp Files volumes are offered between 100 GiB and 102,400 GiB.
80+
Azure NetApp Files allows you to create large volumes up to 1 PiB in size. Large volumes begin at a capacity of 50 TiB and scale up to 1 PiB. Regular Azure NetApp Files volumes are offered between 100 GiB and 102,400 GiB.
8181

8282
For more information, see [Requirements and considerations for large volumes](large-volumes-requirements-considerations.md).
8383

articles/azure-netapp-files/includes/large-volumes-notice.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ ms.custom: include file
1111
# azure-netapp-files-create-volumes
1212
---
1313

14-
Regular volumes quotas are between 100 GiB and 100 TiB. Large volume quotas range from 50 TiB to 500 TiB in size. If you intend for the volume quota to fall in the large volume range, select **Yes**.
14+
Regular volumes quotas are between 100 GiB and 100 TiB. Large volume quotas range from 50 TiB to 1 PiB in size. If you intend for the volume quota to fall in the large volume range, select **Yes**. Volume quotas are entered in GiB.
1515

1616
>[!IMPORTANT]
1717
> If this is your first time using large volumes, you must first [register the feature](../large-volumes-requirements-considerations.md#register-the-feature) and request [an increase in regional capacity quota](../azure-netapp-files-resource-limits.md#request-limit-increase).

articles/azure-netapp-files/large-volumes-requirements-considerations.md

Lines changed: 35 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -18,19 +18,46 @@ This article describes the requirements and considerations you need to be aware
1818
The following requirements and considerations apply to large volumes. For performance considerations of *regular volumes*, see [Performance considerations for Azure NetApp Files](azure-netapp-files-performance-considerations.md).
1919

2020
* A regular volume can’t be converted to a large volume.
21-
* You must create a large volume at a size of 50 TiB or larger. A single volume can't exceed 500 TiB.
21+
* You must create a large volume at a size of 50 TiB or larger. A single volume can't exceed 1 PiB.
2222
* You can't resize a large volume to less than 50 TiB.
2323
A large volume cannot be resized to less than 30% of its lowest provisioned size. This limit is adjustable via [a support request](azure-netapp-files-resource-limits.md#resource-limits).
2424
* Large volumes are currently not supported with Azure NetApp Files backup.
2525
* You can't create a large volume with application volume groups.
2626
* Currently, large volumes aren't suited for database (HANA, Oracle, SQL Server, etc.) data and log volumes. For database workloads requiring more than a single volume’s throughput limit, consider deploying multiple regular volumes. To optimize multiple volume deployments for databases, use [application volume groups](application-volume-group-concept.md).
27-
* Throughput ceilings for the three performance tiers (Standard, Premium, and Ultra) of large volumes are based on the existing 100-TiB maximum capacity targets. You're able to grow to 500 TiB with the throughput ceiling per the following table:
27+
* Throughput ceilings for the three performance tiers (Standard, Premium, and Ultra) of large volumes are based on the existing 100-TiB maximum capacity targets. You're able to grow to 1 PiB with the throughput ceiling per the following table:
2828

29-
| Capacity tier | Volume size (TiB) | Throughput (MiB/s) |
30-
| --- | --- | --- |
31-
| Standard | 50 to 500 | 1,600 |
32-
| Premium | 50 to 500 | 6,400 |
33-
| Ultra | 50 to 500 | 10,240 |
29+
<table><thead>
30+
<tr>
31+
<th></th>
32+
<th colspan="2">Capacity</th>
33+
<th colspan="2">Linear performance scaling per TiB up to maximum throughput </th>
34+
</tr></thead>
35+
<tbody>
36+
<tr>
37+
<td>Capacity tier</td>
38+
<td>Minimum volume size<br>(TiB)</td>
39+
<td>Maximum volume size (TiB)</td>
40+
<td>Minimum throughput (MiB/s)</td>
41+
<td>Maximum throughput (MiB/s)</td>
42+
</tr>
43+
<tr>
44+
<td>Standard (16 MiB/s per TiB)</td>
45+
<td>50</td>
46+
<td>1,024</td>
47+
<td>800</td>
48+
<td>12,800</td>
49+
</tr>
50+
<tr>
51+
<td>Premium (64 MiB/s per TiB)</td>
52+
<td>50</td>
53+
<td>1,024</td>
54+
<td>3,200</td>
55+
<td>12</td>
56+
</tr>
57+
</tbody>
58+
</table>
59+
60+
\* 2-PiB large volumes are available on request depending on regional dedicated capacity availability. To request 2-PiB large volumes, contact your account team.
3461

3562
* Large volumes aren't currently supported with standard storage with cool access.
3663

@@ -73,7 +100,7 @@ Support for Azure NetApp Files large volumes is available in the following regio
73100
>[!IMPORTANT]
74101
>Before you can use large volumes, you must first request [an increase in regional capacity quota](azure-netapp-files-resource-limits.md#request-limit-increase).
75102
76-
Once your [regional capacity quota](regional-capacity-quota.md) has increased, you can create volumes that are up to 500 TiB in size. When creating a volume, after you designate the volume quota, you must select **Yes** for the **Large volume** field. Once created, you can manage your large volumes in the same manner as regular volumes.
103+
Once your [regional capacity quota](regional-capacity-quota.md) has increased, you can create volumes that are up to 1 PiB in size. When creating a volume, after you designate the volume quota, you must select **Yes** for the **Large volume** field. Once created, you can manage your large volumes in the same manner as regular volumes.
77104

78105
### Register the feature
79106

0 commit comments

Comments
 (0)