Skip to content

Commit f7e6d8c

Browse files
authored
Merge pull request #289438 from b-ahibbard/large-vol-new
Large vol new
2 parents e097198 + 18eb9a6 commit f7e6d8c

File tree

7 files changed

+69
-1
lines changed

7 files changed

+69
-1
lines changed

articles/azure-netapp-files/TOC.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -99,6 +99,8 @@
9999
href: regional-capacity-quota.md
100100
- name: Understand default and individual user and group quotas
101101
href: default-individual-user-group-quotas-introduction.md
102+
- name: Understand large volumes
103+
href: large-volumes.md
102104
- name: Requirements and considerations for large volumes
103105
href: large-volumes-requirements-considerations.md
104106
- name: Performance

articles/azure-netapp-files/azure-netapp-files-understand-storage-hierarchy.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ When you use a manual QoS capacity pool with, for example, an SAP HANA system, a
7777

7878
## Large volumes
7979

80-
Azure NetApp Files allows you to create large volumes up to 1 PiB in size. Large volumes begin at a capacity of 50 TiB and scale up to 1 PiB. Regular Azure NetApp Files volumes are offered between 50 GiB and 102,400 GiB.
80+
Azure NetApp Files allows you to create [large volumes](large-volumes.md) up to 1 PiB in size. Large volumes begin at a capacity of 50 TiB and scale up to 1 PiB. Regular Azure NetApp Files volumes are offered between 50 GiB and 102,400 GiB.
8181

8282
For more information, see [Requirements and considerations for large volumes](large-volumes-requirements-considerations.md).
8383

@@ -88,4 +88,5 @@ For more information, see [Requirements and considerations for large volumes](la
8888
- [Performance considerations for Azure NetApp Files](azure-netapp-files-performance-considerations.md)
8989
- [Create a capacity pool](azure-netapp-files-set-up-capacity-pool.md)
9090
- [Manage a manual QoS capacity pool](manage-manual-qos-capacity-pool.md)
91+
- [Understand large volumes](large-volumes.md)
9192
- [Requirements and considerations for large volumes](large-volumes-requirements-considerations.md)

articles/azure-netapp-files/large-volumes-requirements-considerations.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -118,6 +118,7 @@ If this is your first time using large volumes, register the feature with the [l
118118

119119
## Next steps
120120

121+
* [Understand large volumes](large-volumes.md)
121122
* [Storage hierarchy of Azure NetApp Files](azure-netapp-files-understand-storage-hierarchy.md)
122123
* [Resource limits for Azure NetApp Files](azure-netapp-files-resource-limits.md)
123124
* [Create an NFS volume](azure-netapp-files-create-volumes.md)
Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
---
2+
title: Understand large volumes in Azure NetApp Files
3+
description: Learn about the benefits, use cases, and requirements for using large volumes in Azure NetApp Files.
4+
services: azure-netapp-files
5+
author: b-ahibbard
6+
ms.service: azure-netapp-files
7+
ms.custom: references_regions
8+
ms.topic: conceptual
9+
ms.date: 10/29/2024
10+
ms.author: anfdocs
11+
---
12+
# Understand large volumes in Azure NetApp Files
13+
14+
Volumes in Azure NetApp Files are the way you present high performance, cost-effective storage to your network attached storage (NAS) clients in the Azure cloud. Volumes act as independent file systems with their own capacity, file counts, ACLs, snapshots, and file system IDs. These qualities provide a way to separate datasets into individual secure tenants.
15+
16+
:::image type="content" source="./media/large-volumes/large-volumes-diagram.png" alt-text="Diagram of large and regular volume size." lightbox="./media/large-volumes/large-volumes-diagram.png":::
17+
18+
All resources in Azure NetApp files have [limits](azure-netapp-files-resource-limits.md). _Regular_ volumes have the following limits:
19+
20+
| Limit type | Limits |
21+
| - | - |
22+
| Capacity | <ul><li>50 GiB minimum</li><li>100 TiB maximum</li></ul> |
23+
| File count | 2,147,483,632 |
24+
| Performance | <ul><li>Standard: 1,600</li><li>Premium: 1,600</li><li>Ultra: 4,500</li></ul> |
25+
26+
Large volumes have the following limits:
27+
28+
| Limit type | Values |
29+
| - | - |
30+
| Capacity | <ul><li>50 TiB minimum</li><li>1 PiB maximum (or [2 PiB by special request](azure-netapp-files-resource-limits.md#request-limit-increase))</li></ul> |
31+
| File count | 15,938,355,048 |
32+
| Performance | <ul><li>Standard: 1,600</li><li>Premium: 6,400</li><li>Ultra: 12,800</li></ul> |
33+
34+
35+
## Large volumes effect on performance
36+
37+
In many cases, a regular volume can handle the performance needs for a production workload, particularly when dealing with database workloads, general file shares, and Azure VMware Service or virtual desktop infrastructure (VDI) workloads. When workloads are metadata heavy or require scale beyond what a regular volume can handle, a large volume can increase performance needs with minimal cost impact.
38+
39+
For instance, the following graphs show that a large volume can deliver two to three times the performance at scale of a regular volume.
40+
41+
For more information about performance tests, see [Large volume performance benchmarks for Linux](performance-large-volumes-linux.md) and [Regular volume performance benchmarks for Linux](performance-benchmarks-linux.md).
42+
43+
For example, in benchmark tests using Flexible I/O Tester (FIO), a large volume achieved higher I/OPS and throughput than a regular volume.
44+
45+
:::image type="content" source="./media/large-volumes/large-regular-volume-comparison.png" alt-text="Diagram comparing large and regular volumes with random I/O." lightbox="./media/large-volumes/large-regular-volume-comparison.png":::
46+
47+
:::image type="content" source="./media/large-volumes/large-volume-throughput.png" alt-text="Diagram comparing large and regular volumes with sequential I/O." lightbox="./media/large-volumes/large-volume-throughput.png":::
48+
49+
## Work load types and use cases
50+
51+
Regular volumes can handle most workloads. Once capacity, file count, performance, or scale limits are reached, new volumes must be created. This condition adds unnecessary complexity to a solution.
52+
53+
Large volumes allow workloads to extend beyond the current limitations of regular volumes. The following table shows some examples of use cases for each volume type.
54+
55+
| Volume type | Primary use cases |
56+
| - | -- |
57+
| Regular volumes | <ul><li>General file shares</li><li>SAP HANA and databases (Oracle, SQL Server, Db2, and others)</li><li>VDI/Azure VMware Service</li><li>Capacities less than 50 TiB</li></ul> |
58+
| Large volumes | <ul><li>General file shares</li><li>High file count or high metadata workloads (such as electronic design automation, software development, FSI)</li><li>High capacity workloads (such as AI/ML/LLP, oil & gas, media, healthcare images, backup, and archives)</li><li>Large-scale workloads (many client connections such as FSLogix profiles)</li><li>High performance workloads</li><li>Capacity quotas between 50 TiB and 1 PiB</li></ul> |
59+
60+
## More information
61+
62+
* [Requirements and considerations for large volumes](large-volumes-requirements-considerations.md)
63+
* [Storage hierarchy of Azure NetApp Files](azure-netapp-files-understand-storage-hierarchy.md)
64+
* [Resource limits for Azure NetApp Files](azure-netapp-files-resource-limits.md)
12.5 KB
Loading
12.6 KB
Loading
34.6 KB
Loading

0 commit comments

Comments
 (0)