You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-netapp-files/azure-netapp-files-understand-storage-hierarchy.md
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -77,7 +77,7 @@ When you use a manual QoS capacity pool with, for example, an SAP HANA system, a
77
77
78
78
## Large volumes
79
79
80
-
Azure NetApp Files allows you to create large volumes up to 1 PiB in size. Large volumes begin at a capacity of 50 TiB and scale up to 1 PiB. Regular Azure NetApp Files volumes are offered between 50 GiB and 102,400 GiB.
80
+
Azure NetApp Files allows you to create [large volumes](large-volumes.md) up to 1 PiB in size. Large volumes begin at a capacity of 50 TiB and scale up to 1 PiB. Regular Azure NetApp Files volumes are offered between 50 GiB and 102,400 GiB.
81
81
82
82
For more information, see [Requirements and considerations for large volumes](large-volumes-requirements-considerations.md).
83
83
@@ -88,4 +88,5 @@ For more information, see [Requirements and considerations for large volumes](la
88
88
-[Performance considerations for Azure NetApp Files](azure-netapp-files-performance-considerations.md)
89
89
-[Create a capacity pool](azure-netapp-files-set-up-capacity-pool.md)
90
90
-[Manage a manual QoS capacity pool](manage-manual-qos-capacity-pool.md)
91
+
-[Understand large volumes](large-volumes.md)
91
92
-[Requirements and considerations for large volumes](large-volumes-requirements-considerations.md)
Copy file name to clipboardExpand all lines: articles/azure-netapp-files/large-volumes.md
+39-3Lines changed: 39 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,15 +6,30 @@ author: b-ahibbard
6
6
ms.service: azure-netapp-files
7
7
ms.custom: references_regions
8
8
ms.topic: conceptual
9
-
ms.date: 10/18/2024
9
+
ms.date: 10/29/2024
10
10
ms.author: anfdocs
11
11
---
12
12
# Understand large volumes in Azure NetApp Files
13
13
14
+
Volumes in Azure NetApp Files are the way you present high performance, cost-effective storage to your network attached storage (NAS) clients in the Azure cloud. Volumes act as independent file systems with their own capacity, file counts, ACLs, snapshots and file system IDs, which provides a way to separate datasets into individual secure tenants.
14
15
16
+
:::image type="content" source="./media/large-volumes/large-volumes-diagram.png" alt-text="Diagram of large and regular volume size." lightbox="./media/large-volumes/large-volumes-diagram.png":::
15
17
16
-
Volumes in Azure NetApp Files are the way you present high performance, cost-effective storage to your network attached storage (NAS) clients in the Azure cloud. Volumes act as independent file systems with their own capacity, file counts, ACLs, snapshots and file system IDs, which provides a way to separate datasets into individual secure tenants.
18
+
All resources in Azure NetApp files have [limits](azure-netapp-files-resource-limits.md). _Regular_ volumes have the following limits:
| Capacity | <ul><li>50 TiB minimum</li><li>1 PiB maximum (or [2 PiB by special request](azure-netapp-files-resource-limits.md#request-limit-increase))</li></ul> |
@@ -25,4 +40,25 @@ For instance, the following graphs show that a large volume can deliver 2-3x the
25
40
26
41
For more information about performance tests, see [Large volume performance benchmarks for Linux](performance-large-volumes-linux.md) and [Regular volume performance benchmarks for Linux](performance-benchmarks-linux.md).
27
42
28
-
For example, benchmarks using FIO, a large volume was capable of higher IOPS and throughput than a regular volume.
43
+
For example, in benchmark tests using Flexible I/O Tester (FIO), a large volume achieved higher IOPS and throughput than a regular volume.
44
+
45
+
:::image type="content" source="./media/large-volumes/large-regular-volume-comparison.png" alt-text="Diagram comparing large and regular volumes with random I/O." lightbox="./media/large-volumes/large-regular-volume-comparison.png":::
46
+
47
+
:::image type="content" source="./media/large-volumes/large-volume-throughput.png" alt-text="Diagram comparing large and regular volumes with sequential I/O." lightbox="./media/large-volumes/large-volume-throughput.png":::
48
+
49
+
## Work load types and use cases
50
+
51
+
Regular volumes can handle most workloads. Once capacity, file count, performance, or scale limits are reached, new volumes must be created. This condition adds unnecessary complexity to a solution.
52
+
53
+
Large volumes allow workloads to extend beyond the current limitations of regular volumes. The following table shows some examples of use cases for each volume type.
54
+
55
+
| Volume type | Primary use cases |
56
+
| - | -- |
57
+
| Regular volumes | <ul><li>General file shares</li><li>SAP HANA and databases (Oracle, SQL Server, Db2, and others)</li><li>VDI/Azure VMware Service</li><li>Capacities less than 50 TiB</li></ul> |
58
+
| Large volumes | <ul><li>General file shares</li><li>High file count or high metadata workloads (such as electronic design automation, software development, FSI)</li><li>High capacity workloads (such as AI/ML/LLP, oil & gas, media, healthcare images, backup and archives)</li><li>Large-scale workloads (many client connections such as FSLogix profiles)</li><li>High performance workloads</li><li>Capacity quotas between 50 TiB and 1 PiB</li></ul> |
59
+
60
+
## More information
61
+
62
+
*[Requirements and considerations for large volumes](large-volumes-requirements-considerations.md)
63
+
*[Storage hierarchy of Azure NetApp Files](azure-netapp-files-understand-storage-hierarchy.md)
64
+
*[Resource limits for Azure NetApp Files](azure-netapp-files-resource-limits.md)
0 commit comments