Skip to content

Commit 007ea89

Browse files
committed
warnings
1 parent 377b0f9 commit 007ea89

File tree

10 files changed

+48
-26
lines changed

10 files changed

+48
-26
lines changed

articles/container-apps/storage-mounts.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -248,7 +248,7 @@ To enable Azure Files storage in your container, you need to set up your environ
248248
| Requirement | Instructions |
249249
|--|--|
250250
| Azure account | If you don't have one, [create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F). |
251-
| Azure Storage account | [Create a storage account](../storage/common/storage-account-create.md?tabs=azure-cli#create-a-storage-account-1). |
251+
| Azure Storage account | [Create a storage account](../storage/common/storage-account-create.md?tabs=azure-cli#create-a-storage-account). |
252252
| Azure Container Apps environment | [Create a container apps environment](environment.md). |
253253

254254
### Configuration

articles/databox/data-box-disk-limits.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,6 @@ For the latest information on Azure storage service limits and best practices fo
4848
- The hierarchy of files is maintained while uploading to the cloud for both blobs and Azure Files. For example, you copied a file at this path: `<container folder>\A\B\C.txt`. This file is uploaded to the same path in cloud.
4949
- Any empty directory hierarchy (without any files) created under *BlockBlob* and *PageBlob* folders isn't uploaded.
5050
- If you don't have long paths enabled on the client, and any path and file name in your data copy exceeds 256 characters, the Data Box Split Copy Tool (DataBoxDiskSplitCopy.exe) or the Data Box Disk Validation tool (DataBoxDiskValidation.cmd) will report failures. To avoid this kind of failure, [enable long paths on your Windows client](/windows/win32/fileio/maximum-file-path-limitation?tabs=cmd#enable-long-paths-in-windows-10-version-1607-and-later).
51-
- To improve performance during data uploads, we recommend that you [enable large file shares on the storage account and increase share capacity to 100 TiB](../../articles/storage/files/storage-how-to-create-file-share.md#enable-large-file-shares-on-an-existing-account). Large file shares are only supported for storage accounts with locally redundant storage (LRS).
5251
- If there are any errors when uploading data to Azure, an error log is created in the target storage account. The path to this error log is available in the portal when the upload is complete and you can review the log to take corrective action. Don't delete data from the source without verifying the uploaded data.
5352
- If you specified managed disks in the order, review the following additional considerations:
5453

articles/import-export/storage-import-export-data-to-files.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -30,9 +30,7 @@ In this tutorial, you learn how to:
3030
Before you create an import job to transfer data into Azure Files, carefully review and complete the following list of prerequisites. You must:
3131

3232
- Have an active Azure subscription to use with Import/Export service.
33-
- Have at least one Azure Storage account. See the list of [Supported storage accounts and storage types for Import/Export service](storage-import-export-requirements.md).
34-
- Consider configuring large file shares on the storage account. During imports to Azure Files, if a file share doesn't have enough free space, auto splitting the data to multiple Azure file shares is no longer supported, and the copy will fail. For instructions, see [Configure large file shares on a storage account](../storage/files/storage-how-to-create-file-share.md?tabs=azure-portal#enable-large-file-shares-on-an-existing-account).
35-
- For information on creating a new storage account, see [How to create a storage account](../storage/common/storage-account-create.md).
33+
- Have at least one Azure Storage account. See the list of [Supported storage accounts and storage types for Import/Export service](storage-import-export-requirements.md). For information on creating a new storage account, see [How to create a storage account](../storage/common/storage-account-create.md).
3634
- Have an adequate number of disks of [supported types](storage-import-export-requirements.md#supported-disks).
3735
- Have a Windows system running a [supported OS version](storage-import-export-requirements.md#supported-operating-systems).
3836
- Download the current release of the Azure Import/Export version 2 tool, for files, on the Windows system:
@@ -314,7 +312,7 @@ Install-Module -Name Az.ImportExport
314312
[!INCLUDE [storage-import-export-verify-data-copy](../../includes/storage-import-export-verify-data-copy.md)]
315313

316314
> [!NOTE]
317-
> In the latest version of the Azure Import/Export tool for files (2.2.0.300), if a file share doesn't have enough free space, the data is no longer auto split to multiple Azure file shares. Instead, the copy fails, and you'll be contacted by Support. You'll need to either configure large file shares on the storage account or move around some data to make space in the share. For more information, see [Configure large file shares on a storage account](../storage/files/storage-how-to-create-file-share.md?tabs=azure-portal#enable-large-file-shares-on-an-existing-account).
315+
> In the latest version of the Azure Import/Export tool for files (2.2.0.300), if a file share doesn't have enough free space, the data is no longer auto split to multiple Azure file shares. Instead, the copy fails, and you'll be contacted by Support. You might need to move some data around to make space in the share.
318316
319317

320318
## Samples for journal files

articles/storage/common/storage-account-create.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -581,7 +581,7 @@ Alternately, you can delete the resource group, which deletes the storage accoun
581581

582582
[!INCLUDE [GPv1 support statement](../../../includes/storage-account-gpv1-support.md)]
583583

584-
General purpose v1 (GPv1) storage accounts can no longer be created from the Azure portal. If you need to create a GPv1 storage account, follow the steps in section [Create a storage account](#create-a-storage-account-1) for PowerShell, the Azure CLI, Bicep, or Azure Templates. For the `kind` parameter, specify `Storage`, and choose a `sku` or `SkuName` from the [table of supported values](#storage-account-type-parameters).
584+
General purpose v1 (GPv1) storage accounts can no longer be created from the Azure portal. If you need to create a GPv1 storage account, follow the steps in section [Create a storage account](#create-a-storage-account) for PowerShell, the Azure CLI, Bicep, or Azure Templates. For the `kind` parameter, specify `Storage`, and choose a `sku` or `SkuName` from the [table of supported values](#storage-account-type-parameters).
585585

586586
## Next steps
587587

articles/storage/file-sync/file-sync-planning.md

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -374,11 +374,7 @@ For more information about encryption in transit, see [requiring secure transfer
374374

375375
[!INCLUDE [storage-files-tiers-overview](../../../includes/storage-files-tiers-overview.md)]
376376

377-
#### Regional availability
378-
379-
[!INCLUDE [storage-files-tiers-large-file-share-availability](../../../includes/storage-files-tiers-large-file-share-availability.md)]
380-
381-
## Azure file sync region availability
377+
## Azure File Sync region availability
382378

383379
For regional availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=storage).
384380

articles/storage/files/storage-how-to-create-file-share.md

Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -283,6 +283,46 @@ az storage share-rm update \
283283

284284
---
285285

286+
### Expand existing file shares
287+
288+
If you enable large file shares on an existing storage account, you must expand existing file shares in that storage account to take advantage of the increased capacity and scale.
289+
290+
# [Portal](#tab/azure-portal)
291+
1. From your storage account, select **File shares**.
292+
1. Right-click your file share, and then select **Quota**.
293+
1. Enter the new size that you want, and then select **OK**.
294+
295+
![The Azure portal UI with Quota of existing file shares](media/storage-files-how-to-create-large-file-share/update-large-file-share-quota.png)
296+
297+
# [PowerShell](#tab/azure-powershell)
298+
To set the quota to the maximum size, use the following command. Replace `<YourResourceGroupName>`, `<YourStorageAccountName>`, and `<YourStorageAccountFileShareName>` with your information.
299+
300+
```powershell
301+
$resourceGroupName = "<YourResourceGroupName>"
302+
$storageAccountName = "<YourStorageAccountName>"
303+
$shareName="<YourStorageAccountFileShareName>"
304+
305+
# update quota
306+
Update-AzRmStorageShare `
307+
-ResourceGroupName $resourceGroupName `
308+
-StorageAccountName $storageAccountName `
309+
-Name $shareName `
310+
-QuotaGiB 102400
311+
```
312+
313+
# [Azure CLI](#tab/azure-cli)
314+
To set the quota to the maximum size, use the following command. Replace `<yourResourceGroupName>`, `<yourStorageAccountName>`, and `<yourFileShareName>` with your information.
315+
316+
```azurecli-interactive
317+
az storage share-rm update \
318+
--resource-group <yourResourceGroupName> \
319+
--storage-account <yourStorageAccountName> \
320+
--name <yourFileShareName> \
321+
--quota 102400
322+
```
323+
324+
---
325+
286326
## Delete a file share
287327

288328
To delete an Azure file share, you can use the Azure portal, Azure PowerShell, or Azure CLI. SMB Azure file shares can be recovered within the [soft delete](storage-files-prevent-file-share-deletion.md) retention period.

includes/data-box-data-upload-caveats.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,8 +22,7 @@ ms.author: shaas
2222
- Use different storage accounts for SMB and NFS.
2323
- Don't copy the same data to the same end destination in Azure using both SMB and NFS. In these cases, the final outcome can't be determined.
2424
- Although copying via both SMB and NFS in parallel can work, we don't recommend doing that as it's prone to human error. Wait until your SMB data copy is complete before you start an NFS data copy.
25-
- Upload management:
26-
- To improve performance during data uploads, we recommend that you [enable large file shares on the storage account and increase share capacity to 100 TiB](../articles/storage/files/storage-how-to-create-file-share.md#enable-large-file-shares-on-an-existing-account).
25+
- Upload management:
2726
- If there are any errors when uploading data to Azure, an error log is created in the target storage account. The path to this error log is available when the upload is complete, and you can review the log to take corrective action. Don't delete data from the source without verifying the uploaded data.
2827
- File metadata and NTFS permissions can be preserved when the data is uploaded to Azure Files using guidance in [Preserving file ACLs, attributes, and timestamps with Azure Data Box](../articles/databox/data-box-file-acls-preservation.md).
2928
- The hierarchy of the files is maintained while uploading to the cloud for both blobs and Azure Files. For example, you copied a file at this path: `<container folder>\A\B\C.txt`. This file is uploaded to the same path in cloud.

includes/data-box-order-portal.md

Lines changed: 1 addition & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -73,15 +73,7 @@ To order and device, perform the following steps in the Azure portal:
7373

7474
This quota is used for billing. After your data is uploaded to the datacenter, you should adjust the quota to meet your needs. For more information, see [Understanding billing](../articles/storage/files/understanding-billing.md).
7575

76-
- If you're using a **General Purpose v1** or **General Purpose v2** storage account, you can enable large file shares to allow data uploads of up to 100 TiB per share. If large file shares aren't enabled, a data upload to Azure fails after reaching the 5-TiB standard share limit.
77-
78-
If you select a General Purpose v1 or v2 storage account that supports Azure file shares without large file shares enabled, the **Enable large file shares** button is displayed. To enable large file shares for one or more storage accounts, select **Enable large file shares**, and then enable large file shares on each storage account that requires large file shares.
79-
80-
Once you enable large file shares on an account, the storage account is upgraded and this upgrade can't be reversed. For more information, see [Large file shares](../articles/storage/files/storage-how-to-create-file-share.md?tabs=azure-portal#enable-large-file-shares-on-an-existing-account).
81-
82-
:::image type="content" source="media/data-box-order-portal/data-box-import-07.png" alt-text="Screenshot of the Enable option for a Data Box order that imports files to storage accounts. The Enabled button is highlighted.":::
83-
84-
- If you're using a **General Purpose v1**, **General Purpose v2**, or **Blob** storage account, both the **Enable copy to archive** and **Enable large file shares** options are shown. Enabling **Copy to archive** allows you to send your blobs to the archive tier automatically. Any data uploaded to the archive tier remains offline and needs to be rehydrated before it can be read or modified.
76+
- If you're using a **General Purpose v1**, **General Purpose v2**, or **Blob** storage account, the **Enable copy to archive** option is shown. Enabling **Copy to archive** allows you to send your blobs to the archive tier automatically. Any data uploaded to the archive tier remains offline and needs to be rehydrated before it can be read or modified.
8577

8678
When **Copy to archive** is enabled, an extra `Archive` share is available during the copy process. The extra share is available for [SMB, NFS, REST, and data copy service](../articles/databox/data-box-deploy-copy-data.md) methods.
8779

@@ -90,7 +82,6 @@ To order and device, perform the following steps in the Azure portal:
9082
> [!NOTE]
9183
> Storage accounts with virtual networks are supported. To allow the Data Box service to work with secured storage accounts, enable the trusted services within the storage account network firewall settings. For more information, see how to [Add Azure Data Box as a trusted service](../articles/storage/common/storage-network-security.md#exceptions).
9284
93-
9485
#### To use managed disks
9586

9687
When using Data Box to create **Managed disk(s)** from on-premises virtual hard disks (VHDs), you also need to provide the following information:

includes/data-box-storage-account-size-limits.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,5 +12,4 @@ Here are the limits on the size of the data that's copied into a storage account
1212
| Size of data copied into Azure storage account | Default limit |
1313
|---------------------------------------------------------------------|------------------------|
1414
| Block blob and page blob | Maximum limit is the same as the [Storage limit defined for Azure Subscription](../articles/azure-resource-manager/management/azure-subscription-service-limits.md#azure-storage-limits) and it includes data from all the sources including Data Box. |
15-
| Azure Files | <ul><li>Data Box supports large file shares (100 TiB) if enabled before creation of the Data Box order.<ul><li>If large file shares are not enabled before order creation, maximum file share size supported is 5 TiB.</li><li>To improve performance during data uploads, we recommend that you [enable large file shares on the storage account and increase share capacity to 100 TiB](../articles/storage/files/storage-how-to-create-file-share.md#enable-large-file-shares-on-an-existing-account). Large file shares are only supported for storage accounts with locally redundant storage (LRS).</li></ul><li>Data Box supports Azure Premium File Shares, which allow a total of 100 TiB for all shares in the storage account.<ul><li>Maximum usable capacity is slightly less because of the space that copy logs and audit logs use. A minimum 100 GiB each is reserved for the copy log and audit log. For more information, see [Audit logs for Azure Data Box, Azure Data Box Heavy](../articles/databox/data-box-audit-logs.md).</li><li>All folders under *StorageAccount_AzFile* must follow this limit. For more information, see [Create an Azure file share](../articles/storage/files/storage-how-to-create-file-share.md).</li></ul></li></ul> |
16-
15+
| Azure Files | Data Box supports Azure premium file shares, which allow a total of 100 TiB for all shares in the storage account. Maximum usable capacity is slightly less because of the space that copy logs and audit logs use. A minimum 100 GiB each is reserved for the copy log and audit log. For more information, see [Audit logs for Azure Data Box, Azure Data Box Heavy](../articles/databox/data-box-audit-logs.md). All folders under *StorageAccount_AzFile* must follow this limit. For more information, see [Create an Azure file share](../articles/storage/files/storage-how-to-create-file-share.md). |
Binary file not shown.

0 commit comments

Comments
 (0)