Skip to content

Commit 4913e7a

Browse files
Merge pull request #283665 from Becha8/patch-5
Update how-to-guide-upload-data.md
2 parents fe51178 + 8033f32 commit 4913e7a

File tree

1 file changed

+10
-6
lines changed

1 file changed

+10
-6
lines changed

articles/modeling-simulation-workbench/how-to-guide-upload-data.md

Lines changed: 10 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
11
---
22
title: Import data into Azure Modeling and Simulation Workbench
33
description: Learn how to import data into a chamber in Azure Modeling and Simulation Workbench.
4-
author: lynnar
5-
ms.author: lynnar
6-
ms.reviewer: yochu
4+
author: becha8
5+
ms.author: becha
6+
ms.reviewer: becha
77
ms.service: modeling-simulation-workbench
88
ms.topic: how-to
9-
ms.date: 01/01/2023
9+
ms.date: 08/05/2024
1010
# Customer intent: As a Chamber User in Azure Modeling and Simulation Workbench, I want to import data into my chamber.
1111
---
1212

@@ -40,16 +40,20 @@ Open your web browser and go to the [Azure portal](https://portal.azure.com/). E
4040
1. Use the AzCopy command to upload your file. For example, use `azcopy copy <sourceFilePath> "<uploadURL>"`.
4141

4242
> [!NOTE]
43-
> Supported characters for the file name are alphanumeric characters, underscores, periods, and hyphens.
43+
> Supported characters for the file name are alphanumeric characters, underscores, periods, and hyphens. Make sure that the file name does not have any spaces between characters, as it will cause the data import to fail.
4444
>
45-
> The data pipeline processes only files at the root. It doesn't process subfolders.
45+
> The data pipeline processes only files at the root. It doesn't process subfolders. If you're importing multiple smaller files, we recommend that you zip or tarball them into a single file.
46+
>
47+
> Gigabyte-sized tarballs and zipped files are supported, up to a maximum of 200GB per file. Ensure that each individual file is less than the maximum allowed size.
4648
4749
1. Confirm that the uploaded file resource with the source file name appears under **Chamber** > **Data Pipeline** > **File**.
4850

4951
A Chamber Admin or Chamber User can access the uploaded file from the chamber by accessing the following path: */mount/datapipeline/datain*.
5052

5153
> [!IMPORTANT]
5254
> If you're importing multiple smaller files, we recommend that you zip or tarball them into a single file. Gigabyte-sized tarballs and zipped files are supported, depending on your connection type and network speed.
55+
> The /mount/datapipeline/datain directory has a file size of 1TB, so if the imported dataset is larger than this, then free up space by moving the files over to /mount/chamberstorages/”Workbench chamber storage”
56+
> Note that the /datapipeline directory is Azure Files based, whereas the /chamberstorages directory is high-performance Azure NetApp Files. Always copy over the tools/binaries/IP from the /datapipeline/datain folder /chamberstorages directory under the specific chamber’s private storage.
5357
5458
## Next steps
5559

0 commit comments

Comments
 (0)