You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: azure-stack/hci/manage/refs-deduplication-and-compression.md
+30-5Lines changed: 30 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ description: Learn how to use ReFS deduplication and compression in Azure Stack
4
4
author: alkohli
5
5
ms.author: alkohli
6
6
ms.topic: how-to
7
-
ms.date: 04/10/2024
7
+
ms.date: 09/24/2024
8
8
---
9
9
10
10
# Optimize storage with ReFS deduplication and compression in Azure Stack HCI
@@ -148,25 +148,24 @@ After you enable this feature, you can run a one-time job manually or schedule r
148
148
149
149
Before you run, you should also factor these other considerations:
150
150
151
-
- The first run after enabling this feature is always a full scan and optimization of the entire volume. If the `FullRun` parameter is specified, the optimization covers the entire volume rather than new or unoptimized data.
152
151
- If you don't specify a compression format, the default algorithm is LZ4. You can change the algorithm from one run to another as needed.
153
152
- You can specify more parameters for more complex use cases. The cmdlet used in this section is for the simplest use case.
154
-
- The Full Run, Excluded folder, Excluded file extensions, and Minimum last modified time hours filters apply only when running deduplication, and don't apply when running compression.
153
+
- The Excluded folder, Excluded file extensions, and Minimum last modified time hours filters apply only when running deduplication, and don't apply when running compression.
155
154
156
155
**Manually run ReFS deduplication and compression jobs**
157
156
158
157
- To start a job immediately, run the following cmdlet. Once you start a job, its `State` might appear as `NotStarted` because it could still be in the initialization phase.
@@ -394,6 +393,32 @@ The duration limit is in place to prevent any performance impact on customer wor
394
393
395
394
The following section lists the known issues that currently exist with ReFS deduplication and compression.
396
395
396
+
### When Compression is enabled, using the `-FullRun` parameter on jobs after the first optimization run might result in a deadlock in the system.
397
+
398
+
**Status:** Open.
399
+
400
+
Avoid using `-FullRun` in manually started jobs unless the `Type` is `Dedup`.
401
+
402
+
Follow these steps as a temporary workaround to mitigate this issue:
403
+
404
+
1. Disable ReFS deduplication and compression on the volume:
405
+
406
+
```powershell
407
+
Disable-ReFSDedup -Volume <path>
408
+
```
409
+
410
+
1. Decompress the volume using `refsutil`:
411
+
412
+
```powershell
413
+
refsutil compression /c /f NONE <vol>
414
+
```
415
+
416
+
1. Re-enable ReFS deduplication and compression with the `Dedup` only mode, if needed:
417
+
418
+
```powershell
419
+
Enable-ReFSDedup -Volume <path> -Type Dedup
420
+
```
421
+
397
422
### Scheduling jobs to run simultaneously on multiple CSVs within a single cluster can potentially trigger CSV movements and negatively impact performance.
0 commit comments