You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: azure-stack/hci/manage/refs-deduplication-and-compression.md
+30-4Lines changed: 30 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ description: Learn how to use ReFS deduplication and compression in Azure Stack
4
4
author: alkohli
5
5
ms.author: alkohli
6
6
ms.topic: how-to
7
-
ms.date: 04/10/2024
7
+
ms.date: 09/24/2024
8
8
---
9
9
10
10
# Optimize storage with ReFS deduplication and compression in Azure Stack HCI
@@ -148,7 +148,7 @@ After you enable this feature, you can run a one-time job manually or schedule r
148
148
149
149
Before you run, you should also factor these other considerations:
150
150
151
-
- The first run after enabling this feature is always a full scan and optimization of the entire volume. If the `FullRun` parameter is specified, the optimization covers the entire volume rather than new or unoptimized data.
151
+
- The first run after enabling this feature is always a full scan and optimization of the entire volume. If the `FullRun` parameter is specified, the optimization covers the entire volume rather than new or unoptimized data. Don't use the `FullRun` parameter when the `Type` is `DedupandCompress` or `Compress`.
152
152
- If you don't specify a compression format, the default algorithm is LZ4. You can change the algorithm from one run to another as needed.
153
153
- You can specify more parameters for more complex use cases. The cmdlet used in this section is for the simplest use case.
154
154
- The Full Run, Excluded folder, Excluded file extensions, and Minimum last modified time hours filters apply only when running deduplication, and don't apply when running compression.
@@ -158,15 +158,15 @@ Before you run, you should also factor these other considerations:
158
158
- To start a job immediately, run the following cmdlet. Once you start a job, its `State` might appear as `NotStarted` because it could still be in the initialization phase.
@@ -422,6 +422,32 @@ Once ReFS deduplication and compression is disabled on a volume, the ETW channel
422
422
423
423
If the CSV is moved to another server of the cluster while compression is in progress, the job failed event isn't logged in the ReFS deduplication channel. However, we don't anticipate significant usage impact because of this issue.
424
424
425
+
### When Compression is enabled, using `-FullRun` on jobs after the first optimization run might result in a deadlock in the system.
426
+
427
+
**Status:** Open.
428
+
429
+
Avoid using `-FullRun` in manually started jobs unless the `Type` is `Dedup`.
430
+
431
+
Follow these steps as a templorary workaround to mitigate this issue:
432
+
433
+
1. Disable ReFS deduplication and compression on the volume:
434
+
435
+
```powershell
436
+
Disable-ReFSDedup -Volume <path>
437
+
```
438
+
439
+
1. Decompress the volume using `refsutil`:
440
+
441
+
```powershell
442
+
refsutil compression /c /f NONE <vol>
443
+
```
444
+
445
+
1. Re-enable ReFS deduplication and compression with the `Dedup` only mode, if needed:
0 commit comments