Skip to content

Commit f0cb13f

Browse files
committed
readme formatting
1 parent 5ee93fd commit f0cb13f

File tree

1 file changed

+8
-6
lines changed

1 file changed

+8
-6
lines changed

README.md

Lines changed: 8 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -218,9 +218,10 @@ key: "INTER_OP_THREAD_COUNT"
218218
}
219219
```
220220

221-
**NOTE**: This parameter is set globally for the PyTorch backend.
222-
The value from the first model config file that specifies this parameter will be used.
223-
Subsequent values from other model config files, if different, will be ignored.
221+
> [!NOTE]
222+
> This parameter is set globally for the PyTorch backend.
223+
> The value from the first model config file that specifies this parameter will be used.
224+
> Subsequent values from other model config files, if different, will be ignored.
224225
225226
* `INTRA_OP_THREAD_COUNT`:
226227

@@ -242,9 +243,10 @@ key: "INTRA_OP_THREAD_COUNT"
242243
}
243244
```
244245

245-
**NOTE**: This parameter is set globally for the PyTorch backend.
246-
The value from the first model config file that specifies this parameter will be used.
247-
Subsequent values from other model config files, if different, will be ignored.
246+
> [!NOTE]
247+
> This parameter is set globally for the PyTorch backend.
248+
> The value from the first model config file that specifies this parameter will be used.
249+
> Subsequent values from other model config files, if different, will be ignored.
248250
249251
* Additional Optimizations: Three additional boolean parameters are available to disable
250252
certain Torch optimizations that can sometimes cause latency regressions in models with

0 commit comments

Comments
 (0)