Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Commit be779c5

Browse files
committed
add clarification for create a recipe enablement in user guide
1 parent 3b49f10 commit be779c5

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

src/content/user-guide/recipes/enabling.mdx

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,7 @@ To enable all of this, the integration code is accomplished by writing a handful
2727
```python
2828
from sparseml.pytorch.optim import ScheduledModifierManager
2929

30+
## fill in definitions below
3031
model = Model() # model definition
3132
optimizer = Optimizer() # optimizer definition
3233
train_data = TrainData() # train data definition
@@ -58,6 +59,7 @@ To enable all of this, the integration code you'll need to write is only a handf
5859
```python
5960
from sparseml.keras.optim import ScheduledModifierManager
6061

62+
## fill in definitions below
6163
model = None # your model definition
6264
optimizer = None # your optimizer definition
6365
num_train_batches = len(train_data) / batch_size # your number of batches per training epoch
@@ -94,6 +96,7 @@ The `ScheduledModifierManager` can override the necessary callbacks in the estim
9496
```python
9597
from sparseml.tensorflow_v1.optim import ScheduledModifierManager
9698

99+
## fill in definitions below
97100
estimator = None # your estimator definition
98101
num_train_batches = len(train_data) / batch_size # your number of batches per training epoch
99102

@@ -118,6 +121,7 @@ from sparseml.tensorflow_v1.utils import tf_compat
118121
from sparseml.tensorflow_v1.optim import ScheduledModifierManager
119122

120123

124+
## fill in definitions below
121125
with tf_compat.Graph().as_default() as graph:
122126
# Normal graph setup....
123127
num_train_batches = len(train_data) / batch_size # your number of batches per training epoch

0 commit comments

Comments
 (0)