Skip to content

Commit 67bc080

Browse files
agunapalsvekars
andauthored
Apply suggestions from code review
Co-authored-by: Svetlana Karslioglu <[email protected]>
1 parent 9ee64d9 commit 67bc080

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

recipes_source/torch_export_aoti_python.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@
5151
# We will use the TorchVision pretrained `ResNet18` model and TorchInductor on the
5252
# exported PyTorch program using :func:`torch._inductor.aot_compile`.
5353
#
54-
# .. note::
54+
# .. note::
5555
#
5656
# This API also supports :func:`torch.compile` options like ``mode``
5757
# This means that if used on a CUDA enabled device, you can, for example, set ``"max_autotune": True``
@@ -107,7 +107,7 @@
107107
# we added a new API called :func:`torch._export.aot_load` to load the shared library in the Python runtime.
108108
# The API follows a structure similar to the :func:`torch.jit.load` API . You need to specify the path
109109
# of the shared library and the device where it should be loaded.
110-
# .. note::
110+
# .. note::
111111
#
112112
# In the example above, we specified ``batch_size=1`` for inference and it still functions correctly even though we specified ``min=2`` in
113113
# :func:`torch.export.export`.

0 commit comments

Comments
 (0)