Skip to content

Commit 08b1d19

Browse files
committed
fix spelling
1 parent b1c2165 commit 08b1d19

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

intermediate_source/torch_export_tutorial.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -365,7 +365,7 @@ def forward(
365365
# When guards involve symbols allocated for input dimensions, the program contains restrictions on what input shapes are valid;
366366
# i.e. the program's dynamic behavior. The symbolic shapes subsystem is the part responsible for taking in all the emitted guards
367367
# and producing a final program representation that adheres to all of these guards. Before we see this "final representation" in
368-
# an ExportedProgram, let's look at the guards emitted by the toy model we're tracing.
368+
# an ``ExportedProgram``, let's look at the guards emitted by the toy model we're tracing.
369369
#
370370
# Here, each forward input tensor is annotated with the symbol allocated at the start of tracing:
371371

@@ -416,7 +416,7 @@ def forward(
416416
# so interesting currently, since this export call doesn't emit any guards related to symbol bounds and each base symbol has
417417
# a generic bound, but this will come up later.
418418
#
419-
# So far, because we've been exporting this toy model, this experience has been misrepresentative of how hard
419+
# So far, because we've been exporting this toy model, this experience has not been representative of how hard
420420
# it typically is to debug dynamic shapes guards & issues. In most cases it isn't obvious what guards are being emitted,
421421
# and which operations and parts of user code are responsible. For this toy model we pinpoint the exact lines, and the guards
422422
# are rather intuitive.
@@ -542,7 +542,7 @@ def forward(self, w, x, y, z):
542542
# and presenting what export believes is the overall dynamic behavior of the program. The drawback of this design appears once the user has stronger expectations or
543543
# beliefs about the dynamic behavior of these models - maybe there is a strong desire on dynamism and specializations on particular dimensions are to be avoided at
544544
# all costs, or maybe we just want to catch changes in dynamic behavior with changes to the original model code, or possibly underlying decompositions or meta-kernels.
545-
# These changes won't be detected and the ``export()`` call will most likely succeed, unless tests are in place that check the resulting ExportedProgram representation.
545+
# These changes won't be detected and the ``export()`` call will most likely succeed, unless tests are in place that check the resulting ``ExportedProgram`` representation.
546546
#
547547
# For such cases, our stance is to recommend the "traditional" way of specifying dynamic shapes, which longer-term users of export might be familiar with: named ``Dims``:
548548

@@ -555,7 +555,7 @@ def forward(self, w, x, y, z):
555555

556556
######################################################################
557557
# This style of dynamic shapes allows the user to specify what symbols are allocated for input dimensions, min/max bounds on those symbols, and places restrictions on the
558-
# dynamic behavior of the ExportedProgram produced; ConstraintViolation errors will be raised if model tracing emits guards that conflict with the relations or static/dynamic
558+
# dynamic behavior of the ``ExportedProgram`` produced; ``ConstraintViolation`` errors will be raised if model tracing emits guards that conflict with the relations or static/dynamic
559559
# specifications given. For example, in the above specification, the following is asserted:
560560
# - ``x.shape[0]`` is to have range ``[4, 256]``, and related to ``y.shape[0]`` by ``y.shape[0] == 2 * x.shape[0]``.
561561
# - ``x.shape[1]`` is static.
@@ -571,7 +571,7 @@ def forward(self, w, x, y, z):
571571

572572
######################################################################
573573
# One common issue with this specification style (before ``Dim.AUTO`` was introduced), is that the specification would often be mismatched with what was produced by model tracing.
574-
# That would lead to ConstraintViolation errors and export suggested fixes - see for example with this model & specification, where the model inherently requires equality between
574+
# That would lead to ``ConstraintViolation`` errors and export suggested fixes - see for example with this model & specification, where the model inherently requires equality between
575575
# dimensions 0 of ``x`` and ``y``, and requires dimension 1 to be static.
576576

577577
class Foo(torch.nn.Module):
@@ -596,7 +596,7 @@ def forward(self, x, y):
596596
# - ``None`` is a good option for static behavior:
597597
# - ``dynamic_shapes=None`` (default) exports with the entire model being static.
598598
# - specifying ``None`` at an input-level exports with all tensor dimensions static, and alternatively is also required for non-tensor inputs.
599-
# - specfiying ``None`` at a dimension-level specializes that dimension, though this is deprecated in favor of ``Dim.STATIC``.
599+
# - specifying ``None`` at a dimension-level specializes that dimension, though this is deprecated in favor of ``Dim.STATIC``.
600600
# - specifying per-dimension integer values also produces static behavior, and will additionally check that the provided sample input matches the specification.
601601
#
602602
# These options are combined in the inputs & dynamic shapes spec below:

0 commit comments

Comments
 (0)