Skip to content

[export] update dynamic shapes section #3183

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 7 commits into from

Conversation

pianpwk
Copy link
Contributor

@pianpwk pianpwk commented Dec 11, 2024

Description

Updates the export dynamic shapes tutorial to be more up-to-date with our new APIs.

Checklist

  • The issue that is being fixed is referred in the description (see above "Fixes #ISSUE_NUMBER")
  • Only one issue is addressed in this pull request
  • Labels from the issue that this PR is fixing are added to this pull request
  • No unnecessary issues are included into this pull request.

cc @avikchaudhuri @gmagogsfm @zhxchen17 @tugsbayasgalan @angelayi @suo @ydwu4

Copy link

pytorch-bot bot commented Dec 11, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/tutorials/3183

Note: Links to docs will display an error until the docs builds have been completed.

❌ 2 New Failures

As of commit e42792e with merge base 7038ce7 (image):

NEW FAILURES - The following jobs have failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pianpwk pianpwk marked this pull request as ready for review December 11, 2024 23:08
@pianpwk pianpwk changed the title init [export] update dynamic shapes section Dec 11, 2024
# example inputs given to the initial ``torch.export.export()`` call.
# If we try to run the ``ExportedProgram`` in the example below with a tensor
# with a different shape, we get an error:
# This section covers dynamic behavior and representation of exported programs. Dynamic behavior is

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In line 14, it says "This tutorial provides a snapshot of torch.export usage as of PyTorch 2.3."

Should we update that to PyTorch 2.5 or 2.6?

def forward(self, x, y):
return torch.nn.functional.relu(self.lin(x + y), inplace=True)
######################################################################
# Before we look at the program that's produced, let's understand what specifying ``dynamic_shapes`` entails,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: can we make this the regular text instead of a comment? It's a little hard to read because of the color and line breaks.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

similarly for the other chunks of long texts below

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh hmm I thought this was fixed, my bad

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually I think they just need to be regenerated.. not sure how

Copy link

@yushangdi yushangdi Dec 12, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually I think they just need to be regenerated.. not sure how

I thought the preview here should be the regenerated one? They are still comments here. https://docs-preview.pytorch.org/pytorch/tutorials/3183/intermediate/torch_export_tutorial.html#constraints-dynamic-shapes

Not sure if it has anything to do with the CI failure

https://github.com/pytorch/tutorials/actions/runs/12287425965/job/34289430334?pr=3183

@pianpwk pianpwk closed this Jan 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants